CN106446783A - System for sharing biological related data by using cloud computing environment - Google Patents
System for sharing biological related data by using cloud computing environment Download PDFInfo
- Publication number
- CN106446783A CN106446783A CN201610762857.6A CN201610762857A CN106446783A CN 106446783 A CN106446783 A CN 106446783A CN 201610762857 A CN201610762857 A CN 201610762857A CN 106446783 A CN106446783 A CN 106446783A
- Authority
- CN
- China
- Prior art keywords
- cloud computing
- computing environment
- gray
- cell
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16B—BIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
- G16B30/00—ICT specially adapted for sequence analysis involving nucleotides or amino acids
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Investigating Or Analysing Biological Materials (AREA)
Abstract
The present invention provides a system for sharing biological related data by using a cloud computing environment. The system comprises a cell identification module and a sharing system. The cell identification module is used for determining a shared biological species; and the sharing system comprises a cloud computing environment, wherein the cloud computing environment communicates with multiple sample preparation devices, multiple sequencing devices and multiple computing devices. The system provided by the present invention has the following beneficial effect: the cloud computing environment is created to process the biological related data, so that efficiency and timeliness of data processing are improved.
Description
Technical field
The present invention relates to biological data process field, be specifically related to one and utilize cloud computing environment to share biological dependency number
According to system.
Background technology
Gene sequencing has become as field more and more important in gene studies, be hopeful in the future to be used in diagnostic application and its
In his application.In general, gene sequencing relates to the order determining the nucleotides of the nucleic acid of such as RNA or DNA fragmentation etc.Logical
Often analyze relatively short sequence, and the sequence information obtained by can using in various bioinformatics methods, with
In logic multiple fragments are combined together reliably to determine genetic material (these fragments derive from this genetic material) more
The sequence of extensive length.Have been developed for automation, the computer based inspection of characteristic fragment, and these check
In identification closely having been used in genomic mapping, gene and function thereof etc..But, existing technology extremely expends the time, gained
The genomic information arriving also thus extremely expensive.
Content of the invention
For solve the problems referred to above, it is desirable to provide a kind of utilize cloud computing environment share biology related data be
System.
The purpose of the present invention realizes by the following technical solutions:
A kind of system utilizing cloud computing environment to share biological related data, including cell recognition module and shared system
System, described cell recognition module is used for determining shared biological species, and described shared system includes cloud computing environment, this cloud computing ring
Border and multiple sample preparation apparatus, multiple sequencing device and multiple computing device communication, wherein cloud computing environment includes at least one
Individual server, at least one server is configured to and away from least one in the sample preparation apparatus of at least one server
At least one sequencing device in sample preparation apparatus, sequencing device and the communication of at least one computing device in computing device,
To receive from least one sample preparation apparatus described while data prepared by sample and sequencing data generates and to store sample
Preparation data simultaneously receive from least one sequencing device described and storage sequence data.
Beneficial effects of the present invention is:Found cloud computing environment and process biological related data, improve the effect that data are processed
Rate and ageing.
Brief description
The invention will be further described to utilize accompanying drawing, but the embodiment in accompanying drawing does not constitute any limit to the present invention
System, for those of ordinary skill in the art, on the premise of not paying creative work, can also obtain according to the following drawings
Other accompanying drawing.
Fig. 1 is the structural representation of shared system;
Fig. 2 is the structural representation of cell recognition module.
Reference:
Cell recognition module the 1st, Methods of Segmentation On Cell Images unit the 11st, feature extraction unit the 12nd, Classification and Identification unit 13.
Detailed description of the invention
The invention will be further described to be combined to lower application scenarios.
Application scenarios 1
Seeing Fig. 1, Fig. 2, it is biological related that the one of an embodiment of this application scene utilizes cloud computing environment to share
The system of data, including cell recognition module and shared system, described cell recognition module is used for determining shared biological species, institute
State shared system and include cloud computing environment, this cloud computing environment and multiple sample preparation apparatus, multiple sequencing device and multiple meter
Calculating device communication, wherein cloud computing environment includes at least one server, and at least one server is configured to and away from least
At least one sequencing device at least one sample preparation apparatus in the sample preparation apparatus of one server, sequencing device
With at least one computing device communication in computing device, to prepare while data and sequencing data generate from described at sample
At least one sample preparation apparatus receives and storage sample is prepared data and receives from least one sequencing device described and storage
Sequence data.
Preferably, at least one server described is configured to receive sample extraction related data.
This preferred embodiment improves sample and shares speed.
Preferably, described cloud computing environment includes at least one processor, and this at least one processor is configured at least
Generate sample extraction daily record based on sample extraction related data.
This preferred embodiment contributes to arranging data.
Preferably, described cell recognition module 1 includes that the 12nd, Methods of Segmentation On Cell Images unit the 11st, feature extraction unit classifies knowledge
Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image being gathered by cell image acquisition module
Scape, nucleus and cytoplasm;Described feature extraction unit 12 is for extracting to the textural characteristics of cell image;Described classification
Recognition unit 13 is for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subelement, noise remove subelement, coarse segmentation
Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, are specially:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, is used for carrying out denoising to gray level image, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N for more than
Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)
With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T
More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)
Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark difference, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent
Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within
Interval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h
Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing to the background in the cell image after denoising, cytoplasm, nucleus
Point, it is specially:
By each pixel, (x y) represents with four dimensional feature vectors:
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation
Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Coarse segmentation subelement is obtained nucleus approximate region, if nuclear area comprises n point:(x1,y1),…,(xn,
yn), carry out intensity-weighted demarcation to this region and geometric center is demarcated, take its mean value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual pointIf adopting
The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel
Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use
Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image
The holding at edge;Use K mean cluster to extract nucleus and cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin
Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and cytoplasm profile;Accurate Segmentation subelement fills
Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and cytoplasm limit
Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method
Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,
45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether
The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides
The corresponding contrast level parameter of gray level co-occurrence matrixes on each direction in calculates, if the gray level co-occurrence matrixes on four direction
Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
(2) described Gray co-occurrence matrix and matrix element project is utilized to obtain four required textural characteristics parameters:
Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on improved gray level co-occurrence matrixes method, uses the mode arranging weight coefficient to ask for cytological map
The Gray co-occurrence matrix of picture, and then extract textural characteristics on appointment four direction for the cell, solve owing to outside is done
Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes
Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,
Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng
Number is normalized, and facilitates the Classification and Identification process of follow-up cell image.
In this application scenarios, setting threshold value T=13, d=2, image denoising effect improves 5% relatively, cell image
The extraction accuracy of feature improves 8%.
Application scenarios 2
Seeing Fig. 1, Fig. 2, it is biological related that the one of an embodiment of this application scene utilizes cloud computing environment to share
The system of data, including cell recognition module and shared system, described cell recognition module is used for determining shared biological species, institute
State shared system and include cloud computing environment, this cloud computing environment and multiple sample preparation apparatus, multiple sequencing device and multiple meter
Calculating device communication, wherein cloud computing environment includes at least one server, and at least one server is configured to and away from least
At least one sequencing device at least one sample preparation apparatus in the sample preparation apparatus of one server, sequencing device
With at least one computing device communication in computing device, to prepare while data and sequencing data generate from described at sample
At least one sample preparation apparatus receives and storage sample is prepared data and receives from least one sequencing device described and storage
Sequence data.
Preferably, at least one server described is configured to receive sample extraction related data.
This preferred embodiment improves sample and shares speed.
Preferably, described cloud computing environment includes at least one processor, and this at least one processor is configured at least
Generate sample extraction daily record based on sample extraction related data.
This preferred embodiment contributes to arranging data.
Preferably, described cell recognition module 1 includes that the 12nd, Methods of Segmentation On Cell Images unit the 11st, feature extraction unit classifies knowledge
Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image being gathered by cell image acquisition module
Scape, nucleus and cytoplasm;Described feature extraction unit 12 is for extracting to the textural characteristics of cell image;Described classification
Recognition unit 13 is for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subelement, noise remove subelement, coarse segmentation
Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, are specially:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, is used for carrying out denoising to gray level image, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N for more than
Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)
With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T
More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)
Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark difference, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent
Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within
Interval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h
Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing to the background in the cell image after denoising, cytoplasm, nucleus
Point, it is specially:
By each pixel, (x y) represents with four dimensional feature vectors:
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation
Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Coarse segmentation subelement is obtained nucleus approximate region, if nuclear area comprises n point:(x1,y1),…,(xn,
yn), carry out intensity-weighted demarcation to this region and geometric center is demarcated, take its mean value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual pointIf adopting
The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel
Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use
Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image
The holding at edge;Use K mean cluster to extract nucleus and cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin
Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and cytoplasm profile;Accurate Segmentation subelement fills
Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and cytoplasm limit
Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method
Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,
45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether
The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides
The corresponding contrast level parameter of gray level co-occurrence matrixes on each direction in calculates, if the gray level co-occurrence matrixes on four direction
Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
(2) described Gray co-occurrence matrix and matrix element project is utilized to obtain four required textural characteristics parameters:
Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on improved gray level co-occurrence matrixes method, uses the mode arranging weight coefficient to ask for cytological map
The Gray co-occurrence matrix of picture, and then extract textural characteristics on appointment four direction for the cell, solve owing to outside is done
Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes
Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,
Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng
Number is normalized, and facilitates the Classification and Identification process of follow-up cell image.
In this application scenarios, setting threshold value T=15, d=2, image denoising effect improves 6% relatively, cell image
The extraction accuracy of feature improves 8%.
Application scenarios 3
Seeing Fig. 1, Fig. 2, it is biological related that the one of an embodiment of this application scene utilizes cloud computing environment to share
The system of data, including cell recognition module and shared system, described cell recognition module is used for determining shared biological species, institute
State shared system and include cloud computing environment, this cloud computing environment and multiple sample preparation apparatus, multiple sequencing device and multiple meter
Calculating device communication, wherein cloud computing environment includes at least one server, and at least one server is configured to and away from least
At least one sequencing device at least one sample preparation apparatus in the sample preparation apparatus of one server, sequencing device
With at least one computing device communication in computing device, to prepare while data and sequencing data generate from described at sample
At least one sample preparation apparatus receives and storage sample is prepared data and receives from least one sequencing device described and storage
Sequence data.
Preferably, at least one server described is configured to receive sample extraction related data.
This preferred embodiment improves sample and shares speed.
Preferably, described cloud computing environment includes at least one processor, and this at least one processor is configured at least
Generate sample extraction daily record based on sample extraction related data.
This preferred embodiment contributes to arranging data.
Preferably, described cell recognition module 1 includes that the 12nd, Methods of Segmentation On Cell Images unit the 11st, feature extraction unit classifies knowledge
Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image being gathered by cell image acquisition module
Scape, nucleus and cytoplasm;Described feature extraction unit 12 is for extracting to the textural characteristics of cell image;Described classification
Recognition unit 13 is for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subelement, noise remove subelement, coarse segmentation
Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, are specially:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, is used for carrying out denoising to gray level image, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N for more than
Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)
With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T
More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)
Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark difference, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent
Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within
Interval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h
Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing to the background in the cell image after denoising, cytoplasm, nucleus
Point, it is specially:
By each pixel, (x y) represents with four dimensional feature vectors:
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation
Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Coarse segmentation subelement is obtained nucleus approximate region, if nuclear area comprises n point:(x1,y1),…,(xn,
yn), carry out intensity-weighted demarcation to this region and geometric center is demarcated, take its mean value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual pointIf adopting
The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel
Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use
Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image
The holding at edge;Use K mean cluster to extract nucleus and cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin
Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and cytoplasm profile;Accurate Segmentation subelement fills
Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and cytoplasm limit
Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method
Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,
45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether
The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides
The corresponding contrast level parameter of gray level co-occurrence matrixes on each direction in calculates, if the gray level co-occurrence matrixes on four direction
Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
(2) described Gray co-occurrence matrix and matrix element project is utilized to obtain four required textural characteristics parameters:
Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on improved gray level co-occurrence matrixes method, uses the mode arranging weight coefficient to ask for cytological map
The Gray co-occurrence matrix of picture, and then extract textural characteristics on appointment four direction for the cell, solve owing to outside is done
Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes
Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,
Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng
Number is normalized, and facilitates the Classification and Identification process of follow-up cell image.
In this application scenarios, setting threshold value T=18, d=3, image denoising effect improves 7% relatively, cell image
The extraction accuracy of feature improves 7%.
Application scenarios 4
Seeing Fig. 1, Fig. 2, it is biological related that the one of an embodiment of this application scene utilizes cloud computing environment to share
The system of data, including cell recognition module and shared system, described cell recognition module is used for determining shared biological species, institute
State shared system and include cloud computing environment, this cloud computing environment and multiple sample preparation apparatus, multiple sequencing device and multiple meter
Calculating device communication, wherein cloud computing environment includes at least one server, and at least one server is configured to and away from least
At least one sequencing device at least one sample preparation apparatus in the sample preparation apparatus of one server, sequencing device
With at least one computing device communication in computing device, to prepare while data and sequencing data generate from described at sample
At least one sample preparation apparatus receives and storage sample is prepared data and receives from least one sequencing device described and storage
Sequence data.
Preferably, at least one server described is configured to receive sample extraction related data.
This preferred embodiment improves sample and shares speed.
Preferably, described cloud computing environment includes at least one processor, and this at least one processor is configured at least
Generate sample extraction daily record based on sample extraction related data.
This preferred embodiment contributes to arranging data.
Preferably, described cell recognition module 1 includes that the 12nd, Methods of Segmentation On Cell Images unit the 11st, feature extraction unit classifies knowledge
Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image being gathered by cell image acquisition module
Scape, nucleus and cytoplasm;Described feature extraction unit 12 is for extracting to the textural characteristics of cell image;Described classification
Recognition unit 13 is for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subelement, noise remove subelement, coarse segmentation
Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, are specially:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, is used for carrying out denoising to gray level image, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N for more than
Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)
With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T
More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)
Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark difference, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent
Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within
Interval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h
Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing to the background in the cell image after denoising, cytoplasm, nucleus
Point, it is specially:
By each pixel, (x y) represents with four dimensional feature vectors:
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation
Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Coarse segmentation subelement is obtained nucleus approximate region, if nuclear area comprises n point:(x1,y1),…,(xn,
yn), carry out intensity-weighted demarcation to this region and geometric center is demarcated, take its mean value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual pointIf adopting
The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel
Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use
Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image
The holding at edge;Use K mean cluster to extract nucleus and cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin
Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and cytoplasm profile;Accurate Segmentation subelement fills
Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and cytoplasm limit
Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method
Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,
45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether
The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides
The corresponding contrast level parameter of gray level co-occurrence matrixes on each direction in calculates, if the gray level co-occurrence matrixes on four direction
Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
(2) described Gray co-occurrence matrix and matrix element project is utilized to obtain four required textural characteristics parameters:
Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on improved gray level co-occurrence matrixes method, uses the mode arranging weight coefficient to ask for cytological map
The Gray co-occurrence matrix of picture, and then extract textural characteristics on appointment four direction for the cell, solve owing to outside is done
Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes
Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,
Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng
Number is normalized, and facilitates the Classification and Identification process of follow-up cell image.
In this application scenarios, setting threshold value T=20, d=4, image denoising effect improves 8% relatively, cell image
The extraction accuracy of feature improves 6%.
Application scenarios 5
Seeing Fig. 1, Fig. 2, it is biological related that the one of an embodiment of this application scene utilizes cloud computing environment to share
The system of data, including cell recognition module and shared system, described cell recognition module is used for determining shared biological species, institute
State shared system and include cloud computing environment, this cloud computing environment and multiple sample preparation apparatus, multiple sequencing device and multiple meter
Calculating device communication, wherein cloud computing environment includes at least one server, and at least one server is configured to and away from least
At least one sequencing device at least one sample preparation apparatus in the sample preparation apparatus of one server, sequencing device
With at least one computing device communication in computing device, to prepare while data and sequencing data generate from described at sample
At least one sample preparation apparatus receives and storage sample is prepared data and receives from least one sequencing device described and storage
Sequence data.
Preferably, at least one server described is configured to receive sample extraction related data.
This preferred embodiment improves sample and shares speed.
Preferably, described cloud computing environment includes at least one processor, and this at least one processor is configured at least
Generate sample extraction daily record based on sample extraction related data.
This preferred embodiment contributes to arranging data.
Preferably, described cell recognition module 1 includes that the 12nd, Methods of Segmentation On Cell Images unit the 11st, feature extraction unit classifies knowledge
Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image being gathered by cell image acquisition module
Scape, nucleus and cytoplasm;Described feature extraction unit 12 is for extracting to the textural characteristics of cell image;Described classification
Recognition unit 13 is for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subelement, noise remove subelement, coarse segmentation
Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, are specially:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, is used for carrying out denoising to gray level image, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N for more than
Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)
With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T
More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)
Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark difference, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent
Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within
Interval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h
Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing to the background in the cell image after denoising, cytoplasm, nucleus
Point, it is specially:
By each pixel, (x y) represents with four dimensional feature vectors:
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation
Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Coarse segmentation subelement is obtained nucleus approximate region, if nuclear area comprises n point:(x1,y1),…,(xn,
yn), carry out intensity-weighted demarcation to this region and geometric center is demarcated, take its mean value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual pointIf adopting
The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel
Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use
Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image
The holding at edge;Use K mean cluster to extract nucleus and cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin
Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and cytoplasm profile;Accurate Segmentation subelement fills
Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and cytoplasm limit
Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method
Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,
45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether
The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides
The corresponding contrast level parameter of gray level co-occurrence matrixes on each direction in calculates, if the gray level co-occurrence matrixes on four direction
Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
(2) described Gray co-occurrence matrix and matrix element project is utilized to obtain four required textural characteristics parameters:
Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on improved gray level co-occurrence matrixes method, uses the mode arranging weight coefficient to ask for cytological map
The Gray co-occurrence matrix of picture, and then extract textural characteristics on appointment four direction for the cell, solve owing to outside is done
Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes
Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,
Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng
Number is normalized, and facilitates the Classification and Identification process of follow-up cell image.
In this application scenarios, setting threshold value T=26, d=2, image denoising effect improves 7.5% relatively, cytological map
As the extraction accuracy of feature improves 8%.
Last it should be noted that, above example is only in order to illustrating technical scheme, rather than the present invention is protected
Protecting the restriction of scope, although having made to explain to the present invention with reference to preferred embodiment, those of ordinary skill in the art should
Work as understanding, technical scheme can be modified or equivalent, without deviating from the reality of technical solution of the present invention
Matter and scope.
Claims (3)
1. utilize cloud computing environment to share a system for biological related data, it is characterized in that, including cell recognition module and
Shared system, described cell recognition module is used for determining shared biological species, and described shared system includes cloud computing environment, this cloud
Computing environment and multiple sample preparation apparatus, multiple sequencing device and multiple computing device communication, wherein cloud computing environment includes
At least one server, at least one server be configured to away from the sample preparation apparatus of at least one server extremely
At least one sequencing device in a few sample preparation apparatus, sequencing device and at least one computing device in computing device
Communication, to receive from least one sample preparation apparatus described and storage while data prepared by sample and sequencing data generates
Sample is prepared data and receives from least one sequencing device described and storage sequence data.
2. a kind of system utilizing cloud computing environment to share biological related data according to claim 1, is characterized in that,
At least one server described is configured to receive sample extraction related data.
3. a kind of system utilizing cloud computing environment to share biological related data according to claim 2, is characterized in that,
Described cloud computing environment includes at least one processor, and this at least one processor is configured to be related at least based on sample extraction
Data genaration sample extraction daily record.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610762857.6A CN106446783A (en) | 2016-08-30 | 2016-08-30 | System for sharing biological related data by using cloud computing environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610762857.6A CN106446783A (en) | 2016-08-30 | 2016-08-30 | System for sharing biological related data by using cloud computing environment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106446783A true CN106446783A (en) | 2017-02-22 |
Family
ID=58090334
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610762857.6A Pending CN106446783A (en) | 2016-08-30 | 2016-08-30 | System for sharing biological related data by using cloud computing environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106446783A (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105051742A (en) * | 2013-01-25 | 2015-11-11 | Illumina公司 | Methods and systems for using a cloud computing environment to share biological related data |
CN105894464A (en) * | 2016-03-28 | 2016-08-24 | 福州瑞芯微电子股份有限公司 | Median filtering image processing method and apparatus |
-
2016
- 2016-08-30 CN CN201610762857.6A patent/CN106446783A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105051742A (en) * | 2013-01-25 | 2015-11-11 | Illumina公司 | Methods and systems for using a cloud computing environment to share biological related data |
CN105894464A (en) * | 2016-03-28 | 2016-08-24 | 福州瑞芯微电子股份有限公司 | Median filtering image processing method and apparatus |
Non-Patent Citations (2)
Title |
---|
李宽: ""细胞图像的分割、纹理提取及识别方法研究"", 《中国博士学位论文全文数据库 信息科技辑》 * |
梁光明: ""体液细胞图像有形成分智能识别关键技术研究"", 《中国博士学位论文全文数据库 信息科技辑》 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Tang et al. | Lane-level road information mining from vehicle GPS trajectories based on naïve bayesian classification | |
Sang et al. | Intelligent high-resolution geological mapping based on SLIC-CNN | |
Qiu et al. | ASFF-YOLOv5: Multielement detection method for road traffic in UAV images based on multiscale feature fusion | |
CN104992079B (en) | Protein-ligand based on sampling study binds site estimation method | |
CN105930852B (en) | A kind of bubble image-recognizing method | |
Shang et al. | SAR image segmentation using region smoothing and label correction | |
Xiao et al. | Building segmentation and modeling from airborne LiDAR data | |
Liu et al. | Major orientation estimation-based rock surface extraction for 3D rock-mass point clouds | |
Nenchev et al. | Automatic recognition of dendritic solidification structures: DenMap | |
CN105447527A (en) | Method and system for classifying environmental microorganisms by image recognition technology | |
Wlodarczyk-Sielicka et al. | Automatic classification using machine learning for non-conventional vessels on inland waters | |
CN114842308B (en) | Method for establishing target pre-arbitration model based on full feature fusion | |
Cao et al. | Detection of microalgae objects based on the Improved YOLOv3 model | |
CN109799513B (en) | Indoor unknown environment positioning method based on linear characteristics in two-dimensional laser radar data | |
CN114463425A (en) | Workpiece surface featureless point positioning method based on probability Hough linear detection | |
Yao et al. | Landscape pattern change of impervious surfaces and its driving forces in Shanghai during 1965–2010 | |
Fernández et al. | Genome insights into autopolyploid evolution: A case study in Senecio doronicum (Asteraceae) from the Southern Alps | |
Pereira et al. | Detection and delineation of sorted stone circles in Antarctica | |
CN113496260A (en) | Grain depot worker non-standard operation detection method based on improved YOLOv3 algorithm | |
Xie et al. | PSDSD-A superpixel generating method based on pixel saliency difference and spatial distance for SAR images | |
CN106446783A (en) | System for sharing biological related data by using cloud computing environment | |
Li et al. | Automatic positioning of street objects based on self-adaptive constrained line of bearing from street-view images | |
Siłuch et al. | Assessment and Quantitative Evaluation of Loess Area Geomorphodiversity Using Multiresolution DTMs (Roztocze Region, SE Poland) | |
Soilán et al. | Automatic parametrization and shadow analysis of roofs in urban areas from ALS point clouds with solar energy purposes | |
Yao et al. | GNSS Urban Positioning with Vision-Aided NLOS Identification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170222 |
|
RJ01 | Rejection of invention patent application after publication |