CN106372596A - Biological information collection device - Google Patents

Biological information collection device Download PDF

Info

Publication number
CN106372596A
CN106372596A CN201610783725.1A CN201610783725A CN106372596A CN 106372596 A CN106372596 A CN 106372596A CN 201610783725 A CN201610783725 A CN 201610783725A CN 106372596 A CN106372596 A CN 106372596A
Authority
CN
China
Prior art keywords
gray
reagent
cell
sample
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610783725.1A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201610783725.1A priority Critical patent/CN106372596A/en
Publication of CN106372596A publication Critical patent/CN106372596A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a biological information collection device. The device comprises a cell identification module and a biosensor; the cell identification module is configured to determine the biological species; and the biosensor is formed by dried porous materials. The biological information collection device improves the permeability of a reaction layer carrier and realizes the more uniform permeation so as to obtain a biosensor with convenient, rapid, high-sensitivity and high-performance measurement.

Description

A kind of biological information acquisition device
Technical field
The present invention relates to field of biosensors is and in particular to a kind of biological information acquisition device.
Background technology
In recent years, medical diagnosiss scene it would be desirable to access rapid, easy, accurately measure device.But, in note In the existing method carrying, during to Sensor section addition sample solution, if for example sample solution is blood, by using note Emitter carries out taking blood, generally by using centrifugal separator etc., carries out separating the operation that visible component is hemocyte and blood plasma, And regarding necessary as using the sequence of operations that the instruments such as allotter, imbibition glass tubing add to Sensor section.? In these methods, take blood to need the special technical ability such as medical skill using syringe, and centrifugation operation needs are special Instrument and technical ability, general family and there is no the individual of these technology can not use the method when oneself measures.In addition, in order to Examined solution is carried out quantitation, the instruments such as allotter be there is a problem of as necessary various operations so-called miscellaneous.
Content of the invention
For solving the above problems, the present invention is intended to provide a kind of biological information acquisition device.
The purpose of the present invention employs the following technical solutions to realize:
A kind of biological information acquisition device, including cell recognition module and biosensor, described cell recognition module is used To determine biological species, described biosensor is constituted by porous material is dried, above-mentioned biosensor is that have importing examination The sample lead-in portion of sample solution and the developer layer being configured with expansion said sample solution, containing by opening up on above-mentioned developer layer Open said sample solution and keep the labelled reagent holding part of labelled reagent with the drying regime that can elute, and can be with The Immobilized reagents that analyte combines and not elute and fixed on this developer layer containing the reagent in order to participate in reacting Part, imports sample solution in said sample lead-in portion, reaches labelled reagent holding part by being impregnated with above-mentioned developer layer, on State sample solution and elute labelled reagent while partly moving to above-mentioned Immobilized reagents, so that analyte and labelled reagent And immobilized reagent is reacted and is constituted,
By mentioned reagent immobilization partly middle measure above-mentioned labelled reagent binding capacity, will be in said sample solution The analyte containing carries out qualitative or quantitative biosensor.
The invention has the benefit that improve the impregnability of conversion zone carrier it is achieved that being more uniformly impregnated with, and then Achieve simplicity and rapid and can be with the biosensor of the high performance mensure of high sensitivity.
Brief description
Using accompanying drawing, the invention will be further described, but the embodiment in accompanying drawing does not constitute any limit to the present invention System, for those of ordinary skill in the art, on the premise of not paying creative work, can also obtain according to the following drawings Other accompanying drawings.
Fig. 1 is biosensor structure schematic diagram of the present invention;
Fig. 2 is the structural representation of cell recognition module.
Reference:
Cell recognition module 1, Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, Classification and Identification unit 13.
Specific embodiment
In conjunction with following application scenarios, the invention will be further described.
Application scenarios 1
Referring to Fig. 1, Fig. 2, a kind of biological information acquisition device of an embodiment of this application scene, including cell recognition Module and biosensor, described cell recognition module is used for determining biological species, described biosensor is by being dried porous Material is constituted, and above-mentioned biosensor is that have to import the sample lead-in portion of sample solution and to be configured with expansion said sample molten The developer layer of liquid, containing by launching said sample solution the drying regime holding mark eluting on above-mentioned developer layer The labelled reagent holding part of note reagent, and can be combined with analyte and not wash containing the reagent in order to participate in reacting The Immobilized reagents part carrying and being fixed on this developer layer, imports sample solution in said sample lead-in portion, by leaching Above-mentioned developer layer reaches labelled reagent holding part thoroughly, and it is solid to mentioned reagent that said sample solution elutes labelled reagent Determining partly moves, so that analyte and labelled reagent and immobilized reagent are reacted and are constituted,
By mentioned reagent immobilization partly middle measure above-mentioned labelled reagent binding capacity, will be in said sample solution The analyte containing carries out qualitative or quantitative biosensor.
Preferably, in said sample lead-in portion configuration mesh structure.
This preferred embodiment can increase contact area.
Preferably, in said sample lead-in portion or being labelling sample holding part not equal to being that sample imports On the position of surface, there is the cell shrinkage agent holding part of contractive cell composition.
The measurement of this preferred embodiment is more accurate.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz, yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual pointIf adopting The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isThen weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=13, d=2, image denoising effect improves 5% relatively, cell image The extraction accuracy of feature improves 8%.
Application scenarios 2
Referring to Fig. 1, Fig. 2, a kind of biological information acquisition device of an embodiment of this application scene, including cell recognition Module and biosensor, described cell recognition module is used for determining biological species, described biosensor is by being dried porous Material is constituted, and above-mentioned biosensor is that have to import the sample lead-in portion of sample solution and to be configured with expansion said sample molten The developer layer of liquid, containing by launching said sample solution the drying regime holding mark eluting on above-mentioned developer layer The labelled reagent holding part of note reagent, and can be combined with analyte and not wash containing the reagent in order to participate in reacting The Immobilized reagents part carrying and being fixed on this developer layer, imports sample solution in said sample lead-in portion, by leaching Above-mentioned developer layer reaches labelled reagent holding part thoroughly, and it is solid to mentioned reagent that said sample solution elutes labelled reagent Determining partly moves, so that analyte and labelled reagent and immobilized reagent are reacted and are constituted,
By mentioned reagent immobilization partly middle measure above-mentioned labelled reagent binding capacity, will be in said sample solution The analyte containing carries out qualitative or quantitative biosensor.
Preferably, in said sample lead-in portion configuration mesh structure.
This preferred embodiment can increase contact area.
Preferably, in said sample lead-in portion or being labelling sample holding part not equal to being that sample imports On the position of surface, there is the cell shrinkage agent holding part of contractive cell composition.
The measurement of this preferred embodiment is more accurate.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual point (x1,y1) ...,If sampling The coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isThen weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=15, d=2, image denoising effect improves 6% relatively, cell image The extraction accuracy of feature improves 8%.
Application scenarios 3
Referring to Fig. 1, Fig. 2, a kind of biological information acquisition device of an embodiment of this application scene, including cell recognition Module and biosensor, described cell recognition module is used for determining biological species, described biosensor is by being dried porous Material is constituted, and above-mentioned biosensor is that have to import the sample lead-in portion of sample solution and to be configured with expansion said sample molten The developer layer of liquid, containing by launching said sample solution the drying regime holding mark eluting on above-mentioned developer layer The labelled reagent holding part of note reagent, and can be combined with analyte and not wash containing the reagent in order to participate in reacting The Immobilized reagents part carrying and being fixed on this developer layer, imports sample solution in said sample lead-in portion, by leaching Above-mentioned developer layer reaches labelled reagent holding part thoroughly, and it is solid to mentioned reagent that said sample solution elutes labelled reagent Determining partly moves, so that analyte and labelled reagent and immobilized reagent are reacted and are constituted,
By mentioned reagent immobilization partly middle measure above-mentioned labelled reagent binding capacity, will be in said sample solution The analyte containing carries out qualitative or quantitative biosensor.
Preferably, in said sample lead-in portion configuration mesh structure.
This preferred embodiment can increase contact area.
Preferably, in said sample lead-in portion or being labelling sample holding part not equal to being that sample imports On the position of surface, there is the cell shrinkage agent holding part of contractive cell composition.
The measurement of this preferred embodiment is more accurate.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual pointIf adopting The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isThen weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=18, d=3, image denoising effect improves 7% relatively, cell image The extraction accuracy of feature improves 7%.
Application scenarios 4
Referring to Fig. 1, Fig. 2, a kind of biological information acquisition device of an embodiment of this application scene, including cell recognition Module and biosensor, described cell recognition module is used for determining biological species, described biosensor is by being dried porous Material is constituted, and above-mentioned biosensor is that have to import the sample lead-in portion of sample solution and to be configured with expansion said sample molten The developer layer of liquid, containing by launching said sample solution the drying regime holding mark eluting on above-mentioned developer layer The labelled reagent holding part of note reagent, and can be combined with analyte and not wash containing the reagent in order to participate in reacting The Immobilized reagents part carrying and being fixed on this developer layer, imports sample solution in said sample lead-in portion, by leaching Above-mentioned developer layer reaches labelled reagent holding part thoroughly, and it is solid to mentioned reagent that said sample solution elutes labelled reagent Determining partly moves, so that analyte and labelled reagent and immobilized reagent are reacted and are constituted,
By mentioned reagent immobilization partly middle measure above-mentioned labelled reagent binding capacity, will be in said sample solution The analyte containing carries out qualitative or quantitative biosensor.
Preferably, in said sample lead-in portion configuration mesh structure.
This preferred embodiment can increase contact area.
Preferably, in said sample lead-in portion or being labelling sample holding part not equal to being that sample imports On the position of surface, there is the cell shrinkage agent holding part of contractive cell composition.
The measurement of this preferred embodiment is more accurate.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual pointIf adopting The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isThen weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=20, d=4, image denoising effect improves 8% relatively, cell image The extraction accuracy of feature improves 6%.
Application scenarios 5
Referring to Fig. 1, Fig. 2, a kind of biological information acquisition device of an embodiment of this application scene, including cell recognition Module and biosensor, described cell recognition module is used for determining biological species, described biosensor is by being dried porous Material is constituted, and above-mentioned biosensor is that have to import the sample lead-in portion of sample solution and to be configured with expansion said sample molten The developer layer of liquid, containing by launching said sample solution the drying regime holding mark eluting on above-mentioned developer layer The labelled reagent holding part of note reagent, and can be combined with analyte and not wash containing the reagent in order to participate in reacting The Immobilized reagents part carrying and being fixed on this developer layer, imports sample solution in said sample lead-in portion, by leaching Above-mentioned developer layer reaches labelled reagent holding part thoroughly, and it is solid to mentioned reagent that said sample solution elutes labelled reagent Determining partly moves, so that analyte and labelled reagent and immobilized reagent are reacted and are constituted,
By mentioned reagent immobilization partly middle measure above-mentioned labelled reagent binding capacity, will be in said sample solution The analyte containing carries out qualitative or quantitative biosensor.
Preferably, in said sample lead-in portion configuration mesh structure.
This preferred embodiment can increase contact area.
Preferably, in said sample lead-in portion or being labelling sample holding part not equal to being that sample imports On the position of surface, there is the cell shrinkage agent holding part of contractive cell composition.
The measurement of this preferred embodiment is more accurate.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual pointIf adopting The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isThen weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=26, d=2, image denoising effect improves 7.5% relatively, cytological map Extraction accuracy as feature improves 8%.
Finally it should be noted that above example is only in order to illustrating technical scheme, rather than the present invention is protected The restriction of shield scope, although having made to explain to the present invention with reference to preferred embodiment, those of ordinary skill in the art should Work as understanding, technical scheme can be modified or equivalent, without deviating from the reality of technical solution of the present invention Matter and scope.

Claims (3)

1. a kind of biological information acquisition device, is characterized in that, including cell recognition module and biosensor, described cell recognition Module is used for determining biological species, described biosensor is constituted by porous material is dried, and above-mentioned biosensor is that have Import the sample lead-in portion of sample solution and be configured with the developer layer launching said sample solution, containing by above-mentioned expansion Said sample solution is launched on layer and keeps the labelled reagent holding part of labelled reagent with the drying regime that can elute, and The reagent that can be combined with analyte and not elute and fixed on this developer layer containing the reagent in order to participate in reacting Immobilization part, imports sample solution in said sample lead-in portion, reaches labelled reagent holding by being impregnated with above-mentioned developer layer Part, said sample solution elutes labelled reagent and partly moves to above-mentioned Immobilized reagents so as analyte and Labelled reagent and immobilized reagent are reacted and are constituted,
By in the mentioned reagent immobilization partly middle binding capacity measuring above-mentioned labelled reagent, containing in said sample solution Analyte carry out qualitative or quantitative biosensor.
2. a kind of biological information acquisition device according to claim 1, is characterized in that, in the configuration of said sample lead-in portion Network structure.
3. a kind of biological information acquisition device according to claim 2, is characterized in that, in said sample lead-in portion or Be labelling sample holding part not equal to being on the position of sample lead-in portion side, there is the thin of contractive cell composition Born of the same parents' contracting agent holding part.
CN201610783725.1A 2016-08-30 2016-08-30 Biological information collection device Pending CN106372596A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610783725.1A CN106372596A (en) 2016-08-30 2016-08-30 Biological information collection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610783725.1A CN106372596A (en) 2016-08-30 2016-08-30 Biological information collection device

Publications (1)

Publication Number Publication Date
CN106372596A true CN106372596A (en) 2017-02-01

Family

ID=57899212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610783725.1A Pending CN106372596A (en) 2016-08-30 2016-08-30 Biological information collection device

Country Status (1)

Country Link
CN (1) CN106372596A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1464977A (en) * 2001-08-10 2003-12-31 松下电器产业株式会社 Biosensor and method for analyzing blood components using it
CN1761285A (en) * 2005-09-14 2006-04-19 上海广电(集团)有限公司中央研究院 Method for removing isolated noise point in video
CN103543277A (en) * 2013-09-13 2014-01-29 中国科学院苏州生物医学工程技术研究所 Blood type result recognition algorithm based on grey level analysis and type recognition
CN103890164A (en) * 2011-10-18 2014-06-25 株式会社岛津制作所 Cell identification device and program
CN105894464A (en) * 2016-03-28 2016-08-24 福州瑞芯微电子股份有限公司 Median filtering image processing method and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1464977A (en) * 2001-08-10 2003-12-31 松下电器产业株式会社 Biosensor and method for analyzing blood components using it
CN1761285A (en) * 2005-09-14 2006-04-19 上海广电(集团)有限公司中央研究院 Method for removing isolated noise point in video
CN103890164A (en) * 2011-10-18 2014-06-25 株式会社岛津制作所 Cell identification device and program
CN103543277A (en) * 2013-09-13 2014-01-29 中国科学院苏州生物医学工程技术研究所 Blood type result recognition algorithm based on grey level analysis and type recognition
CN105894464A (en) * 2016-03-28 2016-08-24 福州瑞芯微电子股份有限公司 Median filtering image processing method and apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李宽: ""细胞图像的分割、纹理提取及识别方法研究"", 《中国博士学位论文全文数据库 信息科技辑》 *
梁光明: ""体液细胞图像有形成分智能识别关键技术研究"", 《中国博士学位论文全文数据库 信息科技辑》 *

Similar Documents

Publication Publication Date Title
HRP20160127T1 (en) Lung cancer biomarkers and uses thereof
CN113227754A (en) Image-based assays using smart monitoring structures
CN116434226B (en) Circulating tumor cell analyzer
CN112037853B (en) Method for screening cell strain capable of expressing expected product
CN116612472B (en) Single-molecule immune array analyzer based on image and method thereof
Naoe et al. Development of a highly sensitive technique for capturing renal cell cancer circulating tumor cells
Polatoglou et al. Isolation and quantification of plasma cell-free DNA using different manual and automated methods
Tummala et al. An explainable classification method based on complex scaling in histopathology images for lung and colon cancer
Hu et al. Probing the Pore Structure of the Berea Sandstone by Using X-ray Micro-CT in Combination with ImageJ Software
Song et al. Pathological digital biomarkers: validation and application
Takagi et al. Analysis of the circulating tumor cell capture ability of a slit filter-based method in comparison to a selection-free method in multiple cancer types
CN104155325B (en) Sample-transfer-free and low-field nuclear magnetic resonance (NMR) rapid rare cell detection method based on magnetic microspheres
Lee et al. Machine Learning-Aided Three-Dimensional Morphological Quantification of Angiogenic Vasculature in the Multiculture Microfluidic Platform
Fang et al. Simultaneous determination of methylated nucleosides by HILIC–MS/MS revealed their alterations in urine from breast cancer patients
CN106372596A (en) Biological information collection device
CN105943048B (en) A kind of method and its application for distinguishing tubercular meningitis and viral meningitis based on nuclear magnetic resonance technique
CN101231229A (en) Non-dyeing automatic counting method for liquid bacterium-containing quantity
CN101493886A (en) Karyoplast categorization and identification method in case of unsoundness of characteristic parameter
CN101666798A (en) Protein marker for detecting parkinsonism, kit and application thereof
CN106442459A (en) Gold-based extraction material functionalized by different affinity ligands, and application thereof in surface Plasmon optical affinity sandwich analysis
Zhang et al. Research on Microscopic Pore Structure Characteristics and Influencing Factors of Shale Reservoirs: A Case Study of the Second Member of the Permian Lucaogou Formation in Malang Sag, Santanghu Basin
CN109520496A (en) A kind of inertial navigation sensors data de-noising method based on blind source separation method
Lau et al. Advances in imaging modalities, artificial intelligence, and single cell biomarker analysis, and their applications in cytopathology
Miceska et al. Morphological and Immunocytochemical Characterization of Tumor Spheroids in Ascites from High-Grade Serous Carcinoma
CN100419784C (en) Central projection based image form characteristic line extracting method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170201

RJ01 Rejection of invention patent application after publication