CN106244420A - A kind of producing device of high-density biochip - Google Patents

A kind of producing device of high-density biochip Download PDF

Info

Publication number
CN106244420A
CN106244420A CN201610768011.3A CN201610768011A CN106244420A CN 106244420 A CN106244420 A CN 106244420A CN 201610768011 A CN201610768011 A CN 201610768011A CN 106244420 A CN106244420 A CN 106244420A
Authority
CN
China
Prior art keywords
needle
gray
point
pixel
cell
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610768011.3A
Other languages
Chinese (zh)
Other versions
CN106244420B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taizhou Longze Environmental Technology Co., Ltd.
Original Assignee
孟玲
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 孟玲 filed Critical 孟玲
Priority to CN201610768011.3A priority Critical patent/CN106244420B/en
Publication of CN106244420A publication Critical patent/CN106244420A/en
Application granted granted Critical
Publication of CN106244420B publication Critical patent/CN106244420B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01JCHEMICAL OR PHYSICAL PROCESSES, e.g. CATALYSIS OR COLLOID CHEMISTRY; THEIR RELEVANT APPARATUS
    • B01J19/00Chemical, physical or physico-chemical processes in general; Their relevant apparatus
    • B01J19/0046Sequential or parallel reactions, e.g. for the synthesis of polypeptides or polynucleotides; Apparatus and devices for combinatorial chemistry or for making molecular arrays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01JCHEMICAL OR PHYSICAL PROCESSES, e.g. CATALYSIS OR COLLOID CHEMISTRY; THEIR RELEVANT APPARATUS
    • B01J2219/00Chemical, physical or physico-chemical processes in general; Their relevant apparatus
    • B01J2219/00274Sequential or parallel reactions; Apparatus and devices for combinatorial chemistry or for making arrays; Chemical library technology
    • B01J2219/00277Apparatus

Landscapes

  • Chemical & Material Sciences (AREA)
  • Organic Chemistry (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)

Abstract

A kind of producing device of high-density biochip, including cell recognition module with for making the spotting needle of biochip, described cell recognition module is used for determining biological species, the needle body of the described spotting needle for making biochip is that many arris lean on body, and the end face of needle point is plane, and center is shaped with circular depressed, needle body lower end axially has 26 clearance channels, each clearance channel communicates in axle center, and the top that bottom land end caves in needle point, communicates with depression.The invention have the benefit that can efficiently complete high-density biochip makes.

Description

A kind of producing device of high-density biochip
Technical field
The present invention relates to biological field, be specifically related to the producing device of a kind of high-density biochip.
Background technology
Chip spotting needle is the critical component making high-density gene chip, the quality of point sample on gene chip, size, close Degree significant portion designs and fineness with depending on spotting needle.Main employing has solid spotting needle and utilizes rainbow Inhale the various spotting needles of principle.Solid spotting needle is a kind of traditional method, and its principle is utilize spotting needle needle surface attached Put forth effort to pick sample solution, then contact with slide surface and sample liquid is placed in slide surface.This needle point economy and durability is not enough The point that part is total is relatively small, sampling amount is little, and every sub-sampling can only be put and put once, is therefore unfavorable for large-scale fast fast-growing Produce.Further, since every the most all needs to take a sample, and each sampling amount also can change, thus affects the equal of point sample size Even property and quality.
Summary of the invention
For solving the problems referred to above, it is desirable to provide the producing device of a kind of high-density biochip.
The purpose of the present invention realizes by the following technical solutions:
The producing device of a kind of high-density biochip, including cell recognition module with for making the point sample of biochip Pin, described cell recognition module is used for determining that biological species, the needle body of the described spotting needle for making biochip are polygon Rib leans on body, and the end face of needle point is plane, and center is shaped with circular depressed, and needle body lower end axially has 2-6 clearance channel, each clearance channel Communicate in axle center, and the top that bottom land end caves in needle point, communicate with depression.
The invention have the benefit that can efficiently complete high-density biochip makes.
Accompanying drawing explanation
The invention will be further described to utilize accompanying drawing, but the embodiment in accompanying drawing does not constitute any limit to the present invention System, for those of ordinary skill in the art, on the premise of not paying creative work, it is also possible to obtain according to the following drawings Other accompanying drawing.
Fig. 1 is spotting needle schematic diagram of the present invention;
Fig. 2 is the structural representation of cell recognition module.
Reference:
Cell recognition module 1, Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, Classification and Identification unit 13.
Detailed description of the invention
In conjunction with following application scenarios, the invention will be further described.
Application scenarios 1
See Fig. 1, Fig. 2, the producing device of a kind of high-density biochip of an embodiment of this application scene, including Cell recognition module and for making the spotting needle of biochip, described cell recognition module is used for determining biological species, described Being that many arris lean on body for making the needle body of the spotting needle of biochip, the end face of needle point is plane, and center is shaped with circular depressed, Needle body lower end axially has 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that bottom land end caves in needle point, with recessed Fall into and communicate.
Preferably, the tail end sidewall of spotting needle is provided with the ledge torr of Polygonal column shape.
This preferred embodiment is easy to the operation of spotting needle.
Preferably, a length of 1.5-5cm of needle body, diameter 0.8-3cm of spotting needle;A diameter of 8-400 μm of needle point end face, A diameter of 5-350 μm of central concave, cup depth is 0.6-3.8mm;The a length of 2-8mm of clearance channel, width is 15-300 μ m。
This preferred embodiment is more suitable for commercial production.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within Interval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i , j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If sampling The coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=13, d=2, image denoising effect improves 5% relatively, cell image The extraction accuracy of feature improves 8%.
Application scenarios 2
See Fig. 1, Fig. 2, the producing device of a kind of high-density biochip of an embodiment of this application scene, including Cell recognition module and for making the spotting needle of biochip, described cell recognition module is used for determining biological species, described Being that many arris lean on body for making the needle body of the spotting needle of biochip, the end face of needle point is plane, and center is shaped with circular depressed, Needle body lower end axially has 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that bottom land end caves in needle point, with recessed Fall into and communicate.
Preferably, the tail end sidewall of spotting needle is provided with the ledge torr of Polygonal column shape.
This preferred embodiment is easy to the operation of spotting needle.
Preferably, a length of 1.5-5cm of needle body, diameter 0.8-3cm of spotting needle;A diameter of 8-400 μm of needle point end face, A diameter of 5-350 μm of central concave, cup depth is 0.6-3.8mm;The a length of 2-8mm of clearance channel, width is 15-300 μ m。
This preferred embodiment is more suitable for commercial production.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within Interval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i , j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If sampled point Coordinate be not integer, its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=15, d=2, image denoising effect improves 6% relatively, cell image The extraction accuracy of feature improves 8%.
Application scenarios 3
See Fig. 1, Fig. 2, the producing device of a kind of high-density biochip of an embodiment of this application scene, including Cell recognition module and for making the spotting needle of biochip, described cell recognition module is used for determining biological species, described Being that many arris lean on body for making the needle body of the spotting needle of biochip, the end face of needle point is plane, and center is shaped with circular depressed, Needle body lower end axially has 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that bottom land end caves in needle point, with recessed Fall into and communicate.
Preferably, the tail end sidewall of spotting needle is provided with the ledge torr of Polygonal column shape.
This preferred embodiment is easy to the operation of spotting needle.
Preferably, a length of 1.5-5cm of needle body, diameter 0.8-3cm of spotting needle;A diameter of 8-400 μm of needle point end face, A diameter of 5-350 μm of central concave, cup depth is 0.6-3.8mm;The a length of 2-8mm of clearance channel, width is 15-300 μ m。
This preferred embodiment is more suitable for commercial production.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within Interval
[q (and x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i , j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If sampling The coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°}+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=18, d=3, image denoising effect improves 7% relatively, cell image The extraction accuracy of feature improves 7%.
Application scenarios 4
See Fig. 1, Fig. 2, the producing device of a kind of high-density biochip of an embodiment of this application scene, including Cell recognition module and for making the spotting needle of biochip, described cell recognition module is used for determining biological species, described Being that many arris lean on body for making the needle body of the spotting needle of biochip, the end face of needle point is plane, and center is shaped with circular depressed, Needle body lower end axially has 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that bottom land end caves in needle point, with recessed Fall into and communicate.
Preferably, the tail end sidewall of spotting needle is provided with the ledge torr of Polygonal column shape.
This preferred embodiment is easy to the operation of spotting needle.
Preferably, a length of 1.5-5cm of needle body, diameter 0.8-3cm of spotting needle;A diameter of 8-400 μm of needle point end face, A diameter of 5-350 μm of central concave, cup depth is 0.6-3.8mm;The a length of 2-8mm of clearance channel, width is 15-300 μ m。
This preferred embodiment is more suitable for commercial production.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within Interval
[q (and x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i , j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If sampling The coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=20, d=4, image denoising effect improves 8% relatively, cell image The extraction accuracy of feature improves 6%.
Application scenarios 5
See Fig. 1, Fig. 2, the producing device of a kind of high-density biochip of an embodiment of this application scene, including Cell recognition module and for making the spotting needle of biochip, described cell recognition module is used for determining biological species, described Being that many arris lean on body for making the needle body of the spotting needle of biochip, the end face of needle point is plane, and center is shaped with circular depressed, Needle body lower end axially has 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that bottom land end caves in needle point, with recessed Fall into and communicate.
Preferably, the tail end sidewall of spotting needle is provided with the ledge torr of Polygonal column shape.
This preferred embodiment is easy to the operation of spotting needle.
Preferably, a length of 1.5-5cm of needle body, diameter 0.8-3cm of spotting needle;A diameter of 8-400 μm of needle point end face, A diameter of 5-350 μm of central concave, cup depth is 0.6-3.8mm;The a length of 2-8mm of clearance channel, width is 15-300 μ m。
This preferred embodiment is more suitable for commercial production.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are known Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classification Recognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more than Integer equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y) With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value T More than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ q ( i , j ) ∈ [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y) Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] represent Neighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls within Interval
[q (and x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = Σ ( i j ) ∈ L x , y w ( i , j ) q ( i , j ) Σ ( i , j ) ∈ L x , y w ( i , j )
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to h Angle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Table its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn, yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( Σ i = 1 n x i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n x i n )
y z = 1 2 ( Σ i = 1 n y i h ( x i , y i ) Σ i = 1 n h ( x i , y i ) + Σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If sampled point Coordinate be not integer, its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
g r a ( x i , y i ) = | Y ( h d ( x i , y i ) ) | + | Y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixel Property and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, use Gaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially image The holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thin Subelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limit Edge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improved Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogether The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X = Σ i = 1 4 w i X i
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides The contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | D i - D ‾ | + 1 / Σ i = 1 4 1 | D i - D ‾ | + 1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological map The Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is done Disturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=26, d=2, image denoising effect improves 7.5% relatively, cytological map As the extraction accuracy of feature improves 8%.
Last it should be noted that, above example is only in order to illustrate technical scheme, rather than the present invention is protected Protecting the restriction of scope, although having made to explain to the present invention with reference to preferred embodiment, those of ordinary skill in the art should Work as understanding, technical scheme can be modified or equivalent, without deviating from the reality of technical solution of the present invention Matter and scope.

Claims (3)

1. a producing device for high-density biochip, is characterized in that, including cell recognition module with for making biological core The spotting needle of sheet, described cell recognition module is used for determining biological species, the pin of the described spotting needle for making biochip Body is that many arris lean on body, and the end face of needle point is plane, and center is shaped with circular depressed, and needle body lower end axially has 2-6 clearance channel, Each clearance channel communicates in axle center, and the top that bottom land end caves in needle point, communicates with depression.
The producing device of a kind of high-density biochip the most according to claim 1, is characterized in that, at the tail end of spotting needle The ledge torr of Polygonal column shape it is provided with on sidewall.
The producing device of a kind of high-density biochip the most according to claim 2, is characterized in that, the needle body of spotting needle is long Degree is 1.5-5cm, diameter 0.8-3cm;A diameter of 8-400 μm of needle point end face, a diameter of 5-350 μm of central concave, depression The degree of depth is 0.6-3.8mm;The a length of 2-8mm of clearance channel, width is 15-300 μm.
CN201610768011.3A 2016-08-30 2016-08-30 A kind of producing device of high-density biochip Active CN106244420B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610768011.3A CN106244420B (en) 2016-08-30 2016-08-30 A kind of producing device of high-density biochip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610768011.3A CN106244420B (en) 2016-08-30 2016-08-30 A kind of producing device of high-density biochip

Publications (2)

Publication Number Publication Date
CN106244420A true CN106244420A (en) 2016-12-21
CN106244420B CN106244420B (en) 2018-11-23

Family

ID=58079443

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610768011.3A Active CN106244420B (en) 2016-08-30 2016-08-30 A kind of producing device of high-density biochip

Country Status (1)

Country Link
CN (1) CN106244420B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1302904A (en) * 2000-12-18 2001-07-11 陈超 Device and process for preparing high-density biochips
CN101900737A (en) * 2010-06-10 2010-12-01 上海理工大学 Automatic identification system for urinary sediment visible components based on support vector machine
CN103984939A (en) * 2014-06-03 2014-08-13 爱威科技股份有限公司 Sample visible component classification method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1302904A (en) * 2000-12-18 2001-07-11 陈超 Device and process for preparing high-density biochips
CN101900737A (en) * 2010-06-10 2010-12-01 上海理工大学 Automatic identification system for urinary sediment visible components based on support vector machine
CN103984939A (en) * 2014-06-03 2014-08-13 爱威科技股份有限公司 Sample visible component classification method and system

Also Published As

Publication number Publication date
CN106244420B (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN102054274B (en) Method for full automatic extraction of water remote sensing information in coastal zone
CN107067415B (en) A kind of object localization method based on images match
CN101520894B (en) Method for extracting significant object based on region significance
CN103295239B (en) A kind of autoegistration method of the laser point cloud data based on datum plane image
CN105956560A (en) Vehicle model identification method based on pooling multi-scale depth convolution characteristics
CN106033611B (en) A kind of mountain top point extracting method in dem data
CN106355197A (en) Navigation image matching filtering method based on K-means clustering algorithm
CN101807258B (en) SAR (Synthetic Aperture Radar) image target recognizing method based on nuclear scale tangent dimensionality reduction
CN105930852B (en) A kind of bubble image-recognizing method
CN101635027A (en) En-ULLELDA-based method of multi-view model recognition
CN104090972A (en) Image feature extraction and similarity measurement method used for three-dimensional city model retrieval
CN102122353A (en) Method for segmenting images by using increment dictionary learning and sparse representation
CN101986295B (en) Image clustering method based on manifold sparse coding
CN102096819A (en) Method for segmenting images by utilizing sparse representation and dictionary learning
CN106023153B (en) A kind of method of bubble in measurement water body
CN101196564A (en) Laplace regularization least square synthetic aperture radar automatic target recognition method
CN101833763B (en) Method for detecting reflection image on water surface
CN106871901A (en) A kind of underwater terrain matching air navigation aid based on terrain feature matching
CN104134076B (en) SAR image target recognition method based on CS and SVM decision level fusions
Unsalan et al. Classifying land development in high-resolution satellite imagery using hybrid structural-multispectral features
CN102542285B (en) Image collection scene sorting method and image collection scene sorting device based on spectrogram analysis
CN114463425A (en) Workpiece surface featureless point positioning method based on probability Hough linear detection
CN106250818B (en) A kind of total order keeps the face age estimation method of projection
CN109738852A (en) The distributed source two-dimensional space Power estimation method rebuild based on low-rank matrix
CN109389101A (en) A kind of SAR image target recognition method based on denoising autoencoder network

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20181015

Address after: 225400 the south side of Wenchang Road, Taixing hi tech Industrial Development Zone, Taizhou, Jiangsu.

Applicant after: Taizhou Longze Environmental Technology Co., Ltd.

Address before: 315200 No. 555 north tunnel road, Zhenhai District, Ningbo, Zhejiang

Applicant before: Meng Ling

GR01 Patent grant
GR01 Patent grant