CN106244420B - A kind of producing device of high-density biochip - Google Patents

A kind of producing device of high-density biochip Download PDF

Info

Publication number
CN106244420B
CN106244420B CN201610768011.3A CN201610768011A CN106244420B CN 106244420 B CN106244420 B CN 106244420B CN 201610768011 A CN201610768011 A CN 201610768011A CN 106244420 B CN106244420 B CN 106244420B
Authority
CN
China
Prior art keywords
point
pixel
gray
neighborhood
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610768011.3A
Other languages
Chinese (zh)
Other versions
CN106244420A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taizhou Longze Environmental Technology Co., Ltd.
Original Assignee
Taizhou Longze Environmental Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taizhou Longze Environmental Technology Co Ltd filed Critical Taizhou Longze Environmental Technology Co Ltd
Priority to CN201610768011.3A priority Critical patent/CN106244420B/en
Publication of CN106244420A publication Critical patent/CN106244420A/en
Application granted granted Critical
Publication of CN106244420B publication Critical patent/CN106244420B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01JCHEMICAL OR PHYSICAL PROCESSES, e.g. CATALYSIS OR COLLOID CHEMISTRY; THEIR RELEVANT APPARATUS
    • B01J19/00Chemical, physical or physico-chemical processes in general; Their relevant apparatus
    • B01J19/0046Sequential or parallel reactions, e.g. for the synthesis of polypeptides or polynucleotides; Apparatus and devices for combinatorial chemistry or for making molecular arrays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01JCHEMICAL OR PHYSICAL PROCESSES, e.g. CATALYSIS OR COLLOID CHEMISTRY; THEIR RELEVANT APPARATUS
    • B01J2219/00Chemical, physical or physico-chemical processes in general; Their relevant apparatus
    • B01J2219/00274Sequential or parallel reactions; Apparatus and devices for combinatorial chemistry or for making arrays; Chemical library technology
    • B01J2219/00277Apparatus

Landscapes

  • Chemical & Material Sciences (AREA)
  • Organic Chemistry (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Image Analysis (AREA)
  • Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)
  • Image Processing (AREA)

Abstract

A kind of producing device of high-density biochip, spotting needle including cell recognition module and for making biochip, the cell recognition module is used to determine biological species, the needle body for making the spotting needle of biochip is that polygon rib leans on body, and the end face of needle point is plane, and center is formed with circular recess, needle body lower end is axially provided with 2-6 clearance channel, each clearance channel communicates in axle center, and the top that slot bottom end is recessed to needle point, communicates with recess.Beneficial effects of the present invention are:High-density biochip production can efficiently be completed.

Description

A kind of producing device of high-density biochip
Technical field
The present invention relates to biological fields, and in particular to a kind of producing device of high-density biochip.
Background technique
Chip spotting needle is the critical component for making high-density gene chip, the quality, size of point sample on genetic chip, close Degree significant portion is designed depending on spotting needle and fineness.What is mainly used in the world at present has solid spotting needle and using rainbow Inhale the various spotting needles of principle.Solid spotting needle is a kind of traditional method, and principle is to utilize the attached of spotting needle needle surface Put forth effort to pick sample solution, is then contacted with slide surface and sample liquid is placed in slide surface.This needle point economy and durability is insufficient Place is that total point is relatively small, sampling amount is small, and every sub-sampling, which can only be put, to be set once, therefore is unfavorable for large-scale fast fast-growing It produces.In addition, once all need to take a sample due to every, and each sampling amount can also change, thus influence the equal of point sample size Even property and quality.
Summary of the invention
To solve the above problems, the present invention is intended to provide a kind of producing device of high-density biochip.
The purpose of the present invention is realized using following technical scheme:
A kind of producing device of high-density biochip, the point sample including cell recognition module and for making biochip Needle, the cell recognition module are used to determine that biological species, the needle body for making the spotting needle of biochip are polygon Rib leans on body, and the end face of needle point is plane, and center is formed with circular recess, and needle body lower end is axially provided with 2-6 clearance channel, each clearance channel It communicates in axle center, and the top that slot bottom end is recessed to needle point, is communicated with recess.
Beneficial effects of the present invention are:High-density biochip production can efficiently be completed.
Detailed description of the invention
The present invention will be further described with reference to the accompanying drawings, but the embodiment in attached drawing is not constituted to any limit of the invention System, for those of ordinary skill in the art, without creative efforts, can also obtain according to the following drawings Other attached drawings.
Fig. 1 is spotting needle schematic diagram of the present invention;
Fig. 2 is the structural schematic diagram of cell recognition module.
Appended drawing reference:
Cell recognition module 1, Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, Classification and Identification unit 13.
Specific embodiment
In conjunction with following application scenarios, the invention will be further described.
Application scenarios 1
Referring to Fig. 1, Fig. 2, a kind of producing device of high-density biochip of one embodiment of this application scene, including Cell recognition module and spotting needle for making biochip, the cell recognition module is used to determine biological species, described The needle body of spotting needle for making biochip is that polygon rib leans on body, and the end face of needle point is plane, and center is formed with circular recess, Needle body lower end is axially provided with 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that slot bottom end is recessed to needle point, and recessed It falls into and communicates.
Preferably, the protrusion frame torr of Polygonal column shape is provided on the tail end side wall of spotting needle.
This preferred embodiment is convenient for the operation of spotting needle.
Preferably, the needle body length of spotting needle is 1.5-5cm, diameter 0.8-3cm;The diameter of needle point end face is 8-400 μm, The diameter of central concave is 5-350 μm, cup depth 0.6-3.8mm;The length of clearance channel is 2-8mm, and width is 15-300 μ m。
This preferred embodiment is more suitable for industrial production.
Preferably, the cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;The Methods of Segmentation On Cell Images unit 11 is used to distinguish the back in the cell image acquired by cell image acquisition module Scape, nucleus and cytoplasm;The feature extraction unit 12 is for extracting the textural characteristics of cell image;The classification Recognition unit 13 is used to be realized using classifier to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, the Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, specially:
(1) image conversion subunit, for converting gray level image for the cell image of acquisition;
(2) noise remove subelement is used to carry out denoising to gray level image, including:
For pixel (x, y), its 3 × 3 neighborhood S is chosenx,yThe neighborhood L of (2N+1) × (2N+1)x,y, N be greater than Integer equal to 2;
It whether is first that boundary point judges to pixel, given threshold T, T ∈ [13,26] is calculated pixel (x, y) With its neighborhood Sx,yIn each pixel gray scale difference value, and be compared with threshold value T, if gray scale difference value is greater than the number of threshold value T More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ are pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, and q (i, j) ∈ [+1.5 σ of q (x, y) -1.5 σ, q (x, y)] is indicated Neighborhood Lx,yInterior gray value falls within the point of section [+1.5 σ of q (x, y) -1.5 σ, q (x, y)], and k indicates neighborhood Lx,yInterior gray value is fallen within The quantity of the point in section [+1.5 σ of q (x, y) -1.5 σ, q (x, y)];
If (x, y) is non-boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) Angle value, w (i, j) are neighborhood Lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, cytoplasm, nucleus Point, specially:
Each pixel (x, y) is indicated with four dimensional feature vectors:
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Its neighborhood of table Sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood Sx,yGray variance;
Background, cytoplasm, nucleus three classes are divided into using K-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area includes n point:(x1,y1),…,(xn, yn), intensity-weighted calibration is carried out to the region and geometric center is demarcated, takes its average value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
It constructs from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceIt indicates to be rounded downwards;
It carries out sampling available dis with unit length along line segmentpA point (x1,y1) ...,If sampling The coordinate of point is not integer, and gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) at along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Define gray scale difference inhibition function:
Point (xi,yi) at along line segment direction gradient gra (xi,yi):
It chooses the maximum value point of gradient and is used as nucleus and cytoplasmic precise edge.
This preferred embodiment is arranged noise remove subelement, and the space of effective integration center pixel and neighborhood territory pixel is closed on Property and grey similarity carry out noise reduction process, flat site in the picture, grey scale pixel value is not much different in neighborhood, uses Gaussian filter is weighted filtering to gray value, is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and cytoplasm coarse contour are extracted using K mean cluster, the interference of noise can be effectively removed;Setting is thin Subelement is demarcated at karyon center, is accurately positioned convenient for subsequent to nucleus and cytoplasm profile;Accurate Segmentation subelement fills Divide and directional information is utilized, overcomes interference of the inflammatory cell to edge graph, can accurately extract nucleus and cytoplasm side Edge.
Preferably, the textural characteristics to cell image extract, including:
(1) the Gray co-occurrence matrix of cell image, the comprehensive ash are sought based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies the textural characteristics of cell in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project be X1、X2、X3、X4, then Gray is total The calculation formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d indicates that distance, the value range of d are [2,4], wiFor weighting coefficient, i=1,2,3,4, by four sides The corresponding contrast level parameter of gray level co-occurrence matrixes in each direction in calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, mean value isI=1,2,3,4, then weighting coefficient wiCalculation formula be:
(2) four textural characteristics parameters needed for being obtained using the Gray co-occurrence matrix and matrix element project: Contrast, variance and energy and mean value;
(3) four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, seeks cytological map by the way of weighting coefficient is arranged The Gray co-occurrence matrix of picture, and then textural characteristics of the cell on specified four direction are extracted, it solves since outside is dry Disturb the textural characteristics ginseng of cell caused by (influence caused by lighting angle, the flowing of gas interference etc. when such as cell image acquisition) Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and four textural characteristics of energy and mean value, eliminate redundancy and duplicate characteristic parameter;Four textural characteristics are joined Number is normalized, and facilitates the Classification and Identification processing of subsequent cell image.
In this application scenarios, given threshold T=13, d=2, image denoising effect is opposite to improve 5%, cell image The extraction accuracy of feature improves 8%.
Application scenarios 2
Referring to Fig. 1, Fig. 2, a kind of producing device of high-density biochip of one embodiment of this application scene, including Cell recognition module and spotting needle for making biochip, the cell recognition module is used to determine biological species, described The needle body of spotting needle for making biochip is that polygon rib leans on body, and the end face of needle point is plane, and center is formed with circular recess, Needle body lower end is axially provided with 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that slot bottom end is recessed to needle point, and recessed It falls into and communicates.
Preferably, the protrusion frame torr of Polygonal column shape is provided on the tail end side wall of spotting needle.
This preferred embodiment is convenient for the operation of spotting needle.
Preferably, the needle body length of spotting needle is 1.5-5cm, diameter 0.8-3cm;The diameter of needle point end face is 8-400 μm, The diameter of central concave is 5-350 μm, cup depth 0.6-3.8mm;The length of clearance channel is 2-8mm, and width is 15-300 μ m。
This preferred embodiment is more suitable for industrial production.
Preferably, the cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;The Methods of Segmentation On Cell Images unit 11 is used to distinguish the back in the cell image acquired by cell image acquisition module Scape, nucleus and cytoplasm;The feature extraction unit 12 is for extracting the textural characteristics of cell image;The classification Recognition unit 13 is used to be realized using classifier to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, the Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, specially:
(1) image conversion subunit, for converting gray level image for the cell image of acquisition;
(2) noise remove subelement is used to carry out denoising to gray level image, including:
For pixel (x, y), its 3 × 3 neighborhood S is chosenx,yThe neighborhood L of (2N+1) × (2N+1)x,y, N be greater than Integer equal to 2;
It whether is first that boundary point judges to pixel, given threshold T, T ∈ [13,26] is calculated pixel (x, y) With its neighborhood Sx,yIn each pixel gray scale difference value, and be compared with threshold value T, if gray scale difference value is greater than the number of threshold value T More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ are pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, and q (i, j) ∈ [+1.5 σ of q (x, y) -1.5 σ, q (x, y)] is indicated Neighborhood Lx,yInterior gray value falls within the point of section [+1.5 σ of q (x, y) -1.5 σ, q (x, y)], and k indicates neighborhood Lx,yInterior gray value is fallen within The quantity of the point in section [+1.5 σ of q (x, y) -1.5 σ, q (x, y)];
If (x, y) is non-boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) Angle value, w (i, j) are neighborhood Lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, cytoplasm, nucleus Point, specially:
Each pixel (x, y) is indicated with four dimensional feature vectors:
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Its neighborhood of table Sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood Sx,yGray variance;
Background, cytoplasm, nucleus three classes are divided into using K-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area includes n point:(x1,y1),…,(xn, yn), intensity-weighted calibration is carried out to the region and geometric center is demarcated, takes its average value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
It constructs from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceIt indicates to be rounded downwards;
It carries out sampling available dis with unit length along line segmentpA point (x1,y1) ...,If sampled point Coordinate be not integer, gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) at along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Define gray scale difference inhibition function:
Point (xi,yi) at along line segment direction gradient gra (xi,yi):
It chooses the maximum value point of gradient and is used as nucleus and cytoplasmic precise edge.
This preferred embodiment is arranged noise remove subelement, and the space of effective integration center pixel and neighborhood territory pixel is closed on Property and grey similarity carry out noise reduction process, flat site in the picture, grey scale pixel value is not much different in neighborhood, uses Gaussian filter is weighted filtering to gray value, is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and cytoplasm coarse contour are extracted using K mean cluster, the interference of noise can be effectively removed;Setting is thin Subelement is demarcated at karyon center, is accurately positioned convenient for subsequent to nucleus and cytoplasm profile;Accurate Segmentation subelement fills Divide and directional information is utilized, overcomes interference of the inflammatory cell to edge graph, can accurately extract nucleus and cytoplasm side Edge.
Preferably, the textural characteristics to cell image extract, including:
(1) the Gray co-occurrence matrix of cell image, the comprehensive ash are sought based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies the textural characteristics of cell in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project be X1、X2、X3、X4, then Gray is total The calculation formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d indicates that distance, the value range of d are [2,4], wiFor weighting coefficient, i=1,2,3,4, by four sides The corresponding contrast level parameter of gray level co-occurrence matrixes in each direction in calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, mean value isI=1,2,3,4, then weighting coefficient wiCalculation formula be:
(2) four textural characteristics parameters needed for being obtained using the Gray co-occurrence matrix and matrix element project: Contrast, variance and energy and mean value;
(3) four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, seeks cytological map by the way of weighting coefficient is arranged The Gray co-occurrence matrix of picture, and then textural characteristics of the cell on specified four direction are extracted, it solves since outside is dry Disturb the textural characteristics ginseng of cell caused by (influence caused by lighting angle, the flowing of gas interference etc. when such as cell image acquisition) Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and four textural characteristics of energy and mean value, eliminate redundancy and duplicate characteristic parameter;Four textural characteristics are joined Number is normalized, and facilitates the Classification and Identification processing of subsequent cell image.
In this application scenarios, given threshold T=15, d=2, image denoising effect is opposite to improve 6%, cell image The extraction accuracy of feature improves 8%.
Application scenarios 3
Referring to Fig. 1, Fig. 2, a kind of producing device of high-density biochip of one embodiment of this application scene, including Cell recognition module and spotting needle for making biochip, the cell recognition module is used to determine biological species, described The needle body of spotting needle for making biochip is that polygon rib leans on body, and the end face of needle point is plane, and center is formed with circular recess, Needle body lower end is axially provided with 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that slot bottom end is recessed to needle point, and recessed It falls into and communicates.
Preferably, the protrusion frame torr of Polygonal column shape is provided on the tail end side wall of spotting needle.
This preferred embodiment is convenient for the operation of spotting needle.
Preferably, the needle body length of spotting needle is 1.5-5cm, diameter 0.8-3cm;The diameter of needle point end face is 8-400 μm, The diameter of central concave is 5-350 μm, cup depth 0.6-3.8mm;The length of clearance channel is 2-8mm, and width is 15-300 μ m。
This preferred embodiment is more suitable for industrial production.
Preferably, the cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;The Methods of Segmentation On Cell Images unit 11 is used to distinguish the back in the cell image acquired by cell image acquisition module Scape, nucleus and cytoplasm;The feature extraction unit 12 is for extracting the textural characteristics of cell image;The classification Recognition unit 13 is used to be realized using classifier to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, the Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, specially:
(1) image conversion subunit, for converting gray level image for the cell image of acquisition;
(2) noise remove subelement is used to carry out denoising to gray level image, including:
For pixel (x, y), its 3 × 3 neighborhood S is chosenx,yThe neighborhood L of (2N+1) × (2N+1)x,y, N be greater than Integer equal to 2;
It whether is first that boundary point judges to pixel, given threshold T, T ∈ [13,26] is calculated pixel (x, y) With its neighborhood Sx,yIn each pixel gray scale difference value, and be compared with threshold value T, if gray scale difference value is greater than the number of threshold value T More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ are pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, and q (i, j) ∈ [+1.5 σ of q (x, y) -1.5 σ, q (x, y)] is indicated Neighborhood Lx,yInterior gray value falls within the point of section [+1.5 σ of q (x, y) -1.5 σ, q (x, y)], and k indicates neighborhood Lx,yInterior gray value is fallen within Section
The quantity of the point of [+1.5 σ of q (x, y) -1.5 σ, q (x, y)];
If (x, y) is non-boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) Angle value, w (i, j) are neighborhood Lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, cytoplasm, nucleus Point, specially:
Each pixel (x, y) is indicated with four dimensional feature vectors:
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Its neighborhood of table Sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood Sx,yGray variance;
Background, cytoplasm, nucleus three classes are divided into using K-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area includes n point:(x1,y1),…,(xn, yn), intensity-weighted calibration is carried out to the region and geometric center is demarcated, takes its average value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
It constructs from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceIt indicates to be rounded downwards;
It carries out sampling available dis with unit length along line segmentpA point (x1,y1) ...,If sampling The coordinate of point is not integer, and gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) at along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Define gray scale difference inhibition function:
Point (xi,yi) at along line segment direction gradient gra (xi,yi):
It chooses the maximum value point of gradient and is used as nucleus and cytoplasmic precise edge.
This preferred embodiment is arranged noise remove subelement, and the space of effective integration center pixel and neighborhood territory pixel is closed on Property and grey similarity carry out noise reduction process, flat site in the picture, grey scale pixel value is not much different in neighborhood, uses Gaussian filter is weighted filtering to gray value, is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and cytoplasm coarse contour are extracted using K mean cluster, the interference of noise can be effectively removed;Setting is thin Subelement is demarcated at karyon center, is accurately positioned convenient for subsequent to nucleus and cytoplasm profile;Accurate Segmentation subelement fills Divide and directional information is utilized, overcomes interference of the inflammatory cell to edge graph, can accurately extract nucleus and cytoplasm side Edge.
Preferably, the textural characteristics to cell image extract, including:
(1) the Gray co-occurrence matrix of cell image, the comprehensive ash are sought based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies the textural characteristics of cell in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project be X1、X2、X3、X4, then Gray is total The calculation formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°}+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d indicates that distance, the value range of d are [2,4], wiFor weighting coefficient, i=1,2,3,4, by four sides The corresponding contrast level parameter of gray level co-occurrence matrixes in each direction in calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, mean value isI=1,2,3,4, then weighting coefficient wiCalculation formula be:
(2) four textural characteristics parameters needed for being obtained using the Gray co-occurrence matrix and matrix element project: Contrast, variance and energy and mean value;
(3) four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, seeks cytological map by the way of weighting coefficient is arranged The Gray co-occurrence matrix of picture, and then textural characteristics of the cell on specified four direction are extracted, it solves since outside is dry Disturb the textural characteristics ginseng of cell caused by (influence caused by lighting angle, the flowing of gas interference etc. when such as cell image acquisition) Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and four textural characteristics of energy and mean value, eliminate redundancy and duplicate characteristic parameter;Four textural characteristics are joined Number is normalized, and facilitates the Classification and Identification processing of subsequent cell image.
In this application scenarios, given threshold T=18, d=3, image denoising effect is opposite to improve 7%, cell image The extraction accuracy of feature improves 7%.
Application scenarios 4
Referring to Fig. 1, Fig. 2, a kind of producing device of high-density biochip of one embodiment of this application scene, including Cell recognition module and spotting needle for making biochip, the cell recognition module is used to determine biological species, described The needle body of spotting needle for making biochip is that polygon rib leans on body, and the end face of needle point is plane, and center is formed with circular recess, Needle body lower end is axially provided with 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that slot bottom end is recessed to needle point, and recessed It falls into and communicates.
Preferably, the protrusion frame torr of Polygonal column shape is provided on the tail end side wall of spotting needle.
This preferred embodiment is convenient for the operation of spotting needle.
Preferably, the needle body length of spotting needle is 1.5-5cm, diameter 0.8-3cm;The diameter of needle point end face is 8-400 μm, The diameter of central concave is 5-350 μm, cup depth 0.6-3.8mm;The length of clearance channel is 2-8mm, and width is 15-300 μ m。
This preferred embodiment is more suitable for industrial production.
Preferably, the cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;The Methods of Segmentation On Cell Images unit 11 is used to distinguish the back in the cell image acquired by cell image acquisition module Scape, nucleus and cytoplasm;The feature extraction unit 12 is for extracting the textural characteristics of cell image;The classification Recognition unit 13 is used to be realized using classifier to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, the Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, specially:
(1) image conversion subunit, for converting gray level image for the cell image of acquisition;
(2) noise remove subelement is used to carry out denoising to gray level image, including:
For pixel (x, y), its 3 × 3 neighborhood S is chosenx,yThe neighborhood L of (2N+1) × (2N+1)x,y, N be greater than Integer equal to 2;
It whether is first that boundary point judges to pixel, given threshold T, T ∈ [13,26] is calculated pixel (x, y) With its neighborhood Sx,yIn each pixel gray scale difference value, and be compared with threshold value T, if gray scale difference value is greater than the number of threshold value T More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ are pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, and q (i, j) ∈ [+1.5 σ of q (x, y) -1.5 σ, q (x, y)] is indicated Neighborhood Lx,yInterior gray value falls within the point of section [+1.5 σ of q (x, y) -1.5 σ, q (x, y)], and k indicates neighborhood Lx,yInterior gray value is fallen within Section
The quantity of the point of [+1.5 σ of q (x, y) -1.5 σ, q (x, y)];
If (x, y) is non-boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) Angle value, w (i, j) are neighborhood Lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, cytoplasm, nucleus Point, specially:
Each pixel (x, y) is indicated with four dimensional feature vectors:
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Its neighborhood of table Sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood Sx,yGray variance;
Background, cytoplasm, nucleus three classes are divided into using K-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area includes n point:(x1,y1),…,(xn, yn), intensity-weighted calibration is carried out to the region and geometric center is demarcated, takes its average value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
It constructs from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceIt indicates to be rounded downwards;
It carries out sampling available dis with unit length along line segmentpA point (x1,y1) ...,If sampling The coordinate of point is not integer, and gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) at along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Define gray scale difference inhibition function:
Point (xi,yi) at along line segment direction gradient gra (xi,yi):
It chooses the maximum value point of gradient and is used as nucleus and cytoplasmic precise edge.
This preferred embodiment is arranged noise remove subelement, and the space of effective integration center pixel and neighborhood territory pixel is closed on Property and grey similarity carry out noise reduction process, flat site in the picture, grey scale pixel value is not much different in neighborhood, uses Gaussian filter is weighted filtering to gray value, is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and cytoplasm coarse contour are extracted using K mean cluster, the interference of noise can be effectively removed;Setting is thin Subelement is demarcated at karyon center, is accurately positioned convenient for subsequent to nucleus and cytoplasm profile;Accurate Segmentation subelement fills Divide and directional information is utilized, overcomes interference of the inflammatory cell to edge graph, can accurately extract nucleus and cytoplasm side Edge.
Preferably, the textural characteristics to cell image extract, including:
(1) the Gray co-occurrence matrix of cell image, the comprehensive ash are sought based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies the textural characteristics of cell in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project be X1、X2、X3、X4, then Gray is total The calculation formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d indicates that distance, the value range of d are [2,4], wiFor weighting coefficient, i=1,2,3,4, by four sides The corresponding contrast level parameter of gray level co-occurrence matrixes in each direction in calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, mean value isI=1,2,3,4, then weighting coefficient wiCalculation formula be:
(2) four textural characteristics parameters needed for being obtained using the Gray co-occurrence matrix and matrix element project: Contrast, variance and energy and mean value;
(3) four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, seeks cytological map by the way of weighting coefficient is arranged The Gray co-occurrence matrix of picture, and then textural characteristics of the cell on specified four direction are extracted, it solves since outside is dry Disturb the textural characteristics ginseng of cell caused by (influence caused by lighting angle, the flowing of gas interference etc. when such as cell image acquisition) Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and four textural characteristics of energy and mean value, eliminate redundancy and duplicate characteristic parameter;Four textural characteristics are joined Number is normalized, and facilitates the Classification and Identification processing of subsequent cell image.
In this application scenarios, given threshold T=20, d=4, image denoising effect is opposite to improve 8%, cell image The extraction accuracy of feature improves 6%.
Application scenarios 5
Referring to Fig. 1, Fig. 2, a kind of producing device of high-density biochip of one embodiment of this application scene, including Cell recognition module and spotting needle for making biochip, the cell recognition module is used to determine biological species, described The needle body of spotting needle for making biochip is that polygon rib leans on body, and the end face of needle point is plane, and center is formed with circular recess, Needle body lower end is axially provided with 2-6 clearance channel, and each clearance channel communicates in axle center, and the top that slot bottom end is recessed to needle point, and recessed It falls into and communicates.
Preferably, the protrusion frame torr of Polygonal column shape is provided on the tail end side wall of spotting needle.
This preferred embodiment is convenient for the operation of spotting needle.
Preferably, the needle body length of spotting needle is 1.5-5cm, diameter 0.8-3cm;The diameter of needle point end face is 8-400 μm, The diameter of central concave is 5-350 μm, cup depth 0.6-3.8mm;The length of clearance channel is 2-8mm, and width is 15-300 μ m。
This preferred embodiment is more suitable for industrial production.
Preferably, the cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;The Methods of Segmentation On Cell Images unit 11 is used to distinguish the back in the cell image acquired by cell image acquisition module Scape, nucleus and cytoplasm;The feature extraction unit 12 is for extracting the textural characteristics of cell image;The classification Recognition unit 13 is used to be realized using classifier to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, the Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, specially:
(1) image conversion subunit, for converting gray level image for the cell image of acquisition;
(2) noise remove subelement is used to carry out denoising to gray level image, including:
For pixel (x, y), its 3 × 3 neighborhood S is chosenx,yThe neighborhood L of (2N+1) × (2N+1)x,y, N be greater than Integer equal to 2;
It whether is first that boundary point judges to pixel, given threshold T, T ∈ [13,26] is calculated pixel (x, y) With its neighborhood Sx,yIn each pixel gray scale difference value, and be compared with threshold value T, if gray scale difference value is greater than the number of threshold value T More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ are pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, and q (i, j) ∈ [+1.5 σ of q (x, y) -1.5 σ, q (x, y)] is indicated Neighborhood Lx,yInterior gray value falls within the point of section [+1.5 σ of q (x, y) -1.5 σ, q (x, y)], and k indicates neighborhood Lx,yInterior gray value is fallen within Section
The quantity of the point of [+1.5 σ of q (x, y) -1.5 σ, q (x, y)];
If (x, y) is non-boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) Angle value, w (i, j) are neighborhood Lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, cytoplasm, nucleus Point, specially:
Each pixel (x, y) is indicated with four dimensional feature vectors:
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood Sx,yGray average, hmed(x, y) generation Its neighborhood of table Sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood Sx,yGray variance;
Background, cytoplasm, nucleus three classes are divided into using K-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area includes n point:(x1,y1),…,(xn, yn), intensity-weighted calibration is carried out to the region and geometric center is demarcated, takes its average value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
It constructs from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceIt indicates to be rounded downwards;
It carries out sampling available dis with unit length along line segmentpA point (x1,y1) ...,If sampled point Coordinate be not integer, gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) at along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Define gray scale difference inhibition function:
Point (xi,yi) at along line segment direction gradient gra (xi,yi):
It chooses the maximum value point of gradient and is used as nucleus and cytoplasmic precise edge.
This preferred embodiment is arranged noise remove subelement, and the space of effective integration center pixel and neighborhood territory pixel is closed on Property and grey similarity carry out noise reduction process, flat site in the picture, grey scale pixel value is not much different in neighborhood, uses Gaussian filter is weighted filtering to gray value, is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and cytoplasm coarse contour are extracted using K mean cluster, the interference of noise can be effectively removed;Setting is thin Subelement is demarcated at karyon center, is accurately positioned convenient for subsequent to nucleus and cytoplasm profile;Accurate Segmentation subelement fills Divide and directional information is utilized, overcomes interference of the inflammatory cell to edge graph, can accurately extract nucleus and cytoplasm side Edge.
Preferably, the textural characteristics to cell image extract, including:
(1) the Gray co-occurrence matrix of cell image, the comprehensive ash are sought based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies the textural characteristics of cell in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project be X1、X2、X3、X4, then Gray is total The calculation formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
In formula, d indicates that distance, the value range of d are [2,4], wiFor weighting coefficient, i=1,2,3,4, by four sides The corresponding contrast level parameter of gray level co-occurrence matrixes in each direction in calculates, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively Di, mean value isI=1,2,3,4, then weighting coefficient wiCalculation formula be:
(2) four textural characteristics parameters needed for being obtained using the Gray co-occurrence matrix and matrix element project: Contrast, variance and energy and mean value;
(3) four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, seeks cytological map by the way of weighting coefficient is arranged The Gray co-occurrence matrix of picture, and then textural characteristics of the cell on specified four direction are extracted, it solves since outside is dry Disturb the textural characteristics ginseng of cell caused by (influence caused by lighting angle, the flowing of gas interference etc. when such as cell image acquisition) Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and four textural characteristics of energy and mean value, eliminate redundancy and duplicate characteristic parameter;Four textural characteristics are joined Number is normalized, and facilitates the Classification and Identification processing of subsequent cell image.
In this application scenarios, given threshold T=26, d=2, image denoising effect is opposite to improve 7.5%, cytological map As the extraction accuracy of feature improves 8%.
Finally it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than the present invention is protected The limitation of range is protected, although explaining in detail referring to preferred embodiment to the present invention, those skilled in the art are answered Work as understanding, it can be with modification or equivalent replacement of the technical solution of the present invention are made, without departing from the reality of technical solution of the present invention Matter and range.

Claims (3)

1. a kind of producing device of high-density biochip, characterized in that including cell recognition module and for making biological core The spotting needle of piece, the cell recognition module is used to determine biological species, described for making the needle of the spotting needle of biochip Body is that polygon rib leans on body, and the end face of needle point is plane, and center is formed with circular recess, and needle body lower end is axially provided with 2-6 clearance channel, Each clearance channel communicates in axle center, and the top that slot bottom end is recessed to needle point, communicates with recess;
The cell recognition module includes Methods of Segmentation On Cell Images unit, feature extraction unit, Classification and Identification unit;The cytological map As cutting unit is used to distinguish background, nucleus and the cytoplasm in the cell image acquired by cell image acquisition module;Institute Feature extraction unit is stated for extracting to the textural characteristics of cell image;The Classification and Identification unit is used for according to texture spy Sign is realized using classifier to cell image Classification and Identification;
The Methods of Segmentation On Cell Images unit includes image conversion subunit, noise remove subelement, coarse segmentation subelement, nucleus Subelement, Accurate Segmentation subelement are demarcated in center, specially:
(1) image conversion subunit, for converting gray level image for the cell image of acquisition;
(2) noise remove subelement is used to carry out denoising to gray level image, including:
For pixel (x, y), its 3 × 3 neighborhood S is chosenx,yThe neighborhood L of (2N+1) × (2N+1)x,y, N is more than or equal to 2 Integer;
It whether is first that boundary point judges to pixel, given threshold T, T ∈ [13,26] calculates pixel (x, y) and its Neighborhood Sx,yIn each pixel gray scale difference value, and be compared with threshold value T, if number of the gray scale difference value greater than threshold value T is greater than Equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the gray value of noise reduction preceding pixel point (x, y), σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, and q (i, j) ∈ [+1.5 σ of q (x, y) -1.5 σ, q (x, y)] indicates neighborhood Lx,yInterior gray value falls within the point of section [+1.5 σ of q (x, y) -1.5 σ, q (x, y)], and k indicates neighborhood Lx,yInterior gray value falls within section The quantity of the point of [+1.5 σ of q (x, y) -1.5 σ, q (x, y)];
If (x, y) is non-boundary point, following noise reduction process is carried out:
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the gray value at q (i, j) representative image midpoint (i, j), W (i, j) is neighborhood Lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for carrying out thick division, tool to the background in the cell image after denoising, cytoplasm, nucleus Body is:
Each pixel (x, y) is indicated with four dimensional feature vectors:
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood Sx,yGray average, hmed(x, y) represents it Neighborhood Sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood Sx,yGray variance;
Background, cytoplasm, nucleus three classes are divided into using K-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area includes n point:(x1,y1),…,(xn,yn), Intensity-weighted calibration and geometric center calibration are carried out to the region, take its average value as nuclear centers (xz,yz):
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, cytoplasm;
It constructs from nuclear centers (xz,yz) arrive nucleus and cytoplasm boundary point (xp,yp) directed line segmentDistanceIt indicates to be rounded downwards;
It carries out sampling available dis with unit length along line segmentpA point (x1,y1) ...,If the seat of sampled point Mark is not integer, and gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) at along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Define gray scale difference inhibition function:
Point (xi,yi) at along line segment direction gradient gra (xi,yi):
It chooses the maximum value point of gradient and is used as nucleus and cytoplasmic precise edge.
2. a kind of producing device of high-density biochip according to claim 1, characterized in that in the tail end of spotting needle The protrusion frame torr of Polygonal column shape is provided on side wall.
3. a kind of producing device of high-density biochip according to claim 2, characterized in that the needle body of spotting needle is long Degree is 1.5-5cm, diameter 0.8-3cm;The diameter of needle point end face is 8-400 μm, and the diameter of central concave is 5-350 μm, recess Depth is 0.6-3.8mm;The length of clearance channel is 2-8mm, and width is 15-300 μm.
CN201610768011.3A 2016-08-30 2016-08-30 A kind of producing device of high-density biochip Active CN106244420B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610768011.3A CN106244420B (en) 2016-08-30 2016-08-30 A kind of producing device of high-density biochip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610768011.3A CN106244420B (en) 2016-08-30 2016-08-30 A kind of producing device of high-density biochip

Publications (2)

Publication Number Publication Date
CN106244420A CN106244420A (en) 2016-12-21
CN106244420B true CN106244420B (en) 2018-11-23

Family

ID=58079443

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610768011.3A Active CN106244420B (en) 2016-08-30 2016-08-30 A kind of producing device of high-density biochip

Country Status (1)

Country Link
CN (1) CN106244420B (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1144879C (en) * 2000-12-18 2004-04-07 陈超 Device and process for preparing high-density biochips
CN101900737A (en) * 2010-06-10 2010-12-01 上海理工大学 Automatic identification system for urinary sediment visible components based on support vector machine
CN103984939B (en) * 2014-06-03 2017-07-04 爱威科技股份有限公司 A kind of sample visible component sorting technique and system

Also Published As

Publication number Publication date
CN106244420A (en) 2016-12-21

Similar Documents

Publication Publication Date Title
CN105956560B (en) A kind of model recognizing method based on the multiple dimensioned depth convolution feature of pondization
CN109285222A (en) The building of organic shale high-resolution digital rock core and analysis method
CN105389550B (en) It is a kind of based on sparse guide and the remote sensing target detection method that significantly drives
CN106803253B (en) A kind of three-dimensional rock image crack identification method
CN109903319B (en) Multi-resolution-based fast iteration closest point registration algorithm
CN106651882B (en) A kind of bird's nest impurities identification and detection method and device based on machine vision
CN111681274A (en) 3D human skeleton recognition and extraction method based on depth camera point cloud data
CN109584283A (en) A kind of Medical Image Registration Algorithm based on convolutional neural networks
CN112785603B (en) Brain tissue segmentation method based on Unet and superpixel
CN102122353A (en) Method for segmenting images by using increment dictionary learning and sparse representation
CN110516525A (en) SAR image target recognition method based on GAN and SVM
CN106446925A (en) Dolphin identity recognition method based on image processing
CN113689374A (en) Plant leaf surface roughness determination method and system
CN102081733A (en) Multi-modal information combined pose-varied three-dimensional human face five-sense organ marking point positioning method
CN105447527A (en) Method and system for classifying environmental microorganisms by image recognition technology
CN116309847A (en) Stacked workpiece pose estimation method based on combination of two-dimensional image and three-dimensional point cloud
CN106250818B (en) A kind of total order keeps the face age estimation method of projection
Cao et al. Detection of microalgae objects based on the Improved YOLOv3 model
CN109272458A (en) A kind of point cloud filtering method based on prior information
CN114463425A (en) Workpiece surface featureless point positioning method based on probability Hough linear detection
CN115797293A (en) Brain image reference point automatic labeling method based on magnetic resonance imaging
CN106244420B (en) A kind of producing device of high-density biochip
CN110246165B (en) Method and system for improving registration speed of visible light image and SAR image
CN113902779B (en) Point cloud registration method based on tensor voting method
CN112651943B (en) Three-dimensional image mark point extraction method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20181015

Address after: 225400 the south side of Wenchang Road, Taixing hi tech Industrial Development Zone, Taizhou, Jiangsu.

Applicant after: Taizhou Longze Environmental Technology Co., Ltd.

Address before: 315200 No. 555 north tunnel road, Zhenhai District, Ningbo, Zhejiang

Applicant before: Meng Ling

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant