CN106361280A - Optical imaging system adopted by bio-optical imaging device of combined iris and cortical tissues - Google Patents

Optical imaging system adopted by bio-optical imaging device of combined iris and cortical tissues Download PDF

Info

Publication number
CN106361280A
CN106361280A CN201610777489.2A CN201610777489A CN106361280A CN 106361280 A CN106361280 A CN 106361280A CN 201610777489 A CN201610777489 A CN 201610777489A CN 106361280 A CN106361280 A CN 106361280A
Authority
CN
China
Prior art keywords
optical imaging
combined
image
light
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610777489.2A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201610777489.2A priority Critical patent/CN106361280A/en
Publication of CN106361280A publication Critical patent/CN106361280A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Abstract

The invention discloses an optical imaging system adopted by a bio-optical imaging device of combined iris and cortical tissues. The optical imaging system comprises a cell identification module and an optical imaging device, wherein the cell identification module is used for determining a biological type; the optical imaging device consists of an illuminating light source unit and an imaging unit. The optical imaging system has the beneficial effects that the optical imaging system has ordinary people adaptability, and meets national level large-scale application; most reliable biological tissue activity detection analysis is performed, and the reliability of a biological assay system is ensured; based on an image analysis assay method, the accuracy and reliability of a biological assay system are improved.

Description

The light that the Photobiology imaging device of a kind of combined iris and cortical tissue adopts studies As device
Technical field
The present invention relates to optical imaging field is and in particular to the Photobiology imaging of a kind of combined iris and cortical tissue fills Put the optical imaging device of employing.
Background technology
At present, demand preferably improves the performance of bioassay system, comprising: solve to lead to iris deformation when heterphoria When making iris be non-circular produce mistake problem, the expression way of more preferable characteristic image, further improve feature extraction and The ability of coding information entropy, and improve the accuracy and reliability that template is estimated, the accuracy and reliability of far/frr.
Content of the invention
For solving the above problems, the present invention is intended to provide the Photobiology imaging device of a kind of combined iris and cortical tissue Using optical imaging device.
The purpose of the present invention employs the following technical solutions to realize:
The optical imaging device that the Photobiology imaging device of a kind of combined iris and cortical tissue adopts, knows including cell Other module and optical imaging device, described cell recognition module is used for determining biological species, described optical imaging device is by illuminating Light source cell and image-generating unit are constituted, and light source unit for lighting is configured to multispectral multi-polarization state combination optical with image-generating unit and becomes As system, described multispectral multi-polarization state combination optical imaging system produces at least four combined imagings, comprising: black light With orthogonal polarisation state combined imaging, black light and parallel polarization states combined imaging, near infrared light and orthogonal polarisation state are combined into Picture, and near infrared light and parallel polarization states combined imaging;
Described multispectral multi-polarization state combination optical imaging system, comprising:
Light source unit for lighting is made up of at least black light, near infrared light light source, and the polarizer;
Image-generating unit is made up of at least black light, near infrared light imaging light path, and analyzer.
The invention has the benefit that having the General Population suitability, meet based on national sizable application;The most reliable The tissue activity detection analysis of property are it is ensured that the reliability of of bioassay system itself;Based on graphical analyses assay method Improve the accuracy and reliability of bioassay system performance.
Brief description
Using accompanying drawing, the invention will be further described, but the embodiment in accompanying drawing does not constitute any limit to the present invention System, for those of ordinary skill in the art, on the premise of not paying creative work, can also obtain according to the following drawings Other accompanying drawings.
Fig. 1 is present configuration schematic diagram;
Fig. 2 is the structural representation of cell recognition module.
Reference:
Cell recognition module 1, Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, Classification and Identification unit 13.
Specific embodiment
In conjunction with following application scenarios, the invention will be further described.
Application scenarios 1
Referring to Fig. 1, Fig. 2, a kind of combined iris of an embodiment of this application scene and the Photobiology of cortical tissue The optical imaging device that imaging device adopts, including cell recognition module and optical imaging device, described cell recognition module is used To determine biological species, described optical imaging device is made up of light source unit for lighting and image-generating unit, light source unit for lighting with become As unit is configured to multispectral multi-polarization state combination optical imaging system, described multispectral multi-polarization state combination optical is imaged System produces at least four combined imagings, comprising: black light and orthogonal polarisation state combined imaging, black light and parallel polarization State combined imaging, near infrared light and orthogonal polarisation state combined imaging, and near infrared light and parallel polarization states combined imaging;
Described multispectral multi-polarization state combination optical imaging system, comprising:
Light source unit for lighting is made up of at least black light, near infrared light light source, and the polarizer;
Image-generating unit is made up of at least black light, near infrared light imaging light path, and analyzer.
Preferably, described light source unit for lighting and image-generating unit are combined and are configured to the wave-length coverage of at least black light For 300-500nm, the wave-length coverage of near infrared light is 700-900nm, and the polarizer and analyzer are combined and are configured at least to have Orthogonal with parallel polarization state.
This preferred embodiment image quality is higher.
Preferably, described multispectral multi-polarization state combination optical imaging system, light source unit for lighting and image-generating unit quilt Increase combination configuration further to include: visible wavelength range is 500-700nm, the polarizer and analyzer are combined and are configured to 45 The polarization state of degree.
This preferred embodiment areas imaging is wider.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual pointIf adopting The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=13, d=2, image denoising effect improves 5% relatively, cell image The extraction accuracy of feature improves 8%.
Application scenarios 2
Referring to Fig. 1, Fig. 2, a kind of combined iris of an embodiment of this application scene and the Photobiology of cortical tissue The optical imaging device that imaging device adopts, including cell recognition module and optical imaging device, described cell recognition module is used To determine biological species, described optical imaging device is made up of light source unit for lighting and image-generating unit, light source unit for lighting with become As unit is configured to multispectral multi-polarization state combination optical imaging system, described multispectral multi-polarization state combination optical is imaged System produces at least four combined imagings, comprising: black light and orthogonal polarisation state combined imaging, black light and parallel polarization State combined imaging, near infrared light and orthogonal polarisation state combined imaging, and near infrared light and parallel polarization states combined imaging;
Described multispectral multi-polarization state combination optical imaging system, comprising:
Light source unit for lighting is made up of at least black light, near infrared light light source, and the polarizer;
Image-generating unit is made up of at least black light, near infrared light imaging light path, and analyzer.
Preferably, described light source unit for lighting and image-generating unit are combined and are configured to the wave-length coverage of at least black light For 300-500nm, the wave-length coverage of near infrared light is 700-900nm, and the polarizer and analyzer are combined and are configured at least to have Orthogonal with parallel polarization state.
This preferred embodiment image quality is higher.
Preferably, described multispectral multi-polarization state combination optical imaging system, light source unit for lighting and image-generating unit quilt Increase combination configuration further to include: visible wavelength range is 500-700nm, the polarizer and analyzer are combined and are configured to 45 The polarization state of degree.
This preferred embodiment areas imaging is wider.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual pointIf adopting The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=15, d=2, image denoising effect improves 6% relatively, cell image The extraction accuracy of feature improves 8%.
Application scenarios 3
Referring to Fig. 1, Fig. 2, a kind of combined iris of an embodiment of this application scene and the Photobiology of cortical tissue The optical imaging device that imaging device adopts, including cell recognition module and optical imaging device, described cell recognition module is used To determine biological species, described optical imaging device is made up of light source unit for lighting and image-generating unit, light source unit for lighting with become As unit is configured to multispectral multi-polarization state combination optical imaging system, described multispectral multi-polarization state combination optical is imaged System produces at least four combined imagings, comprising: black light and orthogonal polarisation state combined imaging, black light and parallel polarization State combined imaging, near infrared light and orthogonal polarisation state combined imaging, and near infrared light and parallel polarization states combined imaging;
Described multispectral multi-polarization state combination optical imaging system, comprising:
Light source unit for lighting is made up of at least black light, near infrared light light source, and the polarizer;
Image-generating unit is made up of at least black light, near infrared light imaging light path, and analyzer.
Preferably, described light source unit for lighting and image-generating unit are combined and are configured to the wave-length coverage of at least black light For 300-500nm, the wave-length coverage of near infrared light is 700-900nm, and the polarizer and analyzer are combined and are configured at least to have Orthogonal with parallel polarization state.
This preferred embodiment image quality is higher.
Preferably, described multispectral multi-polarization state combination optical imaging system, light source unit for lighting and image-generating unit quilt Increase combination configuration further to include: visible wavelength range is 500-700nm, the polarizer and analyzer are combined and are configured to 45 The polarization state of degree.
This preferred embodiment areas imaging is wider.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual pointIf adopting The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=18, d=3, image denoising effect improves 7% relatively, cell image The extraction accuracy of feature improves 7%.
Application scenarios 4
Referring to Fig. 1, Fig. 2, a kind of combined iris of an embodiment of this application scene and the Photobiology of cortical tissue The optical imaging device that imaging device adopts, including cell recognition module and optical imaging device, described cell recognition module is used To determine biological species, described optical imaging device is made up of light source unit for lighting and image-generating unit, light source unit for lighting with become As unit is configured to multispectral multi-polarization state combination optical imaging system, described multispectral multi-polarization state combination optical is imaged System produces at least four combined imagings, comprising: black light and orthogonal polarisation state combined imaging, black light and parallel polarization State combined imaging, near infrared light and orthogonal polarisation state combined imaging, and near infrared light and parallel polarization states combined imaging;
Described multispectral multi-polarization state combination optical imaging system, comprising:
Light source unit for lighting is made up of at least black light, near infrared light light source, and the polarizer;
Image-generating unit is made up of at least black light, near infrared light imaging light path, and analyzer.
Preferably, described light source unit for lighting and image-generating unit are combined and are configured to the wave-length coverage of at least black light For 300-500nm, the wave-length coverage of near infrared light is 700-900nm, and the polarizer and analyzer are combined and are configured at least to have Orthogonal with parallel polarization state.
This preferred embodiment image quality is higher.
Preferably, described multispectral multi-polarization state combination optical imaging system, light source unit for lighting and image-generating unit quilt Increase combination configuration further to include: visible wavelength range is 500-700nm, the polarizer and analyzer are combined and are configured to 45 The polarization state of degree.
This preferred embodiment areas imaging is wider.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual pointIf adopting The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=20, d=4, image denoising effect improves 8% relatively, cell image The extraction accuracy of feature improves 6%.
Application scenarios 5
Referring to Fig. 1, Fig. 2, a kind of combined iris of an embodiment of this application scene and the Photobiology of cortical tissue The optical imaging device that imaging device adopts, including cell recognition module and optical imaging device, described cell recognition module is used To determine biological species, described optical imaging device is made up of light source unit for lighting and image-generating unit, light source unit for lighting with become As unit is configured to multispectral multi-polarization state combination optical imaging system, described multispectral multi-polarization state combination optical is imaged System produces at least four combined imagings, comprising: black light and orthogonal polarisation state combined imaging, black light and parallel polarization State combined imaging, near infrared light and orthogonal polarisation state combined imaging, and near infrared light and parallel polarization states combined imaging;
Described multispectral multi-polarization state combination optical imaging system, comprising:
Light source unit for lighting is made up of at least black light, near infrared light light source, and the polarizer;
Image-generating unit is made up of at least black light, near infrared light imaging light path, and analyzer.
Preferably, described light source unit for lighting and image-generating unit are combined and are configured to the wave-length coverage of at least black light For 300-500nm, the wave-length coverage of near infrared light is 700-900nm, and the polarizer and analyzer are combined and are configured at least to have Orthogonal with parallel polarization state.
This preferred embodiment image quality is higher.
Preferably, described multispectral multi-polarization state combination optical imaging system, light source unit for lighting and image-generating unit quilt Increase combination configuration further to include: visible wavelength range is 500-700nm, the polarizer and analyzer are combined and are configured to 45 The polarization state of degree.
This preferred embodiment areas imaging is wider.
Preferably, described cell recognition module 1 includes Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification knowledge Other unit 13;Described Methods of Segmentation On Cell Images unit 11 is used for distinguishing the back of the body in the cell image being gathered by cell image acquisition module Scape, nucleus and Cytoplasm;Described feature extraction unit 12 is used for the textural characteristics of cell image are extracted;Described classification Recognition unit 13 is used for utilizing grader to realize to cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes image conversion subunit, noise remove subelement, coarse segmentation Subelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subunit, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for carrying out denoising to gray level image, comprising:
For pixel (x, y), choose its 3 × 3 neighborhood sx,y(2n+1) the neighborhood l of × (2n+1)x,y, n be more than Integer equal to 2;
Whether it is that boundary point judges first to pixel, given threshold t, t ∈ [13,26], calculate pixel (x, y) With its neighborhood sx,yIn each pixel gray scale difference value, and be compared with threshold value t, if gray scale difference value is more than the number of threshold value t More than or equal to 6, then pixel (x, y) is boundary point, and otherwise, pixel (x, y) is non-boundary point;
If (x, y) is boundary point, then carry out following noise reduction process:
h ( x , y ) = σ q ( i , j ) &element; [ q ( x , y ) - 1.5 σ , q ( x , y ) + 1.5 σ ] q ( i , j ) k
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, and q (x, y) is the ash of noise reduction preceding pixel point (x, y) Angle value, σ is pixel (x, y) neighborhood lx,yInterior gray value mark is poor, and q (i, j) ∈ [q (x, y) -1.5 σ, q (x, y)+1.5 σ] represents Neighborhood lx,yInterior gray value falls within the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ], and k represents neighborhood lx,yInterior gray value falls within The quantity of the point of interval [q (x, y) -1.5 σ, q (x, y)+1.5 σ];
If (x, y) is non-boundary point, then carry out following noise reduction process:
h ( x , y ) = σ ( i , j ) &element; l x , y w ( i , j ) q ( i , j ) σ ( i , j ) &element; l x , y w ( i , j )
In formula, h (x, y) is the gray value of pixel (x, y) after noise reduction, the ash at q (i, j) representative image midpoint (i, j) place Angle value, w (i, j) is neighborhood lx,yThe corresponding Gauss weight of interior point (i, j);
(3) coarse segmentation subelement, for slightly being drawn to the background in the cell image after denoising, Cytoplasm, nucleus Point, particularly as follows:
Each pixel (x, y) is represented with four dimensional feature vectors:
u → ( x , y ) = [ h ( x , y ) , h a v e ( x , y ) , h m e d ( x , y ) , h s t a ( x , y ) ]
In formula, h (x, y) represents the gray value of (x, y), have(x, y) represents its neighborhood sx,yGray average, hmed(x, y) generation Table its neighborhood sx,yGray scale intermediate value, hsta(x, y) represents its neighborhood sx,yGray variance;
Background, Cytoplasm, nucleus three class are divided into using k-means clustering procedure;
(4) nuclear centers demarcate subelement, for demarcating to nuclear centers:
Nucleus approximate region is obtained by coarse segmentation subelement, if nuclear area comprises n point: (x1,y1),…,(xn, yn), this region is carried out with intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
x z = 1 2 ( σ i = 1 n x i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n x i n )
y z = 1 2 ( σ i = 1 n y i h ( x i , y i ) σ i = 1 n h ( x i , y i ) + σ i = 1 n y i n )
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Along line segment, sampling is carried out with unit length and can obtain dispIndividual pointIf adopting The coordinate of sampling point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place along line segment direction gray scale difference:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
y ( x ) = x i f x ≤ 0 0.5 x i f x > 0
Point (xi,yi) place along line segment direction gradient gra (xi,yi):
g r a ( x i , y i ) = | y ( h d ( x i , y i ) ) | + | y ( h d ( x i + 1 , y i + ! ) ) | 2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel is closed on the space of neighborhood territory pixel Property and grey similarity carrying out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, and adopts Gaussian filter is weighted to gray value filtering, and is changing violent borderline region, row bound keeps filtering, is conducive to image The holding at edge;Nucleus and Cytoplasm coarse contour are extracted using k mean cluster, can effectively remove the interference of noise;Setting is thin Subelement is demarcated at karyon center, is easy to subsequently nucleus and Cytoplasm profile are accurately positioned;Accurate Segmentation subelement fills Divide and make use of directional information, overcome the interference to edge graph for the inflammatory cell, can accurately extract nucleus and Cytoplasm side Edge.
Preferably, the described textural characteristics to cell image extract, comprising:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on improved gray level co-occurrence matrixes method Degree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, the gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d, 45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is x1、x2、x3、x4, then Gray is common The computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
x = σ i = 1 4 w i x i
In formula, d represents distance, the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sides To in each direction on the corresponding contrast level parameter of gray level co-occurrence matrixes calculate, if the gray level co-occurrence matrixes on four direction Corresponding contrast level parameter is respectively di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
w i = 1 | d i - d &overbar; | + 1 / σ i = 1 4 1 | d i - d &overbar; | + 1
(2) four textural characteristics parameters needed for being obtained using described Gray co-occurrence matrix and matrix element project: Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, finally obtain normalized texture eigenvalue.
This preferred embodiment is based on improved gray level co-occurrence matrixes method, asks for cytological map by the way of setting weight coefficient The Gray co-occurrence matrix of picture, and then extract textural characteristics on specified four direction for the cell, solve due to outside dry Disturb the textural characteristics ginseng of the cell that (impact that causes as lighting angle when cell image gathers, flowing interference of gas etc.) causes Numerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast, Variance and, energy and four textural characteristics of average, eliminate redundancy and the characteristic parameter repeating;To described four textural characteristics ginseng Number is normalized, and the Classification and Identification facilitating follow-up cell image is processed.
In this application scenarios, given threshold t=26, d=2, image denoising effect improves 7.5% relatively, cytological map Extraction accuracy as feature improves 8%.
Finally it should be noted that above example is only in order to illustrating technical scheme, rather than the present invention is protected The restriction of shield scope, although having made to explain to the present invention with reference to preferred embodiment, those of ordinary skill in the art should Work as understanding, technical scheme can be modified or equivalent, without deviating from the reality of technical solution of the present invention Matter and scope.

Claims (3)

1. the optical imaging device that the Photobiology imaging device of a kind of combined iris and cortical tissue adopts, is characterized in that, bag Include cell recognition module and optical imaging device, described cell recognition module is used for determining biological species, described optical imagery dress Put and be made up of light source unit for lighting and image-generating unit, light source unit for lighting and image-generating unit are configured to multispectral multi-polarization state group Close optical imaging system, described multispectral multi-polarization state combination optical imaging system produces at least four combined imagings, comprising: Black light and orthogonal polarisation state combined imaging, black light and parallel polarization states combined imaging, near infrared light and cross-polarization State combined imaging, and near infrared light and parallel polarization states combined imaging;
Described multispectral multi-polarization state combination optical imaging system, comprising:
Light source unit for lighting is made up of at least black light, near infrared light light source, and the polarizer;
Image-generating unit is made up of at least black light, near infrared light imaging light path, and analyzer.
2. the light that the Photobiology imaging device of a kind of combined iris according to claim 1 and cortical tissue adopts studies As device, it is characterized in that, described light source unit for lighting and image-generating unit are combined and are configured to the wavelength model of at least black light Enclose for 300-500nm, the wave-length coverage of near infrared light is 700-900nm, the polarizer and analyzer are combined and are configured at least to have There is orthogonal with parallel polarization state.
3. the light that the Photobiology imaging device of a kind of combined iris according to claim 2 and cortical tissue adopts studies As device, it is characterized in that, described multispectral multi-polarization state combination optical imaging system, light source unit for lighting and image-generating unit quilt Increase combination configuration further to include: visible wavelength range is 500-700nm, the polarizer and analyzer are combined and are configured to 45 The polarization state of degree.
CN201610777489.2A 2016-08-30 2016-08-30 Optical imaging system adopted by bio-optical imaging device of combined iris and cortical tissues Pending CN106361280A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610777489.2A CN106361280A (en) 2016-08-30 2016-08-30 Optical imaging system adopted by bio-optical imaging device of combined iris and cortical tissues

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610777489.2A CN106361280A (en) 2016-08-30 2016-08-30 Optical imaging system adopted by bio-optical imaging device of combined iris and cortical tissues

Publications (1)

Publication Number Publication Date
CN106361280A true CN106361280A (en) 2017-02-01

Family

ID=57901027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610777489.2A Pending CN106361280A (en) 2016-08-30 2016-08-30 Optical imaging system adopted by bio-optical imaging device of combined iris and cortical tissues

Country Status (1)

Country Link
CN (1) CN106361280A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1256635A (en) * 1997-05-22 2000-06-14 默克专利股份有限公司 Peptide-coated implants and methods for producing same
CN1462884A (en) * 2003-06-24 2003-12-24 南京大学 Method of recognizing image of lung cancer cells with high accuracy and low rate of false negative
CN101411606A (en) * 2007-10-15 2009-04-22 倪蔚民 Biological measuring system for combined iris and cortex tissue

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1256635A (en) * 1997-05-22 2000-06-14 默克专利股份有限公司 Peptide-coated implants and methods for producing same
CN1462884A (en) * 2003-06-24 2003-12-24 南京大学 Method of recognizing image of lung cancer cells with high accuracy and low rate of false negative
CN101411606A (en) * 2007-10-15 2009-04-22 倪蔚民 Biological measuring system for combined iris and cortex tissue

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李宽: "细胞图像的分割、纹理提取及识别方法研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Similar Documents

Publication Publication Date Title
CN104766058B (en) A kind of method and apparatus for obtaining lane line
CN106529508B (en) Based on local and non local multiple features semanteme hyperspectral image classification method
CN104143079B (en) The method and system of face character identification
CN102054274B (en) Method for full automatic extraction of water remote sensing information in coastal zone
CN102629374B (en) Image super resolution (SR) reconstruction method based on subspace projection and neighborhood embedding
CN105956582A (en) Face identifications system based on three-dimensional data
CN104091169A (en) Behavior identification method based on multi feature fusion
CN109509164A (en) A kind of Multisensor Image Fusion Scheme and system based on GDGF
CN110310289A (en) Lung tissue's image partition method based on deep learning
CN103927758B (en) Saliency detection method based on contrast ratio and minimum convex hull of angular point
CN109871875B (en) Building change detection method based on deep learning
CN106846322B (en) The SAR image segmentation method learnt based on curve wave filter and convolutional coding structure
CN104732215A (en) Remote-sensing image coastline extracting method based on information vector machine
CN105138983B (en) The pedestrian detection method divided based on weighting block model and selective search
CN106128121A (en) Vehicle queue length fast algorithm of detecting based on Local Features Analysis
CN109191416A (en) Image interfusion method based on sparse dictionary study and shearing wave
CN105869166A (en) Human body action identification method and system based on binocular vision
CN104050685A (en) Moving target detection method based on particle filtering visual attention model
CN109635726B (en) Landslide identification method based on combination of symmetric deep network and multi-scale pooling
CN109034213B (en) Hyperspectral image classification method and system based on correlation entropy principle
CN109492570A (en) A kind of SAR image target recognition method based on multiple dimensioned rarefaction representation
CN109509163A (en) A kind of multi-focus image fusing method and system based on FGF
CN112037252A (en) Eagle eye vision-based target tracking method and system
CN105512622A (en) Visible remote-sensing image sea-land segmentation method based on image segmentation and supervised learning
CN105512670B (en) Divided based on KECA Feature Dimension Reduction and the HRCT peripheral nerve of cluster

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170201