CN112505002B - Solution turbidity detection method, medium and image system based on RGB model - Google Patents

Solution turbidity detection method, medium and image system based on RGB model Download PDF

Info

Publication number
CN112505002B
CN112505002B CN202011356206.XA CN202011356206A CN112505002B CN 112505002 B CN112505002 B CN 112505002B CN 202011356206 A CN202011356206 A CN 202011356206A CN 112505002 B CN112505002 B CN 112505002B
Authority
CN
China
Prior art keywords
image
sample
reference image
light
turbidity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011356206.XA
Other languages
Chinese (zh)
Other versions
CN112505002A (en
Inventor
田晶晶
李勇
段生宝
丁少华
陈晔洲
魏双施
王红梅
谢劲松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Institute of Biomedical Engineering and Technology of CAS
Original Assignee
Suzhou Institute of Biomedical Engineering and Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Institute of Biomedical Engineering and Technology of CAS filed Critical Suzhou Institute of Biomedical Engineering and Technology of CAS
Priority to CN202011356206.XA priority Critical patent/CN112505002B/en
Publication of CN112505002A publication Critical patent/CN112505002A/en
Application granted granted Critical
Publication of CN112505002B publication Critical patent/CN112505002B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/59Transmissivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A20/00Water conservation; Efficient water supply; Efficient water use
    • Y02A20/20Controlling water pollution; Waste water treatment

Abstract

The invention provides a solution turbidity detection method based on an RGB model, which comprises the following steps: acquiring a sample image, wherein the sample image at least comprises a reference image for comparison; respectively acquiring a single pixel point M in a reference image surrounding area and a reference image area under an RGB model n Is respectively marked as: y is Y N n (R,G,B)、Y M n (R, G, B); respectively obtain Y N n (R,G,B)、Y M n (R, G, B) luminance value V of each channel R 、V G、 V B And respectively V R 、V G、 V B The maximum value in (2) is marked as the brightness value V of the corresponding pixel point N n 、V M n The method comprises the steps of carrying out a first treatment on the surface of the Respectively acquiring brightness values V of a plurality of pixel points in a surrounding area and a reference image area of the reference image 0 V, V; according to V 0 The ratio of V to obtain the turbidity value tau of the transparent solution containing the sample to be tested. The invention also relates to an image system and a storage medium. According to the invention, the reference position is arranged in the sample position, so that the difference of the reference position positions is avoided, and the light irradiation intensity is different, so that the final detection result is influenced.

Description

Solution turbidity detection method, medium and image system based on RGB model
Technical Field
The invention relates to the technical field of medical detection, in particular to a solution turbidity detection method, medium and image system based on an RGB model.
Background
In the prior art, at least one sample position and a reference position are often needed when turbidity detection is carried out, and the sample position is calibrated through the reference position. When the light source irradiates the sample position and the reference position on the pore plate, the difference of the positions of the reference positions can lead to the difference of the brightness of light rays emitted into the positions, thereby causing the difference of the final detection results.
In order to improve the measurement accuracy, the invention provides a novel solution turbidity detection method based on an RGB model to solve the problems.
Disclosure of Invention
In order to overcome the defects of the prior art, the first object of the invention is to provide a solution turbidity detection method based on an RGB model, which comprises the following steps:
acquiring an image of a light-permeable solution containing a sample to be detected, which is shot by an image acquisition device and irradiated by a light source, and marking the image as a sample image; the sample image also at least comprises a reference image for comparison; wherein the region where the reference image is located is an image of a pure dark color block;
respectively acquiring single pixel points N in surrounding areas of reference images under RGB model n Single pixel point M in the reference image area with the scattering effect of the light-permeable solution superimposed with the sample to be measured n Is respectively marked as: y is Y N n (R,G,B)、Y M n (R,G,B);
Respectively obtain Y N n (R,G,B)、Y M n (R, G, B) luminance value V of each channel R 、V G 、V B And respectively V R 、V G 、V B The maximum value in (2) is marked as the brightness value V of the corresponding pixel point N n 、V M n
Respectively acquiring surrounding areas of reference images and reference imagesA plurality of pixel points { N } in the image area n }、{M n Luminance value group { V } N n }、{V M n To { V } from the luminance value group N n }、{V M n Obtaining a reference brightness value V 0 A characteristic brightness value V which is a brightness value group { V } M n Maximum/minimum in }; or as a brightness value group { V ] M n Arithmetic mean; the reference brightness value V 0 For the set of luminance values { V N n Maximum/minimum in }; or as a brightness value group { V ] N n Arithmetic mean;
according to V 0 And V to obtain the turbidity value tau of the light-permeable solution containing the sample to be tested.
Preferably, the method comprises the steps of,
according to a single pixel N in the surrounding area of the reference image n Constructing an HSV model to obtain luminance values V of corresponding channels R 、V G 、V B The method comprises the steps of carrying out a first treatment on the surface of the According to a single pixel point M in the reference image n Constructing an HSV model to obtain luminance values V of corresponding channels R 、V G 、V B
Preferably, the method comprises the steps of,
Figure BDA0002802699220000021
where b is the optical path and a is the coefficient associated with the particular solution.
Preferably, the image acquisition device captures a plurality of the sample images.
Preferably, the image segmentation step is further included:
and carrying out image segmentation on an original image shot by the image acquisition device through an algorithm model to acquire a plurality of sample images.
Preferably, the reference image is a circular image within the sample image.
Preferably, when extracting the reference image from the sample image, the method further comprises the steps of:
acquiring a radius of a reference object forming a reference image on the sample container;
determining the position of the center of a circle of a reference image in a sample image according to model information of the sample container, wherein the model information is the relative position relation between the sample position on the sample container and a reference object;
and positioning the region of the reference image in the sample image according to the circle center position of the reference image and the radius of the reference object.
It is a second object of the invention to provide a computer readable storage medium having stored thereon a computer program for execution by a processor of a method as claimed in any one of the preceding claims.
A third object of the present invention is to provide an image system including:
a turbidity detection module configured to perform a solution turbidity detection method based on an RGB model as described above;
the image acquisition module is used for acquiring image information of sample positions and recording the image information as a sample image, wherein the sample image comprises a reference image for comparison;
a light emitting module for providing illumination;
the image acquisition module acquires a sample image and a reference image under the light-emitting module, and the sample image and the reference image are transmitted to the turbidity detection module and output the turbidity value of the sample bit.
Preferably, the device further comprises a reflecting module for reflecting the light rays irradiated by the light emitting module from the sample position to the image acquisition module.
Compared with the prior art, the invention has the beneficial effects that:
(1) According to the solution turbidity detection method, the independent setting of the reference position is canceled, so that the influence of the difference of the reference position on the final detection result is avoided.
(2) The pixel values under the most common RGB model are obtained conveniently and rapidly.
(3) And shooting a plurality of sample images by using an image acquisition device so as to realize large-flux detection.
(4) The image processing is carried out on the three-channel RGB image to be processed by adopting the image segmentation method, so that the method is simple and quick.
(5) The reference image is arranged to be circular, so that light rays irradiating into the reference image are uniform as much as possible.
The foregoing description is only an overview of the present invention, and is intended to provide a better understanding of the present invention, as it is embodied in the following description, with reference to the preferred embodiments of the present invention and the accompanying drawings. Specific embodiments of the present invention are given in detail by the following examples and the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of a solution turbidity detection method based on RGB model of the present invention;
FIG. 2 is a flow chart of an image segmentation method according to the present invention;
FIG. 3 is a flow chart of the present invention for improving image segmentation accuracy;
FIG. 4 is a flow chart of the present invention for locating a reference image from a sample image;
FIG. 5 is a schematic diagram of an image system according to the present invention;
FIG. 6 is a schematic view of a sample container according to the present invention;
FIG. 7 is a schematic diagram of a light blocking layer according to the present invention;
FIG. 8 is a graph of turbidity of a light-transmissible solution according to the present invention versus a ratio of a representative luminance value to a reference luminance value;
reference numerals: 1. the image characterizes the subject; 11. a carrier; 12. a sample container; 13. a light source; 14. an image acquisition device; 15. a light ray adjusting device; 121. a body; 1211. sample position; 123. a light blocking layer; 1231. opaque regions.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and detailed description, wherein it is to be understood that, on the premise of no conflict, the following embodiments or technical features may be arbitrarily combined to form new embodiments.
Example 1
The main basis of turbidity detection is the scattering properties of the particles in the suspension to light. When a beam of light passes through the suspension, the degree of scattered light (or the degree of transmitted light weakening) is proportional to the number of particles in the suspension under certain conditions, and the specific formula can be found: i=i 0 e τb Where I is the transmitted light intensity, I0 is the incident light intensity, b is the optical path, and τ is the turbidity. The detection technology used at present is based on photoelectric technology, when light beam passes through suspension, the light beam is scattered or absorbed to reduce the transmission quantity, the concentration of the suspension is in direct proportion to the optical density and in inverse proportion to the transmittance, and the optical density or the transmittance can be measured by an photoelectric device, which is the basis of photoelectric turbidity detection method. The traditional photoelectric turbidity detection cannot embody the spatial position information of the measured value, and cannot realize the large-flux detection.
The application relates to a novel solution turbidity detection method based on an RGB model, which is characterized in that image information is acquired through an image acquisition device, and a final turbidity value is obtained through extracting effective image information and performing data processing. The method specifically comprises the following steps, as shown in fig. 1:
s101: acquiring an image of a light-permeable solution containing a sample to be detected, which is shot by an image acquisition device and irradiated by a light source, and marking the image as a sample image; the sample image also at least comprises a reference image for comparison; wherein the region where the reference image is located is an image of a pure dark color block; the image acquisition device can be a camera, a video camera and the like; for example, the image acquisition device can be a digital camera to acquire digital images; the image acquisition device may also be a Charge Coupled Device (CCD); the light-permeable solution containing the sample to be tested can be chylomicron or other light-permeable solution containing particulate matter; the image of the pure dark block may be a black image; in some embodiments, the reference image corresponds to when the upper cover of the sample container (the sample container comprises a container body and an upper cover for preventing volatilization of the container) comprises a substrate such as a pure black ground color (the pure black ground color can be directly printed on the upper cover or fixedly connected with the upper cover through post processing, the fixed connection mode comprises bonding and the like); light passes through the black patch image (image of a pure dark patch) formed in the sample image by the cover having the pure black ground color.
It should be understood that the reference image and the sample image may have any shape, and preferably, the reference bit image and the sample bit image are circular images, so as to ensure uniformity of light when the reference bit and the sample bit are irradiated by the light as much as possible.
S102: respectively acquiring single pixel points N in surrounding areas of reference images under RGB model n Single pixel point M in the reference image area with the scattering effect of the light-permeable solution superimposed with the sample to be measured n Is respectively marked as: y is Y N n (R,G,B)、Y M n (R, G, B); specifically, a single pixel point N in a surrounding area of a reference image under an RGB model is acquired n Is denoted as Y N n (R, G, B); acquiring a single pixel point M in the reference image area with the scattering effect of the transparent solution superimposed with the sample to be tested under the RGB model n Is denoted as Y M n (R,G,B)。
S103: respectively obtain Y N n (R,G,B)、Y M n (R, G, B) luminance value V of each channel R 、V G 、V B And respectively V R 、V G 、V B The maximum value in (2) is marked as the brightness value V of the corresponding pixel point N n 、V M n The method comprises the steps of carrying out a first treatment on the surface of the Specifically, obtain Y M n (R, G, B) luminance value V of each channel R 、V G 、V B And V is to R 、V G 、V B The maximum value of (a) is recorded as the brightness value V of the pixel point M n The method comprises the steps of carrying out a first treatment on the surface of the Acquisition of Y N n (R, G, B) channelsIs the brightness value V of (2) R 、V G 、V B And V is to R 、V G 、V B The maximum value of (a) is recorded as the brightness value V of the pixel point N n
S104: respectively acquiring a plurality of pixel points { N } in a reference image surrounding area and a reference image area n }、{M n Luminance value group { V } N n }、{V M n To { V } from the luminance value group N n }、{V M n Obtaining a reference brightness value V 0 A characteristic brightness value V which is a brightness value group { V } M n Maximum/minimum in }; or as a brightness value group { V ] M n Arithmetic mean; the reference brightness value V 0 For the set of luminance values { V N n Maximum/minimum in }; or as a brightness value group { V ] N n Arithmetic mean; specifically, the surrounding area of the reference image is composed of a plurality of pixel points, which are respectively denoted as pixel points N 1 、N 2 、N 3 、N 4 Up to M n The method comprises the steps of carrying out a first treatment on the surface of the 1-n pixel points cover all pixel points in the surrounding area of the reference image; s102, obtaining brightness values of all pixel points in the region to obtain brightness values V of a plurality of pixel points corresponding to the target pixel point N n The method comprises the steps of carrying out a first treatment on the surface of the In some embodiments, for all pixel values V in the region obtained N n Size comparison is performed to pass V N n To measure the brightness value V in the surrounding area of the reference image 0 The method comprises the steps of carrying out a first treatment on the surface of the Specifically, N 1 The corresponding brightness value is V N 1 ,N 2 The corresponding brightness value is V N 2 ,N n The corresponding brightness value is V N n For V N n Sorting the sizes to obtain the maximum value or the minimum value in the single pixel point value in the area, and representing the brightness value of the area around the reference image; in other embodiments, for all pixel values V in the region obtained N n Arithmetic mean value is calculated and the mean value is taken as the brightness value in the areaThe average value is taken as the brightness value, so that the accuracy of the brightness value in the area can be improved. Specifically, the pixel point N is included in the surrounding area of the reference image 1 、N 2 、N 3 、N 4 Up to N n The brightness value of the surrounding area of the reference image passes (V N 1 +V N 2 +V N 3 +……+V N n ) And (3) calculating/n, wherein the numerator of the calculation formula is the sum of brightness values of all single pixels in the area, and the denominator is the number of the single pixels in the area.
Obtaining brightness values V of a plurality of pixel points in a reference image M n And sort the brightness values of the pixels to obtain V M n Maximum/minimum value in (1) or brightness value V for several pixels M n Calculating an arithmetic mean to obtain V; the image in the reference image is composed of a plurality of pixel points which are respectively marked as pixel points M 1 、M 2 、M 3 、M 4 Up to M n The method comprises the steps of carrying out a first treatment on the surface of the 1-n pixel points cover all pixel points in the reference image; s104, obtaining brightness values of all pixel points in the region to obtain brightness values V of a plurality of pixel points corresponding to the target pixel point M n The method comprises the steps of carrying out a first treatment on the surface of the In some embodiments, for all pixel values V in the region obtained M n Size comparison is performed to pass V M n The maximum value or the minimum value of the reference image is used for measuring the brightness value V of the area in the reference image; specifically, M 1 The corresponding brightness value is V M 1 ,M 2 The corresponding brightness value is V M 2 ,M n The corresponding brightness value is V M n For V M n Sorting the sizes to obtain the maximum value or the minimum value in the single pixel point value in the region, and representing the brightness value of the region in the reference image; in other embodiments, for all pixel values V in the region obtained M n The arithmetic average value is calculated, the average value is taken as the brightness value in the area, and the average value is taken as the brightness value, so that the accuracy of the brightness value in the area can be improved. Specifically, in the reference imageThe image in includes pixel point M 1 、M 2 、M 3 、M 4 Up to M n The brightness value of the region passes (V M 1 +V M 2 +V M 3 +……+V M n ) And (3) calculating/n, wherein the numerator of the calculation formula is the sum of brightness values of all single pixels in the area, and the denominator is the number of the single pixels in the area.
S105: according to V 0 And V to obtain the turbidity value tau of the light-permeable solution containing the sample to be tested.
In some embodiments, V is 0 V is substituted into formula
Figure BDA0002802699220000071
Where b is the optical path and a is the coefficient associated with the particular solution.
Specifically, 6 sets of clear solutions were prepared for linear regression analysis. And respectively acquiring corresponding images to determine a coefficient value and b coefficient value according to the relation among the reference brightness value, the specific value of the characterization brightness value and the turbidity of the solution of the image of the transparent solution. As shown in fig. 8, the ratio of the characteristic brightness value of the 6 groups of transparent solutions to the reference brightness value and the turbidity are respectively represented by an abscissa and an ordinate to form a relation diagram; the abscissa x represents V/V 0 Y represents turbidity, and the relation between x and y is obtained, which is y= 2.2476e 4.3572x The method comprises the steps of carrying out a first treatment on the surface of the Wherein the coefficient 1/b is equal to 2.2476; a is equal to 4.3572; r is R 2 The correlation index is shown to reflect the effect of linear regression analysis, and the closer to 1 the correlation index is, the better the regression fit effect is, the higher the model fit goodness exceeding 0.8 is. Further determining a new and convenient turbidity calculation formula
Figure BDA0002802699220000072
The turbidity measurement can be carried out on the transparent solution, and the turbidity measurement can also be carried out on a plurality of transparent solutions to be detected, so that the high-flux quantitative analysis is realized, and the method is quick and convenient.
The corresponding relation between the characteristic brightness value and the reference brightness value of the 6 groups of transparent solutions and the turbidity value is shown in the table I.
List one
V/V 0 Turbidity (%)
0.93 100
0.71 50
0.49 24
0.34 13.5
0.23 6.25
0.16 3.125
In performing acquisition Y M n (R, G, B) luminance value V of each channel R 、V G 、V B In this case, the single pixel point M in the sample image is used n Constructing an HSV model to obtain luminance values V of corresponding channels R 、V G 、V B . When the model conversion is carried out, the RGB model is converted into a formula corresponding to the HSV model; wherein, the value corresponding to V in the HSV model is brightness. The conversion formula can be specifically:
assuming that max is equal to the maximum value among the R value, G value, and B value, min is the minimum value. The (H, S, V) values in the corresponding HSV space are:
if max=min, h=0 °;
if max=rand g+.b,
Figure BDA0002802699220000081
if max=rand G < B,
Figure BDA0002802699220000082
if max=g,
Figure BDA0002802699220000083
if max=b,
Figure BDA0002802699220000084
if max= 0,S =0;
if max is not equal to 0,
Figure BDA0002802699220000085
V=max。
h is between 0 and 360 DEG, S is between 0 and 100%, and V is between 0 and max.
In some embodiments, the conversion model is a formula corresponding to converting an RGB model into a YUV model; wherein, the value corresponding to Y in the YUV model is brightness. The conversion formula can be specifically:
Y=0.299R+0.587G+0.114B
U=-0.147R-0.289G+0.436B
V=0.615R-0.515G-0.100B
R=Y+1.14V
G=Y-0.39U-0.58V
B=Y+2.03U
wherein the RGB value ranges are all 0-255.
Similarly, in executing acquisition Y N (R, G, B) luminance value V of each channel R 、V G 、V B When the method is used, an HSV model/YUV model is built according to three channel components of a single pixel point N in a sample image so as to obtain a brightness value V of a corresponding channel R 、V G 、V B
Specifically, when S101 is executed, since the image captured by the image capturing device is an image, the upper computer needs to perform feature extraction on the obtained image to determine the specific location areas of the sample image and the reference image. The feature extraction may be a feature recognition result corresponding to the entire image based on an artificial intelligence algorithm. The artificial intelligence algorithm may be implemented, for example, by a deep neural network. For example, the deep neural network may be based on a neural network model such as GoogLeNet, alexNet, ZFnet, res net, etc., and image processing of the image is achieved by training a sample database to achieve feature extraction. That is, in the implementation, the image captured by the image capturing device is subjected to feature extraction to determine a sample image and a reference image.
The corresponding embodiments described above are deployed around a sample site; in practice, the sample container often includes a plurality of sample positions, and the image captured by the image capturing device is an image including a plurality of sample images.
When several sample images are included, an image segmentation step is also included, as shown in fig. 2:
s201: and carrying out image segmentation on an original image shot by the image acquisition device through an algorithm model to acquire a plurality of sample images. In some embodiments, an original image comprising a plurality of sample images may be image segmented according to an intelligent algorithm to obtain a plurality of target images, the plurality of sample images being non-overlapping and independent of each other.
In some embodiments, in order to improve the accuracy of the acquired bit image including the plurality of target samples, in some embodiments, the method further includes the step of, after S201, as shown in fig. 3:
s202: comparing the similarity of the original image comprising a plurality of sample images and the standard image comprising a plurality of sample positions; a standard image of the sample container including a plurality of sample positions may be pre-stored in the computer program, and the standard image is compared with the original image including a plurality of sample images acquired in S201 to determine the similarity therebetween.
S203: if the similarity meets the image segmentation condition, segments a plurality of image blocks corresponding to the sample images in the original image. The image segmentation conditions may be pre-stored in a computer program or manually set corresponding image segmentation conditions, if the similarity is set to 85% or more, in order to meet the image segmentation conditions, it should be understood that the image segmentation conditions may be set according to specific requirements, and when the conditions are met, the original image is subjected to image segmentation.
The accuracy of the image segmentation is ensured by performing S201-S203.
The reference image referred to in S101 is an orthographic projection of the light blocking region on the sample container formed in the sample position of the sample container, and may be any shape, and only the region where the reference image does not exceed the sample image needs to be defined. In some embodiments, the reference image is circular such that the light is directed toward the reference image with uniform scattered light around the reference image.
When the reference image is circular, the steps are further included in extracting the reference image from the sample image as shown in fig. 4:
s301: acquiring a radius of a reference object forming a reference image on the sample container; in some embodiments, a user may manually input the radius of the light blocking region within the sample container according to the model of the sample container; in other embodiments, the radius of the light blocking area is a certain value for different sample containers, and the medium for performing the method may store the radius of the reference object directly obtained and stored.
S302: determining the position of the center of a circle of a reference image in a sample image according to model information of the sample container, wherein the model information is the relative position relation between the sample position on the sample container and a reference object; because the reference image is arranged in the sample image, when the position of the sample image and the circle center of the sample image are determined, the circle center position of the reference image in the sample image is determined only according to the theoretical positional relationship between the circle center of the orthographic projection of the light blocking area of the sample container and the circle center of the sample position.
S303: and positioning the region of the reference image in the sample image according to the circle center position of the reference image and the radius of the reference object. After the circle center is determined in S302, the region of the reference image is located in the sample image with the radius of the reference object in S301. It should be understood that the light blocking region is a black substrate on the sample container that is disposed in one-to-one correspondence with the sample site, and that the orthographic projection of the black substrate is within the sample site.
A computer readable storage medium having stored thereon a computer program for execution by a processor of a method as above.
Example two
As shown in fig. 5-7, an imaging system, comprising: a turbidity detection module configured to perform a solution turbidity detection method based on an RGB model as in embodiment one; the image acquisition module is used for acquiring image information of sample positions and recording the image information as a sample image, wherein the sample image comprises a reference image for comparison; a light emitting module for providing illumination; the image acquisition module acquires a sample image and a reference image under the light-emitting module, and the sample image and the reference image are transmitted to the turbidity detection module and output the turbidity value of the sample bit.
In some embodiments, the device further comprises a reflecting module, so that the light emitted by the light emitting module from the sample position is reflected to the image acquisition module, thereby avoiding overlong light paths and ensuring that the overall structure of the image system is moderate in size.
The image system comprises an image representation main body 1 which is of a light-proof shell structure; the image characterization subject 1 includes: a stage 11 for placing a sample container 12; a light source 13 for providing illumination of the sample solution in the sample container 12; at least one image acquisition device 14 to acquire an image of the sample solution; the turbidity detection module is used for acquiring an image of the sample solution and converting the image into a turbidity value; the light source 13 and the image acquisition device 14 are respectively positioned at two sides of the carrying platform 11; after the light source 13 is triggered, the image acquisition device 14 acquires images of the sample solution in a plurality of sample positions, and the images are converted into turbidity values representing suspended matters in the solution in the respective sample positions through the turbidity detection module.
It should be understood that the image characterization object 1 has a closed structure to provide a light-proof acquisition environment for image acquisition, so as to ensure the accuracy of the detection result. In some embodiments, the turbidity detection module is stored in the host computer to display turbidity through the host computer. The upper computer can be a computer, a tablet, a mobile phone and the like, and can be loaded with the turbidity detection module and comprises a display device so as to directly display the turbidity information finally obtained by the turbidity detection module on the upper computer, so that a user can quickly know the turbidity information of the sample solution to be detected in the first time.
In some embodiments, a stage 11 is fixedly mounted to the interior of the image representation body 1 to provide a support platform for the sample container 12. In addition, the carrying platform 11 can also be movably arranged inside the image representation main body 1, namely, the carrying platform 11 can be movably moved at the movable opening position so as to facilitate the taking and placing of the sample container 12 in the image representation main body 1. In order to ensure the accuracy of the detection result, in other embodiments, the stage 11 further includes at least one sensor (not shown), which may be a positioning sensor, for accurately positioning the sample container 12, so as to ensure that the light emitted by the light source 13 is disposed opposite to the sample container 12. To ensure that light can penetrate the stage 11, the stage 11 is preferably made of a transparent material.
The light source 13 is switchable between two or more wavelength spectrums, such as white light, red light, blue light, green light, ultraviolet light (UV), near infrared light (near IR), and infrared light (IR), combinations of the foregoing, and the like. The light source 13 is a surface light source to ensure that light is directed into the sample site within each sample container.
At least one image acquisition device 14 is used for optically imaging the sample container 12 on the stage 11. Since one image capturing device 14 may cause a lack of captured image information, in order to ensure the comprehensiveness of the captured image information, a plurality of image capturing devices 14 may be provided within the image characterization subject 1, and the plurality of image capturing devices 14 may be arranged to capture images from different perspectives arranged around the imaging position. The image acquisition device 14 may be a digital camera to acquire digital images; the image capturing device 14 may be a Charge Coupled Device (CCD), a camera, or the like having an image capturing function.
In an embodiment, when the image capturing device 14 is a CCD camera, after the light source 13 is triggered, the CCD camera obtains image information of each sample in the sample container 12, and after the image information is transmitted to the computer through the transmission device, a preset formula in the computer substitutes the obtained image information into the formula to output a plurality of turbidity values to represent the sample to be tested. The CCD camera may be communicatively coupled to a computer, and the sample container 12 may include a number of sample sites, which may be any number, specifically configured according to the actual detection requirements. In order to meet the requirement of high-throughput detection, the sample container 12 may be a 96-well plate, so as to detect 96 samples at one time, the CCD camera shoots the 96-well plate to obtain image information of the 96-well plate, after the image information is uploaded to the computer, the computer firstly performs effective feature extraction on the obtained image information, and then inputs the effective image information into a preset formula to calculate so as to obtain turbidity values of 96 sample positions.
In some embodiments, the sample container 11 is an well plate comprising several sample sites or a rack for holding transparent test tubes; when the sample container 11 is an orifice plate including a plurality of sample sites, the sample container 11 can be directly used for containing a sample solution to be measured; when the sample container 11 is a rack for holding transparent test tubes, the sample container 11 is used for loading transparent test tubes holding a sample solution to be tested.
In some embodiments, a reflective assembly (not shown) is included between the sample container 12 and the image capture device 14 to reflect an image of the sample solution toward the image capture device 14; the overlong distance between the image acquisition device 14 and the pore plate 12 and overlong imaging path are avoided, so that the overall structure of the image characterization main body 1 is overlarge.
To form a reference image for contrast in the sample image, the sample container 12 includes: the sample storage device comprises a body 121, wherein a plurality of sample positions 1211 are arranged on the body 121, and the sample positions 1211 are used for loading a light-permeable solution of a sample to be tested or a transparent test tube of the light-permeable solution of the sample to be tested; the light blocking layer 123 is arranged on the body 121 in a covering manner and covers the sample positions 1211, the light blocking layer 123 comprises a plurality of light-tight areas 1231, the light-tight areas 1231 are in one-to-one correspondence with the sample positions 1211, and the orthographic projection of the light-tight areas 1231 is positioned in the sample positions 1211; the light emitted by the light source irradiates the sample position 1211 corresponding to the opaque regions through the plurality of opaque regions 1231 on the light blocking layer 123, so that the light brightness of the scattering effect of the light-permeable solution of the sample to be tested is superimposed on the orthographic projection of the opaque regions 1231 in the sample position 1211.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, one or more embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Moreover, one or more embodiments of the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Although embodiments of the present invention have been disclosed above, it is not limited to the details and embodiments shown, it is well suited to various fields of use, and further modifications may be readily made by those skilled in the art without departing from the general concepts defined by the claims and the equivalents thereof, and therefore the invention is not limited to the specific details and examples shown herein.

Claims (6)

1. The solution turbidity detection method based on the RGB model is characterized by comprising the following steps:
acquiring an image of a light-permeable solution containing a sample to be detected, which is shot by an image acquisition device and irradiated by a light source, and marking the image as a sample image; the sample image also at least comprises a reference image for comparison; wherein the region where the reference image is located is an image of a pure dark color block;
the reference image is obtained by:
image segmentation is carried out on an original image shot by an image acquisition device through an algorithm model so as to obtain a plurality of sample images; the reference image is a circular image in the sample image;
when extracting the reference image from the sample image, further comprising the steps of:
acquiring a radius of a reference object forming a reference image on the sample container;
determining the position of the center of a circle of a reference image in a sample image according to model information of the sample container, wherein the model information is the relative position relation between the sample position on the sample container and a reference object;
positioning the region of the reference image in the sample image according to the circle center position of the reference image and the radius of the reference object;
respectively acquiring single pixel points N in surrounding areas of reference images under RGB model n Single pixel point M in the reference image area with the scattering effect of the light-permeable solution superimposed with the sample to be measured n Is respectively marked as: y is Y N n (R,G,B)、Y M n (R,G,B);
Respectively obtain Y N n (R,G,B)、Y M n (R, G, B) luminance value V of each channel R 、V G、 V B And respectively V R 、V G、 V B The maximum value in (2) is marked as the brightness value V of the corresponding pixel point N n 、V M n
Respectively acquiring a plurality of pixel points { N } in a reference image surrounding area and a reference image area n }、{M n Luminance value group { V } N n }、{V M n To { V } from the luminance value group N n }、{V M n Obtaining a reference brightness value V 0 A characteristic brightness value V which is a brightness value group { V } M n Maximum/minimum in }; or as a brightness value group { V ] M n Arithmetic mean; the reference brightness value V 0 For the set of luminance values { V N n Maximum/minimum in }; or as a brightness value group { V ] N n Arithmetic mean;
according to V 0 The ratio of V to obtain the turbidity value tau of the light-permeable solution containing the sample to be detected;
Figure FDA0004063449690000021
where b is the optical path and a is the coefficient associated with the particular solution.
2. The method for detecting turbidity of a solution based on an RGB model of claim 1,
according to a single pixel N in the surrounding area of the reference image n Constructing an HSV model to obtain luminance values V of corresponding channels R 、V G、 V B The method comprises the steps of carrying out a first treatment on the surface of the According to a single pixel point M in the reference image n Constructing an HSV model to obtain luminance values V of corresponding channels R 、V G、 V B
3. The method for detecting the turbidity of a solution based on an RGB model according to claim 1, wherein the image acquisition device shoots a plurality of sample images.
4. A computer-readable storage medium having stored thereon a computer program, characterized by: the computer program being adapted to perform the method of any of claims 1-3 by a processor.
5. An imaging system, comprising:
a turbidity detection module configured to perform the RGB model-based solution turbidity detection method of claim 1;
the image acquisition module is used for acquiring image information of sample positions and recording the image information as a sample image, wherein the sample image comprises a reference image for comparison;
a light emitting module for providing illumination;
the image acquisition module acquires a sample image and a reference image under the light-emitting module, and the sample image and the reference image are transmitted to the turbidity detection module and output the turbidity value of the sample bit.
6. The imaging system of claim 5, further comprising a reflection module to reflect light rays radiated by the light emitting module from the sample site toward the image acquisition module.
CN202011356206.XA 2020-11-26 2020-11-26 Solution turbidity detection method, medium and image system based on RGB model Active CN112505002B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011356206.XA CN112505002B (en) 2020-11-26 2020-11-26 Solution turbidity detection method, medium and image system based on RGB model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011356206.XA CN112505002B (en) 2020-11-26 2020-11-26 Solution turbidity detection method, medium and image system based on RGB model

Publications (2)

Publication Number Publication Date
CN112505002A CN112505002A (en) 2021-03-16
CN112505002B true CN112505002B (en) 2023-05-12

Family

ID=74966762

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011356206.XA Active CN112505002B (en) 2020-11-26 2020-11-26 Solution turbidity detection method, medium and image system based on RGB model

Country Status (1)

Country Link
CN (1) CN112505002B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115046966B (en) * 2022-08-16 2022-11-04 山东国慈新型材料科技有限公司 Method for detecting recycling degree of environmental sewage

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2534631B1 (en) * 2013-09-24 2016-02-02 Universidad De Valladolid Multi-analysis system of laser spectrofluorimetry for oils
CN105954282B (en) * 2016-05-04 2018-11-02 浙江大学 A kind of water turbidity detection device and method based on underwater observation net
CN211877766U (en) * 2019-07-25 2020-11-06 淮北师范大学 Water turbidity measuring device based on infrared camera shooting
CN110274893A (en) * 2019-07-25 2019-09-24 淮北师范大学 Water turbidity measuring device, image capturing system and method based on infrared photography
CN110672523A (en) * 2019-11-14 2020-01-10 厦门华联电子股份有限公司 Turbidity sensor

Also Published As

Publication number Publication date
CN112505002A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
JP5496509B2 (en) System, method, and apparatus for image processing for color classification and skin color detection
JP2000508095A (en) Boundary mapping system and method
CN110324611B (en) Camera module detection system and detection method
CN108090890B (en) Inspection device and inspection method
CN115032196B (en) Full-scribing high-flux color pathological imaging analysis instrument and method
US9531950B2 (en) Imaging system and imaging method that perform a correction of eliminating an influence of ambient light for measurement data
US10650511B2 (en) Optical device for fuel filter debris
CN112964652A (en) Rapid detection device, system and detection method for solution colorimetric analysis
AU2019284820A1 (en) Method for evaluating a suitability of lighting conditions for detecting an analyte in a sample using a camera of a mobile device
CN112505002B (en) Solution turbidity detection method, medium and image system based on RGB model
EP3999822A1 (en) Spectrometer device
CN111812013A (en) Method for optically detecting biomarkers
JP2022501594A (en) Systems, methods, and equipment for autonomous diagnostic verification of optical components of vision-based inspection systems
CN116630148B (en) Spectral image processing method and device, electronic equipment and storage medium
CN112461762B (en) Solution turbidity detection method, medium and image processing system based on HSV model
CN111398138A (en) Optical detection system and method of dry type blood cell analysis device
CN112557350B (en) HSV model-based solution turbidity detection method, medium and image system
JP2013167491A (en) Detection device, detection method, detection program and storage medium for detecting detection target from specimen
WO2023034441A1 (en) Imaging test strips
US11698342B2 (en) Method and system for analysing fluorospot assays
CN114216867A (en) Hyperspectral image acquisition and identification device and method
CN209858431U (en) Imaging device for self-illuminating objects on biological sample film
JP2021092439A (en) Illumination optimization method, control device, and program
CN116465840A (en) Device and method for rapidly identifying years of dried orange peel
US20230349828A1 (en) Method and system for analysing fluorospot assays

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant