WO2022217544A1 - Method and system for sample quality control - Google Patents

Method and system for sample quality control Download PDF

Info

Publication number
WO2022217544A1
WO2022217544A1 PCT/CN2021/087520 CN2021087520W WO2022217544A1 WO 2022217544 A1 WO2022217544 A1 WO 2022217544A1 CN 2021087520 W CN2021087520 W CN 2021087520W WO 2022217544 A1 WO2022217544 A1 WO 2022217544A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sample
target
liquid sample
determining
Prior art date
Application number
PCT/CN2021/087520
Other languages
French (fr)
Inventor
Chenxi Zhang
Haihua WANG
Weibin Xing
Xiaojun Tao
Jing Qian
Qi Zhou
Yin Qian
Original Assignee
F. Hoffmann-La Roche Ag
Roche Diagnostics Operations, Inc.
Roche Diagnostics Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by F. Hoffmann-La Roche Ag, Roche Diagnostics Operations, Inc., Roche Diagnostics Gmbh filed Critical F. Hoffmann-La Roche Ag
Priority to PCT/CN2021/087520 priority Critical patent/WO2022217544A1/en
Publication of WO2022217544A1 publication Critical patent/WO2022217544A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/52Measurement of colour; Colour measuring devices, e.g. colorimeters using colour charts
    • G01J3/524Calibration of colorimeters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the embodiments of the disclosure relate to quality control of liquid samples, for example in the field of health related diagnostics.
  • Diagnostic analytical testing can provide physicians with pivotal information and thus can be of great importance for health related decisions, population health management, etc.
  • abnormal quality of samples e.g., hemolysis or lipidemia
  • a computer-implemented method for processing an image of a liquid sample comprises: obtaining an image of a liquid sample captured by a target capturing device; calibrating the image of the liquid sample using a calibration function, the calibration function being determined based on a pair of images of a color reference object, the pair of images comprising a target image captured by the target capturing device and a corresponding baseline image; and determining a sub-image corresponding to a target portion from the calibrated image for classification.
  • the method further comprises: determining a classification tag of the liquid sample based on the sub-image, the classification tag indicating whether the liquid sample is normal or abnormal.
  • the liquid sample comprises a serum sample and the classification tag comprises at least one of: a first tag indicating that the serum sample is normal; a second tag indicating that the serum sample is hemolytic; a third tag indicating that the serum sample is icteric; or a fourth tag indicating that the serum sample is lipaemic.
  • At least one of the second tag, the third tag and the fourth tag further indicates a severity level of the serum sample.
  • the method further comprises: determining a first feature vector of the target image and a second feature vector of the baseline image, the first feature vector comprising a first set of values in multiple dimensions, the second feature vector comprising a second set of values in the multiple dimensions; and determining the calibration function based on the first set of values and the second set of values.
  • calibrating the image of the liquid sample using a calibration function comprises: determining a third feature vector of the image of the liquid sample, the third feature vector comprising a third set of values in multiple dimensions; and deriving a calibrated feature vector by applying the calibration function to the third set of values.
  • the baseline image is an image of the color reference object captured by a baseline capturing device different from the target capturing device.
  • the calibration function is determined based on multiple color reference objects, each color reference object comprising one tube containing mockup liquid.
  • the liquid sample comprises a serum sample
  • each color reference object has a color corresponding to a real life serum sample and is associated with a classification tag indicating whether the corresponding real life serum sample is normal or abnormal.
  • the method further comprises: determining a target color reference object from the multiple color reference objects based on a difference between baseline images of the multiple color reference objects and the sub-image; and determining a classification tag of the serum sample based on the classification tag associated with the target color reference object.
  • determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises: determining a segmentation line indicating a boundary of the target potion in the calibrated image of the serum sample; and determining the sub-image corresponding to the target portion based on the segmentation line.
  • determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises: determining a boundary of the target portion through applying a feature vector of the calibrated image to a machine learning model; and determining the sub-image from the calibrated image based on the determined boundary.
  • the method further comprises: in accordance with a determination that a width of the sub-image is less than a width threshold, providing a first warning that the captured image of the serum sample is not qualified for classification.
  • the method further comprises: obtaining a first image of a first liquid sample captured by the target capturing device; and providing a second warning if an attribute of the first image is out of a predetermined range.
  • the method further comprises: obtaining a second image of a second liquid sample captured by the target capturing device; calibrating the second image of the second liquid sample using the calibration function; and providing a third warning if a sub-image corresponding to a target portion fails to be determined from the calibrated second image of the second liquid sample.
  • a sample processing system comprises: a target capturing device being configured for capturing an image of a liquid sample; and a quality control system being configured for performing the computer-implemented method according to the first aspect.
  • a sample processing system may be a pre-analytical, analytical, or post-analytical system.
  • a pre-analytical system can usually be used for the preliminary processing of sample tubes or liquid samples, e.g. sorting of sample tubes into racks or aliquoting liquid samples for further processing.
  • An analytical system can be designed, for example, to use a liquid sample or part of the liquid sample and a reagent in order to produce a measurable signal, on the basis of which it is possible to determine whether the analyte is present, and if desired in what concentration.
  • a post-analytical system can usually be used for the post-processing of liquid samples like the archiving of liquid samples.
  • Such pre-analytical systems, analytical systems, and post-analytical systems are well known in the art.
  • a quality control system comprises: a processing unit; and a memory coupled to the processing unit and having instructions stored thereon that, when executed by the processing unit, cause the quality control system to perform the method according to the first aspect.
  • a computer-readable medium storing instructions that when executed cause performing the method according to the first aspect.
  • a computer program product being tangibly stored on a computer storage medium and comprising machine-executable instructions which, when executed by a device, cause the device to perform the method according to the first aspect.
  • FIG. 1 illustrates a schematic diagram of an exemplary sample processing system according to an implementation of the subject matter described herein;
  • Fig. 2 illustrates a block diagram of the quality control system according to an implementation of the subject matter described herein;
  • Fig. 3 illustrates a schematic diagram of determining a sub-image according to an implementation of the subject matter described herein;
  • Fig. 4 illustrates a flowchart of a process for processing an image for a liquid sample according to an implementation of the subject matter described herein;
  • Fig. 5 illustrates a schematic block diagram of an example device for implementing embodiments of the present disclosure.
  • references in the present disclosure to “one embodiment, ” “an embodiment, ” “an example embodiment, ” and the like indicate that the embodiment described may include a particular feature, structure, or characteristic, but it is not necessary that every embodiment includes the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an example embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • first and second etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and similarly, a second element could be termed a first element, without departing from the scope of example embodiments.
  • the term “and/or” includes any and all combinations of one or more of the listed terms.
  • Fig. 1 illustrates a schematic diagram of an exemplary sample processing system 100 according to an implementation of the subject matter described herein.
  • the sample processing system 100 may comprise a target capturing device 120 for capturing an image of a liquid sample 110 to be processed.
  • the liquid sample 110 may be contained in a tube.
  • the liquid sample may be a specimen taken from a patient or a control person.
  • the examples of the liquid sample may include but are not limited to blood sample, serum sample, plasma sample, urine sample, spinal cord fluid sample and the like.
  • the target capturing device 120 may be configured to capture an image 130 of the liquid sample 110.
  • the target capturing device 120 may be a part of a pre-analytical system, an analytical system or a post-analytical system.
  • a sample processing system may be a pre-analytical, analytical, or post-analytical system.
  • a pre-analytical system can usually be used for the preliminary processing of sample tubes or liquid samples, e.g. sorting of sample tubes into racks or aliquoting liquid samples for further processing.
  • An analytical system can be designed, for example, to use a liquid sample or part of the liquid sample and a reagent in order to produce a measurable signal, on the basis of which it is possible to determine whether the analyte is present, and if desired in what concentration.
  • a post-analytical system can usually be used for the post-processing of liquid samples like the archiving of liquid samples.
  • Such pre-analytical systems, analytical systems, and post-analytical systems are well known in the art.
  • the image 130 of the liquid sample 110 may for example be presented to an operator of the sample processing system.
  • the operator may then determine whether the liquid sample is normal or abnormal based on an attribute of the image 130, e.g. a color of the image 130.
  • an attribute of the image 130 e.g. a color of the image 130.
  • the determination of sample quality is based both on the operator’s personal skills and the quality of the image. Normal samples or samples of normal quality may lead to a higher possibility of correct and reliable analytical testing results compared to abnormal sample or sample of abnormal quality.
  • different capturing devices may have cameras with different qualities, which may lead to different color performances of the captured images. For example, an image captured by a capturing device may have a higher saturation than an image captured by another capturing device.
  • different capturing environments e.g., different light intensity
  • an image captured in the noon may have a higher luminance than an image captured in the night.
  • the image 130 of the liquid sample 110 may not be able to correctly represent its true color, which may then result in an inaccurate sample quality determination for example by an operator of the sample processing system.
  • a solution for processing an image of a liquid sample is proposed.
  • an image of a liquid sample captured by a target capturing device is obtained and then calibrated using a calibration function.
  • the calibration function is determined based on a pair of images of a color reference object, and the pair of images comprise a target image captured by the target capturing device and a corresponding baseline image. Further, a sub-image corresponding to a target portion is determined from the calibrated image for classification. In this way, by calibrating the captured image based on a pair of images for a color reference object, the embodiments may improve the performance of sample quality control.
  • the image 130 of the liquid sample 110 captured by the target capturing device 120 may be obtained by a quality control system 140.
  • the quality control system 140 may be a separated device from the target capturing device 120.
  • the quality control system 140 may be implemented in a same device as the target capturing device 120.
  • either the target capturing device 120 or the quality control system 140 may be a part of a pre-analytical system, an analytical system or a post-analytical system.
  • Fig. 2 illustrates a block diagram of the quality control system 140 according to an implementation of the subject matter described herein.
  • the quality control system 140 may comprise an obtaining module 210.
  • the obtaining module 210 may be configured to obtain an image 130 of the liquid sample 110 captured by the target capturing device 120.
  • the target capturing device 120 may store the captured image 130 on a shared storage device, and the obtaining module 210 may retrieve the captured image 130 from the storage device.
  • the obtaining module 210 may provide the obtained image 130 of the liquid sample 110 to a calibration module 220.
  • the calibration module 220 is configured to calibrate the image 130 of the liquid sample 110 using a calibration function.
  • the calibration function may be determined based on a pair of images of a color reference object, and the pair of images may comprise a target image captured by the target capturing device and a corresponding baseline image.
  • the calibration function may be determined before real life liquid samples are to be processed by the sample processing system 100.
  • the target capturing device 120 may first capture a target image of a color reference object.
  • a color reference object may comprise any object (s) with proper color representation.
  • a color reference object may comprise a piece of paper with a color spectrum printed thereon.
  • a color reference object may comprise a tube containing mockup liquid.
  • Such mock liquid may represent a color or a range of colors due to the contained pigment (s) .
  • a mockup liquid may be an artificially produced liquid comprising a predefined amount of pigments or mixture of pigments defining a color or a range of colors.
  • a tube containing mockup liquid may be provided to the target capturing device for obtaining a target image of the mockup liquid.
  • a baseline image of the color reference object may be obtained.
  • the baseline image may comprise an image of the color reference object captured by a baseline capturing device, which is different from the target capturing device 120.
  • the baseline capturing device may comprise a finely-tuned camera, e.g., a calibrated reference camera.
  • the baseline capturing device may comprise a capturing device, for which the classification model has been optimized.
  • the classification model included in the classification module 250 may be trained based on the images captured by a particular capturing device, and the particular capturing device may then be used as the baseline capturing device.
  • a first feature vector of the target image and a second feature vector of the baseline image may then be determined based on the image pairs of the color reference object.
  • the first feature vector may comprise a first set of values in multiple dimensions and the second feature vector may comprise a second set of values in the multiple dimensions. It should be understood that any proper method such as Resnet50 may be utilized to convert the target image and the baseline image into feature vectors.
  • the calibration function may be determined based on one pair of feature vectors, i.e., based on the first set of values and the second set of values.
  • the calibration function may comprise a set of co-linear functions, which are determined based on a value in the first set and a corresponding value in the second set. It should be understood that any proper function indicating the transformation between the first set of values and the second set of value may be applied, and the disclosure is not intended to be limited in this regard.
  • the color reference object may comprise multiple color reference objects.
  • the calibration function may be determined based on multiple pairs of feature vectors.
  • multiple co-linear functions may be determined based on the pairs of feature vectors, and each co-linear function may correspond to one of the multiple dimensions.
  • the calibration function may be determined by any proper devices or systems, including but not limited to the target capturing device 120 or the quality control system 140.
  • the calibration module 220 may for example obtain the determined calibration function and then utilize the calibration function to calibrate the image 130 of the liquid sample 110.
  • the calibration module 220 may first convert the image 130 of the liquid sample 110 into a third feature vector, which comprises a third set of values in multiple dimensions. Further, the calibration module 220 may then derive a calibrated feature vector by applying the calibration function to the third set of values. As an example, the multiple co-linear functions included in the calibration function may be applied to the values in the multiple dimensions respectively.
  • the calibration module 220 may first determine whether the image 130 is qualified for calibration, and proceed with the calibration process as discussed above if the image 130 is determined as being qualified for calibration.
  • the calibration module 220 may determine that the image 130 is not qualified for calibration in a case that an attribute of the image is out of a predetermined range. For example, if a luminance of the image 130 is greater than a threshold, the calibration module 220 may determine that the image 130 is not qualified for calibration and may provide a warning accordingly.
  • the calibration module 220 may further generate the calibrated image 150 and then provide the calibrated image 150 to a detection module 230.
  • the calibrated image 150 may be generated by processing the calibrated vector with a trained decoder.
  • Such decoder may comprise a machine learning model configured for receiving an image feature vector and generating an image based on the image feature vector.
  • the detection module 230 may be configured to determine a sub-image 160 corresponding to a target portion from the calibrated image 150 for classification. In some embodiments, the detection module 230 may utilize segmentation line detection to determine the sub-image 160. In particular, the detection module 230 may determine a segmentation line indicating a boundary of the target potion in the calibrated image 150.
  • Fig. 3 illustrates a schematic diagram 300 of determining a sub-image according to an implementation of the subject matter described herein. As shown in Fig. 3, in most cases, the target portion may have a different color representation than neighboring regions.
  • the detection module 230 may utilize a proper line detection method (e.g., an edge detection algorithm) to determine segmentation line 320 or the segmentation line 350, which segment the serum portion and other portions in the tube.
  • a proper line detection method e.g., an edge detection algorithm
  • the detection module 230 may also detect the vertical segmentation lines of the target portion. In some embodiments, an inner wall of the tube may be determined as the vertical segmentation lines.
  • the serum portion may be covered with a barcode 310.
  • the detection module 230 may detect the boundary of the barcode for determining the vertical segmentation lines 330 and 340.
  • the detection module 230 may determine the sub-image 160 corresponding to the target portion based on the segmentation line (s) .
  • the detection module 230 may also utilize a machine learning module to determine the sub-image 160.
  • the detection module 230 may determine a boundary of the target portion through applying a feature vector of the calibrated image 150 to a machine learning model, and determine the sub-image 160 from the calibrated image 150 based on the determined boundary.
  • an object detection model such as YOLO (You Only Look Once) Model may be trained and then applied to output the boundary of the target portion.
  • a sub-image 160 corresponding to the target portion may be determined for classification.
  • the sub-image 160 may be provided to an operator of the sample processing system for determining whether the liquid sample is normal or abnormal.
  • the detection module 230 may provide a warning. For example, if the YOLO model is unable to provide a boundary of the serum portion, a warning may be then provided.
  • the sub-image 160 may be further utilized for automatic classification.
  • the sub-image 160 may further be provided to a filtering module 240.
  • the filtering module 240 may be configured to determine a width of the sub-image 160, and may provide a warning 245 that the captured image of the serum sample is not qualified for classification if the width is less than a width threshold.
  • the sub-image 160 may have a relative narrow width and may then be not qualified for classification.
  • the filtering module 240 may provide the sub-image 160 to a classification module 250.
  • the classification module 250 may utilize a machine learning model to determine a classification tag of the liquid sample based on the sub-image 160, wherein the classification tag may indicate whether the liquid sample 110 is normal or abnormal.
  • the examples of the utilized machine learning model may comprise, but are not limited to, various types of deep neural networks (DNN) , convolutional neural networks (CNN) , support vector machines (SVM) , decision trees, random forest models, and so on.
  • DNN deep neural networks
  • CNN convolutional neural networks
  • SVM support vector machines
  • the machine learning model may be trained based on real life liquid samples, and the corresponding index results of the real life liquid samples may be used determine the ground-truth labels for training the machine learning model.
  • Real life liquid samples may be specimens taken from patients and/or control persons.
  • the liquid sample 110 may comprises a serum sample and the classification tag comprises at least one of: a first tag indicating that the serum sample is normal; a second tag indicating that the serum sample is hemolytic; a third tag indicating that the serum sample is icteric; or a fourth tag indicating that the serum sample is lipaemic.
  • a serum sample may be assigned with two or more tags of the second tag, the third tag and the fourth tag.
  • a serum sample may be determined as being both hemolytic and icteric.
  • At least one of the second tag, the third tag and the fourth tag may further indicate a severity level of the serum sample.
  • the severity level of the serum sample may indicate an impact of the reliability of an analytical testing result obtained from the serum sample.
  • a tag “hemolytic +++” may indicate a higher severity level than the tag “hemolytic +” , and the serum sample with the tag “hemolytic +++” may lead to a higher possibility of incorrect analytical testing results due to the high hemolysis.
  • the classification tags may also be determined based on the color reference object using predetermined rules, rather than training and applying a machine learning model.
  • each color reference object may have a color corresponding to a real life serum sample and may be associated with a classification tag indicating whether the corresponding real life serum sample is normal or abnormal.
  • the first group of tubes may comprise 4 tubes of mockup liquid, and the colors of the mockup liquid may be corresponding to 4 real life serum samples which are determined as being normal.
  • each of the 4 tubes of mockup liquid may then be associated with a corresponding classification tag “normal” .
  • the second group of tubes may comprise 4 tubes of mockup liquid, and the colors of the mockup liquid may be corresponding to 4 real life serum samples which are determined as being “hemolytic” , “hemolytic +” , “hemolytic ++” and “hemolytic +++” respectively.
  • each of the 4 tubes of mockup liquid may then be associated with a corresponding classification tag “hemolytic” , “hemolytic +” , “hemolytic ++” or “hemolytic +++” .
  • the third group of tubes may comprise 4 tubes of mockup liquid, and the colors of the mockup liquid may be corresponding to 4 real life serum samples which are determined as being “icteric” , “icteric +” , “icteric ++” and “icteric +++” respectively.
  • each of the 4 tubes of mockup liquid may then be associated with a corresponding classification tag “icteric” , “icteric +” , “icteric ++” or “icteric +++” .
  • the fourth group of tubes may comprise 4 tubes of mockup liquid, and the colors of the mockup liquid may be corresponding to 4 real life serum samples which are determined as being “lipaemic” , “lipaemic +” , “lipaemic ++” and “lipaemic +++” respectively.
  • each of the 4 tubes of mockup liquid may then be associated with a corresponding classification tag “lipaemic” , “lipaemic +” , “lipaemic ++” or “lipaemic +++” .
  • the classification module 250 may determine a target color reference object from the multiple color reference objects based on a difference between baseline images of the multiple color reference objects and the sub-image. For example, the classification module 250 may convert the baseline images of the 4 groups of tubes and the sub-image of the serum sample into feature vectors, and calculate the difference based on the feature vectors. Then, a target baseline image with the shortest distance to the sub-image may be selected, and the corresponding target color reference object may be then determined.
  • the classification module 250 may determine a classification tag of the serum sample based on a classification tag associated with the target color reference object. For example, if the baseline image of the tube of mockup liquid with a tag “lipaemic ++” is determined as closest to the sub-image, the classification module 250 may determine the classification tag of the serum sample as “lipaemic ++” .
  • a liquid sample 110 with a certain classification tag e.g. a tag indicating a high severity level
  • the sample processing system 100 may comprise an output unit 507 in form of a display and the classification tag of the liquid sample is displayed on the display.
  • the display may be part of a pre-analytical system or the quality control system. Accordingly, an operator of the sample processing system can identify and exclude a liquid sample 110 with a tag indicating a high severity level from further processing such as for example producing an analytical testing result on an analytical system.
  • the classification tag of a liquid sample may be associated with or linked to an analytical testing result of the liquid sample.
  • the sample processing system 100 may comprise an output unit 507 in form of a display.
  • the classification tag of the liquid sample and the analytical testing result of the liquid sample are displayed together on the display.
  • the display may be part of an analytical system or the quality control system. Accordingly, an operator of the sample processing system can interpret, validate, or invalidate the analytical testing result of the liquid sample based on the displayed classification tag of the liquid sample.
  • the image 110, the calibrated image 150 and the sub-image 160 are shown as visible pictures, the image 110, the calibrated image 150 and the sub-image 160 may also be processed as digital image data, e.g., image feature vectors, without be converted to visible pictures.
  • Fig. 4 illustrates a flowchart of a process 400 of processing an image of a liquid sample according to some implementations of the subject matter as described herein.
  • the process 400 may be implemented by the quality control system 140.
  • the process 400 may also be implemented by any other devices or device clusters similar to the quality control system 140.
  • the process 400 is described with reference to Fig. 1.
  • the quality control system 140 obtains an image of a liquid sample captured by a target capturing device.
  • the quality control system 140 calibrates the image of the liquid sample using a calibration function, wherein the calibration function is determined based on a pair of images of a color reference object, and the pair of images comprise a target image captured by the target capturing device and a corresponding baseline image.
  • the quality control system 140 determines a sub-image corresponding to a target portion from the calibrated image for classification.
  • the quality control system 140 may determine a classification tag of the liquid sample based on the sub-image, wherein the classification tag indicates whether the liquid sample is normal or abnormal.
  • the liquid sample comprises a serum sample and the classification tag comprises at least one of: a first tag indicating that the serum sample is normal; a second tag indicating that the serum sample is hemolytic; a third tag indicating that the serum sample is icteric; or a fourth tag indicating that the serum sample is lipaemic.
  • At least one of the second tag, the third tag and the fourth tag further indicates a severity level of the serum sample.
  • the quality control system 140 may determine a first feature vector of the target image and a second feature vector of the baseline image, wherein the first feature vector comprising a first set of values in multiple dimensions, and the second feature vector comprising a second set of values in the multiple dimensions. Further, the quality control system may determine the calibration function based on the first set of values and the second set of values.
  • calibrating the image of the liquid sample using a calibration function comprises: determining a third feature vector of the image of the liquid sample, the third feature vector comprising a third set of values in multiple dimensions; and deriving a calibrated feature vector by applying the calibration function to the third set of values.
  • the baseline image is an image of the color reference object captured by a baseline capturing device different from the target capturing device.
  • the calibration function is determined based on multiple color reference objects, each color reference object comprising one tube containing mockup liquid.
  • the liquid sample comprises a serum sample
  • each color reference object has a color corresponding to a real life serum samples and is associated with a classification tag indicating whether the corresponding real life serum sample is normal or abnormal.
  • the quality control system 140 may determine a target color reference object from the multiple color reference objects based on a difference between baseline images of the multiple color reference objects and the sub-image. Further, the quality control system 140 may determine a classification tag of the serum sample based on the classification tag associated with the target color reference object.
  • determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises: determining a segmentation line indicating a boundary of the target potion in the calibrated image of the serum sample; and determining the sub-image corresponding to the target portion based on the segmentation line.
  • determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises: determining a boundary of the target portion through applying a feature vector of the calibrated image to a machine learning model; and determining the sub-image from the calibrated image based on the determined boundary.
  • the quality control system 140 may provide a first warning that the captured image of the serum sample is not qualified for classification.
  • the quality control system 140 may obtain a first image of a first liquid sample captured by the target capturing device.
  • the first image may for example comprise an additional image of a liquid sample the same as or different from the liquid sample 110. Further, the quality control system 140 may provide a second warning if an attribute of the first image is out of a predetermined range.
  • the quality control system 140 may obtain a second image of a second liquid sample captured by the target capturing device.
  • the second image may for example comprise an additional image of a liquid sample the same as or different from the liquid sample 110.
  • the quality control system may calibrate the second image of the second liquid sample using the calibration function. Further, the quality control system 140 may provide a third warning if a sub-image corresponding to a target portion fails to be determined from the calibrated second image of the second liquid sample.
  • Fig. 5 illustrates a schematic block diagram of an example device 500 for implementing embodiments of the present disclosure.
  • the sample processing system 100 and/or the quality control system 140 can be implemented by the device 500.
  • the device 500 includes a central processing unit (CPU) 501, which can execute various suitable actions and processing based on the computer program instructions stored in a read-only memory (ROM) 502 or computer program instructions loaded in a random-access memory (RAM) 503 from a storage unit 508.
  • the RAM 503 may also store all kinds of programs and data required by the operations of the device 500.
  • the CPU 501, ROM 502 and RAM 503 are connected to each other via a bus 504.
  • the input/output (I/O) interface 505 is also connected to the bus 504.
  • a plurality of components in the device 500 is connected to the I/O interface 505, including: an input unit 506, for example, a keyboard, a mouse, and the like; an output unit 507, for example, various kinds of displays and loudspeakers, and the like; a storage unit 508, such as a magnetic disk and an optical disk, and the like; and a communication unit 509, such as a network card, a modem, a wireless transceiver, and the like.
  • the communication unit 509 allows the device 500 to exchange information/data with other devices via the computer network, such as Internet, and/or various telecommunication networks.
  • the above described process and processing can also be performed by the processing unit 501.
  • the process 400 may be implemented as a computer software program being tangibly included in the machine-readable medium, for example, the storage unit 508.
  • the computer program may be partially or fully loaded and/or mounted to the device 500 via the ROM 502 and/or communication unit 509.
  • the computer program is loaded to the RAM 503 and executed by the CPU 501, one or more steps of the above described methods or processes can be implemented.
  • the present disclosure may be a method, a device, a system and/or a computer program product.
  • the computer program product may include a computer-readable storage medium, on which the computer-readable program instructions for executing various aspects of the present disclosure are loaded.
  • the computer-readable storage medium may be a tangible device that maintains and stores instructions utilized by the instruction executing devices.
  • the computer-readable storage medium may be, but is not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device or any appropriate combination of the above.
  • the computer-readable storage medium includes: a portable computer disk, a hard disk, a random-access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or flash) , a static random-access memory (SRAM) , a portable compact disk read-only memory (CD-ROM) , a digital versatile disk (DVD) , a memory stick, a floppy disk, a mechanical coding device, a punched card stored with instructions thereon, or a projection in a slot, and any appropriate combination of the above.
  • RAM random-access memory
  • ROM read-only memory
  • EPROM or flash erasable programmable read-only memory
  • SRAM static random-access memory
  • CD-ROM compact disk read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • mechanical coding device a punched card stored with instructions thereon
  • a projection in a slot and any appropriate combination of the above.
  • the computer-readable storage medium utilized herein is not interpreted as transient signals per se, such as radio waves or freely propagated electromagnetic waves, electromagnetic waves propagated via waveguide or other transmission media (such as optical pulses via fiber-optic cables) , or electric signals propagated via electric wires.
  • the described computer-readable program instructions may be downloaded from the computer-readable storage medium to each computing/processing device, or to an external computer or external storage via Internet, local area network, wide area network and/or wireless network.
  • the network may include copper-transmitted cables, optical fiber transmissions, wireless transmissions, routers, firewalls, switches, network gate computers and/or edge servers.
  • the network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in the computer-readable storage medium of each computing/processing device.
  • the computer program instructions for executing operations of the present disclosure may be assembly instructions, instructions of instruction set architecture (ISA) , machine instructions, machine-related instructions, microcodes, firmware instructions, state setting data, or source codes or target codes written in any combination of one or more programming languages, where the programming languages consist of object-oriented programming languages, e.g., Smalltalk, C++, and so on, and conventional procedural programming languages, such as “C” language or similar programming languages.
  • the computer-readable program instructions may be implemented fully on a user computer, partially on the user computer, as an independent software package, partially on the user computer and partially on a remote computer, or completely on the remote computer or a server.
  • the remote computer may be connected to the user computer via any type of network, including a local area network (LAN) and a wide area network (WAN) , or to the external computer (e.g., connected via Internet using an Internet service provider) .
  • state information of the computer-readable program instructions is used to customize an electronic circuit, e.g., a programmable logic circuit, a field programmable gate array (FPGA) or a programmable logic array (PLA) .
  • the electronic circuit may execute computer-readable program instructions to implement various aspects of the present disclosure.
  • the computer-readable program instructions may be provided to the processing unit of a general-purpose computer, dedicated computer or other programmable data processing devices to manufacture a machine, such that the instructions, when executed by the processing unit of the computer or other programmable data processing apparatuses, generate an apparatus for implementing functions/actions stipulated in one or more blocks in the flow chart and/or block diagram.
  • the computer-readable program instructions may also be stored in the computer-readable storage medium and cause the computer, programmable data processing apparatus and/or other devices to work in a particular manner, such that the computer-readable medium stored with instructions contains an article of manufacture, including instructions for implementing various aspects of the functions/actions stipulated in one or more blocks of the flow chart and/or block diagram.
  • the computer-readable program instructions may also be loaded into a computer, other programmable data processing apparatuses or other devices, so as to execute a series of operation steps on the computer, other programmable data processing apparatuses or other devices to generate a computer-implemented procedure. Therefore, the instructions executed on the computer, other programmable data processing apparatuses or other devices implement functions/actions stipulated in one or more blocks of the flow chart and/or block diagram.
  • each block in the flow chart or block diagram can represent a module, a portion of program segment or code, where the module and the portion of program segment or code include one or more executable instructions for performing stipulated logic functions.
  • the functions indicated in the block may also take place in an order different from the one indicated in the drawings. For example, two successive blocks may be in fact executed in parallel or sometimes in a reverse order depending on the involved functions.
  • each block in the block diagram and/or flow chart and combinations of the blocks in the block diagram and/or flow chart may be implemented by a hardware-based system exclusively for executing stipulated functions or actions, or by a combination of dedicated hardware and computer instructions.

Abstract

A method for processing an image (130) of a liquid sample (110), comprising: obtaining an image (130) of the liquid sample (110) captured by a target capturing device (120), (402); calibrating the image (130) of the liquid sample (110) using a calibration function (404), the calibration function being determined based on a pair of images of a color reference object, the pair of images comprising a target image captured by the target capturing device (120) and a corresponding baseline image; and determining a sub-image (160) corresponding to a target portion from the calibrated image (150) for classification (406). Method and system (140) for quality control are also provided. Through the solution, the performance of sample quality control may be improved.

Description

METHOD AND SYSTEM FOR SAMPLE QUALITY CONTROL FIELD
The embodiments of the disclosure relate to quality control of liquid samples, for example in the field of health related diagnostics.
BACKGROUND
Diagnostic analytical testing can provide physicians with pivotal information and thus can be of great importance for health related decisions, population health management, etc. However, abnormal quality of samples, e.g., hemolysis or lipidemia, may lead to incorrect analytical testing results. Therefore, there is a need to detect such abnormal samples.
SUMMARY
It is the purpose of this invention to provide systems, methods, and mediums that expand the current state of the art.
For this purpose, systems, methods, and mediums according to the independent claims are proposed and specific embodiments of the invention are set out in the dependent claims.
In a first aspect, there is provided a computer-implemented method for processing an image of a liquid sample. The method comprises: obtaining an image of a liquid sample captured by a target capturing device; calibrating the image of the liquid sample using a calibration function, the calibration function being determined based on a pair of images of a color reference object, the pair of images comprising a target image captured by the target capturing device and a corresponding baseline image; and determining a sub-image corresponding to a target portion from the calibrated image for classification.
In some embodiments, the method further comprises: determining a classification tag of the liquid sample based on the sub-image, the classification tag indicating whether the liquid sample is normal or abnormal.
In some embodiments, the liquid sample comprises a serum sample and the  classification tag comprises at least one of: a first tag indicating that the serum sample is normal; a second tag indicating that the serum sample is hemolytic; a third tag indicating that the serum sample is icteric; or a fourth tag indicating that the serum sample is lipaemic.
In some embodiments, at least one of the second tag, the third tag and the fourth tag further indicates a severity level of the serum sample.
In some embodiments, the method further comprises: determining a first feature vector of the target image and a second feature vector of the baseline image, the first feature vector comprising a first set of values in multiple dimensions, the second feature vector comprising a second set of values in the multiple dimensions; and determining the calibration function based on the first set of values and the second set of values.
In some embodiments, calibrating the image of the liquid sample using a calibration function comprises: determining a third feature vector of the image of the liquid sample, the third feature vector comprising a third set of values in multiple dimensions; and deriving a calibrated feature vector by applying the calibration function to the third set of values.
In some embodiments, the baseline image is an image of the color reference object captured by a baseline capturing device different from the target capturing device.
In some embodiments, the calibration function is determined based on multiple color reference objects, each color reference object comprising one tube containing mockup liquid.
In some embodiments, the liquid sample comprises a serum sample, and each color reference object has a color corresponding to a real life serum sample and is associated with a classification tag indicating whether the corresponding real life serum sample is normal or abnormal.
In some embodiments, the method further comprises: determining a target color reference object from the multiple color reference objects based on a difference between baseline images of the multiple color reference objects and the sub-image; and determining a classification tag of the serum sample based on the classification tag associated with the target color reference object.
In some embodiments, determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises: determining a segmentation line indicating a boundary of the target potion in the calibrated image of the serum sample; and determining  the sub-image corresponding to the target portion based on the segmentation line.
In some embodiments, determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises: determining a boundary of the target portion through applying a feature vector of the calibrated image to a machine learning model; and determining the sub-image from the calibrated image based on the determined boundary.
In some embodiments, the method further comprises: in accordance with a determination that a width of the sub-image is less than a width threshold, providing a first warning that the captured image of the serum sample is not qualified for classification.
In some embodiments, the method further comprises: obtaining a first image of a first liquid sample captured by the target capturing device; and providing a second warning if an attribute of the first image is out of a predetermined range.
In some embodiments, the method further comprises: obtaining a second image of a second liquid sample captured by the target capturing device; calibrating the second image of the second liquid sample using the calibration function; and providing a third warning if a sub-image corresponding to a target portion fails to be determined from the calibrated second image of the second liquid sample.
In a second aspect, there is provided a sample processing system. The sample processing system comprises: a target capturing device being configured for capturing an image of a liquid sample; and a quality control system being configured for performing the computer-implemented method according to the first aspect.
A sample processing system may be a pre-analytical, analytical, or post-analytical system. A pre-analytical system can usually be used for the preliminary processing of sample tubes or liquid samples, e.g. sorting of sample tubes into racks or aliquoting liquid samples for further processing. An analytical system can be designed, for example, to use a liquid sample or part of the liquid sample and a reagent in order to produce a measurable signal, on the basis of which it is possible to determine whether the analyte is present, and if desired in what concentration. A post-analytical system can usually be used for the post-processing of liquid samples like the archiving of liquid samples. Such pre-analytical systems, analytical systems, and post-analytical systems are well known in the art.
In a third aspect, there is provided a quality control system. The quality control system comprises: a processing unit; and a memory coupled to the processing unit and having  instructions stored thereon that, when executed by the processing unit, cause the quality control system to perform the method according to the first aspect.
In a fourth aspect, there is provided a computer-readable medium storing instructions that when executed cause performing the method according to the first aspect.
In a fifth aspect, there is provided with a computer program product being tangibly stored on a computer storage medium and comprising machine-executable instructions which, when executed by a device, cause the device to perform the method according to the first aspect.
It is to be understood that the summary section is not intended to identify key or essential features of embodiments of the present disclosure, nor is it intended to be used to limit the scope of the present disclosure. Other features of the present disclosure will become easily comprehensible through the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
Through the following detailed description with reference to the accompanying drawings, the above and other objectives, features, and advantages of example embodiments of the present disclosure will become more apparent. In the example embodiments of the present disclosure, the same reference numerals usually refer to the same components.
Fig. 1 illustrates a schematic diagram of an exemplary sample processing system according to an implementation of the subject matter described herein;
Fig. 2 illustrates a block diagram of the quality control system according to an implementation of the subject matter described herein;
Fig. 3 illustrates a schematic diagram of determining a sub-image according to an implementation of the subject matter described herein;
Fig. 4 illustrates a flowchart of a process for processing an image for a liquid sample according to an implementation of the subject matter described herein; and
Fig. 5 illustrates a schematic block diagram of an example device for implementing embodiments of the present disclosure.
DETAILED DESCRIPTION
Principle of the present disclosure will now be described with reference to some embodiments. It is to be understood that these embodiments are described only for the purpose of illustration and help those skilled in the art to understand and implement the present disclosure, without suggesting any limitation as to the scope of the disclosure. The disclosure described herein can be implemented in various manners other than the ones described below.
In the following description and claims, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skills in the art to which this disclosure belongs.
References in the present disclosure to “one embodiment, ” “an embodiment, ” “an example embodiment, ” and the like indicate that the embodiment described may include a particular feature, structure, or characteristic, but it is not necessary that every embodiment includes the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an example embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
It shall be understood that although the terms “first” and “second” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term “and/or” includes any and all combinations of one or more of the listed terms.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a” , “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” , “comprising” , “has” , “having” , “includes” and/or “including” , when used herein, specify the presence of stated features, elements, and/or components etc., but do not preclude the presence or addition of one or more other features, elements, components and/or combinations thereof.
Exemplary Sample Processing System
As discussed above, it is important to control the quality of the liquid samples, e.g. serum samples, thereby improving the accuracy and reliability of the analytical testing results. Fig. 1 illustrates a schematic diagram of an exemplary sample processing system 100 according to an implementation of the subject matter described herein.
As shown in Fig 1, the sample processing system 100 may comprise a target capturing device 120 for capturing an image of a liquid sample 110 to be processed. In some embodiments, the liquid sample 110 may be contained in a tube. The liquid sample may be a specimen taken from a patient or a control person. The examples of the liquid sample may include but are not limited to blood sample, serum sample, plasma sample, urine sample, spinal cord fluid sample and the like.
As shown in Fig. 1, the target capturing device 120 may be configured to capture an image 130 of the liquid sample 110. In some embodiments, the target capturing device 120 may be a part of a pre-analytical system, an analytical system or a post-analytical system.
A sample processing system may be a pre-analytical, analytical, or post-analytical system. A pre-analytical system can usually be used for the preliminary processing of sample tubes or liquid samples, e.g. sorting of sample tubes into racks or aliquoting liquid samples for further processing. An analytical system can be designed, for example, to use a liquid sample or part of the liquid sample and a reagent in order to produce a measurable signal, on the basis of which it is possible to determine whether the analyte is present, and if desired in what concentration. A post-analytical system can usually be used for the post-processing of liquid samples like the archiving of liquid samples. Such pre-analytical systems, analytical systems, and post-analytical systems are well known in the art.
According to a traditional solution, the image 130 of the liquid sample 110 may for example be presented to an operator of the sample processing system. The operator may then determine whether the liquid sample is normal or abnormal based on an attribute of the image 130, e.g. a color of the image 130. In this way, the determination of sample quality is based both on the operator’s personal skills and the quality of the image. Normal samples or samples of normal quality may lead to a higher possibility of correct and reliable analytical testing results compared to abnormal sample or sample of abnormal quality.
However, different capturing devices may have cameras with different qualities,  which may lead to different color performances of the captured images. For example, an image captured by a capturing device may have a higher saturation than an image captured by another capturing device.
On the other hand, different capturing environments, e.g., different light intensity, may also lead to different color of the image. For example, an image captured in the noon may have a higher luminance than an image captured in the night.
Therefore, the image 130 of the liquid sample 110 may not be able to correctly represent its true color, which may then result in an inaccurate sample quality determination for example by an operator of the sample processing system.
According to the implementations of the subject matter described herein, a solution for processing an image of a liquid sample is proposed. In this solution, an image of a liquid sample captured by a target capturing device is obtained and then calibrated using a calibration function. The calibration function is determined based on a pair of images of a color reference object, and the pair of images comprise a target image captured by the target capturing device and a corresponding baseline image. Further, a sub-image corresponding to a target portion is determined from the calibrated image for classification. In this way, by calibrating the captured image based on a pair of images for a color reference object, the embodiments may improve the performance of sample quality control.
As shown in Fig. 1, according to the implementations of the subject matter, the image 130 of the liquid sample 110 captured by the target capturing device 120 may be obtained by a quality control system 140. In some embodiments, the quality control system 140 may be a separated device from the target capturing device 120. In some other embodiments, the quality control system 140 may be implemented in a same device as the target capturing device 120. For example, either the target capturing device 120 or the quality control system 140 may be a part of a pre-analytical system, an analytical system or a post-analytical system.
The detailed process for processing the image 130 of the liquid sample 110 by the quality control system 140 will be discussed in detail with reference to Fig. 2.
Work Principle and Example Structure
The basic principles and several example implementations of the subject matter described herein are described below with reference to the figures.
Reference is first made to Fig. 2, which illustrates a block diagram of the quality control system 140 according to an implementation of the subject matter described herein.
As shown in Fig. 2, the quality control system 140 may comprise an obtaining module 210. The obtaining module 210 may be configured to obtain an image 130 of the liquid sample 110 captured by the target capturing device 120.
In some embodiments, the target capturing device 120 may store the captured image 130 on a shared storage device, and the obtaining module 210 may retrieve the captured image 130 from the storage device.
As shown in Fig. 2, the obtaining module 210 may provide the obtained image 130 of the liquid sample 110 to a calibration module 220. The calibration module 220 is configured to calibrate the image 130 of the liquid sample 110 using a calibration function. In some embodiments, the calibration function may be determined based on a pair of images of a color reference object, and the pair of images may comprise a target image captured by the target capturing device and a corresponding baseline image.
In some embodiments, the calibration function may be determined before real life liquid samples are to be processed by the sample processing system 100. In particular, the target capturing device 120 may first capture a target image of a color reference object. In some embodiments, a color reference object may comprise any object (s) with proper color representation. For example, a color reference object may comprise a piece of paper with a color spectrum printed thereon.
For another example, a color reference object may comprise a tube containing mockup liquid. Such mock liquid may represent a color or a range of colors due to the contained pigment (s) . A mockup liquid may be an artificially produced liquid comprising a predefined amount of pigments or mixture of pigments defining a color or a range of colors. As an example, a tube containing mockup liquid may be provided to the target capturing device for obtaining a target image of the mockup liquid.
On the other hand, a baseline image of the color reference object may be obtained.  In some embodiments, the baseline image may comprise an image of the color reference object captured by a baseline capturing device, which is different from the target capturing device 120. As an example, the baseline capturing device may comprise a finely-tuned camera, e.g., a calibrated reference camera.
For another example, the baseline capturing device may comprise a capturing device, for which the classification model has been optimized. For example, the classification model included in the classification module 250 may be trained based on the images captured by a particular capturing device, and the particular capturing device may then be used as the baseline capturing device.
Further, a first feature vector of the target image and a second feature vector of the baseline image may then be determined based on the image pairs of the color reference object. The first feature vector may comprise a first set of values in multiple dimensions and the second feature vector may comprise a second set of values in the multiple dimensions. It should be understood that any proper method such as Resnet50 may be utilized to convert the target image and the baseline image into feature vectors.
Additionally, the calibration function may be determined based on one pair of feature vectors, i.e., based on the first set of values and the second set of values. In some embodiments, the calibration function may comprise a set of co-linear functions, which are determined based on a value in the first set and a corresponding value in the second set. It should be understood that any proper function indicating the transformation between the first set of values and the second set of value may be applied, and the disclosure is not intended to be limited in this regard.
In some embodiments, the color reference object may comprise multiple color reference objects. In this case, the calibration function may be determined based on multiple pairs of feature vectors. For example, multiple co-linear functions may be determined based on the pairs of feature vectors, and each co-linear function may correspond to one of the multiple dimensions.
It should be understood that the calibration function may be determined by any proper devices or systems, including but not limited to the target capturing device 120 or the quality control system 140.
The calibration module 220 may for example obtain the determined calibration  function and then utilize the calibration function to calibrate the image 130 of the liquid sample 110.
In some embodiments, the calibration module 220 may first convert the image 130 of the liquid sample 110 into a third feature vector, which comprises a third set of values in multiple dimensions. Further, the calibration module 220 may then derive a calibrated feature vector by applying the calibration function to the third set of values. As an example, the multiple co-linear functions included in the calibration function may be applied to the values in the multiple dimensions respectively.
In some further embodiments, before calibrating the image 130 of the liquid sample 110, the calibration module 220 may first determine whether the image 130 is qualified for calibration, and proceed with the calibration process as discussed above if the image 130 is determined as being qualified for calibration.
In some embodiments, the calibration module 220 may determine that the image 130 is not qualified for calibration in a case that an attribute of the image is out of a predetermined range. For example, if a luminance of the image 130 is greater than a threshold, the calibration module 220 may determine that the image 130 is not qualified for calibration and may provide a warning accordingly.
As shown in Fig. 2, after calibrating the image 130 of the liquid sample 110, the calibration module 220 may further generate the calibrated image 150 and then provide the calibrated image 150 to a detection module 230. For example, the calibrated image 150 may be generated by processing the calibrated vector with a trained decoder. Such decoder may comprise a machine learning model configured for receiving an image feature vector and generating an image based on the image feature vector.
The detection module 230 may be configured to determine a sub-image 160 corresponding to a target portion from the calibrated image 150 for classification. In some embodiments, the detection module 230 may utilize segmentation line detection to determine the sub-image 160. In particular, the detection module 230 may determine a segmentation line indicating a boundary of the target potion in the calibrated image 150.
Taking Fig. 3 as an example, Fig. 3 illustrates a schematic diagram 300 of determining a sub-image according to an implementation of the subject matter described herein. As shown in Fig. 3, in most cases, the target portion may have a different color  representation than neighboring regions.
For example, in a case that the liquid sample comprises a serum sample and the target portion is the serum portion, the detection module 230 may utilize a proper line detection method (e.g., an edge detection algorithm) to determine segmentation line 320 or the segmentation line 350, which segment the serum portion and other portions in the tube.
In some embodiments, the detection module 230 may also detect the vertical segmentation lines of the target portion. In some embodiments, an inner wall of the tube may be determined as the vertical segmentation lines.
In some other embodiments, as shown in Fig. 3, the serum portion may be covered with a barcode 310. In this case, the detection module 230 may detect the boundary of the barcode for determining the  vertical segmentation lines  330 and 340.
Further, the detection module 230 may determine the sub-image 160 corresponding to the target portion based on the segmentation line (s) .
In some other embodiments, the detection module 230 may also utilize a machine learning module to determine the sub-image 160. In particular, the detection module 230 may determine a boundary of the target portion through applying a feature vector of the calibrated image 150 to a machine learning model, and determine the sub-image 160 from the calibrated image 150 based on the determined boundary. As an example, an object detection model such as YOLO (You Only Look Once) Model may be trained and then applied to output the boundary of the target portion.
In this way, a sub-image 160 corresponding to the target portion may be determined for classification. For example, the sub-image 160 may be provided to an operator of the sample processing system for determining whether the liquid sample is normal or abnormal.
In some other embodiments, if the detection module 230 fails to detect a sub-image corresponding to the target portion from the calibrated image 150, the detection module 230 may provide a warning. For example, if the YOLO model is unable to provide a boundary of the serum portion, a warning may be then provided.
In a case that the sub-image 160 is successfully determined, the sub-image 160 may be further utilized for automatic classification. Referring back to in Fig. 2, the sub-image 160 may further be provided to a filtering module 240.
The filtering module 240 may be configured to determine a width of the sub-image 160, and may provide a warning 245 that the captured image of the serum sample is not qualified for classification if the width is less than a width threshold.
Taking Fig. 3 as an example, if a majority part of the target portion is covered by the barcode 310, the sub-image 160 may have a relative narrow width and may then be not qualified for classification.
In some embodiments, if the width is greater or equal to the width threshold, the filtering module 240 may provide the sub-image 160 to a classification module 250. The classification module 250 may utilize a machine learning model to determine a classification tag of the liquid sample based on the sub-image 160, wherein the classification tag may indicate whether the liquid sample 110 is normal or abnormal. The examples of the utilized machine learning model may comprise, but are not limited to, various types of deep neural networks (DNN) , convolutional neural networks (CNN) , support vector machines (SVM) , decision trees, random forest models, and so on.
In some embodiments, the machine learning model may be trained based on real life liquid samples, and the corresponding index results of the real life liquid samples may be used determine the ground-truth labels for training the machine learning model. Real life liquid samples may be specimens taken from patients and/or control persons.
In some embodiments, the liquid sample 110 may comprises a serum sample and the classification tag comprises at least one of: a first tag indicating that the serum sample is normal; a second tag indicating that the serum sample is hemolytic; a third tag indicating that the serum sample is icteric; or a fourth tag indicating that the serum sample is lipaemic.
In some embodiments, a serum sample may be assigned with two or more tags of the second tag, the third tag and the fourth tag. For example, a serum sample may be determined as being both hemolytic and icteric.
In some embodiments, at least one of the second tag, the third tag and the fourth tag may further indicate a severity level of the serum sample. In some embodiments, the severity level of the serum sample may indicate an impact of the reliability of an analytical testing result obtained from the serum sample. As an example, a tag “hemolytic +++” may indicate a higher severity level than the tag “hemolytic +” , and the serum sample with the tag “hemolytic +++” may lead to a higher possibility of incorrect analytical testing results due to  the high hemolysis.
In some embodiments, the classification tags may also be determined based on the color reference object using predetermined rules, rather than training and applying a machine learning model.
In some embodiments, as discussed above, multiple color reference objects may be used to determine the calibration function as used by the calibration module 220. To further utilize the multiple color reference objects to determine a classification tag, each color reference object may have a color corresponding to a real life serum sample and may be associated with a classification tag indicating whether the corresponding real life serum sample is normal or abnormal.
For example, for the serum samples, 4 groups of tubes containing mockup liquid may be used as the multiple color reference objects. The first group of tubes may comprise 4 tubes of mockup liquid, and the colors of the mockup liquid may be corresponding to 4 real life serum samples which are determined as being normal. In this case, each of the 4 tubes of mockup liquid may then be associated with a corresponding classification tag “normal” .
The second group of tubes may comprise 4 tubes of mockup liquid, and the colors of the mockup liquid may be corresponding to 4 real life serum samples which are determined as being “hemolytic” , “hemolytic +” , “hemolytic ++” and “hemolytic +++” respectively. In this case, each of the 4 tubes of mockup liquid may then be associated with a corresponding classification tag “hemolytic” , “hemolytic +” , “hemolytic ++” or “hemolytic +++” .
The third group of tubes may comprise 4 tubes of mockup liquid, and the colors of the mockup liquid may be corresponding to 4 real life serum samples which are determined as being “icteric” , “icteric +” , “icteric ++” and “icteric +++” respectively. In this case, each of the 4 tubes of mockup liquid may then be associated with a corresponding classification tag “icteric” , “icteric +” , “icteric ++” or “icteric +++” .
The fourth group of tubes may comprise 4 tubes of mockup liquid, and the colors of the mockup liquid may be corresponding to 4 real life serum samples which are determined as being “lipaemic” , “lipaemic +” , “lipaemic ++” and “lipaemic +++” respectively. In this case, each of the 4 tubes of mockup liquid may then be associated with a corresponding classification tag “lipaemic” , “lipaemic +” , “lipaemic ++” or “lipaemic +++” .
In this case, the classification module 250 may determine a target color reference  object from the multiple color reference objects based on a difference between baseline images of the multiple color reference objects and the sub-image. For example, the classification module 250 may convert the baseline images of the 4 groups of tubes and the sub-image of the serum sample into feature vectors, and calculate the difference based on the feature vectors. Then, a target baseline image with the shortest distance to the sub-image may be selected, and the corresponding target color reference object may be then determined.
Further, the classification module 250 may determine a classification tag of the serum sample based on a classification tag associated with the target color reference object. For example, if the baseline image of the tube of mockup liquid with a tag “lipaemic ++” is determined as closest to the sub-image, the classification module 250 may determine the classification tag of the serum sample as “lipaemic ++” .
In this way, the calculation complexity for classification of the liquid sample may be lowered, thereby improving the performance of the quality control system.
In some embodiments, a liquid sample 110 with a certain classification tag, e.g. a tag indicating a high severity level, is identified and excluded from further processing. In one embodiment, the sample processing system 100 may comprise an output unit 507 in form of a display and the classification tag of the liquid sample is displayed on the display. For example, the display may be part of a pre-analytical system or the quality control system. Accordingly, an operator of the sample processing system can identify and exclude a liquid sample 110 with a tag indicating a high severity level from further processing such as for example producing an analytical testing result on an analytical system.
In some embodiments, the classification tag of a liquid sample may be associated with or linked to an analytical testing result of the liquid sample. In one embodiment, the sample processing system 100 may comprise an output unit 507 in form of a display. The classification tag of the liquid sample and the analytical testing result of the liquid sample are displayed together on the display. For example, the display may be part of an analytical system or the quality control system. Accordingly, an operator of the sample processing system can interpret, validate, or invalidate the analytical testing result of the liquid sample based on the displayed classification tag of the liquid sample.
In some embodiments, though the image 110, the calibrated image 150 and the sub-image 160 are shown as visible pictures, the image 110, the calibrated image 150 and the sub-image 160 may also be processed as digital image data, e.g., image feature vectors,  without be converted to visible pictures.
Example Process
Fig. 4 illustrates a flowchart of a process 400 of processing an image of a liquid sample according to some implementations of the subject matter as described herein. The process 400 may be implemented by the quality control system 140. The process 400 may also be implemented by any other devices or device clusters similar to the quality control system 140. For purpose of description, the process 400 is described with reference to Fig. 1.
At 402, the quality control system 140 obtains an image of a liquid sample captured by a target capturing device.
At 404, the quality control system 140 calibrates the image of the liquid sample using a calibration function, wherein the calibration function is determined based on a pair of images of a color reference object, and the pair of images comprise a target image captured by the target capturing device and a corresponding baseline image.
At 406, the quality control system 140 determines a sub-image corresponding to a target portion from the calibrated image for classification.
In some embodiments, the quality control system 140 may determine a classification tag of the liquid sample based on the sub-image, wherein the classification tag indicates whether the liquid sample is normal or abnormal.
In some embodiments, the liquid sample comprises a serum sample and the classification tag comprises at least one of: a first tag indicating that the serum sample is normal; a second tag indicating that the serum sample is hemolytic; a third tag indicating that the serum sample is icteric; or a fourth tag indicating that the serum sample is lipaemic.
In some embodiments, at least one of the second tag, the third tag and the fourth tag further indicates a severity level of the serum sample.
In some embodiments, the quality control system 140 may determine a first feature vector of the target image and a second feature vector of the baseline image, wherein the first  feature vector comprising a first set of values in multiple dimensions, and the second feature vector comprising a second set of values in the multiple dimensions. Further, the quality control system may determine the calibration function based on the first set of values and the second set of values.
In some embodiments, calibrating the image of the liquid sample using a calibration function comprises: determining a third feature vector of the image of the liquid sample, the third feature vector comprising a third set of values in multiple dimensions; and deriving a calibrated feature vector by applying the calibration function to the third set of values.
In some embodiments, the baseline image is an image of the color reference object captured by a baseline capturing device different from the target capturing device.
In some embodiments, the calibration function is determined based on multiple color reference objects, each color reference object comprising one tube containing mockup liquid.
In some embodiments, the liquid sample comprises a serum sample, and each color reference object has a color corresponding to a real life serum samples and is associated with a classification tag indicating whether the corresponding real life serum sample is normal or abnormal.
In some embodiments, the quality control system 140 may determine a target color reference object from the multiple color reference objects based on a difference between baseline images of the multiple color reference objects and the sub-image. Further, the quality control system 140 may determine a classification tag of the serum sample based on the classification tag associated with the target color reference object.
In some embodiments, determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises: determining a segmentation line indicating a boundary of the target potion in the calibrated image of the serum sample; and determining the sub-image corresponding to the target portion based on the segmentation line.
In some embodiments, determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises: determining a boundary of the target portion through applying a feature vector of the calibrated image to a machine learning model; and  determining the sub-image from the calibrated image based on the determined boundary.
In some embodiments, in accordance with a determination that a width of the sub-image is less than a width threshold, the quality control system 140 may provide a first warning that the captured image of the serum sample is not qualified for classification.
In some embodiments, the quality control system 140 may obtain a first image of a first liquid sample captured by the target capturing device. The first image may for example comprise an additional image of a liquid sample the same as or different from the liquid sample 110. Further, the quality control system 140 may provide a second warning if an attribute of the first image is out of a predetermined range.
In some embodiments, the quality control system 140 may obtain a second image of a second liquid sample captured by the target capturing device. The second image may for example comprise an additional image of a liquid sample the same as or different from the liquid sample 110. The quality control system may calibrate the second image of the second liquid sample using the calibration function. Further, the quality control system 140 may provide a third warning if a sub-image corresponding to a target portion fails to be determined from the calibrated second image of the second liquid sample.
Example Device
Fig. 5 illustrates a schematic block diagram of an example device 500 for implementing embodiments of the present disclosure. For example, the sample processing system 100 and/or the quality control system 140 according to the embodiment of the present disclosure can be implemented by the device 500. As shown, the device 500 includes a central processing unit (CPU) 501, which can execute various suitable actions and processing based on the computer program instructions stored in a read-only memory (ROM) 502 or computer program instructions loaded in a random-access memory (RAM) 503 from a storage unit 508. The RAM 503 may also store all kinds of programs and data required by the operations of the device 500. The CPU 501, ROM 502 and RAM 503 are connected to each other via a bus 504. The input/output (I/O) interface 505 is also connected to the bus 504.
A plurality of components in the device 500 is connected to the I/O interface 505, including: an input unit 506, for example, a keyboard, a mouse, and the like; an output unit  507, for example, various kinds of displays and loudspeakers, and the like; a storage unit 508, such as a magnetic disk and an optical disk, and the like; and a communication unit 509, such as a network card, a modem, a wireless transceiver, and the like. The communication unit 509 allows the device 500 to exchange information/data with other devices via the computer network, such as Internet, and/or various telecommunication networks.
The above described process and processing, for example, the process 400, can also be performed by the processing unit 501. For example, in some embodiments, the process 400 may be implemented as a computer software program being tangibly included in the machine-readable medium, for example, the storage unit 508. In some embodiments, the computer program may be partially or fully loaded and/or mounted to the device 500 via the ROM 502 and/or communication unit 509. When the computer program is loaded to the RAM 503 and executed by the CPU 501, one or more steps of the above described methods or processes can be implemented.
The present disclosure may be a method, a device, a system and/or a computer program product. The computer program product may include a computer-readable storage medium, on which the computer-readable program instructions for executing various aspects of the present disclosure are loaded.
The computer-readable storage medium may be a tangible device that maintains and stores instructions utilized by the instruction executing devices. The computer-readable storage medium may be, but is not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device or any appropriate combination of the above. More concrete examples of the computer-readable storage medium (non-exhaustive list) include: a portable computer disk, a hard disk, a random-access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or flash) , a static random-access memory (SRAM) , a portable compact disk read-only memory (CD-ROM) , a digital versatile disk (DVD) , a memory stick, a floppy disk, a mechanical coding device, a punched card stored with instructions thereon, or a projection in a slot, and any appropriate combination of the above. The computer-readable storage medium utilized herein is not interpreted as transient signals per se, such as radio waves or freely propagated electromagnetic waves, electromagnetic waves propagated via waveguide or other transmission media (such as optical pulses via fiber-optic cables) , or electric signals propagated via electric wires.
The described computer-readable program instructions may be downloaded from the computer-readable storage medium to each computing/processing device, or to an external computer or external storage via Internet, local area network, wide area network and/or wireless network. The network may include copper-transmitted cables, optical fiber transmissions, wireless transmissions, routers, firewalls, switches, network gate computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in the computer-readable storage medium of each computing/processing device.
The computer program instructions for executing operations of the present disclosure may be assembly instructions, instructions of instruction set architecture (ISA) , machine instructions, machine-related instructions, microcodes, firmware instructions, state setting data, or source codes or target codes written in any combination of one or more programming languages, where the programming languages consist of object-oriented programming languages, e.g., Smalltalk, C++, and so on, and conventional procedural programming languages, such as “C” language or similar programming languages. The computer-readable program instructions may be implemented fully on a user computer, partially on the user computer, as an independent software package, partially on the user computer and partially on a remote computer, or completely on the remote computer or a server. In the case where a remote computer is involved, the remote computer may be connected to the user computer via any type of network, including a local area network (LAN) and a wide area network (WAN) , or to the external computer (e.g., connected via Internet using an Internet service provider) . In some embodiments, state information of the computer-readable program instructions is used to customize an electronic circuit, e.g., a programmable logic circuit, a field programmable gate array (FPGA) or a programmable logic array (PLA) . The electronic circuit may execute computer-readable program instructions to implement various aspects of the present disclosure.
Various aspects of the present disclosure are described herein with reference to a flow chart and/or block diagram of method, device (system) and computer program products according to embodiments of the present disclosure. It should be appreciated that each block of the flow chart and/or block diagram and the combination of various blocks in the flow chart and/or block diagram can be implemented by computer-readable program instructions.
The computer-readable program instructions may be provided to the processing unit of a general-purpose computer, dedicated computer or other programmable data processing devices to manufacture a machine, such that the instructions, when executed by the processing unit of the computer or other programmable data processing apparatuses, generate an apparatus for implementing functions/actions stipulated in one or more blocks in the flow chart and/or block diagram. The computer-readable program instructions may also be stored in the computer-readable storage medium and cause the computer, programmable data processing apparatus and/or other devices to work in a particular manner, such that the computer-readable medium stored with instructions contains an article of manufacture, including instructions for implementing various aspects of the functions/actions stipulated in one or more blocks of the flow chart and/or block diagram.
The computer-readable program instructions may also be loaded into a computer, other programmable data processing apparatuses or other devices, so as to execute a series of operation steps on the computer, other programmable data processing apparatuses or other devices to generate a computer-implemented procedure. Therefore, the instructions executed on the computer, other programmable data processing apparatuses or other devices implement functions/actions stipulated in one or more blocks of the flow chart and/or block diagram.
The flow chart and block diagram in the drawings illustrate system architectures, functions and operations that may be implemented by a system, a method and a computer program product according to multiple implementations of the present disclosure. In this regard, each block in the flow chart or block diagram can represent a module, a portion of program segment or code, where the module and the portion of program segment or code include one or more executable instructions for performing stipulated logic functions. In some alternative implementations, it should be appreciated that the functions indicated in the block may also take place in an order different from the one indicated in the drawings. For example, two successive blocks may be in fact executed in parallel or sometimes in a reverse order depending on the involved functions. It should also be appreciated that each block in the block diagram and/or flow chart and combinations of the blocks in the block diagram and/or flow chart may be implemented by a hardware-based system exclusively for executing stipulated functions or actions, or by a combination of dedicated hardware and computer instructions.
Various implementations of the present disclosure have been described above and the above description is only exemplary rather than exhaustive and is not limited to the implementations of the present disclosure. Many modifications and alterations, without deviating from the scope and spirit of the explained various implementations, are obvious for those skilled in the art. The selection of terms in the text aims to best explain principles and actual applications of each implementation and technical improvements made in the market by each embodiment, or enable others of ordinary skill in the art to understand implementations of the present disclosure.

Claims (19)

  1. A computer-implemented method for processing an image of a liquid sample, comprising:
    obtaining an image of a liquid sample captured by a target capturing device;
    calibrating the image of the liquid sample using a calibration function, the calibration function being determined based on a pair of images of a color reference object, the pair of images comprising a target image captured by the target capturing device and a corresponding baseline image; and
    determining a sub-image corresponding to a target portion from the calibrated image for classification.
  2. The method of Claim 1, further comprising:
    determining a classification tag of the liquid sample based on the sub-image, the classification tag indicating whether the liquid sample is normal or abnormal.
  3. The method of Claim 2, wherein the liquid sample comprises a serum sample and wherein the classification tag comprises at least one of:
    a first tag indicating that the serum sample is normal;
    a second tag indicating that the serum sample is hemolytic;
    a third tag indicating that the serum sample is icteric; or
    a fourth tag indicating that the serum sample is lipaemic.
  4. The method of Claim 3, wherein at least one of the second tag, the third tag and the fourth tag further indicates a severity level of the serum sample.
  5. The method of Claim 1, further comprising:
    determining a first feature vector of the target image and a second feature vector of the baseline image, the first feature vector comprising a first set of values in multiple dimensions, the second feature vector comprising a second set of values in the multiple dimensions; and
    determining the calibration function based on the first set of values and the second set of values.
  6. The method of Claim 5, wherein calibrating the image of the liquid sample using a calibration function comprises:
    determining a third feature vector of the image of the liquid sample, the third feature vector comprising a third set of values in multiple dimensions; and
    deriving a calibrated feature vector by applying the calibration function to the third set of values.
  7. The method of Claim 1, wherein the baseline image is an image of the color reference object captured by a baseline capturing device different from the target capturing device.
  8. The method of Claim 1, wherein the calibration function is determined based on multiple color reference objects, each color reference object comprising one tube containing mockup liquid.
  9. The method of Claim 8, wherein the liquid sample comprises a serum sample, wherein each color reference object has a color corresponding to a real life serum sample and is associated with a classification tag indicating whether the corresponding real life serum sample is normal or abnormal.
  10. The method of Claim 9, further comprising:
    determining a target color reference object from the multiple color reference objects based on a difference between baseline images of the multiple color reference objects and the sub-image; and
    determining a classification tag of the serum sample based on the classification tag associated with the target color reference object.
  11. The method of Claim 1, wherein determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises:
    determining a segmentation line indicating a boundary of the target potion in the calibrated image of the serum sample; and
    determining the sub-image corresponding to the target portion based on the segmentation line.
  12. The method of Claim 1, wherein determining a sub-image corresponding to a target portion from the calibrated image for analysis comprises:
    determining a boundary of the target portion through applying a feature vector of the calibrated image to a machine learning model; and
    determining the sub-image from the calibrated image based on the determined boundary.
  13. The method of Claim 1, further comprising:
    in accordance with a determination that a width of the sub-image is less than a width threshold, providing a first warning that the captured image of the serum sample is not qualified for classification.
  14. The method of Claim 1, further comprising:
    obtaining a first image of a first liquid sample captured by the target capturing device; and
    providing a second warning if an attribute of the first image is out of a predetermined range.
  15. The method of Claim 1, further comprising:
    obtaining a second image of a second liquid sample captured by the target capturing device;
    calibrating the second image of the second liquid sample using the calibration function; and
    providing a third warning if a sub-image corresponding to a target portion fails to be determined from the calibrated second image of the second liquid sample.
  16. A sample processing system, comprising:
    a target capturing device being configured for capturing an image of a liquid sample; and
    a quality control system being configured for performing the computer-implemented method according to any of Claims 1-15.
  17. A quality control system, comprising:
    a processing unit; and
    a memory coupled to the processing unit and having instructions stored thereon that, when executed by the processing unit, cause the quality control system to perform the method according to any of Claims 1-15.
  18. A computer-readable medium comprising instructions that when executed cause performing the method according to any of Claims 1-15.
  19. A computer program product being tangibly stored on a computer storage medium and comprising machine-executable instructions which, when executed by a device, cause the device to perform the method according to any of Claims 1-15.
PCT/CN2021/087520 2021-04-15 2021-04-15 Method and system for sample quality control WO2022217544A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/087520 WO2022217544A1 (en) 2021-04-15 2021-04-15 Method and system for sample quality control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/087520 WO2022217544A1 (en) 2021-04-15 2021-04-15 Method and system for sample quality control

Publications (1)

Publication Number Publication Date
WO2022217544A1 true WO2022217544A1 (en) 2022-10-20

Family

ID=83639979

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/087520 WO2022217544A1 (en) 2021-04-15 2021-04-15 Method and system for sample quality control

Country Status (1)

Country Link
WO (1) WO2022217544A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117114511A (en) * 2023-10-23 2023-11-24 山东希尔福生物科技有限公司 Soft capsule production workshop intelligent management system based on Internet of things

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002190014A (en) * 2000-09-29 2002-07-05 Konica Corp Color correction method in id card preparation system, and id card preparation system
US6654048B1 (en) * 1997-03-03 2003-11-25 Meat & Livestock Australia Limited Calibration of imaging systems
CN108562584A (en) * 2018-06-01 2018-09-21 安图实验仪器(郑州)有限公司 Serology Quality method of discrimination
CN110520737A (en) * 2017-04-13 2019-11-29 美国西门子医学诊断股份有限公司 Method and apparatus for carrying out label compensation during sample characterizes
CN110914667A (en) * 2017-07-19 2020-03-24 美国西门子医学诊断股份有限公司 Stray light compensation method and device for characterizing samples
CN111556961A (en) * 2018-01-10 2020-08-18 美国西门子医学诊断股份有限公司 Method and apparatus for biological fluid sample characterization using neural networks with reduced training

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6654048B1 (en) * 1997-03-03 2003-11-25 Meat & Livestock Australia Limited Calibration of imaging systems
JP2002190014A (en) * 2000-09-29 2002-07-05 Konica Corp Color correction method in id card preparation system, and id card preparation system
CN110520737A (en) * 2017-04-13 2019-11-29 美国西门子医学诊断股份有限公司 Method and apparatus for carrying out label compensation during sample characterizes
CN110914667A (en) * 2017-07-19 2020-03-24 美国西门子医学诊断股份有限公司 Stray light compensation method and device for characterizing samples
CN111556961A (en) * 2018-01-10 2020-08-18 美国西门子医学诊断股份有限公司 Method and apparatus for biological fluid sample characterization using neural networks with reduced training
CN108562584A (en) * 2018-06-01 2018-09-21 安图实验仪器(郑州)有限公司 Serology Quality method of discrimination

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117114511A (en) * 2023-10-23 2023-11-24 山东希尔福生物科技有限公司 Soft capsule production workshop intelligent management system based on Internet of things
CN117114511B (en) * 2023-10-23 2024-01-09 山东希尔福生物科技有限公司 Soft capsule production workshop intelligent management system based on Internet of things

Similar Documents

Publication Publication Date Title
CN112560912B (en) Classification model training method and device, electronic equipment and storage medium
CN108171203B (en) Method and device for identifying vehicle
CN111832298B (en) Medical record quality inspection method, device, equipment and storage medium
CN113409284B (en) Circuit board fault detection method, device, equipment and storage medium
US11538566B2 (en) Sample analysis with test determination based on identified condition
US11587228B2 (en) Cross modality training of machine learning models
WO2022217544A1 (en) Method and system for sample quality control
CN113763348A (en) Image quality determination method and device, electronic equipment and storage medium
CN116596916A (en) Training of defect detection model and defect detection method and device
CN111126851A (en) Quality control method, system, server and storage medium
CN114511756A (en) Attack method and device based on genetic algorithm and computer program product
CN116492634B (en) Standing long jump testing method based on image visual positioning
CN112183289A (en) Method, device, equipment and medium for detecting patterned screen
WO2021182564A1 (en) Labelling training method and system for implementing the same
WO2023280229A1 (en) Image processing method, electronic device, and storage medium
US11842165B2 (en) Context-based image tag translation
CN114566280A (en) User state prediction method and device, electronic equipment and storage medium
US20170337171A1 (en) Providing data quality feedback while end users enter data in electronic forms
CN113761845A (en) Text generation method and device, storage medium and electronic equipment
CN113920311A (en) Remote sensing image segmentation method and system based on edge auxiliary information
CN113592842A (en) Sample serum quality identification method and identification device based on deep learning
CN112749978A (en) Detection method, apparatus, device, storage medium, and program product
CN111833325A (en) Colloidal gold reagent strip detection method and system based on deep learning
CN115454855B (en) Code defect report auditing method, device, electronic equipment and storage medium
CN114998607B (en) Ultrasonic image feature extraction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21936426

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21936426

Country of ref document: EP

Kind code of ref document: A1