CN111027409A - Liquid transfer workbench and method for recognizing and monitoring consumable materials by using liquid transfer workbench - Google Patents

Liquid transfer workbench and method for recognizing and monitoring consumable materials by using liquid transfer workbench Download PDF

Info

Publication number
CN111027409A
CN111027409A CN201911134409.1A CN201911134409A CN111027409A CN 111027409 A CN111027409 A CN 111027409A CN 201911134409 A CN201911134409 A CN 201911134409A CN 111027409 A CN111027409 A CN 111027409A
Authority
CN
China
Prior art keywords
consumable
image
experiment
information
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911134409.1A
Other languages
Chinese (zh)
Other versions
CN111027409B (en
Inventor
李政
罗淑芬
孙瑶
苗保刚
李明
彭年才
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Tianlong Science & Technology Co ltd
Original Assignee
Xi'an Tianlong Science & Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Tianlong Science & Technology Co ltd filed Critical Xi'an Tianlong Science & Technology Co ltd
Priority to CN201911134409.1A priority Critical patent/CN111027409B/en
Publication of CN111027409A publication Critical patent/CN111027409A/en
Application granted granted Critical
Publication of CN111027409B publication Critical patent/CN111027409B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L3/00Containers or dishes for laboratory use, e.g. laboratory glassware; Droppers
    • B01L3/02Burettes; Pipettes
    • B01L3/021Pipettes, i.e. with only one conduit for withdrawing and redistributing liquids
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L9/00Supporting devices; Holding devices
    • B01L9/02Laboratory benches or tables; Fittings therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Abstract

The invention relates to a liquid transfer workbench and a method for identifying and monitoring consumable items by using the liquid transfer workbench, which solve the problems of low efficiency and high possibility of errors in the operation process of the conventional automatic liquid transfer workbench. The system comprises an experiment table, a three-dimensional movement mechanism, a liquid transfer pump and a machine vision component, wherein the liquid transfer pump and the machine vision component are fixed on a Z shaft of the three-dimensional movement mechanism, the machine vision component comprises an annular light source and a camera capable of automatically focusing, and the machine vision component can perform three-dimensional movement together with the liquid transfer pump so as to photograph and image consumables with different heights on a table top of the liquid transfer table. The method for identifying and monitoring the state of the consumable supplies obtains images of a specific area of the experiment table according to a positioning process, judges the type, the quantity, the initial position, the state and the mutual corresponding relation of the consumable supplies through image enhancement, image segmentation, image extraction and image feature ratio peer-to-peer processing and calculation, and monitors whether the quantity, the position and the state of the consumable supplies related to the method are changed and the mutual corresponding relation is correct or not in real time in the experiment process.

Description

Liquid transfer workbench and method for recognizing and monitoring consumable materials by using liquid transfer workbench
Technical Field
The invention relates to a liquid transfer workbench and a method for recognizing and monitoring consumables by using the liquid transfer workbench.
Background
With the continuous popularization of modern medical diagnosis technology, common in vitro diagnostic instruments such as biochemistry, immunity, molecular diagnosis and the like are developing towards the trend of integration and automation. The pipetting table is a common automated device applied to these instruments, and can perform liquid handling operations such as pipetting, gradient dilution, liquid separation and liquid combination. The error of manual operation can be reduced, the repeatability of experiment can be improved, the cost of work such as liquid treatment is reduced, and the wide demand of customers is met.
The operation of liquid removal workstation usually involves multiple different sample, reagent, consumptive material and moves liquid flow, and sample, reagent all place in all kinds of consumptive materials, and whether these experimental information of discernment are correct, for example whether the consumptive material is placed in due position, whether the lid of consumptive material is opened consumptive material state information such as, just can ensure to move liquid accuracy of operation and the reliability of operation flow. If the system only depends on the judgment and monitoring of operators, the efficiency is low and the system is easy to make mistakes.
Therefore, the development of an intelligent consumable state identification and monitoring system can automatically and intelligently judge the consumable state and implement monitoring before and during the experiment, and is an urgent need of an automatic liquid transfer workbench.
Disclosure of Invention
Aiming at solving the problems of low efficiency and easy error in the operation process of the existing automatic liquid transferring working table and aiming at the consumable material management requirement of the existing automatic liquid transferring working table, the invention provides the liquid transferring working table capable of intelligently identifying and monitoring the consumable material state, and the invention is used for assisting the function integration of an in vitro diagnostic instrument and the automation and the intellectualization of the operation flow.
In order to achieve the purpose, the technical scheme of the invention is as follows:
a liquid transfer worktable is characterized in that: comprises a laboratory table and a computer; a three-dimensional motion mechanism is arranged on the experiment table; a liquid moving pump and a machine vision component are arranged on the three-dimensional movement mechanism; the pipette pump is used for completing relevant operations required by tip loading and unloading, pipetting and other experiments; the machine vision component is used for photographing and imaging consumables at a certain height on the experiment table; when the three-dimensional motion mechanism runs, the liquid moving pump and the machine vision component can synchronously move along the xyz direction;
the computer comprises a memory and a processor, the memory having stored therein a computer program which, when executed in the processor, performs the process of:
s1, determining the current experiment program, the initial consumable information corresponding to the experiment program and the consumable information corresponding to each step in the experiment process according to the detection items selected by the user;
s2, controlling the three-dimensional movement mechanism to drive the machine vision part to move to different areas of the experiment table, sequentially photographing area units in the different areas, and storing initial images of the area units;
s3, analyzing the initial images of the area units to obtain the initial consumable information of the areas, comparing the obtained initial consumable information with the initial consumable information determined in the step S1, and judging whether the initial consumable information of the areas is correct or not; if the information is correct, the step S5 is carried out, otherwise, the user is prompted to update the initial consumable information;
s4, after finishing the updating of the consumable information according to the prompt, the user acquires the area unit image after the updating of the consumable information, and executes the operation of the step S3 again until the initial consumable information is all correct;
s5, sequentially operating each step in the experiment program, driving the liquid transfer pump and the machine vision part to move to the experiment table area corresponding to each step in the experiment program by the three-dimensional movement mechanism, operating, photographing the steps related to consumable information change in the experiment process, and storing the experiment area unit image;
s6, analyzing the shot experiment area unit image to obtain consumable information after the consumable information is changed in the experiment process, comparing the consumable information with the consumable information corresponding to the step in the experiment process determined in the step S1, and judging whether the change of the consumable information in the step is correct or not; if the consumable information is correct, updating the consumable information, carrying out the experiment of the next step, and repeating the step S6 until the experiment is finished; and if the fault occurs, the operation and the prompt are carried out according to the fault level.
Further, the experiment program in the step S1, the initial consumable information corresponding to the experiment program, and the consumable information corresponding to each step in the experiment process are preset in the computer:
the experimental program comprises operation content, operation times and an operation flow;
the initial consumable information corresponding to the experimental program and the consumable information corresponding to each step in the experimental process respectively comprise the relationship among consumable types, consumable area position coordinates, consumable quantity, consumable hole position states, the quantity and the position of the samples to be detected, and the type, the quantity and the position of consumable required by detecting the samples to be detected.
Further, the method for obtaining the initial consumable information of each area unit image in step S3 and the method for obtaining the consumable information after the consumable information is changed in the experimental process in step S6 both include the following steps:
step 1a, determining the type of a consumable, the position coordinates of a consumable area and the image characteristics of the consumable;
determining the area position coordinates corresponding to the area unit initial image or the experiment area unit image, and acquiring the type of consumables and consumable image characteristics corresponding to the area unit initial image or the experiment area unit image according to prior information;
the prior information comprises the type of regional consumables, the image characteristics of each type of consumables, regional position coordinates, the number of consumable hole sites in a template region unit, the central coordinates of hole sites in the template region unit and the side length M of the hole sites;
step 2a, determining the quantity and the hole site state of consumables;
step 2a1, determining fuzzy positions of all consumable hole sites on the area unit initial image or the experiment area unit image;
mapping hole site center coordinates in the template area unit in the prior information to a currently shot area unit image, and performing binarization and searching for a communication area by taking all hole site center coordinates in the template area unit as base points, wherein the obtained communication area is a fuzzy position of each consumable hole site on the area unit image;
step 2a2, determining the real central positions of all consumable hole sites on the area unit initial image or the experiment area unit image, obtaining the quantity of the consumables according to the quantity of the real central positions and extracting a local image S of the consumable hole sites;
comparing the fuzzy position center of each consumable hole site obtained in the step 2a1 with the hole site center coordinates in the template area unit to obtain the offset of each hole site on the area unit initial image or the experiment area unit image, and obtaining the average offset by using the offset of each hole site; taking the average offset as the real offset of all hole sites on the initial image of the area unit or the experimental area unit image, and acquiring the real central position of the consumable hole site on the currently shot initial image of the area unit or the experimental area unit image by using the real offset and the hole site central coordinates of the area unit of the template;
taking the real central position as a base point, extracting a M-M square area on a currently shot area unit initial image or an experiment area unit image by using the hole site side length M corresponding to the consumable type in the prior information, and taking the square area as a local image S of the corresponding consumable hole site;
step 2a3, calculating the image characteristics of the consumable hole site local image; the image characteristics comprise energy characteristics, contrast characteristics of a gray level co-occurrence matrix, cross-correlation similarity, standard variance of an image histogram and the like;
step 2a4, judging the state of the consumable hole site according to the image characteristics of the obtained local image of the consumable hole site; when the image characteristics are energy characteristics and contrast characteristics of the gray level co-occurrence matrix, and the calculated characteristic value is greater than a judgment threshold value, the state is that consumable materials exist in the hole sites, and when the calculated characteristic value is less than or equal to the judgment threshold value, the state is that the consumable materials do not exist in the hole sites; when the image features are the cross-correlation similarity and the standard variance of the image histogram, and the calculated feature value is smaller than the judgment threshold, the state is that consumables are available at the hole site, and when the calculated feature value is larger than or equal to the judgment threshold, the state is that consumables are not available at the hole site;
and 2a5, outputting the states of all consumable hole sites.
Further, when there is only one consumable in the zone unit, the method of obtaining the initial consumable information of each zone unit image in step S3 and obtaining the consumable information after the consumable information is changed in the experimental process in step S6 includes the following steps:
step 1b, determining the type of the consumable, the position coordinates of the consumable area and the image characteristics of the consumable;
determining the area position coordinates corresponding to the area unit initial image or the experiment area unit image, and acquiring the type of consumables and consumable image characteristics corresponding to the area unit initial image or the experiment area unit image according to prior information;
the prior information comprises the type of regional consumables, the image characteristics of each type of consumables, regional position coordinates, the number of consumable hole sites in a template region unit, the central coordinates of hole sites in the template region unit and the side length M of the hole sites;
step 2b, determining the hole site state of the consumable;
step 2b1, calculating the image characteristics of the consumable hole site local image; the image characteristics comprise energy characteristics, contrast characteristics of a gray level co-occurrence matrix, cross-correlation similarity and standard deviation of an image histogram;
step 2b2, judging the state of the consumable hole site according to the image characteristics of the obtained local image of the consumable hole site; when the image characteristics are energy characteristics and contrast characteristics of the gray level co-occurrence matrix, and the calculated characteristic value is greater than a judgment threshold value, the state is that consumable materials exist in the hole sites, and when the calculated characteristic value is less than or equal to the judgment threshold value, the state is that the consumable materials do not exist in the hole sites; when the image features are the cross-correlation similarity and the standard variance of the image histogram, and the calculated feature value is smaller than the judgment threshold, the state is that consumables are available at the hole site, and when the calculated feature value is larger than or equal to the judgment threshold, the state is that consumables are not available at the hole site;
and 2b3, outputting the states of all consumable hole sites.
Further, the image characteristics of the local images of the consumable hole sites in the steps 2a3 and 2b1 are energy characteristics, and are calculated according to the following formula:
Figure BDA0002279198230000041
wherein i and j are positions of pixel points of the local image S, i represents the ith row, j represents the j column, and M represents the width and height of the local image S;
further, the image characteristics of the local images of the consumable hole sites in the steps 2a3 and 2b1 can also be the contrast characteristics of a gray level co-occurrence matrix, and are calculated by the following formula:
Figure BDA0002279198230000051
wherein G is a gray level co-occurrence matrix, k is a total number of gray levels, and n represents an nth gray level;
further, the image characteristics of the local images of consumable hole sites in step 2a3 and step 2b1 can also be the cross-correlation similarity with the template region unit images in the prior information, and are calculated by the following formula:
Figure BDA0002279198230000052
wherein S represents a local image of a current image, T is a template area unit image in prior information, and M is the width and height of S and T;
or, the image characteristics of the local images of the hole sites in the step 2a3 and the step 2b1 are standard deviations of image histograms, and are calculated by the following formula:
Figure BDA0002279198230000053
where L is the total number of gray levels, ziRepresenting the ith gray level, p (z)i) For normalizing the gray-scale distribution of the histogram, the gray scale is ziM is the mean of the histogram.
Further, the failure levels in step S6 include the following:
i-stage fault: consumable information is wrong, but the running of a subsequent experiment program is not influenced; continuing to run the experiment program, and prompting the user fault information after the experiment is finished;
and (3) II level fault: consumable information is wrong, and the running of a subsequent experiment program is not influenced; but may have an influence on the experimental results; the experiment program is suspended, fault information of a user is prompted, and the user is waited to select to continue or stop the current experiment;
and (3) stage III fault: consumable information is wrong, and the running of a subsequent experiment program is influenced; the running of the experiment program is stopped, the fault information of the user is prompted, and the current experiment is waited to be terminated.
Further, the prior information is stored in a computer in advance, a machine vision component is driven to move by controlling a three-dimensional movement mechanism, a template image and a template area unit image of the layout area of the experiment table are shot, and an area coordinate position, an area consumable type and each type of consumable image characteristics are obtained according to the template image of the layout area of the experiment table; and acquiring the quantity of consumable hole sites in the template area units, the hole site center coordinates in the template area units and the hole site side length M according to the template area unit images.
Further, the computer program, when executed in the processor, further includes the following procedures after the step S6:
and displaying the number of consumables, the number of residual consumables and other experimental information used in the experiment.
Further, the three-dimensional motion mechanism comprises at least one X-axis motion mechanism, one Y-axis motion mechanism and one Z-axis motion mechanism;
the experiment table is provided with a fixed frame, and the X-axis movement mechanism is arranged at the top of the fixed frame and can move along the top of the fixed frame in the X-axis direction; a Y-axis movement mechanism is arranged below the experiment table base and used for driving the fixed frame to move along the Y-axis direction; the Z-axis movement mechanism is arranged on the X-axis movement mechanism, and the X-axis movement mechanism is used for driving the Z-axis movement mechanism to move along the X-axis direction; the liquid transfer pump and the machine vision part are fixed on a Z-axis movement mechanism of the three-dimensional movement mechanism;
the machine vision components include an annular light source and an automatically focusing camera. But auto focusing's camera can carry out three-dimensional motion along with the liquid-transfering pump together to the formation of image is shot to the consumptive material on the laboratory bench. The annular light source can carry out three-dimensional motion along with the liquid-transfering pump together, carries out the light filling when the formation of image is shot to the camera, guarantees image quality.
The invention also provides a method for identifying and monitoring consumables by using the liquid transfer workbench, which comprises the following steps:
step one, presetting information
Presetting an experiment program, initial consumable information corresponding to the experiment program and consumable information corresponding to each step in the experiment process;
the experimental program comprises operation content, operation times and an operation flow;
the initial consumable information corresponding to the experimental program and the consumable information corresponding to each step in the experimental process respectively comprise the type of the consumable, the position coordinates of the consumable area, the quantity of the consumable, the hole position state of the consumable, the quantity and the position of the consumable to be detected and the relationship between the type, the quantity and the position of the consumable required by detecting the samples to be detected;
step two, obtaining prior information
The method comprises the steps of driving a machine vision component to move by controlling a three-dimensional movement mechanism, shooting template images and template area unit images of an arrangement area of the experiment table, and obtaining area coordinate positions, area consumable types and consumable image characteristics according to the template images of the arrangement area of the experiment table; acquiring the quantity of consumable hole sites in the template area units, the hole site center coordinates in the template area units and the hole site side length M according to the template area unit images;
determining a current experiment program, initial consumable information corresponding to the experiment program and consumable information corresponding to each step in the experiment process according to the detection items selected by the user;
fourthly, controlling the three-dimensional movement mechanism to drive the machine vision part to move to different areas of the experiment table for photographing, and storing initial images of the area units;
analyzing the initial images of the area units to obtain initial consumable information of the area units, comparing the obtained initial consumable information with the initial consumable information determined in the step three, and judging whether the initial consumable information of the area units is correct; if the information is correct, the seventh step is carried out, otherwise, the user is prompted to update the initial consumable information;
step six, after finishing the updating of the consumable information according to the prompt, the user acquires the area unit image after updating the consumable information, and executes the operation of the step five again until the initial consumable information is all correct;
seventhly, operating all the steps in the experiment program in sequence, driving the liquid moving pump and the machine vision part to move to the experiment area unit corresponding to each step in the experiment program by the three-dimensional movement mechanism, operating, photographing the steps related to consumable information change in the experiment process, and storing the images of the experiment area units;
step eight, analyzing the shot experiment area unit image, obtaining consumable information after the consumable information is changed in the experiment process, comparing the consumable information with the consumable information corresponding to the step in the experiment process determined in the step three, and judging whether the consumable information change in the step is correct or not; if the consumable information is correct, updating the consumable information, carrying out the experiment of the next step, and repeating the step eight until the experiment is finished; and if the fault occurs, the operation and the prompt are carried out according to the fault level.
Further, the method for obtaining the initial consumable information of each area unit image in the step five and obtaining the consumable information after the consumable information is changed in the experiment process in the step eight respectively comprises the following steps:
step 1a, determining the type of a consumable, the position coordinates of a consumable area and the image characteristics of the consumable;
determining the area position coordinates corresponding to the area unit initial image or the experiment area unit image, and acquiring the type of consumables and consumable image characteristics corresponding to the area unit initial image or the experiment area unit image according to prior information;
the prior information comprises the type of regional consumables, the image characteristics of each type of consumables, the position of a regional coordinate, the number of consumable hole sites in a template region unit, the central coordinate of hole sites in the template region unit and the side length M of the hole sites;
step 2a, determining the quantity and the hole site state of consumables;
step 2a1, determining fuzzy positions of all consumable hole sites on the area unit initial image or the experiment area unit image;
mapping hole site center coordinates in the template area unit in the prior information to a currently shot area unit image, and performing binarization and searching for a communication area by taking all hole site center coordinates in the template area unit as base points, wherein the obtained communication area is a fuzzy position of each consumable hole site on the area unit image;
step 2a2, determining the real central positions of all consumable hole sites on the area unit initial image or the experiment area unit image, obtaining the quantity of the consumables according to the quantity of the real central positions and extracting a local image S of the consumable hole sites;
comparing the fuzzy position center of each consumable hole site obtained in the step 2a1 with the hole site center coordinates in the template area unit to obtain the offset of each hole site on the area unit initial image or the experiment area unit image, and obtaining the average offset by using the offset of each hole site; taking the average offset as the real offset of all hole sites on the initial image of the area unit or the experimental area unit image, and acquiring the real central position of the consumable hole site on the currently shot initial image of the area unit or the experimental area unit image by using the real offset and the hole site central coordinates of the area unit of the template;
taking the real central position as a base point, extracting a M-M square area on a currently shot area unit initial image or an experiment area unit image by using the hole site side length M corresponding to the consumable type in the prior information, and taking the square area as a local image S of the corresponding consumable hole site;
step 2a3, calculating the image characteristics of the consumable hole site local image; the image characteristics comprise energy characteristics, contrast characteristics of a gray level co-occurrence matrix, cross-correlation similarity and standard deviation of an image histogram;
step 2a4, judging the state of the consumable hole site according to the image characteristics of the obtained local image of the consumable hole site; when the image characteristics are energy characteristics and contrast characteristics of the gray level co-occurrence matrix, and the calculated characteristic value is greater than a judgment threshold value, the state is that consumable materials exist in the hole sites, and when the calculated characteristic value is less than or equal to the judgment threshold value, the state is that the consumable materials do not exist in the hole sites; when the image features are the cross-correlation similarity and the standard variance of the image histogram, and the calculated feature value is smaller than the judgment threshold, the state is that consumables are available at the hole site, and when the calculated feature value is larger than or equal to the judgment threshold, the state is that consumables are not available at the hole site;
and 2a5, outputting the states of all consumable hole sites.
Further, when only one consumable is included, the method of obtaining the initial consumable information of each area unit image in step S3 and obtaining the consumable information after the consumable information is changed during the experiment in step S6 each includes the following steps:
step 1b, determining the type of the consumable, the position coordinates of the consumable area and the image characteristics of the consumable;
determining the area position coordinates corresponding to the area unit initial image or the experiment area unit image, and acquiring the type of consumables and consumable image characteristics corresponding to the area unit initial image or the experiment area unit image according to prior information;
the prior information comprises the type of regional consumables, the image characteristics of each type of consumables, regional position coordinates, the number of consumable hole sites in a template region unit, the central coordinates of hole sites in the template region unit and the side length M of the hole sites;
step 2b, determining the hole site state of the consumable;
step 2b1, calculating the image characteristics of the consumable hole site local image; the image characteristics comprise energy characteristics, contrast characteristics of a gray level co-occurrence matrix, cross-correlation similarity and standard deviation of an image histogram;
step 2b2, judging the state of the consumable hole site according to the image characteristics of the obtained local image of the consumable hole site; when the image characteristics are energy characteristics and contrast characteristics of the gray level co-occurrence matrix, and the calculated characteristic value is greater than a judgment threshold value, the state is that consumable materials exist in the hole sites, and when the calculated characteristic value is less than or equal to the judgment threshold value, the state is that the consumable materials do not exist in the hole sites; when the image features are the cross-correlation similarity and the standard variance of the image histogram, and the calculated feature value is smaller than the judgment threshold, the state is that consumables are available at the hole site, and when the calculated feature value is larger than or equal to the judgment threshold, the state is that consumables are not available at the hole site;
and 2b3, outputting the states of all consumable hole sites.
Further, the image characteristics of the local images of the consumable hole sites in the steps 2a3 and 2b1 are energy characteristics, and are calculated according to the following formula:
Figure BDA0002279198230000091
wherein i and j are positions of pixel points of the local image S, i represents the ith row, j represents the j column, and M represents the width and height of the local image S;
or, the image characteristics of the local images of the consumable hole sites in the step 2a3 and the step 2b1 are the contrast characteristics of the gray level co-occurrence matrix, and are calculated according to the following formula:
Figure BDA0002279198230000092
wherein G is a gray level co-occurrence matrix, k is a total number of gray levels, and n represents an nth gray level;
or, the image characteristics of the local images of consumable hole sites in the step 2a3 and the step 2b1 are the cross-correlation similarity with the template region unit images in the prior information, and are calculated by the following formula:
Figure BDA0002279198230000093
wherein S represents a local image of a current image, T is a template area unit image in prior information, and M is the width and height of S and T;
or, the image feature of the consumable hole site local image in step 2a3 and step 2b1 is the standard deviation of the image histogram, and is calculated by the following formula:
Figure BDA0002279198230000101
where L is the total number of gray levels, ziRepresenting the ith gray level, p (z)i) For normalizing the gray-scale distribution of the histogram, the gray scale is ziM is the mean of the histogram.
Further, the fault classes in step eight include the following:
i-stage fault: consumable information is wrong, but the running of a subsequent experiment program is not influenced; continuing to run the experiment program, and prompting the user fault information after the experiment is finished;
and (3) II level fault: consumable information is wrong, and the running of a subsequent experiment program is not influenced; but may have an influence on the experimental results; the experiment program is suspended, fault information of a user is prompted, and the user is waited to select to continue or stop the current experiment;
and (3) stage III fault: consumable information is wrong, and the running of a subsequent experiment program is influenced; the running of the experiment program is stopped, the fault information of the user is prompted, and the current experiment is waited to be terminated.
The invention has the beneficial effects that:
1. according to the invention, different areas are photographed before the experiment is started according to the preset experiment program and the consumable information, and whether the consumable information is correct or not is judged, so that the correctness of the pipetting operation and the reliability of the operation flow are ensured. If the system only depends on the judgment and monitoring of operators, the efficiency is low and the system is easy to make mistakes. And when the experiment is carried out, consumable information in the experimental process is monitored, whether the consumable information is correct or not is judged at the same time, and the correctness of pipetting operation and the reliability of the operation flow in the experimental process are guaranteed.
2. According to the invention, the liquid transfer pump and the machine vision part are fixed on the three-dimensional movement mechanism, so that the camera can move to any position of the system along with the liquid transfer pump, and the photographing range covers the whole experiment table. Meanwhile, the camera capable of automatically focusing is adopted and can move in the Z-axis direction, so that the camera can photograph areas with different heights on the experiment table board.
Drawings
FIG. 1 is a schematic structural diagram of a pipetting station according to an embodiment of the invention;
FIG. 2 is a schematic diagram of the Z-axis motion mechanism, pipetting pump and machine vision components of an embodiment of the invention;
FIG. 3 is a schematic view of a working flow of a pipetting station according to an embodiment of the invention;
FIG. 4 is a schematic view illustrating a process of acquiring consumable information according to the present invention;
FIG. 5 is a comparison of a regional unit image taken by a machine vision component of one embodiment of the present invention and the computer-determined result image, in this embodiment the consumable identified is a pipette tip;
FIG. 6 is a comparison of a region unit image captured by a machine vision component and a computer-determined result image according to an embodiment of the present invention, wherein the consumable identified in this embodiment is a bar cover;
the reference numbers in the figures are: the system comprises a test table 1, a 2-X axis motion mechanism, a 3-Y axis motion mechanism, a 4-Z axis motion mechanism, a 5-liquid moving pump, a 6-machine vision component, a 7-fixed frame, an 8-annular light source and a 9-camera.
Detailed Description
To further clarify the objects, advantages and features of the present invention, a system and method for identification and monitoring of pipetting station consumables according to the present invention will be described in greater detail in the following description with reference to the accompanying drawings and specific examples. It should be noted that the drawings of the following detailed description are in a very simplified form and are not to precise scale, which is merely used for convenience and clarity in assisting in describing the embodiments of the present invention; secondly, the structures shown in the drawings are often part of the actual structures, and thirdly, the drawings need to show different emphasis points and sometimes adopt different proportions.
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
As can be seen from fig. 1, the liquid transferring table of the present embodiment includes a laboratory table 1, a three-dimensional movement mechanism including an X-axis movement mechanism 2, a Y-axis movement mechanism 3, and a Z-axis movement mechanism 4, a liquid transferring pump 5 fixed on the Z-axis movement mechanism 4, a machine vision component 6, and a computer.
The experiment table 1 is provided with a fixed frame 7, and the X-axis movement mechanism 2 is arranged at the top of the fixed frame 7 and can move along the X-axis direction; a Y-axis movement mechanism 3 is arranged below the base of the experiment table 1 and used for driving the fixed frame 7 to move along the Y-axis direction; the Z-axis movement mechanism 4 is arranged on the X-axis movement mechanism 2 and can move along the X-axis direction under the driving of the X-axis movement mechanism 2.
As can be seen from fig. 2, the pipetting pump 5 and the machine vision part 6 are fixed on the Z-axis movement mechanism 4.
The experiment table comprises a plurality of areas, each area corresponds to one type of consumable, each type of consumable corresponds to one image characteristic, and each area comprises a plurality of same area units. The experiment table 1 is loaded with samples, consumables and/or reagents required by the experiment, the user can arrange and design according to specific experiment requirements and flows, and the number of the consumables loaded by the user is not less than the threshold number meeting the experiment requirements. The three-dimensional movement mechanism can drive the liquid transfer pump 5 to move to a specific position of the experiment table 1, and relevant operations required by suction head loading, liquid transfer and other experiments are completed. Meanwhile, the three-dimensional movement mechanism can drive the machine vision part 6 to move to a specific position of the experiment table 1, and consumable identification and other operation monitoring are completed. The machine vision component 6 comprises an annular light source 8 and a camera 9 capable of automatically focusing, and the camera 9 can move three-dimensionally along with the liquid transfer pump 5, so that the consumables on the experiment table 1 can be photographed and imaged; annular light source 8 can carry out three-dimensional motion along with the liquid-transfering pump together, but carries out the light filling when the formation of image is taken to automatic focusing's camera 9, guarantees image quality. The camera 9 of machine vision part 5 not only can automatic focusing to along with Z axle motion along the motion of Z axle direction, adjust the camera to focus the distance and shoot the scope, thereby shoot the formation of image to the consumptive material of the take a picture of the take a height on laboratory bench 1.
The computer comprises a memory and a processor, wherein the memory stores a computer program, and the computer program needs preset information and acquires prior information before executing the program:
preset information
Presetting an experiment program, initial consumable information corresponding to the experiment program and consumable information corresponding to each step in the experiment process;
the experimental program comprises operation content, operation times and an operation flow;
wherein the initial consumable information corresponding to the experimental program and the corresponding consumable information of each step in the experimental process all comprise the consumable type, the consumable area position coordinates, the consumable quantity, the consumable hole position state, the quantity and the position of the samples to be tested, and the prior information is obtained by detecting the relationship between the type, the quantity and the position of the consumable needed by the samples to be tested
The method comprises the steps of driving a machine vision component to move by controlling a three-dimensional movement mechanism, shooting template images and template area unit images of an arrangement area of the experiment table, and obtaining area coordinate positions, area consumable types and consumable image characteristics according to the template images of the arrangement area of the experiment table; and acquiring the quantity of consumable hole sites in the template area units, the hole site center coordinates in the template area units and the hole site side length M according to the template area unit images.
The computer program, when executed in a processor, implements the following process, as shown in fig. 3:
s1, determining a current experiment program, initial consumable information corresponding to the current experiment program and consumable information corresponding to each step in the experiment process according to the detection items selected by the user; the user can load the consumable in the system according to the consumable information provided by the system software.
S2, controlling the three-dimensional movement mechanism to drive the machine vision part to move to different area units of the experiment table for photographing, and storing initial images of the area units;
s3, analyzing the initial image of each area unit to obtain the initial consumable information of each area unit, comparing the obtained initial consumable information with the initial consumable information determined in the step S1, and judging whether the initial consumable information of each area unit is correct or not; if the information is correct, the step S5 is carried out, otherwise, the user is prompted to update the initial consumable information;
s4, after finishing the updating of the consumable information according to the prompt, the user acquires the area unit image after the updating of the consumable information, and executes the operation of the step S3 again until the initial consumable information is all correct;
s5, sequentially operating each step in the experiment program, driving the liquid-moving pump and the machine vision part to move to the experiment table area unit corresponding to each step in the experiment program by the three-dimensional movement mechanism, operating, photographing the steps related to consumable information change in the experiment process, and storing the images of the experiment area units;
s6, analyzing the shot experiment area unit image to obtain consumable information after the consumable information is changed in the experiment process, comparing the consumable information with the consumable information corresponding to the step in the experiment process determined in the step S1, and judging whether the change of the consumable information in the step is correct or not; if the consumable information is correct, updating the consumable information, carrying out the experiment of the next step, and repeating the step S6 until the experiment is finished; and if the fault occurs, the operation and the prompt are carried out according to the fault level.
Specifically, the method comprises the following steps:
i-stage fault: consumable information is wrong, but the running of subsequent experimental programs is not influenced. The system continues to run the experiment program and prompts the fault information of the user after the experiment is finished;
and (3) II level fault: consumable information is wrong, the running of subsequent experimental programs is not influenced, and the experimental result can be influenced. The system suspends the running of the experiment program, prompts the fault information of the user and waits for the user to select to continue or stop the current experiment;
and (3) stage III fault: consumable information is wrong, and the running of subsequent experimental programs is influenced. The system stops running the experiment program, prompts the fault information of the user and waits for the termination of the current experiment;
and S7, after the experiment is finished, the system displays the consumable quantity used in the experiment, the residual consumable quantity and other experiment information.
The method for obtaining the initial consumable information of each area unit image in step S3 and obtaining the consumable information after the consumable information is changed in the experimental process in step S6, as shown in fig. 4, includes the following steps:
step 1, acquiring a consumable type, consumable image characteristics and a region coordinate position corresponding to a region unit initial image or an experiment region unit image according to prior information;
the prior information comprises the type of regional consumables, the image characteristics of each type of consumables, the position of a regional coordinate, the number of consumable hole sites in a template region unit, the central coordinate of hole sites in the template region unit and the side length M of the hole sites;
step 2, determining fuzzy positions of all consumable hole positions on the area unit initial image or the experiment area unit image;
mapping hole site center coordinates in a template region unit in the prior information to a region unit initial image or an experiment region unit image, and performing binarization and searching for a communication region by taking all hole site center coordinates in the template region unit as base points, wherein the obtained communication region is a fuzzy position of each consumable hole site on the region unit initial image or the experiment region unit image;
step 3, determining the real central positions of all consumable hole sites on the initial image of the area unit or the unit image of the experimental area, and extracting a local image S of the consumable hole sites;
comparing the fuzzy position center of each consumable hole site obtained in the step 2 with the hole site center coordinates in the template area unit to obtain the offset of each hole site on the initial image of the area unit or the unit image of the experimental area, and obtaining the average offset by using the offset of each hole site; taking the average offset as the real offset of all hole sites on the initial image of the area unit or the experimental area unit image, and acquiring the real central position of the consumable hole site on the currently shot initial image of the area unit or the experimental area unit image by using the real offset and the hole site central coordinates of the area unit of the template;
taking the real central position as a base point, extracting a M-M square area on a currently shot area unit initial image or an experiment area unit image by using the hole site side length M corresponding to the consumable type in the prior information, and taking the square area as a local image S of the corresponding consumable hole site;
step 4, calculating the image characteristics of the consumable hole position local image S;
energy characteristics in some embodiments:
Figure BDA0002279198230000141
wherein i and j are positions of pixel points of the local image S, i represents the ith row, j represents the j column, and M represents the width and height of the local image S;
in a specific embodiment, the contrast characteristic of the gray level co-occurrence matrix is determined for the area unit image, and the calculation formula is as follows:
Figure BDA0002279198230000142
where G is the gray level co-occurrence matrix and n is k is the gray level.
In some embodiments, the cross-correlation similarity with the template image in the prior information is calculated according to the following formula:
Figure BDA0002279198230000151
wherein S represents a local image of a current image, T is a template area unit image in prior information, and M is the width and height of S and T;
in some embodiments, the standard deviation of the image histogram, the calculation is as follows:
Figure BDA0002279198230000152
where L is the total number of gray levels, ziRepresenting the ith gray level, p (z)i) For normalizing the gray-scale distribution of the histogram, the gray scale is ziM is the mean of the histogram.
Step 5, judging the state of the consumable hole site according to the image characteristics of the obtained local image of the consumable hole site; when the image characteristics are energy characteristics and contrast characteristics of the gray level co-occurrence matrix, and the calculated characteristic value is greater than a judgment threshold value, the state is that consumable materials exist in the hole sites, and when the calculated characteristic value is less than or equal to the judgment threshold value, the state is that the consumable materials do not exist in the hole sites; when the image features are the cross-correlation similarity and the standard variance of the image histogram, and the calculated feature value is smaller than the judgment threshold, the state is that consumables are available at the hole site, and when the calculated feature value is larger than or equal to the judgment threshold, the state is that consumables are not available at the hole site;
and 6, outputting the states of all consumable hole sites.
FIG. 5 is a comparison of a regional unit image taken by a machine vision component of one embodiment of the present invention and the computer-determined result image, in this embodiment the consumable identified is a pipette tip; FIG. 6 is a comparison of a region unit image captured by a machine vision component and a computer-determined result image according to an embodiment of the present invention, in which the consumable identified is a magnetic rod sleeve.

Claims (15)

1. A pipetting station, comprising: comprises a laboratory table (1) and a computer; a three-dimensional motion mechanism is arranged on the experiment table (1); a liquid transfer pump (5) and a machine vision component (6) are arranged on the three-dimensional motion mechanism; the liquid transfer pump (5) is used for completing the loading and unloading of the suction head, liquid transfer and other relevant operations required by experiments; the machine vision part (6) is used for photographing and imaging consumables at a certain height on the experiment table; when the three-dimensional motion mechanism operates, the liquid displacement pump (5) and the machine vision component (6) can synchronously move along the xyz direction;
the computer comprises a memory and a processor, the memory having stored therein a computer program which, when executed in the processor, performs the process of:
s1, determining the current experiment program, the initial consumable information corresponding to the experiment program and the consumable information corresponding to each step in the experiment process according to the detection items selected by the user;
s2, controlling the three-dimensional movement mechanism to drive the machine vision part to move to different areas of the experiment table, sequentially photographing area units in the different areas, and storing initial images of the area units;
s3, analyzing the initial images of the area units to obtain the initial consumable information of the initial images of the area units, comparing the obtained initial consumable information with the initial consumable information determined in the step S1, and judging whether the initial consumable information of the area units is correct or not; if the information is correct, the step S5 is carried out, otherwise, the user is prompted to update the initial consumable information;
s4, after finishing the updating of the consumable information according to the prompt, the user acquires the area unit image after the updating of the consumable information, and executes the operation of the step S3 again until the initial consumable information is all correct;
s5, sequentially operating each step in the experiment program, driving the liquid transfer pump and the machine vision part to move to the experiment area unit corresponding to each step in the experiment program by the three-dimensional movement mechanism, operating, photographing the steps related to consumable information change in the experiment process, and storing the images of the experiment area units;
s6, analyzing the shot experiment area unit image to obtain consumable information after the consumable information is changed in the experiment process, comparing the consumable information with the consumable information corresponding to the step in the experiment process determined in the step S1, and judging whether the change of the consumable information in the step is correct or not; if the consumable information is correct, updating the consumable information, carrying out the experiment of the next step, and repeating the step S6 until the experiment is finished; and if the fault occurs, the operation and the prompt are carried out according to the fault level.
2. The pipetting station of claim 1, wherein: the experimental program in the step S1, the initial consumable information corresponding to the experimental program, and the consumable information corresponding to each step in the experimental process are preset in the computer:
the experimental program comprises operation content, operation times and an operation flow;
the initial consumable information corresponding to the experimental program and the consumable information corresponding to each step in the experimental process respectively comprise the relationship among consumable types, consumable area position coordinates, consumable quantity, consumable hole position states, the quantity and the position of the samples to be detected, and the type, the quantity and the position of consumable required by detecting the samples to be detected.
3. The pipetting stage of claim 2, wherein the method of obtaining initial consumable information for each area unit image in step S3 and obtaining consumable information after consumable information change during experiment in step S6 comprises the steps of:
step 1a, determining the type of a consumable, the position coordinates of a consumable area and the image characteristics of the consumable;
determining the area position coordinates corresponding to the area unit initial image or the experiment area unit image, and acquiring the type of consumables and consumable image characteristics corresponding to the area unit initial image or the experiment area unit image according to prior information;
the prior information comprises the type of regional consumables, the image characteristics of each type of consumables, regional position coordinates, the number of consumable hole sites in a template region unit, the central coordinates of hole sites in the template region unit and the side length M of the hole sites;
step 2a, determining the quantity and the hole site state of consumables;
step 2a1, determining fuzzy positions of all consumable hole sites on the area unit initial image or the experiment area unit image;
mapping hole site center coordinates in the template area unit in the prior information to a currently shot area unit image, and performing binarization and searching for a communication area by taking all hole site center coordinates in the template area unit as base points, wherein the obtained communication area is a fuzzy position of each consumable hole site on the area unit image;
step 2a2, determining the real central positions of all consumable hole sites on the area unit initial image or the experiment area unit image, obtaining the quantity of the consumables according to the quantity of the real central positions and extracting a local image S of the consumable hole sites;
comparing the fuzzy position center of each consumable hole site obtained in the step 2a1 with the hole site center coordinates in the template area unit to obtain the offset of each hole site on the area unit initial image or the experiment area unit image, and obtaining the average offset by using the offset of each hole site; taking the average offset as the real offset of all hole sites on the initial image of the area unit or the experimental area unit image, and acquiring the real central position of the consumable hole site on the currently shot initial image of the area unit or the experimental area unit image by using the real offset and the hole site central coordinates of the area unit of the template;
taking the real central position as a base point, extracting a M-M square area on a currently shot area unit initial image or an experiment area unit image by using the hole site side length M corresponding to the consumable type in the prior information, and taking the square area as a local image S of the corresponding consumable hole site;
step 2a3, calculating the image characteristics of the consumable hole site local image; the image characteristics comprise energy characteristics, contrast characteristics of a gray level co-occurrence matrix, cross-correlation similarity and standard deviation of an image histogram;
step 2a4, judging the state of the consumable hole site according to the image characteristics of the obtained local image of the consumable hole site; when the image characteristics are energy characteristics and contrast characteristics of the gray level co-occurrence matrix, and the calculated characteristic value is greater than a judgment threshold value, the state is that consumable materials exist in the hole sites, and when the calculated characteristic value is less than or equal to the judgment threshold value, the state is that the consumable materials do not exist in the hole sites; when the image features are the cross-correlation similarity and the standard variance of the image histogram, and the calculated feature value is smaller than the judgment threshold, the state is that consumables are available at the hole site, and when the calculated feature value is larger than or equal to the judgment threshold, the state is that consumables are not available at the hole site;
and 2a5, outputting the states of all consumable hole sites.
4. The pipetting stage of claim 2, wherein when there is only one consumable in a zone unit, the method of obtaining initial consumable information of each zone unit image in step S3 and obtaining consumable information after consumable information change during experiment in step S6 comprises the steps of:
step 1b, determining the type of the consumable, the position coordinates of the consumable area and the image characteristics of the consumable;
determining the area position coordinates corresponding to the area unit initial image or the experiment area unit image, and acquiring the type of consumables and consumable image characteristics corresponding to the area unit initial image or the experiment area unit image according to prior information;
the prior information comprises the type of regional consumables, the image characteristics of each type of consumables, regional position coordinates, the number of consumable hole sites in a template region unit, the central coordinates of hole sites in the template region unit and the side length M of the hole sites;
step 2b, determining the hole site state of the consumable;
step 2b1, calculating the image characteristics of the consumable hole site local image; the image characteristics comprise energy characteristics, contrast characteristics of a gray level co-occurrence matrix, cross-correlation similarity and standard deviation of an image histogram;
step 2b2, judging the state of the consumable hole site according to the image characteristics of the obtained local image of the consumable hole site; when the image characteristics are energy characteristics and contrast characteristics of the gray level co-occurrence matrix, and the calculated characteristic value is greater than a judgment threshold value, the state is that consumable materials exist in the hole sites, and when the calculated characteristic value is less than or equal to the judgment threshold value, the state is that the consumable materials do not exist in the hole sites; when the image features are the cross-correlation similarity and the standard variance of the image histogram, and the calculated feature value is smaller than the judgment threshold, the state is that consumables are available at the hole site, and when the calculated feature value is larger than or equal to the judgment threshold, the state is that consumables are not available at the hole site;
and 2b3, outputting the states of all consumable hole sites.
5. The pipetting stage of claim 3 or 4, wherein the image features of the local images of consumable well sites in steps 2a3 and 2b1 are energy features calculated by the following formula:
Figure FDA0002279198220000041
wherein i and j are positions of pixel points of the local image S, i represents the ith row, j represents the j column, and M represents the width and height of the local image S;
or, the image characteristics of the local images of the consumable hole sites in the step 2a3 and the step 2b1 are the contrast characteristics of the gray level co-occurrence matrix, and are calculated according to the following formula:
Figure FDA0002279198220000042
wherein G is a gray level co-occurrence matrix, k is a total number of gray levels, and n represents an nth gray level;
or, the image characteristics of the local images of consumable hole sites in the step 2a3 and the step 2b1 are the cross-correlation similarity with the template region unit images in the prior information, and are calculated by the following formula:
Figure FDA0002279198220000043
wherein S represents a local image of a current image, T is a template area unit image in prior information, and M is the width and height of S and T;
or, the image feature of the consumable hole site local image in step 2a3 and step 2b1 is the standard deviation of the image histogram, and is calculated by the following formula:
Figure FDA0002279198220000044
where L is the total number of gray levels, ziRepresenting the ith gray level, p (z)i) For normalizing the gray-scale distribution of the histogram, the gray scale is ziM is the mean of the histogram.
6. The pipetting station of claim 3 or 4, wherein the fault levels in step S6 include the following:
i-stage fault: consumable information is wrong, but the running of a subsequent experiment program is not influenced; continuing to run the experiment program, and prompting the user fault information after the experiment is finished;
and (3) II level fault: consumable information is wrong, and the running of a subsequent experiment program is not influenced; but may have an influence on the experimental results; the experiment program is suspended, fault information of a user is prompted, and the user is waited to select to continue or stop the current experiment;
and (3) stage III fault: consumable information is wrong, and the running of a subsequent experiment program is influenced; the running of the experiment program is stopped, the fault information of the user is prompted, and the current experiment is waited to be terminated.
7. The pipetting station of claim 6, wherein: the prior information is stored in a computer in advance, a machine vision part is driven to move by controlling a three-dimensional movement mechanism, and a template image of a layout area of the experiment table and a template area unit image are shot; obtaining the area coordinate position, the area consumable type and the consumable image characteristics according to the experiment table layout area template image; and acquiring the quantity of consumable hole sites in the template area units, the hole site center coordinates in the template area units and the hole site side length M according to the template area unit images.
8. The pipetting stage of claim 1, wherein the computer program when executed in the processor further comprises the following process after step S6:
and displaying the number of consumables, the number of residual consumables and other experimental information used in the experiment.
9. The pipetting stage of claim 2 or 3, wherein the three-dimensional motion mechanism comprises at least one X-axis motion mechanism, one Y-axis motion mechanism and one Z-axis motion mechanism;
the experiment table is provided with a fixed frame, and the X-axis movement mechanism is arranged at the top of the fixed frame and can move along the top of the fixed frame in the X-axis direction; a Y-axis movement mechanism is arranged below the experiment table base and used for driving the fixed frame to move along the Y-axis direction; the Z-axis movement mechanism is arranged on the X-axis movement mechanism, and the X-axis movement mechanism is used for driving the Z-axis movement mechanism to move along the X-axis direction; and the liquid moving pump and the machine vision part are fixed on a Z-axis movement mechanism of the three-dimensional movement mechanism.
10. The pipetting station of claim 9, wherein: the machine vision components include an annular light source and an automatically focusing camera.
11. A method of consumable identification and monitoring using a pipetting station as recited in claim 1 comprising the steps of:
step one, presetting information
Presetting an experiment program, initial consumable information corresponding to the experiment program and consumable information corresponding to each step in the experiment process;
the experimental program comprises operation content, operation times and an operation flow;
the initial consumable information corresponding to the experimental program and the consumable information corresponding to each step in the experimental process respectively comprise the type of the consumable, the position coordinates of the consumable area, the quantity of the consumable, the hole position state of the consumable, the quantity and the position of the consumable to be detected and the relationship between the type, the quantity and the position of the consumable required by detecting the samples to be detected;
step two, obtaining prior information
The three-dimensional movement mechanism is controlled to drive a machine vision component to move, and a template image of a layout area of the experiment table and a template area unit image are shot; obtaining the area coordinate position, the area consumable type and the consumable image characteristics according to the experiment table layout area template image; acquiring the quantity of consumable hole sites in the template area units, the hole site center coordinates in the template area units and the hole site side length M according to the template area unit images;
determining a current experiment program, initial consumable information corresponding to the experiment program and consumable information corresponding to each step in the experiment process according to the detection items selected by the user;
fourthly, controlling the three-dimensional movement mechanism to drive the machine vision part to move to different areas of the experiment table for photographing, and storing initial images of the area units;
analyzing the initial images of the area units to obtain initial consumable information of the area units, comparing the obtained initial consumable information with the initial consumable information determined in the step three, and judging whether the initial consumable information of the area units is correct; if the information is correct, the seventh step is carried out, otherwise, the user is prompted to update the initial consumable information;
step six, after finishing the updating of the consumable information according to the prompt, the user acquires the area unit image after updating the consumable information, and executes the operation of the step five again until the initial consumable information is all correct;
seventhly, operating all the steps in the experiment program in sequence, driving the liquid moving pump and the machine vision part to move to the experiment area unit corresponding to each step in the experiment program by the three-dimensional movement mechanism, operating, photographing the steps related to consumable information change in the experiment process, and storing the images of the experiment area units;
step eight, analyzing the shot experiment area unit image, obtaining consumable information after the consumable information is changed in the experiment process, comparing the consumable information with the consumable information corresponding to the step in the experiment process determined in the step three, and judging whether the consumable information change in the step is correct or not; if the consumable information is correct, updating the consumable information, carrying out the experiment of the next step, and repeating the step eight until the experiment is finished; and if the fault occurs, the operation and the prompt are carried out according to the fault level.
12. Method for implementing consumable identification and monitoring according to claim 11, characterized in that: the method for acquiring the initial consumable information of each regional unit image in the step five and acquiring the consumable information after the consumable information is changed in the experimental process in the step eight respectively comprises the following steps of:
step 1a, determining the type of a consumable, the position coordinates of a consumable area and the image characteristics of the consumable;
determining the area position coordinates corresponding to the area unit initial image or the experiment area unit image, and acquiring the type of consumables and consumable image characteristics corresponding to the area unit initial image or the experiment area unit image according to prior information;
the prior information comprises the type of regional consumables, the image characteristics of each type of consumables, the position of a regional coordinate, the number of consumable hole sites in a template region unit, the central coordinate of hole sites in the template region unit and the side length M of the hole sites;
step 2a, determining the quantity and the hole site state of consumables;
step 2a1, determining fuzzy positions of all consumable hole sites on the area unit initial image or the experiment area unit image;
mapping hole site center coordinates in the template area unit in the prior information to a currently shot area unit image, and performing binarization and searching for a communication area by taking all hole site center coordinates in the template area unit as base points, wherein the obtained communication area is a fuzzy position of each consumable hole site on the area unit image;
step 2a2, determining the real central positions of all consumable hole sites on the area unit initial image or the experiment area unit image, obtaining the quantity of the consumables according to the quantity of the real central positions and extracting a local image S of the consumable hole sites;
comparing the fuzzy position center of each consumable hole site obtained in the step 2a1 with the hole site center coordinates in the template area unit to obtain the offset of each hole site on the area unit initial image or the experiment area unit image, and obtaining the average offset by using the offset of each hole site; taking the average offset as the real offset of all hole sites on the initial image of the area unit or the experimental area unit image, and acquiring the real central position of the consumable hole site on the currently shot initial image of the area unit or the experimental area unit image by using the real offset and the hole site central coordinates of the area unit of the template;
taking the real central position as a base point, extracting a M-M square area on a currently shot area unit initial image or an experiment area unit image by using the hole site side length M corresponding to the consumable type in the prior information, and taking the square area as a local image S of the corresponding consumable hole site;
step 2a3, calculating the image characteristics of the consumable hole site local image; the image characteristics comprise energy characteristics, contrast characteristics of a gray level co-occurrence matrix, cross-correlation similarity and standard deviation of an image histogram;
step 2a4, judging the state of the consumable hole site according to the image characteristics of the obtained local image of the consumable hole site; when the image characteristics are energy characteristics and contrast characteristics of the gray level co-occurrence matrix, and the calculated characteristic value is greater than a judgment threshold value, the state is that consumable materials exist in the hole sites, and when the calculated characteristic value is less than or equal to the judgment threshold value, the state is that the consumable materials do not exist in the hole sites; when the image features are the cross-correlation similarity and the standard variance of the image histogram, and the calculated feature value is smaller than the judgment threshold, the state is that consumables are available at the hole site, and when the calculated feature value is larger than or equal to the judgment threshold, the state is that consumables are not available at the hole site;
and 2a5, outputting the states of all consumable hole sites.
13. The method for identifying and monitoring consumables according to claim 11, wherein when there is only one consumable in a zone unit, the method for obtaining the initial consumable information of each zone unit image in step S3 and the method for obtaining the consumable information after the consumable information is changed during the experiment in step S6 are performed by the following steps:
step 1b, determining the type of the consumable, the position coordinates of the consumable area and the image characteristics of the consumable;
determining the area position coordinates corresponding to the area unit initial image or the experiment area unit image, and acquiring the type of consumables and consumable image characteristics corresponding to the area unit initial image or the experiment area unit image according to prior information;
the prior information comprises the type of regional consumables, the image characteristics of each type of consumables, regional position coordinates, the number of consumable hole sites in a template region unit, the central coordinates of hole sites in the template region unit and the side length M of the hole sites;
step 2b, determining the hole site state of the consumable;
step 2b1, calculating the image characteristics of the consumable hole site local image; the image characteristics comprise energy characteristics, contrast characteristics of a gray level co-occurrence matrix, cross-correlation similarity and standard deviation of an image histogram;
step 2b2, judging the state of the consumable hole site according to the image characteristics of the obtained local image of the consumable hole site; when the image characteristics are energy characteristics and contrast characteristics of the gray level co-occurrence matrix, and the calculated characteristic value is greater than a judgment threshold value, the state is that consumable materials exist in the hole sites, and when the calculated characteristic value is less than or equal to the judgment threshold value, the state is that the consumable materials do not exist in the hole sites; when the image features are the cross-correlation similarity and the standard variance of the image histogram, and the calculated feature value is smaller than the judgment threshold, the state is that consumables are available at the hole site, and when the calculated feature value is larger than or equal to the judgment threshold, the state is that consumables are not available at the hole site;
and 2b3, outputting the states of all consumable hole sites.
14. The method for identifying and monitoring consumables according to claim 12 or 13, wherein the image characteristics of the local images of consumable hole sites in step 2a3 and step 2b1 are energy characteristics, and are calculated by the following formula:
Figure FDA0002279198220000081
wherein i and j are positions of pixel points of the local image S, i represents the ith row, j represents the j column, and M represents the width and height of the local image S;
or, the image characteristics of the local images of the consumable hole sites in the step 2a3 and the step 2b1 are the contrast characteristics of the gray level co-occurrence matrix, and are calculated according to the following formula:
Figure FDA0002279198220000091
wherein G is a gray level co-occurrence matrix, k is a total number of gray levels, and n represents an nth gray level;
or, the image characteristics of the local images of consumable hole sites in the step 2a3 and the step 2b1 are the cross-correlation similarity with the template region unit images in the prior information, and are calculated by the following formula:
Figure FDA0002279198220000092
wherein S represents a local image of a current image, T is a template area unit image in prior information, and M is the width and height of S and T;
or, the image feature of the consumable hole site local image in step 2a3 and step 2b1 is the standard deviation of the image histogram, and is calculated by the following formula:
Figure FDA0002279198220000093
where L is the total number of gray levels, ziRepresenting the ith gray level, p (z)i) For normalizing the gray-scale distribution of the histogram, the gray scale is ziM is the mean of the histogram.
15. Method for the identification and monitoring of consumables according to claim 12 or 13, characterized in that the fault classes in step eight comprise the following:
i-stage fault: consumable information is wrong, but the running of a subsequent experiment program is not influenced; continuing to run the experiment program, and prompting the user fault information after the experiment is finished;
and (3) II level fault: consumable information is wrong, and the running of a subsequent experiment program is not influenced; but may have an influence on the experimental results; the experiment program is suspended, fault information of a user is prompted, and the user is waited to select to continue or stop the current experiment;
and (3) stage III fault: consumable information is wrong, and the running of a subsequent experiment program is influenced; the running of the experiment program is stopped, the fault information of the user is prompted, and the current experiment is waited to be terminated.
CN201911134409.1A 2019-11-19 2019-11-19 Liquid transfer workbench and method for recognizing and monitoring consumable materials by using liquid transfer workbench Active CN111027409B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911134409.1A CN111027409B (en) 2019-11-19 2019-11-19 Liquid transfer workbench and method for recognizing and monitoring consumable materials by using liquid transfer workbench

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911134409.1A CN111027409B (en) 2019-11-19 2019-11-19 Liquid transfer workbench and method for recognizing and monitoring consumable materials by using liquid transfer workbench

Publications (2)

Publication Number Publication Date
CN111027409A true CN111027409A (en) 2020-04-17
CN111027409B CN111027409B (en) 2023-04-18

Family

ID=70205837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911134409.1A Active CN111027409B (en) 2019-11-19 2019-11-19 Liquid transfer workbench and method for recognizing and monitoring consumable materials by using liquid transfer workbench

Country Status (1)

Country Link
CN (1) CN111027409B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111812341A (en) * 2020-07-22 2020-10-23 英华达(上海)科技有限公司 Automatic equipment detection system and method for detecting internal operation of automatic equipment
CN113820195A (en) * 2021-11-25 2021-12-21 成都宜乐芯生物科技有限公司 Multichannel parallel pretreatment device
CN113848691A (en) * 2021-09-10 2021-12-28 广州众诺电子技术有限公司 Consumable capacity recording method, consumable chip, storage medium and consumable box
CN115155400A (en) * 2022-06-20 2022-10-11 南京祥中生物科技有限公司 Full-automatic magnetic dispersion solid phase extraction device
CN115841199A (en) * 2022-11-29 2023-03-24 深圳市国赛生物技术有限公司 Automatic consumable management system
CN116083225A (en) * 2023-03-07 2023-05-09 苏州君跻基因科技有限公司 Automated control system, method and apparatus for pipetting workstations
CN116681377A (en) * 2023-05-24 2023-09-01 天津瑞利通科技有限公司 Tool management method, system, electronic device and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180372766A1 (en) * 2017-06-21 2018-12-27 Abbott Molecular Inc. Analysis Systems and Methods of Identifying Consumables and Reagents
CN109949362A (en) * 2019-03-01 2019-06-28 广东九联科技股份有限公司 A kind of material visible detection method
CN110097537A (en) * 2019-04-12 2019-08-06 江南大学 A kind of meat quantitative analysis evaluation method based on three-D grain feature

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180372766A1 (en) * 2017-06-21 2018-12-27 Abbott Molecular Inc. Analysis Systems and Methods of Identifying Consumables and Reagents
CN109949362A (en) * 2019-03-01 2019-06-28 广东九联科技股份有限公司 A kind of material visible detection method
CN110097537A (en) * 2019-04-12 2019-08-06 江南大学 A kind of meat quantitative analysis evaluation method based on three-D grain feature

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王祖德;张大伟;杨海马;涂建坤;姚龙隆;: "基于机器视觉的油画棒检测系统的设计" *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111812341A (en) * 2020-07-22 2020-10-23 英华达(上海)科技有限公司 Automatic equipment detection system and method for detecting internal operation of automatic equipment
CN113848691A (en) * 2021-09-10 2021-12-28 广州众诺电子技术有限公司 Consumable capacity recording method, consumable chip, storage medium and consumable box
CN113820195A (en) * 2021-11-25 2021-12-21 成都宜乐芯生物科技有限公司 Multichannel parallel pretreatment device
CN113820195B (en) * 2021-11-25 2022-03-08 成都宜乐芯生物科技有限公司 Multichannel parallel pretreatment device
CN115155400A (en) * 2022-06-20 2022-10-11 南京祥中生物科技有限公司 Full-automatic magnetic dispersion solid phase extraction device
CN115155400B (en) * 2022-06-20 2023-08-08 南京祥中生物科技有限公司 Full-automatic magnetic dispersion solid phase extraction device
CN115841199A (en) * 2022-11-29 2023-03-24 深圳市国赛生物技术有限公司 Automatic consumable management system
CN115841199B (en) * 2022-11-29 2024-04-30 深圳市国赛生物技术有限公司 Automatic consumable management system
CN116083225A (en) * 2023-03-07 2023-05-09 苏州君跻基因科技有限公司 Automated control system, method and apparatus for pipetting workstations
CN116681377A (en) * 2023-05-24 2023-09-01 天津瑞利通科技有限公司 Tool management method, system, electronic device and readable storage medium
CN116681377B (en) * 2023-05-24 2024-02-27 天津瑞利通科技有限公司 Tool management method, system, electronic device and readable storage medium

Also Published As

Publication number Publication date
CN111027409B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN111027409B (en) Liquid transfer workbench and method for recognizing and monitoring consumable materials by using liquid transfer workbench
US7454053B2 (en) System and method for automatically recovering video tools in a vision system
US20160334777A1 (en) Numerical controller capable of checking mounting state of tool used for machining
CN107615336B (en) Location-based detection of tray slot type and cuvette type in a vision system
US20130162806A1 (en) Enhanced edge focus tool
US20130027538A1 (en) Multi-region focus navigation interface
CN106525622B (en) Hardness testing device and hardness testing method
US20210181222A1 (en) Autosampler
CN108544531B (en) Automatic chemical examination mechanical arm device based on visual calibration, control system and control method thereof
JP2003513251A (en) Apparatus and method for verifying the location of a region of interest in a sample in an image generation system
US20220114398A1 (en) Microscopy System and Method for Verification of a Trained Image Processing Model
US20220091405A1 (en) Microscopy System, Method and Computer Program for Aligning a Specimen Carrier
CN113310997A (en) PCB defect confirmation method and device, automatic optical detection equipment and storage medium
JP2007017424A (en) POSITION ALIGNMENT SYSTEM BY XYtheta STAGE
DE112019004891T9 (en) Automatic sample and standard preparation based on the detection of sample identity and sample type
US20220236551A1 (en) Microscopy System and Method for Checking a Rotational Position of a Microscope Camera
CN105277122B (en) Image measuring apparatus and method of displaying measurement result
CN114002840A (en) Microscope and method for determining the distance to a reference plane of a sample
CN106525620B (en) Hardness testing device and hardness testing method
CN106525624B (en) Hardness testing device and hardness testing method
CN112529856A (en) Method for determining the position of an operating object, robot and automation system
CN117115105A (en) Workpiece processing method, device, equipment and storage medium
US20150287177A1 (en) Image measuring device
CN109863536B (en) Image processing method and image processing apparatus
CN110766671B (en) Image processing method based on machine vision software

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant