WO2022095082A1 - Micromanipulation platform three-dimensional positioning method for cell injection - Google Patents
Micromanipulation platform three-dimensional positioning method for cell injection Download PDFInfo
- Publication number
- WO2022095082A1 WO2022095082A1 PCT/CN2020/127759 CN2020127759W WO2022095082A1 WO 2022095082 A1 WO2022095082 A1 WO 2022095082A1 CN 2020127759 W CN2020127759 W CN 2020127759W WO 2022095082 A1 WO2022095082 A1 WO 2022095082A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sample library
- needle
- positive
- platform
- micromanipulation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 238000002347 injection Methods 0.000 title claims abstract description 53
- 239000007924 injection Substances 0.000 title claims abstract description 53
- 238000012549 training Methods 0.000 claims abstract description 31
- 238000012706 support-vector machine Methods 0.000 claims abstract description 19
- 230000009897 systematic effect Effects 0.000 claims description 22
- 238000006073 displacement reaction Methods 0.000 claims description 19
- 238000012937 correction Methods 0.000 claims description 16
- 239000013598 vector Substances 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000010801 machine learning Methods 0.000 claims description 4
- 210000004027 cell Anatomy 0.000 description 32
- 238000005516 engineering process Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 5
- 238000009434 installation Methods 0.000 description 4
- 238000000520 microinjection Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 210000001161 mammalian embryo Anatomy 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 239000000243 solution Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000003834 intracellular effect Effects 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000001531 micro-dissection Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001850 reproductive effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- C—CHEMISTRY; METALLURGY
- C12—BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
- C12N—MICROORGANISMS OR ENZYMES; COMPOSITIONS THEREOF; PROPAGATING, PRESERVING, OR MAINTAINING MICROORGANISMS; MUTATION OR GENETIC ENGINEERING; CULTURE MEDIA
- C12N15/00—Mutation or genetic engineering; DNA or RNA concerning genetic engineering, vectors, e.g. plasmids, or their isolation, preparation or purification; Use of hosts therefor
- C12N15/09—Recombinant DNA-technology
- C12N15/87—Introduction of foreign genetic material using processes not otherwise provided for, e.g. co-transformation
- C12N15/89—Introduction of foreign genetic material using processes not otherwise provided for, e.g. co-transformation using microinjection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
- G06T2207/10061—Microscopic image from scanning electron microscope
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
Definitions
- the invention relates to the technical field of micro-operation computing, in particular to a three-dimensional positioning method of a micro-operation platform for cell injection.
- Micromanipulation technology refers to an operation method that uses an actuator with micro-nano motion precision to complete specific experimental tasks for controlled objects such as organisms, materials, and chemical molecules through feedback methods such as vision and force.
- micromanipulation technology has made significant progress, and researchers have developed micromanipulation systems with various mechanical structures, driving methods and control methods, but these micromanipulation systems have not achieved wide applicability, especially
- problems in the method research of the three-dimensional positioning of cells as follows.
- the depth information of the target cells is obtained using the contact detection algorithm.
- the usual practice of the contact detection algorithm is to move the injection needle directly above the target cell, lower the microinjection needle, and track the image changes in the area near the needle tip. Through visual feedback, when the needle touches the target cell, the cell surface will deform , then this position is calibrated as the zero point position of the depth coordinate.
- the needle tip since the needle tip needs to directly contact the cell, it will deform, and the target tracking algorithm used in visual feedback (such as template matching method, SSD optical flow method, motion history image MHI and active contour model, etc.) Certain specificity cannot meet the real-time requirements of the operation.
- the needle tip is very easy to puncture the cell during the contact process, resulting in the failure of the microscopic operation.
- the second is the direct access method.
- the macroscopic binocular stereo vision it was introduced into the field of micromanipulation. By placing the microscopes in the horizontal and vertical directions, or using a stereo microscope to achieve binocular stereo vision, the depth information of the target cells can be obtained in this way, but this method is required for micron-level errors. Errors are relatively large, and ensuring accuracy remains a particular challenge.
- the purpose of the present invention is to provide a three-dimensional positioning method of a micromanipulation platform for cell injection with high efficiency and high precision. It adopts the following technical solutions:
- a three-dimensional positioning method of a micromanipulation platform for cell injection comprising:
- the positive sample library includes an injection needle positive sample library and a suction needle positive sample library
- the negative sample library includes an injection needle negative sample library and a suction needle negative sample library a sample library, wherein the needle tip in the positive sample library is in the focal plane of the microscope, and the needle tip in the negative sample library is not in the focal plane of the microscope;
- a positive sample training set and a negative sample training set are respectively established through the positive sample library and the negative sample library, and a support vector machine model is generated by training, which specifically includes:
- H(x, y) represents the pixel value of the input sample image at the pixel point (x, y), G x (x, y), G y (x, y) represent the horizontal gradient and vertical gradient respectively ;
- a Block includes multiple Cells, and each Block contains The HOG feature can be obtained by concatenating the feature vectors of all Cells;
- the training data set of support vector machine is obtained as:
- T ⁇ (x 1 , y 1 ), (x 2 , y 2 ), ... (x i , y i ) ⁇
- N is the number of samples in the positive library or the number of samples in the negative library
- x i is the i-th feature vector
- x i ⁇ R n is the number of samples in the negative library
- y i is the i-th eigenvector class labels, y i ⁇ ⁇ -1, +1 ⁇ , when y i is equal to +1, it is a positive sample library, and when it is -1, it is a negative sample library;
- a support vector machine model is generated according to the support vector machine training data set.
- the judging whether the picture is a positive sample specifically includes: judging whether the shot pictures of the injection needle and the holding needle are positive samples by calculating the HOG feature value.
- the method before the collection of sample pictures and the establishment of a positive sample library and a negative sample library, the method further includes:
- the error correction of the micromanipulation platform specifically includes:
- the calculation of the systematic error of the micromanipulation platform in this direction specifically includes:
- the calculation of the pixel spacing specifically includes:
- the described actual displacement distance of two images before and after the direction is calculated according to the pixel spacing, specifically including:
- the described system error of this direction is obtained according to the actual displacement distance, specifically including:
- the system error in the direction is autonomously compensated to correct the system error in the direction, specifically including:
- Compensation calculation is performed through computer closed-loop feedback, and the system error in this direction is compensated autonomously to correct the system error in this direction.
- the three-dimensional positioning method of the micro-operation platform for cell injection of the present invention can realize positioning during cell injection, and has the advantages of high positioning efficiency and high positioning accuracy.
- FIG. 1 is a flow chart of a three-dimensional positioning method of a micromanipulation platform for cell injection in a preferred embodiment of the present invention
- Fig. 2 is the needle alignment schematic diagram of the completed injection needle and the suction needle in the preferred embodiment of the present invention
- Fig. 3 is the flow chart of performing the error correction of the micro-operation platform in the preferred embodiment of the present invention.
- FIG. 4 is a schematic diagram of a scale in a preferred embodiment of the present invention.
- FIG. 5 is a schematic diagram of two images before and after in the preferred embodiment of the present invention.
- Fig. 6 is the splicing schematic diagram of two images before and after in the preferred embodiment of the present invention.
- FIG. 7 is a schematic diagram of splicing two images before and after the positive X-axis in a preferred embodiment of the present invention.
- the three-dimensional positioning method of the micromanipulation platform for cell injection in the preferred embodiment of the present invention includes the following steps:
- Step S10 collect sample pictures and establish a positive sample library and a negative sample library; wherein, the positive sample library includes a positive sample library for injection needles and a positive sample library for suction needles, and the negative sample library includes a negative sample library for injection needles and suction needles. Hold the needle negative sample bank, the needle tip in the positive sample bank is in the focal plane of the microscope, and the needle tip in the negative sample bank is not in the focal plane of the microscope.
- Step S20 Establish a positive sample training set and a negative sample training set respectively through the positive sample library and the negative sample library, and train to generate a support vector machine model.
- HOG histogram of orientation gradient
- H(x, y) represents the pixel value of the input sample image at the pixel point (x, y), G x (x, y), G y (x, y) represent the horizontal gradient and vertical gradient respectively ;
- a Block includes multiple Cells, each The HOG feature can be obtained by concatenating the feature vectors of all Cells in the block; among them, other parameters include bin, gradient direction, etc.
- the training data set of the support vector machine is obtained as:
- T ⁇ (x 1 , y 1 ), (x 2 , y 2 ), ... (x i , y i ) ⁇
- N is the number of samples in the positive library or the number of samples in the negative library
- x i is the i-th feature vector
- x i ⁇ R n is the number of samples in the negative library
- y i is the i-th eigenvector class labels, y i ⁇ ⁇ -1, +1 ⁇ , when y i is equal to +1, it is a positive sample library, and when it is -1, it is a negative sample library;
- a support vector machine model is generated according to the support vector machine training data set.
- Step S30 respectively move the injection needle and the suction needle along a preset route under the microscope, and take pictures respectively. Specifically, under the microscope, the injection needle and the holding needle are moved from above the focal plane to the negative direction of the Z-axis with a fixed step, or from below the focal plane to the positive direction of the Z-axis with a fixed step, and each step is taken for one shot. a picture.
- Step S40 using the trained support vector machine model to process the photographed picture, and determine whether the picture is a positive sample, until the photographed pictures of the injection needle and the holding needle are both positive samples, that is, the injection needle and the holding needle are completed. 's needle.
- the HOG feature value it is determined whether the pictures of the injection needle and the holding needle are positive samples.
- the present invention further includes step A: performing error correction on the micromanipulation platform. In order to achieve high-precision two-dimensional positioning on the XY plane.
- step A specifically includes:
- Step A1 Place the ruler on the micro-operating platform, move the micro-operating platform in a fixed direction at a fixed step, and acquire images of the ruler, and ensure that the front and rear images partially overlap.
- the scale is a two-dimensional plane scale, as shown in FIG. 4 .
- the two acquired images before and after are shown in Figure 5, namely the front frame and the back frame.
- Step A2 stitching the front and rear images in the direction.
- the stitched image is shown in Figure 6.
- Step A3 Calculate the systematic error of the micromanipulation platform in this direction. Specifically include:
- Step A31 Calculate the pixel spacing. Specifically include:
- Step A32 calculate the actual displacement distance of the front and back two images in this direction according to the pixel spacing; specifically include:
- Step A33 Obtain the systematic error of the direction according to the actual displacement distance. Specifically include:
- +X ⁇ X and +X ⁇ y are the compensation values that need to be compensated when the X axis moves forward. In the same way, the compensation value can be obtained when moving in other directions.
- Step A4 Perform autonomous compensation for the systematic error in the direction to correct the systematic error in the direction. Specifically include:
- Compensation calculation is performed through computer closed-loop feedback, and the system error in this direction is compensated autonomously to correct the system error in this direction.
- performing the error correction of the micro-operation platform further includes:
- Step A5 Move the micro-operating platform in other directions with a fixed step and acquire images respectively, calculate the systematic errors of the micro-operating platform in other directions, and perform autonomous compensation for the system errors in other directions to complete all the steps.
- Systematic error correction for orientation all directions include the positive direction of the X axis, the negative direction of the X axis, the positive direction of the Y axis, and the negative direction of the Y axis.
- performing the error correction of the micro-operating platform further includes: taking multiple sets of images, and performing multiple calculations to obtain the average value of the systematic errors of the micro-operating platform in this direction.
- the system error calculation accuracy can be improved, and finally the error correction accuracy can be improved.
- the method for obtaining the two-dimensional position in the three-dimensional positioning method of the micromanipulation platform for cell injection proposed by the present invention abandons the traditional manual error factors (such as translational motion parts, rotational motion parts, Rolling moving parts) separately perform data acquisition and calibration, and manually compensate for error correction.
- the traditional manual error factors such as translational motion parts, rotational motion parts, Rolling moving parts
- the current mechanical errors, CCD installation errors and pixel/micron conversion errors that affect the accuracy of microinjection are integrated and unified, without manual labor. Auxiliary, it can realize self-compensation and correction, and the error can be controlled at the pixel level.
- the mentioned system error autonomous compensation algorithm is not only applicable to the micromanipulation system, but also to other mobile platforms.
- the error correction is not only simple to operate, but also has the characteristics of high efficiency and high accuracy.
- the method for obtaining the depth position in the three-dimensional positioning method of the micromanipulation platform for cell injection proposed by the present invention proposes a method for determining the depth position in the form of "focus plane to needle" based on HOG features combined with machine learning.
- the new method not only has high processing speed, but also has high accuracy, thus solving the problem of inaccurate depth position acquisition in the current positioning method.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Genetics & Genomics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Biomedical Technology (AREA)
- Software Systems (AREA)
- Organic Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Zoology (AREA)
- Wood Science & Technology (AREA)
- Biotechnology (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Molecular Biology (AREA)
- Plant Pathology (AREA)
- Artificial Intelligence (AREA)
- Microbiology (AREA)
- Biophysics (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- Manipulator (AREA)
- Microscoopes, Condenser (AREA)
Abstract
Description
Claims (10)
- 一种用于细胞注射的显微操作平台三维定位方法,其特征在于,包括:A three-dimensional positioning method for a micromanipulation platform for cell injection, characterized by comprising:采集样本图片并建立正样本库和负样本库;其中,所述正样本库包括注射针正样本库和吸持针正样本库,所述负样本库包括注射针负样本库和吸持针负样本库,所述正样本库中针尖在显微镜的焦平面,所述负样本库中针尖不在显微镜的焦平面;Collect sample pictures and establish a positive sample library and a negative sample library; wherein, the positive sample library includes an injection needle positive sample library and a suction needle positive sample library, and the negative sample library includes an injection needle negative sample library and a suction needle negative sample library a sample library, wherein the needle tip in the positive sample library is in the focal plane of the microscope, and the needle tip in the negative sample library is not in the focal plane of the microscope;通过所述正样本库和负样本库分别建立正样本训练集和负样本训练集,并训练生成支持向量机模型;Establish a positive sample training set and a negative sample training set respectively through the positive sample library and the negative sample library, and train to generate a support vector machine model;分别将注射针和吸持针在显微镜下沿预设路线移动,并分别拍摄图片;Move the injection needle and the holding needle along the preset route under the microscope respectively, and take pictures respectively;利用训练好的支持向量机模型对拍摄的图片进行处理,并判断图片是否为正样本,直至拍摄的注射针和吸持针图片均为正样本时,即完成注射针和吸持针的对针。Use the trained support vector machine model to process the captured images, and determine whether the images are positive samples. When the captured images of injection needles and suction needles are both positive samples, the needle alignment of the injection needles and the suction needles is completed. .
- 如权利要求1所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,通过所述正样本库和负样本库分别建立正样本训练集和负样本训练集,并训练生成支持向量机模型,具体包括:The three-dimensional positioning method of a micromanipulation platform for cell injection according to claim 1, wherein a positive sample training set and a negative sample training set are established respectively through the positive sample library and the negative sample library, and the training Generate a support vector machine model, including:首先,计算样本库中每张样本图像在x轴方向和y轴方向的梯度G x(x,y),G y(x,y); First, calculate the gradients G x (x, y) and G y (x, y) of each sample image in the sample library in the x-axis and y-axis directions;G x(x,y)=H(x+1,y)-H(x-1,y) G x (x,y)=H(x+1,y)-H(x-1,y)G y(x,y)=H(x,y+1)-H(x,y-1) G y (x, y)=H(x, y+1)-H(x, y-1)其中:H(x,y)表示输入样本图像在像素点(x,y)处的像素值,G x(x,y),G y(x,y)分别表示水平方向梯度与竖直方向梯度; Where: H(x, y) represents the pixel value of the input sample image at the pixel point (x, y), G x (x, y), G y (x, y) represent the horizontal gradient and vertical gradient respectively ;然后,将样本图像分为若干个均匀的“BlockCell”,并设置Cell的大小和其他参数,得到Cell的特征向量,再将Cell单元组合成大的Block,一个Block包括多个Cell,每个Block内所有Cell的特征向量串联起来即可得到HOG特征;Then, the sample image is divided into several uniform "BlockCell", and the size and other parameters of the Cell are set to obtain the feature vector of the Cell, and then the Cell units are combined into a large Block. A Block includes multiple Cells, and each Block The HOG feature can be obtained by concatenating the feature vectors of all Cells in the cell;然后结合使用机器学习的方法,通过对正样本训练集和负样本训练集的训 练,得到支持向量机训练数据集为:Then combined with the method of machine learning, through the training of positive sample training set and negative sample training set, the training data set of support vector machine is obtained as:T={(x 1,y 1),(x 2,y 2),...(x i,y i)} T={(x 1 , y 1 ), (x 2 , y 2 ), ... (x i , y i )}其中,i=1,2,3,...N,N为正库中样品数或负样本库中样品数,x i是第i个特征向量,x i∈R n,y i是第i个类标记,y i∈{-1,+1},当y i等于+1时为正样本库,为-1时为负样本库; Among them, i=1, 2, 3,...N, N is the number of samples in the positive library or the number of samples in the negative library, x i is the i-th feature vector, x i ∈R n , y i is the i-th eigenvector class labels, y i ∈ {-1, +1}, when y i is equal to +1, it is a positive sample library, and when it is -1, it is a negative sample library;根据所述支持向量机训练数据集生成支持向量机模型。A support vector machine model is generated according to the support vector machine training data set.
- 如权利要求2所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述判断图片是否为正样本,具体包括:通过计算HOG特征值判断拍摄的注射针和吸持针图片均是否为正样本。The three-dimensional positioning method of a micromanipulation platform for cell injection according to claim 2, wherein the judging whether the picture is a positive sample specifically comprises: judging the shot of the injection needle and the suction needle by calculating the HOG characteristic value. Whether the needle-holding pictures are positive samples.
- 如权利要求1所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,在所述采集样本图片并建立正样本库和负样本库之前,还包括:A three-dimensional positioning method for a micromanipulation platform for cell injection according to claim 1, characterized in that, before the collection of sample pictures and the establishment of a positive sample library and a negative sample library, the method further comprises:进行显微操作平台误差矫正。Perform the error correction of the micromanipulation platform.
- 如权利要求4所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述进行显微操作平台误差矫正,具体包括:The three-dimensional positioning method of a micro-operating platform for cell injection according to claim 4, wherein the performing error correction of the micro-operating platform specifically includes:将标尺放置在显微操作平台上,以固定步进分别让显微操作平台沿固定方向移动并获取标尺图像,并保证前后两张图像有部分重叠;Place the ruler on the micro-operating platform, move the micro-operating platform in a fixed direction at a fixed step, and acquire the ruler image, and ensure that the two images before and after partially overlap;将该方向的前后两张图像进行拼接;Stitch the two images before and after the direction;计算显微操作平台在该方向的系统误差;Calculate the systematic error of the micromanipulation platform in this direction;对该方向的系统误差进行自主补偿,矫正该方向系统误差;Perform autonomous compensation for the system error in the direction to correct the system error in the direction;以固定步进分别让显微操作平台沿其他方向移动并分别获取图像,计算显微操作平台在其他各个方向上的系统误差,并对其他各个方向的系统误差进行自主补偿,完成所有方向的系统误差矫正。Move the micromanipulation platform in other directions with fixed steps and acquire images respectively, calculate the systematic errors of the micromanipulation platform in other directions, and automatically compensate the system errors in other directions to complete the system in all directions. Error correction.
- 如权利要求5所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述计算显微操作平台在该方向的系统误差,具体包括:The three-dimensional positioning method of a micromanipulation platform for cell injection according to claim 5, wherein the calculating the systematic error of the micromanipulation platform in this direction specifically includes:计算像素间距;Calculate the pixel spacing;根据像素间距计算该方向上前后两张图像的实际位移距离;Calculate the actual displacement distance of the two images before and after in this direction according to the pixel spacing;根据实际位移距离得到该方向的系统误差。The systematic error in this direction is obtained according to the actual displacement distance.
- 如权利要求6所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述计算像素间距,具体包括:A three-dimensional positioning method for a micromanipulation platform for cell injection according to claim 6, wherein the calculating the pixel spacing specifically includes:采用公式S=M/N计算像素间距;其中,S为像素间距,M为标尺长度,N为M长度内的像素个数。The pixel spacing is calculated using the formula S=M/N; wherein, S is the pixel spacing, M is the scale length, and N is the number of pixels within the M length.
- 如权利要求7所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述根据像素间距计算该方向上前后两张图像的实际位移距离,具体包括:A kind of micromanipulation platform three-dimensional positioning method for cell injection as claimed in claim 7, it is characterised in that the actual displacement distance of the two images before and after the direction is calculated according to the pixel spacing, specifically includes:采用公式AA 1实际=S*AA 1计算实际位移距离;其中,AA 1实际为实际位移距离,AA 1为该方向的前后两张图像相对位移的像素个数。 The actual displacement distance is calculated using the formula AA 1 actual =S*AA 1 ; wherein, AA 1 is the actual displacement distance, and AA 1 is the number of pixels relative to the displacement of the front and rear images in this direction.
- 如权利要求8所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述根据实际位移距离得到该方向的系统误差,具体包括:A kind of micromanipulation platform three-dimensional positioning method for cell injection as claimed in claim 8, is characterized in that, described according to actual displacement distance obtains the systematic error of this direction, specifically comprises:根据公式AA 1实际*cos(θ)和AA 1实际*sin(θ)得到该方向的系统误差的两个分量;其中,θ为图像坐标系与显微操作平台坐标系的偏角。 According to the formulas AA 1 actual * cos(θ) and AA 1 actual * sin (θ), the two components of the systematic error in this direction are obtained; where, θ is the declination angle between the image coordinate system and the micro-operating platform coordinate system.
- 如权利要求5所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述对该方向的系统误差进行自主补偿,矫正该方向系统误差,具体包括:The three-dimensional positioning method of a micromanipulation platform for cell injection according to claim 5, wherein the autonomous compensation of the systematic error in the direction and the correction of the systematic error in the direction specifically include:通过计算机闭环反馈进行补偿计算,对该方向的系统误差进行自主补偿,矫正该方向系统误差。Compensation calculation is performed through computer closed-loop feedback, and the system error in this direction is compensated autonomously to correct the system error in this direction.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011213646.X | 2020-11-04 | ||
CN202011213646.XA CN112101575B (en) | 2020-11-04 | 2020-11-04 | Three-dimensional positioning method of micromanipulation platform for cell injection |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022095082A1 true WO2022095082A1 (en) | 2022-05-12 |
Family
ID=73784516
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/127759 WO2022095082A1 (en) | 2020-11-04 | 2020-11-10 | Micromanipulation platform three-dimensional positioning method for cell injection |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112101575B (en) |
WO (1) | WO2022095082A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114034225A (en) * | 2021-11-25 | 2022-02-11 | 广州市华粤行医疗科技有限公司 | Method for testing movement precision of injection needle under microscope |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106204642A (en) * | 2016-06-29 | 2016-12-07 | 四川大学 | A kind of cell tracker method based on deep neural network |
CN110796661A (en) * | 2018-08-01 | 2020-02-14 | 华中科技大学 | Fungal microscopic image segmentation detection method and system based on convolutional neural network |
CN110841139A (en) * | 2019-12-10 | 2020-02-28 | 深圳市中科微光医疗器械技术有限公司 | Remaining needle capable of realizing needle tip positioning in image environment |
CN111652848A (en) * | 2020-05-07 | 2020-09-11 | 南开大学 | Robotized adherent cell three-dimensional positioning method |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2710010Y (en) * | 2004-03-26 | 2005-07-13 | 张志宏 | Microscopic operation system of biocell computer |
US9598281B2 (en) * | 2011-03-03 | 2017-03-21 | The Regents Of The University Of California | Nanopipette apparatus for manipulating cells |
CN103255049A (en) * | 2013-05-20 | 2013-08-21 | 苏州大学 | Composite piezoelectric injection system and injection method |
-
2020
- 2020-11-04 CN CN202011213646.XA patent/CN112101575B/en active Active
- 2020-11-10 WO PCT/CN2020/127759 patent/WO2022095082A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106204642A (en) * | 2016-06-29 | 2016-12-07 | 四川大学 | A kind of cell tracker method based on deep neural network |
CN110796661A (en) * | 2018-08-01 | 2020-02-14 | 华中科技大学 | Fungal microscopic image segmentation detection method and system based on convolutional neural network |
CN110841139A (en) * | 2019-12-10 | 2020-02-28 | 深圳市中科微光医疗器械技术有限公司 | Remaining needle capable of realizing needle tip positioning in image environment |
CN111652848A (en) * | 2020-05-07 | 2020-09-11 | 南开大学 | Robotized adherent cell three-dimensional positioning method |
Also Published As
Publication number | Publication date |
---|---|
CN112101575B (en) | 2021-04-30 |
CN112101575A (en) | 2020-12-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112122840B (en) | Visual positioning welding system and welding method based on robot welding | |
CN105354531B (en) | A kind of mask method of face key point | |
Yu et al. | Microrobotic cell injection | |
CN110666798B (en) | Robot vision calibration method based on perspective transformation model | |
CN111775146A (en) | Visual alignment method under industrial mechanical arm multi-station operation | |
CN111089569A (en) | Large box body measuring method based on monocular vision | |
CN111496779B (en) | Intelligent microscopic operation system based on machine vision | |
WO2022095082A1 (en) | Micromanipulation platform three-dimensional positioning method for cell injection | |
WO2021057422A1 (en) | Microscope system, smart medical device, automatic focusing method and storage medium | |
CN113409285A (en) | Method and system for monitoring three-dimensional deformation of immersed tunnel joint | |
CN101073528A (en) | Digital operating bed system with double-plane positioning and double-eyes visual tracting | |
CN114343847B (en) | Hand-eye calibration method of surgical robot based on optical positioning system | |
CN116643393B (en) | Microscopic image deflection-based processing method and system | |
CN103170823A (en) | Control device and method of inserting micro-pipe into micro-hole through monocular microscopy visual guidance | |
US8094191B2 (en) | System and method for correcting an image | |
CN110211183A (en) | The multi-target positioning system and method for big visual field LED lens attachment are imaged based on single | |
Yang et al. | Automatic vision-guided micromanipulation for versatile deployment and portable setup | |
WO2021227189A1 (en) | Micromanipulation platform autonomous error correction algorithm based on machine vision | |
CN111679421A (en) | Intelligent microscopic operation system based on 3D imaging technology | |
CN113403198A (en) | Multi-view composite single-cell micro-operation system and control method | |
CN109491409B (en) | Method for automatically extracting mitochondria of target cell in vitro | |
CN114359393A (en) | Cross-platform visual guide dispensing guiding method | |
CN110490801B (en) | High-speed splicing method for ultra-large matrix pictures of metallographic microscope | |
CN114067646A (en) | Visual simulation teaching system of puncture surgical robot | |
Fan et al. | Posture adjustment and robust microinjection of zebrafish larval heart |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20960539 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20960539 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22.09.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20960539 Country of ref document: EP Kind code of ref document: A1 |