WO2022095082A1 - Micromanipulation platform three-dimensional positioning method for cell injection - Google Patents

Micromanipulation platform three-dimensional positioning method for cell injection Download PDF

Info

Publication number
WO2022095082A1
WO2022095082A1 PCT/CN2020/127759 CN2020127759W WO2022095082A1 WO 2022095082 A1 WO2022095082 A1 WO 2022095082A1 CN 2020127759 W CN2020127759 W CN 2020127759W WO 2022095082 A1 WO2022095082 A1 WO 2022095082A1
Authority
WO
WIPO (PCT)
Prior art keywords
sample library
needle
positive
platform
micromanipulation
Prior art date
Application number
PCT/CN2020/127759
Other languages
French (fr)
Chinese (zh)
Inventor
汝长海
郝淼
陈瑞华
翟荣安
岳春峰
孙钰
Original Assignee
江苏集萃微纳自动化系统与装备技术研究所有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 江苏集萃微纳自动化系统与装备技术研究所有限公司 filed Critical 江苏集萃微纳自动化系统与装备技术研究所有限公司
Publication of WO2022095082A1 publication Critical patent/WO2022095082A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12NMICROORGANISMS OR ENZYMES; COMPOSITIONS THEREOF; PROPAGATING, PRESERVING, OR MAINTAINING MICROORGANISMS; MUTATION OR GENETIC ENGINEERING; CULTURE MEDIA
    • C12N15/00Mutation or genetic engineering; DNA or RNA concerning genetic engineering, vectors, e.g. plasmids, or their isolation, preparation or purification; Use of hosts therefor
    • C12N15/09Recombinant DNA-technology
    • C12N15/87Introduction of foreign genetic material using processes not otherwise provided for, e.g. co-transformation
    • C12N15/89Introduction of foreign genetic material using processes not otherwise provided for, e.g. co-transformation using microinjection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • G06T2207/10061Microscopic image from scanning electron microscope
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Definitions

  • the invention relates to the technical field of micro-operation computing, in particular to a three-dimensional positioning method of a micro-operation platform for cell injection.
  • Micromanipulation technology refers to an operation method that uses an actuator with micro-nano motion precision to complete specific experimental tasks for controlled objects such as organisms, materials, and chemical molecules through feedback methods such as vision and force.
  • micromanipulation technology has made significant progress, and researchers have developed micromanipulation systems with various mechanical structures, driving methods and control methods, but these micromanipulation systems have not achieved wide applicability, especially
  • problems in the method research of the three-dimensional positioning of cells as follows.
  • the depth information of the target cells is obtained using the contact detection algorithm.
  • the usual practice of the contact detection algorithm is to move the injection needle directly above the target cell, lower the microinjection needle, and track the image changes in the area near the needle tip. Through visual feedback, when the needle touches the target cell, the cell surface will deform , then this position is calibrated as the zero point position of the depth coordinate.
  • the needle tip since the needle tip needs to directly contact the cell, it will deform, and the target tracking algorithm used in visual feedback (such as template matching method, SSD optical flow method, motion history image MHI and active contour model, etc.) Certain specificity cannot meet the real-time requirements of the operation.
  • the needle tip is very easy to puncture the cell during the contact process, resulting in the failure of the microscopic operation.
  • the second is the direct access method.
  • the macroscopic binocular stereo vision it was introduced into the field of micromanipulation. By placing the microscopes in the horizontal and vertical directions, or using a stereo microscope to achieve binocular stereo vision, the depth information of the target cells can be obtained in this way, but this method is required for micron-level errors. Errors are relatively large, and ensuring accuracy remains a particular challenge.
  • the purpose of the present invention is to provide a three-dimensional positioning method of a micromanipulation platform for cell injection with high efficiency and high precision. It adopts the following technical solutions:
  • a three-dimensional positioning method of a micromanipulation platform for cell injection comprising:
  • the positive sample library includes an injection needle positive sample library and a suction needle positive sample library
  • the negative sample library includes an injection needle negative sample library and a suction needle negative sample library a sample library, wherein the needle tip in the positive sample library is in the focal plane of the microscope, and the needle tip in the negative sample library is not in the focal plane of the microscope;
  • a positive sample training set and a negative sample training set are respectively established through the positive sample library and the negative sample library, and a support vector machine model is generated by training, which specifically includes:
  • H(x, y) represents the pixel value of the input sample image at the pixel point (x, y), G x (x, y), G y (x, y) represent the horizontal gradient and vertical gradient respectively ;
  • a Block includes multiple Cells, and each Block contains The HOG feature can be obtained by concatenating the feature vectors of all Cells;
  • the training data set of support vector machine is obtained as:
  • T ⁇ (x 1 , y 1 ), (x 2 , y 2 ), ... (x i , y i ) ⁇
  • N is the number of samples in the positive library or the number of samples in the negative library
  • x i is the i-th feature vector
  • x i ⁇ R n is the number of samples in the negative library
  • y i is the i-th eigenvector class labels, y i ⁇ ⁇ -1, +1 ⁇ , when y i is equal to +1, it is a positive sample library, and when it is -1, it is a negative sample library;
  • a support vector machine model is generated according to the support vector machine training data set.
  • the judging whether the picture is a positive sample specifically includes: judging whether the shot pictures of the injection needle and the holding needle are positive samples by calculating the HOG feature value.
  • the method before the collection of sample pictures and the establishment of a positive sample library and a negative sample library, the method further includes:
  • the error correction of the micromanipulation platform specifically includes:
  • the calculation of the systematic error of the micromanipulation platform in this direction specifically includes:
  • the calculation of the pixel spacing specifically includes:
  • the described actual displacement distance of two images before and after the direction is calculated according to the pixel spacing, specifically including:
  • the described system error of this direction is obtained according to the actual displacement distance, specifically including:
  • the system error in the direction is autonomously compensated to correct the system error in the direction, specifically including:
  • Compensation calculation is performed through computer closed-loop feedback, and the system error in this direction is compensated autonomously to correct the system error in this direction.
  • the three-dimensional positioning method of the micro-operation platform for cell injection of the present invention can realize positioning during cell injection, and has the advantages of high positioning efficiency and high positioning accuracy.
  • FIG. 1 is a flow chart of a three-dimensional positioning method of a micromanipulation platform for cell injection in a preferred embodiment of the present invention
  • Fig. 2 is the needle alignment schematic diagram of the completed injection needle and the suction needle in the preferred embodiment of the present invention
  • Fig. 3 is the flow chart of performing the error correction of the micro-operation platform in the preferred embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a scale in a preferred embodiment of the present invention.
  • FIG. 5 is a schematic diagram of two images before and after in the preferred embodiment of the present invention.
  • Fig. 6 is the splicing schematic diagram of two images before and after in the preferred embodiment of the present invention.
  • FIG. 7 is a schematic diagram of splicing two images before and after the positive X-axis in a preferred embodiment of the present invention.
  • the three-dimensional positioning method of the micromanipulation platform for cell injection in the preferred embodiment of the present invention includes the following steps:
  • Step S10 collect sample pictures and establish a positive sample library and a negative sample library; wherein, the positive sample library includes a positive sample library for injection needles and a positive sample library for suction needles, and the negative sample library includes a negative sample library for injection needles and suction needles. Hold the needle negative sample bank, the needle tip in the positive sample bank is in the focal plane of the microscope, and the needle tip in the negative sample bank is not in the focal plane of the microscope.
  • Step S20 Establish a positive sample training set and a negative sample training set respectively through the positive sample library and the negative sample library, and train to generate a support vector machine model.
  • HOG histogram of orientation gradient
  • H(x, y) represents the pixel value of the input sample image at the pixel point (x, y), G x (x, y), G y (x, y) represent the horizontal gradient and vertical gradient respectively ;
  • a Block includes multiple Cells, each The HOG feature can be obtained by concatenating the feature vectors of all Cells in the block; among them, other parameters include bin, gradient direction, etc.
  • the training data set of the support vector machine is obtained as:
  • T ⁇ (x 1 , y 1 ), (x 2 , y 2 ), ... (x i , y i ) ⁇
  • N is the number of samples in the positive library or the number of samples in the negative library
  • x i is the i-th feature vector
  • x i ⁇ R n is the number of samples in the negative library
  • y i is the i-th eigenvector class labels, y i ⁇ ⁇ -1, +1 ⁇ , when y i is equal to +1, it is a positive sample library, and when it is -1, it is a negative sample library;
  • a support vector machine model is generated according to the support vector machine training data set.
  • Step S30 respectively move the injection needle and the suction needle along a preset route under the microscope, and take pictures respectively. Specifically, under the microscope, the injection needle and the holding needle are moved from above the focal plane to the negative direction of the Z-axis with a fixed step, or from below the focal plane to the positive direction of the Z-axis with a fixed step, and each step is taken for one shot. a picture.
  • Step S40 using the trained support vector machine model to process the photographed picture, and determine whether the picture is a positive sample, until the photographed pictures of the injection needle and the holding needle are both positive samples, that is, the injection needle and the holding needle are completed. 's needle.
  • the HOG feature value it is determined whether the pictures of the injection needle and the holding needle are positive samples.
  • the present invention further includes step A: performing error correction on the micromanipulation platform. In order to achieve high-precision two-dimensional positioning on the XY plane.
  • step A specifically includes:
  • Step A1 Place the ruler on the micro-operating platform, move the micro-operating platform in a fixed direction at a fixed step, and acquire images of the ruler, and ensure that the front and rear images partially overlap.
  • the scale is a two-dimensional plane scale, as shown in FIG. 4 .
  • the two acquired images before and after are shown in Figure 5, namely the front frame and the back frame.
  • Step A2 stitching the front and rear images in the direction.
  • the stitched image is shown in Figure 6.
  • Step A3 Calculate the systematic error of the micromanipulation platform in this direction. Specifically include:
  • Step A31 Calculate the pixel spacing. Specifically include:
  • Step A32 calculate the actual displacement distance of the front and back two images in this direction according to the pixel spacing; specifically include:
  • Step A33 Obtain the systematic error of the direction according to the actual displacement distance. Specifically include:
  • +X ⁇ X and +X ⁇ y are the compensation values that need to be compensated when the X axis moves forward. In the same way, the compensation value can be obtained when moving in other directions.
  • Step A4 Perform autonomous compensation for the systematic error in the direction to correct the systematic error in the direction. Specifically include:
  • Compensation calculation is performed through computer closed-loop feedback, and the system error in this direction is compensated autonomously to correct the system error in this direction.
  • performing the error correction of the micro-operation platform further includes:
  • Step A5 Move the micro-operating platform in other directions with a fixed step and acquire images respectively, calculate the systematic errors of the micro-operating platform in other directions, and perform autonomous compensation for the system errors in other directions to complete all the steps.
  • Systematic error correction for orientation all directions include the positive direction of the X axis, the negative direction of the X axis, the positive direction of the Y axis, and the negative direction of the Y axis.
  • performing the error correction of the micro-operating platform further includes: taking multiple sets of images, and performing multiple calculations to obtain the average value of the systematic errors of the micro-operating platform in this direction.
  • the system error calculation accuracy can be improved, and finally the error correction accuracy can be improved.
  • the method for obtaining the two-dimensional position in the three-dimensional positioning method of the micromanipulation platform for cell injection proposed by the present invention abandons the traditional manual error factors (such as translational motion parts, rotational motion parts, Rolling moving parts) separately perform data acquisition and calibration, and manually compensate for error correction.
  • the traditional manual error factors such as translational motion parts, rotational motion parts, Rolling moving parts
  • the current mechanical errors, CCD installation errors and pixel/micron conversion errors that affect the accuracy of microinjection are integrated and unified, without manual labor. Auxiliary, it can realize self-compensation and correction, and the error can be controlled at the pixel level.
  • the mentioned system error autonomous compensation algorithm is not only applicable to the micromanipulation system, but also to other mobile platforms.
  • the error correction is not only simple to operate, but also has the characteristics of high efficiency and high accuracy.
  • the method for obtaining the depth position in the three-dimensional positioning method of the micromanipulation platform for cell injection proposed by the present invention proposes a method for determining the depth position in the form of "focus plane to needle" based on HOG features combined with machine learning.
  • the new method not only has high processing speed, but also has high accuracy, thus solving the problem of inaccurate depth position acquisition in the current positioning method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Genetics & Genomics (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Organic Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Zoology (AREA)
  • Wood Science & Technology (AREA)
  • Biotechnology (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Plant Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Microbiology (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Manipulator (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

A micromanipulation platform three-dimensional positioning method for cell injection, comprising: acquiring a sample picture and establishing a positive sample library and a negative sample library, the positive sample library comprising an injection needle positive sample library and a holding needle positive sample library; the negative sample library comprising an injection needle negative sample library and a holding needle negative sample library; respectively establishing a positive sample training set and a negative sample training set by means of the positive sample library and the negative sample library, and generating a support vector machine model by training; respectively moving an injection needle and a holding needle under a microscope along a preset route, and respectively capturing pictures; and using the trained support vector machine model to process the captured pictures, and determining whether the pictures are positive samples until the captured injection needle and holding needle pictures are all positive samples for completing aligning the injection needle with the holding needle. The micromanipulation platform three-dimensional positioning method for cell injection makes it possible to position during cell injection, and has the advantages of high positioning efficiency and high positioning accuracy.

Description

一种用于细胞注射的显微操作平台三维定位方法A three-dimensional positioning method of a micromanipulation platform for cell injection 技术领域technical field
本发明涉及显微操作计算技术领域,特别涉及一种用于细胞注射的显微操作平台三维定位方法。The invention relates to the technical field of micro-operation computing, in particular to a three-dimensional positioning method of a micro-operation platform for cell injection.
背景技术Background technique
21世纪以来,随着人工辅助生殖技术和生物医药的大量应用,细胞工程领域中的显微操作技术已逐渐成为与人类社会发展密不可分的前沿技术。显微操作技术是指通过视觉、力等反馈方式,利用具有微纳米运动精度的执行器,针对生物、材料、化学分子等被控对象,完成特定实验任务的一种操作手段。传统的显微操作技术都是手工完成的,一般由经过至少需要两三年的专业培训的有经验的实验人员操作,比如卵胞浆内单精子注射、细胞核移植、嵌合体技术、胚胎移植以及显微切割等,其中细胞显微注射技术应用最多,主要应用方面有细胞内药物注射,细胞核移植、胚胎移植及卵胞浆内单精子注射(ICSI)等。在显微注射过程中,实验人员坐在显微镜前用眼睛观察工作空间的情况,根据自己的经验操作微动操作手完成实验,而操作对象的尺寸一般在微米级,这就给操作人员带来一定的挑战。可以看出,这一过程是费时费力的,同时又很依赖于操作人员个人的经验,很难保证所有操作过程的一致性,在一定程度上制约了生物工程技术和生命科学的发展。Since the 21st century, with the extensive application of artificial assisted reproductive technology and biomedicine, micromanipulation technology in the field of cell engineering has gradually become a cutting-edge technology that is inseparable from the development of human society. Micromanipulation technology refers to an operation method that uses an actuator with micro-nano motion precision to complete specific experimental tasks for controlled objects such as organisms, materials, and chemical molecules through feedback methods such as vision and force. Traditional micromanipulation techniques are all done manually and are generally operated by experienced laboratory personnel who have received at least two or three years of professional training, such as intracytoplasmic sperm injection, nuclear transfer, chimera technique, embryo transfer and Microdissection, etc., of which cell microinjection technology is the most widely used, the main applications include intracellular drug injection, nuclear transfer, embryo transfer and intracytoplasmic sperm injection (ICSI). During the microinjection process, the experimenter sits in front of the microscope and observes the working space with his eyes, and operates the micromanipulator to complete the experiment according to his own experience. Certain challenges. It can be seen that this process is time-consuming and labor-intensive, and at the same time, it is very dependent on the personal experience of the operator. It is difficult to ensure the consistency of all operating processes, which restricts the development of bioengineering technology and life sciences to a certain extent.
随着现在机械加工技术和自动化技术在不断进步,以机器设备来代替人工来提高自动化操作水平是目前的研究趋势。近年来,显微操作技术获得了显著的进步,研究者研制出了各种机械结构、驱动方式和控制方法的显微操作系统,但是这些显微操作系统均未能达到广泛的适用性,尤其是在细胞的三维定位的方法研究中仍旧存在一定的问题,具体如下。With the continuous advancement of machining technology and automation technology, it is the current research trend to replace manual labor with machines to improve the level of automation. In recent years, micromanipulation technology has made significant progress, and researchers have developed micromanipulation systems with various mechanical structures, driving methods and control methods, but these micromanipulation systems have not achieved wide applicability, especially However, there are still some problems in the method research of the three-dimensional positioning of cells, as follows.
一方面,在二维位置获取时,存在显微视觉系统的标定精度不高的问题。由于显微操作的对象均在微米级,对操作平台的运动精度要求非常高。当前,显微操作的精度主要依赖于机械的精密度,但是,由于机械设备无法避免的会 存在一定的自身误差以及安装误差,都会降低显微操作的精度,甚至直接影响到显微操作的结果。当前,多采用机械辅助仪器辅助安装来降低安装误差精度,这种误差补偿方式需要多次重复性测量,并伴有大量的数据处理,不仅工作量大、工作效率低,甚至在检测过程中还会附加额外误差。因此,由于尚未有有效的误差补偿方法,当前的显微操作无一例外的还是依赖于纯手工或者半自动化操作,但这显然已无法满足现代化智能医疗技术高效率、高质量的操作要求。On the one hand, when acquiring the two-dimensional position, there is a problem that the calibration accuracy of the microscopic vision system is not high. Since the objects of micro-operation are all in the micron level, the movement precision of the operation platform is very high. At present, the precision of micromanipulation mainly depends on the precision of machinery. However, due to the inevitable errors and installation errors of mechanical equipment, the accuracy of micromanipulation will be reduced, and even the results of micromanipulation will be directly affected. . At present, mechanical auxiliary instruments are often used to assist installation to reduce the installation error accuracy. This error compensation method requires repeated measurements and is accompanied by a large amount of data processing. Additional errors will be added. Therefore, since there is no effective error compensation method, the current micromanipulation without exception still relies on pure manual or semi-automatic operation, which obviously cannot meet the high-efficiency and high-quality operation requirements of modern intelligent medical technology.
另一方面,在深度位置获取时,还存在深度信息无法准确获取的问题。深度信息的获取是显微操作机器人领域一个具有挑战性的问题。现有的深度信息获取的方式基本有两种:第一种,间接获取法。基于图像处理技术使用接触检测算法获取目标细胞的深度信息。接触检测算法通常的做法是注射针位移至目标细胞的正上方,下降显微注射针,并跟踪针尖附近区域图像变化情况,通过视觉反馈,当针头触碰到目标细胞时,细胞表面会发生形变,则以此位置标定为深度坐标的零点位置。但是这种方式由于针尖需要直接接触细胞,使其产生形变,由于视觉反馈时所采用的目标追踪算法(比如:模板匹配法、SSD光流法、运动历史图像MHI和活动轮廓模型等)都具有一定的针对性,不能满足操作的实时性要求,此外,针尖在接触过程中极容易扎破细胞,导致显微操作失败。第二种,直接获取法。受宏观的双目立体视觉的启发,将其引入到显微操作领域。通过水平方向和垂直方向分别放置显微镜,或者是采用基于体视显微镜实现双目立体视觉,以此方式获取目标细胞的深度信息,但这种方式对于微米级误差要求的生物显微操作来说,误差相对较大,确保精确度仍然是尤为突出的难题。On the other hand, when acquiring the depth position, there is still a problem that the depth information cannot be accurately acquired. The acquisition of depth information is a challenging problem in the field of micromanipulation robotics. There are basically two ways to acquire depth information at present: the first is indirect acquisition method. Based on the image processing technology, the depth information of the target cells is obtained using the contact detection algorithm. The usual practice of the contact detection algorithm is to move the injection needle directly above the target cell, lower the microinjection needle, and track the image changes in the area near the needle tip. Through visual feedback, when the needle touches the target cell, the cell surface will deform , then this position is calibrated as the zero point position of the depth coordinate. However, in this way, since the needle tip needs to directly contact the cell, it will deform, and the target tracking algorithm used in visual feedback (such as template matching method, SSD optical flow method, motion history image MHI and active contour model, etc.) Certain specificity cannot meet the real-time requirements of the operation. In addition, the needle tip is very easy to puncture the cell during the contact process, resulting in the failure of the microscopic operation. The second is the direct access method. Inspired by the macroscopic binocular stereo vision, it was introduced into the field of micromanipulation. By placing the microscopes in the horizontal and vertical directions, or using a stereo microscope to achieve binocular stereo vision, the depth information of the target cells can be obtained in this way, but this method is required for micron-level errors. Errors are relatively large, and ensuring accuracy remains a particular challenge.
发明内容SUMMARY OF THE INVENTION
针对现有技术的不足,本发明目的在于提供一种具有高效、高精度的用于细胞注射的显微操作平台三维定位方法。其采用如下技术方案:In view of the deficiencies of the prior art, the purpose of the present invention is to provide a three-dimensional positioning method of a micromanipulation platform for cell injection with high efficiency and high precision. It adopts the following technical solutions:
一种用于细胞注射的显微操作平台三维定位方法,其包括:A three-dimensional positioning method of a micromanipulation platform for cell injection, comprising:
采集样本图片并建立正样本库和负样本库;其中,所述正样本库包括注射针正样本库和吸持针正样本库,所述负样本库包括注射针负样本库和吸持针负样本库,所述正样本库中针尖在显微镜的焦平面,所述负样本库中针尖不在显微镜的焦平面;Collect sample pictures and establish a positive sample library and a negative sample library; wherein, the positive sample library includes an injection needle positive sample library and a suction needle positive sample library, and the negative sample library includes an injection needle negative sample library and a suction needle negative sample library a sample library, wherein the needle tip in the positive sample library is in the focal plane of the microscope, and the needle tip in the negative sample library is not in the focal plane of the microscope;
通过所述正样本库和负样本库分别建立正样本训练集和负样本训练集,并训练生成支持向量机模型;Establish a positive sample training set and a negative sample training set respectively through the positive sample library and the negative sample library, and train to generate a support vector machine model;
分别将注射针和吸持针在显微镜下沿预设路线移动,并分别拍摄图片;Move the injection needle and the holding needle along the preset route under the microscope respectively, and take pictures respectively;
利用训练好的支持向量机模型对拍摄的图片进行处理,并判断图片是否为正样本,直至拍摄的注射针和吸持针图片均为正样本时,即完成注射针和吸持针的对针。Use the trained support vector machine model to process the captured images, and determine whether the images are positive samples. When the captured images of injection needles and suction needles are both positive samples, the needle alignment of the injection needles and the suction needles is completed. .
作为本发明的进一步改进,通过所述正样本库和负样本库分别建立正样本训练集和负样本训练集,并训练生成支持向量机模型,具体包括:As a further improvement of the present invention, a positive sample training set and a negative sample training set are respectively established through the positive sample library and the negative sample library, and a support vector machine model is generated by training, which specifically includes:
通过计算样本库中每张样本图像在x轴方向和y轴方向的梯度G x(x,y),G y(x,y); By calculating the gradients G x (x, y), G y (x, y) of each sample image in the sample library in the x-axis direction and the y-axis direction;
G x(x,y)=H(x+1,y)-H(x-1,y) G x (x,y)=H(x+1,y)-H(x-1,y)
G y(x,y)=H(x,y+1)-H(x,y-1) G y (x, y)=H(x, y+1)-H(x, y-1)
其中:H(x,y)表示输入样本图像在像素点(x,y)处的像素值,G x(x,y),G y(x,y)分别表示水平方向梯度与竖直方向梯度; Where: H(x, y) represents the pixel value of the input sample image at the pixel point (x, y), G x (x, y), G y (x, y) represent the horizontal gradient and vertical gradient respectively ;
将样本图像分为若干个均匀的“Block Cell”,并设置Cell的大小和其他参数,得到Cell的特征向量,再将Cell单元组合成大的Block,一个Block包括多个Cell,每个Block内所有Cell的特征向量串联起来即可得到HOG特征;Divide the sample image into several uniform "Block Cells", and set the size and other parameters of the Cell to obtain the feature vector of the Cell, and then combine the Cell units into a large Block. A Block includes multiple Cells, and each Block contains The HOG feature can be obtained by concatenating the feature vectors of all Cells;
结合使用机器学习的方法,通过对正样本训练集和负样本训练集的训练,得到支持向量机训练数据集为:Combined with the method of machine learning, through the training of positive sample training set and negative sample training set, the training data set of support vector machine is obtained as:
T={(x 1,y 1),(x 2,y 2),...(x i,y i)} T={(x 1 , y 1 ), (x 2 , y 2 ), ... (x i , y i )}
其中,i=1,2,3,...N,N为正库中样品数或负样本库中样品数,x i是第i个特征向量,x i∈R n,y i是第i个类标记,y i∈{-1,+1},当y i等于+1时为正样本库,为-1时为负样本库; Among them, i=1, 2, 3,...N, N is the number of samples in the positive library or the number of samples in the negative library, x i is the i-th feature vector, x i ∈R n , y i is the i-th eigenvector class labels, y i ∈ {-1, +1}, when y i is equal to +1, it is a positive sample library, and when it is -1, it is a negative sample library;
根据所述支持向量机训练数据集生成支持向量机模型。A support vector machine model is generated according to the support vector machine training data set.
作为本发明的进一步改进,所述判断图片是否为正样本,具体包括:通过 计算HOG特征值判断拍摄的注射针和吸持针图片均是否为正样本。As a further improvement of the present invention, the judging whether the picture is a positive sample specifically includes: judging whether the shot pictures of the injection needle and the holding needle are positive samples by calculating the HOG feature value.
作为本发明的进一步改进,在所述采集样本图片并建立正样本库和负样本库之前,还包括:As a further improvement of the present invention, before the collection of sample pictures and the establishment of a positive sample library and a negative sample library, the method further includes:
进行显微操作平台误差矫正。Perform the error correction of the micromanipulation platform.
作为本发明的进一步改进,所述进行显微操作平台误差矫正,具体包括:As a further improvement of the present invention, the error correction of the micromanipulation platform specifically includes:
将标尺放置在显微操作平台上,以固定步进分别让显微操作平台沿固定方向移动并获取标尺图像,并保证前后两张图像有部分重叠;Place the ruler on the micro-operating platform, move the micro-operating platform in a fixed direction at a fixed step, and acquire the ruler image, and ensure that the two images before and after partially overlap;
将该方向的前后两张图像进行拼接;Stitch the two images before and after the direction;
计算显微操作平台在该方向的系统误差;Calculate the systematic error of the micromanipulation platform in this direction;
对该方向的系统误差进行自主补偿,矫正该方向系统误差;Perform autonomous compensation for the system error in the direction to correct the system error in the direction;
以固定步进分别让显微操作平台沿其他方向移动并分别获取图像,计算显微操作平台在其他各个方向上的系统误差,并对其他各个方向的系统误差进行自主补偿,完成所有方向的系统误差矫正。Move the micromanipulation platform in other directions with fixed steps and acquire images respectively, calculate the systematic errors of the micromanipulation platform in other directions, and automatically compensate the system errors in other directions to complete the system in all directions. Error correction.
作为本发明的进一步改进,所述计算显微操作平台在该方向的系统误差,具体包括:As a further improvement of the present invention, the calculation of the systematic error of the micromanipulation platform in this direction specifically includes:
计算像素间距;Calculate the pixel spacing;
根据像素间距计算该方向上前后两张图像的实际位移距离;Calculate the actual displacement distance of the two images before and after in this direction according to the pixel spacing;
根据实际位移距离得到该方向的系统误差。The systematic error in this direction is obtained according to the actual displacement distance.
作为本发明的进一步改进,所述计算像素间距,具体包括:As a further improvement of the present invention, the calculation of the pixel spacing specifically includes:
采用公式S=M/N计算像素间距;其中,S为像素间距,M为标尺长度,N为M长度内的像素个数。The pixel spacing is calculated using the formula S=M/N; wherein, S is the pixel spacing, M is the scale length, and N is the number of pixels within the M length.
作为本发明的进一步改进,所述根据像素间距计算该方向上前后两张图像的实际位移距离,具体包括:As a further improvement of the present invention, the described actual displacement distance of two images before and after the direction is calculated according to the pixel spacing, specifically including:
采用公式AA 1实际=S*AA 1计算实际位移距离;其中,AA 1实际为实际位移距离,AA 1为该方向的前后两张图像相对位移的像素个数。 The actual displacement distance is calculated using the formula AA 1 actual =S*AA 1 ; wherein, AA 1 is the actual displacement distance, and AA 1 is the number of pixels relative to the displacement of the front and rear images in this direction.
作为本发明的进一步改进,所述根据实际位移距离得到该方向的系统误差, 具体包括:As a further improvement of the present invention, the described system error of this direction is obtained according to the actual displacement distance, specifically including:
根据公式AA 1实际*cos(θ)和AA 1实际*sin(θ)得到该方向的系统误差的两个分量;其中,θ为图像坐标系与显微操作平台坐标系的偏角。 According to the formulas AA 1 actual * cos(θ) and AA 1 actual * sin (θ), the two components of the systematic error in this direction are obtained; where, θ is the declination angle between the image coordinate system and the micro-operating platform coordinate system.
作为本发明的进一步改进,所述对该方向的系统误差进行自主补偿,矫正该方向系统误差,具体包括:As a further improvement of the present invention, the system error in the direction is autonomously compensated to correct the system error in the direction, specifically including:
通过计算机闭环反馈进行补偿计算,对该方向的系统误差进行自主补偿,矫正该方向系统误差。Compensation calculation is performed through computer closed-loop feedback, and the system error in this direction is compensated autonomously to correct the system error in this direction.
本发明的有益效果:Beneficial effects of the present invention:
本发明用于细胞注射的显微操作平台三维定位方法可在细胞注射时实现定位,具有定位效率高,定位精度高的优点。The three-dimensional positioning method of the micro-operation platform for cell injection of the present invention can realize positioning during cell injection, and has the advantages of high positioning efficiency and high positioning accuracy.
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其他目的、特征和优点能够更明显易懂,以下特举较佳实施例,并配合附图,详细说明如下。The above description is only an overview of the technical solutions of the present invention, in order to be able to understand the technical means of the present invention more clearly, it can be implemented according to the content of the description, and in order to make the above and other purposes, features and advantages of the present invention more obvious and easy to understand , the following specific preferred embodiments, and in conjunction with the accompanying drawings, are described in detail as follows.
附图说明Description of drawings
图1是本发明优选实施例中用于细胞注射的显微操作平台三维定位方法的流程图;1 is a flow chart of a three-dimensional positioning method of a micromanipulation platform for cell injection in a preferred embodiment of the present invention;
图2是本发明优选实施例中完成注射针和吸持针的对针示意图;Fig. 2 is the needle alignment schematic diagram of the completed injection needle and the suction needle in the preferred embodiment of the present invention;
图3是本发明优选实施例中进行显微操作平台误差矫正的流程图;Fig. 3 is the flow chart of performing the error correction of the micro-operation platform in the preferred embodiment of the present invention;
图4是本发明优选实施例中标尺的示意图;4 is a schematic diagram of a scale in a preferred embodiment of the present invention;
图5是本发明优选实施例中前后两张图像的示意图;5 is a schematic diagram of two images before and after in the preferred embodiment of the present invention;
图6是本发明优选实施例中前后两张图像的拼接示意图;Fig. 6 is the splicing schematic diagram of two images before and after in the preferred embodiment of the present invention;
图7是本发明优选实施例中X轴正向前后两张图像的拼接示意图。FIG. 7 is a schematic diagram of splicing two images before and after the positive X-axis in a preferred embodiment of the present invention.
具体实施方式Detailed ways
下面结合附图和具体实施例对本发明作进一步说明,以使本领域的技术人员可以更好地理解本发明并能予以实施,但所举实施例不作为对本发明的限定。The present invention will be further described below with reference to the accompanying drawings and specific embodiments, so that those skilled in the art can better understand the present invention and implement it, but the embodiments are not intended to limit the present invention.
如图1所示,本发明优选实施例中的用于细胞注射的显微操作平台三维定位方法,包括以下步骤:As shown in FIG. 1 , the three-dimensional positioning method of the micromanipulation platform for cell injection in the preferred embodiment of the present invention includes the following steps:
步骤S10、采集样本图片并建立正样本库和负样本库;其中,所述正样本库包括注射针正样本库和吸持针正样本库,所述负样本库包括注射针负样本库和吸持针负样本库,所述正样本库中针尖在显微镜的焦平面,所述负样本库中针尖不在显微镜的焦平面。Step S10, collect sample pictures and establish a positive sample library and a negative sample library; wherein, the positive sample library includes a positive sample library for injection needles and a positive sample library for suction needles, and the negative sample library includes a negative sample library for injection needles and suction needles. Hold the needle negative sample bank, the needle tip in the positive sample bank is in the focal plane of the microscope, and the needle tip in the negative sample bank is not in the focal plane of the microscope.
步骤S20、通过所述正样本库和负样本库分别建立正样本训练集和负样本训练集,并训练生成支持向量机模型。Step S20: Establish a positive sample training set and a negative sample training set respectively through the positive sample library and the negative sample library, and train to generate a support vector machine model.
具体的,分别对正样本库与负样本库进行方向梯度直方图(HOG)特征的提取,具体包括:Specifically, the extraction of histogram of orientation gradient (HOG) features is performed on the positive sample library and the negative sample library respectively, including:
首先,通过计算样本库中每张样本图像在x轴方向和y轴方向的梯度G x(x,y),G y(x,y); First, by calculating the gradients G x (x, y) and G y (x, y) of each sample image in the sample library in the x-axis direction and the y-axis direction;
G x(x,y)=H(x+1,y)-H(x-1,y) G x (x,y)=H(x+1,y)-H(x-1,y)
G y(x,y)=H(x,y+1)-H(x,y+1) G y (x, y)=H(x, y+1)-H(x, y+1)
其中:H(x,y)表示输入样本图像在像素点(x,y)处的像素值,G x(x,y),G y(x,y)分别表示水平方向梯度与竖直方向梯度; Where: H(x, y) represents the pixel value of the input sample image at the pixel point (x, y), G x (x, y), G y (x, y) represent the horizontal gradient and vertical gradient respectively ;
然后,将样本图像分为若干个均匀的“Block Cell”,并设置Cell的大小和其他参数,得到Cell的特征向量,再将Cell单元组合成大的Block,一个Block包括多个Cell,每个Block内所有Cell的特征向量串联起来即可得到HOG特征;其中,其他参数包括bin、梯度方向等。Then, divide the sample image into several uniform "Block Cells", and set the size and other parameters of the Cell to obtain the feature vector of the Cell, and then combine the Cell units into a large Block, a Block includes multiple Cells, each The HOG feature can be obtained by concatenating the feature vectors of all Cells in the block; among them, other parameters include bin, gradient direction, etc.
然后结合使用机器学习的方法,通过对正样本训练集和负样本训练集的训练,得到支持向量机训练数据集为:Then combined with the machine learning method, through the training of the positive sample training set and the negative sample training set, the training data set of the support vector machine is obtained as:
T={(x 1,y 1),(x 2,y 2),...(x i,y i)} T={(x 1 , y 1 ), (x 2 , y 2 ), ... (x i , y i )}
其中,i=1,2,3,...N,N为正库中样品数或负样本库中样品数,x i是第i个特征向量,x i∈R n,y i是第i个类标记,y i∈{-1,+1},当y i等于+1时为正样本库,为-1时为负样本库; Among them, i=1, 2, 3,...N, N is the number of samples in the positive library or the number of samples in the negative library, x i is the i-th feature vector, x i ∈R n , y i is the i-th eigenvector class labels, y i ∈ {-1, +1}, when y i is equal to +1, it is a positive sample library, and when it is -1, it is a negative sample library;
根据所述支持向量机训练数据集生成支持向量机模型。A support vector machine model is generated according to the support vector machine training data set.
步骤S30、分别将注射针和吸持针在显微镜下沿预设路线移动,并分别拍摄图片。具体的,分别将注射针和吸持针在显微镜下从焦平面上方以固定步进向Z轴负方向运动,或者从焦平面下方以固定步进向Z轴正方向运动,每步进一次拍摄一张图片。Step S30, respectively move the injection needle and the suction needle along a preset route under the microscope, and take pictures respectively. Specifically, under the microscope, the injection needle and the holding needle are moved from above the focal plane to the negative direction of the Z-axis with a fixed step, or from below the focal plane to the positive direction of the Z-axis with a fixed step, and each step is taken for one shot. a picture.
步骤S40、利用训练好的支持向量机模型对拍摄的图片进行处理,并判断图片是否为正样本,直至拍摄的注射针和吸持针图片均为正样本时,即完成注射针和吸持针的对针。Step S40, using the trained support vector machine model to process the photographed picture, and determine whether the picture is a positive sample, until the photographed pictures of the injection needle and the holding needle are both positive samples, that is, the injection needle and the holding needle are completed. 's needle.
其中,通过计算HOG特征值判断拍摄的注射针和吸持针图片均是否为正样本。Among them, by calculating the HOG feature value, it is determined whether the pictures of the injection needle and the holding needle are positive samples.
在其中一实施例中,本发明还包括步骤A:进行显微操作平台误差矫正。以实现对XY平面高精度的二维定位。In one of the embodiments, the present invention further includes step A: performing error correction on the micromanipulation platform. In order to achieve high-precision two-dimensional positioning on the XY plane.
具体的,步骤A具体包括:Specifically, step A specifically includes:
步骤A1、将标尺放置在显微操作平台上,以固定步进分别让显微操作平台沿固定方向移动并获取标尺图像,并保证前后两张图像有部分重叠。Step A1: Place the ruler on the micro-operating platform, move the micro-operating platform in a fixed direction at a fixed step, and acquire images of the ruler, and ensure that the front and rear images partially overlap.
在本实施例中,标尺为二维平面标尺,如图4所示。获取的前后两张图像如图5所示,即前帧和后帧。In this embodiment, the scale is a two-dimensional plane scale, as shown in FIG. 4 . The two acquired images before and after are shown in Figure 5, namely the front frame and the back frame.
步骤A2、将该方向的前后两张图像进行拼接。拼接后的图像如图6所示。Step A2, stitching the front and rear images in the direction. The stitched image is shown in Figure 6.
步骤A3、计算显微操作平台在该方向的系统误差。具体包括:Step A3: Calculate the systematic error of the micromanipulation platform in this direction. Specifically include:
步骤A31、计算像素间距。具体包括:Step A31: Calculate the pixel spacing. Specifically include:
采用公式S=M/N计算像素间距;其中,S为像素间距,M为标尺长度,N为M长度内的像素个数。The pixel spacing is calculated using the formula S=M/N; wherein, S is the pixel spacing, M is the scale length, and N is the number of pixels within the M length.
步骤A32、根据像素间距计算该方向上前后两张图像的实际位移距离;具体包括:Step A32, calculate the actual displacement distance of the front and back two images in this direction according to the pixel spacing; specifically include:
采用公式AA 1实际=S*AA 1计算实际位移距离;其中,AA 1实际为实际位移距离,AA 1为该方向的前后两张图像相对位移的像素个数。 The actual displacement distance is calculated using the formula AA 1 actual =S*AA 1 ; wherein, AA 1 is the actual displacement distance, and AA 1 is the number of pixels relative to the displacement of the front and rear images in this direction.
步骤A33、根据实际位移距离得到该方向的系统误差。具体包括:Step A33: Obtain the systematic error of the direction according to the actual displacement distance. Specifically include:
根据公式AA 1实际*cos(θ)和AA 1实际*sin(θ)得到该方向的系统误差的两个分量;其中,θ为图像坐标系与显微操作平台坐标系的偏角。 According to the formulas AA 1 actual * cos(θ) and AA 1 actual * sin (θ), the two components of the systematic error in this direction are obtained; where, θ is the declination angle between the image coordinate system and the micro-operating platform coordinate system.
如图7所示,当显微操作平台运行方向为X轴正向,此时该方向的系统误差为+XΔ,可分为+XΔ X,+XΔ y两个分量,满足以下公式: As shown in Figure 7, when the operating direction of the micromanipulation platform is the positive direction of the X-axis, the systematic error in this direction is +XΔ, which can be divided into two components: + XΔX and + XΔy , which satisfy the following formula:
+XΔ X=AA 1实际*cos(θ); +XΔ X =AA 1 actual *cos(θ);
+XΔ y=AA 1实际*sin(θ); +XΔ y =AA 1 actual *sin(θ);
+XΔ X与+XΔ y即为X轴正向运动时需要补偿的补偿值。同理可得其它方向运动时的补偿值。 +XΔ X and +XΔ y are the compensation values that need to be compensated when the X axis moves forward. In the same way, the compensation value can be obtained when moving in other directions.
步骤A4、对该方向的系统误差进行自主补偿,矫正该方向系统误差。具体包括:Step A4: Perform autonomous compensation for the systematic error in the direction to correct the systematic error in the direction. Specifically include:
通过计算机闭环反馈进行补偿计算,对该方向的系统误差进行自主补偿,矫正该方向系统误差。Compensation calculation is performed through computer closed-loop feedback, and the system error in this direction is compensated autonomously to correct the system error in this direction.
在本实施例中,所述进行显微操作平台误差矫正还包括:In this embodiment, performing the error correction of the micro-operation platform further includes:
步骤A5、以固定步进分别让显微操作平台沿其他方向移动并分别获取图像,计算显微操作平台在其他各个方向上的系统误差,并对其他各个方向的系统误差进行自主补偿,完成所有方向的系统误差矫正。其中,所有方向包括X轴正向、X轴负向、Y轴正向、Y轴负向。Step A5: Move the micro-operating platform in other directions with a fixed step and acquire images respectively, calculate the systematic errors of the micro-operating platform in other directions, and perform autonomous compensation for the system errors in other directions to complete all the steps. Systematic error correction for orientation. Among them, all directions include the positive direction of the X axis, the negative direction of the X axis, the positive direction of the Y axis, and the negative direction of the Y axis.
在本实施例中,所述进行显微操作平台误差矫正还包括:拍摄多组图像,并进行多次计算得到显微操作平台在该方向系统误差的平均值。可以提高系统误差计算精度,并最终提高误差矫正的精度。In this embodiment, performing the error correction of the micro-operating platform further includes: taking multiple sets of images, and performing multiple calculations to obtain the average value of the systematic errors of the micro-operating platform in this direction. The system error calculation accuracy can be improved, and finally the error correction accuracy can be improved.
本发明所提出的用于细胞注射的显微操作平台三维定位方法中针对二维位置获取的方法,摒弃了传统的人工对各个可能存在误差的因素(比如:平动运动部件、旋转运动部件、滚动运动部件)分别进行数据采集标定,手动补偿以进行误差矫正的方式,基于图像拼接技术对当前影响显微注射精度的机械误差、CCD安装误差以及像素/微米转换等误差进行集成统一,无须人工辅助,可实现自主补偿矫正,误差可控制在像素级别。The method for obtaining the two-dimensional position in the three-dimensional positioning method of the micromanipulation platform for cell injection proposed by the present invention abandons the traditional manual error factors (such as translational motion parts, rotational motion parts, Rolling moving parts) separately perform data acquisition and calibration, and manually compensate for error correction. Based on image stitching technology, the current mechanical errors, CCD installation errors and pixel/micron conversion errors that affect the accuracy of microinjection are integrated and unified, without manual labor. Auxiliary, it can realize self-compensation and correction, and the error can be controlled at the pixel level.
本发明提出的用于细胞注射的显微操作平台三维定位方法中,针对二维位置获取方法时,所提到的系统误差自主补偿算法不仅适用于显微操作系统,同时也适用于其他移动平台的误差矫正,不仅操作简单,同时具备高效率、准确 度高等特点。In the three-dimensional positioning method of the micromanipulation platform for cell injection proposed by the present invention, for the two-dimensional position acquisition method, the mentioned system error autonomous compensation algorithm is not only applicable to the micromanipulation system, but also to other mobile platforms. The error correction is not only simple to operate, but also has the characteristics of high efficiency and high accuracy.
本发明所提出的用于细胞注射的显微操作平台三维定位方法中针对深度位置获取的方法,提出了一种基于HOG特征结合机器学习的以“焦平面对针”的形式以确定深度位置的新方法,不仅具有较高的处理速度,同时也具备较高的精确度,从而解决了当前定位方法中深度位置获取不精确等问题。The method for obtaining the depth position in the three-dimensional positioning method of the micromanipulation platform for cell injection proposed by the present invention proposes a method for determining the depth position in the form of "focus plane to needle" based on HOG features combined with machine learning. The new method not only has high processing speed, but also has high accuracy, thus solving the problem of inaccurate depth position acquisition in the current positioning method.
以上实施例仅是为充分说明本发明而所举的较佳的实施例,本发明的保护范围不限于此。本技术领域的技术人员在本发明基础上所作的等同替代或变换,均在本发明的保护范围之内。本发明的保护范围以权利要求书为准。The above embodiments are only preferred embodiments for fully illustrating the present invention, and the protection scope of the present invention is not limited thereto. Equivalent substitutions or transformations made by those skilled in the art on the basis of the present invention are all within the protection scope of the present invention. The protection scope of the present invention is subject to the claims.

Claims (10)

  1. 一种用于细胞注射的显微操作平台三维定位方法,其特征在于,包括:A three-dimensional positioning method for a micromanipulation platform for cell injection, characterized by comprising:
    采集样本图片并建立正样本库和负样本库;其中,所述正样本库包括注射针正样本库和吸持针正样本库,所述负样本库包括注射针负样本库和吸持针负样本库,所述正样本库中针尖在显微镜的焦平面,所述负样本库中针尖不在显微镜的焦平面;Collect sample pictures and establish a positive sample library and a negative sample library; wherein, the positive sample library includes an injection needle positive sample library and a suction needle positive sample library, and the negative sample library includes an injection needle negative sample library and a suction needle negative sample library a sample library, wherein the needle tip in the positive sample library is in the focal plane of the microscope, and the needle tip in the negative sample library is not in the focal plane of the microscope;
    通过所述正样本库和负样本库分别建立正样本训练集和负样本训练集,并训练生成支持向量机模型;Establish a positive sample training set and a negative sample training set respectively through the positive sample library and the negative sample library, and train to generate a support vector machine model;
    分别将注射针和吸持针在显微镜下沿预设路线移动,并分别拍摄图片;Move the injection needle and the holding needle along the preset route under the microscope respectively, and take pictures respectively;
    利用训练好的支持向量机模型对拍摄的图片进行处理,并判断图片是否为正样本,直至拍摄的注射针和吸持针图片均为正样本时,即完成注射针和吸持针的对针。Use the trained support vector machine model to process the captured images, and determine whether the images are positive samples. When the captured images of injection needles and suction needles are both positive samples, the needle alignment of the injection needles and the suction needles is completed. .
  2. 如权利要求1所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,通过所述正样本库和负样本库分别建立正样本训练集和负样本训练集,并训练生成支持向量机模型,具体包括:The three-dimensional positioning method of a micromanipulation platform for cell injection according to claim 1, wherein a positive sample training set and a negative sample training set are established respectively through the positive sample library and the negative sample library, and the training Generate a support vector machine model, including:
    首先,计算样本库中每张样本图像在x轴方向和y轴方向的梯度G x(x,y),G y(x,y); First, calculate the gradients G x (x, y) and G y (x, y) of each sample image in the sample library in the x-axis and y-axis directions;
    G x(x,y)=H(x+1,y)-H(x-1,y) G x (x,y)=H(x+1,y)-H(x-1,y)
    G y(x,y)=H(x,y+1)-H(x,y-1) G y (x, y)=H(x, y+1)-H(x, y-1)
    其中:H(x,y)表示输入样本图像在像素点(x,y)处的像素值,G x(x,y),G y(x,y)分别表示水平方向梯度与竖直方向梯度; Where: H(x, y) represents the pixel value of the input sample image at the pixel point (x, y), G x (x, y), G y (x, y) represent the horizontal gradient and vertical gradient respectively ;
    然后,将样本图像分为若干个均匀的“BlockCell”,并设置Cell的大小和其他参数,得到Cell的特征向量,再将Cell单元组合成大的Block,一个Block包括多个Cell,每个Block内所有Cell的特征向量串联起来即可得到HOG特征;Then, the sample image is divided into several uniform "BlockCell", and the size and other parameters of the Cell are set to obtain the feature vector of the Cell, and then the Cell units are combined into a large Block. A Block includes multiple Cells, and each Block The HOG feature can be obtained by concatenating the feature vectors of all Cells in the cell;
    然后结合使用机器学习的方法,通过对正样本训练集和负样本训练集的训 练,得到支持向量机训练数据集为:Then combined with the method of machine learning, through the training of positive sample training set and negative sample training set, the training data set of support vector machine is obtained as:
    T={(x 1,y 1),(x 2,y 2),...(x i,y i)} T={(x 1 , y 1 ), (x 2 , y 2 ), ... (x i , y i )}
    其中,i=1,2,3,...N,N为正库中样品数或负样本库中样品数,x i是第i个特征向量,x i∈R n,y i是第i个类标记,y i∈{-1,+1},当y i等于+1时为正样本库,为-1时为负样本库; Among them, i=1, 2, 3,...N, N is the number of samples in the positive library or the number of samples in the negative library, x i is the i-th feature vector, x i ∈R n , y i is the i-th eigenvector class labels, y i ∈ {-1, +1}, when y i is equal to +1, it is a positive sample library, and when it is -1, it is a negative sample library;
    根据所述支持向量机训练数据集生成支持向量机模型。A support vector machine model is generated according to the support vector machine training data set.
  3. 如权利要求2所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述判断图片是否为正样本,具体包括:通过计算HOG特征值判断拍摄的注射针和吸持针图片均是否为正样本。The three-dimensional positioning method of a micromanipulation platform for cell injection according to claim 2, wherein the judging whether the picture is a positive sample specifically comprises: judging the shot of the injection needle and the suction needle by calculating the HOG characteristic value. Whether the needle-holding pictures are positive samples.
  4. 如权利要求1所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,在所述采集样本图片并建立正样本库和负样本库之前,还包括:A three-dimensional positioning method for a micromanipulation platform for cell injection according to claim 1, characterized in that, before the collection of sample pictures and the establishment of a positive sample library and a negative sample library, the method further comprises:
    进行显微操作平台误差矫正。Perform the error correction of the micromanipulation platform.
  5. 如权利要求4所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述进行显微操作平台误差矫正,具体包括:The three-dimensional positioning method of a micro-operating platform for cell injection according to claim 4, wherein the performing error correction of the micro-operating platform specifically includes:
    将标尺放置在显微操作平台上,以固定步进分别让显微操作平台沿固定方向移动并获取标尺图像,并保证前后两张图像有部分重叠;Place the ruler on the micro-operating platform, move the micro-operating platform in a fixed direction at a fixed step, and acquire the ruler image, and ensure that the two images before and after partially overlap;
    将该方向的前后两张图像进行拼接;Stitch the two images before and after the direction;
    计算显微操作平台在该方向的系统误差;Calculate the systematic error of the micromanipulation platform in this direction;
    对该方向的系统误差进行自主补偿,矫正该方向系统误差;Perform autonomous compensation for the system error in the direction to correct the system error in the direction;
    以固定步进分别让显微操作平台沿其他方向移动并分别获取图像,计算显微操作平台在其他各个方向上的系统误差,并对其他各个方向的系统误差进行自主补偿,完成所有方向的系统误差矫正。Move the micromanipulation platform in other directions with fixed steps and acquire images respectively, calculate the systematic errors of the micromanipulation platform in other directions, and automatically compensate the system errors in other directions to complete the system in all directions. Error correction.
  6. 如权利要求5所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述计算显微操作平台在该方向的系统误差,具体包括:The three-dimensional positioning method of a micromanipulation platform for cell injection according to claim 5, wherein the calculating the systematic error of the micromanipulation platform in this direction specifically includes:
    计算像素间距;Calculate the pixel spacing;
    根据像素间距计算该方向上前后两张图像的实际位移距离;Calculate the actual displacement distance of the two images before and after in this direction according to the pixel spacing;
    根据实际位移距离得到该方向的系统误差。The systematic error in this direction is obtained according to the actual displacement distance.
  7. 如权利要求6所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述计算像素间距,具体包括:A three-dimensional positioning method for a micromanipulation platform for cell injection according to claim 6, wherein the calculating the pixel spacing specifically includes:
    采用公式S=M/N计算像素间距;其中,S为像素间距,M为标尺长度,N为M长度内的像素个数。The pixel spacing is calculated using the formula S=M/N; wherein, S is the pixel spacing, M is the scale length, and N is the number of pixels within the M length.
  8. 如权利要求7所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述根据像素间距计算该方向上前后两张图像的实际位移距离,具体包括:A kind of micromanipulation platform three-dimensional positioning method for cell injection as claimed in claim 7, it is characterised in that the actual displacement distance of the two images before and after the direction is calculated according to the pixel spacing, specifically includes:
    采用公式AA 1实际=S*AA 1计算实际位移距离;其中,AA 1实际为实际位移距离,AA 1为该方向的前后两张图像相对位移的像素个数。 The actual displacement distance is calculated using the formula AA 1 actual =S*AA 1 ; wherein, AA 1 is the actual displacement distance, and AA 1 is the number of pixels relative to the displacement of the front and rear images in this direction.
  9. 如权利要求8所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述根据实际位移距离得到该方向的系统误差,具体包括:A kind of micromanipulation platform three-dimensional positioning method for cell injection as claimed in claim 8, is characterized in that, described according to actual displacement distance obtains the systematic error of this direction, specifically comprises:
    根据公式AA 1实际*cos(θ)和AA 1实际*sin(θ)得到该方向的系统误差的两个分量;其中,θ为图像坐标系与显微操作平台坐标系的偏角。 According to the formulas AA 1 actual * cos(θ) and AA 1 actual * sin (θ), the two components of the systematic error in this direction are obtained; where, θ is the declination angle between the image coordinate system and the micro-operating platform coordinate system.
  10. 如权利要求5所述的一种用于细胞注射的显微操作平台三维定位方法,其特征在于,所述对该方向的系统误差进行自主补偿,矫正该方向系统误差,具体包括:The three-dimensional positioning method of a micromanipulation platform for cell injection according to claim 5, wherein the autonomous compensation of the systematic error in the direction and the correction of the systematic error in the direction specifically include:
    通过计算机闭环反馈进行补偿计算,对该方向的系统误差进行自主补偿,矫正该方向系统误差。Compensation calculation is performed through computer closed-loop feedback, and the system error in this direction is compensated autonomously to correct the system error in this direction.
PCT/CN2020/127759 2020-11-04 2020-11-10 Micromanipulation platform three-dimensional positioning method for cell injection WO2022095082A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011213646.X 2020-11-04
CN202011213646.XA CN112101575B (en) 2020-11-04 2020-11-04 Three-dimensional positioning method of micromanipulation platform for cell injection

Publications (1)

Publication Number Publication Date
WO2022095082A1 true WO2022095082A1 (en) 2022-05-12

Family

ID=73784516

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/127759 WO2022095082A1 (en) 2020-11-04 2020-11-10 Micromanipulation platform three-dimensional positioning method for cell injection

Country Status (2)

Country Link
CN (1) CN112101575B (en)
WO (1) WO2022095082A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114034225A (en) * 2021-11-25 2022-02-11 广州市华粤行医疗科技有限公司 Method for testing movement precision of injection needle under microscope

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204642A (en) * 2016-06-29 2016-12-07 四川大学 A kind of cell tracker method based on deep neural network
CN110796661A (en) * 2018-08-01 2020-02-14 华中科技大学 Fungal microscopic image segmentation detection method and system based on convolutional neural network
CN110841139A (en) * 2019-12-10 2020-02-28 深圳市中科微光医疗器械技术有限公司 Remaining needle capable of realizing needle tip positioning in image environment
CN111652848A (en) * 2020-05-07 2020-09-11 南开大学 Robotized adherent cell three-dimensional positioning method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2710010Y (en) * 2004-03-26 2005-07-13 张志宏 Microscopic operation system of biocell computer
US9598281B2 (en) * 2011-03-03 2017-03-21 The Regents Of The University Of California Nanopipette apparatus for manipulating cells
CN103255049A (en) * 2013-05-20 2013-08-21 苏州大学 Composite piezoelectric injection system and injection method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204642A (en) * 2016-06-29 2016-12-07 四川大学 A kind of cell tracker method based on deep neural network
CN110796661A (en) * 2018-08-01 2020-02-14 华中科技大学 Fungal microscopic image segmentation detection method and system based on convolutional neural network
CN110841139A (en) * 2019-12-10 2020-02-28 深圳市中科微光医疗器械技术有限公司 Remaining needle capable of realizing needle tip positioning in image environment
CN111652848A (en) * 2020-05-07 2020-09-11 南开大学 Robotized adherent cell three-dimensional positioning method

Also Published As

Publication number Publication date
CN112101575B (en) 2021-04-30
CN112101575A (en) 2020-12-18

Similar Documents

Publication Publication Date Title
CN112122840B (en) Visual positioning welding system and welding method based on robot welding
CN105354531B (en) A kind of mask method of face key point
Yu et al. Microrobotic cell injection
CN110666798B (en) Robot vision calibration method based on perspective transformation model
CN111775146A (en) Visual alignment method under industrial mechanical arm multi-station operation
CN111089569A (en) Large box body measuring method based on monocular vision
CN111496779B (en) Intelligent microscopic operation system based on machine vision
WO2022095082A1 (en) Micromanipulation platform three-dimensional positioning method for cell injection
WO2021057422A1 (en) Microscope system, smart medical device, automatic focusing method and storage medium
CN113409285A (en) Method and system for monitoring three-dimensional deformation of immersed tunnel joint
CN101073528A (en) Digital operating bed system with double-plane positioning and double-eyes visual tracting
CN114343847B (en) Hand-eye calibration method of surgical robot based on optical positioning system
CN116643393B (en) Microscopic image deflection-based processing method and system
CN103170823A (en) Control device and method of inserting micro-pipe into micro-hole through monocular microscopy visual guidance
US8094191B2 (en) System and method for correcting an image
CN110211183A (en) The multi-target positioning system and method for big visual field LED lens attachment are imaged based on single
Yang et al. Automatic vision-guided micromanipulation for versatile deployment and portable setup
WO2021227189A1 (en) Micromanipulation platform autonomous error correction algorithm based on machine vision
CN111679421A (en) Intelligent microscopic operation system based on 3D imaging technology
CN113403198A (en) Multi-view composite single-cell micro-operation system and control method
CN109491409B (en) Method for automatically extracting mitochondria of target cell in vitro
CN114359393A (en) Cross-platform visual guide dispensing guiding method
CN110490801B (en) High-speed splicing method for ultra-large matrix pictures of metallographic microscope
CN114067646A (en) Visual simulation teaching system of puncture surgical robot
Fan et al. Posture adjustment and robust microinjection of zebrafish larval heart

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20960539

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20960539

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22.09.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20960539

Country of ref document: EP

Kind code of ref document: A1