CN112101575B - Three-dimensional positioning method of micromanipulation platform for cell injection - Google Patents

Three-dimensional positioning method of micromanipulation platform for cell injection Download PDF

Info

Publication number
CN112101575B
CN112101575B CN202011213646.XA CN202011213646A CN112101575B CN 112101575 B CN112101575 B CN 112101575B CN 202011213646 A CN202011213646 A CN 202011213646A CN 112101575 B CN112101575 B CN 112101575B
Authority
CN
China
Prior art keywords
sample library
needle
positive sample
positive
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011213646.XA
Other languages
Chinese (zh)
Other versions
CN112101575A (en
Inventor
汝长海
郝淼
陈瑞华
翟荣安
岳春峰
孙钰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Jicui Micro Nano Automation System And Equipment Technology Research Institute Co ltd
Original Assignee
Jiangsu Jicui Micro Nano Automation System And Equipment Technology Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Jicui Micro Nano Automation System And Equipment Technology Research Institute Co ltd filed Critical Jiangsu Jicui Micro Nano Automation System And Equipment Technology Research Institute Co ltd
Priority to CN202011213646.XA priority Critical patent/CN112101575B/en
Priority to PCT/CN2020/127759 priority patent/WO2022095082A1/en
Publication of CN112101575A publication Critical patent/CN112101575A/en
Application granted granted Critical
Publication of CN112101575B publication Critical patent/CN112101575B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12NMICROORGANISMS OR ENZYMES; COMPOSITIONS THEREOF; PROPAGATING, PRESERVING, OR MAINTAINING MICROORGANISMS; MUTATION OR GENETIC ENGINEERING; CULTURE MEDIA
    • C12N15/00Mutation or genetic engineering; DNA or RNA concerning genetic engineering, vectors, e.g. plasmids, or their isolation, preparation or purification; Use of hosts therefor
    • C12N15/09Recombinant DNA-technology
    • C12N15/87Introduction of foreign genetic material using processes not otherwise provided for, e.g. co-transformation
    • C12N15/89Introduction of foreign genetic material using processes not otherwise provided for, e.g. co-transformation using microinjection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • G06T2207/10061Microscopic image from scanning electron microscope
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Abstract

The invention discloses a three-dimensional positioning method of a micromanipulation platform for cell injection, which comprises the following steps: collecting sample pictures and establishing a positive sample library and a negative sample library; the positive sample library comprises an injection needle positive sample library and a holding needle positive sample library, and the negative sample library comprises an injection needle negative sample library and a holding needle negative sample library; respectively establishing a positive sample training set and a negative sample training set through a positive sample library and a negative sample library, and training to generate a support vector machine model; respectively moving the injection needle and the holding needle under a microscope along a preset route, and respectively shooting pictures; and processing the shot picture by using the trained support vector machine model, and judging whether the picture is a positive sample or not, and finishing the needle alignment of the injection needle and the suction needle until the shot pictures of the injection needle and the suction needle are both positive samples. The three-dimensional positioning method of the micromanipulation platform for cell injection can realize positioning during cell injection, and has the advantages of high positioning efficiency and high positioning precision.

Description

Three-dimensional positioning method of micromanipulation platform for cell injection
Technical Field
The invention relates to the technical field of micromanipulation calculation, in particular to a three-dimensional positioning method of a micromanipulation platform for cell injection.
Background
With the great application of artificial assisted reproduction technology and biological medicine, micromanipulation technology in the field of cell engineering has gradually become a leading technology inseparable with the development of human society since the 21 st century. The micromanipulation technology is an operation means for completing a specific experiment task by using an actuator with micro-nano motion precision through a feedback mode of vision, force and the like and aiming at controlled objects such as organisms, materials, chemical molecules and the like. Conventional micromanipulation techniques are performed manually, and are generally performed by experienced experimenters who require at least two or three years of specialized training, such as intracytoplasmic sperm injection, nuclear transfer, chimera techniques, embryo transfer, microdissection, and the like, wherein the cell microinjection technique is most widely used, and the main applications include intracellular drug injection, nuclear transfer, embryo transfer, intracytoplasmic sperm injection (ICSI), and the like. In the microinjection process, an experimenter sits in front of a microscope and observes the condition of a working space with eyes, operates a micro-motion manipulator to complete the experiment according to own experience, and the size of an operation object is generally in the micron order, so certain challenge is brought to the operator. It can be seen that the process is time-consuming and labor-consuming, and at the same time depends on the personal experience of the operator, so that it is difficult to ensure the consistency of all the operation processes, and the development of bioengineering technology and life science is restricted to a certain extent.
With the continuous progress of mechanical processing technology and automation technology, it is a current research trend to replace manual work with machine equipment to improve the automation operation level. In recent years, remarkable progress has been made in micromanipulation techniques, and researchers have developed micromanipulation systems of various mechanical structures, driving means, and control methods, but none of these micromanipulation systems has achieved wide applicability, and in particular, there are still certain problems in the method research of three-dimensional localization of cells, as follows.
On one hand, when the two-dimensional position is obtained, the problem that the calibration precision of the microscopic vision system is not high exists. Because the objects of the micromanipulation are all in the micron order, the requirement on the motion precision of the operation platform is very high. At present, the precision of the micromanipulation mainly depends on the precision of machinery, but because mechanical equipment can inevitably have certain self error and installation error, the precision of the micromanipulation can be reduced, and even the result of the micromanipulation is directly influenced. At present, the auxiliary installation of the mechanical auxiliary instrument is adopted to reduce the installation error precision, the error compensation mode needs repeated measurement, and is accompanied with a large amount of data processing, so that the workload is large, the working efficiency is low, and even additional errors can be added in the detection process. Therefore, as no effective error compensation method exists, the current microscopic operation is dependent on pure manual or semi-automatic operation without exception, but obviously, the requirements of high efficiency and high quality operation of modern intelligent medical technology cannot be met.
On the other hand, there is a problem that depth information cannot be accurately acquired when acquiring a depth position. The acquisition of depth information is a challenging problem in the field of micromanipulation robotics. The existing depth information acquisition modes are basically two: first, the indirect acquisition method. Depth information of the target cell is acquired using a contact detection algorithm based on an image processing technique. The contact detection algorithm generally adopts the method that the injection needle is displaced to the right above a target cell, the micro injection needle is descended, the image change condition of the area near the needle point is tracked, and through visual feedback, when the needle head touches the target cell, the cell surface is deformed, and the position is calibrated as the zero point position of the depth coordinate. However, in this method, the needle tip needs to directly contact the cell to deform the cell, and the target tracking algorithm (such as a template matching method, an SSD optical flow method, a motion history image MHI, a moving contour model, and the like) adopted in the visual feedback has certain pertinence and cannot meet the real-time requirement of operation. Second, direct acquisition methods. Inspired by macroscopic binocular stereo vision, the method is introduced into the field of micromanipulation. The depth information of the target cell is acquired in such a way that microscopes are respectively arranged in the horizontal direction and the vertical direction or binocular stereo vision is realized based on a stereoscopic microscope, but the way has relatively large error for biological microscopic operation required by micron-scale error, and the problem of ensuring the accuracy is still a particularly outstanding problem.
Disclosure of Invention
Aiming at the defects of the prior art, the invention aims to provide a micromanipulation platform three-dimensional positioning method for cell injection with high efficiency and high precision. The technical scheme is as follows:
a method of three-dimensional localization of a micromanipulation platform for cell injection, comprising:
collecting sample pictures and establishing a positive sample library and a negative sample library; the positive sample library comprises an injection needle positive sample library and a holding needle positive sample library, the negative sample library comprises an injection needle negative sample library and a holding needle negative sample library, the needle point in the positive sample library is on the focal plane of the microscope, and the needle point in the negative sample library is not on the focal plane of the microscope;
respectively establishing a positive sample training set and a negative sample training set through the positive sample library and the negative sample library, and training to generate a support vector machine model;
respectively moving the injection needle and the holding needle under a microscope along a preset route, and respectively shooting pictures;
and processing the shot picture by using the trained support vector machine model, and judging whether the picture is a positive sample or not, and finishing the needle alignment of the injection needle and the suction needle until the shot pictures of the injection needle and the suction needle are both positive samples.
As a further improvement of the present invention, a positive sample training set and a negative sample training set are respectively established through the positive sample library and the negative sample library, and are trained to generate a support vector machine model, which specifically includes:
by calculating the gradient of each sample image in the sample library in the x-axis direction and the y-axis direction
Figure DEST_PATH_IMAGE001
Figure DEST_PATH_IMAGE002
Wherein:
Figure DEST_PATH_IMAGE003
representing input sample image at pixel point
Figure DEST_PATH_IMAGE004
The value of the pixel of (a) is,
Figure 763195DEST_PATH_IMAGE001
respectively representing a horizontal direction gradient and a vertical direction gradient;
dividing a sample image into a plurality of uniform 'Block cells', setting the sizes and other parameters of the cells to obtain characteristic vectors of the cells, combining the Cell units into large blocks, wherein each Block comprises a plurality of cells, and the characteristic vectors of all the cells in each Block are connected in series to obtain HOG characteristics;
by combining the method of machine learning and training the positive sample training set and the negative sample training set, the training data set of the support vector machine is obtained as follows:
Figure DEST_PATH_IMAGE005
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE006
n is the number of samples in the positive bank or the number of samples in the negative sample bank,
Figure DEST_PATH_IMAGE007
is the ith feature vector and is the ith feature vector,
Figure DEST_PATH_IMAGE008
Figure DEST_PATH_IMAGE009
is the i-th class label and,
Figure DEST_PATH_IMAGE010
when is coming into contact with
Figure 761894DEST_PATH_IMAGE009
When the sum is equal to +1, the sample library is a positive sample library, and when the sum is-1, the sample library is a negative sample library;
and generating a support vector machine model according to the training data set of the support vector machine.
As a further improvement of the present invention, the determining whether the picture is a positive sample specifically includes: and judging whether the shot images of the injection needle and the holding needle are positive samples or not by calculating the HOG characteristic value.
As a further improvement of the present invention, before the acquiring a sample picture and establishing a positive sample library and a negative sample library, the method further includes:
and (5) correcting errors of the micro-operation platform.
As a further improvement of the present invention, the performing of the error correction of the micromanipulation platform specifically includes:
placing the scale on a micro-operation platform, respectively moving the micro-operation platform along a fixed direction in a fixed step manner to obtain scale images, and ensuring that the front image and the rear image are partially overlapped;
splicing the front image and the rear image in the direction;
calculating the systematic error of the micro-operation platform in the direction;
performing autonomous compensation on the system error in the direction, and correcting the system error in the direction;
and respectively moving the micro-operation platform along other directions by fixed stepping to respectively acquire images, calculating the system errors of the micro-operation platform in other directions, and performing autonomous compensation on the system errors in other directions to finish the system error correction in all directions.
As a further improvement of the present invention, the calculating the systematic error of the micromanipulation platform in the direction specifically includes:
calculating a pixel pitch;
calculating the actual displacement distance of the front image and the rear image in the direction according to the pixel distance;
and obtaining the system error of the direction according to the actual displacement distance.
As a further improvement of the present invention, the calculating the pixel pitch specifically includes:
calculating the pixel pitch by adopting a formula S = M/N; wherein S is the pixel pitch, M is the length of the scale, and N is the number of pixels within the length of M.
As a further improvement of the present invention, the calculating the actual displacement distance between the front image and the rear image in the direction according to the pixel pitch specifically includes:
using the formula AA1 practice of= S* AA1Calculating an actual displacement distance; wherein, AA1 practice ofFor actual displacement distance, AA1The number of pixels of the relative displacement of the front and the back images in the direction is shown.
As a further improvement of the present invention, the obtaining of the systematic error in the direction according to the actual displacement distance specifically includes:
according to formula AA1 realitycos (q) and AA1 realitysin (q) obtaining two components of the systematic error in that direction; and q is a deflection angle between the image coordinate system and the coordinate system of the micro-operation platform.
As a further improvement of the present invention, the autonomously compensating the systematic error of the direction and correcting the systematic error of the direction specifically includes:
and performing compensation calculation through closed-loop feedback of a computer, performing autonomous compensation on the system error in the direction, and correcting the system error in the direction.
The invention has the beneficial effects that:
the three-dimensional positioning method of the micromanipulation platform for cell injection can realize positioning during cell injection, and has the advantages of high positioning efficiency and high positioning precision.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understood, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Drawings
FIG. 1 is a flow chart of a method for three-dimensional positioning of a micromanipulation platform for cell injection in a preferred embodiment of the invention;
FIG. 2 is a schematic needle alignment of a completed injection needle and a holding needle in a preferred embodiment of the present invention;
FIG. 3 is a flow chart of the preferred embodiment of the present invention for performing micromanipulation platform error correction;
FIG. 4 is a schematic view of a scale in a preferred embodiment of the invention;
FIG. 5 is a schematic illustration of two images in front and back in accordance with a preferred embodiment of the present invention;
FIG. 6 is a schematic diagram of the stitching of two images in front and back in accordance with the preferred embodiment of the present invention;
FIG. 7 is a schematic diagram of the stitching of the two images in front and back of the X-axis in the preferred embodiment of the present invention.
Detailed Description
The present invention is further described below in conjunction with the following figures and specific examples so that those skilled in the art may better understand the present invention and practice it, but the examples are not intended to limit the present invention.
As shown in FIG. 1, the method for three-dimensional positioning of a micromanipulation platform for cell injection in a preferred embodiment of the present invention comprises the following steps:
step S10, collecting sample pictures and establishing a positive sample library and a negative sample library; the positive sample library comprises an injection needle positive sample library and a holding needle positive sample library, the negative sample library comprises an injection needle negative sample library and a holding needle negative sample library, the needle point in the positive sample library is on the focal plane of the microscope, and the needle point in the negative sample library is not on the focal plane of the microscope.
And step S20, respectively establishing a positive sample training set and a negative sample training set through the positive sample library and the negative sample library, and training to generate a support vector machine model.
Specifically, the extracting of Histogram of Oriented Gradient (HOG) features is performed on a positive sample library and a negative sample library respectively, and specifically includes:
firstly, calculating the x-axis direction and the y-axis direction of each sample image in the sample libraryGradient of (2)
Figure 479314DEST_PATH_IMAGE001
Figure 15469DEST_PATH_IMAGE002
Wherein:
Figure 95420DEST_PATH_IMAGE003
representing input sample image at pixel point
Figure 449041DEST_PATH_IMAGE004
The value of the pixel of (a) is,
Figure 857020DEST_PATH_IMAGE001
respectively representing a horizontal direction gradient and a vertical direction gradient;
then, dividing a sample image into a plurality of uniform 'Block cells', setting the sizes and other parameters of the cells to obtain characteristic vectors of the cells, combining the Cell units into large blocks, wherein each Block comprises a plurality of cells, and the characteristic vectors of all the cells in each Block are connected in series to obtain HOG characteristics; among other parameters are bin, gradient direction, etc.
Then, by combining a machine learning method, training a positive sample training set and a negative sample training set to obtain a support vector machine training data set as follows:
Figure 587079DEST_PATH_IMAGE005
wherein the content of the first and second substances,
Figure 928061DEST_PATH_IMAGE006
n is the number of samples in the positive bank or the number of samples in the negative sample bank,
Figure 452583DEST_PATH_IMAGE007
is the ith feature vector and is the ith feature vector,
Figure 613437DEST_PATH_IMAGE008
Figure 881608DEST_PATH_IMAGE009
is the i-th class label and,
Figure 811518DEST_PATH_IMAGE010
when is coming into contact with
Figure 179045DEST_PATH_IMAGE009
When the sum is equal to +1, the sample library is a positive sample library, and when the sum is-1, the sample library is a negative sample library;
and generating a support vector machine model according to the training data set of the support vector machine.
And step S30, moving the injection needle and the holding needle respectively under the microscope along a preset route, and respectively taking pictures. Specifically, the injection needle and the holding needle are respectively moved to the negative direction of the Z axis in fixed steps from the upper part of a focal plane under a microscope, or moved to the positive direction of the Z axis in fixed steps from the lower part of the focal plane, and one picture is taken in each step.
And step S40, processing the shot picture by using the trained support vector machine model, and judging whether the picture is a positive sample or not, wherein the needle alignment of the injection needle and the suction needle is completed until the shot pictures of the injection needle and the suction needle are both positive samples.
And judging whether the shot images of the injection needle and the suction needle are positive samples or not by calculating the HOG characteristic value.
In one embodiment, the present invention further comprises step a: and (5) correcting errors of the micro-operation platform. So as to realize high-precision two-dimensional positioning of the XY plane.
Specifically, the step a specifically includes:
and step A1, placing the ruler on the micromanipulation platform, respectively moving the micromanipulation platform along a fixed direction by fixed stepping, acquiring the image of the ruler, and ensuring that the front image and the rear image are partially overlapped.
In this embodiment, the scale is a two-dimensional planar scale, as shown in fig. 4. The two images before and after the acquisition are shown in fig. 5, i.e., the preceding frame and the following frame.
And step A2, splicing the front image and the rear image in the direction. The stitched image is shown in fig. 6.
And A3, calculating the systematic error of the micromanipulation platform in the direction. The method specifically comprises the following steps:
and step A31, calculating the pixel pitch. The method specifically comprises the following steps:
calculating the pixel pitch by adopting a formula S = M/N; wherein S is the pixel pitch, M is the length of the scale, and N is the number of pixels within the length of M.
Step A32, calculating the actual displacement distance of the front image and the rear image in the direction according to the pixel distance; the method specifically comprises the following steps:
using the formula AA1 practice of= S* AA1Calculating an actual displacement distance; wherein, AA1 practice ofFor actual displacement distance, AA1The number of pixels of the relative displacement of the front and the back images in the direction is shown.
And step A33, obtaining the system error of the direction according to the actual displacement distance. The method specifically comprises the following steps:
according to formula AA1 realitycos (q) and AA1 realitysin (q) obtaining two components of the systematic error in that direction; and q is a deflection angle between the image coordinate system and the coordinate system of the micro-operation platform.
As shown in FIG. 7, when the micro-manipulation platform is moving in the X-axis forward direction, the systematic error in the X-axis forward direction is + X Δ, which can be divided into + X ΔX,+XΔyTwo components, satisfying the following formula:
+XΔX = AA1 practice of* cos(q);
+XΔy = AA1 practice of* sin(q);
+XΔXAnd + X.DELTA.yI.e. the compensation value to be compensated when the X-axis moves in the positive direction. The compensation value of the motion in other directions can be obtained in the same way.
And A4, automatically compensating the system error of the direction, and correcting the system error of the direction. The method specifically comprises the following steps:
and performing compensation calculation through closed-loop feedback of a computer, performing autonomous compensation on the system error in the direction, and correcting the system error in the direction.
In this embodiment, the performing the error correction on the micromanipulation platform further includes:
and step A5, respectively moving the micro-operation platform along other directions by fixed stepping and respectively acquiring images, calculating the system errors of the micro-operation platform in other directions, and performing autonomous compensation on the system errors in other directions to finish the system error correction in all directions. Wherein, all directions comprise X-axis positive direction, X-axis negative direction, Y-axis positive direction and Y-axis negative direction.
In this embodiment, the performing the error correction on the micromanipulation platform further includes: and shooting a plurality of groups of images, and calculating for a plurality of times to obtain the average value of the systematic errors of the micro-operation platform in the direction. The system error calculation precision can be improved, and finally the error correction precision is improved.
The method for acquiring the two-dimensional position in the three-dimensional positioning method of the micromanipulation platform for cell injection provided by the invention abandons the traditional mode that the traditional manual work is used for respectively acquiring and calibrating data of all factors (such as a translational motion part, a rotary motion part and a rolling motion part) which possibly have errors and manually compensating the factors to correct the errors, integrates and unifies the mechanical errors, CCD installation errors, pixel/micrometer conversion and other errors which influence the microinjection precision at present based on the image splicing technology, does not need manual assistance, can realize autonomous compensation correction, and can control the errors at the pixel level.
In the three-dimensional positioning method of the micromanipulation platform for cell injection, which is provided by the invention, aiming at the two-dimensional position acquisition method, the system error autonomous compensation algorithm is not only suitable for the micromanipulation system, but also suitable for error correction of other mobile platforms, and has the characteristics of simple operation, high efficiency, high accuracy and the like.
The invention provides a method for acquiring a depth position in a three-dimensional positioning method of a micromanipulation platform for cell injection, and provides a novel method for determining the depth position in a focal plane-to-needle mode based on HOG characteristics combined with machine learning, which has high processing speed and high accuracy, thereby solving the problems of inaccurate acquisition of the depth position in the current positioning method.
The above embodiments are merely preferred embodiments for fully illustrating the present invention, and the scope of the present invention is not limited thereto. The equivalent substitution or change made by the technical personnel in the technical field on the basis of the invention is all within the protection scope of the invention. The protection scope of the invention is subject to the claims.

Claims (10)

1. A method for three-dimensional positioning of a micromanipulation platform for cell injection, comprising:
collecting sample pictures and establishing a positive sample library and a negative sample library; the positive sample library comprises an injection needle positive sample library and a holding needle positive sample library, the negative sample library comprises an injection needle negative sample library and a holding needle negative sample library, the needle point in the positive sample library is on the focal plane of the microscope, and the needle point in the negative sample library is not on the focal plane of the microscope;
respectively establishing a positive sample training set and a negative sample training set through the positive sample library and the negative sample library, and training to generate a support vector machine model;
respectively moving the injection needle and the holding needle under a microscope along a preset route, and respectively shooting pictures;
and processing the shot picture by using the trained support vector machine model, and judging whether the picture is a positive sample or not, and finishing the needle alignment of the injection needle and the suction needle until the shot pictures of the injection needle and the suction needle are both positive samples.
2. The three-dimensional positioning method for the micromanipulation platform for cell injection according to claim 1, wherein the establishing of the positive sample training set and the negative sample training set by the positive sample library and the negative sample library respectively, and the training to generate the support vector machine model specifically comprise:
firstly, calculating the gradient G of each sample image in the x-axis direction and the y-axis direction in the sample libraryx(x,y),Gy(x,y);
Gx(x,y)=H(x+1,y)-H(x-1,y)
Gy(x,y)=H(x,y+1)-H(x,y-1)
Wherein: h (x, y) denotes a pixel value of the input sample image at a pixel point (x, y), Gx(x,y),Gy(x, y) represents a horizontal direction gradient and a vertical direction gradient, respectively;
then, dividing a sample image into a plurality of uniform 'Block cells', setting the sizes, bins and gradient directions of the cells to obtain characteristic vectors of the cells, combining the Cell units into large blocks, wherein each Block comprises a plurality of cells, and the characteristic vectors of all the cells in each Block are connected in series to obtain HOG characteristics;
then, by combining a machine learning method, training a positive sample training set and a negative sample training set to obtain a support vector machine training data set as follows:
T={(x1,y1),(x2,y2),...(xi,yi)}
wherein, i is 1, 2, 3.. N, N is the number of samples in the positive sample bank or the number of samples in the negative sample bank, xiIs the ith feature vector, xi∈RnWherein, yiIs the ith class label, yiE { -1, +1}, when yiWhen the sum is equal to +1, the sample library is a positive sample library, and when the sum is-1, the sample library is a negative sample library;
and generating a support vector machine model according to the training data set of the support vector machine.
3. The method of claim 2, wherein the determining whether the image is a positive sample comprises: and judging whether the shot images of the injection needle and the holding needle are positive samples or not by calculating the HOG characteristic value.
4. The method of claim 1, further comprising, prior to the step of taking the sample picture and creating the positive sample library and the negative sample library:
and (5) correcting errors of the micro-operation platform.
5. The method for three-dimensional positioning of a micromanipulation platform for cell injection according to claim 4, wherein the performing of micromanipulation platform error correction specifically comprises:
placing the scale on a micro-operation platform, respectively moving the micro-operation platform along a fixed direction in a fixed step manner to obtain scale images, and ensuring that the front image and the rear image are partially overlapped;
splicing the front image and the rear image in the direction;
calculating the systematic error of the micro-operation platform in the direction;
performing autonomous compensation on the system error in the direction, and correcting the system error in the direction;
and respectively moving the micro-operation platform along other directions by fixed stepping to respectively acquire images, calculating the system errors of the micro-operation platform in other directions, and performing autonomous compensation on the system errors in other directions to finish the system error correction in all directions.
6. The method of claim 5, wherein the calculating the systematic error of the micromanipulation platform in the direction comprises:
calculating a pixel pitch;
calculating the actual displacement distance of the front image and the rear image in the direction according to the pixel distance;
and obtaining the system error of the direction according to the actual displacement distance.
7. The method of claim 6, wherein the calculating the pixel pitch comprises:
calculating the pixel pitch by adopting a formula S as M/N; wherein S is the pixel pitch, M is the length of the scale, and N is the number of pixels within the length of M.
8. The three-dimensional positioning method of the micromanipulation platform for cell injection according to claim 7, wherein the calculating the actual displacement distance of the two images in front and back in the direction according to the pixel pitch specifically comprises:
using the formula AA1 practice of=S*AA1Calculating an actual displacement distance; wherein, AA1 practice ofFor actual displacement distance, AA1The number of pixels of the relative displacement of the front and the back images in the direction is shown.
9. The method according to claim 8, wherein the obtaining the systematic error of the direction according to the actual displacement distance comprises:
according to formula AA1 practice ofCos (θ) and AA1 practice ofSin (θ) yields two components of the systematic error for that direction; and theta is a deflection angle between the image coordinate system and the coordinate system of the micro-operation platform.
10. The method of claim 5, wherein the self-compensating for the systematic error of the orientation and correcting the systematic error of the orientation comprises:
and performing compensation calculation through closed-loop feedback of a computer, performing autonomous compensation on the system error in the direction, and correcting the system error in the direction.
CN202011213646.XA 2020-11-04 2020-11-04 Three-dimensional positioning method of micromanipulation platform for cell injection Active CN112101575B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011213646.XA CN112101575B (en) 2020-11-04 2020-11-04 Three-dimensional positioning method of micromanipulation platform for cell injection
PCT/CN2020/127759 WO2022095082A1 (en) 2020-11-04 2020-11-10 Micromanipulation platform three-dimensional positioning method for cell injection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011213646.XA CN112101575B (en) 2020-11-04 2020-11-04 Three-dimensional positioning method of micromanipulation platform for cell injection

Publications (2)

Publication Number Publication Date
CN112101575A CN112101575A (en) 2020-12-18
CN112101575B true CN112101575B (en) 2021-04-30

Family

ID=73784516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011213646.XA Active CN112101575B (en) 2020-11-04 2020-11-04 Three-dimensional positioning method of micromanipulation platform for cell injection

Country Status (2)

Country Link
CN (1) CN112101575B (en)
WO (1) WO2022095082A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114034225A (en) * 2021-11-25 2022-02-11 广州市华粤行医疗科技有限公司 Method for testing movement precision of injection needle under microscope

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2710010Y (en) * 2004-03-26 2005-07-13 张志宏 Microscopic operation system of biocell computer
CN103255049A (en) * 2013-05-20 2013-08-21 苏州大学 Composite piezoelectric injection system and injection method
CN111652848A (en) * 2020-05-07 2020-09-11 南开大学 Robotized adherent cell three-dimensional positioning method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101951968B1 (en) * 2011-03-03 2019-02-25 더 리전트 오브 더 유니버시티 오브 캘리포니아 Nanopipette apparatus for manipulating cells
CN106204642B (en) * 2016-06-29 2019-07-09 四川大学 A kind of cell tracker method based on deep neural network
CN110796661B (en) * 2018-08-01 2022-05-31 华中科技大学 Fungal microscopic image segmentation detection method and system based on convolutional neural network
CN110841139A (en) * 2019-12-10 2020-02-28 深圳市中科微光医疗器械技术有限公司 Remaining needle capable of realizing needle tip positioning in image environment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2710010Y (en) * 2004-03-26 2005-07-13 张志宏 Microscopic operation system of biocell computer
CN103255049A (en) * 2013-05-20 2013-08-21 苏州大学 Composite piezoelectric injection system and injection method
CN111652848A (en) * 2020-05-07 2020-09-11 南开大学 Robotized adherent cell three-dimensional positioning method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
A Cell Polar Body Positioning Method based on SVM Classification;Di Chen等;《Proceedings of the 2014 IEEE International Conference on Robotics and Biomimetics》;20141210;第505-509页 *

Also Published As

Publication number Publication date
CN112101575A (en) 2020-12-18
WO2022095082A1 (en) 2022-05-12

Similar Documents

Publication Publication Date Title
CN112122840B (en) Visual positioning welding system and welding method based on robot welding
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
Zhang et al. Robotic immobilization of motile sperm for clinical intracytoplasmic sperm injection
CN110276806B (en) Online hand-eye calibration and grabbing pose calculation method for four-degree-of-freedom parallel robot stereoscopic vision hand-eye system
Yu et al. Microrobotic cell injection
CN101033972A (en) Method for obtaining three-dimensional information of space non-cooperative object
CN110666798A (en) Robot vision calibration method based on perspective transformation model
CN105444699B (en) A kind of method that micromanipulation system coordinate is detected and compensated with displacement error
CN105323455B (en) A kind of location compensation method based on machine vision
CN112101575B (en) Three-dimensional positioning method of micromanipulation platform for cell injection
CN112949478A (en) Target detection method based on holder camera
CN116643393B (en) Microscopic image deflection-based processing method and system
WO2009111877A1 (en) Method and apparatus for microscopy
CN110211183A (en) The multi-target positioning system and method for big visual field LED lens attachment are imaged based on single
CN112818990A (en) Target detection frame generation method, image data automatic labeling method and system
Yang et al. Towards automatic robot-assisted microscopy: An uncalibrated approach for robotic vision-guided micromanipulation
Roy et al. A semi-automated positioning system for contact-mode atomic force microscopy (AFM)
Mattos et al. A fast and precise micropipette positioning system based on continuous camera-robot recalibration and visual servoing
CN114092552A (en) Method for carrying out butt joint on large rigid body member based on fixed end image
CN110197508B (en) 2D and 3D co-fusion vision guiding movement method and device
WO2021227189A1 (en) Micromanipulation platform autonomous error correction algorithm based on machine vision
Karimirad et al. Vision-based robot-assisted biological cell micromanipulation
Fujishiro et al. Microinjection system to enable real-time 3d image presentation through focal position adjustment
Ghanbari et al. Cell image recognition and visual servo control for automated cell injection
Liu et al. Automated mouse embryo injection moves toward practical use

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant