WO2023245708A1 - 基于机器视觉的电极植入方法及系统 - Google Patents
基于机器视觉的电极植入方法及系统 Download PDFInfo
- Publication number
- WO2023245708A1 WO2023245708A1 PCT/CN2022/102359 CN2022102359W WO2023245708A1 WO 2023245708 A1 WO2023245708 A1 WO 2023245708A1 CN 2022102359 W CN2022102359 W CN 2022102359W WO 2023245708 A1 WO2023245708 A1 WO 2023245708A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- implantation
- electrode
- implant
- image
- Prior art date
Links
- 238000002513 implantation Methods 0.000 title claims abstract description 159
- 238000000034 method Methods 0.000 title claims abstract description 91
- 210000004204 blood vessel Anatomy 0.000 claims abstract description 76
- 210000004556 brain Anatomy 0.000 claims abstract description 74
- 238000003384 imaging method Methods 0.000 claims abstract description 59
- 230000011218 segmentation Effects 0.000 claims abstract description 53
- 238000012545 processing Methods 0.000 claims abstract description 23
- 230000009466 transformation Effects 0.000 claims abstract description 17
- 238000004070 electrodeposition Methods 0.000 claims abstract description 15
- 239000011159 matrix material Substances 0.000 claims abstract description 13
- 239000007943 implant Substances 0.000 claims description 79
- 230000003287 optical effect Effects 0.000 claims description 13
- 238000004364 calculation method Methods 0.000 claims description 12
- 230000007246 mechanism Effects 0.000 claims description 10
- 230000000694 effects Effects 0.000 claims description 8
- 230000003044 adaptive effect Effects 0.000 claims description 6
- 230000002490 cerebral effect Effects 0.000 claims description 5
- 230000010339 dilation Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 description 20
- 238000010586 diagram Methods 0.000 description 16
- WFKWXMTUELFFGS-UHFFFAOYSA-N tungsten Chemical compound [W] WFKWXMTUELFFGS-UHFFFAOYSA-N 0.000 description 12
- 230000000670 limiting effect Effects 0.000 description 11
- 230000008901 benefit Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 6
- 238000003708 edge detection Methods 0.000 description 5
- 238000012549 training Methods 0.000 description 4
- 230000017531 blood circulation Effects 0.000 description 3
- 230000004069 differentiation Effects 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000003628 erosive effect Effects 0.000 description 2
- 238000003709 image segmentation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 238000013179 statistical model Methods 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 230000002792 vascular Effects 0.000 description 2
- 206010002329 Aneurysm Diseases 0.000 description 1
- 241000282553 Macaca Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000002583 angiography Methods 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003071 parasitic effect Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3468—Trocars; Puncturing needles for implanting or removing devices, e.g. prostheses, implants, seeds, wires
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00022—Sensing or detecting at the treatment site
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
- A61B2017/3405—Needle locating or guiding means using mechanical guide means
- A61B2017/3409—Needle locating or guiding means using mechanical guide means including needle or instrument drives
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
Definitions
- the present disclosure relates to the field of life science technology, and specifically relates to an electrode implantation method and system based on machine vision.
- the implanted device In the field of neurosurgery robots, it involves the implantation of flexible electrodes onto the surface of the brain. During the implantation process, the implanted device first passes through the electrodes and is then implanted on the surface of the brain with the electrodes.
- the surgical area on the brain surface is mostly a millimeter-level small window. It relies on a robotic arm or an external stepper motor to initially move to the top of the small window, analyze the implantable sites that avoid blood vessels, and control the implantation tool for implantation. In order to reduce or reduce bleeding when implanting electrodes, the implantable area needs to be automatically identified.
- the implanted brain area and the electrode channel it is required to clearly define the corresponding relationship between the implanted brain area and the electrode channel, and the implantation positions need to be numbered.
- the implant tool is not necessarily completely vertical, and there are undulations on the brain surface, the implant is required to accurately and strictly control the implant angle and position of the implant tool and the surface, and accurately determine the spatial position of the implant tool. Therefore, it is also necessary to design a corresponding three-dimensional microscopic imaging system to monitor the position of the implanted tool in real time and predict its landing point on the brain surface.
- This application proposes an electrode implantation method and system based on machine vision.
- a machine vision-based electrode implantation method including: capturing a first image with a first camera for a brain surface, and capturing a second image with a second camera for the brain surface. ; Perform computational processing on the first image and the second image, wherein a brain surface blood vessel area mask is obtained based on a blood vessel segmentation algorithm to determine the implantable area in the brain surface image; select at least one implantable area in the implantable area position, calculate the distance between the at least one implantation position and the electrode position based on the known electrode position, thereby determining the implantation sequence of the electrodes; match the imaging of the first camera and the second camera to obtain a transformation matrix, and combine the imaging of the first camera
- the first straight line where the implant position is located is projected into the imaging of the second camera, and the intersection point between the first straight line and the second straight line where the implant position is located in the imaging of the second camera is determined as the intersection point of the implant device. Predict the landing point; and control the implant
- a machine vision-based electrode implantation system including: a first camera configured to capture a first image for the brain surface; a second camera configured to capture a brain surface The surface captures the second image; the blood vessel segmentation operation unit is configured to perform operation processing on the first image and the second image, wherein a brain surface blood vessel area mask is obtained based on the blood vessel segmentation algorithm to determine the implantable areas in the brain surface image.
- an implantation sequence determination unit configured to select at least one implantation position in the implantable region, calculate the distance between the at least one implantation position and the electrode position according to the known electrode position, thereby determining the implantation sequence of the electrode ;
- the implant landing point prediction unit is configured to match the imaging of the first camera and the second camera to obtain a transformation matrix, and project the first straight line where the implant position is located in the imaging of the first camera to the imaging of the second camera.
- the implant device control unit is configured to determine the landing point according to the predicted landing point
- the implanted device is controlled in real time until the implant point coincides with the predicted landing point.
- the advantage of embodiments according to the present disclosure is that it is suitable for a variety of different imaging modes, the algorithm used has relatively good universality, and can achieve better boundary segmentation and obtain stable imaging recognition results.
- Another advantage of embodiments according to the present disclosure is that an automated blood vessel segmentation algorithm can be provided, algorithm parameters can be easily adjusted, and the calculation amount of the segmentation algorithm can be reduced.
- FIG. 1 is a schematic diagram illustrating a machine vision-based electrode implantation system according to an embodiment of the present disclosure.
- FIG. 2 is a configuration diagram illustrating a machine vision-based electrode implantation system according to an embodiment of the present disclosure.
- FIG. 3 is a flowchart illustrating a blood vessel segmentation algorithm according to an embodiment of the present disclosure.
- FIG. 4 is a schematic diagram illustrating the steps of a blood vessel segmentation algorithm according to an embodiment of the present disclosure.
- FIG. 5 is an effect diagram illustrating a blood vessel segmentation algorithm according to an embodiment of the present disclosure.
- Figure 6 is a schematic diagram illustrating implantable location selection and path planning of brain surface electrodes according to an embodiment of the present disclosure.
- FIG. 7 is a schematic diagram illustrating implant tool landing point prediction according to an embodiment of the present disclosure.
- Figure 8 is a flowchart illustrating an implant tool control algorithm in accordance with an embodiment of the present disclosure.
- FIG. 9 is a schematic diagram illustrating the steps of a machine vision-based electrode implantation method according to an embodiment of the present disclosure.
- identification of blood vessels on the brain surface mainly includes regional segmentation of brain surface imaging, that is, distinguishing vascular and non-vascular areas.
- the methods to achieve regional segmentation include at least threshold segmentation, edge detection, segmentation methods based on mathematical morphology, etc. .
- threshold segmentation is the most common parallel segmentation method that directly detects regions, and it is also the simplest segmentation method. This method generally makes certain assumptions about the image. It is assumed that the target and background of the image occupy different gray scale ranges. The difference in gray scale values between adjacent pixels inside the target and the background is small, but at the interface between the target and the background. The pixels on both sides of the point have a large difference in gray value. If an appropriate grayscale threshold T is selected, and then the grayscale value of each pixel in the image is compared with the threshold T, the pixels can be divided into two categories based on the comparison results: the pixels whose grayscale value is greater than the threshold are classified into one category.
- the threshold can also be determined using grayscale histogram features and statistical judgment methods.
- the grayscale histogram of the image will appear like two peaks and one valley. The two peaks correspond to the center gray of the target and the center gray of the background respectively.
- the boundary point is located around the target, and its gray level is between the target gray level and the background gray level. Therefore, the gray level of the boundary corresponds to the between the two peaks.
- the valley point is located
- the gray level of the valley point is used as the segmentation threshold. Due to the uneven nature of the histogram, it is difficult to determine the valley value of the histogram, and a specific method needs to be designed to search. Currently, there are many methods to determine the optimal threshold (trough), such as methods for obtaining Gaussian model parameters, methods for finding extreme values by fitting histogram curves, etc.
- the threshold-based segmentation method has the advantages of simple calculation and high computational efficiency. However, this type of method does not consider spatial characteristics and is sensitive to noise and grayscale diversity. It is difficult to obtain accurate segmentation thresholds for images where the grayscale difference between the target and the background is not obvious. In practical applications, it is usually used in conjunction with other image segmentation methods to achieve satisfactory results.
- the existing technology proposes a statistical model based on a physical model of blood flow when extracting blood vessels from time-of-flight magnetic resonance angiography images. In order to improve the segmentation ability of blood vessels, the velocity and phase information of PCA are fused, and the adaptive local threshold method and the single global threshold method are used to segment two different statistical models, so that aneurysms with very low signals near them can achieve better results. Split effect.
- the existing technology also uses a combination of local thresholding and global thresholding to perform three-dimensional reconstruction of cerebral blood vessel images. Using local thresholding can enhance the contrast of small blood vessels, and using global thresholding can extract target blood vessels from the background.
- Edge detection is a parallel boundary segmentation technology based on grayscale discontinuity and is the first step of all boundary segmentation methods. Because the edge is the dividing line between the target and the background, only by extracting the edge can the target and the background be distinguished. Edge detection is generally implemented by utilizing the differences between the target and the background in certain characteristics, such as grayscale, color, texture and other features. Detecting edges is generally accomplished with first-order or second-order derivatives, but in actual digital images, differential operations are used to approximately replace differential operations. The gray value of the points on both sides of the edge in the image changes suddenly, so these points will have a large differential value. When the direction of the differential is perpendicular to the boundary, the differential value is the largest. It can be seen that differentiation is a directional operation and is used to measure changes in gray level in the direction of differentiation.
- the basic principle of the segmentation method based on mathematical morphology is to use structural elements with certain shapes to perform basic operations on the image to achieve the purpose of image analysis and recognition.
- the basic operations of mathematical morphology include dilation and erosion, as well as the opening and closing operations formed by their combination.
- the opening operation erodes first and then expands, and the closing operation first expands and then erodes.
- Various operations have their own characteristics in processing images. Dilation makes the image expand while corrosion makes the image shrink.
- Both the opening operation and the closing operation can smooth the outline of the image, but the two operations have opposite effects.
- the opening operation can break narrow discontinuities and eliminate thin protrusions; the closing operation can eliminate small holes in the image and fill the contour lines. breaks in and merges narrow gaps and elongated chasms.
- neural networks are also used in image area segmentation in practice.
- the neural network can learn; on the other hand, the nonlinearity of the network can be used for boundary segmentation during the training process.
- the disadvantage is that whenever new features are added to the network system, learning and training need to be re-learned, and the debugging process is also complicated.
- An algorithm widely used in the learning process is the post-propagation algorithm. Since the training data set determines learning, the size of the training data determines the learning process.
- the positioning of implanted tools mainly includes hand-eye calibration in robot vision applications.
- the goal is to obtain the relationship between the robot coordinate system and the camera coordinate system, and finally transfer the visual recognition results to the robot coordinate system.
- nine-point calibration In industry, common hand-eye calibration methods are mainly divided into two methods: nine-point calibration method and calibration plate calibration method.
- Nine-point calibration directly establishes the coordinate transformation relationship between the camera and the manipulator. Let the indicator needle at the end of the manipulator touch these 9 points to get the coordinates in the robot coordinate system. At the same time, use the camera to identify the 9 points in the initial screen to get the pixel coordinates. In this way, 9 sets of corresponding coordinates are obtained, and then the solution is obtained.
- the transformation matrix can be used to obtain the transformation affine matrix between the image and the robot coordinates.
- the calibration plate calibration method uses a checkerboard calibration plate or a circular grid calibration plate to obtain the internal and external parameters of the camera, and then the coordinate transformation relationship between the image and the manipulator can be obtained.
- binocular systems are divided into two categories based on the positional relationship of the optical axes, namely, parallel binocular systems where the optical axes are basically parallel and convergent binocular systems where the optical axes intersect.
- Parallel binocular systems because the optical axes are parallel so that the field of view overlaps less, it is rarely used when the field of view is small, and is generally used when the working distance is much larger than the distance between lenses. If you need to observe millimeter-level objects, the converged binocular system is more suitable for your needs.
- the required imaging system in order to accurately predict the landing point of the implanted tool on the brain surface during hand-eye calibration, the required imaging system must use at least two cameras, that is, a binocular system. Moreover, due to the imaging characteristics of different binocular systems, it is necessary to use a convergent binocular system. In order to obtain clear imaging, you need to use a lens with a magnification of ⁇ 1 or ⁇ 2. With the same camera pixels, the field of view is smaller and the depth of field of the lens is very limited, generally around 1mm. This requires adjusting the camera position to obtain clear imaging, resulting in relative instability between the two cameras. However, the calibration of the binocular camera is only established when the camera position is relatively fixed, which also means that the traditional binocular calibration method cannot be directly applied to the brain surface electrode implantation system disclosed in this application.
- the inventor of the present application proposes an improved electrode implantation method and system based on machine vision.
- it relates to a brain surface electrode implantation method and system based on blood vessel segmentation processing of machine imaging.
- the technical solution of the present disclosure mainly includes automatically detecting and avoiding blood vessels during the electrode implantation process, and automatically detecting multiple candidates based on the size, number, and shape of the implantation tool and implanted electrodes.
- Image registration and fusion technology is used to project brain regions on the brain surface, making it easier to locate the brain areas where electrodes are implanted during surgery.
- the implant system After selecting the point, the implant system automatically moves to the selected position, controls the precise movement of the electrode implant device in three-dimensional space based on the selected position and the angle of electrode implantation, and monitors the precise distance between the implant device and the brain surface in real time.
- the depth of electrode implantation is predicted based on the distance from the brain surface.
- technologies such as target tracking are used to achieve precise movement and implantation of electrodes to the selected implant site.
- FIG. 1 and FIG. 2 respectively show a schematic diagram and a configuration diagram of a machine vision-based electrode implantation system according to an embodiment of the present disclosure.
- the hardware structure of the brain surface electrode implantation system of the present application includes an optical system and a motion control system.
- the optical system is mainly associated with two cameras 101 and 102 (hereinafter also referred to as the "first camera” and the "second camera”).
- CMOS industrial camera can be used, which has the ability to image the surgical area 106 respectively.
- Magnified telecentric lenses 103 and 104 can be used, which has the ability to image the surgical area 106 respectively.
- the cameras 101 and 102 have the same imaging plane, are at a certain angle to each other on a projection plane, are fixed on a rigid base plate, and are equipped with a coaxial light source (not shown) to shorten the exposure time and increase the frame rate.
- the light source can be an external point light source, which is mainly used to make the surgical area 106 receive uniform light and avoid blurring or overexposure of the implanted device 105 and the surgical area 106, thus facilitating subsequent image and data processing.
- the light source can be white light or other light of a given wavelength.
- green light such as light with a wavelength of 495 nm to 570 nm
- green light such as light with a wavelength of 495 nm to 570 nm
- Figure 1 illustrates a non-limiting embodiment of the system disclosed herein, in which camera 101 and camera 102 are angled approximately 90° from each other in horizontal projection.
- the motion control system is mainly composed of three stepper motors and is used to control the movement of the implanted device 105 in three directions of a certain spatial coordinate system ( ⁇ x, ⁇ y and ⁇ z directions as shown in Figure 1).
- Cameras 101 and 102 are respectively coupled to the motion control system.
- the three motors include one stepper motor and two micro stepper motors that control ⁇ z-direction motion (not shown). ), such as a stroke of 5 mm, are used to respectively control the movement of the implant device 105 in the ⁇ x and ⁇ y directions, that is, the movement of the area substantially parallel to the surgical area 106.
- the motion control system may also include a robotic arm with similar motion control functions.
- the camera 101/102 is set above the robotic arm.
- the robotic arm since the movement accuracy of the robotic arm (such as ⁇ 30 ⁇ m) cannot meet the accuracy requirements (such as ⁇ 10 ⁇ m) required by the system of this application, the robotic arm is used to roughly find the plant. into the position, the fine adjustment of the electrode position is still completed by two micro operating motors.
- the implantation device 105 is configured to implant the flexible electrode into a designated position in the surgical area 106, and includes an implantation needle, an implantation feeding mechanism, and an implantation actuator.
- the implanted needle structure is used to connect the needle part to the free end of the electrode so as to drive the electrode to move.
- the implant feeding mechanism is used to move the implant needle along the longitudinal direction of the implant device.
- the implant actuator mechanism is used to drive the implant needle to insert the needle portion of the implant needle into the surgical area 106 .
- the implantation device 105 may also be equipped with an implantation movement mechanism to enable the implantation device 105 to implant electrodes from different angles and in different directions.
- FIG. 2 illustrates a non-limiting embodiment of a brain surface electrode implantation system.
- the brain surface electrode implantation system 20 adopts a binocular system, which mainly includes a first camera 201, a second camera 202, a blood vessel segmentation calculation unit 203, an implantation sequence determination unit 204, an implantation point prediction unit 205 and Implanted device control unit 206.
- the first camera 201 and the second camera 202 respectively correspond to the cameras 101 and 102 in Figure 1 and are used to image the position and direction of the implant device relative to the surgical area at an angle to each other. Similar features will not be described again here. .
- the brain surface electrode implantation system 20 uses a first camera 201 and a second camera to image the brain surface, and captures the first image 2010 and the second image 2020 respectively.
- the first image 2010 and the second image 2020 are imaging of the implanted device and the surgical area in different directions, as shown in Figure 1.
- the control direction establishes a three-dimensional coordinate system, then according to the first camera 201 and the second camera 202 that are angled with each other, the position coordinates of the implant device and the surgical area can be determined within the coordinate system based on the first image 2010 and the second image 2020 .
- the blood vessel segmentation calculation unit 203 is configured to perform calculation processing on the first image 2010 and the second image 2020 .
- the main function performed by the blood vessel segmentation calculation unit 203 is to obtain the brain surface blood vessel area mask based on the blood vessel segmentation algorithm to determine the implantable area 2030 in the brain surface image.
- blood vessel segmentation algorithms can be implemented in many ways, including threshold division, edge extraction, mathematical morphology processing, etc., as mentioned above.
- the algorithm used in this application combines the advantages of several processing methods to remove the jitter of the video itself, process multi-frame images, and obtain a smooth blood vessel image mask.
- Figure 3 shows a non-limiting embodiment of the blood vessel segmentation algorithm
- Figure 4 shows a schematic diagram of the results of each step of the blood vessel segmentation algorithm.
- step S301 of FIG. 3 the first image 2010 and/or the second image 2020 are input into the blood vessel segmentation algorithm.
- step S302 a series of image processing steps are performed on the input image.
- the input image is converted into a grayscale image, and then adaptive threshold segmentation is performed to find the outline of blood vessels and remove small outline noise in the processed results, and then an opening operation is performed to eliminate bubbles in the blood vessels in the original image.
- the noise pattern is then inverted and processed.
- the blood vessel part is expanded to obtain a safe distance (also called "eroding" a safe displacement), and finally the inverse operation is performed again.
- the image mask of the implantable area is obtained in step S303.
- step S304 the judgment is set in step S304 so that the series of processes in S302 are repeated before the number of images obtained reaches the preset smoothing number n.
- step S305 the implantable areas in the n most recently obtained images are intersected, and finally in S306, a relatively stable blood vessel image mask is output.
- FIG. 4 mainly shows the intermediate results obtained after each step of the series of processes in step S302.
- the interference of blood vessel color is eliminated in the result converted to grayscale in S402, and the blood vessel and non-blood vessel areas are roughly divided into the result after adaptive threshold segmentation in S403. Due to the use of adaptive Threshold algorithm, so there is no need to calculate the blood vessel gray threshold in advance, and no prior data is required. Under suitable imaging conditions, stable results can be obtained.
- the contour noise is removed in S404 and the bubble noise pattern in the blood vessel is removed in S405
- changes in the extracted contour edge are minimized in the time domain, thereby improving the stability and security of the contour extraction algorithm.
- the image area identified as a blood vessel has a reasonable safety distance to minimize the risk of identifying the blood vessel as an implantable area.
- Figure 5 shows an effect diagram of the blood vessel segmentation algorithm according to the above embodiment.
- the following image blood vessel analysis results were obtained in the 3mmx3mm macaque brain surgery area.
- the image recognition results were obtained through a series of algorithm processing.
- the stripe shape mask is the segmented blood vessel area, and the blank Some are implantable areas. It can be seen from the figure that the blood vessel segmentation algorithm disclosed in this application can effectively and stably identify the blood vessel area in brain surface imaging, and ensures with high accuracy that the implantable area only contains non-blood vessel parts.
- the blood vessel segmentation algorithm disclosed in this application can flexibly adjust parameters.
- the algorithm parameters of the brain surface electrode implantation system can be adjusted based on the site of implanted electrodes. For example, when the requirements for the number of insertable points and accuracy change, it may affect the accuracy of edge detection in the blood vessel segmentation algorithm. Safe distance for threshold or dilation processing. For optical systems, changes in the object distance between the lens and the surgical area will cause specific areas in the imaging to be enlarged or reduced, which will affect the number of sites to be detected, the point distance, and the imaging resolution. Alternatively, the user can specify, the system automatically selects, or the system assists the user in determining the required number and spacing of electrode sites, and the algorithm parameters can be adjusted based on this.
- the implantation sequence determining unit 204 is configured to select at least one implantation location in the implantable area 2030 .
- the implantation sequence determination unit 204 calculates the distance between the implantation position and the electrode position according to the known electrode position, thereby performing path planning on the order of electrode implantation, as follows: Electrode implantation sequence 2040. In a designated area determined based on the surgical area, with the electrode direction as a reference, the motion control system controls the movement direction of the implant device via a stepper motor or a robotic arm, thereby implanting the electrodes into the determined positions in sequence.
- the implantation sequence determining unit 204 needs to perform path planning according to the following principles: the electrode implanted later Do not interfere with previously implanted electrodes; do not pull implanted electrodes during movement. In other words, the ideal electrode implantation path should avoid crossovers, horizontal jumps, etc. as much as possible.
- the order that the implantation order determining unit 204 may adopt is in order from near to far and from left to right of the implanted electrodes relative to the brain surface, as shown in FIG. 6 .
- Figure 6 shows the determination of the implantation position and implantation sequence based on the image processing result.
- Figure 6 (A) is a reference path planning sequence, in which the implantable positions and distribution on the brain surface obtained according to the aforementioned series of processes are simplified into a 5 ⁇ 7 lattice in a two-dimensional coordinate system, and the electrodes are implanted (not shown)
- the lattice with respect to the brain surface adopts the direction from top to bottom (that is, the positive direction of the y-axis in the figure), then the sequence shown by the arrows between the lattice in Figure 6(A) is obtained, that is, Along the positive x-axis direction and along the positive y-axis direction.
- the number of lattice in (A) of Figure 6 is only for illustration and is not limiting.
- the implant placement prediction unit 205 is configured to match imaging of the first camera 201 and the second camera 202 to obtain a transformation matrix. Since both cameras can clearly obtain blood vessel images in the surgical area, and there are many features in blood vessels, the two cameras can be calibrated based on data feature matching. Features that can be used for matching in practical applications are SURF features or SIFT features.
- SURF SURF features
- two cameras are matched based on the SURF features to obtain the affine transformation matrix between the two cameras.
- Gaussian filters of continuous different scales are used to process the image, and scale-invariant feature points in the image are detected through Gaussian difference.
- the Hessian matrix of spot detection is used to detect feature points, and its determinant value represents the change around the pixel point, so the feature point needs to have a maximum and minimum determinant value.
- functions such as L xx (p, ⁇ ) in the matrix are gray-scale images after second-order differentiation.
- Gaussian filters of continuous different scales are used to process the image, and scale-invariant feature points in the image are detected through Gaussian difference.
- SURF uses a square filter to replace the Gaussian filter in SIFT to achieve an approximation of Gaussian blur. Its filter can be expressed as:
- using a square filter can greatly improve the calculation speed by using the integral map, and only needs to calculate the four corner values located in the square shape of the filter.
- Figure 7 further shows a schematic diagram of the principle of implant tool impact point prediction.
- each of the two cameras can locate a straight line where the implant tool is located in its imaging, that is, the first camera 201 and the second camera 201.
- the camera 202 can respectively determine a plane where the implant tool is located in the three-dimensional space (and/or the established spatial coordinate system). Due to the different viewing angles of the two cameras, the image and position information obtained in their imaging are not exactly the same. Taking the first camera 201 to capture the first image 2010 and the second camera 202 to capture the second image 2020 as an example, (A) and (B) of Figure 7 respectively show that either one of the first camera 201 and the second camera 202 A schematic diagram of the imaging direction projected by a camera onto another.
- FIG. 7 shows a schematic diagram of the implantation position in the second image 2020.
- the implantation point prediction unit 205 projects the first straight line 7001 where the implantation position is located in the first image 2010 into the second image 2020, and the implantation position itself falls in the second straight line 7002 in the second image 2020, Then, the intersection point between the first straight line 7001 and the second straight line 7002 is determined as the predicted landing point 2050 of the implanted device in the second image 2020 .
- (D) of FIG. 7 shows the predicted landing point of the implantation location in the first image 2010 . Due to the calculation error of the variable relationship between the two cameras, the two intersection points obtained may not completely coincide in the actual image, but it can be determined that the actual predicted landing point is near the two intersection points.
- the implant device control unit 206 is described. Based on the implantation sequence 2040 determined from the implantation sequence determining unit 204 and the predicted landing point 2050 determined by the implantation landing point prediction unit 205, the implantation device control unit 206 finally determines the implantation sequence 2040 according to the predicted landing point 2050. The implanted device is controlled in real time until the implant point coincides with the predicted landing point.
- the implant device control unit 206 can adopt full supervision or semi-supervision control, and can make different selections according to user selections to achieve personalized control effects.
- FIG. 8 illustrates a flow diagram of an implant tool control algorithm in one non-limiting embodiment.
- the control algorithm takes the camera intrinsic parameters in the optical system as input at step S801.
- the series of processes in step S802 is executed. Place the calibration plate in the surgical area and ensure that the surgical area is within the movement range of the tungsten wire fine-tuning electrode. At this time, adjust the imaging conditions of the camera to achieve clear imaging of the brain surface. Obtain the external parameters of the camera through the position of the calibration plate and then move the calibration plate to keep the position of the surgical area unchanged.
- step S803 the falling position of the tungsten wire is monitored in real time, stops when it touches the brain surface, and records the position of the tungsten wire landing point in the current image.
- step S804 blood vessels are identified, and the implantation point is manually selected based on the restrictions of the implantation sequence 2040. The selection can also be made automatically by the system or through auxiliary calculation.
- coordinate conversion is performed in step S805 to convert the two image coordinates into world coordinates. Specifically, conversion is performed according to "pixel coordinate system ⁇ image coordinate system ⁇ camera coordinate system ⁇ world coordinate system".
- step S806 calculate the movement vector of the implanted device to the electrode, and decompose the vector into two directions of electrode movement, such as finely controlling two micro stepper motors in the x direction in addition to the z-axis direction of the stepper motor. and movement in the y direction.
- step S807 the aforementioned motors in different directions are controlled to move the tungsten wire to be implanted.
- Step S808 repeats the process of step S803.
- step S809 a judgment is performed at step S809.
- the process continues to another judgment at step S810, that is, the difference between the tungsten wire landing point and the implant is Whether the difference in entry points is within the acceptable error range. If the determination in step S809 is "Yes”, that is, there is a blood vessel at the location of the tungsten wire landing point in the current image, then return to S804 to select a new implantation point that meets the requirements. Subsequently at step S810, if the determination result is "yes", it indicates that the implantation tool control algorithm has successfully navigated and the electrode implantation can be started.
- step S810 If the judgment result of step S810 is "no", it means that the error between the position of the tungsten wire landing point in the current image and the selected implantation point is unacceptable, and then return to S805 to perform coordinate conversion again to obtain a landing point with higher accuracy. Location.
- the electrode implantation method 9000 mainly includes the following steps: at step S901, a first image is captured for the brain surface by a first camera, and a second image is captured for the brain surface by a second camera. At step S902, arithmetic processing is performed on the first image and the second image, wherein a brain surface blood vessel area mask is obtained based on a blood vessel segmentation algorithm to determine the implantable area in the brain surface image.
- step S903 at least one implantation position is selected in the implantable area, and the distance between the at least one implantation position and the electrode position is calculated based on the known electrode position, thereby determining the implantation sequence of the electrodes.
- step S904 the imaging of the first camera and the second camera are matched to obtain a transformation matrix, the first straight line where the implantation position is located in the imaging of the first camera is projected into the imaging of the second camera, and the first straight line is projected into the imaging of the second camera.
- the intersection point between the line and the second straight line where the implantation position is located in the imaging of the second camera is determined as the predicted landing point of the implantation device.
- step S905 the implantation device is controlled in real time according to the predicted landing point until the implantation point coincides with the predicted landing point.
- machine vision-based electrode implantation method and system disclosed in this application can also have other implementations.
- the brain surface electrode implantation system disclosed in this application does not necessarily need to use a binocular system with two cameras for calibration.
- the monocular system can also perform position control on the plane of the implantation tool.
- vertical observation the camera shoots vertically above the brain surface to obtain a still image. In this image, blood vessels are segmented.
- the position of the implanted device is controlled through the transformation of pixel coordinates and robot arm coordinates, and it is moved to the brain surface.
- an obliquely placed observation camera can be attached to facilitate observation of the implantation.
- the camera is tilted to the brain surface to obtain a picture with an angle.
- the position of the implanted device is obtained through coordinate transformation, and the electrode is controlled to move its position for implantation.
- the control accuracy and blood vessel segmentation accuracy of the vertical observation method are better than the oblique observation method.
- the calibration plate method refers to placing a calibration plate close to the surgical area. After obtaining the external parameters of the camera, the conversion relationship between the image and the actual coordinates is calculated.
- the nine-point calibration method refers to photographing the brain area at a certain position, then controlling the tungsten wire to move to the specified nine points, and recording the position of the tungsten wire in the image and the actual coordinate system at this time, thereby obtaining the coordinates Transformation relationship. Based on the conversion relationship, the displacement that the tungsten wire should move in the actual position can be calculated through the distance between the tungsten wire landing point and the target point in the image.
- both methods of the monocular system require that the height position of the brain surface from the camera remains unchanged, which will affect its accuracy.
- the camera needs to perform real-time height adjustment, that is, to reach the same position above the brain surface with micron-level accuracy.
- the word "exemplary” means “serving as an example, instance, or illustration” rather than as a “model” that will be accurately reproduced. Any implementation illustratively described herein is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, the disclosure is not bound by any expressed or implied theory presented in the above technical field, background, brief summary or detailed description.
- the word “substantially” is meant to include any minor variations resulting from design or manufacturing defects, device or component tolerances, environmental effects, and/or other factors.
- the word “substantially” also allows for differences from perfect or ideal conditions due to parasitic effects, noise, and other practical considerations that may be present in actual implementations.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Robotics (AREA)
- Gynecology & Obstetrics (AREA)
- Radiology & Medical Imaging (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
一种基于机器视觉的电极植入方法和系统。该方法包括:由第一相机(101,201)针对脑表面捕获第一图像(2010),并由第二相机(102,202)针对脑表面捕获第二图像(2020)(S901);对第一图像(2010)和第二图像(2020)进行运算处理,其中,基于血管分割算法获得脑表面血管区域掩码,以确定脑表面图像中的可植入区域(S902);在可植入区域中选择至少一个植入位置,根据已知的电极位置计算该至少一个植入位置与电极位置的距离,从而确定电极的植入顺序(S903);匹配第一相机和第二相机的成像以获得变换矩阵,将第一相机的成像中植入位置所在的第一直线投影到第二相机的成像中,将第一直线与第二相机的成像中该植入位置所在的第二直线之间的交点确定为植入装置的预测落点(S904);以及根据预测落点实时控制植入装置,直至植入点与预测落点重合(S905)。
Description
本公开涉及生命科学技术领域,并且具体地及一种基于机器视觉的电极植入方法及系统。
在神经外科手术机器人领域,涉及到一种柔性电极向脑表面的植入。在植入的过程中,植入设备先穿过电极,随后带着电极植入到脑表面。而脑表面术区多为毫米级的小窗,依靠机械臂或者外部步进电机初步移至小窗上方,分析避开血管的可植入位点,控制植入工具进行植入。为了在植入电极时减少或降低出血,需要自动识别出可植入区域。由于动物脑表面血管分布复杂,有较多毛细血管;且图像质量对成像条件较为苛刻,光照的变化会大幅影响血管识别的效果。因此,需要稳定的光学系统来保证图像质量,以及稳定能识别出脑血管并提供植入区域的算法。
进一步地,在电极的植入中,要求明确植入脑区与电极通道的对应关系,需要对植入位置进行编号。在植入电极过程中,往往有多个位点需要植入,然而电极长度有限,要考虑电极不被拉扯以及与其他电极的排布关系,对电极的植入位置顺序提出要求。此外,由于植入工具不一定完全竖直,并且脑表面存在起伏,要求植入精确严格控制植入工具与表面的植入角度和位置,准确判断植入工具空间位置。因此,还需要设计相应的立体显微成像系统,实时监测植入工具的位置,预判其在脑表面的落点。
发明内容
本申请提出了一种基于机器视觉的电极植入方法和系统。
根据本公开的实施例的第一方面,提供了一种基于机器视觉的电极植入方法,包括:由第一相机针对脑表面捕获第一图像,并由第二相机针对脑表面捕获第二图像;对第一图像和第二图像进行运算处理,其中,基于血管分割算法获得脑表面血管区域掩码,以确定脑表面图像中的可植入区域;在可植入区域中选择至少一个植入位置,根据已知的电极位置计算该至少一个植入位置与电极位置的距离,从而确定电极的植入顺序;匹配第一相机和第二相机的成像以获得变换矩阵,将第一相机的成像中植入位置所在的第一直线投影到第二相机的成像中,将第一直线与第二相机的成像中该植入位置所在的第二直线之间的交 点确定为植入装置的预测落点;以及根据预测落点实时控制植入装置,直至植入点与预测落点重合。
根据本公开的实施例的第二方面,提供了一种基于机器视觉的电极植入系统,包括:第一相机,被配置为针对脑表面捕获第一图像;第二相机,被配置为针对脑表面捕获第二图像;血管分割运算单元,被配置为对第一图像和第二图像进行运算处理,其中,基于血管分割算法获得脑表面血管区域掩码,以确定脑表面图像中的可植入区域;植入顺序确定单元,被配置为在可植入区域中选择至少一个植入位置,根据已知的电极位置计算该至少一个植入位置与电极位置的距离,从而确定电极的植入顺序;植入落点预测单元,被配置为匹配第一相机和第二相机的成像以获得变换矩阵,将第一相机的成像中植入位置所在的第一直线投影到第二相机的成像中,将第一直线与第二相机的成像中该植入位置所在的第二直线之间的交点确定为植入装置的预测落点;以及植入装置控制单元,被配置为根据预测落点实时控制植入装置,直至植入点与预测落点重合。
根据本公开的实施例的优点在于适用于多种不同的成像模式,所使用的算法具有比较好的普适性,并且能够实现较好的边界分割,得到稳定的成像识别结果。
根据本公开的实施例的另一优点在于能够提供自动化的血管分割算法,易于调整算法参数,并且能够降低分割算法的计算量。
应当认识到,上述优点不需全部集中在一个或一些特定实施例中实现,而是可以部分分散在根据本公开的不同实施例中。根据本公开的实施例可以具有上述优点中的一个或一些,也可以替代地或者附加地具有其它的优点。
通过以下参照附图对本发明的示例性实施例的详细描述,本发明的其它特征及其优点将会变得更为清楚。
图1是示出了根据本公开的实施例的基于机器视觉的电极植入系统的示意图。
图2是示出了根据本公开的实施例的基于机器视觉的电极植入系统的配置图。
图3是示出了根据本公开的实施例的血管分割算法的流程图。
图4是示出了根据本公开的实施例的血管分割算法的步骤示意图。
图5是示出了根据本公开的实施例的血管分割算法的效果图。
图6是示出了根据本公开的实施例的脑表面电极的可植入位置选择和路径规划的示意图。
图7是示出了根据本公开的实施例的植入工具落点预测的示意图。
图8是示出了根据本公开的实施例的植入工具控制算法的流程图。
图9是示出了根据本公开的实施例的基于机器视觉的电极植入方法的步骤示意图。
下面将参照附图来详细描述本公开的各种示例性实施例。应注意到:除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置、数字表达式和数值不限制本公开的范围。
以下对至少一个示例性实施例的描述实际上仅仅是说明性的,决不作为对本公开及其应用或使用的任何限制。也就是说,本文中的结构及方法是以示例性的方式示出以说明本公开中的结构和方法的不同实施例。然而,本领域技术人员将会理解,它们仅仅说明可以用来实施的本公开的示例性方式,而不是穷尽的方式。此外,附图不必按比例绘制,一些特征可能被放大以示出具体组件的细节。
对于相关领域普通技术人员已知的技术、方法和设备可能不作详细讨论,但在适当情况下,所述技术、方法和设备应当被视为授权说明书的一部分。
在进行脑表面的电极植入过程中,需要进行血管识别与植入位点选择、植入路径规划以及植入工具的定位。
一般地,对于脑表面的血管识别主要包括对于脑表面成像进行区域分割,即将血管和非血管的区域进行区分,实现区域分割的方式至少包括阈值分割、边缘检测、基于数学形态学的分割方法等。
具体而言,阈值分割是最常见的并行的直接检测区域的分割方法,同时也是最简单的分割方法。这种方法一般都对图像有一定的假设,假设图像的目标和背景占据不同的灰度级范围,在目标和背景内部的相邻像素间的灰度值差异较小,但在目标和背景交界处两边的像素在灰度值上有较大的差别。如果选取一个适当的灰度阈值T,然后将图像中每个像素的灰度值与该阈值T相比较,根据比较结果可以将像素分为两类:像素的灰度值大于阈值的为一类,它们被赋值1;像素的灰度值小于阈值的为另一类,它们被赋值0,这样就得到了一幅二值图像,并把目标从背景中提取出来.通常根据先验知识确定分割门限,也可以利用灰度直方图特征和统计判决方法确定。图像的灰度直方图会呈现双峰一谷状。两个峰值分别对应于目标的中心灰度和背景的中心灰度,边界点位于目标周围,其灰度介于目标灰度和背景灰度之间,因此边界的灰度对应着双峰之间的谷点。为了使像素错分的概率 达到最小,将谷点的灰度作为分割门限。由于直方图的参差性,直方图的谷值很难确定,需要设计特定的方法进行搜索。目前,有很多的方法可以确定最优阈值(谷底),如求取高斯模型参数的方法、对直方图曲线拟合求极值的方法等。
基于阈值的分割方法具有计算简单、运算效率高的优点。但是这类方法没有考虑空间特性,对噪声和灰度多样性敏感,对目标和背景灰度差异不明显的图像很难得到准确的分割阈值。在实际应用中,通常与其他图像分割方法配合使用,才能取得满意的效果。现有技术由时飞磁共振血管造影图像中提取血管时,提出一个基于血流物理模型的统计模型。为了提高血管的分割能力,将PCA的速度和相位信息进行融合,分别采用自适应局部阈值方法和单一全局阈值方法分割两种不同的统计模型,使其附近信号非常低的动脉瘤取得较好的分割效果。现有技术还采用了将局部阈值和全局阈值结合起来,对脑血管图像进行三维重建,使用局部阈值可以增强小血管的对比度,使用全局阈值可以将目标血管从背景中提取出来。
边缘检测是一种基于灰度不连续性的并行边界分割技术,是所有基于边界分割方法的第一步。因为边缘是目标和背景的分界线,提取出边缘才能将目标和背景区分开来。边缘检测一般利用目标和背景在某种特性上存在差异来实现,如灰度、颜色、纹理等特征。检测边缘一般常用一阶或二阶导数来完成,但在实际的数字图像中求导是利用差分运算近似代替微分运算。图像中处于边缘两侧的点,其灰度值发生突变,所以这些点将具有较大的微分值,当微分的方向和边界垂直时,微分值最大。可见,微分是有方向性的运算,用于测量微分方向上灰度级的变化。
基于数学形态学的分割方法,其基本原则是利用具有一定形态的结构元素对图像进行基本操作,以达到对图像分析和识别的目的。数学形态学的基本运算有膨胀(dilation)和腐蚀(erosion),以及它们组合形成的开运算(opening)和闭运算(closing).开运算是先腐蚀后膨胀,闭运算是先膨胀后腐蚀。各种运算对图像的处理都各具特点,膨胀使图像扩大而腐蚀使图像缩小。开运算和闭运算都能使图像的轮廓变得平滑,但两种运算的作用相反,开运算能断开狭窄的间断,消除细的突出物;闭运算能消除图像中小的孔洞,填补轮廓线中的断裂,并将狭窄的缺口和细长的鸿沟合并。结合图像的具体特征,可根据这些基本运算进行推导和组合不同的数学形态学算法,用于对图像形状和结构进行处理和分析,如边缘检测、图像滤波、特征提取、图像增强等。医学图像常用的处理算法有头帽变换(top-hat transformation)和分水岭变换(watershed transformation)。
为实现上述多种不同方法,实践中还将神经网络运用到图像区域分割中。一方面,神 经网络能够进行学习;另一方面,在训练的过程中能够使用网络的非线性进行边界分割。其不足是每当有新的特征加入网络系统时,就需要重新进行学习和训练,而且其调试过程也很复杂。为了使网络系统利用其可学习性在特征中分类边界,应该尽可能多的选择物体的特征。学习过程中广泛应用的一种算法是后传播算法,由于训练数据集合决定学习,所以训练数据量的大小就决定了学习过程。
进一步地,对于植入工具的定位主要包括机器人视觉应用中的手眼标定。其目标在于获取机器人坐标系和相机坐标系的关系,最后将视觉识别的结果转移到机器人坐标系下。
手眼标定行业内分为两种形式,根据相机固定的地方不同,如果相机和机器人末端固定在一起,则称为“眼在手”(eye in hand);如果相机固定在机器人外面的底座上,则称为“眼在外”(eye to hand)。
在工业上,常见的手眼标定方法主要分为九点标定法和标定板标定法两种方法。九点标定直接建立相机和机械手之间的坐标变换关系。让机械手末端的指示针去接触这9个点得到在机器人坐标系中的坐标,同时还要用相机识别在初始画面中9个点得到像素坐标,这样就得到了9组对应的坐标,再求解变换矩阵就能得到图像与机械手坐标的转换仿射矩阵。标定板标定法利用棋盘格标定板或圆网格标定板子,得到相机的内外参,即可得到图像与机械手之间的坐标变换关系。
单目系统的标定,只适用于一个观测物体在一个水平面的情况,无法获取深度信息。用传统的手眼标定方法不能准确预测出植入工具在脑表面的落点。相比之下,双目系统根据光轴的位置关系分为两大类,即光轴基本平行的为平行双目系统和光轴相交的为汇聚双目系统。平行双目系统,由于光轴平行使得视野重叠范围较少,在视野范围小的情况下很少使用,一般用于工作距离远大于镜头之间距离的情况。而如果需要观察毫米级的物体,汇聚式双目系统更符合需求。
但是,现有技术仍存在许多不足。一方面,对于图像区域分割的算法而言,部分针对特定成像模式提出的算法不具备普适性,即不能适用于其它成像模式。血管的边界判断是依据像素的灰度梯度场来进行,但在血流速度低、血流复杂的区域,梯度值往往不够高,会导致边检判断的精度下降。算法中假设每个组织的灰度分布是高斯分布,但实际的情况并非完全如此,导致提出的模型和临床数据之间存在偏差。同时,图像分割算法中涉及的多个参数需要调整,而且参数的估计过程非常困难。对于部分交互式算法,需要人工选择血管内的种子点或者终止点,使得自动化程度受到影响。此外,总体上分割方法的计算量大,计算代价昂贵。
另一方面,对于植入工具的定位而言,手眼标定中为实现准确预测植入工具在脑表上的落点,需要的成像系统必须使用至少两个相机,也就是使用双目系统。并且,由于不同双目系统的成像特性,需要使用汇聚双目系统。为得到清晰的成像还需要使用放大倍数为×1或×2的镜头,在同等相机像素的情况下,视野范围较小,镜头的景深非常受限,一般在1mm左右。这就需要调整摄像头位置来获得清晰的成像,导致两个相机之间的相对不固定。然而,双目相机的标定只在相机位置相对固定时成立,这也意味着传统的双目标定方法无法直接应用到本申请所公开的脑表面电极植入系统中。
为解决上述技术问题,本申请的发明人提出了一种改进的基于机器视觉的电极植入方法和系统,特别地,涉及基于对机器成像进行血管分割处理的脑表面电极植入方法和系统。概括而言,本公开的技术方案主要包括在电极植入过程自动检测并规避血管,根据植入工具与植入电极大小、数量和形状自动检测多个待选。使用图像配准与融合技术,在脑表面投射脑分区,方便手术过程中定位电极植入的脑区。实现选点后植入系统自动移动至所选位置,基于所选位置和电极植入的角度等控制电极植入装置在三维空间上精确移动,并且实时监测植入装置与脑表面的精确距离。根据距离脑表面的距离预计电极植入的深度。在此过程中,使用目标追踪等技术实现电极到选择的植入位点的精准移动与植入。
以下将结合附图对根据本公开的实施例进行详细描述。首先,图1和图2分别示出了根据本公开的实施例的基于机器视觉的电极植入系统的示意图和配置图。如图1中所示,本申请的脑表面电极植入系统的硬件结构包括光学系统与运动控制系统。其中,光学系统主要与两个相机101和102(下文也称“第一相机”和“第二相机”)相关联,诸如可以采用高清面阵CMOS工业相机,其分别具有对术区106进行成像放大的远心镜头103和104。相机101和102成像平面相同,在一个投影平面上互成一定的角度,固定在一个刚性底板上,并且安装同轴光源(未示出)使曝光时间缩短帧率增加。该光源可以采用外置的点光源,其主要用于使得术区106受光均匀,避免植入装置105和术区106成像模糊不清或过曝光,从而有利于后续的图像和数据处理。该光源可以采用白光,也可以采用给定波长的其它光。优选地,由于脑表面以及血管本身的颜色影响,为实现更好的成像效果,可以采用绿光(诸如波长为495nm至570nm的光)。
此外,相机101和102的后方分别设置三轴滑台,可以将放大镜头103和104调整达到工作距离,以对植入装置105相对于术区106的位置和角度进行多个方位的成像。图1示出了本申请所公开的系统的一个非限制性实施例,其中相机101和相机102在水平投影上互成约90°角。
运动控制系统主要由三个步进电机构成,用于控制植入装置105在某个空间坐标系的三个方向(如图1所示的±x、±y和±z方向)上的运动。相机101和102分别耦接到运动控制系统在图1所示的非限制性实施例中,该三个电机包括控制±z方向运动的一个步进电机和两个微型步进电机(未示出),诸如行程5mm,用于分别控制植入装置105在±x、±y方向的运动,即与术区106基本平行的区域移动。
可替代地,运动控制系统也可以包括具有相似运动控制功能的机械臂等。此时相机101/102设置在机械臂上方,但由于机械臂的运动精度(诸如±30μm)不能满足本申请的系统所需的精度需求(诸如±10μm),因此机械臂用于粗略地找到植入位置,对电极位置的精细微调仍由两个微型操作电机完成。
附加地,植入装置105被配置为将柔性电极植入到术区106的指定位置,包括植入针、植入进给机构和植入执行机构。其中,植入针构用于将以针头部分接合电极的自由端部,以便带动电极运动。植入进给机构用于使植入针沿植入装置的纵向方向移动。植入执行机构构用于驱动植入针以将植入针的针头部分扎入术区106。此外,植入装置105还可以配设有植入运动机构,用于使植入装置105能够从不同角度在不同的朝向下进行电极的植入。
图2示出了脑表面电极植入系统的一个非限制性实施例。在该脑表面电极植入系统20中采用了双目系统,其主要包括第一相机201、第二相机202、血管分割运算单元203、植入顺序确定单元204、植入落点预测单元205以及植入装置控制单元206。其中第一相机201和第二相机202分别对应于图1中的相机101和102,用于互成角度地对植入装置相对于术区的位置和方向进行成像,相似的特征在此不作赘述。
具体而言,脑表面电极植入系统20采用第一相机201和第二相机针对脑表面进行成像,分别捕获第一图像2010和第二图像2020。该第一图像2010和第二图像2020是对于植入装置和术区在不同方向上的成像,如图1所示,在一个非限制性示例中,如果根据运动控制系统中的步进电机的控制方向建立一个三维坐标系,那么根据互成角度的第一相机201和第二相机202,可以基于第一图像2010和第二图像2020在该坐标系内确定植入装置和术区的位置坐标。
血管分割运算单元203被配置为对第一图像2010和第二图像2020进行运算处理。血管分割运算单元203主要执行的功能是基于血管分割算法获得脑表面血管区域掩码,以确定脑表面图像中的可植入区域2030。一般地,血管分割算法可以有许多种实现方式,如前所述,包括阈值划分、边缘提取与数学形态学处理等。本申请所采用的算法结合了几种处 理方法的优点,以除视频本身的抖动,对多帧图像进行处理,获得平滑的血管图像掩码。图3示出了该血管分割算法的一个非限制性实施例,图4示出了该血管分割算法中每一步处理所得到的结果示意图。
具体而言,在图3的步骤S301中,将第一图像2010以及/或者第二图像2020输入到该血管分割算法中。接下来,在步骤S302中,对输入图像进行了一系列的图像处理步骤。首先,将输入图像转变为灰度图,然后进行自适应阈值分割处理,在处理后的结果中寻找血管的轮廓并去除小轮廓噪声,接着进行开运算以消除原图像中属于血管中的气泡状噪声图案,随后取反运算处理,在处理后的结果中对血管部分进行膨胀处理以获取安全距离(也称“腐蚀”一段安全位移),最后再次进行反运算处理。此时在步骤S303中得到了可植入区域的图像掩码。进一步地,在步骤S304中设置判断,使得所获得的图像张数达到预设的平滑数n之前重复S302中的系列处理。经过上述判断,在步骤S305中将最近获得的n张图像中的可植入区域取交集,最终在S306中输出比较稳定的血管图像掩码。
相应地,图4主要示出了在步骤S302的系列处理中每一步处理后得到的中间结果。如图所示,在S402转为灰度图后的结果中消除了血管颜色的干扰,在S403经过自适应阈值分割后的结果中将血管和非血管区域进行了粗略划分,由于采用了自适应阈值算法,因此不需要提前计算得到血管灰度阈值,也不需要先验数据。在成像条件合适的情况下,能得到稳定的结果。接下来,在S404去除轮廓噪声、S405去除血管中的气泡状噪声图案后,使得提取的轮廓边缘在时域上尽量减少变化,从而提高轮廓提取算法的稳定性与安全性。最后,通过S406至S408的处理,使得被识别为血管的图像区域具有合理的安全距离,以最大程度上减少将血管识别为可植入区域的风险。
图5示出了根据上述实施例的血管分割算法的效果图。如图所示,在3mmx3mm的猕猴大脑术区中取得如下图像血管分析效果,在输入相机原始成像结果后经过一系列算法处理得到图像识别结果,其中条纹形状掩码为分割出的血管区域,空白部分为可植入区域。由图中可以看出,本申请所公开的血管分割算法能够有效、稳定地识别出脑表面成像中的血管区域,以较高的精度确保可植入区域只包含非血管的部分。
附加地,本申请所公开的血管分割算法可以灵活地调整参数。一般而言,脑表面电极植入系统的算法参数可以基于植入电极的位点进行调整,诸如在对可插入点数和准确率的要求发生变化时,可能会影响血管分割算法中边缘检测的精度阈值或膨胀处理的安全距离。对于光学系统而言,由于其镜头与术区之间的物距变化会导致成像中特定区域放大或缩小,进而影响到要检测到的位点个数、点距以及成像分辨率。可替代地,可以由用户指定、由 系统自动选定或由系统辅助用户确定所需的电极位点的个数及点距,并基于此对算法参数进行调整。
返回图2继续描述。植入顺序确定单元204被配置为在可植入区域2030中选择至少一个植入位置。特别地,对于需要选择多个电极植入位置的情况,由植入顺序确定单元204根据已知的电极位置计算植入位置与电极位置的距离,从而对电极植入的顺序进行路径规划,得到电极额植入顺序2040。在基于术区确定的指定区域内,以电极方向为参考,运动控制系统经由步进电机或机械臂控制植入装置的运动方向,从而将电极依次植入到所确定的位置上。为防止电极之间产生不期望的相互影响,即正在植入的电极不会对已植入的电极施加作用力,需要由植入顺序确定单元204按照如下原则进行路径规划:后植入的电极不能对之前植入的电极产生干扰;移动过程中不能拉扯已经植入的电极。亦即,理想的电极植入路径应尽可能避免出现交叉、横跳等情形。
在一个非限制性实施例中,植入顺序确定单元204可采用的顺序为按照植入电极相对于脑表面的由近到远并且从左到右的顺序,如图6所示。以图5中获得的基于血管分割算法的图像处理结果为例,图6示出了针对该图像处理结果确定植入位置和植入顺序。图6的(A)是一种参考的路径规划顺序,其中,根据前述一系列处理得到的脑表面可植入位置及其分布简化为二维坐标系中5×7的点阵,植入电极(未示出)相对于脑表面的点阵采用从上到下的方向(即图中y轴正方向),则得到如图6的(A)中点阵间的箭头所示的顺序,即沿x轴正方向并且沿y轴正方向。需要注意的是,图6的(A)中的点阵数量仅用于示意,而不具有限制性。将图6的(A)的示例顺序运用到图5的图像处理结果中,得到如图6的(B)所示的路径规划。图6的(B)中示出了19个计算出的位置,其中由方形标记指示的位置表示不推荐的植入位点,圆形标记指示的位置表示算法推荐的植入位点,三角形标记指示的位置表示已植入电极的位点。数字的大小表示植入顺序,从图中可以看到这些植入位置连成的植入顺序没有发生交叉,并且电极位点由不同标记指示状态以辅助观察和植入过程,从而使得后植入的电极不会对先植入的电极产生干扰和拉扯。
接下来将继续描述图2中的植入落点预测单元205。植入落点预测单元205被配置为匹配第一相机201和第二相机202的成像以获得变换矩阵。由于两个相机都能清晰获得术区的血管成像,而血管中有较多特征,因此可以基于数据的特征匹配以进行两个相机的标定。实际应用中可以用于匹配的特征诸如SURF特征或SIFT特征。
在采用SURF特征的情况下,基于SURF特征匹配两个相机以获得两个摄像头之间的仿射变换矩阵。其中,使用连续不同尺度的高斯滤波器处理影像,并且经由高斯差来侦测 影像中尺度不变的特征点。此外,使用斑点侦测的海森矩阵来侦测特征点,其行列式值代表像素点周围的变化量,因此特征点需取行列式值为极大、极小值。除此之外,为了达到尺度上的不变,SURF还使用了尺度σ的行列式值作特征点的侦测,给定图形中的一点p=(x,y)在尺度σ的海森矩阵为H(p,σ):
其中矩阵内的L
xx(p,σ)等函数为二阶微分后的灰阶图像。9×9的方型滤波器被作为SURF最底的尺度,近似于σ=1.2的高斯滤波器。
在采用SIFT特征的情况,其计算效率高,能够快速进行图像匹配。其中,使用了连续不同尺度的高斯滤波器处理影像,并且经由高斯差来侦测影像中尺度不变的特征点。SURF使用了方型滤波器取代SIFT中的高斯滤波器,借此达到高斯模糊的近似。其滤波器可表示为:
此外使用方型滤波器可利用积分图大幅提高运算速度,仅需计算位于滤波器方型的四个角落值即可。
在经由特征匹配后第一相机201与第二相机202标定,即对于任一组或多组第一图像2010与第二图像2020而言得到了其对应和转换关系。基于此,图7进一步示出了植入工具落点预测的原理示意图。
在一个非限制性实施例中,当植入装置在双目系统的成像中出现,两个相机各能在其成像中定位一条植入工具所在的直线,也即,第一相机201和第二相机202在三维空间(以及/或者所建立的空间坐标系)中能够分别确定植入工具所在的一个平面。由于两个相机的视角不同,在其成像中获取的图像和位置信息并不完全相同。以第一相机201捕获第一图像2010、第二相机202捕获第二图像2020为例,图7的(A)和(B)分别示出了将第一相机201和第二相机202中任一个相机投影到另一个的成像方向上的示意图。图7的(C)示出了第二图像2020中的植入位置示意图。植入落点预测单元205将第一图像2010中植入位置所在的第一直线7001投影到第二图像2020中,而植入位置本身在第二图像2020中落在第二直线7002中,则将第一直线7001与第二直线7002之间的交点确定为植入装置在第二图像2020中的预测落点2050。类似地,图7的(D)示出了第一图像2010中植入位置的预测落点。由于两个相机之间变关系的计算误差,得到的两个交点在实际图像中可能并不完全重合,但可以确定实际的预测落点在这两个交点的附近。
继续返回图2描述植入装置控制单元206。基于从植入顺序确定单元204所确定的植入顺序2040以及植入落点预测单元205所确定的预测落点2050,最终由植入装置控制单元206按照植入顺序2040依次根据预测落点2050实时控制植入装置,直至植入点与预测落点重合。植入装置控制单元206可以采用全监督或半监督的控制,可以根据用户选择进行不同的选择,实现个性化的控制效果。
图8示出了一个非限制性实施例中植入工具控制算法的流程图。该控制算法在步骤S801处以光学系统中的相机内参作为输入。首先执行步骤S802中的系列处理。将标定板放置在术区,并且确保术区在钨丝微调电极移动范围内。此时调节相机的成像条件致使对脑表面的成像清晰,通过标定板的位置得到相机外参后移开标定板,保持术区位置不变。接下来,在步骤S803中实时监测钨丝下落位置,接触到脑表面时停止,记录当前图像中钨丝落点位置。在步骤S804中进行血管识别,基于植入顺序2040的限制人工选择植入点,也可以由系统自动或辅助计算来进行选择。接下来,在步骤S805中进行坐标转换,将两个图像坐标转换为世界坐标。具体而言,按照“像素坐标系→图像坐标系→相机坐标系→世界坐标系”进行转换。到步骤S806中,计算植入装置对电极的移动向量,并将该向量分解为电极移动的两个方向,诸如在步进电机的z轴方向之外精细地控制两个微型步进电机在x和y方向的运动。随后在步骤S807中控制前述不同方向的电机移动待植入的钨丝。步骤S808重复步骤S803的处理。
接下来,在步骤S809处执行一个判断,当钨丝落点位置不存在血管(步骤S809判断为“否(No)”)时继续进入步骤S810处的另一个判断,即钨丝落点与植入点的差异是否在可接受误差范围内。如果步骤S809处判断为“是(Yes)”,即当前图像中的钨丝落点位置存在血管,则返回到S804选择新的符合要求的植入点。随后在步骤S810处,如果判断结果为“是”,则表明植入工具控制算法导航成功,可以开始电极的植入。如果步骤S810的判断结果为“否”,则说明当前图像中的钨丝落点位置与选定的植入点误差不可接受,则返回到S805重新进行坐标转换,以获得精度更高的落点位置。
图9是示出了根据本公开的实施例的基于机器视觉的电极植入方法的步骤示意图。基于前述,电极植入方法9000主要包括以下步骤:在步骤S901处,由第一相机针对脑表面捕获第一图像,并由第二相机针对脑表面捕获第二图像。在步骤S902处,对第一图像和第二图像进行运算处理,其中,基于血管分割算法获得脑表面血管区域掩码,以确定脑表面图像中的可植入区域。接下来在步骤S903处,在可植入区域中选择至少一个植入位置,根据已知的电极位置计算该至少一个植入位置与电极位置的距离,从而确定电极的植入顺序。 随后,在步骤S904中匹配第一相机和第二相机的成像以获得变换矩阵,将第一相机的成像中植入位置所在的第一直线投影到第二相机的成像中,将第一直线与第二相机的成像中该植入位置所在的第二直线之间的交点确定为植入装置的预测落点。最后,在步骤S905处,根据预测落点实时控制植入装置,直至植入点与预测落点重合。
可替代地,本申请所公开的基于机器视觉的电极植入方法和系统还可以具有其它的实现方式。
在一个示例中,本申请所公开的脑表面电极植入系统不一定需要采用两个相机的双目系统用于标定,单目系统也能进行植入工具平面上的位置控制。可采用的实现方式有两种,即垂直观察和倾斜观察。在垂直观察的情况中,相机在脑表面垂直上方进行拍摄,获得静止图像,在此图像中进行血管分割,通过像素坐标与机械臂坐标的变换控制植入装置的位置,将其移到脑表面上方进行植入。在此种实现方式中,可以附加一个倾斜放置的观察相机,便于观察植入情况。而在倾斜观察的情况中,相机倾斜于脑表面放置,得到一个存在倾角的画面,通过坐标变换得到植入装置的位置,控制电极移动其位置以进行植入。相比之下,由于成像角度的问题,垂直观察的方法控制精度与血管分割精度比倾斜观察的方法更优。
如果使用单目系统,则在标定中主要采用标定板法与九点标定法来实现本系统的目标需求。标定板法指在紧贴术区的位置放上标定板,得到相机外参之后,算出图像与实际坐标的转换关系。九点标定法指的是在某个位置拍摄脑区,再控制钨丝移动到指定的九个点上,分别记录此时钨丝在图像中的位置和实际坐标系中的位置,从而得到坐标转换关系。基于转换关系,就能通过图像中钨丝落点和目标点的距离,算出钨丝在实际位置应该移动的位移。相比于双目系统,单目系统的这两个方法都要求脑表面距离相机的高度位置不变,会对其精度产生影响。在实际应用中,由于脑表面本身不是一个平面,需要相机进行实时的高度调节,即以微米级的精度到达脑表面上方同一位置。
在说明书及权利要求中的词语“前”、“后”、“顶”、“底”、“之上”、“之下”等,如果存在的话,用于描述性的目的而并不一定用于描述不变的相对位置。应当理解,这样使用的词语在适当的情况下是可互换的,使得在此所描述的本公开的实施例,例如,能够在与在此所示出的或另外描述的那些取向不同的其他取向上操作。
如在此所使用的,词语“示例性的”意指“用作示例、实例或说明”,而不是作为将被精确复制的“模型”。在此示例性描述的任意实现方式并不一定要被解释为比其他实现方式优选的或有利的。而且,本公开不受在上述技术领域、背景技术、发明内容或 具体实施方式中所给出的任何所表述的或所暗示的理论所限定。
如在此所使用的,词语“基本上”意指包含由设计或制造的缺陷、器件或元件的容差、环境影响和/或其他因素所致的任意微小的变化。词语“基本上”还允许由寄生效应、噪声以及可能存在于实际的实现方式中的其他实际考虑因素所致的与完美的或理想的情形之间的差异。
仅仅为了参考的目的,可以在本文中使用“第一”、“第二”等类似术语,并且因而并非意图限定。例如,除非上下文明确指出,否则涉及结构或元件的词语“第一”、“第二”和其他此类数字词语并没有暗示顺序或次序。
还应理解,“包括/包含”一词在本文中使用时,说明存在所指出的特征、整体、步骤、操作、单元和/或组件,但是并不排除存在或增加一个或多个其他特征、整体、步骤、操作、单元和/或组件以及/或者它们的组合。
如本文所使用的,术语“和/或”包括相关联的列出项目中的一个或多个的任何和所有组合。本文中使用的术语只是出于描述特定实施例的目的,并不旨在限制本公开。如本文中使用的,单数形式“一”、“一个”和“该”也旨在包括复数形式,除非上下文另外清楚指示。
本领域技术人员应当意识到,在上述操作之间的边界仅仅是说明性的。多个操作可以结合成单个操作,单个操作可以分布于附加的操作中,并且操作可以在时间上至少部分重叠地执行。而且,另选的实施例可以包括特定操作的多个实例,并且在其他各种实施例中可以改变操作顺序。但是,其他的修改、变化和替换同样是可能的。因此,本说明书和附图应当被看作是说明性的,而非限制性的。
虽然已经通过示例对本公开的一些特定实施例进行了详细说明,但是本领域的技术人员应该理解,以上示例仅是为了进行说明,而不是为了限制本公开的范围。在此公开的各实施例可以任意组合,而不脱离本公开的精神和范围。本领域的技术人员还应理解,可以对实施例进行多种修改而不脱离本公开的范围和精神。本公开的范围由所附权利要求来限定。
Claims (30)
- 一种基于机器视觉的电极植入方法,包括:由第一相机针对脑表面捕获第一图像,并由第二相机针对脑表面捕获第二图像;对第一图像和第二图像进行运算处理,其中,基于血管分割算法获得脑表面血管区域掩码,以确定脑表面图像中的可植入区域;在可植入区域中选择至少一个植入位置,根据已知的电极位置计算所述至少一个植入位置与电极位置的距离,从而确定电极的植入顺序;匹配第一相机和第二相机的成像以获得变换矩阵,将第一相机的成像中植入位置所在的第一直线投影到第二相机的成像中,将第一直线与第二相机的成像中该植入位置所在的第二直线之间的交点确定为植入装置的预测落点;以及根据预测落点实时控制植入装置,直至植入点与预测落点重合。
- 根据权利要求1所述的电极植入方法,其中:根据预测落点对植入装置进行全监督或半监督的控制。
- 根据权利要求1所述的电极植入方法,其中:第一相机与第二相机成像平面相同。
- 根据权利要求3所述的电极植入方法,其中:第一相机与第二相机在水平投影上成约90°夹角。
- 根据权利要求1所述的电极植入方法,其中:第一相机与第二相机中的每一个分别包括光学系统,并且第一相机与第二相机分别耦接到运动控制系统。
- 根据权利要求5所述的电极植入方法,其中:所述光学系统包括使脑表面受光均匀的外置光源。
- 根据权利要求6所述的电极植入方法,其中:所述外置光源的波长范围为495nm至570 nm。
- 根据权利要求5所述的电极植入方法,还包括:由第一相机和第二相机分别进行图像处理,并且将来自第一相机和第二相机的经过图像处理的数据进行合并处理;根据经过合并处理的数据确定植入装置在第一相机和第二相机的成像中的坐标。
- 根据权利要求8所述的电极植入方法,其中:所述运动控制系统根据所述坐标控制植入装置的移动。
- 根据权利要求5所述的电极植入方法,其中:所述运动控制系统包括三个步进电机,用于控制植入装置在与电极植入术区基本平行的区域移动。
- 根据权利要求1所述的电极植入方法,其中所述脑血管分割算法包括以下步骤:将脑表面图像转变为灰度图;针对该灰度图根据自适应阈值进行分割;对分割后的灰度图去除小轮廓噪声;进行开运算,去除血管中的气泡状噪声图案;进行反运算;进行膨胀处理以获取血管区域边界处的安全距离;以及再次进行反运算。
- 根据权利要求11所述的电极植入方法,其中:基于要检测到的位点个数、点距、成像分辨率调整所述脑血管分割算法中的参数。
- 根据权利要求1所述的电极植入方法,其中:确定电极的植入顺序使得正在植入的电极不会对已植入电极施加作用力。
- 根据权利要求13所述的电极植入方法,其中:基于电极的植入顺序规划植入电极的路径,其中所述路径不发生交叉。
- 根据权利要求1所述的电极植入方法,其中:对第一相机与第二相机进行数据的特征匹配以进行标定。
- 根据权利要求15所述的电极植入方法,其中:使用方形滤波器实现高斯模糊的图像处理效果。
- 根据权利要求1所述的电极植入方法,其中:所述植入装置包括植入针、植入进给机构和植入执行机构,其中,所述植入针构造成用于将以其针头部分接合电极的自由端部,以便带动电极运动,所述植入进给机构构造成用于使植入针沿植入装置的纵向方向移动,以及所述植入执行机构构造成用于驱动植入针以将植入针的针头部分扎入脑部中。
- 根据权利要求17所述的电极植入方法,其中:所述植入装置配设有植入运动机构,所述植入运动机构构造成用于使所述植入装置能够从不同角度在不同的朝向下进行电极的植入。
- 一种基于机器视觉的电极植入系统,包括:第一相机,被配置为针对脑表面捕获第一图像;第二相机,被配置为针对脑表面捕获第二图像;血管分割运算单元,被配置为对第一图像和第二图像进行运算处理,其中,基于血管分割算法获得脑表面血管区域掩码,以确定脑表面图像中的可植入区域;植入顺序确定单元,被配置为在可植入区域中选择至少一个植入位置,根据已知的电极位置计算所述至少一个植入位置与电极位置的距离,从而确定电极的植入顺序;植入落点预测单元,被配置为匹配第一相机和第二相机的成像以获得变换矩阵,将第一相机的成像中植入位置所在的第一直线投影到第二相机的成像中,将第一直线与第二相机的成像中该植入位置所在的第二直线之间的交点确定为植入装置的预测落 点;以及植入装置控制单元,被配置为根据预测落点实时控制植入装置,直至植入点与预测落点重合。
- 根据权利要求19所述的电极植入系统,其中:根据预测落点对植入装置进行全监督或半监督的控制。
- 根据权利要求19所述的电极植入系统,其中:第一相机与第二相机成像平面相同。
- 根据权利要求19所述的电极植入系统,其中:第一相机与第二相机中的每一个分别包括光学系统,并且第一相机与第二相机分别耦接到运动控制系统。
- 根据权利要求22所述的电极植入系统,其中:所述光学系统包括使脑表面受光均匀的外置光源。
- 根据权利要求19所述的电极植入系统,其中:由第一相机和第二相机分别进行图像处理,并且将来自第一相机和第二相机的经过图像处理的数据进行合并处理;根据经过合并处理的数据确定植入装置在第一相机和第二相机的成像中的坐标。
- 根据权利要求24所述的电极植入系统,其中:所述运动控制系统根据所述坐标控制植入装置的移动。
- 根据权利要求19所述的电极植入系统,其中所述脑血管分割算法包括以下步骤:将脑表面图像转变为灰度图;针对该灰度图根据自适应阈值进行分割;对分割后的灰度图去除小轮廓噪声;进行开运算,去除血管中的气泡状噪声图案;进行反运算;进行膨胀处理以获取血管区域边界处的安全距离;以及再次进行反运算。
- 根据权利要求26所述的电极植入系统,其中:所述植入顺序确定单元还被配置为确定电极的植入顺序使得正在植入的电极不会对已植入电极施加作用力。
- 根据权利要求27所述的电极植入系统,其中:基于电极的植入顺序规划植入电极的路径,其中所述路径不发生交叉。
- 根据权利要求19所述的电极植入系统,其中:对第一相机与第二相机进行数据的特征匹配以进行标定。
- 根据权利要求19所述的电极植入系统,其中:所述植入装置包括植入针、植入进给机构和植入执行机构,其中,所述植入针构造成用于将以其针头部分接合电极的自由端部,以便带动电极运动,所述植入进给机构构造成用于使植入针沿植入装置的纵向方向移动,以及所述植入执行机构构造成用于驱动植入针以将植入针的针头部分扎入脑部中。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210696347.9A CN115068082A (zh) | 2022-06-20 | 2022-06-20 | 基于机器视觉的电极植入方法及系统 |
CN202210696347.9 | 2022-06-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023245708A1 true WO2023245708A1 (zh) | 2023-12-28 |
Family
ID=83252767
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/102359 WO2023245708A1 (zh) | 2022-06-20 | 2022-06-29 | 基于机器视觉的电极植入方法及系统 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115068082A (zh) |
WO (1) | WO2023245708A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117982212A (zh) * | 2024-04-03 | 2024-05-07 | 北京智冉医疗科技有限公司 | 电极植入设备和电极植入方法 |
CN117982211A (zh) * | 2024-04-03 | 2024-05-07 | 北京智冉医疗科技有限公司 | 电极植入设备和电极植入方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116473673B (zh) * | 2023-06-20 | 2024-02-27 | 浙江华诺康科技有限公司 | 内窥镜的路径规划方法、装置、系统和存储介质 |
CN117789923B (zh) * | 2024-02-23 | 2024-05-31 | 湖南安泰康成生物科技有限公司 | 电极片贴敷方案确定方法及装置、设备、系统及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210007808A1 (en) * | 2019-07-12 | 2021-01-14 | Neuralink Corp. | Optical coherence tomography for robotic brain surgery |
CN113017566A (zh) * | 2021-02-26 | 2021-06-25 | 北京伟浩君智能技术有限公司 | 基于图像的血管识别与定位方法及装置 |
CN113797440A (zh) * | 2021-09-27 | 2021-12-17 | 首都医科大学附属北京天坛医院 | 基于影像和电生理实时定位的脑深部电极自动植入系统 |
CN215691052U (zh) * | 2021-09-27 | 2022-02-01 | 首都医科大学附属北京天坛医院 | 基于影像和电生理实时定位的脑深部电极自动植入系统 |
CN115461781A (zh) * | 2020-02-20 | 2022-12-09 | 得克萨斯州大学系统董事会 | 用于经由大脑解剖学的多模态3d分析优化探针在脑部中的规划和放置的方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4742356B2 (ja) * | 2005-02-02 | 2011-08-10 | 独立行政法人産業技術総合研究所 | 埋め込み型電極装置及び電極埋め込み装置 |
KR20200054937A (ko) * | 2017-07-17 | 2020-05-20 | 아이스 뉴로시스템즈 아이엔씨 | 뇌 활동을 사용하여 두개 내 장치를 배치하기 위한 시스템 및 방법 |
US11291508B2 (en) * | 2018-09-14 | 2022-04-05 | Neuralink, Corp. | Computer vision techniques |
TWI680744B (zh) * | 2018-10-04 | 2020-01-01 | 臺北榮民總醫院 | 定位顱內電極的方法和系統 |
CN111631813B (zh) * | 2020-05-27 | 2021-08-17 | 武汉联影智融医疗科技有限公司 | 植入性电极自动排序方法、排序系统、设备及计算机可读存储介质 |
CN114259205A (zh) * | 2020-09-16 | 2022-04-01 | 中国科学院脑科学与智能技术卓越创新中心 | 脑认知功能检测系统 |
-
2022
- 2022-06-20 CN CN202210696347.9A patent/CN115068082A/zh active Pending
- 2022-06-29 WO PCT/CN2022/102359 patent/WO2023245708A1/zh unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210007808A1 (en) * | 2019-07-12 | 2021-01-14 | Neuralink Corp. | Optical coherence tomography for robotic brain surgery |
CN115461781A (zh) * | 2020-02-20 | 2022-12-09 | 得克萨斯州大学系统董事会 | 用于经由大脑解剖学的多模态3d分析优化探针在脑部中的规划和放置的方法 |
CN113017566A (zh) * | 2021-02-26 | 2021-06-25 | 北京伟浩君智能技术有限公司 | 基于图像的血管识别与定位方法及装置 |
CN113797440A (zh) * | 2021-09-27 | 2021-12-17 | 首都医科大学附属北京天坛医院 | 基于影像和电生理实时定位的脑深部电极自动植入系统 |
CN215691052U (zh) * | 2021-09-27 | 2022-02-01 | 首都医科大学附属北京天坛医院 | 基于影像和电生理实时定位的脑深部电极自动植入系统 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117982212A (zh) * | 2024-04-03 | 2024-05-07 | 北京智冉医疗科技有限公司 | 电极植入设备和电极植入方法 |
CN117982211A (zh) * | 2024-04-03 | 2024-05-07 | 北京智冉医疗科技有限公司 | 电极植入设备和电极植入方法 |
Also Published As
Publication number | Publication date |
---|---|
CN115068082A (zh) | 2022-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2023245708A1 (zh) | 基于机器视觉的电极植入方法及系统 | |
US9767568B2 (en) | Image processor, image processing method, and computer program | |
US7336814B2 (en) | Method and apparatus for machine-vision | |
CN106181162B (zh) | 一种基于机器视觉的实时焊缝跟踪检测方法 | |
JP5812599B2 (ja) | 情報処理方法及びその装置 | |
Vezhnevets et al. | Robust and accurate eye contour extraction | |
CN108898634B (zh) | 基于双目相机视差对绣花机目标针眼进行精确定位的方法 | |
US11488322B2 (en) | System and method for training a model in a plurality of non-perspective cameras and determining 3D pose of an object at runtime with the same | |
JP6899189B2 (ja) | ビジョンシステムで画像内のプローブを効率的に採点するためのシステム及び方法 | |
WO2024027647A1 (zh) | 机器人控制方法、系统和计算机程序产品 | |
US9569850B2 (en) | System and method for automatically determining pose of a shape | |
EP3918571A1 (en) | Eye tracking device and a method thereof | |
CN109146866A (zh) | 机器人对焊缝处理的方法及装置 | |
US20240122472A1 (en) | Systems and methods for automated end-to-end eye screening, monitoring and diagnosis | |
CN114730454A (zh) | 场景感知系统和方法 | |
TWI823419B (zh) | 試料觀察裝置及方法 | |
JP2003108980A (ja) | 頭部領域抽出装置およびリアルタイム表情追跡装置 | |
JP2004239791A (ja) | ズームによる位置計測方法 | |
CN114842144A (zh) | 双目视觉三维重构方法及系统 | |
KR20100121817A (ko) | 눈의 영역을 추적하는 방법 | |
KR20040100963A (ko) | 화상 처리 장치 | |
JP2004062393A (ja) | 注目判定方法及び注目判定装置 | |
CN112288801A (zh) | 应用于巡检机器人的四位一体自适应跟踪拍摄方法及装置 | |
WO2023142215A1 (zh) | 基于动态运动基元的微纳操作机器人自动拾取纳米线方法 | |
Chen et al. | Modeling tumor/polyp/lesion structure in 3D for computer-aided diagnosis in colonoscopy |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22947471 Country of ref document: EP Kind code of ref document: A1 |