CN117297769A - Bone layer identification method in hard bone tissue operation - Google Patents
Bone layer identification method in hard bone tissue operation Download PDFInfo
- Publication number
- CN117297769A CN117297769A CN202311034845.8A CN202311034845A CN117297769A CN 117297769 A CN117297769 A CN 117297769A CN 202311034845 A CN202311034845 A CN 202311034845A CN 117297769 A CN117297769 A CN 117297769A
- Authority
- CN
- China
- Prior art keywords
- bone
- hard
- force
- model
- different
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000000988 bone and bone Anatomy 0.000 title claims abstract description 138
- 238000000034 method Methods 0.000 title claims abstract description 51
- 230000003993 interaction Effects 0.000 claims abstract description 17
- 230000008569 process Effects 0.000 claims abstract description 15
- 238000001356 surgical procedure Methods 0.000 claims abstract description 14
- 230000001054 cortical effect Effects 0.000 claims abstract description 12
- 238000004088 simulation Methods 0.000 claims abstract description 12
- 239000000463 material Substances 0.000 claims abstract description 11
- 239000000203 mixture Substances 0.000 claims abstract description 9
- 238000012549 training Methods 0.000 claims abstract description 9
- 238000005520 cutting process Methods 0.000 claims description 10
- 238000001228 spectrum Methods 0.000 claims description 7
- 238000013528 artificial neural network Methods 0.000 claims description 6
- 238000002372 labelling Methods 0.000 claims description 6
- 238000005553 drilling Methods 0.000 claims description 5
- 238000000227 grinding Methods 0.000 claims description 5
- 238000003062 neural network model Methods 0.000 claims description 5
- 230000008859 change Effects 0.000 claims description 4
- 238000013527 convolutional neural network Methods 0.000 claims description 4
- 238000013136 deep learning model Methods 0.000 claims description 3
- 230000006870 function Effects 0.000 claims description 3
- 239000013598 vector Substances 0.000 claims description 3
- 125000004122 cyclic group Chemical group 0.000 claims description 2
- 238000003801 milling Methods 0.000 description 14
- 230000035515 penetration Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000000354 decomposition reaction Methods 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 238000002684 laminectomy Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 210000001032 spinal nerve Anatomy 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000000991 decompressive effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000005555 metalworking Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000001172 regenerating effect Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/17—Guides or aligning means for drills, mills, pins or wires
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/76—Manipulators having means for providing feel, e.g. force or tactile feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/2431—Multiple classes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
- G06F30/23—Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
- G06F30/27—Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
- G06F2218/10—Feature extraction by analysing the shape of a waveform, e.g. extracting parameters relating to peaks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Evolutionary Computation (AREA)
- Biomedical Technology (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Animal Behavior & Ethology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Robotics (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Urology & Nephrology (AREA)
- Pathology (AREA)
- Databases & Information Systems (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
Abstract
The invention provides a bone layer identification method in hard bone tissue operation, which establishes a finite element simulation model of the hard bone tissue operation; geometrically modeling the hard bone tissue, dispersing the hard bone tissue into finite element grids, defining the material characteristics of the hard bone tissue, and introducing constraint conditions of an operation process into the finite element model; according to different parameters, corresponding interaction force changes are obtained; collecting real-time interaction force signals of a surgical tool and hard bone tissues; selecting a plurality of different types of bone layer features for the interaction force signal; then, a force interaction curve is obtained through a finite element simulation model; establishing a Gaussian mixture model to describe the distribution condition of different bone layer characteristics; by training the network model, different bone layers can be accurately identified. The invention solves the problem of identification of cortical bone and cancellous bone in the process of hard bone tissue surgery.
Description
Technical Field
The invention provides a bone layer identification method in hard bone tissue surgery, and belongs to the technical field of image processing.
Background
Conventional bone layer identification can provide image information of bone structure by imaging methods such as X-ray, CT scanning and magnetic resonance imaging, but has a certain difficulty in accurately positioning the cortical bone of the inner layer. The close distance between the inner cortical bone and the nerve is often obscured by soft tissue, fat and blood vessels, making accurate identification difficult during surgery. In addition, the position information of the inner cortical bone is automatically or semi-automatically extracted from the medical image based on image processing and computer vision technology. Common methods include edge detection, thresholding, region growing, morphological operations, and the like. These methods can be identified and segmented based on the characteristics, density, and morphology of the bone structure to provide more accurate surgical navigation and localization.
The bone layer identification method based on the image can only be completed before operation, but can not be identified according to signals in real time. Meanwhile, based on the bone layer identification of the image, doctors are required to label one by one, and larger workload is caused.
One of the prior art (Hand-held bone cutting tool with autonomous penetration detection for spinal surgery, IEEE) proposes a Hand-held bone cutting tool system that detects the penetration force of a workpiece. The system learns the cutting state and the moving state from the demonstration of the surgeon, autonomously detects the penetration of the workpiece, and immediately stops the driving of the cutting tool before the complete penetration. The proposed penetration detection scheme does not require knowledge of the shape and position of the workpiece and therefore does not require any expensive systems such as robotic arms and position sensor systems. In addition, the proposed solution can be easily applied to cutting tools of various shapes. The developed system was evaluated experimentally. The results indicate that the developed system performs satisfactorily in both electric and hand-held settings. This technique has the following disadvantages:
1. the method is only suitable for handheld tools, and only distinguishes doctor's operation, but can not realize the identification of bone layers.
2. The surgical tools need to be controllable in real time and therefore need to be customized, which is not satisfactory for use in hospitals.
CN202210471044.7 provides a robot milling chatter type identification method based on power spectrum entropy difference, which comprises the following steps: in the process of robot milling, collecting an original vibration signal of the tail end of the robot; determining the optimal modal decomposition number, and decomposing the original vibration signal into a plurality of sub-signals according to the optimal modal decomposition number; the sub-signal with the center frequency close to the inherent frequency of the cutter-spindle system is marked as a signal B1, and the rest sub-signals lower than the inherent frequency are marked as A1; filtering out spindle frequency conversion and frequency multiplication components in the signals A1 and B1 to obtain signals A2 and B2; respectively calculating the power spectrum entropy of each signal, and further obtaining a power spectrum entropy difference; and determining an optimal classification threshold value of the power spectrum entropy difference, and identifying the flutter type. The invention comprehensively considers the regenerative chatter caused by the flexibility of the cutter-spindle structure and the modal coupling chatter caused by the insufficient rigidity of the robot structure, and realizes the identification of the milling chatter type of the robot. This technique has the following disadvantages:
1. vibration signals are used for identification, the vibration signals are greatly affected by the environment, and meanwhile, effective vibration characteristics are difficult to obtain due to uneven materials of hard bone materials.
2. Vibration signal modal decomposition tends to lose important features.
3. This method is suitable for metal working, and the natural frequency of hard bone tissue is low, so that it cannot be applied to hard bone tissue.
The prior art (Force perception and bone recognition of vertebral lamina milling by robot-assisted ultrasonic bone scalpel based on backpropagation neural network, IEEE) proposes a robotic system that utilizes an ultrasonic bone scalpel to measure lamina milling forces, implementing a safe milling strategy. The developed bone recognition model based on the back propagation neural network is suitable for robot-assisted lamina milling using milling layering and recognition algorithm analysis. The model uses the characteristic milling force, milling speed, milling depth, and ultrasonic scalpel power as inputs to determine whether the milling has reached cortical bone to identify and judge bone layers. The living animal verification experiment shows that the model can accurately determine the safe milling endpoint. Overall, the identification model can significantly improve the safety and reliability of robot-assisted laminectomy, with significant conversion prospects. This technique has the following disadvantages:
1. BP neural networks typically require a large amount of labeling data to train, especially in complex tasks and multi-class classification problems. If the labeling data is limited, the network may have a problem of under fitting, and the information of the data cannot be fully utilized.
2. If the range of the input data varies greatly or is unevenly distributed, the training and performance of the network may be affected. It is often necessary to normalize or normalize the input data to reduce this effect.
The prior art (State recognition of decompressive laminectomy with multiple information in robot-assisted surgery, IEEE) proposes a state recognition system for robotic assisted tele-surgery. By combining the learning method with the conventional method, the slave-side robot can think about the current operation state like a surgeon and provide the master-side surgeon with more information and decision advice, which helps the surgeon work more safely in teleoperation. For windowing, we propose an image-based state recognition method consisting of a U-Net derived network, gray redistribution and dynamic receptive fields, which helps to control the grinding process to prevent the grinding head from damaging the spinal nerves through the inner edge of the blade. Aiming at internal fixation, we propose a state identification method based on audio frequency and force, which consists of a signal characteristic extraction method, LSTM-based prediction and information fusion, and assists in monitoring the drilling process so as to prevent a drill bit from penetrating through the outer edge of the pedicle to damage spinal nerves. This technique has the following disadvantages:
1. with sound and force signals, more sensors are needed.
2. The LSTM prediction information has lower precision, and the prediction range is limited, so that the LSTM prediction information is not suitable for identifying the bone layer.
The prior art (Tactile perception for surgical status recognition in robot-assisted laminectomy, IEEE) proposes a surgical state sensing method using an acceleration sensor and a force sensor mounted on a robot based on a human body tactile sensing mechanism. Introducing a Sinc convolution layer to process the high-frequency vibration signal, and processing the obtained characteristic and the average force signal together by utilizing a one-dimensional convolution network. The method classifies the surgical status into 1 class and outputs a burr blockage probability. Experiments on animal bones verify the effectiveness of the proposed model. In addition, it has been demonstrated that the fusion of the two haptic signals can significantly improve the accuracy of state recognition under varying milling parameters. This technique has the following disadvantages:
1. all data labels are subjective judgment and have a certain difference from the actual situation.
2. Acceleration sensors and force sensors increase the computational effort of the network.
3. The weights of the two signals are not differentiated because the accuracy of the acceleration sensor and the force sensor are not identical.
Disclosure of Invention
The invention provides a bone layer identification method in hard bone tissue operation, which mainly aims to solve the problem of identification of cortical bone and cancellous bone in the hard bone tissue operation process.
The specific technical scheme provided by the invention is as follows:
1. a method for identifying a bone layer in a hard bone tissue operation, comprising the steps of:
step 1, establishing a finite element simulation model of hard bone tissue operation; geometrically modeling the hard bone tissue, dispersing the hard bone tissue into finite element grids, defining the material characteristics of the hard bone tissue, and introducing constraint conditions of an operation process into the finite element model;
step 2, in the simulation process, different operation scenes are considered, and corresponding interaction force changes are obtained according to different parameters;
step 3, performing grinding, drilling and cutting operations by the hard bone tissue operation robot according to the preoperative planning; during the operation, collecting real-time interaction force signals of the surgical tool and the hard bone tissue;
step 4, selecting a plurality of different types of bone layer characteristics for the interaction force signals in order to adapt to different bone materials and different operation parameters; then, a force interaction curve is obtained through a finite element simulation model and is used for guiding labeling of outer cortical bone, cancellous bone and inner cortical bone under different bone layer characteristics;
step 5, establishing a Gaussian mixture model to describe the distribution condition of different bone layer characteristics;
and step 6, training the network model to accurately identify different bone layers. In practical application, the trained network model is used for bone layer identification of the real-time force signals.
Further, the material characteristics of the hard bone tissue in step 1 include elastic modulus, shear modulus, poisson ratio.
The specific method of the step 3 is that a force sensor is arranged at the tail end of the mechanical arm, and an operation tool is arranged on the movable end face of the force sensor; in the operation executing process, the force sensor senses the change of the force signal, and the interaction force signal of the operation tool and the hard bone tissue is acquired by acquiring the signal sensed by the force sensor.
The bone layer characteristics described in step 4 are based on the amplitude, frequency, time domain and frequency domain characteristics of the force signal.
In the step 5, the bone layer features are respectively set into three types of peak force, root mean square force and spectrum features, then the weights and probability density functions of different features are estimated according to the feature distribution conditions, and the weight information of the different bone layer features is obtained by clustering and distribution modeling of feature vectors; and designing and training a neural network model based on the bone layer characteristics established by the marked data set and the Gaussian mixture model.
The neural network model is a deep learning model and comprises a convolutional neural network or a cyclic neural network and is used for learning the association between the force signals and the bone layer categories.
Step 6, performing bone layer identification on the real-time force signals by using the trained network model; by inputting the real-time force signal into the network model, the model will output the corresponding bone layer category.
The invention mainly aims to provide a bone layer identification method for robot assisted hard bone tissue surgery, which solves the problem of shake of various hard bone tissue surgeries; the method has the following technical effects:
1. guiding data annotation by a finite element simulation model, and judging subjectively;
2. the finite element model is suitable for various different hard bone tissues and can be adjusted according to actual operation conditions.
3. Only through force sensor, but not multiple sensor fusion, reduced signal acquisition and processing procedure.
4. And various types of characteristics are established, so that training precision can be improved.
5. And regulating characteristic weights for each group of force signal data through a Gaussian mixture model, and maximally ensuring the effectiveness of the signals.
6. The trained model can be used for real-time bone layer identification and for safety decision-making of the surgical robot.
Drawings
Fig. 1 is a flow chart of the present invention.
Detailed Description
The invention provides a bone layer identification method in hard bone tissue surgery. The whole flow is as shown in figure 1, and the specific steps are as follows:
and step 1, establishing a finite element simulation model of hard bone tissue operation. The hard bone tissue is geometrically modeled and discretized into a finite element mesh, defining the material characteristics of the hard bone tissue, such as elastic modulus, shear modulus, poisson ratio, etc. Constraints of the operating process are introduced in the finite element model.
In the step 2, in the simulation process, different operation situations such as bone cutting, nailing, screw fastening and the like can be considered. The surgical tool and the surgical operation are also provided with different parameters according to actual conditions, such as the diameter and the cutting angle of the surgical tool, the feeding speed, the drilling speed, the feeding angle and the like. And according to different parameters, obtaining corresponding interaction force changes.
And 3, performing grinding, drilling, cutting and other operations by the robot according to preoperative planning. During the execution of the operation, real-time interactive force signals of the surgical tool and the hard bone tissue are acquired. Specifically, the end of the mechanical arm is provided with a force sensor, and the surgical tool is arranged on the movable end face of the force sensor. During the operation execution, the force sensor senses the change of the force signal, and the interaction force signal of the operation tool and the hard bone tissue can be acquired by acquiring the signal sensed by the force sensor.
And 4, selecting a plurality of different types of bone layer characteristics for the interactive force signals to adapt to different bone materials and different operation parameters. These characteristics may be based on amplitude, frequency, time and frequency domain characteristics of the force signal, and the like. For example, peak force, root mean square force, frequency spectral features, time domain statistics, etc. may be selected as bone layer features. And then, a force interaction curve obtained through a finite element simulation model is used for guiding labeling of the outer cortical bone, the cancellous bone and the inner cortical bone under different bone layer characteristics, and the purpose of labeling is to determine corresponding change conditions of different bone layers under different characteristics.
And 5, establishing a Gaussian mixture model (Gaussian Mixture Model, GMM) to describe the distribution condition of different bone layer characteristics. The bone layer features are respectively set into three types of peak force, root mean square force and frequency spectrum features, then the weights and probability density functions of different features are estimated according to the feature distribution conditions, and the weight information of the different bone layer features is obtained by clustering and distribution modeling of feature vectors. And designing and training a proper neural network model based on the bone layer characteristics established by the marked data set and the Gaussian mixture model. This may be a deep learning model, such as a Convolutional Neural Network (CNN) or a Recurrent Neural Network (RNN), for learning the association between the force signal and the bone layer categories.
And step 6, training the network model to accurately identify different bone layers. In practical application, the trained network model is used for bone layer identification of the real-time force signals. By inputting the real-time force signal into the network model, the model will output the corresponding bone layer category. The method can realize real-time bone layer identification and can make corresponding decisions or control according to the requirements.
Claims (7)
1. A method for identifying a bone layer in a hard bone tissue operation, comprising the steps of:
step 1, establishing a finite element simulation model of hard bone tissue operation; geometrically modeling the hard bone tissue, dispersing the hard bone tissue into finite element grids, defining the material characteristics of the hard bone tissue, and introducing constraint conditions of an operation process into the finite element model;
step 2, in the simulation process, different operation scenes are considered, and corresponding interaction force changes are obtained according to different parameters;
step 3, performing grinding, drilling and cutting operations by the hard bone tissue operation robot according to the preoperative planning; during the operation, collecting real-time interaction force signals of the surgical tool and the hard bone tissue;
step 4, selecting a plurality of different types of bone layer characteristics for the interaction force signals in order to adapt to different bone materials and different operation parameters; then, a force interaction curve is obtained through a finite element simulation model and is used for guiding labeling of outer cortical bone, cancellous bone and inner cortical bone under different bone layer characteristics;
step 5, establishing a Gaussian mixture model to describe the distribution condition of different bone layer characteristics;
and step 6, training the network model to accurately identify different bone layers. In practical application, the trained network model is used for bone layer identification of the real-time force signals.
2. The method for identifying bone layers in a hard bone surgery according to claim 1, wherein the material properties of the hard bone in step 1 include elastic modulus, shear modulus, poisson ratio.
3. The method for identifying bone layers in hard bone tissue surgery according to claim 1, wherein the specific method in step 3 is that a force sensor is installed at the tail end of the mechanical arm, and a surgical tool is installed on the movable end face of the force sensor; in the operation executing process, the force sensor senses the change of the force signal, and the interaction force signal of the operation tool and the hard bone tissue is acquired by acquiring the signal sensed by the force sensor.
4. The method of claim 1, wherein the bone layer characteristics of step 4 are based on the amplitude, frequency, time domain and frequency domain characteristics of the force signal.
5. The bone layer identification method in hard bone tissue surgery according to claim 1, wherein in step 5, bone layer features are respectively set as three types of peak force, root mean square force and spectrum features, then weight and probability density functions of different features are estimated according to feature distribution conditions, and weight information of the different bone layer features is obtained by clustering and distribution modeling of feature vectors; and designing and training a neural network model based on the bone layer characteristics established by the marked data set and the Gaussian mixture model.
6. The method for identifying bone layers in hard bone surgery according to claim 5, wherein the neural network model is a deep learning model including a convolutional neural network or a cyclic neural network for learning the association between the force signal and the bone layer class.
7. The method for bone layer identification in hard bone surgery according to claim 1, wherein in step 6, the trained network model is used to perform bone layer identification on the real-time force signal; by inputting the real-time force signal into the network model, the model will output the corresponding bone layer category.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311034845.8A CN117297769A (en) | 2023-08-17 | 2023-08-17 | Bone layer identification method in hard bone tissue operation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311034845.8A CN117297769A (en) | 2023-08-17 | 2023-08-17 | Bone layer identification method in hard bone tissue operation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117297769A true CN117297769A (en) | 2023-12-29 |
Family
ID=89272669
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311034845.8A Pending CN117297769A (en) | 2023-08-17 | 2023-08-17 | Bone layer identification method in hard bone tissue operation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117297769A (en) |
-
2023
- 2023-08-17 CN CN202311034845.8A patent/CN117297769A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11642179B2 (en) | Artificial intelligence guidance system for robotic surgery | |
US10342622B2 (en) | System and method for estimating the spatial position of a tool within an object | |
US10058392B2 (en) | Neural monitor-based dynamic boundaries | |
US11202676B2 (en) | Neural monitor-based dynamic haptics | |
Sun et al. | State recognition of decompressive laminectomy with multiple information in robot-assisted surgery | |
Li et al. | Tactile perception for surgical status recognition in robot-assisted laminectomy | |
Dai et al. | Human-inspired haptic perception and control in robot-assisted milling surgery | |
Zahedi et al. | Towards skill transfer via learning-based guidance in human-robot interaction: An application to orthopaedic surgical drilling skill | |
Xia et al. | Vertebral lamina state estimation in robotic bone milling process via vibration signals fusion | |
Li et al. | Grinding trajectory generator in robot-assisted laminectomy surgery | |
Qu et al. | Force perception and bone recognition of vertebral lamina milling by robot-assisted ultrasonic bone scalpel based on backpropagation neural network | |
Qi et al. | An automatic path planning method of pedicle screw placement based on preoperative CT images | |
Ying et al. | Bone milling: on monitoring cutting state and force using sound signals | |
Jin et al. | Model-based state recognition of bone drilling with robotic orthopedic surgery system | |
CN117297769A (en) | Bone layer identification method in hard bone tissue operation | |
Osa et al. | Autonomous penetration detection for bone cutting tool using demonstration-based learning | |
Li et al. | State sensing of spinal surgical robot based on fusion of sound and force signals | |
Ying et al. | Autonomous penetration perception for bone cutting during laminectomy | |
CN115645063A (en) | Vertebral plate cutting control method and surgical robot | |
CN115422838A (en) | Autonomous learning method, apparatus, device and medium for surgical robot | |
Xia et al. | Tactile Perception Based Depth and Angle Control During Robot-Assisted Bent Bone Grinding | |
CN113951988B (en) | Grinding method, device and system for ultrasonic bone knife | |
Bian et al. | Robotic Automatic Drilling for Craniotomy: Algorithms and In Vitro Animal Experiments | |
Li et al. | Impedance Control of Robot Bone Penetration based on Self-adaptive Shutdown Discrimination | |
Zhang et al. | Safety control strategy of spinal lamina cutting based on force and cutting depth signals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |