CN116570367A - Intelligent sensing prediction method device and equipment for bone grinding and bone quality of robot operation - Google Patents
Intelligent sensing prediction method device and equipment for bone grinding and bone quality of robot operation Download PDFInfo
- Publication number
- CN116570367A CN116570367A CN202310532076.8A CN202310532076A CN116570367A CN 116570367 A CN116570367 A CN 116570367A CN 202310532076 A CN202310532076 A CN 202310532076A CN 116570367 A CN116570367 A CN 116570367A
- Authority
- CN
- China
- Prior art keywords
- bone
- signal
- grinding
- cross
- moment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000000988 bone and bone Anatomy 0.000 title claims abstract description 167
- 238000000227 grinding Methods 0.000 title claims abstract description 66
- 238000000034 method Methods 0.000 title claims abstract description 64
- 230000008447 perception Effects 0.000 claims abstract description 23
- 238000003860 storage Methods 0.000 claims abstract description 11
- 238000002432 robotic surgery Methods 0.000 claims abstract description 9
- 230000006870 function Effects 0.000 claims description 23
- 239000011159 matrix material Substances 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 14
- 238000007781 pre-processing Methods 0.000 claims description 14
- 238000012549 training Methods 0.000 claims description 14
- 238000013527 convolutional neural network Methods 0.000 claims description 11
- 230000008569 process Effects 0.000 claims description 11
- 230000001054 cortical effect Effects 0.000 claims description 9
- 238000005457 optimization Methods 0.000 claims description 8
- 238000012545 processing Methods 0.000 claims description 6
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 claims description 4
- 230000004913 activation Effects 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000002372 labelling Methods 0.000 claims description 4
- 210000002569 neuron Anatomy 0.000 claims description 4
- 238000011176 pooling Methods 0.000 claims description 4
- 238000005070 sampling Methods 0.000 claims description 4
- 238000004519 manufacturing process Methods 0.000 claims description 3
- 238000001356 surgical procedure Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 6
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 108010076504 Protein Sorting Signals Proteins 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01L—MEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
- G01L3/00—Measuring torque, work, mechanical power, or mechanical efficiency, in general
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P3/00—Measuring linear or angular speed; Measuring differences of linear or angular speeds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B2017/1602—Mills
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Mathematical Physics (AREA)
- Data Mining & Analysis (AREA)
- Software Systems (AREA)
- Robotics (AREA)
- Biophysics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Prostheses (AREA)
Abstract
The application provides an intelligent perception prediction method and device for bone grinding and bone quality of robotic surgery operation, electronic equipment and a computer readable storage medium. The intelligent perception prediction method for the bone grinding bone quality of the robot operation comprises the following steps: acquiring a speed signal and a moment signal of a motor under the condition that bone tissues to be identified grind bone; calculating a cross-correlation characteristic between the speed signal and the torque signal; the speed signal, the moment signal and the cross-correlation characteristic are input into a preset bone tissue grinding bone identification model, and a corresponding bone tissue grinding bone identification result is output. According to the embodiment of the application, the bone tissue grinding bone mass can be more accurately identified.
Description
Technical Field
The application belongs to the technical field of deep learning intelligent recognition, and particularly relates to an intelligent perception prediction method and device for bone grinding bone quality of robot operation, electronic equipment and a computer readable storage medium.
Background
Currently, bone tissue grinding bone identification is performed mainly according to acquired time domain signals or frequency domain signals. This approach does not accurately identify bone tissue grinding bone using a single signal.
Therefore, how to more accurately identify bone tissue grinding bone is a technical problem that needs to be solved by those skilled in the art.
Disclosure of Invention
The embodiment of the application provides a robot operation bone grinding bone intelligent perception prediction method, a device, electronic equipment and a computer readable storage medium, which can more accurately identify bone tissue grinding bone.
In a first aspect, an embodiment of the present application provides a method for intelligent sensing and predicting bone grinding bone quality in a robotic surgical operation, including:
acquiring a speed signal and a moment signal of a motor under the condition that bone tissues to be identified grind bone;
calculating a cross-correlation characteristic between the speed signal and the torque signal;
the speed signal, the moment signal and the cross-correlation characteristic are input into a preset bone tissue grinding bone identification model, and a corresponding bone tissue grinding bone identification result is output.
Optionally, before inputting the speed signal, the moment signal and the cross-correlation feature into the preset bone tissue grinding bone identification model, the method further comprises:
respectively acquiring speed signals and moment signals of a motor under three conditions of air, cancellous bone and cortical bone;
data preprocessing is carried out on the speed signal and the moment signal;
for each case, calculating the cross correlation characteristic between the speed signal and the moment signal after data preprocessing;
generating a data set based on the speed signal, the moment signal and the cross-correlation characteristic after data preprocessing;
and performing model training on the convolutional neural network by using the data set to obtain a bone tissue grinding bone identification model.
Optionally, the data preprocessing for the speed signal and the torque signal includes:
labeling speed signals and moment signals of the motor under three conditions of air, cancellous bone and cortical bone respectively;
and intercepting and manufacturing one sample of data according to the same time interval and sampling frequency for the marked signal.
Optionally, for each case, calculating a cross-correlation characteristic between the data-preprocessed speed signal and the torque signal includes:
carrying out mean value removal treatment on the sample data;
and dividing the sample data subjected to the mean value removal processing into a matrix with uniform size by using a sliding window method, and calculating the cross-correlation characteristic between the speed signal and the moment signal in each acquired sliding window.
Optionally, the sliding window method is used to divide the sample data after the mean value removal process into matrices with uniform size, and the cross-correlation feature between the velocity signal and the moment signal in each sliding window is calculated, which includes:
calculating the average value of each row in each matrix;
the cross-correlation characteristic between the speed signal and the torque signal is calculated based on the average value of each row in each matrix.
Optionally, the convolutional neural network includes five sets of network structures, a Dropout layer, a flat layer, a full connection layer and a Softmax layer that are sequentially connected, and each set of network structure includes a convolutional layer, a Dropout layer and a pooling layer that are sequentially connected.
Optionally, in the model training process, setting a Dropout parameter to be 0.5;
setting a loss function as a multi-classification loss function;
setting an optimization function as SGD;
setting a linear rectification ReLU activation function in a convolution layer;
setting the number of neurons of the full connection layer as 512;
the maximum number of rounds of model training is 1000 rounds, the batch size is 32, the initial learning rate is 0.001, and the weight is updated by using an Adam optimization function.
In a second aspect, an embodiment of the present application provides a bone tissue grinding bone identification device, the device comprising:
the signal acquisition module is used for acquiring a speed signal and a moment signal of the motor under the condition that bone tissues to be identified grind bones;
the cross-correlation characteristic calculation module is used for calculating the cross-correlation characteristic between the speed signal and the moment signal;
the result output module is used for inputting the speed signal, the moment signal and the cross-correlation characteristic into a preset bone tissue grinding bone recognition model and outputting a corresponding bone tissue grinding bone recognition result.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory storing computer program instructions;
the processor, when executing the computer program instructions, implements the intelligent perception prediction method for bone grinding and bone quality of robotic surgery operation according to the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, implement the robotic surgical operation bone grinding bone intelligent perception prediction method according to the first aspect.
According to the intelligent perception prediction method and device for the bone grinding and bone quality of the robot operation, the electronic equipment and the computer readable storage medium, the bone grinding and bone quality of the bone tissue can be more accurately identified.
The intelligent perception prediction method for the bone grinding and bone quality of the robot operation obtains a speed signal and a moment signal of a motor under the condition of bone tissue grinding and bone quality to be identified; calculating a cross-correlation characteristic between the speed signal and the torque signal; the speed signal, the moment signal and the cross-correlation characteristic are input into a preset bone tissue grinding bone identification model, and a corresponding bone tissue grinding bone identification result is output.
Therefore, the method calculates the cross-correlation characteristic between the speed signal and the moment signal, namely, the cross-correlation among various signals is utilized to identify the bone tissue grinding bone, and compared with the prior art, the method can more accurately identify the bone tissue grinding bone.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present application, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for intelligent perception prediction of bone grinding and bone quality for robotic surgical procedures according to one embodiment of the present application;
FIG. 2 is a flow chart of a method for intelligent perception prediction of bone grinding and bone quality for robotic surgical procedures according to one embodiment of the present application;
FIG. 3 is a graph of signal characteristics for three cases provided by one embodiment of the present application;
FIG. 4 is a schematic diagram of a sliding window method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a 1D-CNN neural network according to an embodiment of the present application;
FIG. 6 is a schematic diagram showing a structure of a bone tissue grinding bone recognition device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Features and exemplary embodiments of various aspects of the present application will be described in detail below, and in order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described in further detail below with reference to the accompanying drawings and the detailed embodiments. It should be understood that the particular embodiments described herein are meant to be illustrative of the application only and not limiting. It will be apparent to one skilled in the art that the present application may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the application by showing examples of the application.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
Currently, bone tissue grinding bone identification is performed mainly according to acquired time domain signals or frequency domain signals. This approach does not accurately identify bone tissue grinding bone using a single signal.
In order to solve the problems in the prior art, the embodiment of the application provides a robot operation bone grinding bone intelligent perception prediction method, a device, equipment and a computer readable storage medium. The following first describes the intelligent sensing and predicting method for bone grinding and bone quality of the robotic surgery operation provided by the embodiment of the application.
Fig. 1 shows a flow chart of an intelligent perception prediction method for bone grinding bone quality of a robotic surgical operation according to an embodiment of the present application. As shown in fig. 1, the intelligent perception prediction method for bone grinding bone quality of the robotic surgery operation comprises the following steps:
s101, acquiring a speed signal and a moment signal of a motor under the condition that bone tissues to be identified grind bone;
s102, calculating the cross correlation characteristic between the speed signal and the moment signal;
s103, inputting the speed signal, the moment signal and the cross-correlation characteristic into a preset bone tissue grinding bone identification model, and outputting a corresponding bone tissue grinding bone identification result.
In one embodiment, the method further comprises, prior to inputting the speed signal, the moment signal and the cross-correlation feature into the preset bone tissue grinding bone identification model:
respectively acquiring speed signals and moment signals of a motor under three conditions of air, cancellous bone and cortical bone;
data preprocessing is carried out on the speed signal and the moment signal;
for each case, calculating the cross correlation characteristic between the speed signal and the moment signal after data preprocessing;
generating a data set based on the speed signal, the moment signal and the cross-correlation characteristic after data preprocessing;
and performing model training on the convolutional neural network by using the data set to obtain a bone tissue grinding bone identification model.
Specifically, as shown in fig. 2, data acquisition and data preprocessing are performed first, and then cross-correlation features are calculated to generate a data set. And training a model by using the data set, and using the trained model for identifying the ground bone of the bone tissue.
The acquisition signal data comprise speed signals and moment signals of the motor under three conditions of air, cancellous bone and cortical bone. The signal characteristics for the three cases are shown in fig. 3.
Wherein generating the data set comprises: the number of data of the original signal of each sample is 2000, the calculated cross-correlation characteristic is 2000, namely the number of data of the final sample is 4000.
Sample data are combined according to a training set: test set: validation set = 7:1:2 scale division.
In one embodiment, data preprocessing of the speed signal and the torque signal includes:
labeling speed signals and moment signals of the motor under three conditions of air, cancellous bone and cortical bone respectively;
and intercepting and manufacturing one sample of data according to the same time interval and sampling frequency for the marked signal.
Specifically, the data preprocessing includes: labeling data acquired under three conditions of air, cancellous bone and cortical bone, wherein a signal label acquired in the air is 0, a signal label acquired under cancellous bone is 1, and a signal label acquired under cortical bone is 2. The signal data is cut out and made into one sample data at the same time interval, wherein the time can be set to be 2 seconds, and the data sampling frequency is 500Hz, namely, one data sample contains 2000 data.
In one embodiment, for each case, calculating a cross-correlation characteristic between the data pre-processed speed signal and the torque signal includes:
carrying out mean value removal treatment on the sample data;
and dividing the sample data subjected to the mean value removal processing into a matrix with uniform size by using a sliding window method, and calculating the cross-correlation characteristic between the speed signal and the moment signal in each acquired sliding window.
Specifically, the application provides a feature extraction method based on cross correlation by considering the feature relation between a speed signal and a moment signal sequence. The calculation method comprises the following steps:
firstly, carrying out matrix averaging treatment on sample data:
wherein, the liquid crystal display device comprises a liquid crystal display device,representing a matrix of sample data>Representing the average value, sigma, of each column in a matrix of sample data i Representing the standard deviation of each column of data.
Using sliding window method to get new H i,j The matrix is divided into matrices of uniform size, and the cross-correlation between the signals in the sliding window taken each time is calculated.
In one embodiment, the method for dividing the sample data after the mean removal process into matrices with uniform size by using a sliding window method, and calculating the cross-correlation characteristic between the velocity signal and the moment signal in each acquired sliding window includes:
calculating the average value of each row in each matrix;
the cross-correlation characteristic between the speed signal and the torque signal is calculated based on the average value of each row in each matrix.
Fig. 4 is a schematic diagram of a sliding window method according to an embodiment of the present application, where, as shown in fig. 4, the matrix length, i.e. the sample sequence length is m, and the sample matrix dimension is 2*m. The sliding window size l=2χn+1 is selected, namely, a gray frame in the graph, where N is a period signal length, the signal data is analyzed, and a period length is about 100 data, namely, n=100 is set at this time, and the cross-correlation characteristic between two consecutive periods is calculated. The present application calculates the characteristic information between two consecutive data points, thus letting e=1. The gray box represents the range of i-th position feature extraction in the signal, i.e., the cross-correlation feature between the surrounding adjacent 2*N data is calculated.
Calculating the average value of each row in each matrix:
wherein: j=1, 2, l denotes the sliding window size.
Wherein, the liquid crystal display device comprises a liquid crystal display device,representing a cross-correlation characteristic between the mth line of data and the nth line of data, m, n=1, 2, m+.n
M for cross-correlation feature of ith column signal i The expression is that:
M i =(M 12 ,M 21 )
at this time, each column signal gets 2 features, and the cross-correlation feature of each sample data is denoted by M, that is:
M=(M1,M2,...,Mm)
where M represents the cross-correlation characteristic of each sample, the number of characteristics is 2*m, where M takes a value of 1000, and the final cross-correlation characteristic of M is 2000.
In one embodiment, the convolutional neural network comprises five groups of network structures, a Dropout layer, a flat layer, a full connection layer and a Softmax layer which are sequentially connected, and each group of network structures comprises a convolutional layer, a Dropout layer and a pooling layer which are sequentially connected.
In one embodiment, the Dropout parameter is set to 0.5 during model training;
setting a loss function as a multi-classification loss function;
setting an optimization function as SGD;
setting a linear rectification ReLU activation function in a convolution layer;
setting the number of neurons of the full connection layer as 512;
the maximum number of rounds of model training is 1000 rounds, the batch size is 32, the initial learning rate is 0.001, and the weight is updated by using an Adam optimization function.
Specifically, the application designs a one-dimensional CNN network, and extracts signal characteristics of motors under different conditions. In order to avoid overfitting during training, a Dropout technique is used, setting the Dropout parameter to 0.5. And (3) selecting average pooling, wherein the loss function is a multi-classification loss function, the optimization function is SGD, a linear rectification ReLU activation function is used in a convolution layer, the number of neurons of a full-connection layer is set to be 512, and finally classification is carried out through a softmax classifier. The maximum number of training rounds of the CNN network is 1000 rounds, the batch size is 32, the initial learning rate is 0.001, and the weight is updated by using an Adam optimization function. And finally determining the CNN network structure and parameters of each convolution layer through experimental analysis verification, wherein the parameters are shown in figure 5. And (3) preparing the acquired original signals, namely the speed signals, the moment signals and the calculated cross-correlation characteristics into a data sample, converting the data into a one-dimensional format, and inputting the data into a neural network for training.
According to the application, the cross correlation algorithm is used for calculating the cross correlation characteristics between the speed signal and the moment signal, and the structure of the CNN is specifically designed, so that the effective characteristics can be better extracted from the signal data, the accuracy of model classification is improved, and the signal recognition precision is further improved.
Fig. 6 is a schematic structural view of a bone tissue grinding bone substance identification device according to an embodiment of the present application, the device including:
the signal acquisition module 601 is used for acquiring a speed signal and a moment signal of a motor under the condition that bone tissues to be identified grind bone;
a cross-correlation feature calculation module 602, configured to calculate a cross-correlation feature between the speed signal and the torque signal;
the result output module 603 is configured to input the speed signal, the moment signal, and the cross-correlation feature into a preset bone tissue grinding bone identification model, and output a corresponding bone tissue grinding bone identification result.
Fig. 7 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
The electronic device may include a processor 701 and a memory 702 storing computer program instructions.
In particular, the processor 701 may comprise a Central Processing Unit (CPU), or an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or may be configured as one or more integrated circuits implementing embodiments of the present application.
Memory 702 may include mass storage for data or instructions. By way of example, and not limitation, memory 702 may comprise a Hard Disk Drive (HDD), floppy Disk Drive, flash memory, optical Disk, magneto-optical Disk, magnetic tape, or universal serial bus (Universal Serial Bus, USB) Drive, or a combination of two or more of the foregoing. The memory 702 may include removable or non-removable (or fixed) media, where appropriate. The memory 702 may be internal or external to the electronic device, where appropriate. In a particular embodiment, the memory 702 may be a non-volatile solid state memory.
In one embodiment, memory 702 may be Read Only Memory (ROM). In one embodiment, the ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically Erasable PROM (EEPROM), electrically rewritable ROM (EAROM), or flash memory, or a combination of two or more of these.
The processor 701 reads and executes the computer program instructions stored in the memory 702 to implement the intelligent sensory prediction method of bone grinding bone mass for robotic surgery according to any of the above embodiments.
In one example, the electronic device may also include a communication interface 703 and a bus 710. As shown in fig. 7, the processor 701, the memory 702, and the communication interface 703 are connected by a bus 710 and perform communication with each other.
The communication interface 703 is mainly used for implementing communication between each module, device, unit and/or apparatus in the embodiment of the present application.
Bus 710 includes hardware, software, or both that couple components of the electronic device to one another. By way of example, and not limitation, the buses may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a HyperTransport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a micro channel architecture (MCa) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus, or a combination of two or more of the above. Bus 710 may include one or more buses, where appropriate. Although embodiments of the application have been described and illustrated with respect to a particular bus, the application contemplates any suitable bus or interconnect.
In addition, in combination with the intelligent sensing prediction method for the bone grinding bone quality of the robotic surgery operation in the above embodiment, the embodiment of the application can be realized by providing a computer readable storage medium. The computer readable storage medium has stored thereon computer program instructions; the computer program instructions, when executed by the processor, implement any of the robotic surgical operative bone grinding bone intelligent perception prediction methods of the above embodiments.
It should be understood that the application is not limited to the particular arrangements and instrumentality described above and shown in the drawings. For the sake of brevity, a detailed description of known methods is omitted here. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present application are not limited to the specific steps described and shown, and those skilled in the art can make various changes, modifications and additions, or change the order between steps, after appreciating the spirit of the present application.
The functional blocks shown in the above-described structural block diagrams may be implemented in hardware, software, firmware, or a combination thereof. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, a plug-in, a function card, or the like. When implemented in software, the elements of the application are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine readable medium or transmitted over transmission media or communication links by a data signal carried in a carrier wave. A "machine-readable medium" may include any medium that can store or transfer information. Examples of machine-readable media include electronic circuitry, semiconductor memory devices, ROM, flash memory, erasable ROM (EROM), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, radio Frequency (RF) links, and the like. The code segments may be downloaded via computer networks such as the internet, intranets, etc.
It should also be noted that the exemplary embodiments mentioned in this disclosure describe some methods or systems based on a series of steps or devices. However, the present application is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, or may be performed in a different order from the order in the embodiments, or several steps may be performed simultaneously.
Aspects of the present application are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such a processor may be, but is not limited to being, a general purpose processor, a special purpose processor, an application specific processor, or a field programmable logic circuit. It will also be understood that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware which performs the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In the foregoing, only the specific embodiments of the present application are described, and it will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the systems, modules and units described above may refer to the corresponding processes in the foregoing method embodiments, which are not repeated herein. It should be understood that the scope of the present application is not limited thereto, and any equivalent modifications or substitutions can be easily made by those skilled in the art within the technical scope of the present application, and they should be included in the scope of the present application.
Claims (10)
1. The intelligent perception prediction method for the bone grinding bone quality of the robot operation is characterized by comprising the following steps:
acquiring a speed signal and a moment signal of a motor under the condition that bone tissues to be identified grind bone;
calculating a cross-correlation characteristic between the speed signal and the torque signal;
the speed signal, the moment signal and the cross-correlation characteristic are input into a preset bone tissue grinding bone identification model, and a corresponding bone tissue grinding bone identification result is output.
2. The method of intelligent perception prediction of bone grinding bone for robotic surgical procedures of claim 1, wherein prior to inputting the velocity signal, the moment signal, and the cross-correlation features into the predetermined bone tissue grinding bone identification model, the method further comprises:
respectively acquiring speed signals and moment signals of a motor under three conditions of air, cancellous bone and cortical bone;
data preprocessing is carried out on the speed signal and the moment signal;
for each case, calculating the cross correlation characteristic between the speed signal and the moment signal after data preprocessing;
generating a data set based on the speed signal, the moment signal and the cross-correlation characteristic after data preprocessing;
and performing model training on the convolutional neural network by using the data set to obtain a bone tissue grinding bone identification model.
3. The intelligent perception prediction method for bone grinding bone quality of robotic surgical operation according to claim 2, wherein the data preprocessing of the speed signal and the moment signal comprises:
labeling speed signals and moment signals of the motor under three conditions of air, cancellous bone and cortical bone respectively;
and intercepting and manufacturing one sample of data according to the same time interval and sampling frequency for the marked signal.
4. The intelligent perception prediction method for bone grinding bone for robotic surgical operation according to claim 3, wherein calculating the cross-correlation characteristic between the data-preprocessed speed signal and the torque signal for each case comprises:
carrying out mean value removal treatment on the sample data;
and dividing the sample data subjected to the mean value removal processing into a matrix with uniform size by using a sliding window method, and calculating the cross-correlation characteristic between the speed signal and the moment signal in each acquired sliding window.
5. The intelligent perception prediction method for bone grinding bone quality of robotic surgery according to claim 4, wherein the step of dividing the sample data after the mean value removal process into matrices of uniform size by using a sliding window method, and calculating the cross-correlation characteristics between the velocity signal and the moment signal in each sliding window, comprises:
calculating the average value of each row in each matrix;
the cross-correlation characteristic between the speed signal and the torque signal is calculated based on the average value of each row in each matrix.
6. The intelligent perception prediction method for bone grinding bone quality of robotic surgery according to claim 5, wherein the convolutional neural network comprises five groups of network structures, a Dropout layer, a flame layer, a full connection layer and a Softmax layer which are sequentially connected, and each group of network structures comprises a convolutional layer, a Dropout layer and a pooling layer which are sequentially connected.
7. The intelligent perception prediction method for bone grinding bone quality of robotic surgery according to claim 6, wherein in the model training process, dropout parameter is set to 0.5;
setting a loss function as a multi-classification loss function;
setting an optimization function as SGD;
setting a linear rectification ReLU activation function in a convolution layer;
setting the number of neurons of the full connection layer as 512;
the maximum number of rounds of model training is 1000 rounds, the batch size is 32, the initial learning rate is 0.001, and the weight is updated by using an Adam optimization function.
8. A bone tissue grinding bone identification device, the device comprising:
the signal acquisition module is used for acquiring a speed signal and a moment signal of the motor under the condition that bone tissues to be identified grind bones;
the cross-correlation characteristic calculation module is used for calculating the cross-correlation characteristic between the speed signal and the moment signal;
the result output module is used for inputting the speed signal, the moment signal and the cross-correlation characteristic into a preset bone tissue grinding bone recognition model and outputting a corresponding bone tissue grinding bone recognition result.
9. An electronic device, the electronic device comprising: a processor and a memory storing computer program instructions;
the processor, when executing the computer program instructions, implements the robotic surgical operative bone grinding bone intelligent perception prediction method as defined in any one of claims 1-7.
10. A computer readable storage medium, wherein computer program instructions are stored on the computer readable storage medium, and when executed by a processor, the computer program instructions implement the intelligent perception prediction method for bone grinding bone for robotic surgical operations according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310532076.8A CN116570367A (en) | 2023-05-12 | 2023-05-12 | Intelligent sensing prediction method device and equipment for bone grinding and bone quality of robot operation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310532076.8A CN116570367A (en) | 2023-05-12 | 2023-05-12 | Intelligent sensing prediction method device and equipment for bone grinding and bone quality of robot operation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116570367A true CN116570367A (en) | 2023-08-11 |
Family
ID=87537135
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310532076.8A Pending CN116570367A (en) | 2023-05-12 | 2023-05-12 | Intelligent sensing prediction method device and equipment for bone grinding and bone quality of robot operation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116570367A (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106725711A (en) * | 2016-12-13 | 2017-05-31 | 中国科学院深圳先进技术研究院 | Sclerotin grinding machine people, vertebral plate grinding surgery operation robot control system and method |
CN110399821A (en) * | 2019-07-17 | 2019-11-01 | 上海师范大学 | Customer satisfaction acquisition methods based on facial expression recognition |
US20190336097A1 (en) * | 2014-07-21 | 2019-11-07 | Zebra Medical Vision Ltd. | Systems and methods for prediction of osteoporotic fracture risk |
US20200030036A1 (en) * | 2018-07-25 | 2020-01-30 | Think Surgical Inc. | Intraoperative adjustment of a pre-operatively planned implant cavity to improve implant fit |
CN113222951A (en) * | 2021-05-20 | 2021-08-06 | 吉林大学 | Osteoporosis artificial intelligence diagnostic device capable of identifying hip joint X-ray |
CN113762072A (en) * | 2021-07-27 | 2021-12-07 | 新绎健康科技有限公司 | Method and system for identifying gait signals based on cross correlation |
WO2022126392A1 (en) * | 2020-12-15 | 2022-06-23 | 中山大学孙逸仙纪念医院 | Model training method and apparatus, and electronic device, medium and bone mass measurement system |
CN115081476A (en) * | 2022-06-14 | 2022-09-20 | 中国医学科学院北京协和医院 | Method and device for bone recognition for bone tissue grinding |
-
2023
- 2023-05-12 CN CN202310532076.8A patent/CN116570367A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190336097A1 (en) * | 2014-07-21 | 2019-11-07 | Zebra Medical Vision Ltd. | Systems and methods for prediction of osteoporotic fracture risk |
CN106725711A (en) * | 2016-12-13 | 2017-05-31 | 中国科学院深圳先进技术研究院 | Sclerotin grinding machine people, vertebral plate grinding surgery operation robot control system and method |
US20200030036A1 (en) * | 2018-07-25 | 2020-01-30 | Think Surgical Inc. | Intraoperative adjustment of a pre-operatively planned implant cavity to improve implant fit |
CN110399821A (en) * | 2019-07-17 | 2019-11-01 | 上海师范大学 | Customer satisfaction acquisition methods based on facial expression recognition |
WO2022126392A1 (en) * | 2020-12-15 | 2022-06-23 | 中山大学孙逸仙纪念医院 | Model training method and apparatus, and electronic device, medium and bone mass measurement system |
CN113222951A (en) * | 2021-05-20 | 2021-08-06 | 吉林大学 | Osteoporosis artificial intelligence diagnostic device capable of identifying hip joint X-ray |
CN113762072A (en) * | 2021-07-27 | 2021-12-07 | 新绎健康科技有限公司 | Method and system for identifying gait signals based on cross correlation |
CN115081476A (en) * | 2022-06-14 | 2022-09-20 | 中国医学科学院北京协和医院 | Method and device for bone recognition for bone tissue grinding |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109085469A (en) | A kind of method and system of the signal type of the signal of cable local discharge for identification | |
CN111738302A (en) | System for classifying and diagnosing Alzheimer disease based on multi-modal data | |
CN112560948B (en) | Fundus image classification method and imaging method under data deviation | |
CN115062674B (en) | Tool arrangement and tool changing method and device based on deep learning and storage medium | |
CN112578419A (en) | GPS data reconstruction method based on GRU network and Kalman filtering | |
Harvey | Parameter estimation of signal detection models: RscorePlus user’s manual | |
CN113971437B (en) | Cross-domain gesture recognition method based on commercial Wi-Fi equipment | |
CN116570367A (en) | Intelligent sensing prediction method device and equipment for bone grinding and bone quality of robot operation | |
CN116597002B (en) | Automatic femoral stem placement method, device and equipment based on deep reinforcement learning | |
CN109460863A (en) | Equipment state prediction method based on deep learning | |
CN117350992A (en) | Multi-task segmentation network metal implant identification method based on self-guiding attention mechanism | |
CN116650110A (en) | Automatic knee joint prosthesis placement method and device based on deep reinforcement learning | |
CN112907541B (en) | Palm image quality evaluation model construction method and device | |
CN114098764B (en) | Data processing method, device, electronic equipment and storage medium | |
CN110443276A (en) | Time series classification method based on depth convolutional network Yu the map analysis of gray scale recurrence | |
CN112749998A (en) | Income information output method and device, electronic equipment and computer storage medium | |
US11475255B2 (en) | Method for adaptive context length control for on-line edge learning | |
CN117204910B (en) | Automatic bone cutting method for real-time tracking of knee joint position based on deep learning | |
CN113406574A (en) | Online clustering method for multifunctional radar working mode sequence | |
CN114066804A (en) | Curved surface fault layer tooth position identification method based on deep learning | |
CN112733728A (en) | Visibility edge calculation method and device, electronic equipment and storage medium | |
CN111477273A (en) | Method for predicting individual age information based on brain tissue gene expression | |
CN113608860B (en) | Real-time ruminant behavior identification method for dairy cows based on edge calculation | |
CN116712035B (en) | Sleep stage method and system based on CNN-PSO-BiLSTM | |
CN117911273B (en) | Auxiliary positioning method for cutting protection leather sheath with keyboard by iPad |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |