CN116758127B - Model registration method, device, storage medium and processor for femur - Google Patents

Model registration method, device, storage medium and processor for femur Download PDF

Info

Publication number
CN116758127B
CN116758127B CN202311034475.8A CN202311034475A CN116758127B CN 116758127 B CN116758127 B CN 116758127B CN 202311034475 A CN202311034475 A CN 202311034475A CN 116758127 B CN116758127 B CN 116758127B
Authority
CN
China
Prior art keywords
point cloud
cloud data
matrix
point
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311034475.8A
Other languages
Chinese (zh)
Other versions
CN116758127A (en
Inventor
刘兴业
陈亚刚
冯彦彰
顾席铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing AK Medical Co Ltd
Original Assignee
Beijing AK Medical Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing AK Medical Co Ltd filed Critical Beijing AK Medical Co Ltd
Priority to CN202311034475.8A priority Critical patent/CN116758127B/en
Publication of CN116758127A publication Critical patent/CN116758127A/en
Application granted granted Critical
Publication of CN116758127B publication Critical patent/CN116758127B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a model registration method of femur. The method comprises the following steps: acquiring first point cloud data and second point cloud data of a femur; performing state transition operation on the first point cloud data to obtain a first target point cloud data set, and performing state transition operation on the second point cloud data to obtain a second target point cloud data set; pooling the first target point cloud data set to obtain global features of the first point cloud data, and pooling the second target point cloud data set to obtain global features of the second point cloud data; performing deformation operation on the global features of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and performing deformation operation on the global features of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data; and performing loss calculation on the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data to obtain a point cloud registration model of the femur. The method solves the technical problem of low robustness of the femur registration model.

Description

Model registration method, device, storage medium and processor for femur
Technical Field
The invention relates to the medical field, in particular to a femur model registration method, a femur model registration device, a femur model registration storage medium and a femur model processor.
Background
At present, in the current knee joint operation navigation system, the collected key points are generally used for guiding a femur model prestored in the operation navigation system to perform zooming registration, but only zooming and translation are performed on each dimension through rigid registration, so that some parts of the femur, which are complex in structure and are key, cannot be completely attached to the prestored model, and the technical problems of low result attachment degree of the femur registration model and low robustness of the model are caused.
Aiming at the problems that the result fitting degree of the femur registration model is not high and the robustness of the model is low, no effective solution is proposed at present.
Disclosure of Invention
The embodiment of the invention provides a femur model registration method, a femur model registration device, a storage medium and a processor, which are used for at least solving the technical problems of low fitting degree of femur registration model results and low model robustness.
According to one aspect of an embodiment of the present invention, a method of model registration of a femur is provided. The method may include: acquiring first point cloud data and second point cloud data of a femur, wherein the first point cloud data is point cloud data of a femoral head of the femur, and the second point cloud data is point cloud data of a femoral body of the femur; performing state transition operation on the first point cloud data to obtain a first target point cloud data set, and performing state transition operation on the second point cloud data to obtain a second target point cloud data set; executing pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and executing pooling operation on the second target point cloud data set to obtain global features of the second point cloud data; performing deformation operation on the global features of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and performing deformation operation on the global features of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data; and performing loss calculation on the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data to obtain a point cloud registration model of the femur, wherein the point cloud registration model is used for attaching the femur to the femur prosthesis so as to register the femur prosthesis with each part of the femur.
Optionally, performing a state transition operation on the first point cloud data to obtain a first target point cloud data set, and performing a state transition operation on the second point cloud data to obtain a second target point cloud data set, including: performing rotation operation on the first point cloud data to obtain a rotation angle space of the first point cloud data, and performing rotation operation on the second point cloud data to obtain a rotation angle space of the second point cloud data; the first target point cloud data set is determined based on the rotation angle space of the first point cloud data, and the second target point cloud data set is determined based on the rotation angle space of the second point cloud data.
Optionally, determining the first target point cloud data set based on the rotation angle space of the first point cloud data, and determining the second target point cloud data set based on the rotation angle space of the second point cloud data, includes: determining n rotation matrixes of the first point cloud data in a rotation angle space of the first point cloud data, and determining n rotation matrixes of the second point cloud data in a rotation angle space of the second point cloud data, wherein n is a natural number larger than 0; combining the n rotation matrixes of the first point cloud data to obtain a rotation matrix set of the first point cloud data, and combining the n rotation matrixes of the second point cloud data to obtain a rotation matrix set of the second point cloud data; the method comprises the steps of determining a rotation matrix set of first point cloud data and an initial matrix set of first point cloud data as a first target point cloud data set, and determining a rotation matrix set of second point cloud data and an initial matrix set of second point cloud data as a second target point cloud data set.
Optionally, performing a pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and performing a pooling operation on the second target point cloud data set to obtain global features of the second point cloud data, including: performing feature extraction operation on the first target point cloud data set to obtain point-by-point features of the first point cloud data, and performing feature extraction operation on the second target point cloud data set to obtain point-by-point features of the second point cloud data; and executing pooling operation on the point-by-point characteristics of the first point cloud data to obtain global characteristics of the first point cloud data, and executing pooling operation on the point-by-point characteristics of the second point cloud data to obtain the global characteristics of the second point cloud data.
Optionally, performing a deformation operation on the global feature of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and performing a deformation operation on the global feature of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data, including: performing similarity calculation on the point-by-point characteristics of the first point cloud data and the point-by-point characteristics of the second point cloud data to obtain a similarity matrix between any two points in the femur and femur model; performing transformation operation on elements in the similarity matrix to obtain a relation matrix between any two points in the femur and femur model; and determining reconstructed point cloud data of the first point cloud data and reconstructed point cloud data of the second point cloud data based on the relation matrix.
Optionally, determining the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data based on the relationship matrix includes: arranging the first target point cloud data sets according to the relation matrix to obtain intermediate point cloud data of the first target point cloud data sets, and arranging the second target point cloud data sets according to the relation matrix to obtain intermediate point cloud data of the second target point cloud data sets; the method comprises the steps of determining reconstructed point cloud data of first point cloud data based on intermediate point cloud data of a first target point cloud data set, and determining reconstructed point cloud data of second point cloud data based on intermediate point cloud data of a second target point cloud data set.
Optionally, determining the reconstructed point cloud data of the first point cloud data based on the intermediate point cloud data of the first target point cloud data set, and determining the reconstructed point cloud data of the second point cloud data based on the intermediate point cloud data of the second target point cloud data set, includes: splicing the global features of the intermediate point cloud data and the first point cloud data of the second target point cloud data set to obtain a first splicing matrix, and splicing the global features of the intermediate point cloud data and the second point cloud data of the first target point cloud data set to obtain a second splicing matrix; splicing the first replication matrix and the first splicing matrix to obtain a first target matrix, and splicing the second replication matrix and the second splicing matrix to obtain a second target matrix, wherein the first replication matrix is obtained by performing deformation operation on point-by-point characteristics of the first point cloud data, and the second replication matrix is obtained by performing deformation operation on the point-by-point characteristics of the second point cloud data; and inputting the first target matrix into the multi-layer perceptron to obtain the reconstructed point cloud data of the first point cloud data, and inputting the second target matrix into the multi-layer perceptron to obtain the reconstructed point cloud data of the second point cloud data.
According to an aspect of an embodiment of the present invention, there is provided a model registration apparatus for a femur, the apparatus may include: the device comprises an acquisition unit, a storage unit and a storage unit, wherein the acquisition unit is used for acquiring first point cloud data and second point cloud data of a femur, the first point cloud data is point cloud data of the femur, and the second point cloud data is another point cloud data of the femur; the transfer unit is used for performing state transfer operation on the first point cloud data to obtain a first target point cloud data set, and performing state transfer operation on the second point cloud data to obtain a second target point cloud data set; the pooling unit is used for performing pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and performing pooling operation on the second target point cloud data set to obtain global features of the second point cloud data; the deformation unit is used for executing deformation operation on the global features of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and executing deformation operation on the global features of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data; the computing unit is used for carrying out loss computation on the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data to obtain a point cloud registration model of the femur, wherein the point cloud registration model is used for attaching the femur to the femoral prosthesis so as to register the femoral prosthesis with each part of the femur.
According to another aspect of an embodiment of the present invention, there is also provided a computer-readable storage medium. The computer readable storage medium comprises a stored program, wherein the device in which the computer readable storage medium is located is controlled to execute the model registration method of the femur of the embodiment of the invention when the program runs.
According to another aspect of an embodiment of the present invention, there is also provided a processor. The processor is used for running a program, wherein the program is executed by the processor to execute the model registration method of the femur of the embodiment of the invention.
In the embodiment of the invention, the point cloud data of the femur of the patient to be operated is obtained, the state transfer operation is carried out on the first point cloud data in the point cloud data, the first point cloud data set can be obtained, the state transfer operation is carried out on the second point cloud data in the point cloud data, the second point cloud data set can be obtained, the pooling operation is carried out on the obtained first point cloud data set, the global feature of the first point cloud data can be obtained, the pooling operation is carried out on the obtained second point cloud data set, the global feature of the second point cloud data can be obtained, the deformation operation is carried out on the global feature of the first point cloud data, the reconstruction point cloud data of the first point cloud data can be obtained, the reconstruction point cloud data of the second point cloud data can be obtained, the reconstruction point cloud data of the first point cloud data and the reconstruction point cloud data of the second point cloud data are subjected to loss calculation, the point cloud registration model of the femur of the patient to be operated can be obtained, the fitting of the femur model can be carried out, the technical result of the femur model is not fit with the model is high, the result of the femur registration is achieved, and the technical result registration is not good, and the technical result registration of the femur model is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of a method of model registration of a femur in accordance with an embodiment of the present invention;
FIG. 2 is a flow chart of a method of point cloud reconstruction according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of constructing rotation matrix processing data based on a Markov process in accordance with an embodiment of the present invention;
FIG. 4 is a schematic diagram of a point cloud reconstruction process according to an embodiment of the present invention;
fig. 5 is a schematic view of a model registration apparatus for a femur in accordance with an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, shall fall within the scope of the invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
According to an embodiment of the present invention, a method of model registration of a femur is provided, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order different than what is shown herein.
Fig. 1 is a flowchart of a method of model registration of a femur, as shown in fig. 1, in accordance with an embodiment of the present invention, the method may include the steps of:
step S101, acquiring first point cloud data and second point cloud data of a femur.
In the technical solution provided in the above step S101 of the present invention, the first point cloud data may be point cloud data of a femur, the second point cloud data may be another point cloud data of the femur, and the first point cloud data and the second point cloud data may be obtained by scanning an object to be detected from different positions or angles through an electronic device, where the electronic device may be a computed tomography (Computed Tomography, abbreviated as CT) device, a magnetic resonance imaging (Nuclear Magnetic Resonance Imaging, abbreviated as NMRI) device, or the like, and the object to be detected may be a femur of a patient and an existing femur model, which is only illustrated herein without being limited thereto.
Optionally, the CT device is used to scan the femoral part of the patient to be operated on, so as to obtain medical image data about the femoral part, that is, obtain point cloud data of the femur, and the CT device is used to scan the existing femur model, so as to obtain medical image data about the femur model, that is, obtain another point cloud data of the femur.
Step S102, performing a state transition operation on the first point cloud data to obtain a first target point cloud data set, and performing a state transition operation on the second point cloud data to obtain a second target point cloud data set.
In the technical solution provided in the step S102 of the present invention, the first target point cloud data set may be obtained by expanding the original point cloud data a, and the second target point cloud data set may be obtained by expanding another original point cloud data B.
Optionally, after the first point cloud data and the second point cloud data of the femur are acquired, a state transition operation is performed on the first point cloud data based on a markov process, a first target point cloud data set may be obtained, and a state transition operation is performed on the second point cloud data based on the markov process, a second target point cloud data set may be obtained, for example, a state transition operation is performed on the original point cloud data a, a target point cloud data set corresponding to the original point cloud data a may be obtained, and a state transition operation is performed on another original point cloud data B, a target point cloud data set corresponding to the other original point cloud data B may be obtained.
Step S103, performing pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and performing pooling operation on the second target point cloud data set to obtain global features of the second point cloud data.
In the solution provided in the step S103 of the present invention, the point-by-point characteristic of the first point cloud data may be obtained by performing characteristic extraction on a target point cloud data set corresponding to the original point cloud data a, and the point-by-point characteristic of the second point cloud data may be obtained by performing characteristic extraction on a target point cloud data set corresponding to the other original point cloud data B.
Optionally, after performing a state transition operation on the first point cloud data to obtain a first target point cloud data set and performing a state transition operation on the second point cloud data to obtain a second target point cloud data set, performing a pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and performing a pooling operation on the second target point cloud data set to obtain global features of the second point cloud data, for example, performing a maximum average pooling operation on a point-by-point feature corresponding to the original point cloud data a to obtain global features corresponding to the original point cloud data a, and performing a maximum average pooling operation on a point-by-point feature set corresponding to the other original point cloud data B to obtain global features corresponding to the other original point cloud data B.
Step S104, a deformation operation is performed on the global features of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and a deformation operation is performed on the global features of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data.
In the technical solution provided in the step S104 of the present invention, the reconstructed point cloud data of the first point cloud data may be the reconstructed point cloud data corresponding to the original point cloud data a, and the reconstructed point cloud data of the second point cloud data may be the reconstructed point cloud data corresponding to the other original point cloud data B.
Optionally, after performing pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and performing pooling operation on the second target point cloud data set to obtain global features of the second point cloud data, performing deformation operation on the global features of the first point cloud data, and inputting the global features subjected to deformation operation into the multi-layer perceptron, reconstructed point cloud data of the first point cloud data can be obtained, and performing deformation operation on the global features of the second point cloud data, and inputting the global features subjected to deformation operation into the multi-layer perceptron, reconstructed point cloud data of the second point cloud data can be obtained, for example, performing deformation operation on the target point cloud data set, the point-by-point feature, the global feature and the target point cloud data set, the point-by-point feature and the global feature corresponding to the original point cloud data B, and reconstructed point cloud data of the original point cloud data B can be obtained.
Step S105, loss calculation is carried out on the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data, and a point cloud registration model of the femur is obtained.
In the technical solution provided in the step S105 of the present invention, the loss calculation may be implemented by modifying the CorrNet 3D network, and the point cloud registration model may be used to attach the femur to be changed to the target femur, so as to register each part of the femur to be changed to the target femur.
Optionally, after performing a deformation operation on the global feature of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and performing a deformation operation on the global feature of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data, performing loss computation on the first point cloud data and the reconstructed point cloud data of the first point cloud data, and on the reconstructed point cloud data of the second point cloud data and the reconstructed point cloud data of the second point cloud data, respectively, so as to obtain a point cloud registration model of the femur, for example, performing loss computation on the point cloud data and the reconstructed point cloud data of the original point cloud data a, and on the point cloud data and the reconstructed point cloud data corresponding to the other original point cloud data B, so as to obtain a point cloud registration model between the femur and the femur model of the patient to be operated, thereby realizing the technical effect of improving the robustness of the femur registration model.
In the steps S101 to S105, the point cloud data of the femur of the patient to be operated is obtained as the first point cloud data, the state transfer operation is performed based on the markov process, the first point cloud data set can be obtained, the point cloud data of the femur model is obtained as the second point cloud data, the state transfer operation is performed based on the markov process, the second point cloud data set can be obtained, the feature extraction and pooling operation is performed on the obtained first point cloud data set, the point-by-point feature and global feature of the first point cloud data can be obtained, the feature extraction and pooling operation is performed on the obtained second point cloud data set, the point-by-point feature and global feature of the second point cloud data can be obtained, the point-by-point feature and global feature of the first point cloud data set, the point-by-point feature and the second point cloud data set of the second point cloud set are respectively subjected to the deformation operation and the multi-layer perception machine, the reconstructed point cloud data of the first point cloud data and the second point cloud data can be obtained, the point cloud data of the first point cloud data and the second point cloud data set and the point cloud data set are subjected to the point-by-point feature and global feature of the markov process, the point cloud model can be registered with the point cloud model, and the point cloud model is not more easily registered with the point cloud model, and the point cloud model is better than the point cloud model is better in the point cloud model, and the point cloud model is better in the point-by the point cloud model.
The above-described method of this embodiment is further described below.
As an optional embodiment, step S102, performing a state transition operation on the first point cloud data to obtain a first target point cloud data set, and performing a state transition operation on the second point cloud data to obtain a second target point cloud data set, includes: performing rotation operation on the first point cloud data to obtain a rotation angle space of the first point cloud data, and performing rotation operation on the second point cloud data to obtain a rotation angle space of the second point cloud data; the first target point cloud data set is determined based on the rotation angle space of the first point cloud data, and the second target point cloud data set is determined based on the rotation angle space of the second point cloud data.
In this embodiment, the rotation angle space of the first point cloud data may include a storage space RX of the original point cloud data a rotated about the x-axis, a storage space RY of the original point cloud data a rotated about the y-axis, and a storage space RZ of the original point cloud data a rotated about the z-axis, and the rotation angle space of the second point cloud data may include a storage space RX of the original point cloud data B rotated about the x-axis, a storage space RY of the original point cloud data B rotated about the y-axis, and a storage space RZ of the original point cloud data B rotated about the z-axis.
Optionally, after the first point cloud data and the second point cloud data of the femur are acquired, a rotation operation is performed on the first point cloud data based on a markov process, a rotation angle space of the first point cloud data may be obtained, and a rotation operation is performed on the second point cloud data based on the markov process, a rotation angle space of the second point cloud data may be obtained, a first target point cloud data set may be determined based on the rotation angle space of the first point cloud data, and a second target point cloud data set may be determined based on the rotation angle space of the second point cloud data, for example, a rotation operation is performed on the original point cloud data a based on the markov process, a rotation angle space of the original point cloud data a may be obtained, and a rotation angle space of another original point cloud data B may be obtained, a target point cloud data set corresponding to the original point cloud data a may be determined according to the rotation angle space of the original point cloud data a, and a target point cloud data set corresponding to the another original point cloud data B may be determined according to the rotation angle space of the another original point cloud data B.
Optionally, the rotation operation is performed on the original point cloud data a based on the markov process, so that a rotation angle space of the original point cloud data a can be obtained, and the rotation operation is performed on the other original point cloud data B based on the markov process, so that a rotation angle space of the other original point cloud data B can be obtained by the following formula:
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
wherein R is x Can be used to represent a collection of angles of rotation about the x-axis, ang xi ,i∈[1,N]Can be used to represent the set R x The angle of rotation of the ith element of (S) XTn Can be used for representing at T n State of rotation angle around x-axis at any time, ang XTn ,Ang XTn ∈R x Can be used to represent the system at T n Time-of-day selection of rotation data about the x-axis, S XTn-1 Can be used asRepresenting the system at T n-1 State of rotation angle around x-axis at any time, ang XTn-1 ,Ang XTn-1 ∈R x Can be used to represent the system at T n-1 Time-selective rotation data about the x-axis, P (Ang xn-1 ,Ang xn ) Can be used for representing at T n-1 The time is selected to the data Ang xn-1 Thereafter, at T n Time selection state data Ang xn One-step transition probability, P X Can be used to represent a set R of angles of rotation about the x-axis x Probability transition matrix between each element of (an), P (Ang xi ,Ang xj )i,j∈[1,N]Can be used to represent a set R of angles of rotation about the x-axis x Transition probabilities between elements, P (N-1) X Can be used to represent a set R of angles of rotation about the x-axis x N-1 step transition probability matrix, P between each element of (B) (N-1) (Angxi,Angxj) Can be used to represent a set R of angles of rotation about the x-axis x N-1 step transition probabilities between elements of Ang xT ,Ang xT ∈R x Can be used to represent R from x Wherein RX can be used to represent the storage space of the rotation angle around the x-axis and S can be used to represent the rotation angle from R x The number of times of selecting the rotation data around the x-axis is randomly selected, ang XTt ,t∈[1,S]Can be used to represent R from x Rotation data about the x-axis selected at random.
Alternatively, when R is taken from x When one data is randomly selected as selected data to rotate around the x-axis, the currently selected data is not repeatedly selected in the subsequent selection process.
Alternatively, according to a selection process of selecting a rotation angle around the x-axis, a rotation angle around the y-axis may be obtained as well, and the process of obtaining a rotation angle around the y-axis may be achieved by:
(9)
(10)
(11)
(12)
wherein R is y May be used to represent a set of angles of rotation about the y-axis, P Y Can be used to represent a set R of angles of rotation about the y-axis y Probability transition matrix between each element, P (N-1) Y Can be used to represent a set R of angles of rotation about the y-axis y The N-1 step transition probability matrix, RY, between the individual elements may be used to represent the storage space for the angle of rotation about the y-axis.
Alternatively, according to a selection process of selecting a rotation angle around the x-axis, a rotation angle around the z-axis may be obtained as well, and the process of obtaining a rotation angle around the z-axis may be achieved by the following formula:
(13)
(14)
(15)
(16)
wherein R is z May be used to represent a set of angles of rotation about the z-axis, P Z Can be used to represent a set R of angles of rotation about the z-axis z Probability transition matrix between each element, P (N-1) Z Can be used to represent a set R of angles of rotation about the z-axis z The N-1 step transition probability matrix between the individual elements of (c), RZ can be used to represent the storage space for the rotation angle around the z-axis.
As an alternative embodiment, determining the first target point cloud data set based on the rotation angle space of the first point cloud data, and determining the second target point cloud data set based on the rotation angle space of the second point cloud data, includes: determining n rotation matrixes of the first point cloud data in a rotation angle space of the first point cloud data, and determining n rotation matrixes of the second point cloud data in a rotation angle space of the second point cloud data; combining the n rotation matrixes of the first point cloud data to obtain a rotation matrix set of the first point cloud data, and combining the n rotation matrixes of the second point cloud data to obtain a rotation matrix set of the second point cloud data; the method comprises the steps of determining a rotation matrix set of first point cloud data and an initial matrix set of first point cloud data as a first target point cloud data set, and determining a rotation matrix set of second point cloud data and an initial matrix set of second point cloud data as a second target point cloud data set.
In this embodiment, the n rotation matrices of the first point cloud data may be n rotation matrices corresponding to the original point cloud data a, the n rotation matrices of the second point cloud data may be n rotation matrices corresponding to the other original point cloud data B, the rotation matrix set of the first point cloud data may be a rotation matrix set formed by randomly selecting one rotation angle from RX, RY and RZ corresponding to the original point cloud data a multiple times (e.g., S 'times), the rotation matrix set of the second point cloud data may be a rotation matrix set formed by randomly selecting one rotation angle from RX, RY and RZ corresponding to the other original point cloud data B multiple times (e.g., S' times), the first target point cloud data set may be obtained by expanding the original point cloud data a, the second target point cloud data set may be obtained by expanding the other original point cloud data B, and n may be a natural number greater than 0.
Optionally, after performing a rotation operation on the first point cloud data to obtain a rotation angle space of the first point cloud data and performing a rotation operation on the second point cloud data to obtain a rotation angle space of the second point cloud data, determining n rotation matrices corresponding to the original point cloud data a among RX, RY and RZ corresponding to the original point cloud data a, and determining n rotation matrices corresponding to the original point cloud data B among RX, RY and RZ corresponding to the other original point cloud data B; combining the n rotation matrixes corresponding to the original point cloud data A to obtain a rotation matrix set corresponding to the original point cloud data A, and combining the n rotation matrixes corresponding to the other original point cloud data B to obtain a rotation matrix set corresponding to the other original point cloud data B; the rotation matrix set corresponding to the original point cloud data a and the initial matrix set corresponding to the original point cloud data a are determined as the target point cloud data set corresponding to the original point cloud data a, and the n rotation matrices corresponding to the other original point cloud data B and the initial matrix set corresponding to the other original point cloud data B are determined as the target point cloud data set corresponding to the other original point cloud data B.
Optionally, in the rotation angle space of the first point cloud data, n rotation matrices of the first point cloud data are determined, and in the rotation angle space of the second point cloud data, n rotation matrices of the second point cloud data are determined by:
(17)
(18)
wherein A is xyz Can be used to represent the angular combination of randomly selected elements from RX, RY and RZ, and Rot can be used to represent A obtained by Euler angle calculation xyz A rotation matrix constructed of individual elements.
Optionally, combining the n rotation matrices of the first point cloud data to obtain a rotation matrix set of the first point cloud data, and combining the n rotation matrices of the second point cloud data to obtain a rotation matrix set of the second point cloud data may be implemented by the following formula:
(19)
(20)
wherein Rot i ,i∈[1,S’]Can be used to represent A obtained by Euler angle calculation xyz Rotset may be used to represent a set of rotation matrices formed by angular combinations of randomly selecting an element from RX, RY and RZ a plurality of times (e.g., S' times), D may be used to represent input training data, rot<*>May be used to represent the rotation of the data and D' may be used to represent the rotated data.
As an optional embodiment, step S103, performing pooling operation on the first target point cloud data set to obtain a global feature of the first point cloud data, and performing pooling operation on the second target point cloud data set to obtain a global feature of the second point cloud data, includes: performing feature extraction operation on the first target point cloud data set to obtain point-by-point features of the first point cloud data, and performing feature extraction operation on the second target point cloud data set to obtain point-by-point features of the second point cloud data; and executing pooling operation on the point-by-point characteristics of the first point cloud data to obtain global characteristics of the first point cloud data, and executing pooling operation on the point-by-point characteristics of the second point cloud data to obtain the global characteristics of the second point cloud data.
In this embodiment, the point-by-point characteristic of the first point cloud data may be obtained by inputting a set of target point cloud data corresponding to the original point cloud data a into a dynamic graph-side convolutional neural network (Dynamic Graph Convolutional Neural Network, abbreviated as DGCNN) model, the point-by-point characteristic of the second point cloud data may be obtained by inputting a set of target point cloud data corresponding to another original point cloud data B into a DGCNN model, the global characteristic of the first point cloud data may be obtained by performing a maximum average pooling operation on the point-by-point characteristic corresponding to the original point cloud data a, and the global characteristic of the second point cloud data may be obtained by performing a maximum average pooling operation on the point-by-point characteristic corresponding to the other original point cloud data B.
Optionally, after performing a state transition operation on the first point cloud data to obtain a first target point cloud data set and performing a state transition operation on the second point cloud data to obtain a second target point cloud data set, performing a feature extraction operation on the target point cloud data set corresponding to the original point cloud data a to obtain a point-by-point feature of the target point cloud data set corresponding to the original point cloud data a, and performing a feature extraction operation on the target point cloud data set corresponding to the other original point cloud data B to obtain a point-by-point feature of the target point cloud data set corresponding to the other original point cloud data B; and performing maximum average pooling operation on the point-by-point characteristics of the target point cloud data set corresponding to the original point cloud data A to obtain global characteristics of the target point cloud data set corresponding to the original point cloud data A, and performing maximum average pooling operation on the point-by-point characteristics of the target point cloud data set corresponding to the other original point cloud data B to obtain global characteristics of the target point cloud data set corresponding to the other original point cloud data B.
Optionally, performing a feature extraction operation on the first target point cloud data set to obtain a point-by-point feature of the first point cloud data, and performing a feature extraction operation on the second target point cloud data set to obtain a point-by-point feature of the second point cloud data may be implemented by the following formula:
(21)
(22)
Wherein Aset can be used to represent an extended set of point clouds a, F A Can be used to represent the point-by-point characteristics of the expanded point cloud A, and Bset can be used to represent the expanded set of point cloud B, F B Can be used for representing the point-by-point characteristics of the expanded point cloud B and DGCNN<*>Can be used to represent the extraction of point-wise features of the expanded point cloud a and point-wise features of the expanded point cloud B using DGCNN.
Optionally, performing a pooling operation on the point-by-point feature of the first point cloud data to obtain a global feature of the first point cloud data, and performing the pooling operation on the point-by-point feature of the second point cloud data to obtain the global feature of the second point cloud data may be implemented by the following formula:
(23)/>
(24)
wherein FG is A Can be used to represent the global features of the expanded point cloud A, FG B Can be used to represent the global features, MA, of the expanded point cloud B<*>May be used to represent the extraction of global features of the expanded point cloud a and global features of the expanded point cloud B using a maximum average pooling operation.
As an optional embodiment, step S104 includes performing a deformation operation on the global feature of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and performing a deformation operation on the global feature of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data, where the step includes: performing similarity calculation on the point-by-point characteristics of the first point cloud data and the point-by-point characteristics of the second point cloud data to obtain a similarity matrix between any two points in the femur and femur model; performing transformation operation on elements in the similarity matrix to obtain a relation matrix between any two points in the femur and femur model; and determining reconstructed point cloud data of the first point cloud data and reconstructed point cloud data of the second point cloud data based on the relation matrix.
In this embodiment, the similarity matrix may be obtained by calculating a degree of similarity between any two points between the expanded point cloud a and the expanded point cloud B using pearson correlation coefficients, the relationship matrix may be obtained by normalizing and classifying (softmax) elements in the similarity matrix in units of rows, the reconstructed point cloud data of the first point cloud data may be the reconstructed point cloud data corresponding to the original point cloud data a, and the reconstructed point cloud data of the second point cloud data may be the reconstructed point cloud data corresponding to the other original point cloud data B.
Optionally, after performing pooling operation on the first target point cloud data set to obtain global features of the first point cloud data and performing pooling operation on the second target point cloud data set to obtain global features of the second point cloud data, performing similarity calculation on point-by-point features of the target point cloud data set corresponding to the original point cloud data a and point-by-point features of the target point cloud data set corresponding to the other original point cloud data B by using pearson correlation coefficients, so as to obtain a similarity matrix between any two points in the femur and femur model; normalizing and softmax processing are carried out on elements in the similarity matrix according to a row unit, so that a relation matrix between any two points in the femur and femur model can be obtained; from the obtained relationship matrix, the reconstructed point cloud data corresponding to the original point cloud data a and the reconstructed point cloud data corresponding to the other original point cloud data B can be determined.
Optionally, performing similarity calculation on the point-by-point feature of the first point cloud data and the point-by-point feature of the second point cloud data to obtain a similarity matrix between any two points in the femur and femur model may be implemented by the following formula:
(25)
(26)
(27)
(28)
(29)
(30)
(31)/>
(32)
wherein F is Ai Can be used to represent the set of all features of the ith point in the expanded point cloud A, F Aik Can be used to represent the kth eigenvalue, E (F Ai ) Can be used to represent the mean value, F, of all features of the ith point in the expanded point cloud A Bj Can be used to represent the set of all features of the jth point in the expanded point cloud B, F Bjk Can be used to represent the kth eigenvalue, E (F) Bj ) Can be used to represent the mean value, cov (F Ai ,F Bj ) Can be used to represent the covariance, σ, between the set of all features of the ith point in the expanded point cloud A and the set of all features of the jth point in the expanded point cloud B FAi Can be used to represent the overall standard deviation, sigma, of all features of the ith point in the expanded point cloud a FBj Can be used to represent all of the jth points in the expanded point cloud BGlobal standard deviation of features, prs (F Ai ,F Bj ) Can be used to represent the pearson correlation coefficient between the i-th point in the expanded point cloud a and the j-th point in the expanded point cloud B, abs () can be used to represent the absolute value of the parameter, and PR can be used to represent the similarity matrix between any two points between the expanded point cloud a and the expanded point cloud B.
Optionally, performing transformation operation on the elements in the similarity matrix to obtain a relationship matrix between any two points in the femur and femur model can be implemented by the following formula:
(33)
(34)
where g may be used to represent a rescale parameter, b may be used to represent a rescale parameter, PR rc May be used to represent the value of row r, column c, μ in PR r Can be used to represent the mean, σ, of the values of row r in PR r Can be used to represent the standard deviation, N, of the values of row r in PR PRrc Normalized operation results, softmax, that may be used to represent the value of row r, column c in PR r The term "CM" may be used to denote a softmax treatment of a parameter in units of rows rc May be used to represent the value of row r and column c in the final corresponding matrix CM.
As an alternative embodiment, determining reconstructed point cloud data of the first point cloud data and reconstructed point cloud data of the second point cloud data based on the relation matrix includes: arranging the first target point cloud data sets according to the relation matrix to obtain intermediate point cloud data of the first target point cloud data sets, and arranging the second target point cloud data sets according to the relation matrix to obtain intermediate point cloud data of the second target point cloud data sets; the method comprises the steps of determining reconstructed point cloud data of first point cloud data based on intermediate point cloud data of a first target point cloud data set, and determining reconstructed point cloud data of second point cloud data based on intermediate point cloud data of a second target point cloud data set.
In this embodiment, the intermediate point cloud data of the first target point cloud data set may be obtained by rearranging the points in the expanded point cloud a according to the final correspondence matrix CM, and the intermediate point cloud data of the second target point cloud data set may be obtained by rearranging the points in the expanded point cloud B according to the final correspondence matrix CM.
Optionally, after performing a transformation operation on the elements in the similarity matrix to obtain a relation matrix between any two points in the femur and femur model, arranging the target point cloud data sets corresponding to the original point cloud data a according to the final corresponding matrix CM to obtain intermediate point cloud data of the target point cloud data sets corresponding to the original point cloud data a, and arranging the target point cloud data sets corresponding to the other original point cloud data B according to the final corresponding matrix CM to obtain intermediate point cloud data of the target point cloud data sets corresponding to the other original point cloud data B; the reconstruction point cloud data corresponding to the original point cloud data A is determined according to the intermediate point cloud data of the target point cloud data set corresponding to the original point cloud data A, and the reconstruction point cloud data corresponding to the other original point cloud data B is determined according to the intermediate point cloud data of the target point cloud data set corresponding to the other original point cloud data B.
Optionally, the first target point cloud data set is arranged according to the relation matrix to obtain intermediate point cloud data of the first target point cloud data set, and the second target point cloud data set is arranged according to the relation matrix to obtain intermediate point cloud data of the second target point cloud data set by the following formula:
(35)
(36)
wherein A is * Can be used to represent an intermediate point cloud, B, obtained by rearranging the expanded point cloud a according to a correspondence matrix * May be used to represent an intermediate point cloud obtained by rearranging the expanded point cloud B according to the correspondence matrix.
As an alternative embodiment, determining reconstructed point cloud data of the first point cloud data based on the intermediate point cloud data of the first target point cloud data set, and determining reconstructed point cloud data of the second point cloud data based on the intermediate point cloud data of the second target point cloud data set, includes: splicing the global features of the intermediate point cloud data and the first point cloud data of the second target point cloud data set to obtain a first splicing matrix, and splicing the global features of the intermediate point cloud data and the second point cloud data of the first target point cloud data set to obtain a second splicing matrix; splicing the first replication matrix and the first splicing matrix to obtain a first target matrix, and splicing the second replication matrix and the second splicing matrix to obtain a second target matrix; and inputting the first target matrix into the multi-layer perceptron to obtain the reconstructed point cloud data of the first point cloud data, and inputting the second target matrix into the multi-layer perceptron to obtain the reconstructed point cloud data of the second point cloud data.
In this embodiment, the first stitching matrix may be a global feature of the expanded point cloud a and an intermediate point cloud B * The second splicing matrix may be obtained by splicing the global features of the expanded point cloud B and the intermediate point cloud a * The first copy matrix may be obtained by performing a deformation operation on the global feature of the first point cloud data through the point-by-point feature of the first point cloud data, the second copy matrix may be obtained by performing a deformation operation on the global feature of the second point cloud data through the point-by-point feature of the second point cloud data, for example, the first copy matrix may be obtained by performing a deformation operation on the global feature of the expanded point cloud a through the point-by-point feature of the expanded point cloud a, and the second copy matrix may be obtained by expanding through the point-by-point feature of the expanded point cloud BThe global characteristics of the point cloud B after the foregoing are obtained by performing a deformation operation, which is only illustrated herein and is not particularly limited.
Optionally, after arranging the first target point cloud data set according to the relation matrix to obtain intermediate point cloud data of the first target point cloud data set and arranging the second target point cloud data set according to the relation matrix to obtain intermediate point cloud data of the second target point cloud data set, splicing the intermediate point cloud data of the expanded point cloud B and global features of the expanded point cloud a to obtain a splicing matrix of the expanded point cloud a, and splicing the intermediate point cloud data of the expanded point cloud a and global features of the expanded point cloud B to obtain a splicing matrix of the expanded point cloud B; splicing the replication matrix of the expanded point cloud A and the splicing matrix of the expanded point cloud A to obtain a target matrix of the expanded point cloud A, and splicing the replication matrix of the expanded point cloud B and the splicing matrix of the expanded point cloud B to obtain a target matrix of the expanded point cloud B; the target matrix of the expanded point cloud A is input into a multi-layer perceptron to obtain reconstructed point cloud data of the expanded point cloud A, namely, reconstructed point cloud data corresponding to the original point cloud data A, and the target matrix of the expanded point cloud B is input into the multi-layer perceptron to obtain reconstructed point cloud data of the expanded point cloud B, namely, reconstructed point cloud data corresponding to the original point cloud data B.
Optionally, global feature FG A ∈R d And global feature FG B ∈R d Respectively with the intermediate point cloud A * ∈R n×3 And intermediate point cloud B * ∈R n×3 Splicing, i.e. splicing global features FG A And global feature FG B Directly copied to n x d (assuming that the characteristics after copying are Cop respectively FGA ∈R n×d ,Cop FGB ∈R n×d ) Respectively with the intermediate point cloud B * And intermediate point cloud A * Splicing to form n× (3+d) dimensional matrix (assuming that the spliced matrices are Con respectively B*CopFGA ∈R n×(3+d) ,Con A*CopFGB ∈R n×(3+d)
Optionally, performing a deforming operation on the global feature of the first point cloud data through the point-by-point feature of the first point cloud data to obtain a first replication matrix, and performing a deforming operation on the global feature of the second point cloud data through the point-by-point feature of the second point cloud data to obtain a second replication matrix may be implemented by the following formula:
(37)
(38)
(39)
(40)
wherein F is Arc Can be used for representing F A The value of row r, column c, mu FAr Can be used for representing F A Mean value, sigma of the r-th row value in (b) FAr Can be used for representing F A Standard deviation of the r-th row value, N FArc Can be used for representing F A Normalized processing result of the r-th row and c-th column value, F Brc Can be used for representing F B The value of row r, column c, mu FBr Can be used for representing F B Mean value, sigma of the r-th row value in (b) FBr Can be used for representing F B Standard deviation of the r-th row value, N FBrc Can be used for representing F B Normalized processing result of the r-th row and c-th column value, cop FGArc Can be used for representing Cop FGA The value of row r, column c, cop FGBrc Can be used for representing Cop FGB The value of row r and column c,can be used for representing multiplication according to one-to-one correspondence of elements, newCop FGA ∈R n×d Can be used to represent pairs FG A Copy result of NewCop FGB ∈R n×d Can be used to represent pairs FG B Will NewCop FGA And NewCop FGB Respectively with Con B*CopFGA And Con A*CopFGB Splicing to form n× (3+2d) dimensional matrix (assuming that the spliced matrices are Con respectively ConB*CopFGANewCopFGA ∈R n×(3+2d) And Con ConA*CopFGBNewCopFGB ∈R n×(3+2d) ) Con is to ConB*CopFGANewCopFGA ∈R n ×(3+2d) And Con ConA*CopFGBNewCopFGB ∈R n×(3+2d) And inputting the multi-layer perceptron to process to obtain the reconstructed point cloud data of the point cloud A and the reconstructed point cloud data of the point cloud B.
Alternatively, the improvement of the CorrNet 3D network can be achieved by a loss function shown in the following formula, so as to obtain a non-rigid body registration model proposed by the application:
(41)
(42)
(43)
wherein I can be used to represent an n identity matrix, CM r Line r, which may be used to represent CM, CM k Can be used to represent the kth line, Ω of the CM α Can be used for representing point-by-point characteristic time point a in DGCNN network extraction point cloud r Index set of k nearest neighbors, Ω b Can be used for representing point-by-point characteristic time point b in DGCNN network extraction point cloud r Index set of k nearest neighbors of a k Can be used to represent point a r Is a neighbor point of b k Can be used to represent point b r Is a neighbor of the first node.
In the embodiment of the invention, the point cloud data of the femur of the patient to be operated is obtained as the first point cloud data, the state transfer operation is performed based on the Markov process, the first point cloud data set can be obtained, the point cloud data of the femur model is obtained as the second point cloud data, the state transfer operation is performed based on the Markov process, the second point cloud data set can be obtained, the feature extraction and pooling operation is performed on the obtained first point cloud data set, the point-by-point feature and global feature of the first point cloud data can be obtained, the feature extraction and pooling operation is performed on the obtained second point cloud data set, the point-by-point feature and global feature of the second point cloud data can be obtained, the point-by-point feature and global feature of the first point cloud data set, the point-by-point feature and the global feature of the second point cloud data set are respectively processed by a deformation operation and a multi-layer perceptron, the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data can be obtained, the point cloud data of the first point cloud data, the reconstructed point cloud data of the second point cloud data and the reconstructed point cloud data of the second point cloud data are obtained, the point cloud data of the point cloud data and the femur model is matched with the femur model is achieved, and the technical results of the femur model is achieved, and the difference of the femur model is high in degree of the difference is achieved, and the difference of the difference is achieved.
Example 2
The technical solution of the embodiment of the present invention will be illustrated in the following with reference to a preferred embodiment.
With the continuous improvement of economic development and medical level, in the current knee joint operation navigation system, the collected key points are generally used for guiding the femur model prestored in the operation navigation system to perform scaling registration, but scaling and translation are performed on each dimension only through rigid registration, so that some parts of the femur, which are complex in structure and are key, cannot be completely attached to the prestored model, and the technical problem of lower robustness of the femur registration model is caused.
Accordingly, in order to overcome the above-mentioned problems, in a related art, a femoral neck registration method based on point cloud segmentation is disclosed, the method comprising: dividing the obtained CT image into a broken femoral neck to obtain two parts of a broken femoral head A and a femoral body B; inputting point cloud data of the femoral head A and the femoral body B into a point neural network, and dividing to obtain a femoral head joint surface a and a femoral body fracture joint surface B respectively; registering the segmented femoral head joint surface a and the segmented femoral head fracture joint surface b by using an iterative program to obtain a rotation matrix Msource- > target; and applying the obtained rotation matrix to the segmented femoral head A to obtain the finally spliced femur model. However, this method calculates the error by searching for the nearest matching point of each point only, thereby obtaining an optimal rotation matrix, and does not consider the construction of the rotation matrix according to the markov process to process the point cloud data, and the non-rigid femur point cloud registration model is obtained by improving the CorrNet 3D network, so that the robustness of the femur registration model cannot be improved.
However, the embodiment of the invention provides a femur model registration method, which processes point cloud data by adopting a Markov process to construct a rotation matrix, and then obtains a femur point cloud registration model by using an improved CorrNet 3D network, thereby achieving the purpose of deforming the point cloud, solving the technical problem of lower robustness of the femur registration model, and realizing the technical effect of improving the robustness of the femur registration model.
Fig. 2 is a flowchart of a point cloud reconstruction method according to an embodiment of the present invention, and as shown in fig. 2, the point cloud reconstruction method may include the following steps:
step S201, determining a set R of rotation angles around the x-axis from 1 DEG to 360 DEG x
Determining a set R of rotation angles about the x-axis in a range from 1 DEG to 360 DEG x After that, the flow advances to step S202, where a one-step transition probability of the rotation angle around the x-axis is calculated.
After calculating the one-step transition probability of the rotation angle around the x-axis, the process proceeds to step S203, where R is calculated x Probability transition moments between individual elementsArray P X
In calculating R x Probability transition matrix P between elements X After that, the process advances to step S204 to calculate R x N-1 step transition probability matrix between each element.
In calculating R x After the N-1 step transition probability matrix between each element, the process proceeds to step S205 from R x Is selected randomly.
At the slave R x After randomly selecting one data, the process proceeds to step S206, where the selected data is stored in the storage space RX rotated by an angle around the x-axis.
Step S211, determining a set R of rotation angles around the y-axis from 1 DEG to 360 DEG y
Determining a set R of rotation angles about the y-axis from 1 DEG to 360 DEG y After that, the process advances to step S212, where a one-step transition probability of the rotation angle around the y-axis is calculated.
After calculating the one-step transition probability of the rotation angle around the y-axis, the process proceeds to step S213, where R is calculated y Probability transition matrix P between elements Y
In calculating R y Probability transition matrix P between elements Y After that, the process advances to step S214 to calculate R y N-1 step transition probability matrix between each element.
In calculating R y After the N-1 step transition probability matrix between each element, the process proceeds to step S215, from R y Is selected randomly.
At the slave R y After randomly selecting one data, the process proceeds to step S216, where the selected data is stored in the storage space RY rotated by an angle around the y-axis.
Step S221, determining a set R of rotation angles around the y-axis from 1 DEG to 360 DEG z
Determining a set R of rotation angles about the z-axis in the range from 1 DEG to 360 DEG z After that, the process advances to step S222, where a one-step transition probability of the rotation angle around the z-axis is calculated.
After calculating the one-step transition probability of the rotation angle around the z-axis, the process proceeds to step S223, where R is calculated z Probability transition matrix P between elements Z
In calculating R z Probability transition matrix P between elements Z After that, the process advances to step S224 to calculate R z N-1 step transition probability matrix between each element.
In calculating R z After the N-1 step transition probability matrix between each element, the process proceeds to step S225 from R z Is selected randomly.
At the slave R z After randomly selecting one data, the process proceeds to step S226, where the selected data is stored in the storage space RZ rotated by an angle around the z-axis.
After storing the selected data in the storage space RX for the rotation angle around the x-axis, the selected data in the storage space RY for the rotation angle around the y-axis and the selected data in the storage space RZ for the rotation angle around the z-axis, step S231 is entered to compose the angle combination A xyz
At a composition angle of combination A xyz After that, the process advances to step S232 to construct a rotation matrix Rot.
After the rotation matrix Rot is constructed, the flow advances to step S233 to construct a rotation matrix set Rotset.
After constructing the rotation matrix set Rotset, step S234 is entered to obtain expanded input data Aset and Bset.
After obtaining the expanded input data Aset and Bset, proceed to step S235, calculate the point-by-point feature F A And F B Computing global features FG A And FG B
In calculating the point-by-point characteristic F A And F B Computing global features FG A And FG B Then, the steps go to step S236, step S237, step S245, step S246, step S248 and step S250, respectively, calculate the average value of all the features of each point in the point cloud a, calculate the average value of all the features of each point in the point cloud B, and construct the replication matrix Cop FGA Constructing a replication matrix Cop FGB For F A Each element of each row is normalized to F B The individual elements of each row in (a) are normalized.
In constructing the replication matrix Cop FGA Thereafter, the process proceeds to step S247, where a new replication matrix NewCop is constructed FGA In constructing the copy matrix Cop FGB Thereafter, the process proceeds to step S249, where a new replication matrix NewCop is constructed FGB
After calculating the average value of all the features of each point in the point cloud a, calculating the average value of all the features of each point in the point cloud B, proceeding to step S238 and step S239, calculating the covariance between all the feature sets of each point in the point cloud a and the point cloud B, calculating the overall standard deviation of all the features of each point in the point cloud a, and calculating the overall standard deviation of all the features of each point in the point cloud B.
After calculating the covariance between all the feature sets of each point in the point cloud a and the point cloud B and calculating the overall standard deviation of all the features of each point in the point cloud a, step S240 is performed to calculate pearson correlation coefficients between each point in the point cloud a and each point in the point cloud B.
After calculating pearson correlation coefficients between the respective points in the point cloud a and the respective points in the point cloud B, the flow proceeds to step S241, where the degree of similarity ps between the respective points in the point cloud a and the respective points in the point cloud B is calculated.
After calculating the degree of similarity ps between each point in the point cloud a and each point in the point cloud B, the process proceeds to step S242, where the degree of similarity matrix PR between each point in the point cloud a and each point in the point cloud B is calculated.
After calculating the similarity matrix PR between each point in the point cloud a and each point in the point cloud B, step S243 is performed to calculate the correspondence matrix CM between the point cloud a and the point cloud B.
After calculating the correspondence matrix CM between the point cloud a and the point cloud B, the process proceeds to step S244, where the intermediate point cloud a of the point cloud a is calculated * Intermediate point cloud B of sum point cloud B *
Intermediate point cloud A of computing point cloud A * Intermediate point cloud B of sum point cloud B * After that, the process proceeds to step S251 and step S252, and a splice matrix Con is constructed B*CopFGA Constructing a splicing matrix Con A*CopFGB
In constructing the splicing matrix Con B*CopFGA After that, the process proceeds to step S253 and step S254, and a final splicing matrix Con is constructed ConB*CopFGANewCopFGA Acquiring a reconstructed point cloud of the point cloud A through MLP, and constructing a splicing matrix Con A*CopFGB After that, the process proceeds to step S255 and step S256, and a final splicing matrix Con is constructed ConA*CopFGBNewCopFGB And acquiring a reconstructed point cloud of the point cloud B through the MLP.
FIG. 3 is a schematic diagram of a process for constructing a rotation matrix based on a Markov process according to an embodiment of the present invention, wherein the rotation range about the x-axis is defined to be 1 DEG to 360 DEG, the rotation direction is clockwise, the rotation step is a DEG, 360/a angles are generated in total, and the set of rotation angles about the x-axis can be determined to be R by setting N=360/a x The following formula (1) shows:
(1)
wherein R is x Can be used to represent a collection of angles of rotation about the x-axis, ang xi ,i∈[1,N]Can be used to represent the set R x The angle by which the i-th element is rotated.
Optionally, during each acquisition of the rotation angle around the x-axis, the set R x Each element Ang of (E) x1 ,Ang x2 ,Ang x3 ,...,Ang xn Can be used as the state value of the selection result at a certain moment in the selection process, and the set R x The process of obtaining the rotation angle around the x-axis can be expressed by the following formula (2) as a state space of the whole process:
(2)
Wherein S is XTn Can be used for representing at T n State of rotation angle around x-axis at any time, ang XTn ,Ang XTn ∈R x Can be used to represent the system at T n Time-of-day selection of rotation data about the x-axis, S XTn-1 Can be used for representing a systemAt T n-1 State of rotation angle around x-axis at any time, ang XTn-1 ,Ang XTn-1 ∈R x Can be used to represent the system at T n-1 The data is rotated about the x-axis at the time of selection.
Alternatively, it can be obtained according to formula (2) above that during rotation about the x-axis, at T n-1 The moment of Ang is selected Xn-1 Thereafter, at T n Time selection state data Ang Xn The one-step transition probability of (2) is shown in the following formula (3), and a set R of rotation angles around the x-axis can be obtained according to the one-step transition probability formula (3) between state values x Probability transition matrix P between elements of (B) X Wherein the probability transition matrix P X Can be represented by the following formula (4), at the initial stage, P X Each element of the series is given a random initial value which satisfies the condition shown in the following formula (5), and the N-1 step transition probability matrix calculation formula P between the elements can be obtained according to the Hempman-Kerr Mo Geluo Fu equation (N-1) =P N-1 Thus, R can be obtained x The N-1 step transition probability matrix among the elements is shown in the following formula (6), after the N-1 step transition probability matrix is determined, starting with a certain data according to the N-1 step transition probability among the data, selecting the data corresponding to the probability with a certain probability after the N-1 step transition, for example, the starting data is Ang x1 R is selected after N-1 step transfer x The probability of each data in the data are respectively P (N-1) (Angx1,Angx1) ,P (N-1) (Angx1,Angx2) ,P (N -1) (Angx1,AngxN) According to the corresponding probability, R is from x Randomly selecting one data as x-axis rotation data by Ang xT ,Ang xT ∈R x Representation is performed, in the selection of Ang xT Thereafter, the transfer matrix P may be updated according to the following equation (7) X So that Ang will not be repeatedly selected in the subsequent selection process xT Repeating the processes from the formula (4) to the formula (7) for a plurality of times, and selecting a plurality of Angs on the assumption that the selection is repeated for S times xT And placing the memory space in a specified memory space, wherein the form of the memory space is as follows8) The following is shown:
(3)
(4)
(5)
(6)
(7)
(8)
wherein P (Ang) xn-1 ,Ang xn ) Can be used for representing at T n-1 The time is selected to the data Ang xn-1 Thereafter, at T n Time selection state data Ang xn One-step transition probability, P X Can be used to represent a set R of angles of rotation about the x-axis x Probability transition matrix between each element of (an), P (Ang xi ,Ang xj )i,j∈[1,N]Can be used to represent a set R of angles of rotation about the x-axis x Transition probabilities between elements, P (N-1) X Can be used to represent a set R of angles of rotation about the x-axis x N-1 step transition probability matrix, P between each element of (B) (N-1) (Angxi,Angxj) Can be used to represent a set R of angles of rotation about the x-axis x N-1 step transition probabilities between elements of Ang xT ,Ang xT ∈R x Can be used to represent R from x Optionally selected one of the data as selected x-axis rotation data, RX may be used to represent the rotation aroundStorage space for x-axis rotation angle, S can be used to represent the rotation angle from R x The number of times of selecting the rotation data around the x-axis is randomly selected, ang XTt ,t∈[1,S]Can be used to represent R from x Rotation data about the x-axis selected at random.
Alternatively, according to a selection process of selecting a rotation angle around the x-axis, a rotation angle around the y-axis may be obtained as well, and the process of obtaining a rotation angle around the y-axis may be achieved by:
(9)
(10)
(11)
(12)
wherein R is y May be used to represent a set of angles of rotation about the y-axis, P Y Can be used to represent a set R of angles of rotation about the y-axis y Probability transition matrix between each element, P (N-1) Y Can be used to represent a set R of angles of rotation about the y-axis y The N-1 step transition probability matrix, RY, between the individual elements may be used to represent the storage space for the angle of rotation about the y-axis.
Alternatively, according to a selection process of selecting a rotation angle around the x-axis, a rotation angle around the z-axis may be obtained as well, and the process of obtaining a rotation angle around the z-axis may be achieved by the following formula:
(13)
(14)
(15)
(16)
wherein R is z May be used to represent a set of angles of rotation about the z-axis, P Z Can be used to represent a set R of angles of rotation about the z-axis z Probability transition matrix between each element, P (N-1) Z Can be used to represent a set R of angles of rotation about the z-axis z The N-1 step transition probability matrix between the individual elements of (c), RZ can be used to represent the storage space for the rotation angle around the z-axis.
Optionally, after obtaining RX, RY and RZ, randomly selecting an element from RX, RY and RZ respectively to form an angle combination A xyz A is obtained by calculation from the Euler angle rotation of Z-Y-X as shown in the following formula (17) xyz The rotation matrix constructed by each element in the formula (18) is formed into different angle combinations A through the process of the formula (17) for a plurality of times, which is assumed to be S' times xyz Each time a new A is formed xyz Thereafter, a rotation matrix set Rotset can be constructed according to formula (18), as shown in formula (19) below, and after constructing the rotation matrix set Rotset, the data is processed with the rotation matrix in Rotset, as shown in formula (20) below:
(17)
(18)
(19)
(20)
wherein A is xyz Can be used to represent the angular combination of randomly selected elements from RX, RY and RZ, and Rot can be used to represent A obtained by Euler angle calculation xyz Rotation matrix constructed by each element in the matrix, rot i ,i∈[1,S’]Can be used to represent A obtained by Euler angle calculation xyz Rotset may be used to represent a set of rotation matrices formed by angular combinations of randomly selecting an element from RX, RY and RZ a plurality of times (e.g., S' times), D may be used to represent input training data, rot<*>May be used to represent the rotation of the data and D' may be used to represent the rotated data.
Optionally, the processed data is transmitted into the model network together with the raw data, i.e. point cloud A is eventually expanded into A and A Point cloud B eventually expands to B and B (expressed by Aset) Is represented by Bset.
Fig. 4 is a schematic diagram of a point cloud reconstruction process according to an embodiment of the present invention, and as shown in fig. 4, the point cloud reconstruction process may be implemented by:
step one, feature coding: at the input of training data Aset e R n×3 And Bset ε R n×3 Then, using the DGCNN network model to obtain point-by-point characteristics F of the point cloud Aset A ∈R n×d Point-by-point feature F of sum point cloud Bset B ∈R n×d From F using maximum average pooling operations A And F B Extracting global features FG of point cloud Aset A ∈R d Global feature FG of sum point cloud Bset B ∈R d Where n may be used to represent the number of points in the point cloud, 3 may be used to represent that each point has three-dimensional coordinates (x, y, z), and d may be used to represent the dimension of the feature, the feature encoding process described above may be implemented by:
(21)
(22)
(23)
(24)
Wherein Aset can be used to represent an extended set of point clouds a, F A Can be used to represent the point-by-point characteristics of the expanded point cloud A, and Bset can be used to represent the expanded set of point cloud B, F B Can be used for representing the point-by-point characteristics of the expanded point cloud B and DGCNN<*>Can be used to represent the extraction of point-by-point features of the expanded point cloud A and point-by-point features of the expanded point cloud B using DGCNN, FG A Can be used to represent the global features of the expanded point cloud A, FG B Can be used to represent the global features, MA, of the expanded point cloud B<*>May be used to represent the extraction of global features of the expanded point cloud a and global features of the expanded point cloud B using a maximum average pooling operation.
Step two, calculating a matching relation: after F is obtained A And F B Then, the pearson correlation coefficient is used for calculating the similarity between any two points in the point cloud A and the point cloud B, the absolute value is taken from the result obtained by calculation of the pearson correlation coefficient and the reciprocal is taken as the similarity between the two points, the similarity matrix PR between any two points between the point cloud A and the point cloud B can be obtained, after the similarity matrix PR is obtained, the elements in the PR are normalized and softmax processed in units of rows, and the final corresponding matrix CM E R between the point cloud A and the point cloud B is obtained n×n The above process of calculating the matching relationship can be implemented by the following formula:
(25)
(26)
(27)
(28)
(29)
(30)
(31)
(32)
(33)
(34)
wherein F is Ai Can be used to represent the set of all features of the ith point in the expanded point cloud A, F Aik Can be used to represent the kth eigenvalue, E (F Ai ) Can be used for representingThe mean value F of all the characteristics of the ith point in the expanded point cloud A Bj Can be used to represent the set of all features of the jth point in the expanded point cloud B, F Bjk Can be used to represent the kth eigenvalue, E (F) Bj ) Can be used to represent the mean value, cov (F Ai ,F Bj ) Can be used to represent the covariance, σ, between the set of all features of the ith point in the expanded point cloud A and the set of all features of the jth point in the expanded point cloud B FAi Can be used to represent the overall standard deviation, sigma, of all features of the ith point in the expanded point cloud a FBj Can be used to represent the overall standard deviation, prs (F Ai ,F Bj ) Can be used to represent the pearson correlation coefficient between the i-th point in the expanded point cloud a and the j-th point in the expanded point cloud B, abs (x) can be used to represent the absolute value of the parameter, PR can be used to represent the similarity matrix between any two points between the expanded point cloud a and the expanded point cloud B, g can be used to represent the rescaling parameter, B can be used to represent the rescaling parameter, PR rc May be used to represent the value of row r, column c, μ in PR r Can be used to represent the mean, σ, of the values of row r in PR r Can be used to represent the standard deviation, N, of the values of row r in PR PRrc Normalized operation results, softmax, that may be used to represent the value of row r, column c in PR r The term "CM" may be used to denote a softmax treatment of a parameter in units of rows rc May be used to represent the value of row r and column c in the final corresponding matrix CM.
Determining a non-rigid body registration model: after calculation, the final corresponding matrix CM epsilon R is obtained n×n Then, rearranging the points in the point cloud A and the point cloud B according to the corresponding matrix to obtain an intermediate point cloud A * ∈R n×3 And intermediate point cloud B * ∈R n×3 The following formulas (35) to (36); global feature FG obtained by calculation A ∈R d And global feature FG B ∈R d Respectively with the intermediate point cloud A * ∈R n×3 And intermediate point cloud B * ∈R n×3 Splicing, i.e. splicing global features FG A And global feature FG B Directly copied to n x d (assuming that the characteristics after copying are Cop respectively FGA ∈R n×d ,Cop FGB ∈R n×d ) Respectively with the intermediate point cloud B * And intermediate point cloud A * Splicing to form n× (3+d) dimensional matrix (assuming that the spliced matrices are Con respectively B*CopFGA ∈R n×(3+d) ,Con A*CopFGB ∈R n ×(3+d) ) The method comprises the steps of carrying out a first treatment on the surface of the For the calculated point-by-point characteristic F in line units A And point-by-point feature F B Each element of each row is normalized, and then the numerical value and Cop in the normalized result are normalized FGA And Cop FGB The values in the two are multiplied in a one-to-one correspondence to obtain the NewCop FGA And NewCop FGB The following formulas (37) to (40); the obtained NewCop FGA And NewCop FGB Respectively with Con B*CopFGA And Con A*CopFGB Splicing to form n× (3+2d) dimensional matrix (assuming that the spliced matrices are Con respectively ConB*CopFGANewCopFGA ∈R n×(3+2d) And Con ConA*CopFGBNewCopFGB ∈R n×(3+2d) ) Con is to ConA*CopFGBNewCopFGB And Con ConA*CopFGBNewCopFGB Respectively inputting the point clouds into the MLP to obtain a reconstructed point cloud of the point cloud AReconstruction point cloud +.>Wherein, the improvement of CorrNet 3D network is realized by the loss function shown in the following formulas (41) to (43), thereby obtaining the non-rigid body registration model proposed by the application:
(35)
(36)
(37)
(38)
(39)
(40)
(41)
(42)
(43)
wherein A is * Can be used to represent an intermediate point cloud, B, obtained by rearranging the expanded point cloud a according to a correspondence matrix * Can be used to represent an intermediate point cloud, F, obtained by rearranging the expanded point cloud B according to a correspondence matrix Arc Can be used for representing F A The value of row r, column c, mu FAr Can be used for representing F A Mean value, sigma of the r-th row value in (b) FAr Can be used for representing F A Standard deviation of the r-th row value, N FArc Can be used for representing F A Normalized processing result of the r-th row and c-th column value, F Brc Can be used for representing F B The value of row r, column c, mu FBr Can be used for representing F B Mean value, sigma of the r-th row value in (b) FBr Can be used for representing F B Standard deviation of the r-th row value, N FBrc Can be used for representing F B Normalized processing result of the r-th row and c-th column value, cop FGArc Can be used for representing Cop FGA The value of row r, column c, cop FGBrc Can be used for representing Cop FGB The value of row r and column c,can be used for representing multiplication according to one-to-one correspondence of elements, newCop FGA ∈R n×d Can be used to represent pairs FG A Copy result of NewCop FGB ∈R n×d Can be used to represent pairs FG B Will NewCop FGA And NewCop FGB Respectively with Con B*CopFGA And Con A*CopFGB Splicing to form n× (3+2d) dimensional matrix (assuming that the spliced matrices are Con respectively ConB*CopFGANewCopFGA ∈R n×(3+2d) And Con ConA*CopFGBNewCopFGB ∈R n×(3+2d) ) I can be used to represent an n identity matrix, CM r Line r, which may be used to represent CM, CM k Can be used to represent the kth line, Ω of the CM α Can be used for representing point-by-point characteristic time point a in DGCNN network extraction point cloud r Index set of k nearest neighbors, Ω b Can be used for representing point-by-point characteristic time point b in DGCNN network extraction point cloud r Index set of k nearest neighbors of a k Can be used to represent point a r Is a neighbor point of b k Can be used to represent point b r Is a neighbor of the first node.
In this embodiment, the set R of rotation angles about the x-axis is first determined from 1 to 360 x Calculating a one-step transition probability and R of the rotation angle around the x axis x N-1 step transition probability matrix between each element, from R x Randomly selecting one data, storing the selected data in a storage space RX of the rotation angle around the x axis, selecting the rotation angle around the y axis and the rotation angle around the z axis according to the selection process of the rotation angle around the x axis, and forming an angle combination A xyz Structure of the structureConstructing a rotation matrix Rot and a rotation matrix set Rotset, thereby obtaining expanded input data Aset and Bset, and calculating a point-by-point characteristic F A 、F B And global feature FG A And FG B According to the calculated point-by-point characteristic F A 、F B And global feature FG A And FG B Determining a corresponding matrix CM between the point cloud A and the point cloud B, and rearranging the points in the point cloud A and the point cloud B according to the corresponding matrix CM to obtain an intermediate point cloud A * And intermediate point cloud B * Then the obtained intermediate point cloud A * And intermediate point cloud B * Respectively performing matrix splicing twice, and inputting the spliced matrix into the MLP to obtain a reconstructed point cloud of the point cloud AReconstruction point cloud +.>Reconstruction point cloud->And reconstructing the point cloud->The loss calculation is carried out, so that a non-rigid registration model can be obtained, the technical problems of low fitting degree of the result of the femur registration model and low robustness of the model are solved, and the technical effects of improving the fitting degree of the result of the femur registration model and the robustness of the femur registration model are achieved.
Example 3
According to the embodiment of the invention, a model registration device of the femur is also provided. The model registration apparatus for femur may be used to perform a model registration method for femur in embodiment 1.
Fig. 5 is a schematic view of a model registration apparatus for a femur in accordance with an embodiment of the present invention. As shown in fig. 5, a model registration apparatus 500 for a femur may include: an acquisition unit 501, a transfer unit 502, a pooling unit 503, a morphing unit 504, and a calculation unit 505.
The obtaining unit 501 is configured to obtain first point cloud data of a femur and second point cloud data, where the first point cloud data is point cloud data of the femur, and the second point cloud data is another point cloud data of the femur.
The transferring unit 502 is configured to perform a state transferring operation on the first point cloud data to obtain a first target point cloud data set, and perform a state transferring operation on the second point cloud data to obtain a second target point cloud data set.
The pooling unit 503 is configured to perform a pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and perform a pooling operation on the second target point cloud data set to obtain global features of the second point cloud data.
And a deforming unit 504, configured to perform a deforming operation on the global feature of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and perform a deforming operation on the global feature of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data.
The computing unit 505 is configured to perform loss computation on the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data to obtain a point cloud registration model of the femur, where the point cloud registration model is used to attach the femur to the femoral prosthesis so as to register the femoral prosthesis with each part of the femur.
Alternatively, the transfer unit 502 may include: the transfer module is used for executing rotation operation on the first point cloud data to obtain a rotation angle space of the first point cloud data, and executing rotation operation on the second point cloud data to obtain a rotation angle space of the second point cloud data; the first determining module is used for determining a first target point cloud data set based on the rotation angle space of the first point cloud data and determining a second target point cloud data set based on the rotation angle space of the second point cloud data.
Optionally, the first determining module may include: the first determining submodule is used for determining n rotation matrixes of the first point cloud data in the rotation angle space of the first point cloud data and determining n rotation matrixes of the second point cloud data in the rotation angle space of the second point cloud data, wherein n is a natural number larger than 0; the combining sub-module is used for combining the n rotation matrixes of the first point cloud data to obtain a rotation matrix set of the first point cloud data, and combining the n rotation matrixes of the second point cloud data to obtain a rotation matrix set of the second point cloud data; the second determining sub-module is used for determining the rotation matrix set of the first point cloud data and the initial matrix set of the first point cloud data as a first target point cloud data set, and determining the rotation matrix set of the second point cloud data and the initial matrix set of the second point cloud data as a second target point cloud data set.
Alternatively, the pooling unit 503 may include: the extraction module is used for executing feature extraction operation on the first target point cloud data set to obtain point-by-point features of the first point cloud data, and executing feature extraction operation on the second target point cloud data set to obtain point-by-point features of the second point cloud data; the pooling module is used for performing pooling operation on the point-by-point characteristics of the first point cloud data to obtain global characteristics of the first point cloud data, and performing pooling operation on the point-by-point characteristics of the second point cloud data to obtain global characteristics of the second point cloud data.
Alternatively, the deforming unit 504 may include: the computing module is used for performing similarity computation on the point-by-point characteristics of the first point cloud data and the point-by-point characteristics of the second point cloud data to obtain a similarity matrix between any two points in the femur and the femur model; the transformation module is used for performing transformation operation on the elements in the similarity matrix to obtain a relation matrix between any two points in the femur and the femur model; and the second determining module is used for determining the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data based on the relation matrix.
Optionally, the second determining module may include: the arrangement sub-module is used for arranging the first target point cloud data set according to the relation matrix to obtain intermediate point cloud data of the first target point cloud data set, and arranging the second target point cloud data set according to the relation matrix to obtain intermediate point cloud data of the second target point cloud data set; and the third determining sub-module is used for determining the reconstruction point cloud data of the first point cloud data based on the intermediate point cloud data of the first target point cloud data set and determining the reconstruction point cloud data of the second point cloud data based on the intermediate point cloud data of the second target point cloud data set.
Optionally, the third determination submodule may be further configured to: splicing the global features of the intermediate point cloud data and the first point cloud data of the second target point cloud data set to obtain a first splicing matrix, and splicing the global features of the intermediate point cloud data and the second point cloud data of the first target point cloud data set to obtain a second splicing matrix; splicing the first replication matrix and the first splicing matrix to obtain a first target matrix, and splicing the second replication matrix and the second splicing matrix to obtain a second target matrix, wherein the first replication matrix is obtained by performing deformation operation on point-by-point characteristics of the first point cloud data, and the second replication matrix is obtained by performing deformation operation on the point-by-point characteristics of the second point cloud data; and inputting the first target matrix into the multi-layer perceptron to obtain the reconstructed point cloud data of the first point cloud data, and inputting the second target matrix into the multi-layer perceptron to obtain the reconstructed point cloud data of the second point cloud data.
In this embodiment, the obtaining unit is configured to obtain first point cloud data and second point cloud data of the femur, where the first point cloud data is point cloud data of the femur, and the second point cloud data is another point cloud data of the femur; the transfer unit is used for performing state transfer operation on the first point cloud data to obtain a first target point cloud data set, and performing state transfer operation on the second point cloud data to obtain a second target point cloud data set; the pooling unit is used for performing pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and performing pooling operation on the second target point cloud data set to obtain global features of the second point cloud data; the deformation unit is used for executing deformation operation on the global features of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and executing deformation operation on the global features of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data; the calculation unit is used for carrying out loss calculation on the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data to obtain a point cloud registration model of the femur, wherein the point cloud registration model is used for attaching the femur to the femoral prosthesis so as to register each part of the femur and the femoral prosthesis, the technical problems that the result attaching degree of the femoral registration model is low and the robustness of the model is low are solved, and the technical effects of improving the result attaching degree of the femoral registration model and the robustness of the femoral registration model are achieved.
Example 4
According to an embodiment of the present invention, there is also provided a computer-readable storage medium. The computer-readable storage medium includes a stored program, wherein the apparatus in which the computer-readable storage medium is located is controlled to execute the model registration method of the femur in embodiment 1 when the program is run.
Example 5
According to an embodiment of the invention, a processor is also provided. The processor is used to run a program, wherein the program when run by the processor performs the model registration method of the femur of example 1.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of units may be a logic function division, and there may be another division manner in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the method of the various embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (7)

1. A method of model registration of a femur, comprising:
acquiring first point cloud data and second point cloud data of a femur, wherein the first point cloud data is the point cloud data of the femur, and the second point cloud data is the other point cloud data of the femur;
performing state transition operation on the first point cloud data to obtain a first target point cloud data set, and performing state transition operation on the second point cloud data to obtain a second target point cloud data set;
executing pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and executing pooling operation on the second target point cloud data set to obtain global features of the second point cloud data;
performing deformation operation on the global features of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and performing deformation operation on the global features of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data;
Performing loss calculation on the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data to obtain a point cloud registration model of the femur, wherein the point cloud registration model is used for attaching the femur to a femoral prosthesis so as to register the femoral prosthesis with each part of the femur;
the method for obtaining the reconstructed point cloud data of the second point cloud data includes the steps of: performing similarity calculation on the point-by-point characteristics of the first point cloud data and the point-by-point characteristics of the second point cloud data by using pearson correlation coefficients to obtain a similarity matrix between any two points in the femur and the femur model, wherein the pearson correlation coefficients are used for representing correlation coefficients between an ith point in the first point cloud data and a jth point in the second point cloud data; normalizing and classifying elements in the similarity matrix according to a row unit to obtain a relation matrix between any two points in the femur and the femur model; arranging the first target point cloud data set according to the relation matrix to obtain intermediate point cloud data of the first target point cloud data set, and arranging the second target point cloud data set according to the relation matrix to obtain intermediate point cloud data of the second target point cloud data set; splicing the intermediate point cloud data of the second target point cloud data set and the global features of the first point cloud data to obtain a first splicing matrix, and splicing the intermediate point cloud data of the first target point cloud data set and the global features of the second point cloud data to obtain a second splicing matrix; splicing a first replication matrix and the first splicing matrix to obtain a first target matrix, and splicing a second replication matrix and the second splicing matrix to obtain a second target matrix, wherein the first replication matrix is obtained by performing a deformation operation on point-by-point characteristics of the first point cloud data, and the second replication matrix is obtained by performing a deformation operation on the point-by-point characteristics of the second point cloud data; and inputting the first target matrix into a multi-layer perceptron to obtain the reconstructed point cloud data of the first point cloud data, and inputting the second target matrix into the multi-layer perceptron to obtain the reconstructed point cloud data of the second point cloud data.
2. The method of claim 1, wherein performing a state transition operation on the first point cloud data to obtain a first set of point cloud data, and performing a state transition operation on the second point cloud data to obtain a second set of point cloud data, comprises:
performing a rotation operation on the first point cloud data to obtain a rotation angle space of the first point cloud data, and performing a rotation operation on the second point cloud data to obtain a rotation angle space of the second point cloud data;
the first set of target point cloud data is determined based on a rotation angle space of the first point cloud data, and the second set of target point cloud data is determined based on a rotation angle space of the second point cloud data.
3. The method of claim 2, wherein determining the first set of target point cloud data based on the rotation angle space of the first point cloud data and determining the second set of target point cloud data based on the rotation angle space of the second point cloud data comprises:
determining n rotation matrices of the first point cloud data in a rotation angle space of the first point cloud data, and determining n rotation matrices of the second point cloud data in a rotation angle space of the second point cloud data, wherein n is a natural number greater than 0;
Combining the n rotation matrixes of the first point cloud data to obtain a rotation matrix set of the first point cloud data, and combining the n rotation matrixes of the second point cloud data to obtain a rotation matrix set of the second point cloud data;
determining a rotation matrix set of the first point cloud data and an initial matrix set of the first point cloud data as the first target point cloud data set, and determining a rotation matrix set of the second point cloud data and an initial matrix set of the second point cloud data as the second target point cloud data set.
4. The method of claim 1, wherein performing a pooling operation on the first set of target point cloud data results in global features of the first point cloud data and performing a pooling operation on the second set of target point cloud data results in global features of the second point cloud data, comprising:
performing feature extraction operation on the first target point cloud data set to obtain point-by-point features of the first point cloud data, and performing feature extraction operation on the second target point cloud data set to obtain point-by-point features of the second point cloud data;
And executing pooling operation on the point-by-point characteristics of the first point cloud data to obtain global characteristics of the first point cloud data, and executing pooling operation on the point-by-point characteristics of the second point cloud data to obtain the global characteristics of the second point cloud data.
5. A model registration apparatus for a femur, comprising:
the device comprises an acquisition unit, a storage unit and a storage unit, wherein the acquisition unit is used for acquiring first point cloud data and second point cloud data of a femur, the first point cloud data are point cloud data of the femur, and the second point cloud data are other point cloud data of the femur;
the transfer unit is used for executing state transfer operation on the first point cloud data to obtain a first target point cloud data set, and executing state transfer operation on the second point cloud data to obtain a second target point cloud data set;
the pooling unit is used for performing pooling operation on the first target point cloud data set to obtain global features of the first point cloud data, and performing pooling operation on the second target point cloud data set to obtain global features of the second point cloud data;
the deformation unit is used for executing deformation operation on the global features of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and executing deformation operation on the global features of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data;
The computing unit is used for carrying out loss computation on the reconstructed point cloud data of the first point cloud data and the reconstructed point cloud data of the second point cloud data to obtain a point cloud registration model of the femur, wherein the point cloud registration model is used for attaching the femur to a femoral prosthesis so as to register each part of the femur and the femoral prosthesis;
the deformation unit is further configured to perform a deformation operation on the global feature of the first point cloud data to obtain reconstructed point cloud data of the first point cloud data, and perform a deformation operation on the global feature of the second point cloud data to obtain reconstructed point cloud data of the second point cloud data, by performing the following steps: performing similarity calculation on the point-by-point characteristics of the first point cloud data and the point-by-point characteristics of the second point cloud data by using pearson correlation coefficients to obtain a similarity matrix between any two points in the femur and the femur model, wherein the pearson correlation coefficients are used for representing correlation coefficients between an ith point in the first point cloud data and a jth point in the second point cloud data; normalizing and classifying elements in the similarity matrix according to a row unit to obtain a relation matrix between any two points in the femur and the femur model; arranging the first target point cloud data set according to the relation matrix to obtain intermediate point cloud data of the first target point cloud data set, and arranging the second target point cloud data set according to the relation matrix to obtain intermediate point cloud data of the second target point cloud data set; splicing the intermediate point cloud data of the second target point cloud data set and the global features of the first point cloud data to obtain a first splicing matrix, and splicing the intermediate point cloud data of the first target point cloud data set and the global features of the second point cloud data to obtain a second splicing matrix; splicing a first replication matrix and the first splicing matrix to obtain a first target matrix, and splicing a second replication matrix and the second splicing matrix to obtain a second target matrix, wherein the first replication matrix is obtained by performing a deformation operation on point-by-point characteristics of the first point cloud data, and the second replication matrix is obtained by performing a deformation operation on the point-by-point characteristics of the second point cloud data; and inputting the first target matrix into a multi-layer perceptron to obtain the reconstructed point cloud data of the first point cloud data, and inputting the second target matrix into the multi-layer perceptron to obtain the reconstructed point cloud data of the second point cloud data.
6. A computer readable storage medium, characterized in that the computer readable storage medium comprises a stored program, wherein the program when run controls a device in which the computer readable storage medium is located to perform the model registration method of a femur as claimed in any of claims 1 to 4.
7. A processor, characterized in that the processor is adapted to run a program, wherein the program when run by the processor performs the model registration method of a femur as claimed in any of claims 1 to 4.
CN202311034475.8A 2023-08-16 2023-08-16 Model registration method, device, storage medium and processor for femur Active CN116758127B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311034475.8A CN116758127B (en) 2023-08-16 2023-08-16 Model registration method, device, storage medium and processor for femur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311034475.8A CN116758127B (en) 2023-08-16 2023-08-16 Model registration method, device, storage medium and processor for femur

Publications (2)

Publication Number Publication Date
CN116758127A CN116758127A (en) 2023-09-15
CN116758127B true CN116758127B (en) 2023-12-19

Family

ID=87950017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311034475.8A Active CN116758127B (en) 2023-08-16 2023-08-16 Model registration method, device, storage medium and processor for femur

Country Status (1)

Country Link
CN (1) CN116758127B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106264731A (en) * 2016-10-11 2017-01-04 昆明医科大学第附属医院 A kind of method based on point-to-point registration technique virtual knee joint single condyle replacement model construction
CN113077498A (en) * 2021-05-06 2021-07-06 北京爱康宜诚医疗器材有限公司 Pelvis registration method, pelvis registration device and pelvis registration system
CN113409301A (en) * 2021-07-12 2021-09-17 上海精劢医疗科技有限公司 Point cloud segmentation-based femoral neck registration method, system and medium
CN113450294A (en) * 2021-06-07 2021-09-28 刘星宇 Multi-modal medical image registration and fusion method and device and electronic equipment
CN115511960A (en) * 2022-09-05 2022-12-23 重庆博仕康科技有限公司 Method and device for positioning central axis of femur, computer equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106264731A (en) * 2016-10-11 2017-01-04 昆明医科大学第附属医院 A kind of method based on point-to-point registration technique virtual knee joint single condyle replacement model construction
CN113077498A (en) * 2021-05-06 2021-07-06 北京爱康宜诚医疗器材有限公司 Pelvis registration method, pelvis registration device and pelvis registration system
CN113450294A (en) * 2021-06-07 2021-09-28 刘星宇 Multi-modal medical image registration and fusion method and device and electronic equipment
WO2022257344A1 (en) * 2021-06-07 2022-12-15 刘星宇 Image registration fusion method and apparatus, model training method, and electronic device
CN113409301A (en) * 2021-07-12 2021-09-17 上海精劢医疗科技有限公司 Point cloud segmentation-based femoral neck registration method, system and medium
CN115511960A (en) * 2022-09-05 2022-12-23 重庆博仕康科技有限公司 Method and device for positioning central axis of femur, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
取点数目对基于CT导航股骨配准精度影响的研究;曹延祥;赵燕鹏;徐晓军;张立海;王国旗;李建涛;周坚锋;陈宇翔;娄盛涵;唐佩福;;中国数字医学(第09期);全文 *

Also Published As

Publication number Publication date
CN116758127A (en) 2023-09-15

Similar Documents

Publication Publication Date Title
Zhang et al. Improved deep hashing with soft pairwise similarity for multi-label image retrieval
De Silva et al. Persistent cohomology and circular coordinates
EP0910832B1 (en) Method and apparatus for image registration
Wang et al. Laplacian pyramid adversarial network for face completion
CN113139996B (en) Point cloud registration method and system based on three-dimensional point cloud geometric feature learning
Li et al. Object matching using a locally affine invariant and linear programming techniques
CN113436211B (en) Medical image active contour segmentation method based on deep learning
CN113593001A (en) Target object three-dimensional reconstruction method and device, computer equipment and storage medium
CN112651345A (en) Human body posture recognition model optimization method and device and terminal equipment
CN112767463A (en) Countermeasure registration method and device, computer equipment and storage medium
CN114612902A (en) Image semantic segmentation method, device, equipment, storage medium and program product
CN113158970B (en) Action identification method and system based on fast and slow dual-flow graph convolutional neural network
CN116758127B (en) Model registration method, device, storage medium and processor for femur
CN111209946B (en) Three-dimensional image processing method, image processing model training method and medium
Korez et al. Intervertebral disc segmentation in MR images with 3D convolutional networks
Hudagi et al. Bayes-probabilistic-based fusion method for image inpainting
CN114708952B (en) Image annotation method and device, storage medium and electronic equipment
CN115564915A (en) Map construction method and device for environment digital area of transformer substation
CN115239740A (en) GT-UNet-based full-center segmentation algorithm
CN115861396A (en) Medical image registration method based on deep learning
Hua et al. Spectral geometry of shapes
CN113496468B (en) Depth image restoration method, device and storage medium
CN114693873A (en) Point cloud completion method based on dynamic graph convolution and attention mechanism
CN113723417A (en) Image matching method, device and equipment based on single view and storage medium
CN112837420A (en) Method and system for completing shape of terracotta warriors point cloud based on multi-scale and folding structure

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant