CN114360349A - Operation training device and system and operation instrument movement information acquisition method - Google Patents
Operation training device and system and operation instrument movement information acquisition method Download PDFInfo
- Publication number
- CN114360349A CN114360349A CN202111499425.8A CN202111499425A CN114360349A CN 114360349 A CN114360349 A CN 114360349A CN 202111499425 A CN202111499425 A CN 202111499425A CN 114360349 A CN114360349 A CN 114360349A
- Authority
- CN
- China
- Prior art keywords
- space model
- angular coordinate
- clamping part
- surgical
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Computational Mathematics (AREA)
- Epidemiology (AREA)
- Public Health (AREA)
- Primary Health Care (AREA)
- Medicinal Chemistry (AREA)
- Algebra (AREA)
- General Business, Economics & Management (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Instructional Devices (AREA)
- Manipulator (AREA)
Abstract
The application relates to the field of simulation training, in particular to a surgical training device and system and a surgical instrument movement information acquisition method. The operation training device comprises at least one surgical instrument and an operation space model. The surgical instrument includes an operation portion, a connection portion, and a clamping portion. A first attitude sensor is arranged at one end of the connecting part close to the clamping part, a second attitude sensor is arranged on the operating part, and a distance sensor is arranged at one end of the connecting part close to the operating part. The operation space model comprises at least one operation hole, and the connecting part and the clamping part of the surgical instrument penetrate through the operation hole and extend into the operation space model. And acquiring the motion information of the surgical instrument in the operation space model according to the acquired data of the first attitude sensor, the second attitude sensor and the distance sensor. The algorithm of the motion information is simple, and the implementation cost of the scheme can be effectively reduced.
Description
Technical Field
The application relates to the field of simulation training, in particular to a surgical training device and system and a surgical instrument movement information acquisition method.
Background
The virtual operation training system can simulate a virtual medical environment graphically through computer technology. Is widely applied to the skill training and evaluation of surgeons. The virtual operation training system needs to record the motion information of the surgical instruments when the users operate. And training or evaluating the operator based on the motion information of the surgical instrument.
The existing virtual operation training system needs the cooperation of a plurality of devices to simulate the medical environment and obtain the motion information of the surgical instruments in the operation process of an operator. The algorithm for acquiring the motion information of the surgical instrument is complex, and the use cost of the virtual surgery training system is high.
Disclosure of Invention
The application mainly aims to provide a surgical training device, a surgical training system and a surgical instrument motion information acquisition method, and aims to solve the problems that an algorithm for acquiring surgical instrument motion information is complex, and the use cost of a virtual surgical training system is high.
In a first aspect, the present application provides a surgical training apparatus comprising: at least one surgical instrument and a surgical space model. The surgical instrument comprises an operation part, a connecting part and a clamping part, wherein one end of the connecting part is connected with the clamping part, the operation part is connected with the other end of the connecting part in a rotating manner, and the operation part is driven to open when moving. A first attitude sensor is arranged at one end of the connecting part close to the clamping part, a second attitude sensor is arranged on the operating part, and a distance sensor is arranged at one end of the connecting part close to the operating part. The operation space model comprises at least one operation hole, and the connecting part and the clamping part of the surgical instrument penetrate through the operation hole and extend into the operation space model.
The application provides a surgery training device, including at least one surgical instruments and operation space model, the surgical instruments can stretch into and carry out the analog operation in the operation space model. The motion information of the surgical instrument in the surgical space model is obtained according to the data collected by the first attitude sensor, the second attitude sensor and the distance sensor. The simulation operation training is realized at lower cost.
In some embodiments, a third attitude sensor is disposed at the origin of the reference frame where the surgical space model is located.
In some embodiments, the surgical training apparatus further includes a control component communicatively coupled to the first attitude sensor, the second attitude sensor, the third attitude sensor, and the distance sensor.
In some embodiments, the surgical space model further comprises at least one viewing port.
In a second aspect, the present application provides a method for acquiring motion information of a surgical instrument, which is applied to the surgical training apparatus provided in the first aspect, and the method includes: and acquiring first motion information of the clamping part in the operation space model through data acquired by the first attitude sensor and the distance sensor, wherein the first motion information is used for representing the moving state of the surgical instrument. And acquiring second motion information of the clamping part through data acquired by the second attitude sensor, wherein the second motion information is used for representing the operating state of the clamping part.
The surgical instrument movement information acquisition method provided by the application is based on the surgical training device provided by the first aspect, and the movement state and the operation state of the surgical instrument are calculated through data acquired by arranging a plurality of sensors on the surgical instrument. The algorithm is simple and the implementation cost of the scheme can be effectively reduced.
In some embodiments, the first motion information of the clamping portion in the operation space model comprises a rotation angle of the clamping portion, and the angular coordinate of the reference frame origin where the operation space model is located is acquired by the third attitude sensor.
Acquiring the rotation angle of the clamping part in the operation space model, comprising the following steps: a first angular coordinate of the clamping portion is acquired through the first attitude sensor. And acquiring a second angular coordinate of the clamping part according to the angular coordinate of the origin of the reference system where the operation space model is located and the first angular coordinate, wherein the second angular coordinate is the coordinate of the clamping part in the reference system where the operation space model is located, and the second angular coordinate is used for expressing the rotation angle of the clamping part.
In some embodiments, obtaining the second angular coordinate of the clamping portion according to the angular coordinate of the origin of the reference system where the operation space model is located and the first angular coordinate comprises: and subtracting the angular coordinate of the origin of the reference system where the operation space model is positioned from the first angular coordinate to obtain a second angular coordinate.
In some embodiments, the second motion information of the grip includes an opening angle of the grip.
Obtain the flare angle of clamping part, include: and acquiring a third coordinate of the operating part through the second attitude sensor, wherein the third coordinate is a coordinate of the operating part in a reference system where the operating space model is located, and the third coordinate is used for representing the rotation angle of the operating part. And acquiring the opening angle of the clamping part according to the angular coordinate, the first angular coordinate and the third angular coordinate of the origin of the reference system where the operation space model is located.
In some embodiments, obtaining the opening angle of the clamping portion according to the angular coordinate of the origin of the reference system where the operation space model is located, the first angular coordinate and the third angular coordinate includes: and subtracting the roll angle of the first angular coordinate and the angular coordinate of the origin of the reference system where the operation space model is located from the roll angle of the third angular coordinate to obtain the opening angle of the clamping part.
In some embodiments, the first motion information of the clamp within the surgical space model further includes a displacement distance of the clamp.
Obtain the displacement distance of clamping part in operation space model, include: and acquiring measurement data of the plurality of distance sensors, wherein the measurement data is the distance between the distance sensors and the inner wall of the operation space model. And acquiring a plurality of coordinates of the clamping part according to the measurement data of the plurality of distance sensors and the coordinates of the operation hole in a reference system where the operation space model is positioned. And calculating the displacement distance of the clamping part according to the plurality of coordinates of the clamping part.
In some embodiments, the distance between the distance sensor and the clamp is a first distance.
The coordinates of the clamping part are obtained according to the measurement data of the distance sensor and the coordinates of the operation hole, and the method comprises the following steps: and acquiring the coordinates of the clamping part according to the measurement data of the distance sensor, the coordinates of the operation hole and the first distance through a space vector algorithm.
In a third aspect, the application provides a surgical training system, which includes the surgical training apparatus provided in the first aspect and a terminal device. The terminal device is in communication connection with a control assembly in the operation training device, and when the terminal device runs, the method for acquiring the motion information of the surgical instrument provided by the second aspect is realized.
The surgical training system provided by the third aspect includes the surgical training apparatus provided by the first aspect, and applies the surgical instrument motion information obtaining method provided by the second aspect, and the beneficial effects thereof may refer to the first aspect and the second aspect, which are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic view of a surgical instrument according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a surgical training apparatus according to an embodiment of the present application;
fig. 3 is a flowchart illustrating a surgical instrument movement information acquiring method according to an embodiment of the present application.
Reference numerals: 1-surgical instruments; 11-a clamping part; 12-a connecting part; 13-an operating part; 14-a first attitude sensor; 15-a distance sensor; 16-a second attitude sensor; 2-an operation space model; 21-an operation hole; 22-a third attitude sensor; 23-Observation hole.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
The scheme provided by the application comprises at least one surgical instrument and a surgical space model. The motion information of the surgical instrument in the surgical space model is obtained according to the data collected by the first attitude sensor, the second attitude sensor and the distance sensor. The virtual surgery training scheme with simple algorithm and low implementation cost is provided. The problems that the algorithm for acquiring the motion information of the surgical instrument is complex and the use cost of the virtual surgery training system is high are solved.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Fig. 1 is a schematic structural diagram of a surgical instrument according to an embodiment of the present application, and fig. 2 is a schematic structural diagram of a surgical training apparatus according to an embodiment of the present application.
In some embodiments, referring to fig. 1 and 2, a surgical training apparatus includes at least one surgical instrument 1 and a surgical space model 2. The surgical instrument 1 comprises an operation part 13, a connection part 12 and a clamping part 11, wherein one end of the connection part 12 is connected with the clamping part 11, the operation part 13 is connected with the other end of the connection part 12 in a rotating manner, and the operation part 13 drives the clamping part 11 to open when moving. A first posture sensor 14 is provided at one end of the connecting portion 12 close to the grip portion 11, a second posture sensor 16 is provided at the operating portion 13, and a distance sensor 15 is provided at one end of the connecting portion 12 close to the operating portion 13. The operation space model 2 comprises at least one operation hole 21, and the connecting part 12 and the clamping part 11 of the surgical instrument 1 penetrate through the operation hole 21 and extend into the operation space model 2.
In the present application, the surgical instrument 1 is exemplified by a curved dissecting forceps, but the surgical instrument is not limited thereto, and may be, for example, a needle holder, a closer, or the like.
In some embodiments, the attitude sensor may be a three-axis gyroscope, a three-axis accelerometer, a three-axis electronic compass, or the like, and the distance sensor may be an infrared distance sensor, a laser distance sensor, an ultrasonic distance sensor, or the like, which is not limited by the application to the type of the sensor.
In some embodiments, a third attitude sensor 22 is provided at the origin of the reference frame in which the surgical space model 2 is located. Referring to fig. 2, the origin of the reference frame where the surgical space model 2 is located may be set to the lower left corner of the surgical space model 2 facing the user, i.e. where the third posture sensor 22 is located.
In some embodiments, the surgical training apparatus further includes a control assembly (not shown) communicatively coupled to the first attitude sensor 14, the second attitude sensor 16, the third attitude sensor 22, and the distance sensor 15.
In some embodiments, the surgical space model 2 further includes at least one viewing port 23. The viewing port 23 may be a through hole in the surgical space phantom 2 or may be created using a material that is transparent when the surgical space phantom 2 is manufactured. The operator can observe the position of the surgical instrument 1 in the operating space model 2 through the observation hole 23 to facilitate the operation.
Fig. 3 is a flowchart illustrating a surgical instrument movement information acquiring method according to an embodiment of the present application.
Referring to fig. 3, the surgical instrument motion information acquisition method includes:
s310, acquiring first motion information of the clamping part in the operation space model through data acquired by the first attitude sensor and the distance sensor.
In some embodiments, the first motion information is used to represent a movement state of the surgical instrument. The first motion information of the clamping part in the operation space model comprises the rotation angle of the clamping part, and the angular coordinate of the origin of the reference system where the operation space model is located is acquired through the third attitude sensor. Acquiring the rotation angle of the clamping part in the operation space model, comprising the following steps: a first angular coordinate of the clamping portion is acquired through the first attitude sensor. And acquiring a second angular coordinate of the clamping part according to the angular coordinate of the origin of the reference system where the operation space model is located and the first angular coordinate, wherein the second angular coordinate is the coordinate of the clamping part in the reference system where the operation space model is located, and the second angular coordinate is used for expressing the rotation angle of the clamping part.
As an example, referring to the coordinate system of the surgical space model shown in FIG. 2, the reference point R of the origin of the reference system of the surgical space model0Has an angular coordinate of (a)0,b0,c0). The first posture sensor may be disposed at a position closer to the grip, for example, may be 5cm from the grip. The measurement angle R of the clamping part can be obtained through the first attitude sensor11Angular coordinate (a) of11,b11,c11) I.e. the first angular coordinate.
In some embodiments, the first attitude sensor and the operation space model are in the same coordinate system, so that the first attitude sensor and the operation space model can be based on the reference point R0The first angular coordinate is calibrated, and the calibrated angular coordinate is the rotating angle R of the top end of the clamping part1Angular coordinate (a) of1,b1,c1) I.e. the second angular coordinate.
As an example, according to the reference point R0When the angular coordinate of the surgical space model is used for calibrating the first angular coordinate, the angular coordinate of the origin of the reference system where the surgical space model is located can be subtracted from the first angular coordinate to obtain a second angular coordinate.
Referring to the above example, the first angular coordinate is (a)11,b11,c11) Angular coordinate of origin of reference system where the operation space model is located (a)0,b0,c0)。R1Angular coordinate (a) of1,b1,c1) The method comprises the following steps:
a1=a11-a0
b1=b11-b0
c1=c11-c0
in some embodiments, the first motion information of the clamp within the surgical space model further includes a displacement distance of the clamp. The displacement distance of the clamping part can be obtained by acquiring the coordinates of the clamping part for multiple times and calculating according to the acquired coordinates.
In some embodiments, obtaining the displacement distance of the clamping portion within the surgical space model comprises: and acquiring measurement data of the plurality of distance sensors, wherein the measurement data is the distance between the distance sensors and the inner wall of the operation space model. And acquiring a plurality of coordinates of the clamping part according to the measurement data of the plurality of distance sensors and the coordinates of the operation hole in a reference system where the operation space model is positioned. Calculating the displacement distance of the clamping part according to a plurality of coordinates of the clamping part
In some embodiments, the distance between the distance sensor and the grip is a first distance, denoted as L, which may be 20cm, for example. As an example, the coordinates of the clamping portion, which are based on the coordinates of the tip of the clamping portion, may be obtained by a space vector algorithm from the measurement data of the distance sensor, the coordinates of the operation hole, and the first distance.
Referring to fig. 2, the surgical space model 2 has a length a in the X-direction, a length B in the y-direction, a height C in the z-direction, and a boundary distance T between the operation hole 21 and the surgical space model 2 (T in the X-axis direction)xAnd T is in the Y-axis directiony). Measurement data D of the distance sensor1Setting D as the distance between the distance sensor and the plane of the X axis and the Y axis1=h1. The coordinate of the origin O is (0,0,0), and the coordinate P of the tip of the clamping portion1Is described as (X)1,Y1,Z1) The coordinate Q of the operation hole 21 is expressed as (T)X,TYC), the coordinates of the distance sensor are M, M and P1The distance between is L.
The following can be obtained through space vector calculation:
according to the formula of the space vector included angle, calculatingThe angles with the three coordinate axes can be:
the coordinate of M on the Z axis is the value measured from the sensor, i.e.:
the following can be obtained:
Z1=L*cosc1+h1
according to a similar principle:
the following can be obtained:
the same principle is that:
in summary, P1The coordinate is (X)1,Y1,Z1) The method comprises the following steps:
Z1=L*cosc1+h1
in some embodiments, when multiple P's are acquired1Can be based on two adjacent P1The distance between the two coordinates is calculated (the calculation formula of the distance between the two coordinates is similar to the calculation formula of k, and is not described herein), and the displacement distance of the clamping portion is obtained.
And S320, acquiring second motion information of the clamping part through the data acquired by the second attitude sensor.
In some embodiments, the second motion information is used to indicate an operating state of the gripping portion. The second motion information of the nip comprises an opening angle of the nip.
Obtain the flare angle of clamping part, include: and acquiring a third coordinate of the operating part through the second attitude sensor, wherein the third coordinate is a coordinate of the operating part in a reference system where the operating space model is located, and the third coordinate is used for representing the rotation angle of the operating part. And acquiring the opening angle of the clamping part according to the angular coordinate, the first angular coordinate and the third angular coordinate of the origin of the reference system where the operation space model is located.
As an example, the angular coordinate of the origin of the reference system where the operation space model is located is the reference point R in the above example0Angular coordinate (a) of0,b0,c0) The first angular coordinate is the measurement angle R of the first attitude sensor to the clamping part11Angular coordinate (a) of11,b11,c11) The third angular coordinate is a measurement angle R obtained by a second attitude sensor provided on the operation portion21Angular coordinate (a) of21,b21,c21)。
Because the operating part can drive the clamping part to open when being stressed, the opening angle of the clamping part can change along with the change of the rolling angle of the operating part. Thus, in some embodiments, the opening angle of the gripping portion may be acquired by the roll angle of the operating portionR′1. For example, the roll angle of the first angular coordinate and the roll angle of the angular coordinate of the origin of the reference system where the surgical space model is located are subtracted from the roll angle of the third angular coordinate to obtain the opening angle of the clamping portion. Namely:
R′1=a21-a11-a0
in some embodiments, the movement information of the clamping portion actually measured is different from the expected value due to the influence of factors such as an angle and a distance error when the sensor is fixed. The motion information may be calibrated by calculating a linear regression equation. It should be noted that the linear regression equation is a statistical analysis method for determining the quantitative relationship between two or more variables by using regression analysis in mathematical statistics. In statistics, a linear regression equation is a regression analysis that models the relationship between one or more independent and dependent variables using a least squares function. Such a function is a linear combination of one or more model parameters called regression coefficients.
In some embodiments, the surgical training apparatus includes a surgical instrument, in this case, the motion information (X) in the X direction in the coordinates, the actual measured values and the expected values are two variables in a set of linear regression equations. The linear regression equation for two variables can be expressed as:
y=a*x+b
where y represents the expected value, x represents the actual measured value, a is the coefficient fitted, and b represents the error, then a and b can be expressed as:
alternatively, it can also be expressed as:
as an example, table 1 shows 6 sets of actually measured x data and the corresponding expected values y:
TABLE 1
x | y |
24.9449 | 25.6821 |
25.9782 | 28.6025 |
26.0865 | 27.5865 |
26.1798 | 27.7786 |
25.2140 | 27.3585 |
30.1454 | 30.6699 |
From the above formula, and the measurement data in table 1, the linear regression equation of the motion information in the X direction can be calculated as:
y=0.7870*x+7.1507
i.e., b-7.1507 and a-0.7870.
Similarly, a linear regression equation that can calculate the motion information in the Y direction is:
y=0.7986*x+7.2046
i.e., b-7.2046 and a-0.7986.
In some embodiments, the surgical training apparatus includes more than one surgical instrument, for example, when the surgical training apparatus includes two surgical instruments and the two surgical instruments pass through the same operation hole, the motion information of each surgical instrument is obtained in the same manner as in S310 and S320, which is not described herein again. When two surgical instruments are operated, the position information error generated by the interaction of the surgical instruments can be calculated by means of a linear regression equation.
In some embodiments, the loss function of linear regression can be used as a criterion for the degree of fit of the linear regression equation. Wherein the loss function of the linear regression is:
the smaller the loss function value of the linear regression is, the higher the fitting degree of the linear regression equation and the real data is, and the more accurate the error obtained by calculation is. Wherein y is(i)The value of the data being represented as accurate,representing the calculation result of the linear regression equation, and recording the motion information of the first surgical instrument in the X direction as X1And the motion information of the second surgical instrument in the X direction is recorded as X2。
θ=(θ0,θ1,θ2,…,θn)T,X(i)represented as a row vector:
mixing X(i)Aligned in a line, XbRepresents that X is(i)Spread from the vector into a matrix, and X0X can be obtained as 1b:
the loss function J (θ) is developed to obtain:
J(θ)=θT*Xb T*Xb*θ-2*(Xb*θ)T*y+yT*y
derivation of J (θ) can give:
Xb T*Xb*θ=Xb T*y
an expression for θ can be found:
θ=(Xb T*Xb)-1*Xb T*y
as an example, table 2 shows 5 sets of data obtained from actual measurements:
TABLE 2
y | X1 | X2 |
32.2680 | 30.4836 | 34.7798 |
35.4657 | 33.9440 | 37.1566 |
34.8306 | 32.0958 | 36.3354 |
34.8960 | 32.3838 | 36.5793 |
34.5660 | 32.9330 | 36.8416 |
The linear regression equation for the motion information of the two surgical instruments in the X direction can be calculated as:
y=-17.3772-0.3129*X1+1.7037*X2
i.e. b-17.3772, a1=-0.3129,a2=1.7037。
Similarly, the linear regression equation of the motion information of the two surgical instruments in the Y direction is:
y=-15.8779-0.3211*X1+1.2567*X2
i.e. b-15.8779, a1=-0.3211,a2=1.2567。
In some embodiments, the operation score of the operator can be further obtained according to the measured first motion information and the second motion information.
As an example, a specification of a set of surgical instrument motion profiles may be pre-designed to simulate the process of separation, cutting and movement in a surgical procedure. The operator operates along the motion trajectory specified in the specification by operating one or two surgical instruments (e.g., curved distraction clamps). When the operator operates, the first motion information and the second motion information when the operator operates are recorded.
For example, when the surgical instrument is a curved separation forceps, the gripping portion is a forceps of the curved separation forceps, and the operating portion is an operating handle of the curved separation forceps. The rotating angle and the displacement distance of the top end of the bent separating forceps are first motion information. The opening angle of the forceps of the curved separating forceps is the second movement information.
And comparing the measured motion information such as the rotating angle and the displacement distance of the top end of the bent separating forceps and the opening angle of the bent separating forceps with the motion information specified in the specification of the pre-designed motion track of the surgical instrument to obtain the operation score of the operator. For example, the similarity between the measured motion information and the predetermined motion information may be calculated, and the numerical value of the similarity may be used as the operation score of the operator.
In the application, a surgical training device and a surgical instrument movement information acquisition method based on the surgical training device are provided. The motion data of the surgical instrument in the operation space model is acquired by arranging a first posture sensor, a second posture sensor and a distance sensor on the surgical instrument. And then, calculating operation data of the operator for operating the surgical instrument according to the motion data acquired by the sensor. The algorithm is simple and the implementation cost of the scheme can be effectively reduced.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
An embodiment of the present application provides a chip system, where the chip system includes a memory and a processor, and the processor executes a computer program stored in the memory to implement the steps in the foregoing method embodiments.
An embodiment of the present application provides a chip system, where the chip system includes a processor, the processor is coupled to a computer-readable storage medium, and the processor executes a computer program stored in the computer-readable storage medium to implement the steps in the above-mentioned method embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments. While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (12)
1. A surgical training apparatus, comprising: at least one surgical instrument and a surgical space model;
the surgical instrument comprises an operation part, a connecting part and a clamping part, wherein one end of the connecting part is connected with the clamping part, the operation part is rotatably connected with the other end of the connecting part, and the operation part drives the clamping part to open when moving;
a first attitude sensor is arranged at one end of the connecting part close to the clamping part, a second attitude sensor is arranged on the operating part, and a distance sensor is arranged at one end of the connecting part close to the operating part;
the operation space model comprises at least one operation hole, and the connecting part and the clamping part of the surgical instrument penetrate through the operation hole and extend into the operation space model.
2. The surgical training apparatus as defined in claim 1, wherein a third attitude sensor is provided at an origin of a reference frame in which the surgical space model is located.
3. The surgical training apparatus of claim 2, further comprising a control assembly communicatively coupled to the first attitude sensor, the second attitude sensor, the third attitude sensor, and the distance sensor.
4. The surgical training apparatus of claim 3, further comprising at least one viewing port on the surgical space model.
5. A surgical instrument movement information acquisition method applied to the surgical training apparatus according to any one of claims 1 to 4, the method comprising:
acquiring first motion information of the clamping part in the operation space model through data acquired by a first attitude sensor and a distance sensor, wherein the first motion information is used for representing the motion state of the surgical instrument;
and acquiring second motion information of the clamping part through data acquired by the second attitude sensor, wherein the second motion information is used for representing the operating state of the clamping part.
6. The method of claim 5, wherein the first motion information of the clamp portion within the surgical space model comprises a rotation angle of the clamp portion, and an angular coordinate of a reference frame origin where the surgical space model is located is acquired by a third attitude sensor;
acquiring the rotation angle of the clamping part in the operation space model, wherein the rotation angle comprises the following steps:
acquiring a first angular coordinate of the clamping part through the first attitude sensor;
and acquiring a second angular coordinate of the clamping part according to the angular coordinate of the origin of the reference system where the operation space model is located and the first angular coordinate, wherein the second angular coordinate is the coordinate of the clamping part in the reference system where the operation space model is located, and the second angular coordinate is used for representing the rotation angle of the clamping part.
7. The method of claim 6, wherein obtaining the second angular coordinate of the clamping portion from the first angular coordinate and the angular coordinate of the origin of the reference system in which the surgical space model is located comprises:
and subtracting the angular coordinate of the origin of the reference system where the operation space model is located from the first angular coordinate to obtain the second angular coordinate.
8. The method of claim 6, wherein the second motion information of the grip includes an opening angle of the grip;
obtaining an opening angle of the clamping portion, including:
acquiring a third coordinate of the operating part through the second attitude sensor, wherein the third coordinate is a coordinate of the operating part in a reference system where the operating space model is located, and the third coordinate is used for representing a rotation angle of the operating part;
and acquiring the field angle of the clamping part according to the angular coordinate of the origin of the reference system where the operation space model is located, the first angular coordinate and the third angular coordinate.
9. The method of claim 8, wherein obtaining the opening angle of the clamping portion according to the angular coordinate of the origin of the reference system where the surgical space model is located, the first angular coordinate and the third angular coordinate comprises:
and subtracting the roll angle of the first angular coordinate and the roll angle of the angular coordinate of the origin of the reference system where the operation space model is located from the roll angle of the third angular coordinate to obtain the opening angle of the clamping part.
10. The method of any one of claims 5-9, wherein the first motion information of the gripping portion within the surgical space model further includes a displacement distance of the gripping portion;
obtaining a displacement distance of the clamping part in the operation space model, comprising:
acquiring measurement data of a plurality of distance sensors, wherein the measurement data are distances between the distance sensors and the inner wall of the operation space model;
in a reference system where the operation space model is located, acquiring a plurality of coordinates of the clamping part according to the measurement data of the plurality of distance sensors and the coordinates of the operation hole;
and calculating the displacement distance of the clamping part according to the plurality of coordinates of the clamping part.
11. The method of claim 10, wherein the distance between the distance sensor and the grip is a first distance;
obtaining the coordinates of the clamping part according to the measurement data of the distance sensor and the coordinates of the operation hole, and the method comprises the following steps:
and acquiring the coordinates of the clamping part according to the measurement data of the distance sensor, the coordinates of the operation hole and the first distance through a space vector algorithm.
12. A surgical training system comprising the surgical training apparatus according to any one of claims 1 to 4 and a terminal device;
the terminal device is connected with a control component in the operation training device in a communication mode, and when the terminal device operates, the surgical instrument motion information acquisition method provided by any one of claims 5 to 11 is achieved.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111499425.8A CN114360349B (en) | 2021-12-09 | 2021-12-09 | Operation training device and system and surgical instrument motion information acquisition method |
PCT/CN2022/137027 WO2023104057A1 (en) | 2021-12-09 | 2022-12-06 | Surgical training device and system, and method for acquiring motion information of surgical instrument |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111499425.8A CN114360349B (en) | 2021-12-09 | 2021-12-09 | Operation training device and system and surgical instrument motion information acquisition method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114360349A true CN114360349A (en) | 2022-04-15 |
CN114360349B CN114360349B (en) | 2022-09-23 |
Family
ID=81097389
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111499425.8A Active CN114360349B (en) | 2021-12-09 | 2021-12-09 | Operation training device and system and surgical instrument motion information acquisition method |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114360349B (en) |
WO (1) | WO2023104057A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023104057A1 (en) * | 2021-12-09 | 2023-06-15 | 深圳先进技术研究院 | Surgical training device and system, and method for acquiring motion information of surgical instrument |
CN116822850A (en) * | 2023-06-07 | 2023-09-29 | 北京贝德信诚科技有限公司 | Simulation teaching management system |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1488127A (en) * | 2001-01-24 | 2004-04-07 | ��±���ѧ��ƿ�ѧ�ɷݹ�˾ | Method and system for simulation of surgical procedures |
CN103280144A (en) * | 2013-04-07 | 2013-09-04 | 浙江工业大学 | Analogue operation training system |
CN103456223A (en) * | 2012-06-01 | 2013-12-18 | 苏州敏行医学信息技术有限公司 | Laparoscopic surgery simulation system based on force feedback |
CN104200730A (en) * | 2014-09-09 | 2014-12-10 | 华中科技大学 | Device, method and system for virtual laparoscopic surgery |
US20170312031A1 (en) * | 2016-04-27 | 2017-11-02 | Arthrology Consulting, Llc | Method for augmenting a surgical field with virtual guidance content |
KR101887805B1 (en) * | 2017-03-23 | 2018-08-10 | 최재용 | System for simulating laparoscopic surgery base on augmented reality and method for using the same |
CN110800033A (en) * | 2017-06-29 | 2020-02-14 | 威博外科公司 | Virtual reality peritoneoscope formula instrument |
CN110807968A (en) * | 2019-11-28 | 2020-02-18 | 上海褚信医学科技有限公司 | Puncture operation teaching system, realization method, teaching terminal and teaching equipment |
CN111819611A (en) * | 2018-03-09 | 2020-10-23 | 拉帕罗有限公司 | Working tool and manipulation and measurement kit for laparoscopic trainer |
CN113066336A (en) * | 2021-04-19 | 2021-07-02 | 中国科学院深圳先进技术研究院 | Abdominal organ tumor simulation platform |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5614980B2 (en) * | 2009-12-25 | 2014-10-29 | 三菱プレシジョン株式会社 | Simulation tool position setting device for trocar position setting |
CN202871171U (en) * | 2012-09-06 | 2013-04-10 | 佛山市金天皓科技有限公司 | Neuroendoscopy simulation training apparatus and system thereof |
CN203573550U (en) * | 2013-11-29 | 2014-04-30 | 魏东新 | Device applied to single hole laparoscopic training |
US10695123B2 (en) * | 2016-01-29 | 2020-06-30 | Covidien Lp | Surgical instrument with sensor |
CN114360349B (en) * | 2021-12-09 | 2022-09-23 | 深圳先进技术研究院 | Operation training device and system and surgical instrument motion information acquisition method |
-
2021
- 2021-12-09 CN CN202111499425.8A patent/CN114360349B/en active Active
-
2022
- 2022-12-06 WO PCT/CN2022/137027 patent/WO2023104057A1/en unknown
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1488127A (en) * | 2001-01-24 | 2004-04-07 | ��±���ѧ��ƿ�ѧ�ɷݹ�˾ | Method and system for simulation of surgical procedures |
CN103456223A (en) * | 2012-06-01 | 2013-12-18 | 苏州敏行医学信息技术有限公司 | Laparoscopic surgery simulation system based on force feedback |
CN103280144A (en) * | 2013-04-07 | 2013-09-04 | 浙江工业大学 | Analogue operation training system |
CN104200730A (en) * | 2014-09-09 | 2014-12-10 | 华中科技大学 | Device, method and system for virtual laparoscopic surgery |
US20170312031A1 (en) * | 2016-04-27 | 2017-11-02 | Arthrology Consulting, Llc | Method for augmenting a surgical field with virtual guidance content |
KR101887805B1 (en) * | 2017-03-23 | 2018-08-10 | 최재용 | System for simulating laparoscopic surgery base on augmented reality and method for using the same |
CN110800033A (en) * | 2017-06-29 | 2020-02-14 | 威博外科公司 | Virtual reality peritoneoscope formula instrument |
CN111819611A (en) * | 2018-03-09 | 2020-10-23 | 拉帕罗有限公司 | Working tool and manipulation and measurement kit for laparoscopic trainer |
CN110807968A (en) * | 2019-11-28 | 2020-02-18 | 上海褚信医学科技有限公司 | Puncture operation teaching system, realization method, teaching terminal and teaching equipment |
CN113066336A (en) * | 2021-04-19 | 2021-07-02 | 中国科学院深圳先进技术研究院 | Abdominal organ tumor simulation platform |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023104057A1 (en) * | 2021-12-09 | 2023-06-15 | 深圳先进技术研究院 | Surgical training device and system, and method for acquiring motion information of surgical instrument |
CN116822850A (en) * | 2023-06-07 | 2023-09-29 | 北京贝德信诚科技有限公司 | Simulation teaching management system |
Also Published As
Publication number | Publication date |
---|---|
WO2023104057A1 (en) | 2023-06-15 |
CN114360349B (en) | 2022-09-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114360349B (en) | Operation training device and system and surgical instrument motion information acquisition method | |
JP7235665B2 (en) | Laparoscopic training system | |
US11452569B2 (en) | Systems and methods for device-aware flexible tool registration | |
EP3404664B1 (en) | Systems and methods of tracking and analyzing use of medical instruments | |
US20200107899A1 (en) | Systems and methods for adaptive input mapping | |
US10376178B2 (en) | Systems and methods for registration of a medical device using rapid pose search | |
WO2018218175A1 (en) | Laparoscopic training system | |
KR20150017327A (en) | Systems and methods for deformation compensation using shape sensing | |
WO2020213484A1 (en) | Surgery evaluation system | |
Cheng et al. | Design and integration of electrical bio-impedance sensing in surgical robotic tools for tissue identification and display | |
Noonan et al. | A dual-function wheeled probe for tissue viscoelastic property identification during minimally invasive surgery | |
WO2017098506A9 (en) | Autonomic goals-based training and assessment system for laparoscopic surgery | |
WO2008072756A1 (en) | Reaction force presentation method and force presentation system | |
Liu et al. | Rolling mechanical imaging: a novel approach for soft tissue modelling and identification during minimally invasive surgery | |
EP4280996A1 (en) | Method for tracking a medical tool during a medical procedure using deep learning | |
Chowriappa et al. | A predictive model for haptic assistance in robot assisted trocar insertion | |
US20230310087A1 (en) | Navigation method and navigation system for surgical instrument | |
CN113889224B (en) | Training of operation prediction model and operation indication method | |
US20240257666A1 (en) | Systems and methods for trocar simulation with admittance haptic feedback | |
Ilewicz | Inference about transient states of innovative RCM structure for soft tissue surgery using FEM taking into account inputs from in vitro experiment on cardiovascular tissue | |
Marin et al. | Stress Analysis at Trocar and Endoscope Interface Using Computational Simulation: A Preliminary Study | |
Ferguson | Estimation in Minimally Invasive Surgical Robotics: Enabling Novel Technologies in Image Guidance and Sensing | |
Wang | Development of a robotic biopsy system compatible with label-free digital pathology and methods for needle-tissue classification | |
CN118695810A (en) | Method for tracking medical tools during medical procedures using deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |