CN112712016B - Surgical instrument identification method, identification platform and medical robot system - Google Patents

Surgical instrument identification method, identification platform and medical robot system Download PDF

Info

Publication number
CN112712016B
CN112712016B CN202011595244.0A CN202011595244A CN112712016B CN 112712016 B CN112712016 B CN 112712016B CN 202011595244 A CN202011595244 A CN 202011595244A CN 112712016 B CN112712016 B CN 112712016B
Authority
CN
China
Prior art keywords
surgical instrument
identification information
identification
information
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011595244.0A
Other languages
Chinese (zh)
Other versions
CN112712016A (en
Inventor
宋进
陈功
朱祥
何超
李明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Microport Medbot Group Co Ltd
Original Assignee
Shanghai Microport Medbot Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Microport Medbot Group Co Ltd filed Critical Shanghai Microport Medbot Group Co Ltd
Priority to CN202011595244.0A priority Critical patent/CN112712016B/en
Publication of CN112712016A publication Critical patent/CN112712016A/en
Application granted granted Critical
Publication of CN112712016B publication Critical patent/CN112712016B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/98Identification means for patients or instruments, e.g. tags using electromagnetic means, e.g. transponders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/034Recognition of patterns in medical or anatomical images of medical instruments

Abstract

The invention provides a surgical instrument identification method, an identification platform and a medical robot system, wherein the surgical instrument identification method comprises the following steps: acquiring image information of a surgical instrument placed on a support table; extracting characteristic information of the surgical instrument in the image information through a neural network; according to the characteristic information, obtaining identification information of the surgical instrument; the identification information of the surgical instrument comprises first identification information obtained before operation and second identification information obtained after operation; and judging whether the first identification information is consistent with the second identification information or not, and outputting a judging result. The identification platform is utilized to acquire the identification information of the surgical instrument, and the problem that the existing manual statistics is easy to miss can be replaced, so that an operator can conveniently and accurately judge whether the surgical instrument is missed in the surgical process.

Description

Surgical instrument identification method, identification platform and medical robot system
Technical Field
The invention relates to the technical field of robot assisted surgery, in particular to a surgical instrument identification method, an identification platform and a medical robot system.
Background
The minimally invasive surgery is a new technology for performing the surgery in a human body through endoscopes such as laparoscopes, thoracoscopes and the like, has the advantages of small trauma, light pain, less bleeding and the like, and can effectively reduce the recovery time of patients, is inadaptation and avoids some harmful side effects of the traditional surgery.
The minimally invasive surgical robot system enables an operator to observe tissue features in a patient through a two-dimensional or three-dimensional display device at a main console and remotely control a mechanical arm and surgical tool instruments on a slave operating robot to complete operation of the surgery. In robotic surgery, it is often necessary to introduce surgical instruments or auxiliary items into the body, such as suture needles, hemostatic clips, and the like. In the current surgery, surgical instruments and the like used are managed by a bedside nurse, and a check work using consumable materials in the surgery is performed after the surgery. This mode of operation depends on the records of the bedside nurse and is generally done post-operatively. The bedside nurse relies on the kind and the quantity of using the consumptive material in the manual inventory art, and is likely to appear to be missed. And once the surgical instrument is missed, serious medical accidents are likely to happen.
Disclosure of Invention
The invention aims to provide a surgical instrument identification method, an identification platform and a medical robot system, which are used for solving the problem that consumable materials used in the existing manual statistics are prone to being overlooked.
To solve the above technical problem, according to a first aspect of the present invention, there is provided a surgical instrument identification method, including:
acquiring image information of a surgical instrument placed on a support table;
extracting characteristic information of the surgical instrument in the image information through a neural network;
according to the characteristic information, obtaining identification information of the surgical instrument; the identification information of the surgical instrument comprises first identification information obtained before operation and second identification information obtained after operation; and
and judging whether the first identification information is consistent with the second identification information, and outputting a judging result.
Optionally, in the surgical instrument identification method, the feature information includes: at least one of shape, texture, and color of a corresponding surgical instrument in the image information.
Optionally, in the surgical instrument identification method, the method for extracting feature information of the surgical instrument in the image information through a neural network includes:
Preprocessing the image information to reduce noise of the picture information; and
and extracting the characteristic information of the surgical instrument from the preprocessed picture information.
Optionally, in the surgical instrument identification method, before extracting the characteristic information of the surgical instrument in the image information, the surgical instrument identification method includes:
extracting a target area covering a target surgical instrument from the image information;
carrying out graying treatment on the color image information in the target area range;
scanning the image information after graying, extracting pixel values, and judging whether the target area is uniform in illumination or not: if the illumination is uneven, carrying out illumination correction on the target area so as to make the illumination of the target area even; if the illumination is uniform, not correcting; and
and outputting the preprocessed image information.
Optionally, the surgical instrument identification method further includes:
after the first identification information is acquired, displaying the first identification information; after acquiring the second identification information and judging whether the first identification information is consistent with the second identification information, displaying the second identification information and the judging result, or,
and after the first identification information and the second identification information are acquired and whether the first identification information is consistent with the second identification information is judged, displaying the first identification information, the second identification information and a judging result or displaying the judging result.
Optionally, the surgical instrument identification method further includes:
after the second identification information is obtained, whether the first identification information is consistent with the second identification information is judged, and if the judgment result is that the first identification information is inconsistent with the second identification information, a first alarm signal is output.
Optionally, the surgical instrument identification method further includes:
before acquiring image information of the surgical instrument placed on the supporting table, acquiring third identification information of the surgical instrument through input information;
after the first identification information of the surgical instrument is obtained, judging whether the first identification information is equal to the third identification information, and if not, outputting a second alarm signal.
In order to solve the above technical problem, according to a second aspect of the present invention, there is also provided an identification platform, including: the device comprises a supporting table board, an image acquisition module and an identification processor;
the supporting table top is used for placing surgical instruments;
the image acquisition module is arranged above the supporting table top and is in communication connection with the image recognition module; the image acquisition module is used for acquiring image information of the surgical instrument placed on the supporting table top and transmitting the image information to the image recognition module;
The recognition processor comprises an image recognition module and an image processing module;
the image recognition module is in communication connection with the image processing module and is used for extracting characteristic information of the surgical instrument in the image information through a neural network and transmitting the characteristic information to the image processing module;
the image processing module obtains the identification information of the surgical instrument according to the characteristic information; the identification information of the surgical instrument comprises first identification information obtained before operation and second identification information obtained after operation;
the image processing module is used for judging whether the first identification information is consistent with the second identification information or not and outputting a judging result.
Optionally, in the identification platform, the image identification module includes: a preprocessing unit and an extracting unit;
the preprocessing unit is used for preprocessing the image information so as to reduce noise of the image information;
the extraction unit is in communication connection with the preprocessing unit and is used for extracting the characteristic information of the surgical instrument from the image information preprocessed by the preprocessing unit through the trained neural network.
Optionally, in the identification platform, the preprocessing unit includes: a feature selection subunit, a graying subunit and an illumination correction subunit;
the selecting subunit is used for determining an image area where the characteristic information to be extracted by the surgical instrument is located; the graying subunit is used for graying the image of the target area when the target area is a color image; the illumination correction subunit is used for correcting the target area image subjected to the graying treatment when the illumination of the target area is uneven.
Optionally, in the identification platform, the image processing module includes a test unit, and the test unit is communicatively connected with the image identification module, and is configured to obtain the identification information from the feature information through a trained classifier.
Optionally, in the identification platform, the image processing module further includes a training unit, and the training unit is used for training the classifier.
Optionally, the identification platform further includes: a display module; the display module is in communication connection with the image processing module and is used for displaying at least one of the first identification information, the second identification information and the judging result.
Optionally, the identification platform further includes an alarm device, where the alarm device is in communication connection with the identification processor, and the alarm device is configured to be controlled to output a first alarm signal when the judgment result is that the first identification information does not match the second identification information.
Optionally, the identification platform further includes an input component, the input component is in communication connection with the identification processor, the input component is used for inputting input information of the surgical instrument, the image processing module obtains third identification information of the surgical instrument through the input component, compares the third identification information with the first identification information, and if the third identification information is not equal to the first identification information, the identification processor controls the alarm device to output a second alarm signal.
To solve the above technical problem, according to a third aspect of the present invention, there is also provided a medical robot system, comprising: an identification platform as described above.
In summary, in the surgical instrument identification method, the identification platform and the medical robot system provided by the invention, the surgical instrument identification method comprises the following steps: acquiring image information of a surgical instrument placed on a support table; extracting characteristic information of the surgical instrument in the image information through a neural network; according to the characteristic information, acquiring identification information of the surgical instrument, wherein the identification information of the surgical instrument comprises first identification information acquired before operation and second identification information acquired after operation; and judging whether the first identification information is consistent with the second identification information or not, and outputting a judging result. The identification information of the surgical instrument can be obtained by extracting and identifying the image information of the surgical instrument, thereby providing guarantee for automatically checking the surgical instrument. The identification platform is utilized to acquire the identification information of the surgical instrument, and the problem that the existing manual statistics is easy to generate omission can be replaced, so that an operator can conveniently and accurately judge whether the surgical instrument omission phenomenon exists in the surgical process, and consumable inspection, postoperative evaluation and the like are facilitated.
Drawings
Those of ordinary skill in the art will appreciate that the figures are provided for a better understanding of the present invention and do not constitute any limitation on the scope of the present invention. Wherein:
FIG. 1 is a schematic view of a medical robotic system according to an embodiment of the invention;
FIG. 2 is a schematic diagram of a doctor-side control device of a medical robotic system according to an embodiment of the invention;
FIG. 3 is a patient-side control device of a medical robotic system according to an embodiment of the present invention;
FIG. 4 is a schematic view of an image trolley of a medical robotic system according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an identification platform according to an embodiment of the present invention;
FIG. 6 is a schematic view of image information of a surgical instrument acquired by an image acquisition module according to an embodiment of the present invention;
FIG. 7 is a symbolized illustration of first identification information of a surgical instrument obtained by an identification platform according to one embodiment of the present invention;
FIG. 8 is a symbolized illustration of second identification information of a surgical instrument obtained by an identification platform according to one embodiment of the present invention;
FIG. 9 is a schematic diagram of a component module of an identification platform according to an embodiment of the present invention;
FIG. 10 is a flow chart of a surgical instrument identification method according to an embodiment of the present invention;
FIG. 11 is a flow chart of extracting feature information of a surgical instrument from image information via a neural network according to an embodiment of the present invention;
FIG. 12 is a flow chart of preprocessing image information in accordance with an embodiment of the present invention;
FIG. 13 is a schematic view showing the first identification information and the second identification information when they match according to an embodiment of the present invention;
fig. 14 is a schematic view showing a case where the first identification information and the second identification information do not match in accordance with an embodiment of the present invention.
In the accompanying drawings:
100-doctor end control device; 101-a main operator; 102-an imaging device; 103-foot operated surgical control device;
200-patient end control device; 201-surgical adjustment arm; 202-a surgical working arm; 203-surgical instrument; 204-stand columns;
300-image trolley; 301-endoscope; 302-an endoscope processor; 303-a display device;
400-identifying a platform; 401-an image acquisition module; 402-supporting a mesa; 403-a display module; 404-identifying a processor; 4041-an image recognition module; 40411-a pretreatment unit; 40412-an extraction unit; 40421-test unit; 40422-training unit; 4042-an image processing module; 405-a memory module;
500-a main controller; 501-circle symbol.
Detailed Description
The invention will be described in further detail with reference to the drawings and the specific embodiments thereof in order to make the objects, advantages and features of the invention more apparent. It should be noted that the drawings are in a very simplified form and are not drawn to scale, merely for convenience and clarity in aiding in the description of embodiments of the invention. Furthermore, the structures shown in the drawings are often part of actual structures. In particular, the drawings are shown with different emphasis instead being placed upon illustrating the various embodiments.
As used in this disclosure, the singular forms "a," "an," and "the" include plural referents, the term "or" are generally used in the sense of comprising "and/or" and the term "several" are generally used in the sense of comprising "at least one," the term "at least two" are generally used in the sense of comprising "two or more," and the term "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying any relative importance or number of technical features indicated. Thus, a feature defining "first," "second," "third," or "third" may explicitly or implicitly include one or at least two such features, the term "proximal" typically being one end proximal to the operator, the term "distal" typically being one end proximal to the patient (i.e., proximal to the lesion), the term "one end" and "other end" and the term "proximal" and "distal" typically referring to the respective two portions, including not only the endpoints, the terms "mounted," "connected," or "coupled" should be construed broadly, e.g., may be a fixed connection, may be a removable connection, or may be integral; can be mechanically or electrically connected; either directly or indirectly, through intermediaries, or both, may be in communication with each other, or may interact with each other, unless the context clearly dictates otherwise.
The invention provides a surgical instrument identification method, an identification platform and a medical robot system, which are used for solving the problem that consumable materials used in the existing manual statistics operation are prone to being overlooked.
The following description refers to the accompanying drawings.
Referring to fig. 1 to 14, fig. 1 is a schematic view of a medical robot system according to an embodiment of the invention; FIG. 2 is a schematic diagram of a doctor-side control device of a medical robotic system according to an embodiment of the invention; FIG. 3 is a patient-side control device of a medical robotic system according to an embodiment of the present invention; FIG. 4 is a schematic view of an image trolley of a medical robotic system according to an embodiment of the present invention; FIG. 5 is a schematic diagram of an identification platform according to an embodiment of the present invention; FIG. 6 is a schematic view of image information of a surgical instrument acquired by an image acquisition module according to an embodiment of the present invention; FIG. 7 is a symbolized illustration of first identification information of a surgical instrument obtained by an identification platform according to one embodiment of the present invention; FIG. 8 is a symbolized illustration of second identification information of a surgical instrument obtained by an identification platform according to one embodiment of the present invention; FIG. 9 is a schematic diagram of a component module of an identification platform according to an embodiment of the present invention; FIG. 10 is a flow chart of a surgical instrument identification method according to an embodiment of the present invention; FIG. 11 is a flow chart of extracting feature information of a surgical instrument from image information via a neural network according to an embodiment of the present invention; FIG. 12 is a flow chart of preprocessing image information in accordance with an embodiment of the present invention; FIG. 13 is a schematic view showing the first identification information and the second identification information when they match according to an embodiment of the present invention; fig. 14 is a schematic view showing a case where the first identification information and the second identification information do not match in accordance with an embodiment of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a medical robot system, which includes a master-slave teleoperation surgical robot, that is, the medical robot system includes a doctor-side control device 100, a patient-side control device 200, and a main controller 500.
Referring to fig. 2, the doctor-side control device 100 is an operation side of a teleoperated surgical robot, and includes a main manipulator 101 mounted thereon. The main manipulator 101 is used for receiving hand motion information of an operator, so as to be used as motion control signal input of the whole system. Optionally, the main controller 500 is also provided on the physician-side control device 100. Preferably, the doctor-side control apparatus 100 further includes an imaging device 102, where the imaging device 102 may provide a stereoscopic image to an operator and provide operation information to the operator for performing an operation. The surgical operation information includes the type, number, pose in the abdomen, morphology, arrangement, etc. of the diseased organ tissue and surrounding organ tissue vessels. Optionally, the doctor-side control apparatus 100 further includes a foot-operated operation control device 103, and the operator may complete input of relevant operation instructions such as electric cutting, electric coagulation, etc. through the foot-operated operation control device 103.
Patient-side control device 200 is a specific execution platform for a teleoperated surgical robot and includes a column 204 and surgical execution components mounted thereon. The surgical execution assembly includes a robotic arm and a surgical instrument 203. In one embodiment, the robotic arm includes a surgical adjustment arm 201 and a surgical working arm 202. The surgical tool arm 202 is a mechanical fixed point mechanism for driving a surgical instrument to move around a mechanical fixed point, and the surgical adjustment arm 201 is used for adjusting the position of the fixed point in a working space. In another embodiment, the robotic arm is a mechanism having a spatial configuration with at least six degrees of freedom for driving the surgical instrument about an active stationary point under program control. The surgical instrument 203 is used to perform specific surgical procedures, such as clamping, cutting, shearing, etc.
The main controller 500 is communicatively connected to the doctor-side control device 100 and the patient-side control device 200, respectively, for controlling the movement of the surgical execution assembly according to the movement of the main manipulator 101. Specifically, the master controller includes a master-slave mapping module, and the master-slave mapping module is configured to obtain an end pose of the master manipulator 101 and a predetermined master-slave mapping relationship, obtain an expected end pose of the surgical execution assembly, and further control the mechanical arm to drive the surgical instrument to move to the expected end pose. Further, the master-slave mapping module is also used for receiving instrument function operation instructions (such as relevant operation instructions of electric cutting, electric coagulation and the like) and controlling an energy driver of the instrument to release energy so as to realize operation operations of electric cutting, electric coagulation and the like.
Further, the medical robot system further includes an image dolly 300. As shown in fig. 4, the image dolly 300 includes: an endoscope 301 is communicatively connected to an endoscope processor 302. The endoscope 301 is used to obtain intra-cavity (refer to the inside of a body cavity of a patient) operation information. The endoscope processor 302 is used for imaging the operation information acquired by the endoscope 301 and transmitting the operation information to the imaging device 102 so as to facilitate the operator to observe the operation information. Optionally, the image trolley 300 further comprises a display device 303. The display device 303 is communicatively coupled to the endoscope processor 302 for providing display surgical operation information to an auxiliary operator (e.g., a nurse) in real time.
In operation, an operator sits in front of the doctor-side control apparatus 100 located outside the sterile field, observes the returned operation information through the imaging device 102, and controls the operation performing assembly and the laparoscopic motion by operating the main manipulator 101 to complete various operation.
Further, the medical robot further comprises an identification platform 400. As shown in fig. 5 and 9, the recognition platform 400 includes: an image acquisition module 401, a support table 402, and an identification processor 404. Wherein the support table 402 is used for placing surgical instruments; the image acquisition module 401 is used for acquiring image information of the surgical instrument placed on the supporting table top 402; the recognition processor 404 is communicatively connected to the image acquisition module 401, and is configured to obtain, according to image information of the surgical instrument, recognition information of the surgical instrument, determine whether first recognition information obtained before surgery matches second recognition information obtained after surgery, and output a determination result.
In this embodiment, the recognition processor 404 and the main controller 500 are separately provided. In other embodiments, the recognition processor 404 is designed integrally with the master controller 500, e.g., the corresponding functions of the recognition processor 404 are integrated as functional modules in the master controller 500 with other modules (e.g., master-slave mapping modules) in the master controller 500.
It should be noted that the surgical instruments described herein include not only instruments in a narrow sense (such as surgical knife, scissors, needle, etc.), but also consumables used in surgery (such as cotton ball, gauze, hemostatic clips, etc.). Preferably, the identification information includes identification feature classifications of surgical instruments, and the number of surgical instruments under each of the identification feature classifications. Accordingly, the first identification information of the surgical instrument includes identification feature classifications of the surgical instrument before surgery and the number of the surgical instruments under each identification feature classification; the second identification information includes identification feature classifications of the surgical instruments after the operation, and the number of surgical instruments under each identification feature classification. Further, the identification feature classification includes the class and specification of surgical instruments, etc., such as scissors of different models, etc. In addition, in this embodiment, the second identification information matches the first identification information, which can be understood as that the types and the numbers of surgical instruments before and after the operation are identical, that is, no surgical instrument is used and consumed in the cavity during the operation, and all the surgical instruments are recovered, that is, it can be determined that no omission exists. If there is difference information between the second identification information and the first identification information, it is indicated that a surgical instrument is left in the cavity. The second identification information accords with the first identification information, and the type and the number of the surgical instruments after the operation accord with a certain preset rule. For example, portions of the surgical instrument will reside in the human body to assist in wound healing or to secure target tissue. Thus, the number of surgical instruments of this type prior to surgery will be greater than or equal to the number of surgical instruments after surgery. Further, if the first identification information and the second identification information of the surgical instrument are not consistent, the alarm device is triggered to output a first alarm signal so as to remind an operator to check. The present invention is not particularly limited to the first alarm signal, and a person skilled in the art may set the first alarm signal according to the prior art, for example, by voice, light, or by user error prompt through a user interactive interface.
The identification platform 400 is described in detail below in conjunction with fig. 5, 6-10.
The support platform 402 includes a plurality of predefined areas, each predefined area for receiving a surgical instrument. As shown in fig. 6, the same type of surgical instruments are placed in the same column and different surgical instruments are placed in different columns. Thus, the surgical instruments are placed in a classified arrangement according to the category, and the surgical instruments are orderly arranged in the same category, so that the surgical instruments can be conveniently identified.
The image acquisition module 401 is disposed above the supporting table 402 by a support arm, and is in communication with the recognition processor 404. The image acquisition module 401 is configured to acquire image information of a surgical instrument placed on the support table, and transmit the image information to the recognition processor 404. Fig. 6 shows a schematic diagram of the image information of the surgical instrument acquired by the image acquisition module 401. Wherein the schematic shows four different types or models of surgical instruments, each of which is arranged in a column, the same type and model of surgical instruments being preferably arranged in a row.
Preferably, the recognition processor 404 includes an image recognition module 4041 and an image processing module 4042. The image recognition module 4041 is respectively in communication connection with the image acquisition module 401 and the image processing module 4042, and the image recognition module 4041 is configured to extract, through a neural network, feature information of a surgical instrument in the image information according to the image information acquired by the image acquisition module 401, and transmit the feature information to the image processing module 4042. The image processing module 4042 obtains the identification information of the surgical instrument according to the characteristic information; the identification information of the surgical instrument comprises first identification information obtained before operation and second identification information obtained after operation; the image processing module 4042 is further configured to determine whether the first identification information matches the second identification information, and output a determination result. Optionally, the feature information includes: at least one of shape, texture, and color of a corresponding surgical instrument in the image information.
Optionally, the image recognition module 4041 includes a preprocessing unit 40411 and an extraction unit 40412. The preprocessing unit 40411 is configured to preprocess the image information to reduce noise of the image information, and for a specific method of preprocessing the image information, please refer to the following detailed description; the extracting unit 40412 is communicatively connected to the preprocessing unit 40411, and is configured to extract, through a trained neural network, characteristic information of a surgical instrument in the image information preprocessed by the preprocessing unit 40411.
Optionally, the image processing module 4042 includes a test unit 40421. The test unit 40421 is communicatively connected to the image recognition module 4041 and is configured to obtain the identification information from the feature information through a trained classifier. The specific type of the classifier in this embodiment is not particularly limited, and is, for example, a Support Vector Machine (SVM), a bayesian classifier, a KNN algorithm, an adaboost method, a Rocchio algorithm, or the like. Further, the image processing module 4042 further comprises a training unit 40422 for training the classifier. For example, the classifier is a support vector machine, a nonlinear training set is converted into a linear training set in a high-dimensional space through a proper kernel function, such as a gaussian radial basis function, and then the support vector machine is introduced, and data evaluation optimization is performed to obtain a trained support vector machine.
Optionally, the preprocessing unit 40412 includes a feature selecting subunit, a graying subunit, and an illumination correcting subunit. The selecting subunit is used for determining an image area where the characteristic information to be extracted by the surgical instrument is located; the graying subunit is used for graying the image of the target area when the target area is a color image; the illumination correction subunit is used for correcting the target area image subjected to the graying treatment when the illumination of the target area is uneven. The specific method of realizing the graying of the image in the present embodiment is not particularly limited. For example, for an image in RGB format, a G value that is relatively sensitive to human eyes may be taken as a gray value, or a gray value that is relatively sensitive to human eyes may be takenThe R value, the G value and the B value are weighted to obtain gray; for YUV coding or YC b C r And directly taking the Y value of each pixel point as a gray value of the coded image. Also, the present embodiment is not particularly limited as to the method of realizing the correction of the illuminance unevenness. For example, an algorithm based on the Retinex theory, a Histogram Equalization (HE) method, a non-sharpening mask method, a morphological filtering method, a method based on a spatial illuminance map, an adaptive correction method based on a two-dimensional gamma function, or the like.
Preferably, the identification platform 400 further comprises a display module 403. The display module 403 is communicatively connected to the image processing module 4042, and is configured to display at least one of the first identification information, the second identification information, and the determination result.
Referring to fig. 7 and 8, to simplify the recognition process, the identification information of the surgical instrument may be represented by a symbolized representation. FIG. 7 shows a symbolized representation of first identification information representing identification information of a preoperative surgical instrument; fig. 8 shows a symbolized representation of second identification information, which represents the identification information of the surgical instrument after surgery. More specifically, in the symbolized illustration, each identification feature classification may be represented by a particular symbol. As shown in fig. 7 and 8, the symbols of different columns are inconsistent, representing different identification feature classifications, and the number of symbols of the same column represents the number of surgical instruments under that identification feature classification. The selection of the specific symbol may set the simplified identifier according to the shape of the surgical instrument. The symbolized representation may be displayed on a display module to prompt the operator to assist the operator in making decisions and taking relevant action at the site. Of course, those skilled in the art may use other expression methods to express the identification information, such as using an alphanumeric method, etc., and the present invention is not limited thereto. In the symbolized representation of the first identification information shown in fig. 7, there are 3 circular symbols 501, which represent that the number of surgical instruments of a certain category before surgery is 3. In the symbolized representation of the second identification information shown in fig. 8, there are 2 circular symbols 501, which represent that the number of surgical instruments of this category after surgery is 2.
In one exemplary embodiment, the display module displays a symbolized representation of the first identification information when the first identification information is acquired. And when the second identification information is acquired and whether the first identification information is consistent with the second identification information is judged, the display module displays the symbolized graphic representation of the first identification information, the symbolized graphic representation of the second identification information and the symbolized graphic representation of the comparison result. The operator can intuitively see whether the identification information of the surgical instrument before and after the operation has a difference. It should be noted that, the number of the circular symbols 501 in the first identification information and the second identification information is different, which does not represent that the first identification information and the second identification information do not match, if the surgical instrument represented by the circular symbols 501 is expected to be consumed in surgery according to a preset rule (such as an expected surgical procedure), it should be understood that the second identification information matches the first identification information.
As shown in fig. 13, when the judging result is that the first identification information of the surgical instrument before operation and the second identification information after operation are matched, the system prompts to pass, the symbolized graphic representation of the first identification information is displayed on the display module, the second identification information is symbolized, and the judgment result is marked by a green filter attached to the symbolized representation of the second identification information and/or marked with a symbol of a 'v'. After the operator confirms again that there is no error, the end of the operation can be confirmed.
Further, the identification platform further comprises an alarm device. The alarm device is in communication connection with the identification processor 404, and is configured to be controlled by the identification processor 404 to output a first alarm signal when the determination result is that the first identification information does not match the second identification information. The alarm device can be a warning lamp and a buzzer. The warning light, buzzer may be one or more, may be located on the support table 402, the physician's end control device 100, and/or the patient end control device 200. The first alarm signal may be a warning light on, flashing, buzzer beeping, etc.
As shown in fig. 14, when the judgment result is that the first identification information before operation and the second identification information after operation of the surgical instrument do not match, the system prompt does not pass, the identification processor 404 controls the alarm device to output the first alarm signal, the display module displays the symbolized graphic representation of the first identification information, the symbolized graphic representation of the second identification information, and the red filter is used for adding the symbolized graphic representation of the second identification information and/or the cross x to identify the judgment result. At this time, the operator should find the cause, and if it is confirmed that the surgical instrument has been consumed or used in surgery or in the human body, manual intervention can be performed, and recording and treatment schemes can be made.
The identification platform 400 also includes an input component in communication with the identification processor 404 for inputting input information for the surgical instrument. The input information is associated with the third identification information of the surgical instrument, and may be identification information of at least a part of the surgical instrument or additional information of the surgical instrument, such as a two-dimensional code or a bar code. In an alternative embodiment, the input component includes a microphone, mouse, keyboard, scanning gun, or nfc card reader, etc., communicatively coupled to the recognition processor 404. In another embodiment, the input assembly includes a touch piece disposed on the display module 403, both of which form a touch screen on which an operator can directly make an input. The image processing module 4042 obtains the third identification information of the surgical instrument through the input information of the input component, compares the third identification information with the first identification information, and if the third identification information and the first identification information are not equal, indicates an entry error or a picture identification error. Further, the recognition processor 404 may control the alarm device to issue a second alarm signal. The second alarm signal may be embodied with reference to the first alarm signal, but should be distinguished from the first alarm signal.
In an alternative embodiment, the recognition platform 400 further includes a storage module 405, where the storage module 405 is communicatively connected to the image capturing module 401 and the recognition processor 404, and information such as image information captured by the image capturing module 401, preprocessed image, recognition information, and preset third recognition information associated with input information may be stored in the storage module for being called by the recognition processor 404.
Referring to fig. 10, based on the above-mentioned identification platform 400, the present invention further provides a surgical instrument identification method, which includes:
step SC1: acquiring image information of a surgical instrument placed on the support table 402;
step SC2: extracting characteristic information of the surgical instrument in the image information through a neural network;
step SC3: according to the characteristic information, obtaining identification information of the surgical instrument; the identification information of the surgical instrument comprises first identification information obtained before operation and second identification information obtained after operation; the method comprises the steps of,
step SC4: and judging whether the first identification information is consistent with the second identification information, and outputting a judging result.
Preferably, as shown in fig. 11, step SC2 includes:
Step SC21: preprocessing the image information to reduce noise of the image information;
step SC22: and extracting the characteristic information of the surgical instrument from the preprocessed image information.
In the step of extracting and identifying the characteristic information of the image information through the neural network, the preprocessing unit performs preprocessing on the image information of the surgical instrument acquired by the image acquisition module 401 to reduce the noise of the image; and further extracting characteristic information (such as color, shape and texture) of the surgical instrument in the image, preferably extracting remarkable characteristic information.
Preferably, as shown in fig. 12, before extracting the characteristic information of the surgical instrument in the image information in step SC2, the surgical instrument identification method includes:
step SC211: extracting an image area covering a target surgical instrument from the image information, namely extracting a target area from the image information; the target region may be, for example, an image region in which all or part of the surgical instrument/s in the acquired image of the surgical instrument/s is/are located. In one aspect, the items resting on the support table 402 include not only surgical instruments, but possibly other items, so the target area is to be set; on the other hand, the surgical instrument may include a plurality of features, not all of which are helpful in obtaining the identification information, so that it is necessary to reject a portion of the features to reduce the amount of calculation of the subsequent identification. The specific method for extracting the image area covering the target surgical instrument in the image information is not particularly limited in this embodiment, and for example, the target area may be determined by a rule set manually or manually.
Step SC212: and carrying out graying processing on the color image information in the target area range.
Step SC213: scanning the image information after graying, extracting pixel values, and judging whether the target area is uniform in illumination or not: if the illumination is uneven, carrying out illumination correction on the target area so as to make the illumination of the target area even; if the illumination is uniform, no correction is needed, i.e. no correction is performed. Further, judging whether the illumination of the target area is uniform or not again for the image information subjected to illumination correction, and if the illumination is non-uniform, carrying out illumination correction on the target area again until the illumination of the target area is uniform.
Step SC214: and outputting the preprocessed image information.
Preferably, in step SC3, the feature information is passed through a trained classifier to obtain the identification information. Further, the classifier is a support vector machine, a nonlinear training set is converted into a linear training set in a high-dimensional space through a kernel function, such as a Gaussian radial basis function, and then the support vector machine is introduced, and data evaluation optimization is performed to obtain the trained support vector machine.
Preferably, the surgical instrument identification method further comprises: at least one of first identification information, second identification information, and a result of the identification of the surgical instrument is displayed. For example, may be displayed by the display device 303. In one embodiment, after the first identification information is acquired, the first identification information is displayed; and after the second identification information is acquired, judging whether the first identification information is consistent with the second identification information, and displaying the second identification information and a judging result. In another embodiment, after the first identification information and the second identification information are obtained and whether the first identification information matches the second identification information is determined, the first identification information, the second identification information, and the determination result are displayed, or the determination result is displayed.
Optionally, after the second identification information is obtained and whether the first identification information is consistent with the second identification information is judged, if the judgment result is that the first identification information is inconsistent with the second identification information, a first alarm signal is output.
Optionally, before step S1, obtaining third identification information of the surgical instrument by inputting the information; after the first identification information of the surgical instrument is obtained, judging whether the first identification information is consistent with the third identification information, and if not, outputting a second alarm signal.
In an alternative embodiment, the medical robotic system may also be an orthopedic robot. Accordingly, the orthopedic robot has only the patient-side control device 200 and the doctor-side control device 100 is not provided, and the surgical instrument may be a bone saw, a bone drill, or the like. Other devices, such as identification platforms, etc., are similar to the embodiments described above and therefore are not repeated.
In summary, in the surgical instrument identification method, the identification platform and the medical robot system provided by the invention, the surgical instrument identification method comprises the following steps: acquiring image information of a surgical instrument placed on a support table; extracting characteristic information of the surgical instrument in the image information through a neural network; according to the characteristic information, obtaining identification information of the surgical instrument; the identification information of the surgical instrument comprises first identification information obtained before operation and second identification information obtained after operation; and judging whether the first identification information is consistent with the second identification information or not, and outputting a judging result. The identification information of the surgical instrument can be obtained by extracting and identifying the image information of the surgical instrument, thereby providing guarantee for automatically checking the surgical instrument. The identification platform is utilized to acquire the identification information of the surgical instrument, and the problem that the existing manual statistics is easy to generate omission can be replaced, so that an operator can conveniently and accurately judge whether the surgical instrument omission phenomenon exists in the surgical process, and consumable inspection, postoperative evaluation and the like are facilitated.
The above description is only illustrative of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention, and any alterations and modifications made by those skilled in the art based on the above disclosure shall fall within the scope of the appended claims.

Claims (16)

1. A method of identifying a surgical instrument, comprising:
acquiring image information of a surgical instrument placed on a support table;
extracting characteristic information of the surgical instrument in the image information through a neural network;
according to the characteristic information, obtaining identification information of the surgical instrument, wherein the identification information of the surgical instrument is represented by a symbolized diagram; the identification information of the surgical instrument comprises first identification information obtained before operation and second identification information obtained after operation; and
judging whether the first identification information is consistent with the second identification information or not, and outputting a judging result;
the supporting table top comprises a plurality of preset areas, each preset area is used for accommodating one surgical instrument, and the surgical instruments of the same type are placed in the same row; the different columns of symbols are inconsistent and represent different identification feature classifications, and the number of symbols in the same column represents the number of surgical instruments under the identification feature classification.
2. A surgical instrument identification method according to claim 1, wherein the characteristic information includes: at least one of shape, texture, and color of a corresponding surgical instrument in the image information.
3. The surgical instrument identification method according to claim 1, wherein the method of extracting the characteristic information of the surgical instrument from the image information through the neural network includes:
preprocessing the image information to reduce noise of the image information; and
and extracting the characteristic information of the surgical instrument from the preprocessed image information.
4. A surgical instrument identification method according to claim 1 or 3, characterized in that the surgical instrument identification method includes, before extracting the characteristic information of the surgical instrument in the image information:
extracting a target area covering a target surgical instrument from the image information;
carrying out graying treatment on the color image information in the target area range;
scanning the image information after graying, extracting pixel values, and judging whether the target area is uniform in illumination or not: if the illumination is uneven, carrying out illumination correction on the target area so as to make the illumination of the target area even; if the illumination is uniform, not correcting; and
And outputting the preprocessed image information.
5. The surgical instrument identification method according to claim 1, characterized in that the surgical instrument identification method further comprises:
after the first identification information is acquired, displaying the first identification information; after acquiring the second identification information and judging whether the first identification information is consistent with the second identification information, displaying the second identification information and the judging result, or,
and after the first identification information and the second identification information are acquired and whether the first identification information is consistent with the second identification information is judged, displaying the first identification information, the second identification information and a judging result or displaying the judging result.
6. The surgical instrument identification method according to claim 1, characterized in that the surgical instrument identification method further comprises:
after the second identification information is obtained, whether the first identification information is consistent with the second identification information is judged, and if the judgment result is that the first identification information is inconsistent with the second identification information, a first alarm signal is output.
7. The surgical instrument identification method according to claim 1, characterized in that the surgical instrument identification method further comprises:
Before acquiring image information of the surgical instrument placed on the supporting table, acquiring third identification information of the surgical instrument through input information;
after the first identification information of the surgical instrument is obtained, judging whether the first identification information is equal to the third identification information, and if not, outputting a second alarm signal.
8. An identification platform for surgical instruments, comprising: the device comprises a supporting table board, an image acquisition module and an identification processor;
the supporting table top is used for placing surgical instruments; the supporting table top comprises a plurality of preset areas, each preset area is used for accommodating one surgical instrument, and the surgical instruments of the same type are placed in the same row; the symbols of different columns are inconsistent and represent different identification feature classifications, and the number of the symbols of the same column represents the number of surgical instruments under the identification feature classification;
the recognition processor comprises an image recognition module and an image processing module;
the image acquisition module is arranged above the supporting table top and is in communication connection with the image recognition module; the image acquisition module is used for acquiring image information of the surgical instrument placed on the supporting table top and transmitting the image information to the image recognition module;
The image recognition module is in communication connection with the image processing module and is used for extracting characteristic information of the surgical instrument in the image information through a neural network and transmitting the characteristic information to the image processing module;
the image processing module obtains the identification information of the surgical instrument according to the characteristic information, and the identification information of the surgical instrument is represented by a symbolized diagram; the identification information of the surgical instrument comprises first identification information obtained before operation and second identification information obtained after operation;
the image processing module is used for judging whether the first identification information is consistent with the second identification information or not and outputting a judging result.
9. The recognition platform of claim 8, wherein the image recognition module comprises: a preprocessing unit and an extracting unit;
the preprocessing unit is used for preprocessing the image information so as to reduce noise of the image information;
the extraction unit is in communication connection with the preprocessing unit and is used for extracting the characteristic information of the surgical instrument from the image information preprocessed by the preprocessing unit through the trained neural network.
10. The identification platform of claim 9, wherein the preprocessing unit comprises: a feature selection subunit, a graying subunit and an illumination correction subunit;
the selecting subunit is used for determining an image area where the characteristic information to be extracted by the surgical instrument is located; the graying subunit is used for graying the image of the target area when the target area is a color image; the illumination correction subunit is used for correcting the target area image subjected to the graying treatment when the illumination of the target area is uneven.
11. The identification platform of claim 8, wherein the image processing module comprises a test unit communicatively coupled to the image identification module for obtaining the identification information from the feature information via a trained classifier.
12. The recognition platform of claim 11, wherein the image processing module further comprises a training unit to train the classifier.
13. The identification platform of claim 8, further comprising: a display module; the display module is in communication connection with the image processing module and is used for displaying at least one of the first identification information, the second identification information and the judging result.
14. The identification platform of claim 8, further comprising an alarm device in communication with the identification processor, the alarm device configured to be controlled to output a first alarm signal when the determination is that the first identification information does not match the second identification information.
15. The identification platform of claim 14, further comprising an input assembly in communication with the identification processor, the input assembly for inputting input information of a surgical instrument, the image processing module obtaining third identification information of the surgical instrument via the input assembly and comparing the third identification information with the first identification information, the identification processor controlling the alarm device to output a second alarm signal if the third identification information is not equal.
16. A medical robotic system comprising an identification platform according to any one of claims 8-15.
CN202011595244.0A 2020-12-29 2020-12-29 Surgical instrument identification method, identification platform and medical robot system Active CN112712016B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011595244.0A CN112712016B (en) 2020-12-29 2020-12-29 Surgical instrument identification method, identification platform and medical robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011595244.0A CN112712016B (en) 2020-12-29 2020-12-29 Surgical instrument identification method, identification platform and medical robot system

Publications (2)

Publication Number Publication Date
CN112712016A CN112712016A (en) 2021-04-27
CN112712016B true CN112712016B (en) 2024-01-26

Family

ID=75546352

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011595244.0A Active CN112712016B (en) 2020-12-29 2020-12-29 Surgical instrument identification method, identification platform and medical robot system

Country Status (1)

Country Link
CN (1) CN112712016B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113673350B (en) * 2021-07-21 2024-02-20 苏州爱医斯坦智能科技有限公司 Surgical instrument identification method, device, equipment and storage medium
CN116919597B (en) * 2023-09-15 2023-12-26 中南大学 Operation navigation system based on multi-vision fusion acquisition information

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202277385U (en) * 2011-10-14 2012-06-20 上海理工大学 Automatic identifying and counting system of surgical instrument
CN108171269A (en) * 2018-01-04 2018-06-15 吴勤旻 A kind of medical instrument pattern recognition device
CN109409905A (en) * 2018-09-28 2019-03-01 微创(上海)医疗机器人有限公司 A kind of medical instrument automatic identification check system and method
CN110051443A (en) * 2019-05-24 2019-07-26 苏州爱医斯坦智能科技有限公司 Automatic method and apparatus monitoring identification and check surgical instrument
CN110301981A (en) * 2019-06-28 2019-10-08 华中科技大学同济医学院附属协和医院 A kind of intelligence checks the scanner and control method of surgical instrument
CN110678902A (en) * 2017-05-31 2020-01-10 Eizo株式会社 Surgical instrument detection system and computer program
CN111161875A (en) * 2019-12-31 2020-05-15 上海市肺科医院 Intelligent checking system for surgical auxiliary instrument
CN111553422A (en) * 2020-04-28 2020-08-18 南京新空间信息科技有限公司 Automatic identification and recovery method and system for surgical instruments
CN111860711A (en) * 2020-06-28 2020-10-30 维怡医疗科技有限公司 Surgical instrument management method, system and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7180014B2 (en) * 2003-03-20 2007-02-20 Boris Farber Method and equipment for automated tracking and identification of nonuniform items
US20130113929A1 (en) * 2011-11-08 2013-05-09 Mary Maitland DeLAND Systems and methods for surgical procedure safety

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202277385U (en) * 2011-10-14 2012-06-20 上海理工大学 Automatic identifying and counting system of surgical instrument
CN110678902A (en) * 2017-05-31 2020-01-10 Eizo株式会社 Surgical instrument detection system and computer program
CN108171269A (en) * 2018-01-04 2018-06-15 吴勤旻 A kind of medical instrument pattern recognition device
CN109409905A (en) * 2018-09-28 2019-03-01 微创(上海)医疗机器人有限公司 A kind of medical instrument automatic identification check system and method
CN110051443A (en) * 2019-05-24 2019-07-26 苏州爱医斯坦智能科技有限公司 Automatic method and apparatus monitoring identification and check surgical instrument
CN110301981A (en) * 2019-06-28 2019-10-08 华中科技大学同济医学院附属协和医院 A kind of intelligence checks the scanner and control method of surgical instrument
CN111161875A (en) * 2019-12-31 2020-05-15 上海市肺科医院 Intelligent checking system for surgical auxiliary instrument
CN111553422A (en) * 2020-04-28 2020-08-18 南京新空间信息科技有限公司 Automatic identification and recovery method and system for surgical instruments
CN111860711A (en) * 2020-06-28 2020-10-30 维怡医疗科技有限公司 Surgical instrument management method, system and device

Also Published As

Publication number Publication date
CN112712016A (en) 2021-04-27

Similar Documents

Publication Publication Date Title
US20210157403A1 (en) Operating room and surgical site awareness
CN110074863B (en) Augmented surgical reality environment for robotic surgical system
US10849709B2 (en) Systems and methods for removing occluding objects in surgical images and/or video
CN112804958A (en) Indicator system
CN112712016B (en) Surgical instrument identification method, identification platform and medical robot system
JP2019526315A (en) System and method for on-screen menu in teleoperated medical system
CN112704566B (en) Surgical consumable checking method and surgical robot system
CN111067468B (en) Method, apparatus, and storage medium for controlling endoscope system
US20200184638A1 (en) Systems and methods for enhancing surgical images and/or video
CN111374688B (en) System and method for correcting medical scans
KR20170095992A (en) Head-mountable computing device, method and computer program product
CN113476142B (en) Surgical instrument clamping force self-adaptive control system and control method and surgical robot
CN110573107B (en) Medical system and related method
CN114533263B (en) Mechanical arm collision prompting method, readable storage medium, surgical robot and system
CN116423547A (en) Surgical robot pedal control system, method, readable medium and surgical robot
WO2023161193A2 (en) Medical imaging device, medical system, method for operating a medical imaging device, and method of medical imaging
CN114266292A (en) Operation grade determination method, device, system, equipment and medium
CN115443108A (en) Surgical procedure guidance system
US20190192252A1 (en) Surgical Instrument Positioning System and Positioning Method Thereof
CN114424975A (en) Surgical robot assistance system, method, medium, terminal, and surgical robot
CN114081631B (en) Health monitoring system and surgical robot system
CN113925607B (en) Operation robot operation training method, device, system, medium and equipment
CN115908349B (en) Automatic endoscope parameter adjusting method and device based on tissue identification
CN114081631A (en) Health monitoring system and surgical robot system
CN115120341A (en) Computer readable storage medium, electronic equipment and surgical robot system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant