CN112712016A - Surgical instrument identification method, identification platform and medical robot system - Google Patents

Surgical instrument identification method, identification platform and medical robot system Download PDF

Info

Publication number
CN112712016A
CN112712016A CN202011595244.0A CN202011595244A CN112712016A CN 112712016 A CN112712016 A CN 112712016A CN 202011595244 A CN202011595244 A CN 202011595244A CN 112712016 A CN112712016 A CN 112712016A
Authority
CN
China
Prior art keywords
identification information
information
surgical instrument
image
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011595244.0A
Other languages
Chinese (zh)
Other versions
CN112712016B (en
Inventor
宋进
陈功
朱祥
何超
李明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Microport Medbot Group Co Ltd
Original Assignee
Shanghai Microport Medbot Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Microport Medbot Group Co Ltd filed Critical Shanghai Microport Medbot Group Co Ltd
Priority to CN202011595244.0A priority Critical patent/CN112712016B/en
Publication of CN112712016A publication Critical patent/CN112712016A/en
Application granted granted Critical
Publication of CN112712016B publication Critical patent/CN112712016B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/98Identification means for patients or instruments, e.g. tags using electromagnetic means, e.g. transponders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/034Recognition of patterns in medical or anatomical images of medical instruments

Abstract

The invention provides a surgical instrument identification method, an identification platform and a medical robot system, wherein the surgical instrument identification method comprises the following steps: acquiring image information of a surgical instrument placed on a supporting table top; extracting characteristic information of the surgical instrument in the image information through a neural network; acquiring identification information of the surgical instrument according to the characteristic information; the identification information of the surgical instrument comprises first identification information obtained before an operation and second identification information obtained after the operation; and judging whether the first identification information is consistent with the second identification information or not, and outputting a judgment result. The identification information of the surgical instruments is acquired by the identification platform, and the problem that omission is easily caused by existing manual statistics can be replaced, so that an operator can conveniently and accurately judge whether the surgical instruments are omitted in the surgical process.

Description

Surgical instrument identification method, identification platform and medical robot system
Technical Field
The invention relates to the technical field of robot-assisted surgery, in particular to a surgical instrument identification method, an identification platform and a medical robot system.
Background
The micro-trauma operation is a new technology for performing operations in a human body through endoscopes such as a laparoscope and a thoracoscope, and has the advantages of small trauma, light pain, less bleeding and the like, so that the recovery time of a patient is effectively shortened, the patient is not suitable, and some harmful side effects of the traditional operations are avoided.
The minimally invasive surgery robot system enables an operator to observe tissue characteristics in a patient body through a two-dimensional or three-dimensional display device at a main console, and operates mechanical arms and surgical tool instruments on the operation robot in a remote control mode to complete operation. In robotic surgery, it is often necessary to deliver surgical instruments or auxiliary items, such as suture needles, hemostatic clips, etc., into the body. In the current operation, the used surgical instruments and the like are managed by a bedside nurse, and the intraoperative consumable supplies are checked after the operation. This mode of operation is dependent on the bedside nurse's record and is typically performed post-operatively. Bedside nurses rely on the type and the number of consumables used in manual counting operation, and careless omission is likely to occur. Once surgical instruments are missed, serious medical accidents are likely to happen.
Disclosure of Invention
The invention aims to provide a surgical instrument identification method, an identification platform and a medical robot system, and aims to solve the problem that consumables used in the existing manual statistics operation are easy to leak.
In order to solve the above technical problem, according to a first aspect of the present invention, there is provided a surgical instrument identifying method including:
acquiring image information of a surgical instrument placed on a supporting table top;
extracting characteristic information of the surgical instrument in the image information through a neural network;
acquiring identification information of the surgical instrument according to the characteristic information; the identification information of the surgical instrument comprises first identification information obtained before an operation and second identification information obtained after the operation; and
and judging whether the first identification information is consistent with the second identification information or not, and outputting a judgment result.
Optionally, in the surgical instrument recognition method, the feature information includes: at least one of a shape, texture, and color of a corresponding surgical instrument in the image information.
Optionally, in the surgical instrument recognition method, the method for extracting feature information of a surgical instrument in the image information through a neural network includes:
preprocessing the image information to reduce the noise of the image information; and
and extracting the characteristic information of the surgical instrument in the preprocessed picture information.
Optionally, in the surgical instrument recognition method, before extracting feature information of a surgical instrument in the image information, the surgical instrument recognition method includes:
extracting a target region covering a target surgical instrument in the image information;
carrying out gray processing on the color image information in the target area range;
scanning the image information after graying, extracting pixel values, and judging whether the target area is uniformly illuminated: if the illumination is not uniform, performing illumination correction on the target area to enable the illumination of the target area to be uniform; if the illumination is uniform, no correction is carried out; and
and outputting the preprocessed image information.
Optionally, the surgical instrument identification method further includes:
after first identification information is acquired, displaying the first identification information; after second identification information is acquired and whether the first identification information is consistent with the second identification information is judged, the second identification information and a judgment result are displayed, or,
and after acquiring the first identification information and the second identification information and judging whether the first identification information is consistent with the second identification information or not, displaying the first identification information, the second identification information and a judgment result or displaying the judgment result.
Optionally, the surgical instrument identification method further includes:
and after second identification information is acquired and whether the first identification information is consistent with the second identification information is judged, if the judgment result shows that the first identification information is not consistent with the second identification information, outputting a first alarm signal.
Optionally, the surgical instrument identification method further includes:
before acquiring the image information of the surgical instrument placed on the supporting table top, acquiring third identification information of the surgical instrument through input information;
and after the first identification information of the surgical instrument is acquired, judging whether the first identification information is equal to the third identification information, and if not, outputting a second alarm signal.
To solve the above technical problem, according to a second aspect of the present invention, there is also provided an identification platform, comprising: the device comprises a supporting table surface, an image acquisition module and an identification processor;
the supporting table top is used for placing surgical instruments;
the image acquisition module is arranged above the supporting table top and is in communication connection with the image recognition module; the image acquisition module is used for acquiring image information of a surgical instrument placed on the supporting table top and transmitting the image information to the image identification module;
the recognition processor comprises an image recognition module and an image processing module;
the image recognition module is in communication connection with the image processing module, and is used for extracting the characteristic information of the surgical instrument in the image information through a neural network and transmitting the characteristic information to the image processing module;
the image processing module acquires identification information of the surgical instrument according to the characteristic information; the identification information of the surgical instrument comprises first identification information obtained before an operation and second identification information obtained after the operation;
the image processing module is used for judging whether the first identification information is consistent with the second identification information or not and outputting a judgment result.
Optionally, in the recognition platform, the image recognition module includes: a preprocessing unit and an extraction unit;
the preprocessing unit is used for preprocessing the image information to reduce the noise of the image information;
the extraction unit is in communication connection with the preprocessing unit and is used for extracting the characteristic information of the surgical instrument in the image information preprocessed by the preprocessing unit through the trained neural network.
Optionally, in the identification platform, the preprocessing unit includes: a characteristic selection subunit, a graying subunit and an illumination correction subunit;
the selection subunit is used for determining an image area where the characteristic information required to be extracted by the surgical instrument is located; the graying subunit is used for performing graying processing on the image of the target area when the target area is a color image; the illumination corrector subunit is used for correcting the target area image subjected to the graying processing when the illumination of the target area is not uniform.
Optionally, in the recognition platform, the image processing module includes a testing unit, and the testing unit is in communication connection with the image recognition module and is configured to obtain the recognition information from the feature information through a trained classifier.
Optionally, in the recognition platform, the image processing module further includes a training unit, and the training unit is configured to train the classifier.
Optionally, the identification platform further includes: a display module; the display module is in communication connection with the image processing module and is used for displaying at least one of the first identification information, the second identification information and the judgment result.
Optionally, the identification platform further includes an alarm device, the alarm device is in communication connection with the identification processor, and the alarm device is configured to be controlled to output a first alarm signal when the determination result indicates that the first identification information does not correspond to the second identification information.
Optionally, the identification platform further includes an input component, the input component is in communication connection with the identification processor, the input component is configured to input information of a surgical instrument, the image processing module obtains third identification information of the surgical instrument through the input component, compares the third identification information with the first identification information, and if the third identification information is not equal to the first identification information, the identification processor controls the alarm device to output a second alarm signal.
In order to solve the above technical problem, according to a third aspect of the present invention, there is also provided a medical robot system including: the identification platform as described above.
In summary, in the surgical instrument identification method, the surgical instrument identification platform, and the medical robot system according to the present invention, the surgical instrument identification method includes: acquiring image information of a surgical instrument placed on a supporting table top; extracting characteristic information of the surgical instrument in the image information through a neural network; acquiring identification information of the surgical instrument according to the characteristic information, wherein the identification information of the surgical instrument comprises first identification information acquired before an operation and second identification information acquired after the operation; and judging whether the first identification information is consistent with the second identification information or not, and outputting a judgment result. By extracting and identifying the image information of the surgical instrument, the identification information of the surgical instrument can be obtained, thereby providing guarantee for automatically checking the surgical instrument. Utilize the identification platform to acquire surgical instruments's identifying information, can replace the problem that current manual statistics easily produced the omission to make the operator can make things convenient for accurately to judge whether there is the phenomenon that surgical instruments were omitted in the operation process, make things convenient for consumptive material to check and postoperative aassessment etc..
Drawings
It will be appreciated by those skilled in the art that the drawings are provided for a better understanding of the invention and do not constitute any limitation to the scope of the invention. Wherein:
FIG. 1 is a schematic view of a medical robotic system according to an embodiment of the invention;
FIG. 2 is a schematic view of a doctor-side control device of a medical robotic system in accordance with an embodiment of the present invention;
FIG. 3 is a patient-side control device of the medical robotic system of an embodiment of the present invention;
FIG. 4 is a schematic view of an imaging trolley of the medical robotic system of an embodiment of the present invention;
FIG. 5 is a schematic diagram of an identification platform according to an embodiment of the invention;
FIG. 6 is a schematic view of image information of a surgical instrument acquired by an image acquisition module according to an embodiment of the present invention;
FIG. 7 is a graphical representation of first identification information of a surgical instrument derived from the identification platform in accordance with one embodiment of the present invention;
FIG. 8 is a graphical representation of second identification information for a surgical instrument derived from the identification platform in accordance with one embodiment of the present invention;
FIG. 9 is a block diagram of an embodiment of the recognition platform;
FIG. 10 is a flow chart of a surgical instrument identification method in accordance with an embodiment of the present invention;
FIG. 11 is a flow chart of extracting feature information of a surgical instrument from image information via a neural network according to an embodiment of the present invention;
FIG. 12 is a flow chart of pre-processing image information according to an embodiment of the present invention;
FIG. 13 is a schematic diagram of a display of the first identification information and the second identification information matching according to an embodiment of the invention;
fig. 14 is a schematic display diagram illustrating a case where the first identification information and the second identification information do not match according to an embodiment of the present invention.
In the drawings:
100-doctor end control device; 101-main operator; 102-an imaging device; 103-a foot-operated surgical control device;
200-a patient-side control device; 201-surgical adjustment arm; 202-a surgical working arm; 203-surgical instruments; 204-column;
300-an image trolley; 301-endoscope; 302-an endoscope processor; 303-a display device;
400-identifying a platform; 401-an image acquisition module; 402-supporting a table top; 403-a display module; 404-an identification processor; 4041-image recognition module; 40411-pretreatment unit; 40412-an extraction unit; 40421-a test unit; 40422-a training unit; 4042-image processing module; 405-a storage module;
500-a master controller; 501-circular symbols.
Detailed Description
To further clarify the objects, advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. It is to be noted that the drawings are in greatly simplified form and are not to scale, but are merely intended to facilitate and clarify the explanation of the embodiments of the present invention. Further, the structures illustrated in the drawings are often part of actual structures. In particular, the drawings may have different emphasis points and may sometimes be scaled differently.
As used in this disclosure, the singular forms "a", "an" and "the" include plural referents, the term "or" is generally employed in a sense including "and/or," the terms "a" and "an" are generally employed in a sense including "at least one," the terms "at least two" are generally employed in a sense including "two or more," and the terms "first", "second" and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", "third" may explicitly or implicitly include one or at least two of such features, the term "proximal" generally being the end near the operator, the term "distal" generally being the end near the patient (i.e. near the lesion), the terms "end" and "proximal" and "distal" generally referring to the corresponding two parts, which include not only the end points, the terms "mounted", "connected" and "connected" being understood broadly, e.g. as being fixedly connected, detachably connected, or integral; can be mechanically or electrically connected; they may be directly connected or indirectly connected through intervening media, or they may be connected through the interior of two elements or through the interaction of two elements unless the context clearly dictates otherwise.
The core idea of the invention is to provide a surgical instrument identification method, an identification platform and a medical robot system, so as to solve the problem that consumables used in the existing manual statistics operation are easy to leak.
The following description refers to the accompanying drawings.
Referring to fig. 1 to 14, fig. 1 is a schematic view of a medical robot system according to an embodiment of the present invention; FIG. 2 is a schematic view of a doctor-side control device of a medical robotic system in accordance with an embodiment of the present invention; FIG. 3 is a patient-side control device of the medical robotic system of an embodiment of the present invention; FIG. 4 is a schematic view of an imaging trolley of the medical robotic system of an embodiment of the present invention; FIG. 5 is a schematic diagram of an identification platform according to an embodiment of the invention; FIG. 6 is a schematic view of image information of a surgical instrument acquired by an image acquisition module according to an embodiment of the present invention; FIG. 7 is a graphical representation of first identification information of a surgical instrument derived from the identification platform in accordance with one embodiment of the present invention; FIG. 8 is a graphical representation of second identification information for a surgical instrument derived from the identification platform in accordance with one embodiment of the present invention; FIG. 9 is a block diagram of an embodiment of the recognition platform; FIG. 10 is a flow chart of a surgical instrument identification method in accordance with an embodiment of the present invention; FIG. 11 is a flow chart of extracting feature information of a surgical instrument from image information via a neural network according to an embodiment of the present invention; FIG. 12 is a flow chart of pre-processing image information according to an embodiment of the present invention; FIG. 13 is a schematic diagram of a display of the first identification information and the second identification information matching according to an embodiment of the invention; fig. 14 is a schematic display diagram illustrating a case where the first identification information and the second identification information do not match according to an embodiment of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a medical robot system including a master-slave teleoperation surgical robot, that is, the medical robot system includes a doctor-side control device 100, a patient-side control device 200, and a master controller 500.
Referring to fig. 2, the surgeon's end control device 100 is an operation end of a teleoperation surgical robot and includes a main manipulator 101 mounted thereon. The main operator 101 is used for receiving hand motion information of an operator as a motion control signal input of the whole system. Optionally, the main controller 500 is also disposed on the doctor end control device 100. Preferably, the doctor-side control apparatus 100 further includes an imaging device 102, and the imaging device 102 can provide a stereoscopic image for an operator and provide surgical operation information for the operator to perform a surgical operation. The operation information includes the type and number of the surgical instruments, the pose in the abdomen, the shape and arrangement of the organ tissues of the patient and the blood vessels of the surrounding organ tissues, and the like. Optionally, the doctor-side control apparatus 100 further includes a foot-operated operation control device 103, and the operator can also complete input of relevant operation instructions such as electrosection, electrocoagulation and the like through the foot-operated operation control device 103.
Patient-side control device 200 is a particular implementation platform for a teleoperated surgical robot and includes a column 204 and surgical implementation components mounted thereon. The surgical performance assembly includes a robotic arm and a surgical instrument 203. In one embodiment, the robotic arm includes a surgical adjustment arm 201 and a surgical working arm 202. The surgical tool arm 202 is a mechanical fixed point mechanism for driving a surgical instrument to move around the mechanical fixed point, and the surgical adjusting arm 201 is used for adjusting the position of the fixed point in the working space. In another embodiment, the robotic arm is a mechanism having a spatial configuration with at least six degrees of freedom for driving the surgical instrument about an active motionless point under program control. The surgical instrument 203 is used to perform a particular surgical procedure, such as a clamp, cut, scissors, etc.
The main controller 500 is in communication connection with the doctor-side control device 100 and the patient-side control device 200, respectively, and is used for controlling the movement of the operation performing assembly according to the movement of the main manipulator 101. Specifically, the master controller includes a master-slave mapping module, and the master-slave mapping module is configured to obtain an end pose of the master manipulator 101 and a predetermined master-slave mapping relationship, obtain an expected end pose of the surgical executing assembly, and further control the manipulator to drive the surgical instrument to move to the expected end pose. Further, the master-slave mapping module is also used for receiving a functional operation instruction (such as an electro-excision, electro-coagulation and other related operation instructions) of the instrument and controlling an energy driver of the instrument so as to release energy to realize the electro-excision, electro-coagulation and other surgical operations.
Further, the medical robot system further includes an image trolley 300. As shown in fig. 4, the image dolly 300 includes: an endoscope 301 and an endoscope processor 302 communicatively connected to the endoscope 301. The endoscope 301 is used to acquire surgical operation information in a cavity (in the body cavity of a patient). The endoscope processor 302 is configured to image the surgical operation information acquired by the endoscope 301, and transmit the surgical operation information to the imaging device 102, so that the operator can observe the surgical operation information. Optionally, the image trolley 300 further comprises a display device 303. The display device 303 is communicatively coupled to the endoscope processor 302 for providing real-time display of surgical procedure information to an operator (e.g., a nurse).
In operation, an operator sits in front of the doctor end control device 100 located outside the sterile area, observes the returned operation information through the imaging device 102, and controls the operation performing assembly and the laparoscope movement by operating the main manipulator 101 to complete various operation.
Further, the medical robot further comprises an identification platform 400. As shown in fig. 5 and 9, the recognition platform 400 includes: an image acquisition module 401, a support table 402, and an identification processor 404. Wherein the support table 402 is used for placement of surgical instruments; the image acquisition module 401 is configured to acquire image information of a surgical instrument placed on the support table 402; the identification processor 404 is in communication connection with the image acquisition module 401, and is configured to obtain identification information of the surgical instrument according to the image information of the surgical instrument, determine whether first identification information obtained before an operation matches second identification information obtained after the operation, and output a determination result.
In this embodiment, the identification processor 404 and the main controller 500 are separately provided. In other embodiments, the identification processor 404 is integrated with the main controller 500, for example, the corresponding functions of the identification processor 404 are integrated in the main controller 500 as functional modules with other modules (e.g., master-slave mapping module) in the main controller 500.
It should be noted that the surgical instruments described herein include not only instruments in a narrow sense (such as scalpels, scissors, needles, etc.), but also consumables used in surgery (such as cotton balls, gauzes, hemostatic clips, etc.). Preferably, the identification information includes identification feature classifications of the surgical instruments, and the number of surgical instruments under each of the identification feature classifications. Correspondingly, the first identification information of the surgical instruments comprises identification feature classifications of the preoperative surgical instruments and the number of the surgical instruments under each identification feature classification; the second identifying information includes identifying feature classifications of the post-operative surgical instruments and a number of surgical instruments under each identifying feature classification. Further, the recognition feature classification includes a category and specification of the surgical instrument, etc., such as different types of scissors, etc. In addition, in the embodiment, the second identification information matches the first identification information, which can be understood as that the types and the numbers of the surgical instruments before and after the operation are completely the same, that is, no surgical instrument is used and consumed in the cavity during the operation, and all the surgical instruments are recycled, that is, it can be determined that there is no omission. If the second identification information and the first identification information have difference information, the surgical instrument is left in the cavity. The second identification information is consistent with the first identification information, and the type and the number of the surgical instruments after the operation can be understood to accord with certain preset rules. For example, portions of the surgical instrument will remain in the body to aid in wound healing or to secure the target tissue. Thus, the number of surgical instruments of this type before surgery will be greater than or equal to the number of surgical instruments after surgery. Further, if the first identification information and the second identification information of the surgical instrument do not accord with each other, the alarm device is triggered to output a first alarm signal so as to remind an operator of checking. The first alarm signal is not particularly limited in the present invention, and a person skilled in the art can set the first alarm signal according to the prior art, such as a user error prompt through sound, light or through a user interaction interface.
The recognition platform 400 is described in detail below with reference to fig. 5, 6-10.
The support platform 402 includes a plurality of predefined areas, each predefined area for receiving a surgical instrument. As shown in fig. 6, the same type of surgical instrument is placed in the same column and different surgical instruments are placed in different columns. Therefore, the surgical instruments are placed in classified arrangement according to categories, and the surgical instruments are arranged in order according to the same category, so that the surgical instruments can be conveniently identified.
The image acquisition module 401 is disposed above the supporting table 402 via an arm and is in communication with the recognition processor 404. The image acquisition module 401 is configured to acquire image information of a surgical instrument placed on the support table, and transmit the image information to the recognition processor 404. Fig. 6 shows a schematic diagram of the image information of the surgical instrument acquired by the image acquisition module 401. Wherein the schematic view shows four different types or models of surgical instruments, each different type or model of surgical instruments being arranged in columns, the same type and model of surgical instruments preferably being arranged in rows.
Preferably, the recognition processor 404 includes an image recognition module 4041 and an image processing module 4042. The image recognition module 4041 is in communication connection with the image acquisition module 401 and the image processing module 4042, and the image recognition module 4041 is configured to extract feature information of a surgical instrument in the image information through a neural network according to the image information acquired by the image acquisition module 401, and transmit the feature information to the image processing module 4042. The image processing module 4042 obtains the identification information of the surgical instrument according to the feature information; the identification information of the surgical instrument comprises first identification information obtained before an operation and second identification information obtained after the operation; the image processing module 4042 is further configured to determine whether the first identification information matches the second identification information, and output a determination result. Optionally, the feature information includes: at least one of a shape, texture, and color of a corresponding surgical instrument in the image information.
Optionally, the image recognition module 4041 includes a preprocessing unit 40411 and an extraction unit 40412. The preprocessing unit 40411 is configured to preprocess the image information to reduce noise of the image information, and please refer to the following detailed description; the extraction unit 40412 is in communication connection with the preprocessing unit 40411, and is configured to extract, through the trained neural network, feature information of the surgical instrument in the image information preprocessed by the preprocessing unit 40411.
Optionally, the image processing module 4042 includes a test unit 40421. The testing unit 40421 is in communication connection with the image recognition module 4041, and is configured to obtain the recognition information from the feature information through a trained classifier. In the present embodiment, there is no particular limitation on the specific type of the classifier, such as a Support Vector Machine (SVM), a bayesian classifier, a KNN algorithm, an adaboost method, a rochio algorithm, and the like. Further, the image processing module 4042 further comprises a training unit 40422 for training the classifier. For example, the classifier is a support vector machine, the nonlinear training set is converted into a linear training set in a high-dimensional space through a suitable kernel function, such as a gaussian radial basis function, and then the linear training set is introduced into the support vector machine and subjected to data evaluation optimization to obtain a trained support vector machine.
Optionally, the preprocessing unit 40412 includes a feature selection subunit, a graying subunit, and an illumination syndrome subunit. The selection subunit is used for determining an image area where feature information required to be extracted by the surgical instrument is located; the graying subunit is used forWhen the target area is a color image, carrying out gray processing on the image of the target area; the illumination corrector subunit is used for correcting the target area image subjected to the graying processing when the illumination of the target area is not uniform. The present embodiment is not particularly limited to a specific method for realizing image graying. For example, for an image in RGB format, a G value that is sensitive to human eyes may be taken as a gray value, and a R value, a G value, and a B value may be weighted to obtain a gray value; for YUV encoding or YCbCrAnd directly taking the Y value of each pixel point as a gray value of the coded image. Also, the present embodiment does not particularly limit the method of implementing the illumination unevenness correction. For example, an algorithm based on Retinex theory, a Histogram Equalization (HE) method, an unsharp mask method, a morphological filtering method, a method based on a spatial illuminance map, or an adaptive correction method based on a two-dimensional gamma function, and the like.
Preferably, the identification platform 400 further comprises a display module 403. The display module 403 is communicatively connected to the image processing module 4042, and is configured to display at least one of the first identification information, the second identification information, and the determination result.
Referring to fig. 7 and 8, to simplify the identification process, the identification information of the surgical instrument can be represented by a symbolic representation. FIG. 7 illustrates a graphical representation of first identification information representing identification information of a preoperative surgical instrument; fig. 8 shows a symbolized representation of the second identification information, which represents the identification information of the post-operative surgical instrument. More specifically, in the symbolic illustration, each recognition feature classification may be represented by a particular symbol. As shown in fig. 7 and 8, the symbols in different columns are not consistent and represent different classifications of identifying features, and the number of symbols in the same column represents the number of surgical instruments under the classification of identifying features. The selection of the specific symbol can set the simplified mark according to the shape of the surgical instrument. The symbolic representation can be displayed on a display module to prompt an operator to assist the operator in the field to make judgments and take relevant measures. Of course, those skilled in the art may also use other expression methods to express the identification information, such as an expression method using letters and numbers, and the like, which is not limited by the present invention. Fig. 7 shows a symbolic representation of the first identification information, which has 3 circular symbols 501, which represent that the number of surgical instruments in a certain category before the operation is 3. Fig. 8 shows a symbolic representation of the second identification information, with 2 circular symbols 501, which represent that the number of surgical instruments of this category after surgery is 2.
In one exemplary embodiment, the display module displays a graphical representation of the first identification information when the first identification information is obtained. And when the second identification information is acquired and whether the first identification information is consistent with the second identification information is judged, the display module displays the symbolic graphic representation of the first identification information, the symbolic graphic representation of the second identification information and the symbolic graphic representation of the comparison result. The operator can visually see whether the identification information of the surgical instrument before and after the operation has the difference. It should be noted that, here, the number of the circular symbols 501 in the first identification information and the second identification information is different, and does not represent that the first identification information and the second identification information do not correspond, for example, according to a preset rule (such as an expected surgical procedure), the surgical instrument represented by the circular symbols 501 is expected to be consumed in use during surgery, and it should be understood that the second identification information corresponds to the first identification information.
As shown in fig. 13, when the determination result is that the first identification information of the surgical instrument before the operation and the second identification information after the operation match, the system prompts that the determination result is identified by displaying the symbolic representation of the second identification information in addition to the symbolic representation of the first identification information on the display module, and attaching a green filter to the symbolic representation of the second identification information and/or drawing a "√" shape. After the operator confirms that no error exists again, the end of the operation can be confirmed.
Further, the identification platform further comprises an alarm device. The alarm device is in communication connection with the identification processor 404, and the alarm device is used for controlling the identification processor 404 to output a first alarm signal when the judgment result shows that the first identification information does not accord with the second identification information. The alarm device can be a warning lamp or a buzzer. The warning light and the buzzer can be one or more, and can be arranged on the support table 402, the doctor-end control device 100 and/or the patient-end control device 200. The first alarm signal can be that a warning lamp is on, flickers, a buzzer buzzes, and the like.
As shown in fig. 14, when the determination result is that the first identification information of the surgical instrument before the operation and the second identification information after the operation do not match, the system prompts not to pass, the recognition processor 404 controls the alarm device to output the first alarm signal, and the display module displays the symbolized graphic representation of the first identification information, the symbolized graphic representation of the second identification information, and the symbolized graphic representation of the second identification information and/or the cross "x" attached by the red filter to identify the determination result. At this time, the operator should search for the reason, and if the operator confirms that the surgical instrument is consumed or used in the operation or in the human body, the operator can perform manual intervention and make a recording and treatment scheme.
The recognition platform 400 also includes an input assembly communicatively coupled to the recognition processor 404 for inputting input information for the surgical instrument. The input information is associated with the third identification information of the surgical instrument, and may be identification information of at least a part of the surgical instrument or information attached to the surgical instrument, such as a two-dimensional code or a barcode. In an alternative embodiment, the input components include a microphone, mouse, keyboard, scan gun or nfc reader, etc. communicatively coupled to the recognition processor 404. In another embodiment, the input component includes a touch piece disposed on the display module 403, both of which form a touch screen on which an operator can directly make input. The image processing module 4042 obtains the third identification information of the surgical instrument through the input information of the input component, compares the third identification information with the first identification information, and if the third identification information is not equal to the first identification information, indicates that the input error or the picture identification error occurs. Further, the identification processor 404 may control an alarm device to issue a second alarm signal. The second alarm signal may be embodied in a form that is referenced to the first alarm signal, but should be distinguished from the first alarm signal.
In an alternative embodiment, the identification platform 400 further includes a storage module 405, the storage module 405 is respectively connected in communication with the image acquisition module 401 and the identification processor 404, and information such as image information acquired by the image acquisition module 401, a pre-processed image, identification information, and the like, and preset third identification information associated with input information, and the like, may be stored in the storage module for the identification processor 404 to call.
Referring to fig. 10, based on the above-mentioned identification platform 400, the present invention further provides a surgical instrument identification method, which includes:
step SC 1: acquiring image information of a surgical instrument placed on the support table 402;
step SC 2: extracting characteristic information of the surgical instrument in the image information through a neural network;
step SC 3: acquiring identification information of the surgical instrument according to the characteristic information; the identification information of the surgical instrument comprises first identification information obtained before an operation and second identification information obtained after the operation; and the number of the first and second groups,
step SC 4: and judging whether the first identification information is consistent with the second identification information or not, and outputting a judgment result.
Preferably, as shown in fig. 11, step SC2 includes:
step SC 21: preprocessing the image information to reduce noise of the image information;
step SC 22: and extracting the characteristic information of the surgical instrument in the preprocessed image information.
In the step of extracting and identifying the feature information of the image information through the neural network, the preprocessing unit firstly preprocesses the image information of the surgical instrument acquired by the image acquisition module 401 to reduce the noise of the image; further, it is preferable to extract characteristic information of the surgical instrument in the image, and to extract distinctive characteristic information (such as color, shape, and texture).
Preferably, as shown in fig. 12, before the step SC2 extracts feature information of the surgical instrument from the image information, the surgical instrument recognition method includes:
step SC 211: extracting an image region which covers a target surgical instrument in the image information, namely extracting a target region in the image information; the target area may be, for example, an image area in which the whole or part of the/some surgical instrument(s) in the acquired image of the surgical instrument(s) is/are located. On the one hand, the items resting on the support table 402 include not only surgical instruments but possibly other items, so the target area is set; on the other hand, the surgical instrument may include a plurality of features, not all of which are helpful for acquiring the identification information, so that a part of the features need to be eliminated to reduce the calculation amount of subsequent identification. The present embodiment does not particularly limit the specific method for extracting the image region of the image information that covers the target surgical instrument, and for example, the target region may also be determined by a rule set manually or may also be determined manually.
Step SC 212: and carrying out gray processing on the color image information in the target area range.
Step SC 213: scanning the image information after graying, extracting pixel values, and judging whether the target area is uniformly illuminated: if the illumination is not uniform, performing illumination correction on the target area to enable the illumination of the target area to be uniform; if the illumination is uniform, no correction is needed, namely no correction is carried out. And further, judging whether the target area is uniformly illuminated or not again for the image information subjected to illumination correction, and if the illumination is not uniform, performing illumination correction again on the target area until the illumination of the target area is uniform.
Step SC 214: and outputting the preprocessed image information.
Preferably, in step SC3, the feature information is obtained by a trained classifier to obtain the identification information. Further, the classifier is a support vector machine, the nonlinear training set is converted into a linear training set in a high-dimensional space through a kernel function, such as a gaussian radial basis function, and then the linear training set is introduced into the support vector machine and subjected to data evaluation optimization to obtain the trained support vector machine.
Preferably, the surgical instrument identification method further includes: displaying at least one of first identification information, second identification information, and an identification result of the surgical instrument. Such as by display device 303. In one embodiment, after the first identification information is acquired, the first identification information is displayed; and after second identification information is acquired and whether the first identification information is consistent with the second identification information is judged, displaying the second identification information and a judgment result. In another embodiment, after the first identification information and the second identification information are acquired and whether the first identification information and the second identification information are consistent or not is judged, the first identification information, the second identification information and the judgment result are displayed, or the judgment result is displayed.
Optionally, after obtaining the second identification information and determining whether the first identification information matches the second identification information, if the determination result is that the first identification information does not match the second identification information, outputting a first alarm signal.
Optionally, before step S1, third identification information of the surgical instrument is acquired by inputting the information; and after the first identification information of the surgical instrument is acquired, judging whether the first identification information is consistent with the third identification information, and if not, outputting a second alarm signal.
In an alternative embodiment, the medical robotic system may also be an orthopaedic robot. Accordingly, the orthopaedic robot has only the patient-end control device 200 without the surgeon-end control device 100, and the surgical instrument may be a bone saw, a bone drill, or the like. Other devices, such as identification platforms, etc., are similar to those of the above-described embodiments and are therefore not repeated.
In summary, in the surgical instrument identification method, the surgical instrument identification platform, and the medical robot system according to the present invention, the surgical instrument identification method includes: acquiring image information of a surgical instrument placed on a supporting table top; extracting characteristic information of the surgical instrument in the image information through a neural network; acquiring identification information of the surgical instrument according to the characteristic information; the identification information of the surgical instrument comprises first identification information obtained before an operation and second identification information obtained after the operation; and judging whether the first identification information is consistent with the second identification information or not, and outputting a judgment result. By extracting and identifying the image information of the surgical instrument, the identification information of the surgical instrument can be obtained, thereby providing guarantee for automatically checking the surgical instrument. Utilize the identification platform to acquire surgical instruments's identifying information, can replace the problem that current manual statistics easily produced the omission to make the operator can make things convenient for accurately to judge whether there is the phenomenon that surgical instruments were omitted in the operation process, make things convenient for consumptive material to check and postoperative aassessment etc..
The above description is only for the purpose of describing the preferred embodiments of the present invention, and is not intended to limit the scope of the present invention, and any variations and modifications made by those skilled in the art based on the above disclosure are within the scope of the appended claims.

Claims (16)

1. A surgical instrument identification method, comprising:
acquiring image information of a surgical instrument placed on a supporting table top;
extracting characteristic information of the surgical instrument in the image information through a neural network;
acquiring identification information of the surgical instrument according to the characteristic information; the identification information of the surgical instrument comprises first identification information obtained before an operation and second identification information obtained after the operation; and
and judging whether the first identification information is consistent with the second identification information or not, and outputting a judgment result.
2. The surgical instrument identification method according to claim 1, wherein the feature information includes: at least one of a shape, texture, and color of a corresponding surgical instrument in the image information.
3. The method for identifying a surgical instrument according to claim 1, wherein the method for extracting feature information of the surgical instrument from the image information through a neural network includes:
preprocessing the image information to reduce the noise of the image information; and
and extracting the characteristic information of the surgical instrument in the preprocessed picture information.
4. The surgical instrument recognition method according to claim 1 or 3, wherein before extracting feature information of a surgical instrument in the image information, the surgical instrument recognition method includes:
extracting a target region covering a target surgical instrument in the image information;
carrying out gray processing on the color image information in the target area range;
scanning the image information after graying, extracting pixel values, and judging whether the target area is uniformly illuminated: if the illumination is not uniform, performing illumination correction on the target area to enable the illumination of the target area to be uniform; if the illumination is uniform, no correction is carried out; and
and outputting the preprocessed image information.
5. The surgical instrument identification method according to claim 1, further comprising:
after first identification information is acquired, displaying the first identification information; after second identification information is acquired and whether the first identification information is consistent with the second identification information is judged, the second identification information and a judgment result are displayed, or,
and after acquiring the first identification information and the second identification information and judging whether the first identification information is consistent with the second identification information or not, displaying the first identification information, the second identification information and a judgment result or displaying the judgment result.
6. The surgical instrument identification method according to claim 1, further comprising:
and after second identification information is acquired and whether the first identification information is consistent with the second identification information is judged, if the judgment result shows that the first identification information is not consistent with the second identification information, outputting a first alarm signal.
7. The surgical instrument identification method according to claim 1, further comprising:
before acquiring the image information of the surgical instrument placed on the supporting table top, acquiring third identification information of the surgical instrument through input information;
and after the first identification information of the surgical instrument is acquired, judging whether the first identification information is equal to the third identification information, and if not, outputting a second alarm signal.
8. An identification platform for a surgical instrument, comprising: the device comprises a supporting table surface, an image acquisition module and an identification processor;
the supporting table top is used for placing surgical instruments;
the recognition processor comprises an image recognition module and an image processing module;
the image acquisition module is arranged above the supporting table top and is in communication connection with the image recognition module; the image acquisition module is used for acquiring image information of a surgical instrument placed on the supporting table top and transmitting the image information to the image identification module;
the image recognition module is in communication connection with the image processing module, and is used for extracting the characteristic information of the surgical instrument in the image information through a neural network and transmitting the characteristic information to the image processing module;
the image processing module acquires identification information of the surgical instrument according to the characteristic information; the identification information of the surgical instrument comprises first identification information obtained before an operation and second identification information obtained after the operation;
the image processing module is used for judging whether the first identification information is consistent with the second identification information or not and outputting a judgment result.
9. The recognition platform of claim 8, wherein the image recognition module comprises: a preprocessing unit and an extraction unit;
the preprocessing unit is used for preprocessing the image information to reduce the noise of the image information;
the extraction unit is in communication connection with the preprocessing unit and is used for extracting the characteristic information of the surgical instrument in the image information preprocessed by the preprocessing unit through the trained neural network.
10. The recognition platform of claim 9, wherein the preprocessing unit comprises: a characteristic selection subunit, a graying subunit and an illumination correction subunit;
the selection subunit is used for determining an image area where the characteristic information required to be extracted by the surgical instrument is located; the graying subunit is used for performing graying processing on the image of the target area when the target area is a color image; the illumination corrector subunit is used for correcting the target area image subjected to the graying processing when the illumination of the target area is not uniform.
11. The recognition platform of claim 8, wherein the image processing module comprises a testing unit, the testing unit is communicatively coupled to the image recognition module, and is configured to obtain the recognition information from the feature information through a trained classifier.
12. The recognition platform of claim 11, wherein the image processing module further comprises a training unit to train the classifier.
13. The recognition platform of claim 8, further comprising: a display module; the display module is in communication connection with the image processing module and is used for displaying at least one of the first identification information, the second identification information and the judgment result.
14. The identification platform of claim 8, further comprising an alarm device communicatively connected to the identification processor, wherein the alarm device is configured to be controlled to output a first alarm signal when the first identification information is determined not to correspond to the second identification information.
15. The identification platform of claim 14, further comprising an input component communicatively connected to the identification processor, wherein the input component is configured to input information of a surgical instrument, the image processing module obtains third identification information of the surgical instrument through the input component, compares the third identification information with the first identification information, and controls the alarm device to output a second alarm signal if the third identification information is not equal to the first identification information.
16. A medical robotic system comprising an identification platform according to any one of claims 8 to 15.
CN202011595244.0A 2020-12-29 2020-12-29 Surgical instrument identification method, identification platform and medical robot system Active CN112712016B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011595244.0A CN112712016B (en) 2020-12-29 2020-12-29 Surgical instrument identification method, identification platform and medical robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011595244.0A CN112712016B (en) 2020-12-29 2020-12-29 Surgical instrument identification method, identification platform and medical robot system

Publications (2)

Publication Number Publication Date
CN112712016A true CN112712016A (en) 2021-04-27
CN112712016B CN112712016B (en) 2024-01-26

Family

ID=75546352

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011595244.0A Active CN112712016B (en) 2020-12-29 2020-12-29 Surgical instrument identification method, identification platform and medical robot system

Country Status (1)

Country Link
CN (1) CN112712016B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113673350A (en) * 2021-07-21 2021-11-19 苏州爱医斯坦智能科技有限公司 Surgical instrument identification method, device, equipment and storage medium
CN116919597A (en) * 2023-09-15 2023-10-24 中南大学 Operation navigation system based on multi-vision fusion acquisition information

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040186683A1 (en) * 2003-03-20 2004-09-23 Boris Farber Method and equipment for automated tracking and identification of nonuniform items
CN202277385U (en) * 2011-10-14 2012-06-20 上海理工大学 Automatic identifying and counting system of surgical instrument
US20130113929A1 (en) * 2011-11-08 2013-05-09 Mary Maitland DeLAND Systems and methods for surgical procedure safety
CN108171269A (en) * 2018-01-04 2018-06-15 吴勤旻 A kind of medical instrument pattern recognition device
CN109409905A (en) * 2018-09-28 2019-03-01 微创(上海)医疗机器人有限公司 A kind of medical instrument automatic identification check system and method
CN110051443A (en) * 2019-05-24 2019-07-26 苏州爱医斯坦智能科技有限公司 Automatic method and apparatus monitoring identification and check surgical instrument
CN110301981A (en) * 2019-06-28 2019-10-08 华中科技大学同济医学院附属协和医院 A kind of intelligence checks the scanner and control method of surgical instrument
CN110678902A (en) * 2017-05-31 2020-01-10 Eizo株式会社 Surgical instrument detection system and computer program
CN111161875A (en) * 2019-12-31 2020-05-15 上海市肺科医院 Intelligent checking system for surgical auxiliary instrument
CN111553422A (en) * 2020-04-28 2020-08-18 南京新空间信息科技有限公司 Automatic identification and recovery method and system for surgical instruments
CN111860711A (en) * 2020-06-28 2020-10-30 维怡医疗科技有限公司 Surgical instrument management method, system and device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040186683A1 (en) * 2003-03-20 2004-09-23 Boris Farber Method and equipment for automated tracking and identification of nonuniform items
CN202277385U (en) * 2011-10-14 2012-06-20 上海理工大学 Automatic identifying and counting system of surgical instrument
US20130113929A1 (en) * 2011-11-08 2013-05-09 Mary Maitland DeLAND Systems and methods for surgical procedure safety
CN110678902A (en) * 2017-05-31 2020-01-10 Eizo株式会社 Surgical instrument detection system and computer program
CN108171269A (en) * 2018-01-04 2018-06-15 吴勤旻 A kind of medical instrument pattern recognition device
CN109409905A (en) * 2018-09-28 2019-03-01 微创(上海)医疗机器人有限公司 A kind of medical instrument automatic identification check system and method
CN110051443A (en) * 2019-05-24 2019-07-26 苏州爱医斯坦智能科技有限公司 Automatic method and apparatus monitoring identification and check surgical instrument
CN110301981A (en) * 2019-06-28 2019-10-08 华中科技大学同济医学院附属协和医院 A kind of intelligence checks the scanner and control method of surgical instrument
CN111161875A (en) * 2019-12-31 2020-05-15 上海市肺科医院 Intelligent checking system for surgical auxiliary instrument
CN111553422A (en) * 2020-04-28 2020-08-18 南京新空间信息科技有限公司 Automatic identification and recovery method and system for surgical instruments
CN111860711A (en) * 2020-06-28 2020-10-30 维怡医疗科技有限公司 Surgical instrument management method, system and device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113673350A (en) * 2021-07-21 2021-11-19 苏州爱医斯坦智能科技有限公司 Surgical instrument identification method, device, equipment and storage medium
CN113673350B (en) * 2021-07-21 2024-02-20 苏州爱医斯坦智能科技有限公司 Surgical instrument identification method, device, equipment and storage medium
CN116919597A (en) * 2023-09-15 2023-10-24 中南大学 Operation navigation system based on multi-vision fusion acquisition information
CN116919597B (en) * 2023-09-15 2023-12-26 中南大学 Operation navigation system based on multi-vision fusion acquisition information

Also Published As

Publication number Publication date
CN112712016B (en) 2024-01-26

Similar Documents

Publication Publication Date Title
US11096749B2 (en) Augmented surgical reality environment for a robotic surgical system
US11080854B2 (en) Augmented surgical reality environment
US10908681B2 (en) Operating room and surgical site awareness
CN112804958A (en) Indicator system
JP2003265408A (en) Endoscope guide device and method
CN112712016B (en) Surgical instrument identification method, identification platform and medical robot system
CN111067468B (en) Method, apparatus, and storage medium for controlling endoscope system
JP2019010513A (en) System and method for glass state view in real-time three-dimensional (3d) cardiac imaging
CN112704566B (en) Surgical consumable checking method and surgical robot system
US20200113636A1 (en) Robotically-assisted surgical device, robotically-assisted surgery method, and system
US20200184638A1 (en) Systems and methods for enhancing surgical images and/or video
CN111374688B (en) System and method for correcting medical scans
WO2019092950A1 (en) Image processing device, image processing method, and image processing system
EP3643265A1 (en) Loose mode for robot
CN113476142B (en) Surgical instrument clamping force self-adaptive control system and control method and surgical robot
US7340291B2 (en) Medical apparatus for tracking movement of a bone fragment in a displayed image
EP3933481A1 (en) Medical observation system and method, and medical observation device
CN107496029A (en) Intelligent minimally invasive surgery system
US20190192252A1 (en) Surgical Instrument Positioning System and Positioning Method Thereof
CN115908349B (en) Automatic endoscope parameter adjusting method and device based on tissue identification
WO2022054498A1 (en) Medical arm control system, medical arm device, medical arm control method, and program
US20210298854A1 (en) Robotically-assisted surgical device, robotically-assisted surgical method, and system
CN115120341A (en) Computer readable storage medium, electronic equipment and surgical robot system
US20190069866A1 (en) Display method, and display control device
CN113925607A (en) Operation training method, device, system, medium and equipment for surgical robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant