CN111407406B - Head position identification system, intraoperative control system and control method - Google Patents

Head position identification system, intraoperative control system and control method Download PDF

Info

Publication number
CN111407406B
CN111407406B CN202010243839.3A CN202010243839A CN111407406B CN 111407406 B CN111407406 B CN 111407406B CN 202010243839 A CN202010243839 A CN 202010243839A CN 111407406 B CN111407406 B CN 111407406B
Authority
CN
China
Prior art keywords
data signal
position data
head
central control
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010243839.3A
Other languages
Chinese (zh)
Other versions
CN111407406A (en
Inventor
汪全全
李盛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Original Assignee
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan United Imaging Zhirong Medical Technology Co Ltd filed Critical Wuhan United Imaging Zhirong Medical Technology Co Ltd
Priority to CN202010243839.3A priority Critical patent/CN111407406B/en
Publication of CN111407406A publication Critical patent/CN111407406A/en
Application granted granted Critical
Publication of CN111407406B publication Critical patent/CN111407406B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

The application relates to a head position identification system, an intraoperative control system and a control method. The visual identifier of the head position identification system is fixed on the head frame through the connecting frame. The headgear is adapted to be secured to the head of a patient. The visual marker forms a rigid connection with the head of the patient. When the head of the human body is displaced, the visual marker moves along with the head. The position of the first optical signal is changed synchronously. The visual tracking device is used for collecting the first optical signal and converting the first optical signal into a first position data signal. The central control device acquires the first optical signal and obtains head displacement according to the first position data signal. The propagation speed of light in air is high. The optical signal is used for measuring the displacement, so that the precision of the head position identification system is improved. The head position identification system provides accurate head displacement information for the operating equipment, and improves the safety of the operating equipment.

Description

Head position identification system, intraoperative control system and control method
Technical Field
The present application relates to the field of medical technology, and in particular, to a head position identification system, an intraoperative control system, and a control method.
Background
Conventional surgical operating devices do not have the capability to detect head movement of a patient during use. The surgical operating equipment needs to establish a rigid connection with the patient's head. The head of the human body cannot move or rotate relative to the surgical operating device throughout the surgical procedure.
If the operation operating equipment or the operating table collapses in the operation process, serious damage can be caused to the patient. Therefore, how to improve the security of the operating device is an urgent problem to be solved.
Disclosure of Invention
In view of the above, it is necessary to provide a head position identification system, an intraoperative control system, and a control method, which are directed to the problem of how to improve the safety of an operating device.
A head position identification system includes a headgear, a connecting frame, a visual identifier, a visual tracking device, and a central control device. The head frame is used for being fixed on the head of a patient. One end of the connecting frame is connected with the head frame. The visual identifier is connected with the other end of the connecting frame. The visual marker is used for generating a first light signal. The visual identifier is arranged in the lighting range of the visual tracking device. The visual tracking device is used for collecting the first optical signal and converting the first optical signal into a first position data signal. The central control device is connected with the visual tracking device. The central control device is used for acquiring the first position data signal in real time and obtaining head displacement according to the first position data signal.
In one embodiment, the connecting frame comprises a plurality of connecting rods connected end to end in sequence.
In one embodiment, two adjacent links are pivotally connected.
In one embodiment, the visual identifier includes a bracket and an optical marker. The support is connected with one end of the connecting frame, which is far away from the head frame. The stent includes at least one branch. At least one branch sets the optical mark. The optical mark is used for generating the first optical signal.
An intraoperative control system including the head position identification system of any of the embodiments described above further includes a robotic arm, a second position identification device, and a controller. The second position marking device is arranged on the mechanical arm. The second position identification device is used for generating a second optical signal for identifying the position of the mechanical arm. The visual tracking device is further configured to collect the second optical signal and convert the second optical signal into a second position data signal. The central control device is also used for acquiring the second position data signal in real time and generating traveling path information according to the head displacement and the second position data signal. The central control device and the mechanical arm are respectively connected with the controller. The central control device is used for outputting the traveling path information to the controller. The controller is used for controlling the mechanical arm to reach a target position according to the traveling path information.
In one embodiment, the intraoperative control system further comprises a mobile cart. The central control device and the controller are accommodated in the movable trolley. The mechanical arm is arranged on the moving trolley.
In one embodiment, the headgear is adapted to be coupled to a patient bed and the trolley is coupled to the headgear.
A method of controlling an intraoperative control system as in any one of the embodiments above, comprising:
and S100, controlling the visual tracking device to acquire the first optical signal and the second optical signal. The visual tracking device converts the first light signal into a first position data signal. The visual tracking device also converts the second light signal to a second position data signal.
S200, controlling the central control device to collect the first position data signal and the second position data signal. The central control device generates travel path information from the first position data signal and the second position data signal.
And S300, controlling the central control device to output the traveling path information to the controller. And the controller controls the mechanical arm to reach the target position according to the traveling path information.
In one embodiment, the step of generating the travel path information from the first position data signal and the second position data signal in S200 includes:
s210, the central control device judges whether the control state of the controller is in an automatic state or a manual state.
And S220, if the controller is in a manual state, the central control device prohibits the mechanical arm from moving through the controller.
S230, the central control device updates the target location information according to the first location data signal. The central control device generates the travel path information according to the target position information and the second position data signal. And transmitting the travel path information to the controller.
In one embodiment, after S210, the method further includes:
and S211, if the controller is in an automatic state, the central control device enables the mechanical arm to stop moving through the controller.
S212, the central control device updates target position information according to the first position data signal, and the central control device generates the traveling path information according to the updated target position information and the second position data signal and sends the traveling path information to the controller.
And S213, the controller controls the mechanical arm to reach the target position according to the traveling path information.
In one embodiment, after the step of the central control device acquiring the first position data signal and the second optical signal in S200, the method further includes:
s201, the central control device obtains the real-time displacement of the head according to the first position data signal and the previous first position data signal, and if the real-time displacement is larger than a preset threshold value, S210 is executed.
In the head position identification system provided by the embodiment of the present application, the visual identifier is connected to the head frame through the connecting frame. The head frame is used for being fixed on the head of a patient. The visual marker forms a rigid connection with the head of the patient. When the head of the human body is displaced, the visual marker moves along with the head. The position of the first optical signal is changed synchronously. The visual tracking device is used for collecting the first optical signal and converting the first optical signal into a first position data signal. The central control device acquires the first optical signal and obtains head displacement according to the first position data signal. The propagation speed of light in air is high. The first optical signal has no time difference with the head movement of the patient. The optical signal is adopted to measure the displacement, so that the precision of the head position identification system is improved. The head position identification system provides accurate head displacement information for the operating equipment, and improves the safety of the operating equipment.
Drawings
FIG. 1 is a schematic structural view of the head position identification system and the intraoperative control system provided in one embodiment of the present application;
FIG. 2 is a communication diagram of the head position identification system provided in one embodiment of the present application;
FIG. 3 is a schematic diagram of a partial structure of the head position identification system provided in an embodiment of the present application;
FIG. 4 is a communication diagram of the intraoperative control system provided in one embodiment of the present application;
FIG. 5 is a schematic structural diagram of the intraoperative control system provided in another embodiment of the present application;
FIG. 6 is a flow chart of a control method of the intraoperative control system provided in one embodiment of the present application;
fig. 7 is a flowchart of a control method of the intraoperative control system provided in another embodiment of the present application.
Reference numerals:
intraoperative control system 10
Human body 100
Head position identification system 20
Head frame 210
Connecting frame 220
Connecting rod 221
Visual identifier 230
Support 231
Branch 201
Optical marker 232
Visual tracking device 30
The robotic arm 410
Second position identification device 420
Controller 430
Central control device 50
Movable trolley 60
Display device 70
Alarm device 80
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanying the present application are described in detail below with reference to the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of embodiments in many different forms than those described herein and those skilled in the art will be able to make similar modifications without departing from the spirit of the application and it is therefore not intended to be limited to the embodiments disclosed below.
The numbering of the components as such, e.g., "first", "second", etc., is used herein for the purpose of describing the objects only, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings). In the description of the present application, it is to be understood that the terms "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used only for convenience in describing the present application and for simplicity in description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be considered as limiting the present application.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1, 2 and 3, a head position identification system 20 according to an embodiment of the present application includes a head frame 210, a connecting frame 220, a visual identifier 230, a visual tracking device 30 and a central control device 50. The headgear 210 is adapted to be secured to the head of a patient. The connecting frame 220 one end of the connecting frame 220 is connected to the head frame 210. The visual identifier 230 is connected to the other end of the connection frame 220. The visual identifier 230 is used to generate a first light signal. The visual marker 230 is disposed within a lighting range of the visual tracking device 30. The visual tracking device 30 is configured to collect the first light signal and convert the first light signal into a first position data signal. The central control device 50 is connected to the visual tracking device 30. The central control device 50 is configured to acquire the first position data signal in real time, and obtain the head displacement according to the first position data signal.
Embodiments of the present application provide that the visual identifier 230 of the head position identification system 20 is connected to the headgear 210 via the connecting frame 220. The headgear 210 is adapted to be secured to the head of a patient. The visual marker 230 forms a rigid connection with the patient's head. When the head of the human body is displaced, the visual marker 230 moves together with the head. The position of the first optical signal is changed synchronously. The visual tracking device 30 is configured to collect the first light signal and convert the first light signal into a first position data signal. The central control device 50 acquires the first optical signal and obtains the head displacement according to the first position data signal. The propagation speed of light in air is high. The first optical signal has no time difference with the head movement of the patient. The optical signal is used for measuring the displacement, so that the precision of the head position identification system 20 is improved. The head position identification system 20 provides accurate head displacement information for the operating device, and improves the safety of the operating device.
During operation, the body lies on the bed. The head position identification system 20 keeps the head position identification system 20 away from the head through the connection frame 220, and head surgical instrument movement is prevented from being hindered. The head position identification system 20 allows sufficient space for head surgery instruments to facilitate real-time head surgery. At the same time, the system 20 is far away from the head due to the head position. The head position identification system 20 can monitor the head displacement in real time in the whole course of the operation process. In one embodiment, the central control device 50 is further configured to update the head position information according to the first position data signal.
The central control device 50 includes a computer, a CPU, a central control unit or a remote control unit.
The visual Marker 230 is a visual tracking Marker. The visual tracking Marker is matched to the visual tracking apparatus 30.
In one embodiment, the visual identifier 230 is at least one. The visual tracking device 30 is at least one. At least one of the visual tracking devices 30 is connected to the central control device 50.
In one embodiment, the visual identifier 230 is plural. The visual tracking device 30 is plural. The plurality of visual markers 230 respectively generate a plurality of first light signals. A plurality of the first light signals may be collected by a plurality of the visual tracking devices 30 or may be collected by one of the visual tracking devices 30.
In one embodiment, the central control device 50 derives the head displacement from the first position data signal. The head displacement is a vector, including the magnitude of the direction and distance of movement.
The visual tracking device 30 is connected to the central control device 50 for information transfer. The visual tracking device 30 is connected to the central control device 50 by a wireless connection or a wired connection. The wireless connection comprises Bluetooth, WIFI or a cellular network and the like. Referring also to fig. 4, in one embodiment, the connecting frame 220 includes a plurality of connecting rods 221 connected end to end in series.
In one embodiment, the maximum diameter of the link 221 distal from the headgear 210 is greater than the maximum diameter of the link 221 proximal to the headgear 210 to ensure stability of the connecting frame 220.
In one embodiment, the rotational connection between two adjacent links 221 facilitates adjusting the position of the visual marker 230.
The visual marker 230 may be a marked point or an array of marks, etc. The mark point can be an active luminous body or a passive reflecting body. The mark points can be in regular structures such as spheres, sheets and squares, and can also be in irregular structures. The marker array can be one or more of a polygon array, a cross array or other irregular array. Optical marks are disposed on the array of marks. The optical markers may be active emitters or passive reflectors. The optical mark can be in a regular structure such as a sphere, a sheet, a square and the like, and can also be in an irregular structure. The number of the optical marks is not limited. The kind of the optical mark is not limited.
In one embodiment, the visual marker 230 includes a bracket 231 and an optical marker 232. The holder 231 is connected to an end of the connecting frame 220 away from the head frame 210. The holder 231 comprises at least one branch 201. At least one branch 201 sets the optical mark 232. The optical marker 232 is used to generate the first optical signal.
In one embodiment, the holder 231 includes one branch 201. One of the optical markers 232 is disposed at the middle or end of the branch 201. One end of the connecting frame 220 is connected to the middle or end of the branch 201.
In one embodiment, the holder 231 includes one branch 201. A plurality of said optical markers 232 is arranged on one of said branches 201. The plurality of optical marks 232 may be disposed at the branches 201 in a dispersed manner, or may be disposed at the middle or end of the branches 201 in a concentrated manner. One end of the connecting frame 220 is connected to the middle or end of the branch 201.
In one embodiment, the holder 231 includes a plurality of branches 201. The plurality of branches 201 may be arranged in one or more of a polygonal array, a cross array or other irregular array, and may also be arranged irregularly. A plurality of optical marks 232 are respectively disposed on the plurality of branches 201. The number of optical marks 232 on each of the branches 201 may be the same or different.
In the above embodiment, at least one of the branches 201 is not provided with the optical mark 232.
In one embodiment, the visual marker 230 includes a cross brace and four optical markers 232. The optical markers 232 are light-reflecting spheres. The middle part of the cross-shaped bracket is fixedly connected with one end of the connecting frame 220 far away from the head frame 210. The four optical markers 232 are correspondingly arranged at the end of the cross-shaped bracket. Four of the optical markers 232 are within the light collection range of the visual tracking device 30.
Displacement monitoring is ensured as long as one of the four optical markers 420 is capable of normally generating the first optical signal, increasing the reliability of the visual marker 40.
When the head of the human body 100 moves, the head frame 210, the connecting frame 220, the cross-shaped support and the optical mark 232 are driven to move. The positions of the four optical markers 232 are different, and the positions of the light reflected by the four optical markers 232 are different. The first light signal is the light reflection position signal of the optical mark 232. The light reflection position signals of the four optical markers 232 are collected by the visual tracking device 30, and the data signals corresponding to the light reflection position signals are uploaded to the central control device 50.
In the above embodiments, the head frame 210 may be replaced by other supports to adapt the head position identification system 20 to other surgical scenarios.
Referring to fig. 5, an intraoperative control system 10 including a head position identification system 20 according to any of the embodiments is provided. The intraoperative control system 10 further includes a robotic arm 410, a second position identifying device 420, and a controller 430.
The second position identifying device 420 is disposed on the robot 410. The second position identifying device 420 is used to generate a second optical signal that identifies the position of the robotic arm 410. The visual tracking device 30 is further configured to collect the second light signal and convert the second light signal into a second position data signal. The central control device 50 is further configured to collect the second position data signal in real time, and generate the traveling path information according to the head displacement and the second position data signal. The central control unit 50 and the robot arm 410 are connected to the controller 430, respectively. The central control device 50 is configured to output the travel path information to the controller 430. The controller 430 is configured to control the robot arm 410 to reach a target position according to the travel path information.
The central control unit 50 and the robot arm 410 are connected to the controller 430, respectively. The connection mode comprises wireless connection and wired connection. The wireless connection comprises Bluetooth, WIFI or a cellular network and the like. The surgical procedure, from start to finish, may involve manual and automated operation of the robotic arm 410. Wherein the automated operation may cause the end of the robotic arm 410 to reach a target location according to a planned path. The surgeon then inserts the surgical instrument through the adapter at the end of the robotic arm 410 and into the patient's head for surgery. The robotic arm 410 functions as a positioning aid throughout the procedure to address the problems of manual positioning inaccuracies and manual jitter for the physician.
The second position identifying device 420 is disposed on the robot 410. The robotic arm 410 moves or rotates. The second position marker 420 moves in synchronization with the robot arm 410. The robot arm 410 displacement and real-time position may be monitored by monitoring the displacement of the second position identification device 420.
The intraoperative control system 10 provided by the embodiment of the present application simultaneously monitors the movement of the head of the human body and the movement of the robotic arm 410. The intraoperative control system updates the travel path according to the first position data signal and the second position data signal, reduces the operation error caused by the movement of the head of the human body or the mechanical arm 410, and improves the accuracy and the safety of the intraoperative control system 10.
In one embodiment, the intraoperative control system 10 further includes a mobile cart 60. The central control device 50 and the controller 430 are housed in the moving cart 60. The robot arm 410 is disposed on the moving cart 60.
The traveling carriage 60 is used to coarsely adjust the position of the robot arm 410. Before surgery, the operator pushes the dolly 60 to the operating range of the robotic arm 410.
Referring also to fig. 6, in one embodiment, the head frame 210 is adapted to be fixedly connected to a patient's bed, and the mobile cart 60 is fixedly connected to the head frame 210 to substantially fix the head frame 210 and establish a rigid connection between the robotic arm 410 and the head frame 210.
In one embodiment, the head frame 210 is only fixedly connected to the patient bed, and is not connected to the mobile cart 60, so as to prevent the patient bed from collapsing and damaging the head and cervical vertebrae of the human body.
In one embodiment, the intraoperative control system 10 further includes a display device 70. The display device 70 is electrically connected to the central control device 50. The display device 70 is used for displaying the movement position information of the human head, the movement information or the traveling path information of the robot arm 410, and the like.
In one embodiment, the display device 70 is fixedly disposed on the mobile cart 60, so that an operator can obtain relevant information in time.
The display device 70 includes an LED display screen, a display or a display device, and the like.
In one embodiment, the intraoperative control system 10 further includes an alarm device 80. The alarm device 80 is connected to the central control device 50. The alarm device 80 includes an audible alarm, a photoelectric alarm, a screen reminder, or the like.
Alarm device 80 set up in remove dolly 60, the operator of being convenient for in time discovers alarm information. The alarm device 80 and the display device 70 are respectively connected to the central control device 50. The connection mode comprises wireless connection and wired connection. The wireless connection comprises Bluetooth, WIFI or a cellular network and the like.
Referring to fig. 7, the present embodiment provides a control method of the intraoperative control system 10, wherein the intraoperative control system 10 comprises a headgear 210, a linkage frame 220, a visual identifier 230, a visual tracking device 30, a robotic arm 410, a second position identifying device 420, a controller 430, and a central control device 50, the headgear 210 is adapted to be secured to the head of a patient, the connecting frame 220 is secured at one end to the headgear 210, the visual identifier 230 is fixedly connected to the other end of the connecting frame 220, the visual identifier 230 is used for generating a first light signal, the second position identifying device 420 is fixed to the robot arm 410, the second position identifying device 420 is used for generating a second optical signal, the visual tracking device 30, the controller 430 and the robot arm 410 are respectively connected to the central control device 50, and the control method includes:
s100, controlling the visual tracking device 30 to collect the first light signal and the second light signal. The visual tracking device 30 converts the first light signal into the first position data signal. The visual tracking device 30 also converts the second light signal to the second position data signal.
S200, controlling the central control device 50 to collect the first position data signal and the second position data signal. The central control device 50 generates travel path information from the first position data signal and the second position data signal.
S300, controlling the central control device 50 to output the travel path information to the controller 430. The controller 430 controls the robot arm 410 to reach a target position according to the travel path information.
The embodiments of the present application provide a control method for the intraoperative control system 10 while monitoring the movement of the head and the movement of the robotic arm 410. The control method of the intraoperative control system updates a travel path according to the first position data signal and the second position data signal. The control method of the intraoperative control system reduces the operation error caused by the movement of the head of the human body or the mechanical arm 410, and improves the accuracy and safety of the intraoperative control system 10.
In one embodiment, before S100, the control method further includes:
s010, adjusting the visual tracking device 30 to make the visual identifier 230 and the second position identifying device 420 within the lighting range of the visual tracking device 30.
S010 is the positioning of the visual tracking apparatus 30.
In one embodiment, the surgeon may manually make gross adjustments to the position of the robotic arm 410 to bring the robotic arm 410 within the confines of an automated operation while performing a procedure on a human body using the robotic arm 410. After the manual coarse adjustment is completed, the path is automatically planned through the intraoperative control system 10, and the fine operation is automatically performed.
In one embodiment, the step of generating the travel path information from the first position data signal and the second position data signal in S200 includes:
s210, the central control device 50 determines whether the control state of the controller 430 is the automatic state or the manual state.
S220, if the controller 430 is in the manual state, the central control apparatus 50 prohibits the robot arm 410 from moving through the controller 430.
S230, the central control device 50 updates the target position information according to the first position data signal, and the central control device 50 generates the travel route information according to the target position information and the second position data signal, and sends the travel route information to the controller 430.
In one embodiment, after S230, the control method further includes:
s240, the central control apparatus 50 controls the controller 430 to put the robot arm 410 in a standby state, and the robot arm 410 waits for a doctor' S operation.
In one embodiment, after S210, the control method further includes:
s211, if the controller 430 is in the automatic state, the central control apparatus 50 stops the robot 410 by the controller 430.
S212, the central control device 50 updates the target position information according to the first position data signal, and the central control device 50 generates the travel route information according to the updated target position information and the second position data signal, and sends the travel route information to the controller 430.
S213, the controller 430 controls the robot arm 410 to reach a target position according to the travel path information.
In one embodiment, after the step of the central control device 50 acquiring the first position data signal and the second optical signal in S200, the control method further includes:
s201, the central control device 50 obtains a real-time displacement of the head according to the first position data signal and the previous first position data signal, and if the real-time displacement is greater than a predetermined threshold, S210 is executed.
In one embodiment, the predetermined threshold is 0.3mm, which can ensure both the safety of the human body and the smooth operation.
The intraoperative control system described above may also be applied to other surgical sites. The position identification device may be provided at the site to be operated. The control method is used for controlling the intraoperative control system to perform other surgical site operations.
Although the individual steps in the above flowcharts are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the flowchart may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-described examples merely represent several embodiments of the present application and are not to be construed as limiting the scope of the claims. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An intraoperative control system, comprising:
head position identification means comprising:
a head frame (210) for fixing to the head of a human body (100);
a connecting frame (220), one end of the connecting frame (220) being connected to the headstock (210);
a visual marker (230), the visual marker (230) being connected to the other end of the connecting frame (220), the visual marker (230) being configured to generate a first light signal;
the visual tracking device (30), the visual identifier (230) is arranged in the lighting range of the visual tracking device (30), and the visual tracking device (30) is used for collecting the first light signal and converting the first light signal into a first position data signal;
the central control device (50) is connected with the visual tracking device (30), the central control device (50) is used for acquiring the first position data signal in real time and obtaining head displacement according to the first position data signal, and the central control device (50) also updates the position information of the head according to the first position data signal;
a robotic arm (410);
a second position identification device (420) disposed on the robot arm (410), the second position identification device (420) being configured to generate a second optical signal for identifying the position of the robot arm (410), the visual tracking device (30) being further configured to acquire the second optical signal, the visual tracking device (30) being further configured to convert the second optical signal into a second position data signal, the central control device (50) being further configured to acquire the second position data signal in real time and generate travel path information according to the head displacement and the second position data signal, the central control device (50) being further configured to update the travel path information according to the first position data signal and the second position data signal;
the controller (430), the central control device (50) and the mechanical arm (410) are respectively connected with the controller (430), the central control device (50) is used for outputting the travel path information to the controller (430), and the controller (430) is used for controlling the mechanical arm (410) to reach a target position according to the travel path information.
2. The intraoperative control system of claim 1, wherein the connecting frame (220) includes a plurality of links (221) connected end-to-end in series.
3. The intraoperative control system of claim 2, wherein two adjacent links (221) are in rotational connection.
4. The intraoperative control system of claim 1, wherein the visual identifier (230) comprises:
a holder (231) connected to an end of the connecting frame (220) remote from the headgear (210), the holder (231) comprising at least one branch (201);
an optical marker (232), the optical marker (232) being arranged by at least one branch (201), the optical marker (232) being used for generating the first optical signal.
5. The intraoperative control system of claim 1, further comprising:
a mobile trolley (60), wherein the central control device (50) and the controller (430) are contained in the mobile trolley (60), and the mechanical arm (410) is arranged on the mobile trolley (60).
6. The intraoperative control system of claim 5, wherein the headgear (210) is for connection with a patient bed, the mobile cart (60) being connected with the headgear (210).
7. A method for controlling an intraoperative control system, characterized in that the intraoperative control system (10) comprises a head frame (210), a connecting frame (220), a visual identifier (230), a visual tracking device (30), a mechanical arm (410), a second position identifier (420), a controller (430) and a central control device (50), wherein the head frame (210) is used for being fixed on the head of a patient, one end of the connecting frame (220) is connected to the head frame (210), the visual identifier (230) is connected with the other end of the connecting frame (220), the visual identifier (230) is used for generating a first light signal, the second position identifier (420) is arranged on the mechanical arm (410), the second position identifier (420) is used for generating a second light signal, the visual tracking device (30), the controller (430) and the mechanical arm (410) are respectively connected with the central control device (50), the control method comprises the following steps:
s100, controlling the visual tracking device (30) to collect the first optical signal and the second optical signal, wherein the visual tracking device (30) converts the first optical signal into a first position data signal, and the visual tracking device (30) further converts the second optical signal into a second position data signal;
s200, controlling the central control device (50) to collect the first position data signal and the second position data signal, and generating traveling path information by the central control device (50) according to the first position data signal and the second position data signal;
s300, controlling the central control device (50) to output the information of the travel path to the controller (430), and controlling the mechanical arm (410) to reach a target position by the controller (430) according to the information of the travel path.
8. The control method of the intra-operative control system according to claim 7, wherein the step of generating the travel path information according to the first position data signal and the second position data signal in S200 includes:
s210, the central control device (50) judges whether the control state of the controller (430) is in an automatic state or a manual state;
s220, if the controller (430) is in a manual state, the central control device (50) prohibits the mechanical arm (410) from moving through the controller (430);
s230, the central control device (50) updates target position information according to the first position data signal, and the central control device (50) generates the traveling path information according to the target position information and the second position data signal and sends the traveling path information to the controller (430).
9. The control method of the intraoperative control system of claim 8, after S210, further comprising:
s211, if the controller (430) is in an automatic state, the central control device (50) stops the mechanical arm (410) through the controller (430);
s212, the central control device (50) updates target position information according to the first position data signal, and the central control device (50) generates the traveling path information according to the updated target position information and the second position data signal and sends the traveling path information to the controller (430);
s213, the controller (430) controls the mechanical arm (410) to reach the target position according to the travel path information.
10. The control method of the intra-operative control system according to claim 7, further comprising, after the step of the central control device (50) acquiring the first position data signal and the second light signal in S200:
s201, the central control device (50) obtains real-time displacement of the head according to the first position data signal and the previous first position data signal, and if the real-time displacement is larger than a preset threshold value, S210 is executed.
CN202010243839.3A 2020-03-31 2020-03-31 Head position identification system, intraoperative control system and control method Active CN111407406B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010243839.3A CN111407406B (en) 2020-03-31 2020-03-31 Head position identification system, intraoperative control system and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010243839.3A CN111407406B (en) 2020-03-31 2020-03-31 Head position identification system, intraoperative control system and control method

Publications (2)

Publication Number Publication Date
CN111407406A CN111407406A (en) 2020-07-14
CN111407406B true CN111407406B (en) 2022-04-26

Family

ID=71485348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010243839.3A Active CN111407406B (en) 2020-03-31 2020-03-31 Head position identification system, intraoperative control system and control method

Country Status (1)

Country Link
CN (1) CN111407406B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101257844A (en) * 2005-04-29 2008-09-03 范德比特大学 System and methods of using image-guidance for providing an access to a cochlear of a living subject
CN103445929A (en) * 2013-09-10 2013-12-18 河南科技大学第一附属医院 Operation fixing cap
CN103735317A (en) * 2013-12-18 2014-04-23 宁波市全灵医疗设备股份有限公司 Navigation device in orthopedics department and preparation method of navigation device
CN105050527A (en) * 2013-03-15 2015-11-11 圣纳普医疗(巴巴多斯)公司 Intelligent positioning system and methods therefore
CN105193458A (en) * 2005-12-28 2015-12-30 Pt稳定股份公司 Method and system for compensating a self-caused displacement of tissue
CN105208958A (en) * 2013-03-15 2015-12-30 圣纳普医疗(巴巴多斯)公司 Systems and methods for navigation and simulation of minimally invasive therapy
CN106580470A (en) * 2016-10-18 2017-04-26 南京医科大学附属口腔医院 System and method for head positioning on basis of binocular vision
CN106687063A (en) * 2014-08-13 2017-05-17 株式会社高永科技 Tracking system and tracking method using same
CN108201470A (en) * 2016-12-16 2018-06-26 上海铂联医疗科技有限公司 A kind of autonomous type tooth-implanting robot system and its device and method
CN109330687A (en) * 2018-11-26 2019-02-15 上海术凯机器人有限公司 A kind of surgical robot system
CN109571412A (en) * 2019-01-15 2019-04-05 北京华晟经世信息技术有限公司 A kind of mechanical arm independent navigation mobile system and method
CN109864806A (en) * 2018-12-19 2019-06-11 江苏集萃智能制造技术研究所有限公司 The Needle-driven Robot navigation system of dynamic compensation function based on binocular vision
CN110494095A (en) * 2017-04-20 2019-11-22 直观外科手术操作公司 System and method for constraining virtual reality surgery systems
CN110650703A (en) * 2017-05-05 2020-01-03 斯科皮斯有限公司 Surgical navigation system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104083217B (en) * 2014-07-03 2016-08-17 北京天智航医疗科技股份有限公司 A kind of surgery positioning device and robotic surgical system
JP6657933B2 (en) * 2015-12-25 2020-03-04 ソニー株式会社 Medical imaging device and surgical navigation system
US10631935B2 (en) * 2016-10-25 2020-04-28 Biosense Webster (Israel) Ltd. Head registration using a personalized gripper
KR102019482B1 (en) * 2017-07-31 2019-09-06 경북대학교 산학협력단 Optical tracking system and controlling method thereof
CN107440797B (en) * 2017-08-21 2020-04-03 刘洋 Registration and registration system and method for surgical navigation
CN108042218A (en) * 2017-12-05 2018-05-18 北京军秀咨询有限公司 A kind of neurosurgery patient head mark automatic vision positioner and method
CN107970060A (en) * 2018-01-11 2018-05-01 上海联影医疗科技有限公司 Surgical robot system and its control method
CN108705536A (en) * 2018-06-05 2018-10-26 雅客智慧(北京)科技有限公司 A kind of the dentistry robot path planning system and method for view-based access control model navigation
CN109692050B (en) * 2018-12-26 2020-05-22 雅客智慧(北京)科技有限公司 Calibration and tracking method and device for dental implant navigation operation

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101257844A (en) * 2005-04-29 2008-09-03 范德比特大学 System and methods of using image-guidance for providing an access to a cochlear of a living subject
CN105193458A (en) * 2005-12-28 2015-12-30 Pt稳定股份公司 Method and system for compensating a self-caused displacement of tissue
CN105050527A (en) * 2013-03-15 2015-11-11 圣纳普医疗(巴巴多斯)公司 Intelligent positioning system and methods therefore
CN105208958A (en) * 2013-03-15 2015-12-30 圣纳普医疗(巴巴多斯)公司 Systems and methods for navigation and simulation of minimally invasive therapy
CN103445929A (en) * 2013-09-10 2013-12-18 河南科技大学第一附属医院 Operation fixing cap
CN103735317A (en) * 2013-12-18 2014-04-23 宁波市全灵医疗设备股份有限公司 Navigation device in orthopedics department and preparation method of navigation device
CN106687063A (en) * 2014-08-13 2017-05-17 株式会社高永科技 Tracking system and tracking method using same
CN106580470A (en) * 2016-10-18 2017-04-26 南京医科大学附属口腔医院 System and method for head positioning on basis of binocular vision
CN108201470A (en) * 2016-12-16 2018-06-26 上海铂联医疗科技有限公司 A kind of autonomous type tooth-implanting robot system and its device and method
CN110494095A (en) * 2017-04-20 2019-11-22 直观外科手术操作公司 System and method for constraining virtual reality surgery systems
CN110650703A (en) * 2017-05-05 2020-01-03 斯科皮斯有限公司 Surgical navigation system
CN109330687A (en) * 2018-11-26 2019-02-15 上海术凯机器人有限公司 A kind of surgical robot system
CN109864806A (en) * 2018-12-19 2019-06-11 江苏集萃智能制造技术研究所有限公司 The Needle-driven Robot navigation system of dynamic compensation function based on binocular vision
CN109571412A (en) * 2019-01-15 2019-04-05 北京华晟经世信息技术有限公司 A kind of mechanical arm independent navigation mobile system and method

Also Published As

Publication number Publication date
CN111407406A (en) 2020-07-14

Similar Documents

Publication Publication Date Title
JP6840815B2 (en) Surgical robot automation with tracking markers
CN216021360U (en) Operation navigation system
US11717351B2 (en) Navigation surgical system, registration method thereof and electronic device
US7894872B2 (en) Computer assisted orthopaedic surgery system with light source and associated method
JP6704034B2 (en) Surgical robot system with retractor
JP2020072773A (en) Control of surgical robot to avoid robotic arm collision
JP2019122768A (en) Surgical robotic systems with target trajectory deviation monitoring and related methods
JP6894466B2 (en) Systems and methods related to robotic guidance in surgery
US11717350B2 (en) Methods for robotic assistance and navigation in spinal surgery and related systems
CN111317572A (en) Surgical robot automation with tracking markers
CN109549706A (en) A kind of surgical operation auxiliary system and its application method
CA2962652A1 (en) Method and device for registering surgical images
WO2006069288A2 (en) Overhead mount for a medical robot for use with medical scanning equipment
CN115363762A (en) Positioning method and device of surgical robot and computer equipment
WO2020151598A1 (en) Surgery robot system and use method therefor
JP6751461B2 (en) Surgical robot automation with tracking markers
CN111407406B (en) Head position identification system, intraoperative control system and control method
JP7082090B2 (en) How to tune virtual implants and related surgical navigation systems
JP2018108344A (en) System and method for measuring depth of instruments
CN112006780B (en) Minimally invasive surgery robot system and artificial cochlea minimally invasive implantation surgery device
EP3636394A1 (en) Robotic system for spinal fixation elements registration with tracking markers
CN209826968U (en) Surgical robot system
EP3624695B1 (en) Device and method for determining positioning data for an x-ray image acquisition device on a mobile patient support unit
JP2021126521A (en) System and method of determining optimal 3-dimensional position and orientation of imaging device for imaging patient bones
CN111407568B (en) Operation assisting system and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant