WO2023124732A1 - 一种影像引导介入穿刺的设备控制方法和系统 - Google Patents

一种影像引导介入穿刺的设备控制方法和系统 Download PDF

Info

Publication number
WO2023124732A1
WO2023124732A1 PCT/CN2022/135624 CN2022135624W WO2023124732A1 WO 2023124732 A1 WO2023124732 A1 WO 2023124732A1 CN 2022135624 W CN2022135624 W CN 2022135624W WO 2023124732 A1 WO2023124732 A1 WO 2023124732A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical robot
imaging device
motion state
control
surgical
Prior art date
Application number
PCT/CN2022/135624
Other languages
English (en)
French (fr)
Inventor
柯贤锋
谢强
Original Assignee
武汉联影智融医疗科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 武汉联影智融医疗科技有限公司 filed Critical 武汉联影智融医疗科技有限公司
Publication of WO2023124732A1 publication Critical patent/WO2023124732A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges

Definitions

  • the present application relates to the field of medical devices, in particular to a device control method and system for image-guided interventional puncture.
  • CT computerized tomography
  • surgical robots guided by CT imaging equipment are increasingly used to assist doctors in puncture operations.
  • automatic needle insertion by surgical robots or doctor-led surgical robot needle insertion has become the main development trend of CT imaging equipment interventional puncture surgery.
  • the CT imaging device In order to realize the CT imaging device guiding the surgical robot to complete the puncture action, it is generally necessary to obtain the position information of a specific area inside the patient's body through the CT imaging device, and then guide the surgical robot to the position to complete the puncture action.
  • the image-guided interventional puncture system is often independent of the CT system, the CT imaging equipment cannot detect the working status of the surgical robot when it is working, and the surgical robot cannot detect the working status of the CT imaging equipment when it is working. There may be unintended relative motion causing physical injury to the patient.
  • One of the embodiments of this specification provides an image-guided interventional puncture device control method, the method includes: acquiring the initial motion state of the imaging device and/or the surgical robot; and controlling the surgical robot and/or the surgical robot according to the initial motion state Or the target motion state of the imaging device.
  • controlling the target motion state of the surgical robot and/or the imaging device according to the initial motion state includes: controlling the surgical robot according to the first motion state of the imaging device and/or controlling the first motion state of the imaging device according to the second motion state of the surgical robot.
  • controlling the second motion state of the surgical robot according to the first motion state of the imaging device includes: according to the first motion state of the imaging device, when determining that the imaging device is moving , control the surgical robot to keep still.
  • the controlling the first motion state of the imaging device according to the second motion state of the surgical robot includes: according to the second motion state of the surgical robot, when When it is determined that the surgical robot is moving, the imaging device is controlled to keep still.
  • controlling the second motion state of the surgical robot according to the first motion state of the imaging device includes: determining the first motion state of the imaging device according to the first motion state of the imaging device A motion trajectory; according to the first motion trajectory, the surgical robot is controlled to move.
  • the controlling the surgical robot to move according to the first motion trajectory includes: predicting the relationship between the imaging device and the surgical robot according to the first motion trajectory of the imaging device. The distance between; when the distance is less than the distance threshold, simultaneously control the imaging device and the surgical robot to keep still.
  • the controlling the surgical robot to move according to the first trajectory includes: planning a second trajectory of the surgical robot according to the first trajectory;
  • the motion trajectory controls the operation of the surgical robot.
  • the controlling the target motion state of the surgical robot and/or the imaging device according to the initial motion state includes: according to the initial motion state of the imaging device and/or the surgical robot The motion state controls the motion speed of the surgical robot and/or the imaging device.
  • the method further includes: controlling the movement speed of the surgical robot and/or the imaging device according to the environment information.
  • the method further includes: acquiring a first end signal generated by the imaging device when ending the current preset procedure, or a second end signal generated by the surgical robot when ending the current preset procedure; According to the first end signal or the second end signal, the imaging device and/or the surgical robot is controlled to enter a motion state of a next procedure.
  • controlling the imaging device and/or the surgical robot to enter the motion state of the next process includes: according to the first The end signal controls the imaging device to keep still, and/or cancels the static state of the surgical robot; controls the surgical robot to keep still, and/or releases the static state of the imaging device according to the second end signal.
  • the method further includes: controlling the imaging device and the surgical robot to enter an integrated working mode according to the access request of the surgical robot or the imaging device, and in the integrated working mode, The motion states of the imaging device and the surgical robot are related to each other.
  • the method further includes: obtaining an interrupt request sent by the imaging device or the surgical robot, and controlling the imaging device and the surgical robot to enter an independent working mode according to the interrupt request.
  • the motion states between the imaging device and the surgical robot are independent of each other.
  • the method further includes: detecting the connection relationship between the imaging device and the surgical robot; when the connection relationship is abnormal, simultaneously controlling the imaging device and the surgical robot to maintain still.
  • the method further includes: in response to failure of the imaging device or the surgical robot, controlling the imaging device and the surgical robot to enter an independent working mode.
  • One of the embodiments of this specification provides an image-guided interventional puncture device control system, the system includes: an imaging device, used to acquire image data of a target object; a surgical robot, used to perform a puncture operation; a control module, used to The initial motion state of the imaging device and/or the surgical robot controls the target motion state of the surgical robot and/or the imaging device.
  • the system further includes: a display module, configured to receive control instruction information and motion state information output by the imaging device and/or the surgical robot, and display them on a display interface.
  • a display module configured to receive control instruction information and motion state information output by the imaging device and/or the surgical robot, and display them on a display interface.
  • the system further includes: a first interlocking interface, used to control the surgical robot to establish or terminate the connection relationship with the imaging device, and to detect the connection relationship; the second interlocking interface An interface is used to control the motion state of the imaging device according to the motion state of the surgical robot; a third interlocking interface is used to control the imaging device and/or the surgical robot to keep still in a preset emergency situation.
  • a first interlocking interface used to control the surgical robot to establish or terminate the connection relationship with the imaging device, and to detect the connection relationship
  • the second interlocking interface An interface is used to control the motion state of the imaging device according to the motion state of the surgical robot
  • a third interlocking interface is used to control the imaging device and/or the surgical robot to keep still in a preset emergency situation.
  • the system further includes: a first transmission channel, configured to transmit the image data acquired by the imaging device to the surgical robot, so that the surgical robot performs the operation according to the image data.
  • the puncture operation configured to transmit the first motion state information of the imaging device to the surgical robot, and/or transmit the second motion state information of the surgical robot to the imaging device.
  • One of the embodiments of this specification provides a computer-readable storage medium, the storage medium stores computer instructions, and after the computer reads the computer instructions in the storage medium, the computer executes the method as described above.
  • Fig. 1 is a schematic diagram of an application scene of an image-guided interventional puncture system according to some embodiments of this specification;
  • FIG. 2 is a block diagram of a processing device according to some embodiments of the present specification.
  • Fig. 3 is a schematic flowchart of an image-guided interventional puncture device control method according to some embodiments of the present specification
  • Fig. 4 is a schematic diagram of the working mode of the image-guided interventional puncture system according to some embodiments of the present specification
  • Fig. 5 is a schematic flowchart of an image-guided interventional puncture device control method according to other embodiments of the present specification
  • Fig. 6 is a schematic structural diagram of an image-guided interventional puncture system according to some embodiments of the present specification.
  • Fig. 7 is a schematic diagram of the connection relationship of the image-guided interventional puncture system according to some embodiments of the present specification.
  • system means for distinguishing different components, elements, parts, parts or assemblies of different levels.
  • the words may be replaced by other expressions if other words can achieve the same purpose.
  • CT imaging equipment-guided interventional surgery robots can be divided into two categories, one is miniaturized design products, such as XACT, iSYS, etc., and the other is surgical execution arm design products, such as MAXIO, ZeroBot, etc.
  • the puncture device of the miniaturized design product is directly fixed on the scanning bed or bound to the patient, thus avoiding the risk of unexpected relative movement between the surgical robot and the moving parts of the CT imaging equipment.
  • the embodiment of the present application provides an image-guided interventional puncture device control method and system, which controls the motion state of one end of the imaging device and the surgical robot according to the motion state of the other end, which solves the possible problems of surgical robots and CT imaging devices.
  • the technical problem of the expected relative motion improves the safety of the CT imaging equipment guiding the surgical robot to complete the puncture action.
  • Fig. 1 is a schematic diagram of an application scene of an image-guided interventional puncture system according to some embodiments of the present specification.
  • the image-guided interventional puncture system 100 may include an imaging device 110 , a surgical robot 120 , a processing device 130 , a terminal device 140 , a storage device 150 and a network 160 .
  • processing device 130 may be a part of imaging device 110 and/or surgical robot 120 .
  • the imaging device 110 may scan the target object in the detection area or the scanning area to obtain image data (for example, a scanned image, etc.) of the target object.
  • the imaging device 110 may be a computed tomography (CT) scanner, a magnetic resonance imaging (MRI) scanner, a positron emission computed tomography (PET) scanner, a single photon emission computed tomography (SPECT) ), etc. or any combination thereof, to obtain at least one of CT images, MR images, PET images, SPECT images and combined images of the target object.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • PET positron emission computed tomography
  • SPECT single photon emission computed tomography
  • CT equipment obtains scan data according to the difference in absorption rate and transmittance of X-rays by different tissues of the human body, and then inputs the scan data into electronic computer equipment to generate a cross-sectional or three-dimensional image of the inspected part.
  • MRI equipment obtains image data by examining the hydrogen element in the human body.
  • PET equipment uses radioactive tracers to obtain image data of scanned objects.
  • SPECT equipment obtains photons through radioactive tracers and converts them into electrical signals to obtain image data. It can be understood that the above description of the imaging device is for illustration purposes only, and is not intended to limit the scope of this specification.
  • the surgical robot 120 may be used to perform terminal operations (eg, ablation, puncture, suturing, etc.) on the target object.
  • the surgical robot 120 may include a surgical execution arm structure, and the end of the surgical execution arm has a fixing structure for fixing functional components (eg, ablation needles, puncture needles, etc.) and other surgical equipment.
  • terminal operations eg, ablation, puncture, suturing, etc.
  • the surgical robot 120 may include a surgical execution arm structure, and the end of the surgical execution arm has a fixing structure for fixing functional components (eg, ablation needles, puncture needles, etc.) and other surgical equipment.
  • FIG. 6 for example, the surgical robot 620
  • its related descriptions which will not be repeated here.
  • the processing device 130 can guide the surgical robot 120 to perform a corresponding operation (for example, a puncture operation) through remote operation control.
  • the processing device 130 can be electrically connected to the end of the robot arm (for example, the end of the surgical execution arm 623) through a communication device (for example, the network 160), and is used to control the end of the robot arm to drive functional components (for example, ablation needle, puncture needle, etc.) to perform synchronous operations.
  • the processing device 130 may drive the puncture needle to perform corresponding operations by controlling the rotation and translation of the end of the robot arm.
  • the processing device 130 may drive the puncture needle to implement the puncture operation by controlling the end of the robotic arm to advance.
  • the surgical robot 120 can be a mechanical arm body, which is used to drive the end of the robotic arm to move, so as to control and/or adjust the operation and/or posture (such as , angle, position, etc.).
  • the processing device 130 can process data and/or information acquired from the imaging device 110 , the surgical robot 120 , the terminal device 140 , the storage device 150 or other components of the image-guided interventional puncture system 100 .
  • the processing device 130 may acquire the first motion state (for example, motion, static, etc.) of the imaging device 110, and analyze and process it to determine the corresponding second motion state (such as motion, static, etc.) of the surgical robot 120 and/or its trajectory.
  • the processing device 130 may acquire a current image of the target object (for example, a CT scan image) from the imaging device 110 and analyze and process it, so as to control the surgical robot 120 to guide and adjust the puncture needle.
  • processing device 130 may be local or remote.
  • the processing device 130 can access information and/or data from the imaging device 110 , the surgical robot 120 , the terminal device 140 and/or the storage device 150 through the network 160 .
  • processing device 130 and the imaging device 110 may be integrated into one. In some embodiments, the processing device 130 and the imaging device 110 may be connected directly or indirectly, and work together to implement the methods and/or functions described in this specification.
  • the processing device 130 and the surgical robot 120 may be integrated. In some embodiments, the processing device 130 and the surgical robot 120 may be connected directly or indirectly, and work together to implement the methods and/or functions described in this specification.
  • the processing device 130 may be a control module in the surgical robot 120 shown in FIG. 7 .
  • the imaging device 110 , the surgical robot 120 and the processing device 130 may be integrated, for example, the imaging device 610 , the surgical robot 620 and the control module 630 in the image-guided interventional puncture system 600 .
  • the imaging device 110, the surgical robot 120, and the processing device 130 may be directly or indirectly connected to jointly implement the methods and/or functions described in this specification. For more relevant content, please refer to FIG. 6 and FIG. 7 description and will not be repeated here.
  • processing device 130 may include input devices and/or output devices. Through the input device and/or the output device, the interaction with the user (for example, displaying the motion state information of the imaging device 110 and/or the surgical robot 120, etc.) can be realized.
  • the input device and/or output device may include a display screen, keyboard, mouse, microphone, etc. or any combination thereof.
  • the terminal device 140 may be connected to and/or communicate with the imaging device 110 , the surgical robot 120 , the processing device 130 and/or the storage device 150 .
  • the terminal device 140 can obtain and display the current image of the target object from the imaging device 110, so that the user can monitor the actual puncture area of the puncture needle in real time.
  • the terminal device 140 may include a mobile device 141, a tablet computer 142, a notebook computer 143, etc. or any combination thereof.
  • the terminal device 140 (or all or part of its functions) may be integrated in the imaging device 110 or the processing device 130 .
  • Storage device 150 may store data, instructions and/or any other information.
  • the storage device 150 can store data obtained from the imaging device 110, the surgical robot 120, and/or the processing device 130 (for example, the current image of the target object, the motion state of the imaging device 110 and/or the surgical robot 120, trajectory, preset process, etc.).
  • the storage device 150 may store computer instructions for implementing the image-guided interventional puncture method.
  • the storage device 150 may include one or more storage components, and each storage component may be an independent device or a part of other devices.
  • the storage device 150 may include random access memory (RAM), read only memory (ROM), mass storage, removable memory, volatile read-write memory, etc., or any combination thereof.
  • Exemplary mass storage may include magnetic disks, optical disks, solid state disks, and the like.
  • storage device 150 may be implemented on a cloud platform.
  • Network 160 may include any suitable network capable of facilitating the exchange of information and/or data, eg, a wireless network, a wired network.
  • at least one component of the image-guided interventional puncture system 100 can communicate with at least one other component in the system 100 through the network 160.
  • Components exchange information and/or data.
  • the processing device 130 may acquire the planned image and/or the current image of the target object from the imaging device 110 through the network 160 .
  • the image-guided interventional puncture system 100 is provided for illustrative purposes only and is not intended to limit the scope of this description.
  • various modifications or changes can be made according to the description in this specification.
  • the image-guided interventional puncture system 100 may implement similar or different functions on other devices. However, these changes and modifications do not depart from the scope of this specification.
  • Fig. 2 is a block diagram of a processing device according to some embodiments of the present specification.
  • the processing device 130 may include an acquisition unit 210 , a control unit 220 and a detection unit 230 .
  • the obtaining unit 210 may be used to obtain data and/or information related to components in the image-guided puncture system.
  • the obtaining unit 210 may be used to obtain preset procedures, image data, etc. stored in the storage device 150 .
  • the acquiring unit 210 can be used to acquire access requests, interrupt requests, end signals, displacement data (eg, linear velocity and angular velocity of each part of the surgical robot), position data, motion trajectory, etc. of imaging equipment or surgical robots.
  • the acquiring unit 210 may be used to acquire an initial motion state of an imaging device (eg, imaging device 110 ) and/or a surgical robot (eg, surgical robot 120 ). In some embodiments, the acquisition unit 210 may be configured to acquire the first end signal generated by the imaging device when the current preset procedure is ended, or the second end signal generated by the surgical robot when the current preset procedure is ended. In some embodiments, the obtaining unit 210 may be used to obtain access requests and interrupt requests sent by imaging devices and/or surgical robots. In some embodiments, the acquiring unit 210 can be used to acquire environment information.
  • the control unit 220 can be used to control the components (eg, the imaging device 110 , the surgical robot 120 , the terminal device 140 ) in the image-guided interventional puncture system (eg, the image-guided interventional puncture system 100 ).
  • the control unit 220 may be used to control the imaging device 110 to scan the target object to obtain image data of the target object.
  • the control unit 220 may be used to control the surgical robot 120 to perform a puncture operation on the target object.
  • control unit 220 can be used to control the target motion state of the surgical robot and/or the imaging device according to the initial motion state of the imaging device and/or the surgical robot. In some embodiments, the control unit 220 can be used to control the second motion state of the surgical robot according to the first motion state of the imaging device; and/or control the first motion state of the imaging device according to the second motion state of the surgical robot .
  • control unit 220 can be used to control the surgical robot to keep still when it is determined that the imaging device is moving according to the first motion state of the imaging device; , to control the video device to keep still.
  • control unit 220 may be configured to determine a first motion track of the imaging device according to the first motion state of the imaging device, and control the surgical robot to move according to the first motion track.
  • control unit 220 can be used to predict the distance between the imaging device and the surgical robot according to the first movement track of the imaging device, and control the imaging device and the surgical robot to keep still when the distance is less than a distance threshold.
  • the control unit 220 can be used to plan a second motion trajectory of the surgical robot according to the first motion trajectory, and control the surgical robot to move according to the second motion trajectory.
  • control unit 220 can be used to control the movement speed of the surgical robot and/or the imaging device according to the initial motion state of the imaging device and/or the surgical robot. In some embodiments, the control unit 220 can be used to control the movement speed of the surgical robot and/or the imaging device according to the environment information.
  • control unit 220 can be used to control the imaging device and/or the surgical robot to enter the next procedure and/or its motion state according to the first end signal or the second end signal. In some embodiments, the control unit 220 may be configured to control the imaging device to keep still and/or release the static state of the surgical robot according to the first end signal. In some embodiments, the control unit 220 can be configured to control the surgical robot to keep still and/or release the still state of the imaging device according to the second end signal.
  • control unit 220 may be configured to control the imaging device and the surgical robot to enter an integrated working mode according to the access request of the surgical robot or the imaging device. In the integrated working mode, the motion states of the imaging equipment and the surgical robot are interrelated. In some embodiments, the control unit 220 can be used to control the imaging device and the surgical robot to enter an independent working mode according to an interrupt request or a fault detection result. In the independent working mode, the motion states between the imaging equipment and the surgical robot are independent of each other.
  • FIGS. 3-6 For more information on controlling imaging equipment and/or surgical robots, please refer to FIGS. 3-6 and their related descriptions.
  • the detection unit 230 can be used for fault detection of components in the image-guided interventional puncture system.
  • the detection unit 230 may be used to detect the terminal device 140 to determine whether it can normally display the current image of the target object and so on.
  • the detection unit 230 can be used to detect the imaging device 110 to determine whether it can work normally, such as whether the scanning bed can move normally, whether the scanning frame can scan the target object, and so on.
  • the detection unit 230 can be used to detect the imaging equipment and the surgical robot respectively, and when a failure of the equipment is detected, a detection signal is immediately generated and sent to the control unit 220 .
  • the detection unit 230 can be used to detect the connection relationship between the imaging device and the surgical robot, and when the connection relationship is abnormal, generate a feedback signal and send it to the control unit 220 .
  • the control unit 220 can be used to control the imaging device and the surgical robot to enter an independent working mode or keep still at the same time in response to a failure of the imaging device or the surgical robot, or an abnormal connection relationship.
  • each unit in the above-mentioned processing device 130 may be fully or partially implemented by software, hardware or a combination thereof.
  • the above units may be embedded in or independent of the processing device 130 in the form of hardware, and may also be stored in the memory of the processing device 130 in the form of software, so that the processing device 130 can call and execute the operations corresponding to the above modules.
  • Fig. 3 is a schematic flowchart of a device control method for image-guided interventional puncture according to some embodiments of the present specification.
  • the process 300 may be performed by the image-guided interventional puncture system 100 (eg, the processing device 130 ). As shown in Figure 3, the process 300 may include the following steps:
  • Step 310 acquiring the initial motion state of the imaging device and/or the surgical robot.
  • step 310 may be performed by the obtaining unit 210 .
  • the initial motion state may reflect the current motion state of the imaging device and/or the surgical robot.
  • the initial motion state may include that the imaging device or the surgical robot is moving, and the imaging device or the surgical robot remains stationary.
  • the initial motion state may include motion data (eg, motion speed, motion direction, motion acceleration, etc.) of the imaging device or surgical robot.
  • the initial motion state may include data such as imaging equipment moving at a speed of 1 cm/s at a constant speed, acceleration, and deceleration, and data such as linear and angular velocities of components of the surgical robot (for example, robotic arms, joints, etc.).
  • the initial motion state may be obtained based on the motion state information fed back by the imaging device and/or the surgical robot.
  • the imaging device 110 may acquire current movement speed, position and other information through a position sensor or a speed sensor, etc., and generate first movement state information and transmit it to the processing device 130 .
  • the surgical robot 120 may obtain whether the surgical robot is currently moving or stationary by reading a preset process in the processor, and generate the second motion state information and transmit it to the processing device 130 .
  • Step 320 Control the target motion state of the surgical robot and/or the imaging device according to the initial motion state. In some embodiments, step 320 may be performed by the control unit 220 .
  • the target motion state may refer to the motion state and/or motion trajectory expected to be realized by the surgical robot and/or the imaging device.
  • the initial motion states of the imaging device and the surgical robot are static state and motion state respectively. If the surgical robot has retreated to the preset height after the operation, the surgical robot needs to remain in a static state, and the scanning bed of the imaging device starts to move. Send the patient out of the aperture. At this time, the target motion states of the surgical robot and the imaging equipment are to maintain a static state and a motion state, respectively.
  • the preset height may be a height set in advance.
  • the initial motion state there are multiple implementations for controlling the target motion state of the surgical robot and/or the imaging device.
  • intelligent control can be performed through a preset process; for another example, according to the initial motion state, manual control can be used.
  • the process of controlling the target motion state of the surgical robot and/or the imaging device according to the initial motion state of the imaging device and/or the surgical robot may also be referred to as interlock control.
  • controlling the target motion state of the surgical robot and/or the imaging device according to the initial motion state may include: controlling the second motion state of the surgical robot according to the first motion state of the imaging device;
  • the second motion state of the robot controls the first motion state of the imaging device.
  • the first motion state may include information such as whether the imaging device is currently in motion or still, current linear velocity, and historical linear velocity.
  • the second motion state may include information such as whether the surgical robot is currently moving or stationary, the angular velocity and the linear velocity of the surgical manipulator arm.
  • the first motion state of the imaging device and the second motion state of the surgical robot can be determined manually, by installing sensor detection, and other methods.
  • the first motion state of the imaging device and the second motion state of the surgical robot can be determined through a preset process of the system.
  • the surgical robot according to the first motion state of the imaging device, when it is determined that the imaging device is moving, the surgical robot is controlled to keep still.
  • the imaging device is controlled to keep still.
  • keeping still may include controlling the moving parts of the imaging device or the surgical robot to be still, or locking the moving parts.
  • the scanning bed of the imaging device can be locked.
  • the surgery performing arm of the surgical robot 120 may be locked.
  • the moving part is locked, its movement cannot be controlled even if the corresponding control button is operated, and its movement can only be continued after unlocking.
  • the surgical robot when the imaging device moves, can be controlled to move according to the motion information of the imaging device.
  • the processing device 130 can determine the real-time position of the imaging device based on the displacement data of the imaging device 110, and when the imaging device is located in the target area or the distance from the edge of the target area is small, control the surgical robot to keep still or avoid the area.
  • the target area may be an area where the surgical robot and the imaging device may collide.
  • target areas can be predicted based on historical motion data of surgical robots and imaging equipment.
  • the first motion track of the imaging device may be determined according to the first motion state of the imaging device, and the surgical robot is controlled to move according to the first motion track.
  • the first motion track may refer to a motion path of the imaging device in a three-dimensional space for a period of time.
  • the scanning table of the imaging device sends the target object into the aperture of the gantry for scanning.
  • the first movement track of the imaging device is the route of the scanning table moving from the current position to the aperture.
  • the trajectory shape of the first movement trajectory is not limited, and may be a straight line, or other shapes such as an arc.
  • the first motion track of the imaging device can be determined in various ways.
  • the first motion trajectory of the imaging device can be determined by establishing a three-dimensional space coordinate system, and determining information such as its motion speed and position according to the first motion state of the imaging device.
  • the first motion track can be obtained directly from the imaging device.
  • the distance between the imaging device and the surgical robot can be predicted according to the first motion track of the imaging device, and when the distance is less than a distance threshold, the imaging device and the surgical robot are controlled to keep still.
  • the distance threshold may refer to a minimum allowable distance.
  • the distance between the imaging device and the surgical robot can be predicted in various ways, including but not limited to program algorithms.
  • the processing device 130 can determine the current position of the imaging device (for example, three-dimensional space coordinates) according to the first motion trajectory, and use a program algorithm to calculate the position of the imaging device at the next moment based on the current movement speed, and compare the position with the surgical robot The position of the imaging device is compared to determine the distance between the imaging device and the surgical robot.
  • the distance between the imaging device and the surgical robot can be acquired by a sensor.
  • a sensor For example, infrared sensors, laser sensors, ultrasonic sensors, etc. can be installed on the surgical robot to obtain the distance between the imaging device and the surgical robot.
  • the installation position and/or type of the sensor can be set according to actual needs, for example, on the scanning frame of the imaging device, on the side edge of the scanning bed, etc., which is not limited in this specification.
  • the second motion trajectory of the surgical robot can be planned according to the first motion trajectory; and the surgical robot can be controlled to move according to the second motion trajectory.
  • the second motion trajectory may refer to the motion path of the surgical robot in three-dimensional space for a period of time. For example, after the imaging device finishes scanning the target object, the surgical robot will perform local anesthesia. At this time, the surgical robot needs to move from the current position to the operable area, then the path from the current position to the operable area can be considered as is the second trajectory of the surgical robot.
  • the trajectory shape of the second motion trajectory can also have various types, such as straight line, arc and so on.
  • the operable area may refer to an area where the surgical robot can perform relevant surgical operations.
  • the motion position of the imaging device at each moment may be determined according to the first motion trajectory, and the second motion trajectory of the surgical robot is planned based on the motion position, so as to avoid collision between the surgical robot and the imaging device.
  • the processing device 130 may determine the spatial coordinates of the imaging device at different moments based on the first motion trajectory, and plan the movement position of the surgical robot according to the spatial coordinates, so that the distance between the surgical robot and the imaging device is greater than a preset value, so as to determine the location of the surgical robot. of the second trajectory.
  • the motion speed of the surgical robot and/or the imaging device can be controlled according to the initial motion state of the imaging device and/or the surgical robot.
  • the real-time distance between the imaging device and the surgical robot can be determined according to the first motion state of the imaging device and the second motion state of the surgical robot.
  • a preset threshold for example, 1 meter or 2 meters
  • control the surgical robot or imaging device to move at a first speed
  • control the surgical robot or imaging device to move at a second speed, wherein the first speed is greater than the second speed.
  • the processing device 130 can determine the real-time position of the surgical robot 120 according to the second motion state of the surgical robot 120, and based on the position of the imaging device 110 and the position of the surgical robot 120, determine the distance between the two. distance.
  • the processing device 130 may determine the first position of the imaging device 110 according to the first motion state, and determine the second position of the surgical robot 120 according to the second motion state, based on the same moment The first position and second position determine the distance between the two.
  • a sensor may be used to determine the real-time distance between the two, which is not limited in this description.
  • the real-time distance between the imaging device 110 or the surgical robot 120 can be determined through a radar sensor installed on the imaging device 110 or the surgical robot 120 .
  • control of the movement speed of the surgical robot and/or the imaging device may be consistent or different, depending on the actual situation.
  • the movement speed of the surgical robot and/or the imaging equipment can also be controlled according to the environment information.
  • the environment information may refer to relevant information in the area where the imaging device and the surgical robot are located.
  • the environment information can be the position information of the surgical robot and imaging equipment, or the action information of the target object (for example, moving the body, raising the hand, etc.), or any object contained in the surrounding area of the equipment (for example, people, other equipment, etc.).
  • Environmental information can be obtained in many ways. For example, it may be acquired through a sensor, or may be acquired through a camera device, etc., and the acquisition method is not limited.
  • the movement speed of the surgical robot and/or the imaging equipment can also be controlled by means of manual speed regulation, program automatic speed regulation, and the like.
  • manual speed regulation program automatic speed regulation
  • the program can control the surgical robot to reduce the speed to avoid collisions and cause damage to the target object; when the target object After the hand is returned to its original position, the program can control the surgical robot to return to its original speed, so as to save equipment running time and improve surgical efficiency.
  • the movement speed of the surgical robot (for example, the moving speed of the surgical execution arm) can be increased until an object is detected in the preset area, and the speed of the surgical robot is controlled to decrease (for example, down to less than or equal to initial velocity, or down to 0).
  • the imaging device can be controlled to stop moving at the same time.
  • the first end signal generated by the imaging device when the current preset process is ended or the second end signal generated by the surgical robot when the current preset process is ended; according to the first end signal or the second The end signal controls the imaging equipment and/or surgical robot to enter the motion state of the next process.
  • the preset process is a work task process preset by the system.
  • the complete preset process can be: scanning ⁇ local anesthesia puncture ⁇ patient feeding into the aperture ⁇ puncture ⁇ patient moving out of the aperture ⁇ end, wherein scanning, patient feeding into the aperture, and patient removal from the aperture need to be performed by imaging equipment, and local anesthesia puncture , The puncture needs to be performed by a surgical robot.
  • the current preset procedure may refer to preset procedure information corresponding to the current stage, such as procedure information such as scanning the target object and performing puncture surgery on the target object.
  • the system can automatically determine the corresponding preset procedure according to the patient information, or the doctor can manually set the preset procedure for the current target object.
  • the first end signal may refer to a feedback signal generated when the imaging device completes the current preset process. For example, after the imaging device 110 controls the scanning bed to send the target object into the aperture of the gantry, the first end signal may be generated. For another example, after the imaging device 110 finishes scanning the target object, it may generate a first end signal.
  • the second end signal may refer to a feedback signal generated when the surgical robot completes the current preset procedure. For example, after the surgical robot 120 finishes puncturing the target subject under local anesthesia, it may generate a second end signal. For another example, after the surgical robot 120 completes the puncture operation on the target object, it may generate a second end signal.
  • the first end signal and the second end signal may have various forms of expression. For example, warning sounds and indicator lights can be used to reflect that the imaging equipment and surgical robot have completed the current preset process.
  • the first end signal and the second end signal may be code information generated in the processing device.
  • the first end signal and the second end signal may include the execution content of the current preset process.
  • the first end signal may display content such as "the target object has been sent to the aperture" on the display interface in the form of a pop-up window.
  • the imaging device and/or the surgical robot can send the first end signal or the second end signal through the data transmission channel.
  • the imaging device 110 or the surgical robot 120 may transmit the first end signal or the second end signal through the second transmission channel.
  • the second transmission channel please refer to FIG. 7 and its description.
  • the next process may refer to a work task corresponding to a next stage after the current preset process is completed. For example, after the imaging device finishes scanning the target object, the next process may be performing local anesthesia and puncture on the target object by the surgical robot.
  • the imaging device can be controlled to remain still (for example, control the moving parts of the imaging device to enter a locked state) according to the first end signal, and/or release the static state of the surgical robot (for example, release the moving parts of the surgical robot). lock).
  • the surgical robot can be controlled to keep still according to the second end signal, and/or the imaging device can be released from the still state. For example, when the imaging device 110 finishes scanning the target object, a first end signal is generated, and according to the first end signal, the control unit 220 can control the scanning table of the imaging device 110 to enter a locked state, and unlock the moving parts of the surgical robot 120 state, so that the surgical robot can complete the local anesthesia operation.
  • a second end signal is generated.
  • the control unit 220 can control the moving parts of the surgical robot 120 to enter a locked state, and scan the image device 110 The couch is unlocked so that the scanning couch of the imaging device 110 can move the target object out of the gantry aperture.
  • the movement of the surgical robot can be controlled based on patient information. For example, according to different body types, different displacements are preset for the surgical arm (for example, those with more fat have larger displacements, etc.), and when the puncture operation is performed, the body shape of the current target object is obtained, and the operation is controlled according to the preset corresponding relationship. Execute the arm movement to perform the piercing operation on the target object.
  • the imaging device and the surgical robot can be controlled to enter an integrated working mode according to the access request of the surgical robot or the imaging device.
  • the target motion state of the surgical robot and/or the imaging device is controlled according to the initial motion state of the imaging device and/or the surgical robot.
  • the imaging device and the surgical robot can be controlled to enter an independent working mode according to the interrupt request. For details about the integrated working mode and the independent working mode, please refer to FIG. 4 and its related descriptions, which will not be repeated here.
  • the unexpected relative motion between the imaging device and the surgical robot can be effectively solved, and damage to the device and physical injury to the patient can be avoided.
  • the motion state of the imaging equipment and the surgical robot is controlled after the current preset process is completed, so as to ensure the interlocking control of the motion state when the preset process is switched, avoiding unexpected relative motion, and further reducing the risk of image guidance. Risk of interfering with the piercing system.
  • Fig. 4 is a schematic diagram of the working mode of the image-guided interventional puncture system according to some embodiments of the present specification.
  • the image-guided interventional puncture system 400 can include an integrated working mode and an independent working mode, and the switching of the working modes can be realized through a connection interface (for example, a first interlocking interface).
  • a connection interface for example, a first interlocking interface
  • the motion states of the imaging equipment and the surgical robot are interrelated.
  • the independent working mode the imaging equipment and the surgical robot can work independently, and their motion states are independent of each other without interlocking control.
  • the surgical robot enters the locked state when the imaging device moves, and the imaging device enters the locked state when the surgical robot moves; in the independent working mode, the connection between the imaging device and the surgical robot is disconnected, and the connection between the two The movements do not affect each other.
  • the imaging device or the surgical robot may send an access request to the controller (for example, the processing device 130, the control module 630), and the controller establishes a connection relationship between the imaging device and the surgical robot according to the access request, so as to Control the imaging equipment and surgical robot to enter the integrated working mode.
  • the access request may be an instruction signal requesting access sent by a surgical robot or an imaging device.
  • the control module 630 receives the first access request from the imaging device 610 or the second access request from the surgical robot 620, it can respectively establish a connection channel with the imaging device 610 and the surgical robot 620, so as to use the control module 630 as data
  • the intermediary of transmission establishes the connection relationship between the imaging device 610 and the surgical robot 620 .
  • the imaging device may send a first access request to the surgical robot, and after the surgical robot accepts the first access request, the surgical robot and the imaging device enter an integrated working mode.
  • the surgical robot may send a second access request to the imaging device, and after the imaging device accepts the second access request, the surgical robot and the imaging device enter an integrated working mode.
  • an imaging device or a surgical robot may send a first access request or a second access request during the preparation phase of the image intervention-guided puncture system.
  • the first access request or the second access request may be generated in response to a user operation.
  • the user may connect the communication cable of the imaging device 110 to the interface board of the controller of the surgical robot 120, and the imaging device 110 generates a first access request after detecting that the communication cable is connected.
  • the surgical robot 120 may generate a second access request in response to the user's operation of the working mode switching button on the surgical robot 120 .
  • the first access request or the second access request may be generated based on preset conditions. For example, after the imaging device 110 and the surgical robot 120 are connected through hardware, the imaging device 110 or the surgical robot 120 may generate an access request when the connection duration reaches a preset time threshold.
  • the imaging equipment can be used to guide the surgical robot to perform the puncture operation, so that the integrated working mode can be involved in the puncture surgery.
  • interlocking control can be used to realize the locking control of the surgical robot when the imaging equipment is scanning (for example, to control the moving parts of the surgical robot to remain stationary or locked), and when the surgical robot performs puncture surgery Lock control of imaging equipment (for example, controlling moving parts of imaging equipment to remain stationary or locked).
  • the imaging device or the moving parts of the surgical robot can be controlled based on a preset process or the motion state of the imaging device/surgical robot.
  • the processing device 130 can lock the moving parts of the surgical robot 120 .
  • the processing device 130 may control the imaging device 110 to keep still and/or lock its moving parts, and unlock the moving parts of the surgical robot 120 at the same time.
  • the imaging device or the surgical robot may send an interrupt request (for example, a first interrupt request, a second interrupt request) to the controller (for example, the processing device 130, the control module 630), and the controller interrupts the imaging according to the interrupt request.
  • the connection relationship between the equipment and the surgical robot is used to control the imaging equipment and the surgical robot to enter the independent working mode.
  • the interruption request may refer to an instruction signal requesting interruption sent by a surgical robot or an imaging device.
  • the interrupt request may include a third-party instruction, such as an operation instruction input by a user.
  • the imaging device 610 when it needs to exit the integrated working mode, it can send a first interrupt request to the control module 630, and the control module 630 interrupts the connection relationship between the imaging device 610 and the surgical robot 620 after receiving the first interrupt request .
  • the control module 630 may interrupt the connection channels with the imaging device 610 and the surgical robot 620 respectively, thereby interrupting the connection relationship between the imaging device 610 and the surgical robot 620 .
  • the surgical robot 620 when the surgical robot 620 needs to exit the integrated working mode, the surgical robot 620 sends a second interrupt request to the control module 630, and the control module 630 interrupts the connection between the surgical robot 620 and the imaging device 610 after receiving the second interrupt request. connection relationship.
  • the imaging device may send a first interruption request to the surgical robot (for example, after the intervention-guided operation ends), and the surgical robot interrupts the connection after accepting the first interruption request, and the surgical robot and the imaging device enter an independent working mode.
  • the surgical robot may send a second interrupt request to the imaging device, and the imaging device interrupts the connection relationship after receiving the second interrupt request, and the surgical robot and the imaging device enter an independent working mode.
  • the surgical robot In the independent working mode, the surgical robot is withdrawn, and the imaging equipment can be used alone to perform scanning operations (for example, clinical image scanning).
  • the surgical robot By interrupting the connection between the imaging equipment and the surgical robot, the surgical robot can be evacuated in time after the operation, and the imaging equipment can work independently without affecting the normal operation of the system, thereby improving the efficiency and use of the image-guided interventional puncture system. experience.
  • the imaging device and the surgical robot can be controlled to enter a failure mode.
  • the failure mode may include simultaneously controlling the imaging device and the moving parts of the surgical robot to keep still, and/or locking the imaging device and the moving parts of the surgical robot.
  • connection relationship between the imaging device and the surgical robot can be detected; when the connection relationship is abnormal, the imaging device and the surgical robot are controlled to keep still.
  • the connection relationship can be a hardware connection relationship or a software connection relationship. If abnormal conditions are detected in the connection relationship, it indicates that there may be a fault in the system, and continuing the surgical operation may lead to a dangerous situation. Therefore, it is necessary to send control signals to the imaging equipment and the surgical robot at the same time, and control the imaging equipment and the surgical robot to end the motion state, so as to avoid harm to the patient.
  • the control module 630 can detect the working status of the hardware interface and software interface of the imaging device 610 and the surgical robot 620, and when there is an abnormality in the interface cable or software, generate a control signal and send it to the imaging device.
  • the device 610 and the surgical robot 620 are used to control the imaging device 610 and the surgical robot 620 to enter a locked state.
  • the surgical robot or the imaging device can perform self-healing after entering the locked state. If the self-repair fails, it can be checked and repaired manually.
  • restarting the imaging device and surgical robot can directly enter the integrated working mode or independent working mode, or can enter the integrated working mode or independent working mode based on the access request or interrupt request of the imaging device or surgical robot.
  • the specific method can be determined by the user. Set it yourself.
  • the imaging device and the surgical robot can be controlled to enter a failure mode.
  • the patient image can be collected in real time through the camera, and when it is recognized based on the collected image that the user has an unexpected movement (the preset standard posture is lying flat with both hands on the sides of the legs, but the user lifts or places the puncture site), the user can Simultaneously control the imaging equipment and the surgical robot to keep still.
  • the control module may also control the imaging device and the surgical robot to enter an independent working mode in response to failure of the imaging device or the surgical robot.
  • Faults may include an abnormal connection between the imaging device and the surgical robot, a downtime of the surgical robot or the imaging device, and the like.
  • the control module detects the failure, it can generate a control signal and send it to the imaging device 110 and the surgical robot 120 to control the imaging device 110 and the surgical robot 120 to forcibly disconnect , to enter the independent working mode, at this time, the imaging device 110 can perform the scanning operation independently.
  • connection relationship By detecting the connection relationship between the imaging equipment and the surgical robot, and locking the moving parts of the imaging equipment and the surgical robot when the connection relationship is abnormal, so as to avoid unexpected relative motion, thereby avoiding the impact of accidents on the image-guided interventional puncture system
  • the impact improves the safety of the image-guided interventional puncture system.
  • Fig. 5 is a schematic flowchart of a device control method for image-guided interventional puncture according to other embodiments of the present specification.
  • the process 500 may be performed by the image-guided interventional puncture system 100 (eg, the processing device 130 ). As shown in Figure 5, the process 500 may include the following steps:
  • Step 510 registration and registration of the surgical robot.
  • step 510 may be performed by the processing device 130 .
  • Registration registration can refer to matching the 3D space of the target object with the 3D space of the scanned image to unify into the same coordinate system.
  • the surgical robot associates the patient's 3D coordinates or scanned images into a unified coordinate system through registration and registration, so as to realize the coordinates of the 3D space coordinates or scanned image coordinates and the coordinates of the surgical robot. Transform to determine surgical location and establish connection channels.
  • Step 520 control the surgical robot to keep still, and control the imaging equipment to scan the target object.
  • step 520 may be performed by processing device 130 .
  • the surgical robot may generate a second end signal and send it to the controller (eg, the processing device 130 ).
  • the controller controls the surgical robot to keep still, and controls the imaging device to scan the target object.
  • the processing device 130 may lock the surgical performing arm of the surgical robot 120 based on the second end signal, and control the imaging device 110 to scan the target object, so as to obtain a scanned image of the target object.
  • the processing device 130 may control the surgical robot 120 to move to a target position (for example, a position that will not interfere with the movement of the imaging device) and then stop, based on the second end signal, and lock the surgical performing arm. Then the imaging device 110 is controlled to scan the target object to obtain a scanned image of the target object.
  • Step 530 control the movement of the imaging device according to a preset process.
  • step 530 may be performed by processing device 130 .
  • the processing device 130 can control the imaging device to move or keep still according to a preset process.
  • the preset process includes local anesthesia puncture (local anesthesia puncture)
  • the imaging device 110 finishes scanning it can generate a first end signal and send it to the processing device 130.
  • the processing device 130 After the processing device 130 receives the first end signal, it The scanning bed of the imaging device 110 is locked, and the surgical robot 120 is controlled to enter the workflow—local anesthesia puncture, so that the surgical arm performs local anesthesia puncture on the patient; after completing the workflow, the surgical robot 120 generates a second end signal and sends To the processing device 130, the processing device 130 unlocks the scanning table according to the second end signal, and controls the imaging device 110 to enter the work flow-patient feeding aperture, so that the imaging device 110 carries the patient into the aperture of the gantry through the scanning table.
  • the processing device 130 may control the imaging device 110 to transport the target object to the aperture of the scanning frame through the scanning bed after the imaging device 110 finishes scanning according to the preset procedure, And locate the puncture level.
  • Step 540 according to the first end signal, control the imaging device to keep still, and release the static state of the surgical robot.
  • step 540 may be performed by processing device 130 .
  • the imaging device can generate a first end signal and send it to the controller (for example, the processing device 130), and the controller controls the moving parts of the imaging device according to the first end signal (for example, Scanning table) remains stationary and/or enters a locked state, and releases the locked state or static state for moving parts of the surgical robot.
  • the processing device 130 may lock the scanning bed and the scanning frame of the imaging device 110 based on the received end signal, and unlock the surgical performing arm of the surgical robot 120, so that the surgical performing arm enters the aperture of the scanning frame to perform the operation on the patient. Perform a piercing action.
  • Step 550 control the surgical robot to move according to the preset procedure.
  • step 550 may be performed by processing device 130 .
  • the processing device 130 can send a control signal to the surgical robot according to a preset procedure, and the surgical robot controls the operation arm to move after receiving the control signal, so that the operation arm enters the aperture of the gantry to perform main operations on the target object. from the piercing action. Further, after the master-slave puncture action is completed, the surgical robot controls the surgical execution arm to move out of the aperture of the scanning frame according to the control instruction.
  • Step 560 control the surgical robot to keep still, and release the still state of the imaging device.
  • step 560 may be performed by processing device 130 .
  • the surgical robot can generate a second end signal and send it to the controller (for example, the processing device 130), and the controller controls the operation of the surgical robot according to the second end signal.
  • a moving part eg, a surgical execution arm
  • the processing device 130 may lock the operation arm and unlock the scanning table according to the second end signal, so that the imaging device 110 enters the workflow—the patient moves out of the aperture, and the imaging device 110 moves the patient out of the scanning table through the scanning table. rack aperture.
  • the processing device 130 can determine whether the puncture operation is over, and if it is over, go to step 570; otherwise, go to step 530: control the movement of the imaging device according to the preset process (for example, control the movement of the imaging device according to the next workflow).
  • whether the puncture operation is over can be determined according to a preset process.
  • Step 570 controlling the imaging device and the surgical robot to enter an independent working mode.
  • step 570 may be performed by processing device 130 .
  • the processing device 130 may unlock the imaging device (e.g., a scanning table) and/or the surgical robot (e.g., a surgical performing arm), and disconnect the imaging device from the surgical robot. relationship to enter standalone working mode.
  • the imaging device e.g., a scanning table
  • the surgical robot e.g., a surgical performing arm
  • the motion state of the imaging equipment and the surgical robot is controlled after the preset process is completed, so as to ensure the interlocking control of the motion state when the preset process is switched, and avoid unexpected relative motion.
  • the risk of image-guided interventional puncture system can be further reduced.
  • Fig. 6 is a schematic structural diagram of an image-guided interventional puncture system according to some embodiments of the present specification.
  • an image-guided interventional puncture system 600 may include an image device 610 , a surgical robot 620 and a control module 630 .
  • the imaging device 610 , the surgical robot 620 and the control module 630 are structures or components similar to the imaging device 110 , the surgical robot 120 and the processing device 130 in the image-guided interventional puncture system 100 .
  • the control module 630 can be integrated into the surgical robot 620 and communicated with the imaging device 610 for controlling the imaging device 610 and the surgical robot 620 .
  • the imaging device 610 may include a scanning frame 613 and a scanning bed 615 .
  • the scanning frame 613 can be equipped with an X-ray tube, a line filter, a collimator, a reference detector, a signal detector, electronic circuits and various moving parts. part.
  • the moving parts of the scanning frame 613 can control the scanning frame to perform linear motion, rotational motion, forward and backward tilting motion and other motions.
  • the distance between the X-ray tube and the target object can be changed based on the moving parts, and the inclination angle of the scanning frame 613 can be adjusted, and the inclination angle can reach ⁇ 20° ⁇ 30°.
  • the scanning couch 615 is the vehicle for the target object.
  • the scanning bed 615 has a vertical moving part and a horizontal and longitudinal moving part, which can automatically enter and exit the aperture of the scanning frame 613 according to a preset process, and carry the target object to a designated scanning position.
  • the surgical robot 620 may include a surgical implementation arm 621 , a surgical implementation arm tip 623 , and a surgical device 625 .
  • the operation performing arm 621 can be used to support the operation performing arm end 623 and carry the operation performing arm end 623 to a designated operation position.
  • the end 623 of the operation performing arm can be used to fix the operation device 625 and control the operation device 625 to perform operations such as puncture, suture, and ablation.
  • control module 630 can be used to control the target motion state of the surgical robot 620 and/or the imaging device 610 according to the initial motion state of the imaging device 610 and/or the surgical robot 620 .
  • control module 630 may be configured to control the surgical robot 620 to keep still when it is determined that the imaging device 610 is moving according to the first motion state of the imaging device 610 . For example, when the imaging device 610 starts to move, it can output a feedback signal to the control module 630. After receiving the feedback signal, the control module 30 outputs a control signal to the surgical robot 620 to control the moving parts of the surgical robot 620 to enter the locked state.
  • control module 630 may be configured to control the imaging device 610 to keep still when it is determined that the surgical robot 620 is moving according to the second motion state of the surgical robot 620 . For example, when the surgical robot 620 starts to move, it can output a feedback signal to the control module 630. After receiving the feedback signal, the control module 630 outputs a control signal to the imaging device 610 to control the scanning bed of the imaging device 610 to enter the locked state.
  • the control module 630 can simultaneously control the imaging device 610 and the surgical robot 620 to keep still. For example, after the control module 630 receives feedback signals output by other modules (for example, the terminal device 140), it simultaneously outputs control signals to the imaging device 610 and the surgical robot 620, so as to simultaneously control the moving parts of the imaging device 610 and the surgical robot 620 to remain stationary. As another example, when the control module 630 detects that the imaging device 610 and/or the surgical robot 620 is malfunctioning, or the connection relationship between the two is abnormal, it can simultaneously output control signals to the imaging device 610 and the surgical robot 620 to simultaneously control the imaging device 610 and the moving parts of the surgical robot 620 enter a locked state.
  • the control module 630 detects that the imaging device 610 and/or the surgical robot 620 is malfunctioning, or the connection relationship between the two is abnormal, it can simultaneously output control signals to the imaging device 610 and the surgical robot 620 to simultaneously control the imaging device 610 and the moving parts of the surgical robot 620 enter a locked state.
  • the imaging device 610 and the surgical robot 620 may include a brake locking structure (not shown in the figure), which is used to lock the moving parts of the imaging device 610 and the surgical robot 620 when they remain stationary. Locked to avoid accidental abnormal movement.
  • the imaging device 610 can also be used to generate a first end signal when the current preset process ends, and transmit it to the control module 630 .
  • the surgical robot 620 may be configured to generate a second end signal when ending the current preset procedure, and transmit it to the control module 630 .
  • the first end signal is a feedback signal generated by the imaging device 610 when the preset process is ended
  • the second control signal is a feedback signal generated by the surgical robot 620 when the preset process is ended.
  • the control module 630 receives the first end signal or the second end signal, and controls the imaging device 610 and the surgical robot 620 to enter the next preset procedure.
  • an end signal is generated and sent to the control module 630, so that the control module 630 controls the imaging device 610 and the surgical robot 620 to enter the next preset process according to the end signal, which ensures that different preset processes are switched.
  • the interlocking control of the imaging device 610 and the surgical robot 620 improves the safety of the image-guided interventional puncture system.
  • control module 630 can also be used to control the imaging device 610 to keep still according to the first end signal, and/or release the static state of the surgical robot 120; and/or, control the surgical robot according to the second end signal 620 remains stationary, and/or releases the stationary state of imaging device 610 .
  • the surgical robot is ready to enter the next stage of the workflow, so it is necessary to lock the moving parts of the imaging device. If the surgical robot is originally locked, the surgical robot needs to be locked. The moving parts are unlocked.
  • the imaging equipment is ready to enter the next stage of the workflow, so it is necessary to lock the moving parts of the surgical robot. parts to unlock. It should be noted that when the two procedures are performed by the same device, for example, sending the patient directly into the aperture after the scan is completed, the device can be controlled to remain unlocked to control the movement of the scanning table to send the patient into the aperture.
  • the imaging device 610 and/or the surgical robot 620 can also be used to send access requests and interrupt requests to the control module 630 .
  • the control module 630 can be used to control the imaging device 610 and the surgical robot 620 to enter the integrated working mode according to the access request, and control the imaging device 610 and the surgical robot 620 to enter the independent working mode according to the interruption request.
  • the control module 630 may be configured to control the imaging device 610 and the surgical robot 620 to enter an independent working mode in response to a failure of the imaging device 610 and/or the surgical robot 620 .
  • the image-guided interventional puncture system 600 may include a first control module and a second control module, and the first control module and the second control module respectively control the imaging device 610 and the surgical robot 620 .
  • the first control module is integrated in the imaging device 610
  • the second control module is integrated in the surgical robot 620.
  • control module 630 can also be used to determine the first motion track of the imaging device 610 according to the first motion state of the imaging device 610 when the imaging device 610 is moving, and control the surgical robot 620 to perform the operation according to the first motion track. sports. In some embodiments, the control module 630 can also be used to control the movement speed of the surgical robot 620 and/or the imaging device 610 according to the environment information and the initial motion state of the imaging device 610 and/or the surgical robot 620 . For more details, refer to FIG. 3 and its related descriptions, which will not be repeated here.
  • the image-guided interventional puncture system 600 may further include a display module 640 .
  • the display module 640 may be a module for displaying information such as images, including but not limited to a CRT display, a liquid crystal display, or an LED display.
  • the display module 640 can be used to receive control instruction information and motion state information output by the imaging device 610 and/or the surgical robot 620, and display them on the display interface.
  • the control instruction information may refer to instruction information such as the first end signal and the second end signal.
  • the motion state information may refer to the first motion state, the second motion state, etc., reflecting the motion status (for example, stationary, moving) of the imaging device 610 and/or the surgical robot 620 .
  • the display module 640 can display image data (eg, scanned images) acquired by the image device 610 .
  • the display module 640 can be connected with the control module 630 , so as to receive control instructions and operating status information output by the imaging device 610 and the surgical robot 620 through the control module 630 .
  • the display module 640 can be displayed on the display interface in various forms. For example, it may be displayed by text, by using an image screen, or by any combination of images and text.
  • the user through the display module, the user (for example, a doctor) can observe the progress of the puncture operation in real time according to the control instruction information and motion state information output by the imaging device and/or surgical robot displayed on the display module in real time , to ensure the safety of the surgical procedure.
  • Fig. 7 is a schematic diagram of the connection relationship of the image-guided interventional puncture system according to some embodiments of the present specification.
  • the connection between the imaging device (eg, imaging device 610 ) and the surgical robot (eg, surgical robot 620 ) in the image-guided interventional puncture system 700 can be divided into two levels: hardware and software. , realizing interlocking control at the hardware level, and realizing data and/or information (for example, image data and status information) interaction at the software level.
  • the hardware connection between the imaging device and the surgical robot may include three interfaces: a first interlocking interface, a second interlocking interface, and a second interlocking interface.
  • the first interlock interface can also be called a safety interlock interface, which is used to control the surgical robot to establish or terminate the connection relationship with the imaging device, and to detect the connection relationship between the surgical robot and the imaging device.
  • the controller of the surgical robot 620 may be connected with the interface installed on the frame of the imaging device 610 (for example, the scanning frame 613 ) through protocol communication.
  • the imaging device after the imaging device is connected to the surgical robot, the imaging device can be identified and verified. After the verification is successful, it enters the integrated working mode, so as to ensure the connection safety of the image-guided interventional puncture system.
  • the second interlock interface can also be called a motion lock interface, and is used to control the motion state of the imaging device according to the motion state of the surgical robot, and/or control the motion state of the surgical robot according to the motion state of the imaging device.
  • the controller of the surgical robot 620 can be connected to the interface installed on the frame (for example, the scanning frame 613) of the imaging device 610 through a cable, and when the surgical robot is in a moving state, the imaging device is connected to the imaging device through the second interlocking interface.
  • the moving parts eg, scanning table, scanning frame
  • the moving parts of the imaging equipment can be locked by brakes such as brakes.
  • the third interlock interface can also be called an emergency stop interface, which can be used to control the system under preset emergency situations (for example, collisions, obstacles around the equipment, abnormal patients, abnormal operations, equipment failures, abnormal connection relationships, etc.) Moving parts of imaging equipment and surgical robots remain stationary or go into lockout.
  • the controller of the surgical robot 620 can be connected to the interface installed on the frame of the imaging device 610 (for example, the scanning frame 613) through a cable. Trigger the emergency stop of the surgical robot 620 when stopping, or trigger the emergency stop of the imaging device 610 and the surgical robot 620 at the same time when the patient's body is abnormal.
  • the third interlocking interface can be used to control the imaging device and/or the surgical robot to move a corresponding distance in a direction opposite to the original moving direction in a preset emergency situation, so as to stay away from the collision object. In this way, unexpected movement between the imaging device 610 and the surgical robot 620 can be avoided in an emergency, and the safety of the image-guided interventional puncture system can be ensured.
  • the first interlocking interface, the second interlocking interface and the second interlocking interface may be integrated on an interface board of an imaging device or a surgical robot.
  • the imaging device 610 can be connected to the controller of the surgical robot 620 through a bus consisting of two cables (for motion lock and emergency stop respectively) and a communication protocol line.
  • the software connection between the imaging device and the surgical robot may include two transmission channels: a first transmission channel and a second transmission channel.
  • the first transmission channel can be used to transmit image data.
  • the imaging device 610 may transmit the acquired image data to the surgical robot 620 through the first transmission channel, so as to guide the surgical robot 620 to perform a puncture operation according to the image data.
  • the second transmission channel can be used to transmit motion state information.
  • the second transmission channel may be used to transmit the first motion state information of the imaging device 110 to the surgical robot 120 , and/or transmit the second motion state information of the surgical robot 120 to the imaging device 110 .
  • the hard-wire transmission channel between the imaging equipment and the surgical robot is established through the safety interlock interface, which ensures the stability of the interlock structure.
  • the imaging device and the surgical robot are connected by software, which can realize the information interaction between the two devices, so that the imaging device and the surgical robot can obtain each other's information in time and adjust the surgical operation, effectively improving the image-guided interventional puncture system. accuracy and execution efficiency.
  • numbers describing the quantity of components and attributes are used. It should be understood that such numbers used in the description of the embodiments use the modifiers "about”, “approximately” or “substantially” in some examples. grooming. Unless otherwise stated, “about”, “approximately” or “substantially” indicates that the stated figure allows for a variation of ⁇ 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that can vary depending upon the desired characteristics of individual embodiments. In some embodiments, numerical parameters should take into account the specified significant digits and adopt the general digit reservation method. Although the numerical ranges and parameters used in some embodiments of this specification to confirm the breadth of the range are approximations, in specific embodiments, such numerical values are set as precisely as practicable.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Manipulator (AREA)

Abstract

本说明书实施例提供了一种影像引导介入穿刺的设备控制方法和系统。所述方法包括:获取影像设备和/或手术机器人的初始运动状态;根据所述初始运动状态,控制所述手术机器人和/或所述影像设备的目标运动状态。

Description

一种影像引导介入穿刺的设备控制方法和系统
优先权声明
本申请要求于2021年12月30日提交的申请号为202111660066.X的中国申请的优先权,全部内容通过引用并入本文。
技术领域
本申请涉及医疗器械领域,特别是涉及一种影像引导介入穿刺的设备控制方法和系统。
背景技术
随着电子计算机断层扫描(Computed Tomography,CT)技术以及手术机器人的发展,CT影像设备引导手术机器人被越来越多地应用于辅助医生进行穿刺手术。其中,由手术机器人自动进针或者医生主导手术机器人进针成为CT影像设备介入穿刺手术的主要发展趋势。
为了实现CT影像设备引导手术机器人完成穿刺动作,一般需要通过CT影像设备获取患者身体内部特定区域的位置信息,然后引导手术机器人至该位置完成穿刺动作。但是,由于影像引导介入穿刺系统往往独立于CT系统,CT影像设备在工作时无法检测手术机器人的工作状态,手术机器人在工作时也无法检测CT影像设备的工作状态,导致手术机器人和CT影像设备可能存在非预期的相对运动,造成患者的身体损伤。
因此,希望提供一种影像引导介入穿刺的设备控制方法和系统,以解决手术机器人和CT影像设备可能存在非预期相对运动的问题。
发明内容
本说明书实施例之一提供一种影像引导介入穿刺的设备控制方法,所述方法包括:获取影像设备和/或手术机器人的初始运动状态;根据所述初始运动状态,控制所述手术机器人和/或所述影像设备的目标运动状态。
在一些实施例中,所述根据所述初始运动状态,控制所述手术机器人和/或所述影像设备的目标运动状态,包括:根据所述影像设备的第一运动状态,控制所述手术机器人的第二运动状态;和/或根据所述手术机器人的所述第二运动状态,控制所述影像设备的所述第一运动状态。
在一些实施例中,所述根据所述影像设备的第一运动状态,控制所述手术机器人的第二运动状态,包括:根据所述影像设备的第一运动状态,当确定所述影像设备运动时,控制所述手术机器人保持静止。
在一些实施例中,所述根据所述手术机器人的所述第二运动状态,控制所述影像设备的所述第一运动状态,包括:根据所述手术机器人的所述第二运动状态,当确定所述手术机器人运动时,控制所述影像设备保持静止。
在一些实施例中,所述根据所述影像设备的第一运动状态,控制所述手术机器人的第二运动状态,包括:根据所述影像设备的第一运动状态,确定所述影像设备的第一运动轨迹;根据所述第一运动轨迹,控制所述手术机器人进行运动。
在一些实施例中,所述根据所述第一运动轨迹,控制所述手术机器人进行运动,包括:根据所述影像设备的所述第一运动轨迹,预测所述影像设备与所述手术机器人之间的距离;当所述距离小于距离阈值时,同时控制所述影像设备和所述手术机器人保持静止。
在一些实施例中,所述根据所述第一运动轨迹,控制所述手术机器人进行运动,包括:根据所述第一运动轨迹,规划所述手术机器人的第二运动轨迹;根据所述第二运动轨迹,控制所述手术机器人进行运动。
在一些实施例中,所述根据所述初始运动状态,控制所述手术机器人和/或所述影像设备的目标运动状态,包括:根据所述影像设备和/或所述手术机器人的所述初始运动状态,控制所述手术机器人和/或所述影像设备的运动速度。
在一些实施例中,所述方法还包括:根据环境信息,控制所述手术机器人和/或所述影像设备的运动速度。
在一些实施例中,所述方法还包括:获取所述影像设备在结束当前预设流程时生成的第一结束信号,或所述手术机器人在结束当前预设流程时生成的第二结束信号;根据所述第一结束信号或所述第二结束信号,控制所述影像设备和/或所述手术机器人进入下一个流程的运动状态。
在一些实施例中,所述根据所述第一结束信号或所述第二结束信号,控制所述影像设备和/或所述手术机器人进入下一个流程的运动状态,包括:根据所述第一结束信号控制所述影像设备保持静止,和/或 解除所述手术机器人的静止状态;根据所述第二结束信号控制所述手术机器人保持静止,和/或解除所述影像设备的静止状态。
在一些实施例中,所述方法还包括:根据所述手术机器人或所述影像设备的接入请求,控制所述影像设备与所述手术机器人进入一体工作模式,在所述一体工作模式下,所述影像设备和所述手术机器人的运动状态相互关联。
在一些实施例中,所述方法还包括:获取所述影像设备或所述手术机器人发送的中断请求,并根据所述中断请求控制所述影像设备与所述手术机器人进入独立工作模式,在所述独立工作模式下,所述影像设备和所述手术机器人之间的运动状态相互独立。
在一些实施例中,所述方法还包括:对所述影像设备和所述手术机器人之间的连接关系进行检测;当所述连接关系异常时,同时控制所述影像设备以及所述手术机器人保持静止。
在一些实施例中,所述方法还包括:响应于所述影像设备或所述手术机器人发生故障,控制所述影像设备与所述手术机器人进入独立工作模式。
本说明书实施例之一提供一种影像引导介入穿刺的设备控制系统,所述系统包括:影像设备,用于获取目标对象的影像数据;手术机器人,用于执行穿刺操作;控制模块,用于根据所述影像设备和/或所述手术机器人的初始运动状态,控制所述手术机器人和/或所述影像设备的目标运动状态。
在一些实施例中,所述系统还包括:显示模块,用于接收所述影像设备和/或所述手术机器人输出的控制指令信息以及运动状态信息,并在显示界面进行显示。
在一些实施例中,所述系统还包括:第一联锁接口,用于控制所述手术机器人建立或中断与所述影像设备的连接关系,以及对所述连接关系进行检测;第二联锁接口,用于根据所述手术机器人的运动状态控制所述影像设备的运动状态;第三联锁接口,用于在预设紧急情况下控制所述影像设备和/或所述手术机器人保持静止。
在一些实施例中,所述系统还包括:第一传输通道,用于将所述影像设备获取的所述影像数据传输至所述手术机器人,以使所述手术机器人根据所述影像数据执行所述穿刺操作;第二传输通道,用于将所述影像设备的第一运动状态信息传输至所述手术机器人,和/或将所述手术机器人的第二运动状态信息传输至所述影像设备。
本说明书实施例之一提供一种计算机可读存储介质,所述存储介质存储计算机指令,当计算机读取存储介质中的计算机指令后,计算机执行如前所述的方法。
附图说明
本说明书将以示例性实施例的方式进一步说明,这些示例性实施例将通过附图进行详细描述。这些实施例并非限制性的,在这些实施例中,相同的编号表示相同的结构,其中:
图1是根据本说明书一些实施例所示的影像引导介入穿刺系统的应用场景示意图;
图2是根据本说明书一些实施例所示的处理设备的模块示意图;
图3是根据本说明书一些实施例所示的影像引导介入穿刺的设备控制方法的流程示意图;
图4是根据本说明书一些实施例所示的影像引导介入穿刺系统的工作模式示意图;
图5是根据本说明书另一些实施例所示的影像引导介入穿刺的设备控制方法的流程示意图;
图6是根据本说明书一些实施例所示的影像引导介入穿刺系统的结构示意图;以及
图7是根据本说明书一些实施例所示的影像引导介入穿刺系统的连接关系的示意图。
具体实施方式
为了更清楚地说明本说明书实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本说明书的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本说明书应用于其它类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。
应当理解,本文使用的“系统”、“装置”、“单元”和/或“模块”是用于区分不同级别的不同组件、元件、部件、部分或装配的一种方法。然而,如果其他词语可实现相同的目的,则可通过其他表达来替换所述词语。
如本说明书和权利要求书中所示,除非上下文明确提示例外情形,“一”、“一个”、“一种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的步骤和元素,而这些步骤和元素不构成一个排它性的罗列,方法或者设备也可能包含其它的步骤或元素。
本说明书中使用了流程图用来说明根据本说明书的实施例的系统所执行的操作。应当理解的是,前面或后面操作不一定按照顺序来精确地执行。相反,可以按照倒序或同时处理各个步骤。同时,也可以 将其他操作添加到这些过程中,或从这些过程移除某一步或数步操作。
随着CT技术以及手术机器人的发展,CT影像设备引导手术机器人越来越多的被应用于辅助医生进行穿刺手术。其中,由手术机器人自动进针或者医生主导手术机器人进针,成为CT影像设备介入穿刺手术的主要发展趋势。一般地,CT影像设备引导介入手术机器人的产品可以分为两类,一类是小型化设计产品,例如XACT、iSYS等,另一类是手术执行臂类设计产品,例如MAXIO、ZeroBot等。小型化设计产品的穿刺装置直接固定于扫描床或者与患者进行绑定,从而避免了手术机器人与CT影像设备运动部件的非预期相对运动风险。但是在CT影像设备的孔径内,仍然可能发生患者误操作导致的非预期运动,从而导致穿刺针刺伤患者。而且,小型化设计产品由于体型较小,功能比较局限,在工作空间上难以满足大多数临床穿刺场景。手术执行臂类产品的手术执行臂穿刺装置具有较大的工作空间,但是手术执行臂和CT影像设备的运动部件的非预期相对运动往往很难避免。
本申请实施例中提供一种影像引导介入穿刺的设备控制方法和系统,根据影像设备与手术机器人中的其中一端的运动状态控制另一端的运动状态,解决了手术机器人和CT影像设备可能存在非预期的相对运动的技术问题,提高了CT影像设备引导手术机器人完成穿刺动作的安全性。
图1是根据本说明书一些实施例所示的影像引导介入穿刺系统的应用场景示意图。
在一些实施例中,如图1所示,影像引导介入穿刺系统100可以包括影像设备110、手术机器人120、处理设备130、终端设备140、存储设备150和网络160。在一些实施例中,处理设备130可以是影像设备110和/或手术机器人120中的一部分。
影像设备110可以对检测区域或扫描区域内的目标对象进行扫描,得到该目标对象的影像数据(例如,扫描图像等)。在一些实施例中,影像设备110可以是计算机断层扫描(CT)扫描仪、核磁共振成像(MRI)扫描仪、正电子发射计算机断层扫描(PET)扫描仪、单光子发射计算机断层成像术(SPECT)等或其任意组合,以获取目标对象的CT图像、MR图像、PET图像、SPECT图像以及组合图像中的至少一种。其中,CT设备根据人体不同组织对X射线的吸收率以及透过率的不同,获取扫描数据,再将扫描数据输入电子计算机设备生成被检查部位的断面或者立体图像。MRI设备通过检查人体内的氢元素,获取影像数据。PET设备通过放射性示踪剂获取扫描对象的影像数据。SPECT设备通过放射性示踪剂获取光子并转化为电信号,得到影像数据。可以理解的是,上述影像设备的相关描述仅用于说明目的,而无意限制本说明书的范围。
手术机器人120可以用于对目标对象执行末端操作(例如,消融、穿刺、缝合等手术动作)。在一些实施例中,手术机器人120可以包括手术执行臂结构,其手术执行臂的末端有固定结构,用于固定功能部件(例如,消融针、穿刺针等)等手术设备。更多详细内容,可以参见图6(例如,手术机器人620)及其相关描述,此处不再赘述。
在一些实施例中,处理设备130可以通过远程操作控制引导手术机器人120执行相应的操作(例如,穿刺操作)。在一些实施例中,处理设备130可以通过通信装置(例如,网络160)与机器臂末端(例如,手术执行臂末端623)电连接,用于控制机器臂末端带动功能部件(例如,消融针、穿刺针等)执行同步操作。例如,处理设备130可以通过控制机器臂末端旋转、平移等带动穿刺针执行相应的操作。又如,处理设备130可以通过控制机器臂末端向前推进,带动穿刺针实现穿刺操作。在一些实施例中,手术机器人120可以为机械臂本体,用于带动机器臂末端运动,以控制和/或调整机器臂末端承载的功能部件(例如,穿刺针)的操作和/或姿态(例如,角度、位置等)。
处理设备130可以处理从影像设备110、手术机器人120、终端设备140、存储设备150或影像引导介入穿刺系统100的其他组件获取的数据和/或信息。例如,处理设备130可以获取影像设备110的第一运动状态(例如,运动、静止等),并对其进行分析处理,以确定手术机器人120相应的第二运动状态(例如,运动、静止等)和/或其运动轨迹。又例如,处理设备130可以从影像设备110获取目标对象的当前图像(例如,CT扫描图像),并对其进行分析处理,以控制手术机器人120引导调整穿刺针。在一些实施例中,处理设备130可以是本地或远程的。例如,处理设备130可以通过网络160从影像设备110、手术机器人120、终端设备140和/或存储设备150访问信息和/或数据。
在一些实施例中,处理设备130和影像设备110可以集成为一体。在一些实施例中,处理设备130和影像设备110可以直接或间接相连接,联合作用实现本说明书所述的方法和/或功能。
在一些实施例中,处理设备130和手术机器人120可以集成为一体。在一些实施例中,处理设备130和手术机器人120可以直接或间接相连接,联合作用实现本说明书所述的方法和/或功能。例如,处理设备130可以为图7所示的手术机器人120中的控制模块。
在一些实施例中,影像设备110、手术机器人120和处理设备130可以集成为一体,例如影像引导介入穿刺系统600中影像设备610、手术机器人620和控制模块630。在一些实施例中,影像设备110、手术机器人120和处理设备130可以直接或间接相连接,联合作用实现本说明书所述的方法和/或功能,更多相关内容可以参见图6和图7中描述,此处不再赘述。
在一些实施例中,处理设备130可以包括输入装置和/或输出装置。通过输入装置和/或输出装置,可以实现与用户的交互(例如,显示影像设备110和/或手术机器人120的运动状态信息等)。在一些实施例中,输入装置和/或输出装置可以包括显示屏、键盘、鼠标、麦克风等或其任意组合。
终端设备140可以与影像设备110、手术机器人120、处理设备130和/或存储设备150连接和/或通信。例如,终端设备140可以从影像设备110获取目标对象的当前图像并显示,便于用户实时监测穿刺针的实际穿刺区域等。在一些实施例中,终端设备140可以包括移动设备141、平板电脑142、笔记本电脑143等或其任意组合。在一些实施例中,终端设备140(或其全部或部分功能)可以集成在影像设备110或处理设备130中。
存储设备150可以存储数据、指令和/或任何其他信息。在一些实施例中,存储设备150可以存储从影像设备110、手术机器人120和/或处理设备130获取的数据(例如,目标对象的当前图像、影像设备110和/或手术机器人120的运动状态、运动轨迹、预设流程等)。在一些实施例中,存储设备150可以存储用于实现影像引导介入穿刺方法的计算机指令。
在一些实施例中,存储设备150可以包括一个或多个存储组件,每个存储组件可以是一个独立的设备,也可以是其他设备的一部分。在一些实施例中,存储设备150可包括随机存取存储器(RAM)、只读存储器(ROM)、大容量存储器、可移动存储器、易失性读写存储器等或其任意组合。示例性的大容量储存器可以包括磁盘、光盘、固态磁盘等。在一些实施例中,存储设备150可在云平台上实现。
网络160可以包括能够促进信息和/或数据交换的任何合适的网络,例如,无线网络、有线网络。在一些实施例中,影像引导介入穿刺系统100的至少一个组件(例如,影像设备110、手术机器人120、处理设备130、终端设备140、存储设备150)可以通过网络160与系统100中至少一个其他组件交换信息和/或数据。例如,处理设备130可以通过网络160从影像设备110获取目标对象的计划图像和/或当前图像。
应当注意,影像引导介入穿刺系统100仅仅是为了说明的目的而提供的,并不意图限制本说明书的范围。对于本领域的普通技术人员来说,可以根据本说明书的描述,做出多种修改或变化。例如,影像引导介入穿刺系统100可以在其它设备上实现类似或不同的功能。然而,这些变化和修改不会背离本说明书的范围。
图2是根据本说明书一些实施例所示的处理设备的模块示意图。
如图2所示,在一些实施例中,处理设备130可以包括获取单元210、控制单元220和检测单元230。
获取单元210可以用于获取与影像引导穿刺系统中组件相关的数据和/或信息。例如,获取单元210可以用于获取存储设备150中存储的预设流程、影像数据等。又例如,获取单元210可以用于获取影像设备或手术机器人的接入请求、中断请求、结束信号、位移数据(例如,手术机器人各部分的线速度、角速度等)、位置数据、运动轨迹等。
在一些实施例中,获取单元210可以用于获取影像设备(例如,影像设备110)和/或手术机器人(例如,手术机器人120)的初始运动状态。在一些实施例中,获取单元210可以用于获取影像设备在结束当前预设流程时生成的第一结束信号,或手术机器人在结束当前预设流程时的第二结束信号。在一些实施例中,获取单元210可以用于获取影像设备和/或手术机器人发送的接入请求、中断请求。在一些实施例中,获取单元210可以用于获取环境信息。
关于初始运动状态、预设流程、当前预设流程、第一结束信号以及第二结束信号的更多内容,可以参见图3及其相关描述。关于影像数据、接入请求以及中断请求的更多内容,可以参见图4及其相关描述,此处不再赘述。
控制单元220可以用于控制影像引导介入穿刺系统(例如,影像引导介入穿刺系统100)中的组件(例如,影像设备110、手术机器人120、终端设备140)。例如,控制单元220可以用于控制影像设备110对目标对象进行扫描,以获得目标对象的影像数据等。又例如,控制单元220可以用于控制手术机器人120对目标对象进行穿刺操作等。
在一些实施例中,控制单元220可以用于根据影像设备和/或手术机器人的初始运动状态,控制手术机器人和/或影像设备的目标运动状态。在一些实施例中,控制单元220可以用于根据影像设备的第一运动状态,控制手术机器人的第二运动状态;和/或根据手术机器人的第二运动状态,控制影像设备的第一运动状态。
在一些实施例中,控制单元220可以用于根据影像设备的第一运动状态,当确定影像设备运动时,控制手术机器人保持静止;或根据手术机器人的第二运动状态,当确定手术机器人运动时,控制影像设备保持静止。在一些实施例中,控制单元220可以用于根据影像设备的第一运动状态确定影像设备的第一运动轨迹,根据第一运动轨迹控制手术机器人进行运动。在一些实施例中,控制单元220可以用于根据影像设备的第一运动轨迹,预测影像设备与手术机器人之间的距离,当距离小于距离阈值时,同时控制影像设备和手术机器人保持静止。在一些实施例中,控制单元220可以用于根据第一运动轨迹,规划手术机器人 的第二运动轨迹,并根据第二运动轨迹控制手术机器人进行运动。
在一些实施例中,控制单元220可以用于根据影像设备和/或手术机器人的初始运动状态,控制手术机器人和/或影像设备的运动速度。在一些实施例中,控制单元220可以用于根据环境信息,控制手术机器人和/或影像设备的运动速度。
在一些实施例中,控制单元220可以用于根据第一结束信号或第二结束信号,控制影像设备和/或手术机器人进入下一个流程和/或其运动状态。在一些实施例中,控制单元220可以用于根据第一结束信号控制影像设备保持静止,和/或解除手术机器人的静止状态。在一些实施例中,控制单元220可以用于根据第二结束信号控制手术机器人保持静止,和/或解除影像设备的静止状态。
在一些实施例中,控制单元220可以用于根据手术机器人或影像设备的接入请求,控制影像设备与手术机器人进入一体工作模式。在一体工作模式下,影像设备和手术机器人的运动状态相互关联。在一些实施例中,控制单元220可以用于根据中断请求或故障检测结果,控制影像设备与手术机器人进入独立工作模式。在独立工作模式下,影像设备和手术机器人之间的运动状态相互独立。
关于控制影像设备和/或手术机器人的更多内容,可以参见图3-图6及其相关描述。
检测单元230可以用于对影像引导介入穿刺系统中组件进行故障检测。例如,检测单元230可以用于对终端设备140进行检测,确定其是否可以正常显示目标对象的当前图像等。又例如,检测单元230可以用于对影像设备110进行检测,确定其是否可以正常工作,如扫描床是否可以正常移动、扫描架是否可以对目标对象进行扫描等。
在一些实施例中,检测单元230可以用于分别对影像设备与手术机器人进行检测,当检测到设备发生故障时,立即生成检测信号发送至控制单元220。在一些实施例中,检测单元230可以用于对影像设备与手术机器人之间的连接关系进行检测,当连接关系存在异常时,生成反馈信号并发送至控制单元220。控制单元220可以用于响应于影像设备或手术机器人发生故障、或连接关系异常,控制影像设备与手术机器人进入独立工作模式或同时保持静止。
关于处理设备130的具体限定可以参见文中对于设备控制方法的限定,在此不再赘述。可以理解,上述处理设备130中的各个单元可全部或部分通过软件、硬件及其组合来实现。上述各单元可以硬件形式内嵌于或独立于处理设备130中,也可以以软件形式存储于处理设备130中的存储器中,以便于处理设备130调用执行以上各个模块对应的操作。
图3是根据本说明书一些实施例所示的影像引导介入穿刺的设备控制方法的流程示意图。在一些实施例中,流程300可以由影像引导介入穿刺系统100(例如,处理设备130)执行。如图3所示,流程300可以包括以下步骤:
步骤310,获取影像设备和/或手术机器人的初始运动状态。在一些实施例中,步骤310可以由获取单元210执行。
初始运动状态可以反映影像设备和/或手术机器人当前的运动状态。例如,初始运动状态可以包括影像设备或手术机器人正在运动、影像设备或手术机器人保持静止。在一些实施例中,初始运动状态可以包括影像设备或手术机器人的运动数据(例如,运动速度、运动方向、运动加速度等)。例如,初始运动状态可以包括影像设备正在以1cm/s的速度匀速运动、加速运动、减速运动等数据,手术机器人各部件(例如,机械臂、各关节等)的线速度、角速度等数据。
在一些实施例中,可以基于影像设备和/或手术机器人反馈的运动状态信息获取其初始运动状态。例如,影像设备110可以通过位置传感器或速度传感器等获取当前的运动速度、位置等信息,并生成第一运动状态信息将其传输至处理设备130。又如,手术机器人120可以通过读取处理器中的预设流程以获取手术机器人当前是在运动还是处于静止,并生成第二运动状态信息将其传输至处理设备130。
步骤320,根据初始运动状态,控制手术机器人和/或影像设备的目标运动状态。在一些实施例中,步骤320可以由控制单元220执行。
目标运动状态可以是指期望手术机器人和/或影像设备实现的运动状态和/或运动轨迹。例如,影像设备与手术机器人的初始运动状态分别为静止状态、运动状态,若完成手术后的手术机器人已退回至预设高度,那么就需要手术机器人保持静止状态,影像设备的扫描床开始运动以将患者送出孔径,此时手术机器人与影像设备的目标运动状态分别为保持静止状态、运动状态。其中,预设高度可以是提前设置的高度。
在一些实施例中,根据初始运动状态,控制手术机器人和/或影像设备的目标运动状态有多种实现方式。例如,根据初始运动状态,可以通过预设流程进行智能控制;又例如,根据初始运动状态,可以利用人工手动进行控制等。在一些实施例中,根据影像设备和/或手术机器人的初始运动状态,控制手术机器人和/或影像设备的目标运动状态的过程,也可称为联锁控制。
在一些实施例中,根据初始运动状态,控制手术机器人和/或影像设备的目标运动状态,可以包括:根据影像设备的第一运动状态,控制手术机器人的第二运动状态;和/或根据手术机器人的第二运动状态, 控制影像设备的第一运动状态。例如,第一运动状态可以包括影像设备当前是运动或静止、当前线速度、历史线速度等信息。又如,第二运动状态可以包括手术机器人当前是运动或静止、手术机械臂角速度、线速度等信息。
其中,影像设备的第一运动状态与手术机器人的第二运动状态可以通过人工、安装传感器检测等多种方式确定。在一些实施例中,可以通过系统的预设流程确定影像设备的第一运动状态与手术机器人的第二运动状态。
在一些实施例中,可以根据影像设备的第一运动状态,当确定影像设备运动时,控制手术机器人保持静止。
在一些实施例中,可以根据手术机器人的第二运动状态,当确定手术机器人运动时,控制影像设备保持静止。
在一些实施例中,保持静止可以包括控制影像设备或手术机器人的运动部件静止、或对运动部件进行锁定。例如,当手术机器人120运动时,可以锁定影像设备的扫描床。又如,当影像设备110运动时,可以对手术机器人120的手术执行臂进行锁定。运动部件被锁定时,即使操作相应的控制按钮也无法控制其运动,只有解锁后才可以继续控制其运动。
在一些实施例中,当影像设备运动时,可以根据影像设备的运动信息,控制手术机器人进行运动。例如,处理设备130可以基于影像设备110的位移数据,确定影像设备的实时位置,当影像设备位于目标区域或与目标区域边缘的距离较小时,控制手术机器人保持静止或避开该区域。其中,目标区域可以是手术机器人与影像设备可能发生碰撞的区域。例如,可以基于手术机器人和影像设备的历史运动数据,预测目标区域。
在一些实施例中,可以根据影像设备的第一运动状态确定影像设备的第一运动轨迹,根据第一运动轨迹控制手术机器人进行运动。
第一运动轨迹可以是指影像设备在三维空间内一段时间的运动路径。例如,在进行穿刺手术之前,通过影像设备的扫描床将目标对象送入扫描架孔径内进行扫描,此时影像设备的第一运动轨迹即扫描床从当前位置移动到孔径内的路线。第一运动轨迹的轨迹形状不限,可以是直线,也可以是弧线等其他形状。
根据影像设备的第一运动状态,影像设备的第一运动轨迹可以利用多种方式进行确定。例如,可以通过建立三维空间坐标系,根据影像设备的第一运动状态确定其运动速度、位置等信息,从而确定影像设备的第一运动轨迹。在一些实施例中,可以直接从影像设备获取其第一运动轨迹。
在一些实施例中,可以根据影像设备的第一运动轨迹,预测影像设备与手术机器人之间的距离,当距离小于距离阈值时,同时控制影像设备和手术机器人保持静止。例如,当距离小于距离阈值时,可以同时控制影像设备和手术机器人的运动部件保持静止,或对影像设备和手术机器人的运动部件进行锁定。其中,距离阈值可以是指可被允许的距离最小值。
根据影像设备的第一运动轨迹,影像设备与手术机器人之间的距离可以利用多种方式进行预测,包括但不限于程序算法等。例如,处理设备130可以根据第一运动轨迹确定影像设备的当前位置(例如,三维空间坐标),并利用程序算法计算出基于当前运动速度,影像设备下一时刻的位置,将该位置与手术机器人的位置进行比较,确定影像设备与手术机器人之间的距离。
在一些实施例中,可以通过传感器获取影像设备与手术机器人之间的距离。例如,可以在手术机器人上安装红外传感器、激光传感器、超声波传感器等,以获取影像设备与手术机器人之间的距离。在一些实施例中,传感器的安装位置和/或类型可以根据实际需求进行设置,例如,影像设备的扫描架上、扫描床的侧边缘等,本说明书对此不作限制。
当影像设备与手术机器人之间的距离小于距离阈值时,同时控制影像设备与手术机器人的运动部件保持静止或对运动部件进行锁定,可以避免两者发生碰撞造成设备损坏,保障目标对象的人身安全。
在一些实施例中,可以根据第一运动轨迹,规划手术机器人的第二运动轨迹;根据第二运动轨迹,控制手术机器人进行运动。
第二运动轨迹可以是指手术机器人在三维空间内一段时间的运动路径。例如,影像设备完成对目标对象的扫描后,手术机器人将进行局部麻醉操作,此时手术机器人需要从当前位置移动至可操作区域,那么从当前位置移动至可操作区域的这段路径就可以认为是手术机器人的第二运动轨迹。第二运动轨迹的轨迹形状也可以有多种,例如直线形、弧线等。其中,可操作区域可以是指手术机器人可以实现相关手术操作的区域。
在一些实施例中,可以根据第一运动轨迹,确定影像设备在每个时刻的运动位置,基于该运动位置规划手术机器人的第二运动轨迹,以避免手术机器人与影像设备发生碰撞。例如,处理设备130可以基于第一运动轨迹确定影像设备在不同时刻的空间坐标,根据空间坐标规划手术机器人的运动位置,使得手术机器人与影像设备之间的距离大于预设值,以确定手术机器人的第二运动轨迹。
在一些实施例中,可以根据影像设备和/或手术机器人的初始运动状态,控制手术机器人和/或影 像设备的运动速度。
在一些实施例中,可以根据影像设备的第一运动状态和手术机器人的第二运动状态,确定影像设备与手术机器人之间的实时距离,当该距离大于预设阈值(例如,1米或2米)时,控制手术机器人或影像设备以第一速度进行运动;当该距离小于或等于预设阈值时,控制手术机器人或影像设备以第二速度进行运动,其中第一速度大于第二速度。例如,当影像设备110保持静止时,处理设备130可以根据手术机器人120的第二运动状态确定手术机器人120的实时位置,并基于影像设备110的位置和手术机器人120的位置,确定两者之间的距离。又如,当影像设备110和手术机器人120同时运动时,处理设备130可以根据第一运动状态确定影像设备110的第一位置,根据第二运动状态确定手术机器人120的第二位置,基于同一时刻的第一位置和第二位置确定两者之间的距离。在一些实施例中,可以通过传感器确定两者之间的实时距离,本说明书对此不做限制。例如,可以通过安装在影像设备110或手术机器人120上的雷达传感器确定两者之间的实时距离。
其中,对手术机器人和/或影像设备的运动速度的大小的控制可以一致,也可以不同,具体视实际情况而定。
在一些实施例中,还可以根据环境信息,控制手术机器人和/或影像设备的运动速度。
环境信息可以是指影像设备与手术机器人所在区域内的相关信息。例如,环境信息可以是手术机器人、影像设备的位置信息,也可以是目标对象的动作信息(例如,身体挪动、抬手等)等信息,或者是设备的周围区域内包含的任意物体(例如,人、其他设备等)。
环境信息可以通过多种方式进行获取。例如,可以通过传感器获取,也可以通过摄像装置获取等,其获取方式不限。
在一些实施例中,根据环境信息,也可以通过人工手动调速、程序自动调速等方式控制手术机器人和/或影像设备的运动速度。例如,当手术机器人向目标对象进行移动时,目标对象忽然抬手以致手与手术机器人的距离很近,此时程序可以控制手术机器人降低速度,以避免发生碰撞对目标对象造成伤害;当目标对象的手放回原位之后,程序可以再控制手术机器人恢复至原先的速度,以节省设备运行时间,提高手术效率。又如,当检测到预设区域内没有物体时,可以提高手术机器人的运动速度(例如,手术执行臂的移动速度),直到检测到预设区域内出现物体,控制手术机器人速度下降(例如,降至小于或等于初始速度、或降至0)。在一些实施例中,当手术机器人停止运动后,影像设备与手术机器人之间的距离仍在缩小时,可以同时控制影像设备停止运动。
在一些实施例中,还可以获取影像设备在结束当前预设流程时生成的第一结束信号,或手术机器人在结束当前预设流程时生成的第二结束信号;根据第一结束信号或第二结束信号,控制影像设备和/或手术机器人进入下一个流程的运动状态。
其中,预设流程为系统预先设置的工作任务流程。例如,完整的预设流程可以为:扫描→局麻穿刺→患者送入孔径→穿刺→患者移出孔径→结束,其中,扫描、患者送入孔径、患者移出孔径需要由影像设备执行,局麻穿刺、穿刺需要由手术机器人执行。当前预设流程可以是指当前这一阶段对应的预设流程信息,例如对目标对象进行扫描、对目标对象进行穿刺手术等流程信息。在一些实施例中,系统可以根据患者信息自动确定相应的预设流程,或由医生手动设置当前目标对象的预设流程。
第一结束信号可以是指影像设备完成当前预设流程时生成的反馈信号。例如,影像设备110控制扫描床将目标对象送至扫描架孔径内之后,可以生成第一结束信号。又例如,影像设备110完成对目标对象的扫描之后,可以生成第一结束信号。
第二结束信号可以是指手术机器人完成当前预设流程时生成的反馈信号。例如,手术机器人120对目标对象完成局部麻醉穿刺后,可以生成第二结束信号。又例如,手术机器人120对目标对象完成穿刺手术后,可以生成第二结束信号。
第一结束信号与第二结束信号可以有多种表现形式。例如,可以通过警示音、指示灯等形式来反映影像设备与手术机器人已完成当前预设流程。在一些实施例中,第一结束信号与第二结束信号可以是处理设备中生成的代码信息。在一些实施例中,第一结束信号与第二结束信号可以包括当前预设流程的执行内容。例如,第一结束信号可以通过弹窗形式在显示界面展示“已将目标对象送至孔径”等内容。
在一些实施例中,影像设备和/或手术机器人可以通过数据传输通道发送第一结束信号或第二结束信号。例如,影像设备110或手术机器人120可以通过第二传输通道传输第一结束信号或第二结束信号,第二传输通道的更多详细内容可以参见图7及其描述。
下一个流程可以是指完成当前预设流程后的下一个阶段对应的工作任务。例如,影像设备结束对目标对象的扫描之后,下一个流程可以为手术机器人对目标对象进行局部麻醉穿刺等。
在一些实施例中,可以根据第一结束信号控制影像设备保持静止(例如,控制影像设备的运动部件进入锁定状态),和/或解除手术机器人的静止状态(例如,解除对手术机器人的运动部件的锁定)。在一些实施例中,可以根据第二结束信号控制手术机器人保持静止,和/或解除影像设备的静止状态。例如, 当影像设备110结束对目标对象的扫描时生成第一结束信号,根据第一结束信号,控制单元220可以控制影像设备110的扫描床进入锁定状态,并对手术机器人120的运动部件解除锁定状态,以使手术机器人可以完成局部麻醉操作。又例如,当手术机器人120完成穿刺手术并退回至原位时生成第二结束信号,根据第二结束信号,控制单元220可以控制手术机器人120的运动部件进入锁定状态,并对影像设备110的扫描床解除锁定状态,以使影像设备110的扫描床可以将目标对象移出扫描架孔径。
在一些实施例中,可以基于患者信息,控制手术机器人进行运动。例如,针对不同体型,对手术执行臂预设不同的位移(例如,脂肪较多者位移较大等),在进行穿刺操作时,获取当前目标对象的体型,根据预设的对应关系,控制手术执行臂移动,以对该目标对象执行穿刺操作。
在一些实施例中,可以根据手术机器人或影像设备的接入请求,控制影像设备与手术机器人进入一体工作模式。响应于进入一体工作模式,根据影像设备和/或手术机器人的初始运动状态,控制手术机器人和/或影像设备的目标运动状态。在一些实施例中,可以根据中断请求控制影像设备与手术机器人进入独立工作模式。关于一体工作模式和独立工作模式的详细内容可以参见图4及其相关描述,此处不再赘述。
通过基于影像设备的运动状态控制手术机器人的运动状态、基于手术机器人的运动状态控制影像设备的运动状态,可以有效解决影像设备与手术机器人非预期的相对运动,避免设备的损坏与患者的身体损伤。通过结束信号,在完成当前预设流程后对影像设备以及手术机器人的运动状态进行控制,以保证在预设流程切换时实现运动状态的联锁控制,避免非预期相对运动,进一步降低了影像引导介入穿刺系统的风险。通过基于影像设备和/或手术机器人的初始运动状态以及环境信息,控制手术机器人和/或影像设备的运动速度,不仅可以避免设备发生碰撞造成损坏,而且还可以有效节省设备的运行时间,加快手术进度,提高手术效率。
图4是根据本说明书一些实施例所示的影像引导介入穿刺系统的工作模式示意图。
如图4所示,在一些实施例中,影像引导介入穿刺系统400可以包括一体工作模式与独立工作模式,通过连接接口(例如,第一联锁接口)可以实现工作模式的切换。关于连接接口的内容可以参见图6及其相关描述,此处不再赘述。
在一体工作模式下,影像设备和手术机器人的运动状态相互关联。在独立工作模式下,影像设备和手术机器人可以独立工作,彼此之间的运动状态相互独立,无需进行联锁控制。例如,在一体工作模式下,影像设备运动时手术机器人进入锁定状态,手术机器人运动时影像设备进入锁定状态;在独立工作模式下,影像设备与手术机器人之间的连接断开,两者之间的运动互不影响。
在一些实施例中,影像设备或手术机器人可以向控制器(例如,处理设备130、控制模块630)发送接入请求,控制器根据接入请求建立影像设备与手术机器人之间的连接关系,以控制影像设备与手术机器人进入一体工作模式。其中,接入请求可以是手术机器人或影像设备发出的请求接入的指令信号。例如,控制模块630接收到影像设备610的第一接入请求或手术机器人620的第二接入请求后,可以分别建立与影像设备610以及手术机器人620的连接通道,从而通过控制模块630作为数据传输的中介,建立影像设备610与手术机器人620的连接关系。通过控制器建立影像设备与手术机器人之间的连接关系,以方便直接传输控制信号,进而直接对影像设备以及手术机器人的运动状态进行控制,无需每次进行联锁控制时先建立连接通道,提高了影像引导介入穿刺系统的执行效率。
在一些实施例中,影像设备可以向手术机器人发送第一接入请求,手术机器人接受该第一接入请求后,手术机器人与影像设备进入一体工作模式。在一些实施例中,手术机器人可以向影像设备发送第二接入请求,影像设备接受第二接入请求后,手术机器人与影像设备进入一体工作模式。例如,影像设备或手术机器人可以在影像介入引导穿刺系统准备阶段,发送第一接入请求或第二接入请求。
在一些实施例中,可以响应于用户操作,生成第一接入请求或第二接入请求。例如,用户可以将影像设备110的通信线缆连接至手术机器人120的控制器的接口板,影像设备110检测到通信线缆连通后生成第一接入请求。又如,在影像设备110与手术机器人120通过硬件连接后,手术机器人120可以响应于用户对手术机器人120上的工作模式切换按钮的操作,生成第二接入请求。此种情况下,影像设备110与手术机器人120通过硬件连接后,仍处于独立工作模式。在一些实施例中,可以基于预设条件,生成第一接入请求或第二接入请求。例如,影像设备110与手术机器人120通过硬件连接后,可以当连通时长达到预设时间阈值时,由影像设备110或手术机器人120生成接入请求。
当影像设备与手术机器人进入一体工作模式后,可通过影像设备引导手术机器人执行穿刺手术,从而将该一体工作模式介入穿刺手术。在一些实施例中,执行过程中,可以通过联锁控制,实现在影像设备扫描时对手术机器人进行锁定控制(例如,控制手术机器人的运动部件保持静止或锁定),在手术机器人执行穿刺手术时对影像设备进行锁定控制(例如,控制影像设备的运动部件保持静止或锁定)。在一些实施例中,可以基于预设流程、或影像设备/手术机器人的运动状态,对影像设备或手术机器人的运动部件进行控制。例如,当影像设备110运动时,处理设备130可以对手术机器人120的运动部件进行锁定。又如,当影像设备110结束当前预设流程时,处理设备130可以控制影像设备110保持静止和/或对其运动部 件进行锁定,同时解锁手术机器人120的运动部件。
在一些实施例中,影像设备或手术机器人可以向控制器(例如,处理设备130、控制模块630)发送中断请求(例如,第一中断请求、第二中断请求),控制器根据中断请求中断影像设备与手术机器人之间的连接关系,以控制影像设备与手术机器人进入独立工作模式。其中,中断请求可以是指手术机器人或影像设备发出的请求中断的指令信号。在一些实施例中,中断请求可以包括第三方指令,例如用户输入的操作指令等。
示例性地,在影像设备610需要退出一体工作模式时,可以向控制模块630发送第一中断请求,控制模块630接收到第一中断请求后,中断影像设备610与手术机器人620之间的连接关系。例如,控制模块630可以分别中断与影像设备610以及手术机器人620的连接通道,从而中断影像设备610与手术机器人620的连接关系。
示例性地,在手术机器人620需要退出一体工作模式时,由手术机器人620向控制模块630发送第二中断请求,控制模块630接收到第二中断请求后,中断手术机器人620与影像设备610之间的连接关系。
在一些实施例中,影像设备可以向手术机器人发送第一中断请求(例如,介入引导手术结束后),手术机器人接受该第一中断请求后中断连接关系,手术机器人与影像设备进入独立工作模式。在一些实施例中,手术机器人可以向影像设备发送第二中断请求,影像设备接收该第二中断请求后中断连接关系,手术机器人与影像设备进入独立工作模式。
独立工作模式下,手术机器人撤离,影像设备可单独用于执行扫描操作(例如,临床影像扫描)。
通过中断影像设备与手术机器人之间的连接关系,从而保证手术结束后手术机器人能够及时撤离,保证影像设备独立工作,不影响系统的正常运行,从而提高了影像引导介入穿刺系统的使用效率和使用体验。
在一些实施例中,可以当影像设备和手术机器人之间的连接关系异常时,控制影像设备和手术机器人进入故障模式。例如,故障模式可以包括同时控制影像设备与手术机器人的运动部件保持静止,和/或对影像设备以及手术机器人的运动部件进行锁定。
在一些实施例中,在进入一体工作模式后,可以对影像设备和手术机器人之间的连接关系进行检测;当连接关系异常时,同时控制影像设备与手术机器人保持静止。连接关系可以为硬件连接关系,也可以为软件连接关系。若检测到连接关系存在异常状况,则表明系统中可能存在故障,继续执行手术操作可能会导致危险状况。因此,需要同时向影像设备以及手术机器人发送控制信号,并控制影像设备以及手术机器人结束运动状态,从而避免对患者的伤害。
例如,在一体工作模式下,控制模块630可以对影像设备610以及手术机器人620的硬件接口以及软件接口的工作状态进行检测,当接口线缆或者软件存在异常状况时,生成控制信号并发送至影像设备610与手术机器人620,以控制影像设备610以及手术机器人620进入锁定状态。在一些实施例中,当进入锁定状态后,手术机器人或影像设备可以进行自修复。若自修复失败,则可以由人工进行检查、修复。修复完成后,重启影像设备与手术机器人可以直接进入一体工作模式或独立工作模式,或者可以基于影像设备或手术机器人的接入请求或中断请求进入一体工作模式或独立工作模式,具体方式可以由用户自行设定。
在一些实施例中,可以当确定目标对象存在非预期运动时,控制影像设备和手术机器人进入故障模式。例如,可以通过摄像头实时采集患者图像,当基于采集的图像识别到用户存在非预期运动时(预设标准姿势是双手放在腿两侧平躺,但用户手举起或放置穿刺部位),可以同时控制影像设备与手术机器人保持静止。
在一些实施例中,控制模块还可以响应于影像设备或手术机器人发生故障,控制影像设备与手术机器人进入独立工作模式。故障可以包括影像设备与手术机器人的连接关系存在异常、手术机器人或影像设备宕机等。例如,在一体工作模式下,若手术机器人120发生故障,控制模块检测到其故障之后,可以生成控制信号发送至影像设备110与手术机器人120,以控制影像设备110与手术机器人120强制断开连接,进入独立工作模式,此时,影像设备110可以独立执行扫描操作。
通过对影像设备以及手术机器人之间的连接关系进行检测,并在连接关系存在异常时锁定影像设备以及手术机器人的运动部件,从而避免非预期相对运动,进而避免意外事件对影像引导介入穿刺系统的影响,提高了影像引导介入穿刺系统的安全性。
图5是根据本说明书另一些实施例所示的影像引导介入穿刺的设备控制方法的流程示意图。在一些实施例中,流程500可以由影像引导介入穿刺系统100(例如,处理设备130)执行。如图5所示,流程500可以包括以下步骤:
步骤510,手术机器人注册配准。在一些实施例中,步骤510可以由处理设备130执行。
注册配准可以是指将目标对象的三维空间与扫描影像的三维空间进行匹配,以统一到相同的坐标 系下。
例如,影像设备与手术机器人进入一体工作模式后,手术机器人通过注册配准,将患者的三维坐标或扫描影像关联到统一的坐标系下,从而实现三维空间坐标或扫描影像坐标与手术机器人坐标的转换,以确定手术位置,并建立连接通道。
步骤520,控制手术机器人保持静止,并控制影像设备对目标对象进行扫描。在一些实施例中,步骤520可以由处理设备130执行。
在一些实施例中,当完成注册配准后,手术机器人可以生成第二结束信号,并发送至控制器(例如,处理设备130)。控制器接收到第二结束信号后,控制手术机器人保持静止,并控制影像设备对目标对象进行扫描。例如,处理设备130可以基于第二结束信号对手术机器人120的手术执行臂进行锁定,并控制影像设备110对目标对象进行扫描,以获取目标对象的扫描影像。又如,处理设备130可以基于第二结束信号,控制手术机器人120运动到目标位置(例如,不会干扰影像设备运动的位置)后静止,并对手术执行臂进行锁定。然后控制影像设备110对目标对象进行扫描,以获取目标对象的扫描影像。
步骤530,根据预设流程,控制影像设备运动。在一些实施例中,步骤530可以由处理设备130执行。
在一些实施例中,处理设备130可以根据预设流程,控制影像设备运动或保持静止。例如,当预设流程中包括局麻穿刺(局部麻醉穿刺)时,影像设备110完成扫描后,可以生成第一结束信号并发送至处理设备130,处理设备130接收到第一结束信号后,对影像设备110的扫描床进行锁定,并控制手术机器人120进入工作流程-局麻穿刺,以使手术执行臂对患者进行局麻穿刺;完成该工作流程后,手术机器人120生成第二结束信号并发送至处理设备130,处理设备130根据第二结束信号对扫描床进行解锁,并控制影像设备110进入工作流程-患者送入孔径,以使影像设备110通过扫描床将患者运载至扫描架孔径内。又如,当预设流程中不包括局麻穿刺时,处理设备130可以根据预设流程,当影像设备110完成扫描后,控制影像设备110通过扫描床将目标对象运送至扫描架的孔径内,并对穿刺层面进行定位。
步骤540,根据第一结束信号,控制影像设备保持静止,并解除手术机器人的静止状态。在一些实施例中,步骤540可以由处理设备130执行。
在一些实施例中,当完成穿刺层面定位之后,影像设备可以生成第一结束信号并发送至控制器(例如,处理设备130),控制器根据第一结束信号控制影像设备的运动部件(例如,扫描床)保持静止和/或进入锁定状态,并解除对手术机器人的运动部件的锁定状态或静止状态。例如,处理设备130可以基于接收到的结束信号,对影像设备110的扫描床和扫描架进行锁定,并对手术机器人120的手术执行臂进行解锁,以使手术执行臂进入扫描架孔径内对患者进行穿刺动作。
步骤550,根据预设流程,控制手术机器人进行运动。在一些实施例中,步骤550可以由处理设备130执行。
在一些实施例中,处理设备130可以根据预设流程发送控制信号至手术机器人,手术机器人接收到控制信号之后控制手术执行臂进行运动,以使手术执行臂进入扫描架孔径内对目标对象进行主从穿刺动作。进一步地,完成主从穿刺动作后,手术机器人根据控制指令控制手术执行臂移出扫描架孔径内。
步骤560,根据第二结束信号,控制手术机器人保持静止,并解除影像设备的静止状态。在一些实施例中,步骤560可以由处理设备130执行。
在一些实施例中,当手术执行臂移出扫描架孔径后,手术机器人可以生成第二结束信号并发送至控制器(例如,处理设备130),控制器根据该第二结束信号,控制手术机器人的运动部件(例如,手术执行臂)保持静止或进入锁定状态,并解除对影像设备的运动部件的静止状态或锁定状态。例如,处理设备130可以根据该第二结束信号,对手术执行臂进行锁定,并对扫描床进行解锁,以使影像设备110进入工作流程-患者移出孔径,影像设备110通过扫描床将患者移出扫描架孔径。
进一步地,处理设备130可以判断穿刺手术是否结束,若结束,则进入步骤570;否则,进入步骤530:根据预设流程,控制影像设备运动(例如,根据下一个工作流程控制影像设备运动)。在一些实施例中,可以根据预设流程判断穿刺手术是否结束。在一些实施例中,可以根据结束信号判断穿刺手术是否结束。例如,可以基于影像设备110发送的“患者已移出孔径”的信号,确定穿刺手术结束。
步骤570,控制影像设备与手术机器人进入独立工作模式。在一些实施例中,步骤570可以由处理设备130执行。
在一些实施例中,响应于穿刺手术已结束,处理设备130可以对影像设备(例如,扫描床)和/或手术机器人(例如,手术执行臂)进行解锁,并中断影像设备与手术机器人的连接关系,以进入独立工作模式。
通过第一结束信号与第二结束信号,在完成预设流程后对影像设备与手术机器人的运动状态进行控制,以保证预设流程切换时实现运动状态的联锁控制,避免非预期相对运动,可以进一步降低影像引导介入穿刺系统的风险。
图6是根据本说明书一些实施例所示的影像引导介入穿刺系统的结构示意图。
如图6所示,在一些实施例中,影像引导介入穿刺系统600可以包括影像设备610、手术机器人620和控制模块630。其中,影像设备610、手术机器人620和控制模块630分别为与影像引导介入穿刺系统100中影像设备110、手术机器人120和处理设备130类似的结构或组件。在一些实施例中,控制模块630可以集成于手术机器人620,并与影像设备610进行通讯连接,以用于对影像设备610以及手术机器人620进行控制。
在一些实施例中,影像设备610可以包括扫描架613、扫描床615。以影像设备610为CT设备为例,在一些实施例中,扫描架613上可以装有X线球管、滤线器、准直器、参考探测器、信号探测器、电子线路以及各种运动部件。在一些实施例中,扫描架613的运动部件可以控制扫描架做直线运动、旋转运动、前后倾斜运动等运动。在一些实施例中,可以基于运动部件改变X线球管和目标对象之间的距离,以及调节扫描架613的倾斜角度,其倾斜角度可以达到±20°~±30°。扫描床615为目标对象的运载工具。在一些实施例中,扫描床615具有垂直运动部件以及水平纵向运动部件,能够按照预设流程实现自动进出扫描架613的孔径,将目标对象运载至指定的扫描位置。
在一些实施例中,手术机器人620可以包括手术执行臂621、手术执行臂末端623以及手术设备625。其中,手术执行臂621可以用于对手术执行臂末端623进行支撑,并将手术执行臂末端623运载至指定的手术位置。手术执行臂末端623可以用于固定手术设备625,并控制手术设备625执行穿刺、缝合、消融等手术动作。
在一些实施例中,控制模块630可以用于根据影像设备610和/或手术机器人620的初始运动状态,控制手术机器人620和/或影像设备610的目标运动状态。
在一些实施例中,控制模块630可以用于根据影像设备610的第一运动状态,当确定影像设备610运动时,控制手术机器人620保持静止。例如,影像设备610开始进入运动状态时,可以输出反馈信号至控制模块630,控制模块30接收到该反馈信号后,输出控制信号至手术机器人620,以控制手术机器人620的运动部件进入锁定状态。
在一些实施例中,控制模块630可以用于根据手术机器人620的第二运动状态,当确定手术机器人620运动时,控制影像设备610保持静止。例如,手术机器人620开始进入运动状态时,可以输出反馈信号至控制模块630,控制模块630接收到该反馈信号后,输出控制信号至影像设备610,以控制影像设备610的扫描床进入锁定状态。
在一些实施例中,控制模块630可以同时控制影像设备610和手术机器人620保持静止。例如,控制模块630接收其他模块(例如,终端设备140)输出的反馈信号后,同时输出控制信号至影像设备610以及手术机器人620,以同时控制影像设备610以及手术机器人620的运动部件保持静止。又如,控制模块630检测到影像设备610和/或手术机器人620发生故障、或两者之间的连接关系异常时,可以同时输出控制信号至影像设备610以及手术机器人620,以同时控制影像设备610以及手术机器人620的运动部件进入锁定状态。
在一些实施例中,影像设备610以及手术机器人620可以包括抱闸锁定结构(图中未示出),以用于在影像设备610以及手术机器人620的运动部件在保持静止时,对运动部件进行锁定,以避免意外非正常运动。
通过在手术机器人以及影像设备中一端运动时,使得另一端保持静止,或者在意外情况下同时控制手术机器人以及影像设备保持静止,避免了影像设备和手术机器人的运动部件的独立控制所导致的非预期相对运动风险,从而提高了影像引导介入穿刺系统600在各种情形下的安全性。
在一些实施例中,影像设备610还可以用于在结束当前预设流程时生成第一结束信号,并传输至控制模块630。在一些实施例中,手术机器人620可以用于在结束当前预设流程时生成第二结束信号,并传输至控制模块630。
第一结束信号为影像设备610在结束预设流程时生成的反馈信号,第二控制信号为手术机器人620在结束预设流程时生成的反馈信号。控制模块630接收第一结束信号或第二结束信号,并控制影像设备610以及手术机器人620进入下一个预设流程。
通过在每个流程结束时,生成结束信号并发送至控制模块630,以使控制模块630根据结束信号控制影像设备610以及手术机器人620进入下一个预设流程,保证了不同预设流程切换时对影像设备610和手术机器人620的联锁控制,提高了影像引导介入穿刺系统的安全性。
在一些实施例中,控制模块630还可以用于根据第一结束信号,控制影像设备610保持静止,和/或解除手术机器人120的静止状态;和/或,根据第二结束信号,控制手术机器人620保持静止,和/或解除影像设备610的静止状态。
可以理解的,当影像设备完成当前的工作流程后,手术机器人准备进入下一阶段的工作流程,因此需要对影像设备的运动部件进行锁定,若手术机器人原来处于锁定状态,则需要对手术机器人的运动部 件进行解锁。相应地,当手术机器人完成当前的工作流程后,影像设备准备进入下一阶段的工作流程,因此需要对手术机器人的运动部件进行锁定,若影像设备原来处于锁定状态,则需要对影像设备的运动部件进行解锁。需要注意的是,当前后两个流程是由相同的设备执行时,例如,扫描完成后直接送患者进入孔径,可以控制该设备的保持解锁状态,以控制扫描床运动从而将患者送入孔径。
在一些实施例中,影像设备610和/或手术机器人620还可以用于发送接入请求、中断请求至控制模块630。控制模块630可以用于根据接入请求控制影像设备610与手术机器人620进入一体工作模式,以及根据中断请求控制影像设备610与手术机器人620进入独立工作模式。在一些实施例中,控制模块630可以用于响应于影像设备610和/或手术机器人620发生故障,控制影像设备610与手术机器人620进入独立工作模式。
在一些实施例中,影像引导介入穿刺系统600可以包括第一控制模块以及第二控制模块,第一控制模块以及第二控制模块分别对影像设备610以及手术机器人620进行控制。例如,第一控制模块集成在影像设备610、第二控制模块集成在手术机器人620,通过建立第一控制模块与第二控制模块之间的连接关系,使影像设备与手术机器人进入一体工作模式。关于独立工作模式和一体工作模式的更多内容可以参见图4及其相关描述,此处不再赘述。
在一些实施例中,控制模块630还可以用于当影像设备610运动时,根据影像设备610的第一运动状态确定影像设备610的第一运动轨迹,并根据第一运动轨迹控制手术机器人620进行运动。在一些实施例中,控制模块630还可以用于根据环境信息、影像设备610和/或手术机器人620的初始运动状态,控制手术机器人620和/或影像设备610的运动速度。更多内容可以参见图3及其相关描述,此处不再赘述。
在一些实施例中,影像引导介入穿刺系统600还可以包括显示模块640。显示模块640可以是用于显示图像等信息的模块,包括但不限于CRT显示器、液晶显示器或者LED显示器等。
在一些实施例中,显示模块640可以用于接收影像设备610和/或手术机器人620输出的控制指令信息以及运动状态信息,并在显示界面进行显示。其中,控制指令信息可以是指第一结束信号、第二结束信号等指令信息。运动状态信息可以是指第一运动状态、第二运动状态等反映影像设备610和/或手术机器人620运动情况(例如,静止、运动)的信息。
在一些实施例中,显示模块640可以显示影像设备610获取的影像数据(例如,扫描图像)。
在一些实施例中,显示模块640可以与控制模块630进行连接,从而通过控制模块630接收影像设备610以及手术机器人620输出的控制指令以及运行状态信息。在一些实施例中,当显示模块640接收到影像设备610和/或手术机器人620输出的指令信息以及运动状态信息后,可以通过多种形式在显示界面进行显示。例如,可以通过文字进行显示,可以利用图像画面进行显示,也可以利用图像结合文字等任意一种方式进行显示。
在本说明书的一些实施例中,通过显示模块,用户(例如,医生)可以根据显示模块上实时显示的影像设备和/或手术机器人输出的控制指令信息以及运动状态信息,实时观察穿刺手术的进程,以确保手术过程的安全。
图7是根据本说明书一些实施例所示的影像引导介入穿刺系统的连接关系的示意图。
如图7所示,在一些实施例中,影像引导介入穿刺系统700中的影像设备(例如,影像设备610)与手术机器人(例如,手术机器人620)的连接可以分为硬件和软件两个层面,在硬件层面实现联锁控制,在软件层面实现数据和/或信息(例如,影像数据和状态信息)的交互。
在一些实施例中,影像设备与手术机器人的硬件连接可以包括三个接口:第一联锁接口、第二连锁接口以及第二联锁接口。
第一联锁接口也可称为安全联锁接口,用于控制手术机器人建立或中断与影像设备的连接关系,以及对手术机器人与影像设备的连接关系进行检测。例如,手术机器人620的控制器可以与安装于影像设备610的机架(例如,扫描架613)上的接口通过协议通讯方式连接。
在一些实施例中,影像设备与手术机器人连接后,可以对影像设备进行身份识别与校验,校验成功后进入一体工作模式,以此来保证影像引导介入穿刺系统的连接安全性。
第二联锁接口也可称为运动锁接口,用于根据手术机器人的运动状态控制影像设备的运动状态,和/或根据影像设备的运动状态控制手术机器人的运动状态。例如,手术机器人620的控制器可以与安装于影像设备610的机架(例如,扫描架613)上的接口通过线缆连接,当手术机器人处于运动状态时,通过第二联锁接口对影像设备的运动部件(例如,扫描床、扫描架)进行锁定,以使影像设备在手术机器人工作时处于静止状态,从而避免非预期运动。在此模式下影像设备的运动部件可以通过抱闸等制动器进行锁定。
第三联锁接口也可称为急停接口,可以用于在预设紧急情况(例如,发生碰撞、设备周围存在障碍物、患者身体异常、手术异常、设备故障、连接关系异常等)下控制影像设备和手术机器人的运动部件保持静止或进入锁定状态。例如,手术机器人620的控制器可以与安装于影像设备610的机架(例如,扫 描架613)上的接口通过线缆连接,手术机器人620急停时触发影像设备610急停、影像设备610急停时触发手术机器人620急停、或患者身体异常时同时触发影像设备610和手术机器人620急停等。在一些实施例中,第三联锁接口可以用于在预设紧急情况下,控制影像设备和/或手术机器人向与原先的运动方向相反的方向运动相应距离,以远离碰撞对象。从而在紧急状况下避免影像设备610与手术机器人620之间的非预期运动,保证影像引导介入穿刺系统的安全性。
在一些实施例中,第一联锁接口、第二连锁接口以及第二联锁接口可以集成在影像设备或手术机器人的一个接口板上。例如,影像设备610可以通过由两根线缆(分别用于运动锁和急停)和一根通讯协议线路组成的总线与手术机器人620的控制器连接。
在一些实施例中,影像设备与手术机器人的软件连接可以包括两个传输通道:第一传输通道与第二传输通道。
第一传输通道可以用于传输影像数据。例如,影像设备610可以通过第一传输通道将获取的影像数据传输至手术机器人620,以引导手术机器人620根据影像数据执行穿刺操作。
第二传输通道可以用于传输运动状态信息。例如,第二传输通道可以用于将影像设备110的第一运动状态信息传输至手术机器人120,和/或将手术机器人120的第二运动状态信息传输至影像设备110。
通过安全联锁接口建立影像设备与手术机器人之间的硬线传输通道,保证了联锁结构的稳定性。影像设备与手术机器人之间通过软件方式连接,可以实现两个设备之间的信息交互,使得影像设备与手术机器人能够及时获取对方的信息并对手术操作进行调整,有效提高了影像引导介入穿刺系统的准确度及执行效率。
上文已对基本概念做了描述,显然,对于本领域技术人员来说,上述详细披露仅仅作为示例,而并不构成对本说明书的限定。虽然此处并没有明确说明,本领域技术人员可能会对本说明书进行各种修改、改进和修正。该类修改、改进和修正在本说明书中被建议,所以该类修改、改进、修正仍属于本说明书示范实施例的精神和范围。
同时,本说明书使用了特定词语来描述本说明书的实施例。如“一个实施例”、“一实施例”、和/或“一些实施例”意指与本说明书至少一个实施例相关的某一特征、结构或特点。因此,应强调并注意的是,本说明书中在不同位置两次或多次提及的“一实施例”或“一个实施例”或“一个替代性实施例”并不一定是指同一实施例。此外,本说明书的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。
此外,除非权利要求中明确说明,本说明书所述处理元素和序列的顺序、数字字母的使用、或其他名称的使用,并非用于限定本说明书流程和方法的顺序。尽管上述披露中通过各种示例讨论了一些目前认为有用的发明实施例,但应当理解的是,该类细节仅起到说明的目的,附加的权利要求并不仅限于披露的实施例,相反,权利要求旨在覆盖所有符合本说明书实施例实质和范围的修正和等价组合。例如,虽然以上所描述的系统组件可以通过硬件设备实现,但是也可以只通过软件的解决方案得以实现,如在现有的服务器或移动设备上安装所描述的系统。
同理,应当注意的是,为了简化本说明书披露的表述,从而帮助对一个或多个发明实施例的理解,前文对本说明书实施例的描述中,有时会将多种特征归并至一个实施例、附图或对其的描述中。但是,这种披露方法并不意味着本说明书对象所需要的特征比权利要求中提及的特征多。实际上,实施例的特征要少于上述披露的单个实施例的全部特征。
一些实施例中使用了描述成分、属性数量的数字,应当理解的是,此类用于实施例描述的数字,在一些示例中使用了修饰词“大约”、“近似”或“大体上”来修饰。除非另外说明,“大约”、“近似”或“大体上”表明所述数字允许有±20%的变化。相应地,在一些实施例中,说明书和权利要求中使用的数值参数均为近似值,该近似值根据个别实施例所需特点可以发生改变。在一些实施例中,数值参数应考虑规定的有效数位并采用一般位数保留的方法。尽管本说明书一些实施例中用于确认其范围广度的数值域和参数为近似值,在具体实施例中,此类数值的设定在可行范围内尽可能精确。
针对本说明书引用的每个专利、专利申请、专利申请公开物和其他材料,如文章、书籍、说明书、出版物、文档等,特此将其全部内容并入本说明书作为参考。与本说明书内容不一致或产生冲突的申请历史文件除外,对本说明书权利要求最广范围有限制的文件(当前或之后附加于本说明书中的)也除外。需要说明的是,如果本说明书附属材料中的描述、定义、和/或术语的使用与本说明书所述内容有不一致或冲突的地方,以本说明书的描述、定义和/或术语的使用为准。
最后,应当理解的是,本说明书中所述实施例仅用以说明本说明书实施例的原则。其他的变形也可能属于本说明书的范围。因此,作为示例而非限制,本说明书实施例的替代配置可视为与本说明书的教导一致。相应地,本说明书的实施例不仅限于本说明书明确介绍和描述的实施例。

Claims (20)

  1. 一种影像引导介入穿刺的设备控制方法,包括:
    获取影像设备和/或手术机器人的初始运动状态;
    根据所述初始运动状态,控制所述手术机器人和/或所述影像设备的目标运动状态。
  2. 根据权利要求1所述的方法,所述根据所述初始运动状态,控制所述手术机器人和/或所述影像设备的目标运动状态,包括:
    根据所述影像设备的第一运动状态,控制所述手术机器人的第二运动状态;和/或
    根据所述手术机器人的所述第二运动状态,控制所述影像设备的所述第一运动状态。
  3. 根据权利要求2所述的方法,所述根据所述影像设备的第一运动状态,控制所述手术机器人的第二运动状态,包括:
    根据所述影像设备的第一运动状态,当确定所述影像设备运动时,控制所述手术机器人保持静止。
  4. 根据权利要求2所述的方法,所述根据所述手术机器人的所述第二运动状态,控制所述影像设备的所述第一运动状态,包括:
    根据所述手术机器人的所述第二运动状态,当确定所述手术机器人运动时,控制所述影像设备保持静止。
  5. 根据权利要求2所述的方法,所述根据所述影像设备的第一运动状态,控制所述手术机器人的第二运动状态,包括:
    根据所述影像设备的第一运动状态,确定所述影像设备的第一运动轨迹;
    根据所述第一运动轨迹,控制所述手术机器人进行运动。
  6. 根据权利要求5所述的方法,所述根据所述第一运动轨迹,控制所述手术机器人进行运动,包括:
    根据所述影像设备的所述第一运动轨迹,预测所述影像设备与所述手术机器人之间的距离;
    当所述距离小于距离阈值时,同时控制所述影像设备和所述手术机器人保持静止。
  7. 根据权利要求5所述的方法,所述根据所述第一运动轨迹,控制所述手术机器人进行运动,包括:
    根据所述第一运动轨迹,规划所述手术机器人的第二运动轨迹;
    根据所述第二运动轨迹,控制所述手术机器人进行运动。
  8. 根据权利要求1所述的方法,所述根据所述初始运动状态,控制所述手术机器人和/或所述影像设备的目标运动状态,包括:
    根据所述影像设备和/或所述手术机器人的所述初始运动状态,控制所述手术机器人和/或所述影像设备的运动速度。
  9. 根据权利要求1所述的方法,还包括:
    根据环境信息,控制所述手术机器人和/或所述影像设备的运动速度。
  10. 根据权利要求1所述的方法,还包括:
    获取所述影像设备在结束当前预设流程时生成的第一结束信号,或所述手术机器人在结束当前预设流程时生成的第二结束信号;
    根据所述第一结束信号或所述第二结束信号,控制所述影像设备和/或所述手术机器人进入下一个流程的运动状态。
  11. 根据权利要求10所述的方法,所述根据所述第一结束信号或所述第二结束信号,控制所述影像设备和/或所述手术机器人进入下一个流程的运动状态,包括:
    根据所述第一结束信号控制所述影像设备保持静止,和/或解除所述手术机器人的静止状态;
    根据所述第二结束信号控制所述手术机器人保持静止,和/或解除所述影像设备的静止状态。
  12. 根据权利要求1所述的方法,还包括:
    根据所述手术机器人或所述影像设备的接入请求,控制所述影像设备与所述手术机器人进入一体工作模式,在所述一体工作模式下,所述影像设备和所述手术机器人的运动状态相互关联。
  13. 根据权利要求12所述的方法,还包括:
    获取所述影像设备或所述手术机器人发送的中断请求,并根据所述中断请求控制所述影像设备与所述手术机器人进入独立工作模式,在所述独立工作模式下,所述影像设备和所述手术机器人之间的运动状态相互独立。
  14. 根据权利要求12所述的方法,还包括:
    对所述影像设备和所述手术机器人之间的连接关系进行检测;
    当所述连接关系异常时,同时控制所述影像设备以及所述手术机器人保持静止。
  15. 根据权利要求12所述的方法,还包括:
    响应于所述影像设备或所述手术机器人发生故障,控制所述影像设备与所述手术机器人进入独立工作模式。
  16. 一种影像引导介入穿刺的设备控制系统,包括:
    影像设备,用于获取目标对象的影像数据;
    手术机器人,用于执行穿刺操作;
    控制模块,用于根据所述影像设备和/或所述手术机器人的初始运动状态,控制所述手术机器人和/或所述影像设备的目标运动状态。
  17. 根据权利要求16所述的系统,还包括:
    显示模块,用于接收所述影像设备和/或所述手术机器人输出的控制指令信息以及运动状态信息,并在显示界面进行显示。
  18. 根据权利要求16所述的系统,还包括:
    第一联锁接口,用于控制所述手术机器人建立或中断与所述影像设备的连接关系,以及对所述连接关系进行检测;
    第二联锁接口,用于根据所述手术机器人的运动状态控制所述影像设备的运动状态;
    第三联锁接口,用于在预设紧急情况下控制所述影像设备和/或所述手术机器人保持静止。
  19. 根据权利要求16所述的系统,还包括:
    第一传输通道,用于将所述影像设备获取的所述影像数据传输至所述手术机器人,以使所述手术机器人根据所述影像数据执行所述穿刺操作;
    第二传输通道,用于将所述影像设备的第一运动状态信息传输至所述手术机器人,和/或将所述手术机器人的第二运动状态信息传输至所述影像设备。
  20. 一种计算机可读存储介质,所述存储介质存储计算机指令,当计算机读取存储介质中的计算机指令后,计算机执行如权利要求1-15中任一项所述的方法。
PCT/CN2022/135624 2021-12-30 2022-11-30 一种影像引导介入穿刺的设备控制方法和系统 WO2023124732A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111660066.XA CN114305613B (zh) 2021-12-30 2021-12-30 影像引导介入穿刺系统
CN202111660066.X 2021-12-30

Publications (1)

Publication Number Publication Date
WO2023124732A1 true WO2023124732A1 (zh) 2023-07-06

Family

ID=81018008

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/135624 WO2023124732A1 (zh) 2021-12-30 2022-11-30 一种影像引导介入穿刺的设备控制方法和系统

Country Status (2)

Country Link
CN (1) CN114305613B (zh)
WO (1) WO2023124732A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114305613B (zh) * 2021-12-30 2024-01-30 武汉联影智融医疗科技有限公司 影像引导介入穿刺系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016175489A1 (ko) * 2015-04-30 2016-11-03 현대중공업 주식회사 바늘삽입형 중재시술 로봇의 마스터 콘솔 및 이를 포함하는 로봇시스템
CN107970060A (zh) * 2018-01-11 2018-05-01 上海联影医疗科技有限公司 手术机器人系统及其控制方法
CN110584784A (zh) * 2018-06-13 2019-12-20 上海联影医疗科技有限公司 机器人辅助手术系统
CN111202583A (zh) * 2020-01-20 2020-05-29 上海奥朋医疗科技有限公司 跟踪手术床运动的方法、系统及介质
CN114305613A (zh) * 2021-12-30 2022-04-12 武汉联影智融医疗科技有限公司 影像引导介入穿刺系统

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2561821A1 (en) * 2011-08-25 2013-02-27 Perfint Healthcare Private Limited Tool positioning system
KR20140129702A (ko) * 2013-04-30 2014-11-07 삼성전자주식회사 수술 로봇 시스템 및 그 제어방법
CA2926714C (en) * 2013-10-07 2022-08-02 Technion Research & Development Foundation Ltd. Gripper for robotic image guided needle insertion
CN107645924B (zh) * 2015-04-15 2021-04-20 莫比乌斯成像公司 集成式医学成像与外科手术机器人系统
KR101758741B1 (ko) * 2015-09-09 2017-08-11 울산대학교 산학협력단 의료영상을 사용하는 중재시술 가이드 방법 및 이를 위한 중재시술 시스템
US10610307B2 (en) * 2017-09-28 2020-04-07 General Electric Company Workflow assistant for image guided procedures
US20210275263A1 (en) * 2017-10-16 2021-09-09 Epica International, Inc Robot-assisted surgical guide system for performing surgery
CN110051436B (zh) * 2018-01-18 2020-04-17 上海舍成医疗器械有限公司 自动化协同工作组件及其在手术器械中的应用
CN110664484A (zh) * 2019-09-27 2020-01-10 江苏工大博实医用机器人研究发展有限公司 一种机器人与影像设备的空间注册方法及系统
CN110623731A (zh) * 2019-11-03 2019-12-31 北京诺影医疗科技有限公司 一种高集成度骨科手术机器人
CN211534702U (zh) * 2019-12-23 2020-09-22 武汉联影智融医疗科技有限公司 介入穿刺系统及具有其的诊疗设备
CN111513849B (zh) * 2020-04-30 2022-04-19 京东方科技集团股份有限公司 一种用于穿刺的手术系统、控制方法及控制装置
CN212879562U (zh) * 2020-09-18 2021-04-06 浙江伽奈维医疗科技有限公司 一种远程遥控步进穿刺机器人系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016175489A1 (ko) * 2015-04-30 2016-11-03 현대중공업 주식회사 바늘삽입형 중재시술 로봇의 마스터 콘솔 및 이를 포함하는 로봇시스템
CN107970060A (zh) * 2018-01-11 2018-05-01 上海联影医疗科技有限公司 手术机器人系统及其控制方法
CN110584784A (zh) * 2018-06-13 2019-12-20 上海联影医疗科技有限公司 机器人辅助手术系统
CN111202583A (zh) * 2020-01-20 2020-05-29 上海奥朋医疗科技有限公司 跟踪手术床运动的方法、系统及介质
CN114305613A (zh) * 2021-12-30 2022-04-12 武汉联影智融医疗科技有限公司 影像引导介入穿刺系统

Also Published As

Publication number Publication date
CN114305613A (zh) 2022-04-12
CN114305613B (zh) 2024-01-30

Similar Documents

Publication Publication Date Title
US11576737B2 (en) System and method for integrated surgical table
US10624807B2 (en) System and method for integrated surgical table icons
JP6983482B2 (ja) 医用イメージングシステムにおいて対象3dポイントクラウドを生成するための方法及びシステム
WO2023124732A1 (zh) 一种影像引导介入穿刺的设备控制方法和系统
CN113366583A (zh) 用于计算机辅助手术系统的摄像机控制系统和方法
JP2008272290A (ja) 医用画像撮影装置
JP5513786B2 (ja) X線ct装置及び移動制御プログラム
CN117357267A (zh) 一种手术机器人及其控制方法、装置、以及手术系统
CN113576666A (zh) 监测方法和医学系统
Lin et al. Design of a Multi-data fusion intelligent venipuncture blood sampling robot
JP2011217904A (ja) X線診断装置
JP2014057897A (ja) X線ct装置及び移動制御プログラム
CN117357255A (zh) 一种手术机器人及其与手术床的配准方法、以及手术系统
CN117618085A (zh) 基于多模态医学影像的机器人穿刺系统及其控制方法
CN117357256A (zh) 一种手术机器人及其姿态配准方法、控制方法
JP4703161B2 (ja) 核磁気共鳴撮像装置
ITMI950597A1 (it) Applicazione di robot alla biopsia e relative apparecchiature

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22913987

Country of ref document: EP

Kind code of ref document: A1