WO2023274098A1 - 可移动设备的摆位指引方法及系统、手术机器人系统 - Google Patents

可移动设备的摆位指引方法及系统、手术机器人系统 Download PDF

Info

Publication number
WO2023274098A1
WO2023274098A1 PCT/CN2022/101376 CN2022101376W WO2023274098A1 WO 2023274098 A1 WO2023274098 A1 WO 2023274098A1 CN 2022101376 W CN2022101376 W CN 2022101376W WO 2023274098 A1 WO2023274098 A1 WO 2023274098A1
Authority
WO
WIPO (PCT)
Prior art keywords
movable
coordinate information
movable device
positioning
dimensional coordinate
Prior art date
Application number
PCT/CN2022/101376
Other languages
English (en)
French (fr)
Inventor
费璠
戴婷萍
何超
毛亮亮
Original Assignee
上海微创医疗机器人(集团)股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海微创医疗机器人(集团)股份有限公司 filed Critical 上海微创医疗机器人(集团)股份有限公司
Publication of WO2023274098A1 publication Critical patent/WO2023274098A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0046Arrangements of imaging apparatus in a room, e.g. room provided with shielding or for improved access to apparatus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis

Definitions

  • the present application relates to the technical field of medical devices, in particular to a positioning guidance method and system for a movable device, and a surgical robot system.
  • Surgical robots can not only replace the doctor's eyes, but also enable the doctor to see the three-dimensional organ images in the patient's body, help the group judge the location of the lesion tissue, and can also replace the doctor's hands to complete precise, complex and flexible surgical operations. Not only that, the use of surgical robots to perform minimally invasive surgery can also reduce surgical risks and the probability of postoperative complications.
  • the equipment contained in the surgical robot is large in size, and surgery needs to be performed in a special operating room. If the various equipment is not placed properly, it is very easy for the surgical robot to collide with other equipment, and the robotic arm of the surgical robot is difficult to extend. , thereby affecting the accuracy of the operation. At this time, it is necessary to re-adjust the positions of the various devices of the surgical robot to prolong the operation time.
  • the purpose of this application is to provide a positioning guidance method and system for movable equipment, and a surgical robot system, aiming to accurately guide operators to various movable equipment such as doctor control terminal equipment, image trolleys, The patient's operating equipment and other equipment are positioned to shorten the operation time.
  • the present application provides a positioning guidance method for a mobile device, including:
  • the three-dimensional model of the movable device is placed at the target placement position and superimposed and displayed in the real scene.
  • the placement guidance method also includes:
  • the moving path is superimposed and displayed in the real scene.
  • the placement guidance method also includes:
  • the positioning guidance method further includes:
  • the three-dimensional coordinate information and target placement position of the movable device currently being positioned the three-dimensional coordinate information of the fixed device, the three-dimensional coordinate information of other movable devices, and the possible The posture of the mobile device and the other movable devices, planning the movement path of the mobile device currently being positioned, so that the movable device currently being positioned moves along the moving path to arrive at The location of the target, and does not interfere with the fixed equipment and the other movable equipment during the movement process;
  • the moving path is superimposed and displayed in the real scene.
  • the placement guidance method also includes:
  • the real-time three-dimensional coordinate information of the mobile device currently being positioned the real-time three-dimensional coordinate information of the other mobile devices, the three-dimensional coordinate information of the fixed device, and the current
  • the relative relationship of the three-dimensional coordinate information of the target placement position of the mobile device is updated in real time to the moving path;
  • the placement guidance method also includes:
  • the placement guidance method also includes:
  • the placement guidance method further includes: planning a placement sequence of the plurality of movable devices according to a preset rule.
  • the present application also provides a positioning guidance system, including:
  • the positioning device is used to obtain the three-dimensional coordinate information of the fixed equipment and the three-dimensional coordinate information of the movable equipment;
  • An augmented reality device for superimposing and displaying the three-dimensional model of the movable device and the moving path of the movable device in a real scene
  • the control unit is connected in communication with the positioning device and the augmented reality device, and is configured to execute the position guidance method for the mobile device as described in any one of the preceding items.
  • control unit is configured to establish a mapping relationship between the coordinate system of the augmented reality device, the coordinate system of the movable device, and the coordinate system of the fixed device based on the positioning device, and then The three-dimensional model of the movable device and the moving path of the movable device are superimposed and displayed in the real scene acquired by the augmented reality apparatus.
  • control unit is configured to establish a mapping relationship between the coordinate system of the positioning device and the coordinate system of the augmented reality device in the world coordinate system, establish the coordinate system of the positioning device and the possible The mapping relationship of the coordinate system of the mobile device, and the establishment of the mapping relationship between the coordinate system of the positioning device and the coordinate system of the fixed device, and then establish the coordinate system of the augmented reality device, the mobile device, and the fixed device The mapping relationship between the coordinate system of the device.
  • the positioning device is arranged on the augmented reality device; the coordinate system of the positioning device establishes a mapping relationship with the coordinate system of the augmented reality device through a mechanical position.
  • the positioning device is separated from the augmented reality device; the three-dimensional coordinate information of the augmented reality device is obtained through the positioning device, and the coordinate system of the positioning device and the coordinate system of the augmented reality device are established. Mapping relations.
  • the present application also provides a surgical robot system, including the positioning guidance system described in any one of the preceding items and a movable device, the movable device includes a doctor control terminal device, an image trolley and a patient operation at least one of the end devices.
  • the present application also provides a computer-readable storage medium, on which a program is stored, and when the program is executed, the position guidance method for a mobile device as described in any one of the preceding items is executed.
  • the positioning guidance method and system of the mobile device and the surgical robot system of the present application have the following advantages:
  • the aforementioned position guidance method for a movable device includes: obtaining three-dimensional coordinate information of a fixed device; planning a target placement position of the movable device according to the three-dimensional coordinate information of the fixed device and the posture of the movable device; obtaining the A three-dimensional model of the movable device; placing the three-dimensional model of the movable device at the target placement position and displaying it superimposed on the display scene.
  • the positioning guidance method of the mobile device provided in this application, the three-dimensional model of the mobile device, the target placement position and the actual scene can be displayed in combination, and it is convenient to guide the operator to place the mobile device Move to the target placement position to complete the placement operation.
  • the positioning guidance method is applied to the positioning of medical equipment used in surgery, the positioning of the equipment can be completed quickly and accurately, avoiding the mutual interference between various equipment during the operation, and helping to save operation time.
  • Fig. 1 is a schematic diagram of an application scenario of a surgical robot system provided by the present application according to an embodiment
  • Fig. 2 is a schematic diagram of the application scene of the surgical robot system provided by the present application according to an embodiment. The difference between Fig. 2 and Fig. 1 is that the viewing directions are different;
  • Fig. 3 is a flow chart of a positioning guidance method for a mobile device according to an embodiment of the present application
  • Fig. 4 is a schematic diagram of obtaining a three-dimensional model of a mobile device by using a binocular vision device in a positioning guidance method for a mobile device according to an embodiment of the present application;
  • FIG. 5 is a schematic diagram of the principle of binocular stereo vision three-dimensional measurement provided by the present application according to an embodiment
  • Fig. 6 is a schematic diagram of the moving direction of the doctor control terminal device when the doctor control terminal device is placed in the positioning guidance method for the movable device according to an embodiment of the present application;
  • Fig. 7 is a schematic diagram of the moving direction of the image trolley when the image trolley is placed in the position guidance method for the movable device according to an embodiment of the present application;
  • Fig. 8 is a schematic diagram of the moving direction of the patient's implementing end and the moving direction of the supporting mechanism when the patient's implementing end device is positioned in the method for guiding the positioning of the movable device according to an embodiment of the present application;
  • FIG. 9 is a schematic diagram of obtaining the spatial mapping relationship between the mobile device and its three-dimensional model in the positioning guidance method for the mobile device according to an embodiment of the present application.
  • the binocular vision device and the AR glasses are separated from each other;
  • FIG. 10 is a schematic structural diagram of mutually independent binocular vision devices and AR glasses in a positioning guidance method for a mobile device according to an embodiment of the present application;
  • Fig. 11 is a schematic diagram of obtaining the spatial mapping relationship between the mobile device and its three-dimensional model in the positioning guidance method for the mobile device according to an embodiment of the present application.
  • the binocular vision device is integrated on the AR glasses;
  • FIG. 12 is a schematic structural diagram of a binocular vision device integrated into AR glasses in a positioning guidance method for a mobile device according to an embodiment of the present application;
  • Fig. 13a is a schematic diagram of judging whether the doctor's control terminal device has reached the target placement position in the positioning guidance method of the movable device according to an embodiment of the present application;
  • Fig. 13b is a schematic diagram of judging whether the patient-executing end device has reached the target placement position in the positioning guidance method of the movable device according to an embodiment of the present application;
  • Fig. 13c is a schematic diagram of judging whether the image trolley has reached the target placement position in the position guidance method of the mobile device according to an embodiment of the present application;
  • Fig. 14 is a flow chart of positioning the doctor's control device, the patient's execution device, and the image trolley in the surgical robot system according to an embodiment of the present application.
  • 110-doctor control terminal equipment 120-patient execution terminal equipment, 130-image trolley;
  • each embodiment of the content described below has one or more technical features, but this does not mean that the applicant must implement all the technical features in any embodiment at the same time, or can only implement different embodiments separately. Some or all of the technical features. In other words, on the premise that the implementation is possible, those skilled in the art can selectively implement some or all of the technical features in any embodiment according to the disclosure of the application and depending on the design specifications or implementation requirements, or Selectively implement a combination of some or all of the technical features in multiple embodiments, thereby increasing the flexibility of implementing the present application.
  • the singular forms “a”, “an” and “the” include plural objects, and the plural form “a plurality” includes two or more objects, unless the content clearly states otherwise.
  • the term “or” is generally used in the sense including “and/or”, unless the content clearly indicates otherwise, and the terms “install”, “connect” and “connect” should be To understand it in a broad sense, for example, it can be a fixed connection, a detachable connection, or an integral connection. It can be a mechanical connection or an electrical connection. It can be directly connected or indirectly connected through an intermediary, and it can be the internal communication of two elements or the interaction relationship between two elements. Those of ordinary skill in the art can understand the specific meanings of the above terms in this application according to specific situations.
  • the core idea of the present application is to provide a positioning guidance method for a movable device, a positioning guidance system for a movable device, a surgical robot system, and a computer-readable storage medium, aiming at helping the operator quickly and accurately guide the positioning of the surgical robot system.
  • Various equipments are placed to avoid equipment interference during the operation, which is not conducive to the operation, and is conducive to shortening the operation time.
  • the position guidance method of the movable device includes: obtaining the three-dimensional coordinate information of the fixed device; planning the target of the movable device according to the three-dimensional coordinate information of the fixed device and the posture of the movable device placing the position; acquiring the three-dimensional model of the movable device; placing the three-dimensional model of the movable device at the target placement position and displaying it superimposed in the real scene, thereby guiding the operator to place the movable device bit.
  • the three-dimensional model of the mobile device is superimposed and displayed in the real scene through an augmented reality device.
  • the operator can position the movable device according to the three-dimensional model of the movable device displayed in the augmented reality device and the target placement position.
  • the movable device refers to a device whose posture needs to be adjusted before performing a specific operation so that it is in a suitable posture
  • the fixed device refers to a device that performs Device whose pose has been fixed before this particular operation.
  • the pose includes the position and posture of the device.
  • the initially acquired three-dimensional coordinate information of the fixed device may be the three-dimensional coordinate information of the fixed device in the world coordinate system, and then the target placement of the movable device in the world coordinate system may be planned Location.
  • the basic standard should be that the movable device does not interfere with the fixed device when it is in the target placement position.
  • the posture of the movable device can be determined by those skilled in the art according to the specific type of the movable device, or can be determined according to the size of the length, height and width of the movable device (that is, the size of the movable device)
  • the target placement position includes the three-dimensional coordinate information of the mobile device in the world coordinate system, that is, includes the three-dimensional coordinate information of the mobile device in the X, Y, and Z directions specific location.
  • the mapping relationship between the three-dimensional coordinate system of the target placement position and the three-dimensional coordinate system of the fixed device in the world coordinate system can be obtained.
  • the 3D model of the mobile device can be obtained in any suitable way, and the coordinate system of the augmented reality device and the coordinate system of the 3D model of the mobile device can be established in the world coordinate system
  • the mapping relationship between the three-dimensional model of the mobile device and the target placement position is unified into the coordinate system of the augmented reality device, and the three-dimensional model of the mobile device can be placed on the target placement location and displayed through the augmented reality device.
  • the three-dimensional coordinate information of the movable device may also be obtained, and according to the three-dimensional coordinate information of the movable device and the three-dimensional coordinate information of the fixed device , the posture of the movable device, and the target placement position of the movable device to plan the moving path of the movable device, so that when the movable device moves along the moving path, it can reach the target position place, and do not interfere with the fixed equipment during movement.
  • the moving path is also superimposed and displayed in the real scene through the augmented reality device.
  • the acquired three-dimensional coordinate information of the mobile device may be three-dimensional coordinate information of the mobile device in a world coordinate system.
  • the three-dimensional model of the movable device is unified with the actual position of the movable device into the coordinate system of the augmented reality device, and displayed by the augmented reality device.
  • the pose of the movable device may be determined according to the three-dimensional coordinate information of the movable device, and virtual-real fusion registration is performed on the three-dimensional model of the movable device and the pose of the movable device.
  • the operator can be guided to move the movable device along the moving path according to the display of the augmented reality device for positioning.
  • the positioning guidance method further preferably includes: acquiring the three-dimensional coordinate information of the movable device in real time, and Update the moving path of the mobile device in real time according to the relative relationship between the real-time three-dimensional coordinate information of the mobile device, the three-dimensional coordinate information of the fixed device, and the three-dimensional coordinate information of the target placement position, and use the enhanced The reality device superimposes and displays the updated moving path in the real scene.
  • the positioning guidance method further includes: according to the three-dimensional coordinate information and the target placement position of the mobile device currently being positioned, the three-dimensional coordinates of the fixed device Coordinate information, three-dimensional coordinate information of other movable devices, the movable device currently being positioned, and the attitude planning of other movable devices to plan the movement path of the movable device currently being positioned, so that the When the movable device currently being positioned moves along the moving path, it can reach the target placement position, and does not interfere with the fixed device and other movable devices during the moving process.
  • the positioning guidance method further preferably includes: acquiring the three-dimensional coordinate information of the mobile device currently being positioned and other mobile devices in real time, And according to the real-time three-dimensional coordinate information of the movable device currently being positioned, the real-time three-dimensional coordinate information of other movable devices, the three-dimensional coordinate information of the fixed device, and the real-time three-dimensional coordinate information of the movable device currently being positioned The relative relationship of the three-dimensional coordinate information of the target placement position updates the moving path in real time.
  • the positioning guidance method may further include: acquiring the deviation between the three-dimensional coordinate information of the movable device and the three-dimensional coordinate information of the target placement position in real time, and judging the deviation of the movable device according to the deviation. Whether the device has reached the target placement location. Accurately judge whether the movable device has reached the target placement position through position calculation, avoiding inaccurate placement of the movable device due to subjective errors during manual judgment.
  • FIG. 1 and FIG. 2 show schematic diagrams of application scenarios of the surgical robot system.
  • the surgical robot system includes a doctor control terminal device 110 , a patient execution terminal device 120 , an image trolley 130 and other equipment.
  • the doctor control terminal device 110 is provided with a master operator.
  • the patient-executing end device 120 includes at least one image arm (not marked in the figure) and at least one tool arm (not marked in the figure), the image arm is used to mount the endoscope, and the tool arm is used to mount the Surgical Instruments.
  • the endoscope and the surgical instrument are respectively used to enter the patient's body from a wound on the patient's body, wherein the endoscope is used to obtain information on human tissue, information on surgical instruments in the human body, and information on the operating environment.
  • Surgical instruments are used to perform surgical procedures.
  • the master operator is communicatively connected with the tool arm and the surgical instrument, and the master operator forms a master-slave control relationship with the tool arm and the surgical instrument. That is, the tool arm moves according to the movement of the master manipulator during the operation, and the surgical instrument executes the movement instructions related to the master manipulator. That is to say, during the operation, the doctor operates the main manipulator, so that the surgical instrument performs a corresponding operation.
  • the image trolley 130 is used to display the situation inside the human body, which is convenient for nurses to observe.
  • the patient is supported on the sick bed 11 , and the operation room is also equipped with equipment such as an operation lighting 12 and a ventilator 13 .
  • equipment such as an operation lighting 12 and a ventilator 13 .
  • the patient bed 11, the surgical lighting lamp 12, the ventilator 13 and other equipment are already in a fixed posture, while the doctor control terminal equipment 110, The patient execution end equipment 120, the image trolley 130 and other equipment need to be properly positioned according to the postures of the patient bed 11, the surgical lighting 12, the ventilator 13 and other equipment, so as to ensure The image arm and the tool arm of the patient-executing end device 120 can be fully deployed to avoid interference with various devices (including fixed devices and other movable devices) in the operating room.
  • the fixed equipment includes at least the hospital bed 11 , the surgical lighting 12 , the ventilator 13 and other equipment.
  • the movable device includes the doctor control The end device 110, the patient execution end device 120, and the image trolley 130.
  • this article will describe in detail the application scenario of the positioning guidance method for a movable device provided by the present application by taking the preoperative preparation for a surgical operation by using a surgical robot system as an example.
  • the guidance method for the mobile device can also be applied to other systems, which is not limited in this application.
  • Fig. 3 shows a flow chart of a positioning guidance method for a mobile device provided by the present application according to a non-limiting embodiment.
  • step S110 is first performed: obtaining the three-dimensional coordinate information of the fixed device and the mobile device.
  • the three-dimensional coordinate information of the fixed device and the movable device may be three-dimensional coordinate information in a world coordinate system.
  • the three-dimensional coordinate information of the hospital bed 11, the operating lighting lamp 12, the ventilator 13, the doctor control terminal equipment 110, the patient implementation terminal equipment 120 and the image trolley 130 can all be based on a positioning device 200 (as shown in Figure 4) to obtain.
  • the positioning apparatus 200 is configured to obtain target information of the fixed device and the movable device, and the target information is used to obtain three-dimensional coordinate information of the corresponding device.
  • the positioning device 200 is, for example, a binocular vision device.
  • the three-dimensional geometric information of the measured object can be obtained from multiple images based on the principle of parallax.
  • binocular vision generally acquires two digital images of the measured object from different angles simultaneously by dual cameras, or obtains two digital images of the measured object from different angles by a single camera at different times, and based on parallax
  • the principle restores the three-dimensional geometric information of the measured object and obtains the position of the measured object. That is to say, when the three-dimensional coordinate information of the measured object is acquired by the binocular vision device, the target information is the image information of the measured object.
  • Fig. 5 schematically shows the principle of three-dimensional measurement of a binocular vision device.
  • the point P(x, y, z) is a feature point on the measured object
  • O l is the optical center of the left camera
  • O r is the optical center of the right camera.
  • the optical center distance of the two cameras is the baseline b, and the focal lengths of the two cameras are both f.
  • the two cameras shoot the same feature point P(x,y,z) of the measured object at the same moment, and obtain the following relationship according to the principle of similar triangles:
  • the three-dimensional coordinate information of the feature point P on the measured object in the coordinate system of the binocular vision device (ie, the positioning device 200 ) can be obtained.
  • the three-dimensional coordinate information of other feature points on the measured object in the coordinate system of the binocular vision device is obtained, and then the three-dimensional coordinate information of the measured object in the coordinate system of the binocular vision device is obtained.
  • the mapping relationship between the coordinate system of the positioning device 200 and the world coordinate system can be obtained through the rotation matrix R and the translation vector t.
  • the coordinates P(x c , y c , z c ) of the measured point P in the coordinate system of the positioning device 200 and its coordinates P(x w , y w , z w ) in the world coordinate system satisfy The following relationship:
  • R is a 3 ⁇ 3 matrix
  • t is a 3 ⁇ 1 vector
  • 0 is (0,0,0)
  • M b is a 4 ⁇ 4 matrix, which is also called the camera external parameter matrix.
  • the parameter matrix can be obtained through an existing camera calibration method, which will not be described in detail in this application.
  • the three-dimensional coordinate information of any measured object in the world coordinate system can be obtained.
  • the hospital bed 11, the operating lighting 12, the ventilator 13, the doctor control terminal equipment 110, the patient execution terminal equipment 120 and the image trolley 130 Each includes a target 14 that can be identified by the positioning device 200 (marked in FIG. 4 ), the positioning device 200 acquires the image information of the target 14, and then can acquire the doctor's control based on the image information of the target 14.
  • the three-dimensional coordinate information of the end device 110 , the patient execution end device 120 and the image trolley 130 It can be understood that the operation of acquiring the three-dimensional coordinate information of the corresponding device based on the image information of the target 14 is generally performed by a computer program.
  • step S120 is executed: planning the target placement position of the movable device and the moving path of the movable device.
  • the target placement position of the movable device is planned according to the three-dimensional coordinate information of the fixed device and the posture of the movable device.
  • the movement path of the movable device is plan according to the three-dimensional coordinate information of the fixed device, the posture of the movable device, and the target placement position of the movable device.
  • the posture of the movable device may be determined according to the specific type of the movable device, or determined according to the three-dimensional coordinate information of the movable device.
  • the target placement position and the movable path can be planned by a computer program according to any suitable planning method, alternatively, they can also be planned manually by an operator.
  • the movable devices when one of the movable devices is placed to the corresponding target placement position, this one of the movable devices will not interfere with the fixed device and other movable devices .
  • the three-dimensional coordinate information of other movable devices when planning the movement path of the movable device currently being positioned, the three-dimensional coordinate information of other movable devices, the The pose of the movable device in position and the other movable devices, so that the movable device currently being positioned does not interfere with the other movable devices when it moves along the moving path.
  • the "other movable equipment” here refers to the movable equipment other than the movable equipment currently being positioned when there are multiple movable equipment.
  • the movable equipment includes all The doctor control terminal device 110, the patient executive terminal device 120 and the image trolley 130, and when the doctor control terminal device 110 is being positioned, other movable devices refer to the patient executive terminal device 120 and the image trolley 130.
  • other movable devices refer to the patient executive terminal device 120 and the image trolley 130.
  • there is only one movable device and the movable device is being positioned there are no other movable devices.
  • doctor-side control device 110 when positioning the doctor-side control device 110, it is mainly to drive the doctor-side control device 110 to move forward, backward, left, and right horizontally on the horizontal plane (that is, move along the horizontal plane). 6 in the X direction and/or Y direction), so that it is in a suitable horizontal position.
  • the image trolley 130 When the image trolley 130 is placed, it is mainly to drive the image trolley 110 to move forward, backward, left, and right on the horizontal plane (moving along the X direction and/or Y direction in FIG. 7 ) , so that it is in the proper horizontal position.
  • the "target placement position" mentioned herein includes the three-dimensional coordinate information of the doctor-end control device 110, the patient-end device 120, and the image trolley 130 in the world coordinate system (i.e.
  • the specific position of the movable device in the X, Y, and Z directions is known.
  • the three-dimensional coordinate information of the target placement position in the world coordinate system is known, and the relative relationship between it and the coordinates of the fixed device is known.
  • step S130 is executed: acquiring the three-dimensional model of the movable device.
  • the three-dimensional model of the movable device can be established and stored in advance, and can be called directly when executing the position guidance method of the movable device.
  • the three-dimensional model of the movable device may also be established based on the three-dimensional coordinate information of the movable device obtained by the positioning device when the positioning guidance method is executed, and the movable device may be established at the same time The mapping relationship between the coordinate system of the 3D model and the coordinate system of the mobile device.
  • step S140 After calling the 3D model, perform step S140: perform virtual-real fusion registration on the 3D model of the movable device and the pose of the movable device, and establish the coordinate system of the movable device and the The mapping relationship of the coordinate system of the three-dimensional model of the mobile device.
  • step S150 is executed: make the three-dimensional model of the movable device at the target placement position, and superimpose and display it in the real scene through the augmented reality device 300 (marked in FIG. 4 and FIG. 9 ), and meanwhile use the
  • the augmented reality apparatus 300 enables the moving path of the movable device to be superimposed and displayed in the real scene.
  • the pose of the movable device may be determined according to the three-dimensional coordinate information of the movable device.
  • the augmented reality device 300 is, for example, AR glasses.
  • the positioning device 200 can be used to establish a mapping relationship between the coordinate system of the augmented reality device 300 and the coordinate system of the movable device in the world coordinate system, and then combine the coordinate system of the movable device
  • the mapping relationship with the coordinate system of the three-dimensional model of the mobile device establishes the mapping relationship between the coordinate system of the augmented reality apparatus 300 and the coordinate system of the three-dimensional model of the mobile device.
  • Using the positioning device 200 to establish a mapping relationship between the coordinate system of the augmented reality device 300 and the coordinate system of the mobile device specifically includes: establishing the positioning device 200 and the augmented reality device 300 in the world coordinate system coordinate system of the augmented reality device 300 and the coordinate system of the mobile device Create a mapping relationship.
  • the augmented reality device 300 when the augmented reality device 300 and the positioning device 200 are two separate devices, the augmented reality device 300 includes a target (not shown in the figure), so The positioning device 200 shoots the digital image of the target, and then can obtain the three-dimensional coordinate information of the augmented reality device 300 in the world coordinate system, and further establish the coordinate system of the positioning device 200 and the coordinate system of the augmented reality device 300.
  • the mapping relationship of the coordinate system As shown in FIG. 11 and FIG.
  • the positioning device 200 is arranged on the augmented reality device 300, then the mechanical position between the positioning device 200 and the augmented reality device 300 is fixed, so that the positioning device 200 can be fixed according to
  • the positioning device 200 and the mechanical position of the augmented reality device 300 establish a mapping relationship between the coordinate system of the positioning device 200 and the coordinate system of the augmented reality device 300 .
  • the mechanical position is a known fixed relative position between the positioning device 200 and the augmented reality device 300 .
  • the target placement position is in the positioning device 200
  • the three-dimensional coordinate information in the coordinate system of is in the positioning device 200
  • the mapping relationship between the coordinate system of the positioning device 200 and the coordinate system of the augmented reality device 300 the three-dimensional coordinate information of the target placement position in the coordinate system of the augmented reality device 300 can be obtained.
  • the target placement position, the three-dimensional model of the mobile device, and the moving path are all integrated into the coordinate system of the augmented reality device 300, and can be displayed in the coordinate system of the augmented reality device 300. , placing the three-dimensional model of the mobile device at the target placement position and superimposing and displaying it in the real scene, and enabling the moving path of the mobile device to be superimposed and displayed in the real scene through the augmented reality device 300 .
  • the sequence of the step S110 to the step S150 shown in FIG. 3 is not fixed, for example, the step S110 may be executed separately. Or, before performing the step S120, the step S130 and the step S140 may be performed first. Alternatively, the step S130 is executed first, then the step S120 is executed, and then the step S140 is executed. That is to say, as long as the three-dimensional model of the mobile device at the target placement position and the moving path can be finally displayed on the augmented reality apparatus 300 .
  • the operator can position the movable device according to the display of the augmented reality device 300 .
  • the method for guiding the positioning of the movable device further includes step S160: updating the moving path of the movable device.
  • the three-dimensional coordinate information of the current position of the movable device being positioned is obtained in real time, and according to the three-dimensional coordinate information of the movable device currently being positioned, the three-dimensional coordinate information of the fixed device, The three-dimensional coordinate information of other mobile devices and the three-dimensional coordinate information of the target placement position corresponding to the mobile device being positioned are used to update the moving path of the mobile device, and use the augmented reality device 300 to The updated movement path is overlaid on the real scene.
  • the movable device being positioned moves along the moving path, the movable device can smoothly reach the target placement position, and does not interfere with the fixed device and other movable devices during the movement interference occurs.
  • the coordinate systems of multiple mobile devices can also establish a mapping relationship based on the positioning device 200, specifically, the coordinate systems of the positioning device 200 and each of the positioning devices are respectively established in the world coordinate system.
  • the mapping relationship of the coordinate systems of the movable equipment and then establish the mapping relationship between the coordinate systems of the movable equipment.
  • the positioning guidance method of the mobile device further includes step S170: obtaining in real time the deviation between the three-dimensional coordinate information of the mobile device currently being positioned and the three-dimensional coordinate information of the target placement position, and according to the obtained The deviation is used to determine whether the movable device has reached the target placement position.
  • the solid line indicates the current position of the doctor control terminal device 110
  • the dotted line indicates the position of the doctor control terminal device 110.
  • the target placement position of the end device 110 If the current position of the doctor control terminal device 110 does not coincide with the target placement position (that is, the solid line area shown in FIG. When there is a deviation between the information and the three-dimensional coordinate information of the target placement position, it may be determined that the doctor control terminal device 110 has not reached the target placement position.
  • the doctor control terminal device 110 When the displacement of the doctor control terminal device 110 is S1, the doctor control terminal device 110 is moved so that its current position coincides with the target placement position, that is, the current three-dimensional position of the doctor control terminal device 110 The coordinate information coincides with the three-dimensional coordinate information of the target positioning position, and thus the positioning of the doctor control terminal device 110 is completed.
  • the positioning accuracy of the movable device is improved.
  • the process of judging whether the positioning of the patient executive device 120 and the image trolley 130 is completed is similar (refer to FIG. 13b and FIG. 13c ), and will not be repeated here.
  • the step S120 can simultaneously plan the target placement positions and moving paths of the doctor control terminal device 110, the patient executive terminal device 120, and the image trolley 130, and According to the multiple target placement positions and multiple moving paths displayed by the augmented reality device 300, the operator can respectively control the doctor control terminal device 110, the patient executive terminal device 120, and the image trolley 130. position. That is to say, the operator can complete the positioning of each of the movable devices in sequence according to actual needs.
  • the positioning guidance method of the mobile device further includes step S151: planning the positioning sequence of all the mobile devices according to preset rules.
  • the preset rule is, for example, to determine the positioning order according to the collision probabilities between the movable devices and the fixed devices. Specifically, the equipment with a high collision probability may be positioned first, and then the equipment with a low collision probability may be positioned.
  • the collision probability of multiple movable devices may be predetermined by the operator, or judged by a computer program according to the volume of the movable devices or other factors. In some specific embodiments, it can be determined that the collision probability of the doctor control terminal device 110 , the collision probability of the image trolley 130 and the collision probability of the patient execution terminal device 120 decrease in sequence.
  • the doctor control terminal device 110 is first positioned, then the image trolley 130 is positioned, and finally the patient execution terminal device 120 is positioned. Get in position. It is ensured that while the image arm and the tool arm of the patient implementing end device 120 can be fully extended, collisions between the various devices are avoided.
  • the step S151 is executed after the step S150 and before the step S160.
  • it can be carried out according to the process shown in FIG. 14 Position:
  • Step S210 Position the doctor control terminal device 110. Specifically, the step S160 is performed on the doctor control terminal device 110 and the doctor control terminal device 110 is moved, and the step S170 is executed during the process of moving the doctor control terminal device until the doctor control terminal device is completed. Positioning of device 110 .
  • Step S220 Position the image trolley 130 . Specifically, the step S160 is executed on the image trolley 130 and the image trolley 130 is moved, and the step S170 is executed during the process of moving the image trolley 130 until the image trolley 130 is completed. position.
  • Step S230 Position the patient implementing end device 120. Specifically, the step S160 is executed on the patient-actuated device 120 and the patient-actuated device 120 is moved, and the step S170 is executed during the process of moving the patient-actuated device 120 until the patient-administered device 120 is completed. Positioning of the end device 120.
  • the execution order of the step S151 can also be adjusted as required, for example, it can be executed before the step S120.
  • the step S120 only one target placement position and movement path of the movable device can be planned, that is, the target placement position and movement path of the doctor control terminal device 110 are planned first, and the described After the doctor controls the placement guidance of the terminal device 110, he plans the target placement position and moving path of the image trolley 130, completes the positioning guidance of the image trolley 130, and finally plans the patient execution terminal device 120 The location and movement path of the target, and the placement guidance.
  • the present application also provides a positioning guidance system for a mobile device, which includes the aforementioned positioning device 200, an augmented reality device 300, and a control unit.
  • the control unit is configured to In order to implement the positioning guidance method as described above.
  • the positioning guidance system provided by this application can quickly perform positioning of movable devices such as the doctor control terminal device 110, the patient implementation terminal device 120, and the image trolley 130, avoiding the patient's implementation terminal device 120 during the operation.
  • the image arm and/or the tool arm cannot be extended repeatedly, or the situation of collision and interference between multiple devices, which in turn helps to shorten the operation time.
  • control unit described in this application may include a processor, and perform corresponding operations through the processor.
  • the processor can be a central processing unit (Central Processing Unit, CPU), and can also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf Programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor can be a microprocessor or the processor can also be any conventional processor, etc., and the processor is the control center of the surgical robot system, and uses various interfaces and lines to connect various parts of the entire surgical robot system .
  • the memory can be used to store the computer program, and the processor realizes various functions of the surgical robot system by running or executing the computer program stored in the memory and calling the data stored in the memory.
  • Nonvolatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM) or external cache memory.
  • RAM is available in many forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Chain Synchlink DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.
  • the present application also provides a surgical robot system, the surgical robot system includes the aforementioned positioning guidance system and a movable device, the movable device includes the doctor control terminal device 110, the image trolley 130 and the At least one of the above-mentioned patient implementation end devices 120.
  • the present application also provides a computer-readable storage medium, on which a program is stored, and when the program is executed, the aforementioned method for guiding the positioning of a mobile device is executed.
  • the readable storage medium in the embodiments of the present application may use any combination of one or more computer-readable media.
  • the readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination thereof. More specific examples (non-exhaustive list) of computer readable storage media include: electrical connection with one or more wires, portable computer hard disk, hard disk, random access memory (RAM), read only memory (ROM), Erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a data signal carrying computer readable program code in baseband or as part of a carrier wave. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, which can send, propagate, or transmit a program for use by or in conjunction with an instruction execution system, apparatus, or device. .

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种可移动设备的摆位指引方法及系统、手术机器人系统,可移动设备的摆位指引方法包括:获取固定设备的三维坐标信息;根据固定设备的三维坐标信息、以及可移动设备的姿态规划可移动设备的目标摆放位置;获取可移动设备的三维模型(S130);使可移动设备的三维模型处于目标摆放位置并叠加显示于现实场景中。将摆位指引方法应用于手术机器人系统时,有利于帮助操作者快速且准确地完成手术机器人系统的各个设备的摆位,有利于缩短手术时间。

Description

可移动设备的摆位指引方法及系统、手术机器人系统 技术领域
本申请涉及医疗器械技术领域,具体涉及一种可移动设备的摆位指引方法及系统、手术机器人系统。
背景技术
随着人们对于医疗的逐步重视,手术机器人技术也在飞速发展。手术机器人不仅能够替代医生的眼睛,使医生能够看到病人体内的立体的脏器图像,帮组医生判断病灶组织位置,还能够替代医生的双手完成精准、复杂、灵活的手术操作。不仅于此,利用手术机器人执行微创伤手术还能够减少手术风险和术后并发症的发生概率。
手术机器人所包含的设备的体积均较大,需要在专门的手术室进行手术,若各种设备的摆放不当,极容易出现手术机器人与其他设备相碰撞、手术机器人的机械臂难以伸展的情况,从而影响操作的精确性,此时需要重新调整手术机器人各个设备的位置,延长手术时间。
发明内容
本申请的目的在于提供一种可移动设备的摆位指引方法及系统、手术机器人系统,旨在准确地引导操作者对各类可移动设备例如手术机器人系统的医生控制端设备、图像台车、患者操作端设备等进行摆位,缩短手术时间。
为实现上述目的,本申请提供了一种可移动设备的摆位指引方法,包括:
获取固定设备三维坐标信息;
根据所述固定设备的三维坐标信息以及可移动设备的姿态规划所述可移动设备的目标摆放位置;
获取所述可移动设备的三维模型;
使所述可移动设备的三维模型处于所述目标摆放位置并叠加显示于现实场景中。
可选地,所述摆位指引方法还包括:
获取所述可移动设备的三维坐标信息;
根据所述可移动设备的三维坐标信息、所述固定设备的三维坐标信息、所述可移动设备的姿态、以及所述可移动设备的目标摆放位置规划所述可移动设备的移动路径,以使所述可移动设备按照所述移动路径移动时抵达所述目标摆放位置且不与所述固定设备发生干涉;
将所述移动路径叠加显示于现实场景中。
可选地,所述摆位指引方法还包括:
实时地获取所述可移动设备的三维坐标信息;
根据所述可移动设备的实时的三维坐标信息、所述固定设备的三维坐标信息以及所述目标摆放位置的三维坐标信息的相对关系实时地更新所述可移动设备的移动路径;
将更新的移动路径叠加显示于现实场景中。
可选地,所述可移动设备的数量为多个,所述摆位指引方法还包括:
获取当前正在进行摆位的可移动设备的三维坐标信息;
根据所述当前正在进行摆位的可移动设备的三维坐标信息和目标摆放位置、所述固定设备的三维坐标信息、其他的可移动设备的三维坐标信息、所述当前正在进行摆位的可移动设备以及所述其他的可移动设备的姿态,规划所述当前正在进行摆位的可移动设备的移动路径,以使所述当前正在进行摆位的可移动设备沿所述移动路径移动而抵达所述目标摆放位置,且在移动过程中不与所述固定设备及所述其他可移动设备发生干涉;
将所述移动路径叠加显示于现实场景中。
可选地,所述摆位指引方法还包括:
实时地获取所述当前正在进行摆位的可移动设备及其他可移动设备的三维坐标信息;
根据所述当前正在进行摆位的可移动设备的实时的三维坐标信息、所述其他的可移动设备的实时的三维坐标信息、所述固定设备的三维坐标信息、以及所述当前正在进行摆位的可移动设备的目标摆放位置的三维坐标信息的相对关系实时地更新所述移动路径;
将更新的移动路径叠加显示于现实场景中。
可选地,所述摆位指引方法还包括:
通过所述可移动设备的三维坐标信息确定所述可移动设备的位姿;将所述可移动设备的三维模型与所述可移动设备的位姿进行虚实融合配准,并叠加显示于现实场景中。
可选地,所述摆位指引方法还包括:
实时获取所述可移动设备的三维坐标信息与所述目标摆放位置的三维坐标信息的偏差,并根据所述偏差判断所述可移动设备是否抵达所述目标摆放位置。
可选地,所述可移动设备的数量为多个,所述摆位指引方法还包括:根据预设规则规划多个所述可移动设备的摆位顺序。
为实现上述目的,本申请还提供了一种摆位指引系统,包括:
定位装置,用于获取固定设备的三维坐标信息及可移动设备的三维坐标信息;
增强现实装置,用于将可移动设备的三维模型、可移动设备的移动路径叠加显示于现实场景中;以及,
控制单元,与所述定位装置及所述增强现实装置通讯连接,并被配置用于执行如前任一项所述的可移动设备的摆位指引方法。
可选地,所述控制单元被配置用于基于所述定位装置建立所述增强现实装置的坐标系与所述可移动设备的坐标系及所述固定设备的坐标系之间的映射关系,进而在所述增强现实装置获取的现实场景中叠加显示所述可移动设备的三维模型和、所述可移动设备的移动路径。
可选地,所述控制单元被配置用于在世界坐标系中建立所述定位装置的坐标系与所述增强现实装置的坐标系的映射关系,建立所述定位装置的坐标系与所述可移动设备的坐标系的映射关系,以及建立所述定位装置的坐标系与所述固定设备的坐标系的映射关系,进而建立所述增强现实装置、所述可移动设备的坐标系以及所述固定设备的坐标系之间的映射关系。
可选地,所述定位装置设置于所述增强现实装置上;所述定位装置的坐标系通过机械位置与所述增强现实装置的坐标系建立映射关系。
可选地,所述定位装置与所述增强现实装置分离;通过所述定位装置获取所述增强现实装置的三维坐标信息,建立所述定位装置的坐标系与所述增强现实装置的坐标系的映射关系。
为实现上述目的,本申请还提供了一种手术机器人系统,包括如前任一项所述的摆位指引系统和可移动设备,所述可移动设备包括医生控制端设备、图像台车和患者操作端设备中的至少一者。
为实现上述目的,本申请还提供了一种计算机可读存储介质,其上存储有程序,当所述程序被执行时,执行如前任一项所述的可移动设备的摆位指引方法。
与现有技术相比,本申请的可移动设备的摆位指引方法及系统、手术机器人系统具有如下优点:
前述的可移动设备的摆位指引方法包括:获取固定设备的三维坐标信息;根据所述固定设备的三维坐标信息和可移动设备的姿态规划所述可移动设备的目标摆放位置;获取所述可移动设备的三维模型;使所述可移动设备的三维模型处于所述目标摆放位置并叠加显示于显示场景中。通过采用本申请提供的可移动设备的摆位指引方法,可使所述可移动设备的三维模型、所述目标摆放位置与实际场景相结合地显示,方便指引操作者将所述可移动设备移动至所述目标摆放位置而完成摆位操作。将该摆位指引方法应用手术所使用的医疗设备的摆位时,可快速准确地完成设备的摆位,避免在手术过程中出现各个设备 之间相互干扰的情况,有利于节省手术时间。
附图说明
附图用于更好地理解本申请,不构成对本申请的不当限定。其中:
图1是本申请根据一实施例所提供的手术机器人系统的应用场景示意图;
图2是本申请根据一实施例所提供的手术机器人系统的应用场景示意图,图2与图1的区别之处在于观察方位不同;
图3是本申请根据一实施例所提供的可移动设备的摆位指引方法的流程图;
图4是本申请根据一实施例所提供的可移动设备的摆位指引方法中利用双目视觉装置获取可移动设备的三维模型时的示意图;
图5是本申请根据一实施例所提供的双目立体视觉三维测量的原理示意图;
图6是本申请根据一实施例所提供的可移动设备的摆位指引方法中对医生控制端设备进行摆位时,医生控制端设备的移动方向示意图;
图7是本申请根据一实施例所提供的可移动设备的摆位指引方法中对图像台车进行摆位时,图像台车的移动方向示意图;
图8是本申请根据一实施例所提供的可移动设备的摆位指引方法中对患者执行端设备进行摆位时,患者执行端的移动方向及支撑机构的移动方向的示意图;
图9是本申请根据一实施例所提供的可移动设备的摆位指引方法中获取可移动设备与其三维模型的空间映射关系时的示意图,图示中双目视觉设备与AR眼镜相互分离;
图10是本申请根据一实施例所提供的可移动设备的摆位指引方法中相互独立的双目视觉装置及AR眼镜的结构示意图;
图11是本申请根据一实施例所提供的可移动设备的摆位指引方法中获取可移动设备与其三维模型的空间映射关系时的示意图,图示中双目视觉装置集成于AR眼镜上;
图12是本申请根据一实施例所提供的可移动设备的摆位指引方法中双目视觉装置集成于AR眼镜时的结构示意图;
图13a是本申请根据一实施例所提供的可移动设备的摆位指引方法中判断医生控制端设备是否抵达目标摆放位置时的示意图;
图13b是本申请根据一实施例所提供的可移动设备的摆位指引方法中判断患者执行端设备是否抵达目标摆放位置时的示意图;
图13c是本申请根据一实施例所提供的可移动设备的摆位指引方法中判断图像台车是否抵达目标摆放位置时的示意图;
图14是本申请根据一实施例所提供的手术机器人系统中的医生控制端设备、患者执行端设备及图像台车的摆位流程图。
[附图标记说明如下]:
110-医生控制端设备,120-患者执行端设备,130-图像台车;
11-病床,12-手术照明灯,13-呼吸机,14-靶标;
200-定位装置,300-增强现实装置。
具体实施方式
以下通过特定的具体实例说明本申请的实施方式,本领域技术人员可由本说明书所揭露的内容轻易地了解本申请的其他优点与功效。本申请还可以通过另外不同的具体实施方式加以实施或应用,本说明书中的各项细节也可以基于不同观点与应用,在没有背离本申请的精神下进行各种修饰或改变。需要说明的是,本实施例中所提供的图示仅以示意方式说明本申请的基本构想,遂图式中仅显示与本申请中有关的组件而非按照实际实施时的组件数目、形状及尺寸绘制,其实际实施时各组件的型态、数量及比例可为一种随意的改变,且其组件布局型态也可能更为复杂。
另外,以下说明内容的各个实施例分别具有一或多个技术特征,然此并不意味着使用本申请者必需同时实施任一实施例中的所有技术特征,或仅能分开实施不同实施例中的一部或全部技术特征。换句话说,在实施为可能的前提下,本领域技术人员可依据本申请的公开内容,并视设计规范或实作需求,选择性地实施任一实施例中部分或全部的技术特征,或者选择性地实施多个实施例中部分或全部的技术特征的组合,借此增加本申请实施时的弹性。
如在本说明书中所使用的,单数形式“一”、“一个”以及“该”包括复数对象,复数形式“多个”包括两个以上的对象,除非内容另外明确指出外。如在本说明书中所使用的,术语“或”通常是以包括“和/或”的含义而进行使用的,除非内容另外明确指出外,以及术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接。可以是机械连接,也可以是电连接。可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
本申请的核心思想在于提供一种可移动设备的摆位指引方法、可移动设备的摆位指引系统、手术机器人系统及计算机可读存储介质,旨在帮助操作者快速准确地对手术机器人系统的各个设备进行摆 放,避免在手术过程中发生设备干涉而不利于手术进行的情况,有利于缩短手术时间。
为实现上述思想,所述可移动设备的摆位指引方法包括:获取固定设备的三维坐标信息;根据所述固定设备的三维坐标信息和所述可移动设备的姿态规划所述可移动设备的目标摆放位置;获取所述可移动设备的三维模型;使所述可移动设备的三维模型处于所述目标摆放位置并叠加显示于现实场景中,从而指引操作者对所述可移动设备进行摆位。本申请实施例中通过增强现实装置使所述可移动设备的三维模型叠加显示于现实场景中。之后操作者可以根据所述增强现实装置中显示的所述可移动设备的三维模型和所述目标摆放位置进行所述可移动设备的摆位。应理解,针对不同的应用场景,所述可移动设备是指在执行一特定操作前需要通过对其位姿进行调整,以使其处于一合适位姿的设备,所述固定设备是指在执行该特定操作前,位姿已被固定的设备。本领域技术人员可以理解,位姿包括设备的位置和姿态。
本申请实施例中,初始获取的所述固定设备的三维坐标信息可以是所述固定设备在世界坐标系下的三维坐标信息,进而可以规划所述可移动设备在世界坐标系下的目标摆放位置。
在规划所述可移动设备的目标摆放位置时,应以所述可移动设备处于所述目标摆放位置时与所述固定设备不发生干涉为基本标准。其中,所述可移动设备的姿态可以由本领域技术人员根据所述可移动设备的具体类型来确定,也可以根据所述可移动设备的长度、高度、宽度的尺寸(也即所述可移动设备的三维坐标信息)确定,相应地,所述目标摆放位置包括所述可移动设备在世界坐标系下的三维坐标信息,也即包括所述可移动设备在X、Y、Z三个方向上的具体位置。并且,一旦所述目标摆放位置规划完成,就可以获得在世界坐标系下所述目标摆放位置的三维坐标系与所述固定设备的三维坐标系之间的映射关系。
本申请实施例中可通过任意合适的方式获取所述可移动设备的三维模型,并通过在世界坐标系下建立所述增强现实装置的坐标系与所述可移动设备的三维模型的坐标系之间的映射关系,将所述可移动设备的三维模型和所述目标摆放位置统一至所述增强现实装置的坐标系中,并使得所述可移动设备的三维模型能够放置在所述目标摆放位置,且通过所述增强现实装置显示。
进一步地,在获取所述可移动设备的目标摆放位置后,还可以获取所述可移动设备的三维坐标信息,并根据所述可移动设备的三维坐标信息、所述固定设备的三维坐标信息、所述可移动设备姿态、以及所述可移动设备的目标摆放位置规划所述可移动设备的移动路径,以使所述可移动设备沿所述移动路径移动时,能够抵达所述目标摆放位置,并在移动过程中不与所述固定设备发生干涉。所述移动路径亦通过所述增强现实装置叠加显示于现实场景中。
本申请实施例中,获取的所述可移动设备的三维坐标信息可以是所述可移动设备在世界坐标系下的三维坐标信息。通过在世界坐标系下建立所述增强现实装置的坐标系与所述可移动设备的三维坐标系之间的映射关系,将所述可移动设备的三维模型与所述可移动设备的实际位置统一至所述增强现实装置的坐标系中,并通过所述增强现实装置显示。具体地,可根据所述可移动设备的三维坐标信息确定所述可移动设备的位姿,并将所述可移动设备的三维模型与所述可移动设备的位姿进行虚实融合配准。利用 上述虚实融合配准技术,可指引操作者根据所述增强现实装置的显示,沿所述移动路径移动所述可移动设备,进行摆位。
不仅于此,为确保操作者能够顺利地将所述可移动设备移动至所述目标摆放位置,所述摆位指引方法还优选包括:实时地获取所述可移动设备的三维坐标信息,并根据所述可移动设备的实时三维坐标信息、所述固定设备的三维坐标信息以及所述目标摆放位置的三维坐标信息的相对关系实时更新所述可移动设备的移动路径,并利用所述增强现实装置使更新的移动路径叠加显示于现实场景中。
进一步地,对于应用场景包括多个可移动设备的情况,所述摆位指引方法还包括:根据当前正在进行摆位的可移动设备的三维坐标信息和目标摆放位置、所述固定设备的三维坐标信息、其他可移动设备的三维坐标信息、所述当前正在进行摆位的可移动设备以及其他可移动设备的姿态规划所述当前正在进行摆位的可移动设备的移动路径,以使所述当前正在进行摆位的可移动设备沿所述移动路径移动时,能够抵达目标摆放位置,并在移动过程中不与所述固定设备及其他可移动设备发生干涉。
更进一步地,对于应用场景包括多个可移动设备的情况,所述摆位指引方法还优选包括:实时地获取所述当前正在进行摆位的可移动设备及其他可移动设备的三维坐标信息,并根据所述当前正在进行摆位的可移动设备的实时三维坐标信息、其他可移动设备的实时三维坐标信息、所述固定设备的三维坐标信息以及所述当前正在进行摆位的可移动设备的目标摆放位置的三维坐标信息的相对关系实时更新移动路径。
再进一步地,所述摆位指引方法还可包括:实时地获取所述可移动设备的三维坐标信息与所述目标摆放位置的三维坐标信息的偏差,并根据所述偏差判断所述可移动设备是否抵达所述目标摆放位置。通过位置计算准确地判断所述可移动设备是否抵达所述目标摆放位置,避免人工判断时因主观误差而造成所述可移动设备的摆位不准确。
为使本申请的目的、优点和特征更加清楚,以下结合附图对本申请作进一步详细说明。需说明的是,附图均采用非常简化的形式且均使用非精准的比例,仅用以方便、明晰地辅助说明本申请实施例的目的。附图中相同或相似的附图标记代表相同或相似的部件。
图1及图2示出了手术机器人系统时的应用场景示意图。如图1所示,所述手术机器人系统包括医生控制端设备110、患者执行端设备120、图像台车130等设备。所述医生控制端设备110上设置有主操作手。所述患者执行端设备120包括至少一个图像臂(图中未标注)和至少一个工具臂(图中未标注),所述图像臂用于挂载内窥镜,所述工具臂用于挂载手术器械。所述内窥镜和所述手术器械分别用于从患者身体上的创口进入患者体内,其中,所述内窥镜用于获取人体组织信息、人体内的手术器械信息及手术环境信息,所述手术器械用于执行手术操作。进一步地,所述主操作手与所述工具臂及所述手术器械通信连接,且所述主操作手与所述工具臂及所述手术器械构成主从控制关系。即,所述工具臂在手术过程中根据主操作手的运动而运动,且所述手术器械执行所述主操作手相关的运动指令。也就是说,在手术过程中,医生操作主操作手,以使得所述手术器械执行相应手术操作。与此同时,所述图像台车 130用于显示人体内的情况,便于护士观察。
如图2所示,在实际手术过程中,患者被支撑于病床11上,同时手术室内还设置有手术照明灯12、呼吸机13等设备。应理解,在术前准备阶段,所述病床11、所述手术照明灯12、所述呼吸机13等设备已处于固定的位姿,而所述手术机器人系统的所述医生控制端设备110、所述患者执行端设备120、所述图像台车130等设备需要根据所述病床11、所述手术照明灯12、所述呼吸机13等设备的位姿进行合理摆位,以确保在手术过程中所述患者执行端设备120的所述图像臂、所述工具臂能够充分施展,避免与手术室内的各个设备(包括固定设备及其他的可移动设备)产生干涉。
故而,在针对于手术机器人系统执行手术操作的这一特定应用场景中,所述固定设备至少包括病床11、所述手术照明灯12、所述呼吸机13等设备。在术前准备中,且所述医生控制端设备110、所述患者执行端设备120以及所述图像台车130均未摆放至目标摆放位置之前,所述可移动设备包括所述医生控制端设备110、所述患者执行端设备120、所述图像台车130。
接下去,本文以利用手术机器人系统执行手术操作的术前准备为例对本申请所提供的可移动设备的摆位指引方法的应用场景进行详细说明。但应理解,所述可移动设备的指引方法还可以应用于其他系统上,本申请对此不作限定。
图3示出了本申请根据一非限制性的实施例所提供的可移动设备的摆位指引方法的流程图。
请参考图3,在执行所述可移动设备的摆位指引方法时,首先执行步骤S110:获取所述固定设备和所述可移动设备的三维坐标信息。本步骤中,所述固定设备和所述可移动设备的三维坐标信息可以是世界坐标系下的三维坐标信息。所述病床11、所述手术照明灯12、所述呼吸机13、所述医生控制端设备110、所述患者执行端设备120以及所述图像台车130的三维坐标信息均可以基于一定位装置200(如图4所示)获取。具体来说,所述定位装置200用于获取所述固定设备及所述可移动设备的目标信息,所述目标信息用于获取所述相应设备的三维坐标信息。所述定位装置200例如是双目视觉装置。
通过双目视觉装置可基于视差原理由多幅图像获取被测物体的三维几何信息。在机器视觉系统中,双目视觉一般由双摄像机从不同角度同时获取被测物体的两幅数字图像,或由单摄像机在不同时刻从不同角度获取被测物体的两幅数字图像,并基于视差原理恢复出被测物体的三维几何信息,获取被测物体位置。也就是说,以所述双目视觉装置获取被测物体的三维坐标信息时,所述目标信息为所述被测物体的图像信息。
图5示意性地给出了双目视觉装置的三维测量的原理。请参考图5,点P(x,y,z)是被测物体上的一个特征点,O l是左相机的光心,O r是右相机的光心。如果用左相机观察点P,看到它在左相机的图像点位于P 1,但我们无法由P l得知P的三维位置,事实上,在O 1P 1连线上任一点于左相机上的图像点都是P 1,因此,由P 1点的位置,只能知道空间点P位于直线O 1P 1上。同理,从右相机的角度观察,只能知道空间点P位于直线O rP r上。由此,当两台相机在同一时刻拍摄被测物体的同一特征点P(x,y,z),直线O lP l与直线O rP r的交点即空间点P所处的位置,即空间点P的三维坐标是唯一确定的。
进一步地,两台相机的光心距离即基线b,两台相机的焦距均为f。两台相机在同一时刻拍摄被测物体的同一特征点P(x,y,z),并根据相似三角形原理得到如下关系式:
Figure PCTCN2022101376-appb-000001
进而得到:
Figure PCTCN2022101376-appb-000002
由此,可以得到被测物体上的特征点P在所述双目视觉装置(即所述定位装置200)的坐标系下的三维坐标信息。基于同样的原理,得到被测物体上的其他特征点在所述双目视觉装置坐标系下的三维坐标信息,进而得到被测物体在所述双目视觉装置坐标系下的三维坐标信息。
所述定位装置200的坐标系与世界坐标系的映射关系可通过旋转矩阵R与平移向量t得到。被测的点P在所述定位装置200的坐标系下的坐标P(x c,y c,z c)与其在世界坐标系下的坐标P(x w,y w,z w)之间满足如下关系:
Figure PCTCN2022101376-appb-000003
其中,R为3×3的矩阵,t为3×1的向量,0为(0,0,0),M b为4×4的矩阵,也被称为相机外部参数矩阵,所述相机外部参数矩阵可以通过现有的相机标定方法得到,本申请对此不再进行详细赘述。
由此,根据所述定位装置200的相机的外部参数矩阵和被测物体在所述定位装置200的坐标系下的坐标即可获取任一个被测物体在世界坐标系下的三维坐标信息。
应理解,本申请实施例中,所述病床11、所述手术照明灯12、所述呼吸机13、所述医生控制端设备110、所述患者执行端设备120及所述图像台车130上分别包括能够被所述定位装置200识别的靶标14(如图4所标注),所述定位装置200获取所述靶标14的图像信息,之后可基于所述靶标14的图像信息获取所述医生控制端设备110、所述患者执行端设备120及所述图像台车130的三维坐标信息。可以理解,基于所述靶标14的图像信息获取相应设备的三维坐标信息的操作通常由计算机程序执行。
之后,执行步骤S120:规划所述可移动设备的目标摆放位置和所述可移动设备的移动路径。具体是根据所述固定设备的三维坐标信息、所述可移动设备的姿态规划所述可移动设备的目标摆放位置。并进一步根据所述固定设备的三维坐标信息、所述可移动设备的姿态、以及所述可移动设备的目标摆放位置规划所述可移动设备的移动路径。其中,所述可移动设备的姿态可以根据所述可移动设备的具体类型确定,或者根据所述可移动设备的三维坐标信息确定。所述目标摆放位置及所述可以动路径可由计算机程序根据任意合适的规划方法进行规划,替代性地,也可以由操作者人工规划。
本领域技术人员可以理解,当一个所述可移动设备摆放至与之对应的所述目标摆放位置时,这一个所述可移动设备不与所述固定设备及其他的可移动设备发生干扰。并且,当所述可移动设备的数量为多个时,在规划当前正在进行摆位的可移动设备的移动路径时,还应当考虑其他的可移动设备的三维坐标信息、所述当前正在进行摆位的可移动设备及所述其他的可移动设备的姿态,以使所述当前正在进行摆位的可移动设备沿所述移动路径移动时与所述其他的可移动设备不产生干涉。这里的“其他可移动设备”是指当存在多个可移动设备时,除当前正在进行摆位的所述可移动设备以外的可移动设备,举例来说,当所述可移动设备同时包括所述医生控制端设备110、所述患者执行端设备120以及所述图像台车130,且当其所述医生控制端设备110正在进行摆位,那么其他可移动设备则是指所述患者执行端设备120和所述图像台车130。当然,在其他的应用场景中,若只有一个可移动设备,且该可移动设备正在进行摆位,则不存在其他可移动设备。
本领域技术人员还可以理解,在对所述医生端控制设备110进行摆位时,主要是驱使所述医生端控制设备110在水平面上做前、后、左、右的水平移动(即沿图6中的X方向和/或Y方向移动),以使其处于合适的水平位置。对所述图像台车130进行摆位时,主要是驱使所述图像台车110在水平面上做前、后、左、右的水平移动(沿图7中的X方向和/或Y方向移动),以使其处于合适的水平位置。对所述患者执行端设备120进行摆位时,需要驱使所述患者执行端设备120在水平面上进行前、后、左、右的水平移动(沿图8所示的X方向和/或Y方向移动),以使其处于合适的水平位置,以及驱使所述患者执行端设备120上的支撑机构在竖直平面做上、下移动(沿图8所示的Z方向移动),以使所述图像臂和所述工具臂处于合适的高度。由此,本文所述及的“所述目标摆放位置”包括所述医生端控制设备110、所述患者执行端设备120及所述图像台车130在世界坐标系下的三维坐标信息(即,可移动设备在X、Y、Z三个方向上的具体位置)。并且,当所述目标摆放位置规划完成后,所述目标摆放位置在世界坐标系下的三维坐标信息已知,其与所述固定设备的坐标的相对关系已知。
随后,执行步骤S130:获取所述可移动设备的三维模型。
本申请实施例对所述可移动设备的三维模型的获取方式没有特别限定。本实施例中,可以预先建立并存储所述可移动设备的三维模型,在执行所述可移动设备的摆位指引方法时直接调用即可。替代性地,也可以在执行所述摆位指引方法时基于所述定位装置所获取的所述可移动设备的三维坐标信息建立所述可移动设备的三维模型,同时可以建立所述可移动设备的三维模型的坐标系与所述可移动设备的坐标系的映射关系。
调用所述三维模型之后,执行步骤S140:对所述可移动设备的三维模型与所述可移动设备的位姿进行虚实融合配准,并据此建立所述可移动设备的坐标系与所述可移动设备的三维模型的坐标系的映射关系。
之后,执行步骤S150:使所述可移动设备的三维模型处于所述目标摆放位置,并通过增强现实装置300(如图4及图9所标注)叠加显示于现实场景中,同时还利用所述增强现实装置300使所述可移 动设备的移动路径也叠加显示于现实场景中。其中,所述可移动设备的位姿可以根据所述可移动设备的三维坐标信息确定。所述增强现实装置300例如是AR眼镜。
本步骤中,可先在世界坐标系下利用所述定位装置200建立所述增强现实装置300的坐标系与所述可移动设备的坐标系的映射关系,再结合所述可移动设备的坐标系与所述可移动设备的三维模型的坐标系的映射关系建立所述增强现实装置300的坐标系与所述可移动设备的三维模型的坐标系的映射关系。
利用所述定位装置200建立所述增强现实装置300的坐标系与所述可移动设备的坐标系的映射关系具体包括:在世界坐标系中,建立所述定位装置200与所述增强现实装置300的坐标系的映射关系,以及建立所述定位装置200的坐标系与所述可移动设备的坐标系的映射关系,进而使得所述增强现实装置300的坐标系与所述可移动设备的坐标系建立映射关系。
其中,如图9及图10所示,当所述增强现实装置300与所述定位装置200是两个相互分离的设备时,所述增强现实装置300包括靶标(图中未示出),所述定位装置200拍摄所述靶标的数字图像,进而可以获取所述增强现实装置300在世界坐标系下的三维坐标信息,并进一步建立所述定位装置200的坐标系与所述增强现实装置300的坐标系的映射关系。或者,如图11及图12所示,所述定位装置200设置于所述增强现实装置300上,那么所述定位装置200与所述增强现实装置300之间的机械位置固定,由此可根据所述定位装置200与所述增强现实装置300的机械位置建立所述定位装置200的坐标系与所述增强现实装置300的坐标系的映射关系。所述机械位置为所述定位装置200与所述增强现实装置300之间已知的固定相对位置。
以及,根据所述目标摆放位置在世界坐标系下的三维坐标信息、以及所述定位装置200的坐标系与世界坐标系的转换关系,可以得到所述目标摆放位置在所述定位装置200的坐标系下的三维坐标信息。之后根据所述定位装置200的坐标系与所述增强现实装置300的坐标系的映射关系,可以得到所述目标摆放位置在所述增强现实装置300的坐标系下的三维坐标信息。
基于此,所述目标摆放位置、所述可移动设备的三维模型以及所述移动路径均统一至所述增强现实装置300的坐标系中,并能够在所述增强现实装置300的坐标系下,将所述可移动设备的三维模型放置在所述目标摆放位置并叠加显示于现实场景中,以及可以使所述可移动设备的移动路径通过所述增强现实装置300叠加显示于现实场景中。
需要说明的是,图3所示的所述步骤S110至所述步骤S150的顺序并不是固定不变的,例如所述步骤S110可分开执行。或者,在执行所述步骤S120之前,可以先执行所述步骤S130和所述步骤S140。或者,先执行所述步骤S130,再执行所述步骤S120,接着执行所述步骤S140。也就是说,只要最终能够在所述增强现实装置300中显示处于所述目标摆放位置的所述可移动设备的三维模型以及所述移动路径即可。
这样一来,操作者可根据所述增强现实装置300的显示对所述可移动设备进行摆位。
请继续参考图3,在摆位过程中,往往因各种原因造成所述可移动设备偏离所述移动路径,或者预先规划的移动路径不准确,因此为避免所述可移动设备与其他设备发生碰撞并提高摆位效率,所述可移动设备的摆位指引方法还包括步骤S160:更新所述可移动设备的移动路径。具体为:实时获取正在进行摆位的所述可移动设备的当前位置的三维坐标信息,并根据所述当前正在进行摆位的可移动设备的三维坐标信息、所述固定设备的三维坐标信息、其他可移动设备的三维坐标信息以及与正在进行摆位的可移动设备相对应的所述目标摆放位置的三维坐标信息更新所述可移动设备的移动路径,并利用所述增强现实装置300将更新的移动路径叠加显示于现实场景中。当所述正在进行摆位的可移动设备沿所述移动路径移动时,该可移动设备能够顺利抵达所述目标摆放位置,且在移动过程中不与所述固定设备、以及其他可移动设备发生干涉。
本领域技术人员可以理解,多个可移动设备的坐标系之间也可以基于所述定位装置200建立映射关系,具体是在世界坐标系下分别建立所述定位装置200的坐标系与各个所述可移动设备的坐标系的映射关系,然后建立各个所述可移动设备的坐标系之间的映射关系。
进一步地,所述可移动设备的摆位指引方法还包括步骤S170:实时获取当前正在进行摆位的可移动设备的三维坐标信息与所述目标摆放位置的三维坐标信息的偏差,并根据所述偏差判断所述可移动设备是否抵达所述目标摆放位置。
具体地,请参考图13a所示,以对所述医生控制端设备110的摆位指引为例,实线表示的为所述医生控制端设备110的当前位置,虚线所示为所述医生控制端设备110的所述目标摆放位置。若所述医生控制端设备110的当前位置与所述目标摆放位置不重合(即图13a所示的实线区域与虚线区域至少部分分离),即所述医生控制端设备110的当前三维坐标信息与所述目标摆放位置的三维坐标信息存在偏差时,可确定所述医生控制端设备110未抵达所述目标摆放位置。在所述医生控制端设备110的位移为S1的情况下,使所述医生控制端设备110移动从而使其当前位置与所述目标摆放位置重合,即所述医生控制端设备110的当前三维坐标信息与所述目标摆位位置的三维坐标信息重合,如此所述医生控制端设备110的摆位完成。通过对所述可移动设备的三维坐标信息与该可移动设备的目标摆放位置的比对,提高所述可移动设备的摆位准确性。所述患者执行端设备120、所述图像台车130是否完成摆位的判断过程与之类似(参考图13b及图13c所示),此处不再赘述。
需要说明的是,本申请实施例中,所述步骤S120可同时规划所述医生控制端设备110、所述患者执行端设备120、所述图像台车130的目标摆放位置和移动路径,且操作者可根据所述增强现实装置300所显示的多个目标摆放位置和多个移动路径,分别对所述医生控制端设备110、所述患者执行端设备120、所述图像台车130进行摆位。也就是说,操作者可根据实际需求,依次完成各个所述可移动设备的摆位。
优选地,可移动设备的摆位指引方法还包括步骤S151:根据预设规则规划所有所述可移动设备的摆位顺序。所述预设规则例如是根据多个所述可移动设备与所述固定设备的碰撞概率的大小来确定摆位 顺序。具体可以是先对碰撞概率大的设备进行摆位,再对碰撞概率小的设备摆位。多个所述可移动设备的碰撞概率的大小可以由操作者预先确定,也可以有计算机程序根据所述可移动设备的体积或其他因素来判断。在一些具体的实施例中,可判定所述医生控制端设备110的碰撞概率、所述图像台车130的碰撞概率以及所述患者执行端设备120的碰撞概率依次减小。因此在对所述手术机器人系统的各个设备进行摆位时,首先对所述医生控制端设备110进行摆位,再对所述图像台车130进行摆位,最后对所述患者执行端设备120进行摆位。确保在患者执行端设备120的所述图像臂、所述工具臂皆能够充分伸展的同时,避免各个设备之间发生碰撞。
可选地,在图3所示的实施例中,所述步骤S151在所述步骤S150之后,以及所述步骤S160之前执行。如此,在确定了所述手术机器人系统的所述医生控制端设备110、所述患者执行端设备120、所述图像台车130各自的目标摆放位置之后,可根据图14所示的流程进行摆位:
步骤S210:对所述医生控制端设备110进行摆位。具体为对所述医生控制端设备110执行所述步骤S160并移动所述医生控制端设备110,且在移动所述医生控制端设备的过程中执行所述步骤S170,直至完成所述医生控制端设备110的摆位。
步骤S220:对所述图像台车130进行摆位。具体为对所述图像台车130执行所述步骤S160并移动所述图像台车130,且在移动所述图像台车130的过程中执行所述步骤S170,直至完成所述图像台车130的摆位。
步骤S230:对所述患者执行端设备120进行摆位。具体为对所述患者执行端设备120执行所述步骤S160并移动所述患者执行端设备120,且在移动所述患者执行端设备120的过程中执行所述步骤S170,直至完成所述患者执行端设备120的摆位。
当然,所述步骤S151的执行顺序也可以根据需要调整,例如可以在步骤S120之前执行。此外,所述步骤S120中也可仅规划一个所述可移动设备的目标摆放位置和移动路径,也即首先规划所述医生控制端设备110的目标摆放位置和移动路径,并完成所述医生控制端设备110的摆位指引之后,再规划所述图像台车130的目标摆放位置和移动路径,且完成所述图像台车130的摆位指引,最后规划所述患者执行端设备120的目标摆放位置和移动路径,并进行摆位指引。
前述的摆位指引方法的所有步骤(即所述步骤S110、所述步骤S120、所述步骤S130、所述步骤S140、所述步骤S150、所述步骤S151、所述步骤S160、步骤S170)均可由计算机程序执行,因此基于同一发明构思,本申请还提供了一种可移动设备的摆位指引系统,其包括前述的定位装置200、增强现实装置300及控制单元,所述控制单元被配置用于执行如前所述的摆位指引方法。本申请所提供的摆位指引系统,可以快速地进行可移动设备例如医生控制端设备110、患者执行端设备120、图像台车130的摆位,避免在手术过程中发生患者执行端设备120的图像臂和/或工具臂不能重复伸展,或多个设备之间发生碰撞干涉的情形,进而有利于缩短手术时间。
应理解,本申请所述的控制单元可以包括处理器,并通过所述处理器执行相应的操作。所述处理 器可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理其、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等,所述处理器是所述手术机器人系统的控制中心,利用各种接口和线路连接整个手术机器人系统的各个部分。
所述存储器可用于存储所述计算机程序,所述处理器通过运行或执行存储在所述存储器内的计算机程序,以及调用存储在存储器内的数据,实现所述手术机器人系统的各种功能。
所述存储器可以包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
本申请还提供了一种手术机器人系统,所述手术机器人系统包括前述的摆位指引系统和可移动设备,所述可移动设备包括所述医生控制端设备110、所述图像台车130及所述患者执行端设备120中的至少一者。
进一步地,本申请还提供了一种计算机可读存储介质,其上存储有程序,当程序被执行时,执行如前述的可移动设备的摆位指引方法。
本申请实施方式的可读存储介质,可以采用一个或多个计算机可读的介质的任意组合。可读介质可以是计算机可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是但不限于电、磁、光、电磁、红外线或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式计算机硬盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本文中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其组合使用。
计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。
虽然本申请披露如上,但并不局限于此。本领域的技术人员可以对本申请进行各种改动和变型而不 脱离本申请的精神和范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。

Claims (15)

  1. 一种可移动设备的摆位指引方法,其特征在于,包括:
    获取固定设备三维坐标信息;
    根据所述固定设备的三维坐标信息以及可移动设备的姿态规划所述可移动设备的目标摆放位置;
    获取所述可移动设备的三维模型;
    使所述可移动设备的三维模型处于所述目标摆放位置并叠加显示于现实场景中。
  2. 根据权利要求1所述的可移动设备的摆位指引方法,其特征在于,所述摆位指引方法还包括:
    获取所述可移动设备的三维坐标信息;
    根据所述可移动设备的三维坐标信息、所述固定设备的三维坐标信息、所述可移动设备的姿态、以及所述可移动设备的目标摆放位置规划所述可移动设备的移动路径,以使所述可移动设备按照所述移动路径移动时抵达所述目标摆放位置且不与所述固定设备发生干涉;
    将所述移动路径叠加显示于现实场景中。
  3. 根据权利要求2所述的可移动设备的摆位指引方法,其特征在于,所述摆位指引方法还包括:
    实时地获取所述可移动设备的三维坐标信息;
    根据所述可移动设备的实时的三维坐标信息、所述固定设备的三维坐标信息以及所述目标摆放位置的三维坐标信息的相对关系实时地更新所述可移动设备的移动路径;
    将更新的移动路径叠加显示于现实场景中。
  4. 根据权利要求1所述的可移动设备的摆位指引方法,其特征在于,所述可移动设备的数量为多个,所述摆位指引方法还包括:
    获取当前正在进行摆位的可移动设备的三维坐标信息;
    根据所述当前正在进行摆位的可移动设备的三维坐标信息和目标摆放位置、所述固定设备的三维坐标信息、其他的可移动设备的三维坐标信息、所述当前正在进行摆位的可移动设备以及所述其他的可移动设备的姿态,规划所述当前正在进行摆位的可移动设备的移动路径,以使所述当前正在进行摆位的可移动设备沿所述移动路径移动而抵达所述目标摆放位置,且在移动过程中不与所述固定设备及所述其他可移动设备发生干涉;
    将所述移动路径叠加显示于现实场景中。
  5. 根据权利要求4所述的可移动设备的摆位指引方法,其特征在于,所述摆位指引方法还包括:
    实时地获取所述当前正在进行摆位的可移动设备及其他可移动设备的三维坐标信息;
    根据所述当前正在进行摆位的可移动设备的实时的三维坐标信息、所述其他的可移动设备的实时的三维坐标信息、所述固定设备的三维坐标信息、以及所述当前正在进行摆位的可移动设备的目标摆放位置的三维坐标信息的相对关系实时地更新所述移动路径;
    将更新的移动路径叠加显示于现实场景中。
  6. 根据权利要求2-5中任一项所述的可移动设备的摆位指引方法,其特征在于,所述摆位指引方法还包括:
    通过所述可移动设备的三维坐标信息确定所述可移动设备的位姿;将所述可移动设备的三维模型与所述可移动设备的位姿进行虚实融合配准,并叠加显示于现实场景中。
  7. 根据权利要求1所述的可移动设备的摆位指引方法,其特征在于,所述摆位指引方法还包括:
    实时获取所述可移动设备的三维坐标信息与所述目标摆放位置的三维坐标信息的偏差,并根据所述偏差判断所述可移动设备是否抵达所述目标摆放位置。
  8. 根据权利要求1所述的可移动设备的摆位指引方法,其特征在于,所述可移动设备的数量为多个,所述摆位指引方法还包括:根据预设规则规划多个所述可移动设备的摆位顺序。
  9. 一种摆位指引系统,其特征在于,包括:
    定位装置,用于获取固定设备的三维坐标信息及可移动设备的三维坐标信息;
    增强现实装置,用于将可移动设备的三维模型、可移动设备的移动路径叠加显示于现实场景中;以及,
    控制单元,与所述定位装置及所述增强现实装置通讯连接,并被配置用于执行如权利要求1-8中任一项所述的可移动设备的摆位指引方法。
  10. 根据权利要求9所述的摆位指引系统,其特征在于,所述控制单元被配置用于基于所述定位装置建立所述增强现实装置的坐标系与所述可移动设备的坐标系及所述固定设备的坐标系之间的映射关系,进而在所述增强现实装置获取的现实场景中叠加显示所述可移动设备的三维模型和所述可移动设备的移动路径。
  11. 根据权利要求10所述的摆位指引系统,其特征在于,所述控制单元被配置用于在世界坐标系中建立所述定位装置的坐标系与所述增强现实装置的坐标系的映射关系,建立所述定位装置的坐标系与所述可移动设备的坐标系的映射关系,以及建立所述定位装置的坐标系与所述固定设备的坐标系的映射关系,进而建立所述增强现实装置、所述可移动设备的坐标系以及所述固定设备的坐标系之间的映射关系。
  12. 根据权利要求11所述的摆位指引系统,其特征在于,所述定位装置设置于所述增强现实装置上;所述定位装置的坐标系通过机械位置与所述增强现实装置的坐标系建立映射关系。
  13. 根据权利要求11所述的摆位指引系统,其特征在于,所述定位装置与所述增强现实装置分离;通过所述定位装置获取所述增强现实装置的三维坐标信息,建立所述定位装置的坐标系与所述增强现实装置的坐标系的映射关系。
  14. 一种手术机器人系统,其特征在于,包括如权利要求9-13中任一项所述的摆位指引系统和可移动设备,所述可移动设备包括医生控制端设备、图像台车和患者操作端设备中的至少一者。
  15. 一种计算机可读存储介质,其上存储有程序,其特征在于,当所述程序被执行时,执行如权利要求1-8中任一项所述的可移动设备的摆位指引方法。
PCT/CN2022/101376 2021-06-30 2022-06-27 可移动设备的摆位指引方法及系统、手术机器人系统 WO2023274098A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110738626.2 2021-06-30
CN202110738626.2A CN113456221B (zh) 2021-06-30 2021-06-30 可移动设备的摆位指引方法及系统、手术机器人系统

Publications (1)

Publication Number Publication Date
WO2023274098A1 true WO2023274098A1 (zh) 2023-01-05

Family

ID=77876737

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/101376 WO2023274098A1 (zh) 2021-06-30 2022-06-27 可移动设备的摆位指引方法及系统、手术机器人系统

Country Status (2)

Country Link
CN (1) CN113456221B (zh)
WO (1) WO2023274098A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113456221B (zh) * 2021-06-30 2023-06-30 上海微创医疗机器人(集团)股份有限公司 可移动设备的摆位指引方法及系统、手术机器人系统
CN114305695B (zh) * 2021-12-06 2023-12-26 上海微创医疗机器人(集团)股份有限公司 移动指引方法及系统、可读存储介质、及手术机器人系统
CN114564050A (zh) * 2022-03-03 2022-05-31 瑞龙诺赋(上海)医疗科技有限公司 手术平台定位系统、位姿信息确定方法以及装置
CN115100257A (zh) * 2022-07-21 2022-09-23 上海微创医疗机器人(集团)股份有限公司 套管对准方法、装置、计算机设备、存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150134145A1 (en) * 2013-11-08 2015-05-14 Samsung Electronics Co., Ltd. Method and apparatus for controlling movement of medical device
CN106255471A (zh) * 2014-02-05 2016-12-21 直观外科手术操作公司 用于动态虚拟碰撞物体的系统和方法
US20170347979A1 (en) * 2016-06-01 2017-12-07 Siemens Healthcare Gmbh Method and device for motion control of a mobile medical device
CN108917758A (zh) * 2018-02-24 2018-11-30 石化盈科信息技术有限责任公司 一种基于ar的导航方法及系统
CN110547874A (zh) * 2018-05-30 2019-12-10 上海舍成医疗器械有限公司 制定移动路径的方法及其组件和在自动化设备中的应用
US20210068907A1 (en) * 2019-09-10 2021-03-11 Verb Surgical Inc. Handheld User Interface Device for a Surgical Robot
CN113456221A (zh) * 2021-06-30 2021-10-01 上海微创医疗机器人(集团)股份有限公司 可移动设备的摆位指引方法及系统、手术机器人系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE456048B (sv) * 1982-02-24 1988-08-29 Philips Norden Ab Sett och anordning for att bestemma kollisionsrisken for tva inbordes rorliga kroppar
US11013480B2 (en) * 2012-06-28 2021-05-25 Koninklijke Philips N.V. C-arm trajectory planning for optimal image acquisition in endoscopic surgery
US9639666B2 (en) * 2013-03-15 2017-05-02 Covidien Lp Pathway planning system and method
CN105455901B (zh) * 2015-11-20 2018-02-02 清华大学 针对手术机器人的避障规划方法和避障规划系统
CN112370159A (zh) * 2016-02-26 2021-02-19 思想外科有限公司 用于指导用户定位机器人的系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150134145A1 (en) * 2013-11-08 2015-05-14 Samsung Electronics Co., Ltd. Method and apparatus for controlling movement of medical device
CN106255471A (zh) * 2014-02-05 2016-12-21 直观外科手术操作公司 用于动态虚拟碰撞物体的系统和方法
US20170347979A1 (en) * 2016-06-01 2017-12-07 Siemens Healthcare Gmbh Method and device for motion control of a mobile medical device
CN108917758A (zh) * 2018-02-24 2018-11-30 石化盈科信息技术有限责任公司 一种基于ar的导航方法及系统
CN110547874A (zh) * 2018-05-30 2019-12-10 上海舍成医疗器械有限公司 制定移动路径的方法及其组件和在自动化设备中的应用
US20210068907A1 (en) * 2019-09-10 2021-03-11 Verb Surgical Inc. Handheld User Interface Device for a Surgical Robot
CN113456221A (zh) * 2021-06-30 2021-10-01 上海微创医疗机器人(集团)股份有限公司 可移动设备的摆位指引方法及系统、手术机器人系统

Also Published As

Publication number Publication date
CN113456221B (zh) 2023-06-30
CN113456221A (zh) 2021-10-01

Similar Documents

Publication Publication Date Title
WO2023274098A1 (zh) 可移动设备的摆位指引方法及系统、手术机器人系统
JP7086150B2 (ja) 遠隔操作医療システムにおける器具の画面上での識別をレンダリングするためのシステム及び方法
JP7080945B2 (ja) 遠隔操作医療システムにおける器具の画面上での識別のためのシステム及び方法
KR101413920B1 (ko) 네비게이션 시스템의 검출장치의 배치위치를 결정하기 위한 방법, 및 네비게이션 시스템의 검출장치를 배치하기 위한 방법
CN103705307B (zh) 手术导航系统及医疗机器人
US11258964B2 (en) Synthesizing spatially-aware transitions between multiple camera viewpoints during minimally invasive surgery
US20150320514A1 (en) Surgical robots and control methods thereof
CN113194862A (zh) 使用增强反射镜显示器设置外科机器人
CN112672709A (zh) 用于跟踪机器人操纵的手术器械的位置的系统和方法
US20070038065A1 (en) Operation of a remote medical navigation system using ultrasound image
WO2022083372A1 (zh) 手术机器人的调整系统、方法、介质及计算机设备
JP2007007041A (ja) 手術支援装置
US11701189B2 (en) Device for providing joint replacement robotic surgery information and method for providing same
WO2023040632A1 (zh) 计算机可读存储介质、对准方法及系统、手术机器人系统以及电子设备
US20240238045A1 (en) Virtual reality system with customizable operation room
CN114631886A (zh) 机械臂摆位方法、可读存储介质及手术机器人系统
Dumpert et al. Semi-autonomous surgical tasks using a miniature in vivo surgical robot
CN114305695B (zh) 移动指引方法及系统、可读存储介质、及手术机器人系统
US11992273B2 (en) System and method of displaying images from imaging devices
US20230363830A1 (en) Auto-navigating digital surgical microscope
TWM484404U (zh) 成像投影系統設備應用
WO2022162668A1 (en) Multi-arm robotic systems for identifying a target
US20230404692A1 (en) Cost effective robotic system architecture
WO2023141800A1 (en) Mobile x-ray positioning system
CN118434379A (zh) 集成手术导航和可视化系统及其方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22831912

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22831912

Country of ref document: EP

Kind code of ref document: A1