CN114848143A - Operation navigation system and method based on spine operation auxiliary robot - Google Patents

Operation navigation system and method based on spine operation auxiliary robot Download PDF

Info

Publication number
CN114848143A
CN114848143A CN202210230308.XA CN202210230308A CN114848143A CN 114848143 A CN114848143 A CN 114848143A CN 202210230308 A CN202210230308 A CN 202210230308A CN 114848143 A CN114848143 A CN 114848143A
Authority
CN
China
Prior art keywords
robot
doctor
coordinate system
spine
patient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210230308.XA
Other languages
Chinese (zh)
Inventor
程鹏飞
何永义
杨浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN202210230308.XA priority Critical patent/CN114848143A/en
Publication of CN114848143A publication Critical patent/CN114848143A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3472Trocars; Puncturing needles for bones, e.g. intraosseus injections
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention discloses a surgery navigation system and a surgery navigation method based on a spine surgery auxiliary robot. The method comprises preoperative planning, intraoperative execution and postoperative evaluation. According to the spine vertebral body pathological change position and the operation direction confirmed by the doctor in the image, the spine auxiliary operation robot can be automatically moved to position the operation direction under the robot coordinate system so as to guide the doctor to carry out the operation, so that the spine auxiliary operation robot is convenient for the doctor to carry out the operation, the operation efficiency is improved, and the operation accuracy and safety are improved. Meanwhile, the space registration of the image coordinate system and the robot coordinate system is completed in a mode that the set space registration probe is picked up and stuck on the back target point of the patient, so that secondary trauma to the patient is avoided.

Description

Operation navigation system and method based on spine operation auxiliary robot
Technical Field
The invention relates to the technical field of medical instruments, in particular to a surgical navigation system and method based on a spine surgical auxiliary robot.
Background
At present, China gradually enters an aging society, and the osteoporosis problem is more and more concerned by people and becomes one of common metabolic bone diseases. Osteoporotic Vertebral Compression Fracture (OVCF) is one of the common complications of osteoporosis and poses a significant health risk to patients. Percutaneous Vertebroplasty (PVP) is a minimally invasive spinal surgery technique that is used for Percutaneous injection of biological cement into a vertebral body through a pedicle or outside the pedicle to increase the strength and stability of the vertebral body, prevent collapse, relieve pain, and even partially restore the height of the vertebral body. Compared with conservative treatment, PVP for treating osteoporosis can quickly relieve pain, vertebral body shape and height can be basically recovered, vicious circle can be inhibited, and the survival state of a patient can be improved, so that the PVP becomes a main means for clinically treating OVCF. However, in the conventional PVP operation, most doctors rely on two-dimensional medical images to determine the pathological changes of the vertebral bodies of the spine, and determine the positions and the operation directions of the pathological changes by experience, and then the positions and the operation directions of the pathological changes are generally determined by medical image scanning and adjustment for multiple times.
At present, the operation navigation system for assisting the relevant spinal operation also depends on two-dimensional medical image information, the three-dimensional shape of the spinal vertebra cannot be intuitively reflected, the observation of the pathological change condition is not intuitive enough, the problems of inaccurate operation direction judgment, difficult path planning, low precision and the like easily exist in the whole PVP operation process, and the accuracy and the safety of the operation are influenced. In addition, in order to establish a spatial registration relationship between a patient and the surgical navigation system, the conventional surgical navigation system needs to mount a fixing device on a normal vertebral body or other human body parts as reference, so that secondary trauma is caused to the patient.
Disclosure of Invention
Aiming at the defects in the prior art, the invention mainly aims to provide a surgical navigation system and a surgical navigation method based on a spine surgical auxiliary robot, and aims to solve the problems that the conventional surgical navigation system does not have a three-dimensional spine vertebra reconstruction example, cannot visually judge the pathological change condition of a vertebral body and reflect a PVP (polyvinyl pyrrolidone) surgical process, and depends on a fixing device arranged on a normal vertebral body or other human body parts as reference to complete spatial registration, so that secondary trauma is caused to a patient.
In order to achieve the purpose, the invention adopts the following technical scheme:
the auxiliary robot for the spinal surgery comprises a three-coordinate platform, a six-axis parallel robot and a doctor working platform.
A spinal surgery assisted robot based surgical navigation system comprising:
the image processing module is used for processing the spinal CT data of the patient, segmenting spinal vertebra examples from other tissues of a human body and reconstructing the segmented spinal vertebra examples in a three-dimensional manner;
the space registration module is used for calculating the space registration relation between the image coordinate system and the robot coordinate system;
the surgical navigation module is used for selecting the position and the surgical direction of a lesion area in the spine vertebra example after the three-dimensional reconstruction by a doctor, obtaining the position and the surgical direction of the lesion area in a robot coordinate system according to the space registration relation, and generating a surgical navigation instruction by selecting the installation direction of the end effector and the moving distance in the Z-axis direction by the doctor according to the position and the surgical direction of the lesion area in the robot coordinate system;
and the motion control module is used for controlling the three-coordinate platform and the six-axis parallel robot of the spinal surgery auxiliary robot to move according to the surgery navigation instruction, so that the end effector reaches an appointed pose.
Preferably, the spatial registration module is used for sticking the bioelectrode on the back of the patient as target points, and the doctor picks up the target points respectively under the image coordinate system and the robot coordinate system to obtain coordinate values of the target points, and determines the spatial registration relationship between the image coordinate system and the robot coordinate system through Helmert transformation.
Preferably, the number of the target points is at least 5, and the manner of picking up the target points in the robot coordinate system is that a doctor controls a space registration probe below the Y-axis tail end of the three-coordinate platform to contact the target points through a teleoperation rod, so as to obtain coordinate values of the target points in the robot coordinate system.
Preferably, the teleoperation rod is a controller for manually controlling the three-coordinate platform and the six-axis parallel robot, the space registration probe below the Y-axis tail end of the three-coordinate platform is a high-precision ruby measuring head and is designed to be of a foldable structure, and after a doctor finishes target point pickup, the space registration probe rotates for 90 degrees to be hidden, so that collision with the back skin of a human body in the operation process is avoided.
Preferably, the operation navigation instruction comprises the movement amount of each axis of the three-coordinate platform, the pose of the six-axis parallel robot and the needle insertion depth allowed to be operated by a doctor.
Preferably, the pose of the six-axis parallel robot includes the amount of movement of the X axis, a rotation angle β about the Y axis, and a rotation angle γ about the Z axis.
Preferably, the navigation system further comprises a navigation display module located on the doctor working platform and used for interacting with a doctor in the whole operation process, displaying the spinal vertebra example reconstruction result, selecting the lesion position and direction of the spinal vertebra example by the doctor under the image coordinate system, and displaying the real-time information of the three-coordinate platform and the six-axis parallel robot so as to be used as the operation reference by the doctor in the operation process.
A surgical navigation method based on a spine surgery auxiliary robot comprises preoperative planning, intraoperative execution and postoperative evaluation;
1) the step of preoperative planning comprises:
1-1) anaesthetizing the patient, pasting a target point for the spatial registration on the back of the patient, scanning the patient by using CT and transmitting the spinal CT data of the patient to the doctor working platform;
1-2) segmenting spinal vertebra examples from other tissues of a human body according to spinal CT data of the patient, and reconstructing the segmented spinal vertebra examples in three dimensions;
1-3) the doctor picks up the target point under the image coordinate system and the robot coordinate system respectively to obtain the coordinate value of the target point, and determines the spatial registration relation between the image coordinate system and the robot coordinate system through Helmert transformation;
1-4) selecting the position and the operation direction of a lesion area by a doctor in the reconstructed spine vertebra example, and obtaining the position and the operation direction of the lesion area under a robot coordinate system according to the spatial registration relation;
1-5) selecting the installation direction of an end effector and the moving distance in the Z-axis direction by a doctor according to the position of a lesion area and the surgical direction in the robot coordinate system to generate a surgical navigation instruction;
2) the intraoperative steps include:
2-1) registering a patient, a guide tube and a puncture needle;
2-2) data analysis, wherein the three-coordinate platform and the six-axis parallel robot execute an operation navigation instruction to enable the end effector to reach an appointed pose;
2-3) a doctor installs a puncture guide tube and manually punctures a puncture needle into a lesion area in the body of a patient;
3) the step of post-operative assessment comprises:
3-1) carrying out CT scanning on the patient again, transmitting the spine CT data to a doctor working platform for image analysis, and comparing whether the puncture needle reaches the lesion position; if not, confirming the position and the operation direction of the lesion again to generate an operation navigation instruction, and if so, injecting biological cement;
3-2) after the current spine vertebra operation is finished, the patient carries out CT scanning again to evaluate the biological cement filling effect;
3-3) judging whether other vertebral bodies need to be operated, if so, continuing the process, otherwise, finishing the operation, and returning the spine operation auxiliary robot to the original point.
Compared with the prior art, the surgical navigation system and method based on the spine surgery auxiliary robot provided by the invention adopt the technical scheme, and the following technical effects are achieved:
the spine vertebra example segmentation and three-dimensional reconstruction can be completed according to the spine CT image, the three-dimensional form of the spine vertebra can be reflected more visually, and a doctor can confirm the pathological change position and the operation direction in the Percutaneous Vertebroplasty (PVP) more conveniently. In addition, the target point is pasted on the skin, and the high-precision ruby measuring head is matched to be used as a space registration probe, so that the space registration of the image coordinate system and the robot coordinate system is realized, and the secondary wound of a patient is avoided. Furthermore, the operation navigation system and method based on the spine operation auxiliary robot can generate the navigation instruction according to the lesion position and the operation direction confirmed by a doctor in an image, control the three-coordinate platform and the six-axis parallel robot to move, enable the end effector to reach the designated pose, reduce the times of CT shooting and needle inserting position and operation direction adjustment of a patient, and can continuously perform the operation on the patient with a plurality of spinal vertebrae traumas, thereby facilitating the operation of the doctor and improving the operation efficiency.
Drawings
FIG. 1 is a diagrammatic view of a preferred embodiment of a spinal surgery-assisted robot-based surgical navigation system of the present invention.
Fig. 2 is a diagram depicting an embodiment of a six-axis parallel robot, a rotatable hidden spatial registration probe, and an interchangeable or alternative end effector located at the Y-axis end of a spine surgery assistance robot three-coordinate platform.
Fig. 3 is a functional block diagram of the spinal surgery assistance robot in fig. 1.
FIG. 4 is a flow chart of the surgical navigation method based on the spine surgery auxiliary robot according to the preferred embodiment of the invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the above objects, the following detailed description of the embodiments, structures, features and effects of the present invention will be made with reference to the accompanying drawings and preferred embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example one
Referring to fig. 1, a surgical navigation system based on a spine surgery assistance robot includes:
the image processing module is used for processing the spinal CT data of the patient, segmenting spinal vertebra examples from other tissues of a human body and reconstructing the segmented spinal vertebra examples in a three-dimensional manner;
the space registration module is used for calculating the space registration relation between the image coordinate system and the robot coordinate system;
the surgical navigation module is used for selecting the position and the surgical direction of a lesion area in the spine vertebra example after the three-dimensional reconstruction by a doctor, obtaining the position and the surgical direction of the lesion area in a robot coordinate system according to the space registration relation, and generating a surgical navigation instruction by selecting the installation direction of the end effector and the moving distance in the Z-axis direction by the doctor according to the position and the surgical direction of the lesion area in the robot coordinate system;
and the motion control module is used for controlling the three-coordinate platform and the six-axis parallel robot of the spinal surgery auxiliary robot to move according to the surgery navigation instruction, so that the end effector reaches an appointed pose.
The operation navigation system based on the spine operation auxiliary robot can automatically move the spine operation auxiliary robot to position the operation direction under the robot coordinate system so as to guide a doctor to carry out an operation, thereby not only facilitating the operation of the doctor but also improving the efficiency of the operation, and also improving the accuracy and the safety of the operation.
Example two
This embodiment is substantially the same as the first embodiment, and is characterized in that:
in this embodiment, the spatial registration module attaches a bioelectrode to the back of the patient as target points, and the doctor picks up the target points in the image coordinate system and the robot coordinate system respectively to obtain coordinate values of the target points, and determines the spatial registration relationship between the image coordinate system and the robot coordinate system through the Helmert transformation.
In this embodiment, the number of the target points is at least 5, and the manner of picking up the target points in the robot coordinate system is that a doctor controls a spatial registration probe below the Y-axis end of the three-coordinate platform to contact the target points through a teleoperation stick, so as to obtain coordinate values of the target points in the robot coordinate system.
In this embodiment, the teleoperation rod is a controller for manually controlling a three-coordinate platform and a six-axis parallel robot, the spatial registration probe below the Y-axis end of the three-coordinate platform is a high-precision ruby probe, and is designed to be of a foldable structure, and after a doctor finishes target point pickup, the probe is rotated by 90 degrees to be hidden, so that collision with the back skin of a human body in an operation process is avoided.
In this embodiment, the surgical navigation instruction includes the movement amount of each axis of the three-coordinate platform, the pose of the six-axis parallel robot, and the needle insertion depth allowed by the doctor.
In the present embodiment, the pose of the six-axis parallel robot includes the amount of movement of the X axis, the rotation angle β around the Y axis, and the rotation angle γ around the Z axis.
In this embodiment, the navigation system further includes a navigation display module located on the doctor working platform, and is configured to interact with a doctor in the entire surgical procedure, display the spinal vertebra instance reconstruction result, the doctor selects the lesion position and direction of the spinal vertebra instance in the image coordinate system, and display real-time information of the three-coordinate platform and the six-axis parallel robot, so that the doctor can make an operative reference in the surgical procedure.
The embodiment can complete spine vertebra example segmentation and three-dimensional reconstruction according to the spine CT image, more intuitively reflect the three-dimensional shape of the spine vertebra, and is more convenient for a doctor to confirm the position and the operation direction of a lesion in Percutaneous Vertebroplasty (PVP). In addition, the target point is pasted on the skin, and the high-precision ruby measuring head is matched to be used as a space registration probe, so that the space registration of the image coordinate system and the robot coordinate system is realized, and the secondary wound of a patient is avoided. The surgical navigation system based on the spine surgery auxiliary robot can generate the navigation instruction according to the lesion position and the surgical direction confirmed by a doctor in an image, control the three-coordinate platform and the six-axis parallel robot to move, enable the end effector to reach an appointed pose, reduce the times of CT shooting and needle inserting position and surgical direction adjustment of a patient, and can continuously perform surgery on a plurality of patients with spinal vertebrae trauma, so that the surgery of the doctor is facilitated, and the surgical efficiency is improved.
EXAMPLE III
This embodiment is substantially the same as the above embodiment, and is characterized in that:
in the embodiment, the surgical navigation method based on the spine surgery auxiliary robot comprises preoperative planning, intraoperative execution and postoperative evaluation;
1) the step of preoperative planning comprises:
1-1) anaesthetizing the patient, pasting a target point for the spatial registration on the back of the patient, scanning the patient by using CT and transmitting the spinal CT data of the patient to the doctor working platform;
1-2) segmenting spinal vertebra examples from other tissues of a human body according to spinal CT data of the patient, and reconstructing the segmented spinal vertebra examples in three dimensions;
1-3) the doctor picks up the target point under the image coordinate system and the robot coordinate system respectively to obtain the coordinate value of the target point, and determines the spatial registration relation between the image coordinate system and the robot coordinate system through Helmert transformation;
1-4) selecting the position and the operation direction of a lesion area by a doctor in the reconstructed spine vertebra example, and obtaining the position and the operation direction of the lesion area under a robot coordinate system according to the spatial registration relation;
1-5) selecting the installation direction of an end effector and the moving distance in the Z-axis direction by a doctor according to the position of a lesion area and the surgical direction in the robot coordinate system to generate a surgical navigation instruction;
2) the intraoperative steps include:
2-1) registering a patient, a guide tube and a puncture needle;
2-2) data analysis, wherein the three-coordinate platform and the six-axis parallel robot execute an operation navigation instruction to enable the end effector to reach an appointed pose;
2-3) a doctor installs a puncture guide tube and manually punctures a puncture needle into a lesion area in the body of a patient;
3) the step of post-operative assessment comprises:
3-1) carrying out CT scanning on the patient again, transmitting the spine CT data to a doctor working platform for image analysis, and comparing whether the puncture needle reaches the lesion position; if not, confirming the position and the operation direction of the lesion again to generate an operation navigation instruction, and if so, injecting biological cement;
3-2) after the current spine vertebra operation is finished, the patient performs CT scanning again to evaluate the filling effect of the biological cement;
3-3) judging whether other vertebral bodies need to be operated, if so, continuing the process, otherwise, finishing the operation, and returning the spine operation auxiliary robot to the original point.
The embodiment is a surgical navigation method based on a spine surgery auxiliary robot, and the method comprises the following steps: preoperative planning, intraoperative execution and postoperative evaluation; wherein the preoperative planning comprises the steps of: anaesthetizing the patient and pasting the target point; CT scanning, namely transmitting data to a doctor working platform; medical image processing, spine segmentation and three-dimensional reconstruction; selecting a target point in the image, moving a space registration probe through a teleoperation rod to pick up the corresponding target point, and finishing space registration; and selecting a lesion vertebral body in the three-dimensional image, confirming the position and the operation direction of the lesion, and generating an operation navigation instruction. The intraoperative implementation comprises the following steps: registering a patient, a guide tube and a puncture needle; analyzing data, and executing a surgical navigation instruction by the three-coordinate platform and the six-axis parallel robot to enable the end effector to reach an appointed pose; the doctor installs the puncture guide tube and manually pierces the puncture needle into the lesion area in the patient. The post-operative assessment comprises the steps of: the patient is CT scanned again; transmitting the data to a doctor working platform, carrying out image analysis, and comparing whether the puncture needle reaches a lesion position; judging whether the requirement of needle insertion is met, if not, confirming the position and the operation direction of the lesion again to generate an operation navigation instruction, and if so, injecting biological cement; after the completion, the patient carries out CT scanning again to evaluate the biological cement filling effect; and finally, judging whether other vertebral bodies need to be operated, if so, continuing the process, otherwise, finishing the operation, and returning the robot to the original point. According to the spine vertebral body pathological change position and the operation direction confirmed by the doctor in the image, the spine auxiliary operation robot can be automatically moved to position the operation direction under the robot coordinate system so as to guide the doctor to carry out the operation, the operation of the doctor is facilitated, the operation efficiency is improved, and the operation accuracy and safety are also improved. Meanwhile, the spatial registration of the image coordinate system and the robot coordinate system is completed in a mode that the set spatial registration probe is picked up and stuck to the target point on the back of the patient, and secondary trauma to the patient is avoided.
Example four
This embodiment is substantially the same as the above embodiment, and is characterized in that:
in this embodiment, referring to fig. 1 and 2, fig. 1 is a diagram of a preferred embodiment of the spinal surgery-assisted robot based surgical navigation system of the present invention, and fig. 2 is a diagram depicting a six-axis parallel robot, a rotatable hidden spatial registration probe, and an embodiment with interchangeable or alternative end effectors located at the Y-axis end of a spinal surgery-assisted robot three-coordinate platform. In the present embodiment, the spine surgery assistance robot includes, but is not limited to, a three-coordinate platform 102, a six-axis parallel robot 103, and a doctor working platform 106. The three-coordinate platform 102 is fixed on the operating table 100 for the prone position of the patient 101 to perform the operation, and is used for moving in a large range during the operation navigation, so as to ensure that the operation area can cover the whole spine range of the patient 101. A rotatable hidden space registration probe 203 is further arranged below the Y-axis tail end of the three-coordinate platform 102, and is used for picking up the target point 104 when the image coordinate system and the robot coordinate system are spatially registered, so that coordinate values of the target point 104 in the robot coordinate system are obtained. The six-axis parallel robot 103 is fixed above the Y-axis tail end of the three-coordinate platform 102, the upper platform 200 of the six-axis parallel robot 103 is a movable platform, an end effector 201 for performing an operation is arranged on the movable platform, and the end effector 201 can reach an appointed operation pose by adjusting the pose of the upper platform 200 during operation navigation. The doctor working platform 106 is provided with a display screen 105 and a remote operating rod 107, the display screen 105 can display a spine vertebra image of a patient in the operation process and the whole process of confirming the operation direction of a lesion position in real time, interaction with a doctor in the operation process is achieved, and the remote operating rod 107 can be used for the doctor to manually operate the three-coordinate platform 102 and the six-axis parallel robot 103. The surgical navigation system 301 is installed and operated in the doctor working platform 106 of the spine surgery assisting robot.
Referring to fig. 3, fig. 3 is a functional block diagram of the spinal surgery assistance robot 300 of fig. 1. In the present embodiment, the spine surgery assisting robot includes, but is not limited to, a surgical navigation system 301, a three-coordinate platform driver 310, a six-axis parallel robot driver 311, a display screen 105, a teleoperation rod 107, a memory 312, and a microprocessor 313. The three-coordinate platform driver 310, the six-axis parallel robot driver 311, the display screen 105, the teleoperation rod 107 and the memory 312 are all connected to the microprocessor 313 through a data bus and can perform information interaction with the surgical navigation system 301 through the microprocessor 313. The memory 312 may be a read-only memory unit ROM, an electrically erasable memory unit EEPROM or a FLASH memory unit FLASH, and is used to store program instruction codes constituting the surgical navigation system 301. The microprocessor 313 may be a Micro Controller Unit (MCU), a data processing chip, or an information processing unit with data processing function, and is configured to execute the surgical navigation system 301 to provide surgical guidance for a patient during a surgical procedure.
In this embodiment, the surgical navigation system 301 based on the spine surgery assisted robot includes, but is not limited to, an image processing module 302, a spatial registration module 303, a surgical navigation module 304, a motion control module 305, and a navigation display module 306. The module referred to in the present invention refers to a series of computer program instruction segments that can be executed by the microprocessor 313 of the spine surgery assistance robot 300 and can perform a fixing function, and is stored in the memory 312 of the spine surgery assistance robot 300.
The image processing module 302 is configured to process spinal CT data of a patient, segment a spinal vertebra instance from other tissues of a human body, and reconstruct the segmented spinal vertebra instance in three dimensions. In this embodiment, the spine CT data of the patient is in a DICOM file format, the method for segmenting the spine vertebra example from the rest tissues of the human body is a U-NET-based deep learning image segmentation method, and the method for three-dimensionally reconstructing the segmented spine vertebra example is a three-dimensional reconstruction method based on surface rendering.
The spatial registration module 303 is configured to calculate a spatial registration relationship between the CT image coordinate system and the robot coordinate system. In this embodiment, in order to realize spatial registration, a doctor sticks a target point 104 on the back of a patient, and picks up the target point 104 in an image coordinate system and a robot coordinate system, respectively, to obtain coordinate values of the target point 104. The spatial registration module 303 records coordinate values of the target point 104 in the image coordinate system and the robot coordinate system, and determines a spatial registration relationship between the image coordinate system and the robot coordinate system through Helmert transformation. The number of the target points is 5, the method for picking up the target points in the robot coordinate system is that a doctor controls a space registration probe 203 below the Y-axis tail end of the three-coordinate platform 102 to contact with the target points 104 through a remote operating rod 107, and coordinate values of the target points 104 in the robot coordinate system are obtained in sequence.
In this embodiment, the remote operation rod 107 is a controller for manually controlling the three-coordinate platform 102 and the six-axis parallel robot 103, and can control the three-coordinate platform 102 to move along three directions of the X-axis, the Y-axis, and the Z-axis, and also can control six axes of the six-axis parallel robot 103 to individually extend and retract or control the movable platform 200 of the six-axis parallel robot 103 to move along the directions of the X-axis, the Y-axis, and the Z-axis and rotate around the directions of the X-axis, the Y-axis, and the Z-axis. The space registration probe 203 positioned below the tail end of the Y axis of the three-coordinate platform is a high-precision ruby measuring head and is designed to be of a foldable structure, and after registration is completed by a doctor, the space registration probe can be hidden by rotating 90 degrees, so that collision with the skin of the back of a human body in the operation process is avoided. And a force sensor 204 is also arranged behind the spatial registration probe 203 and used for sensing the force and the direction of the spatial registration probe 203 when contacting the target point 104, so that the precision of spatial registration is improved.
The surgical navigation module 304 is configured to select a position and a surgical direction of a lesion area in the reconstructed example of the spine vertebra by a doctor, obtain the position and the surgical direction of the lesion area in a robot coordinate system according to the spatial registration relationship, and select a moving distance in an installation direction and a Z-axis direction of the end effector 201 by the doctor according to the position and the surgical direction of the lesion area in the robot coordinate system to generate a surgical navigation instruction; in this embodiment, the surgical navigation instruction includes the movement amount of each axis of the three-coordinate platform 102, the pose of the six-axis parallel robot 103, and the needle insertion depth allowed by the doctor. The pose of the six-axis parallel robot 103 further includes a movement amount along the X-axis direction, a rotation angle β around the Y-axis, and a rotation angle γ around the Z-axis; the end effector 201 can be installed along the positive direction or the negative direction of the X axis above the moving platform 200 of the six-axis parallel robot 103 according to the position of the lesion area and the determined operation direction, so that the pose change amount of the six-axis parallel robot 103 is minimized, the range of the six-axis parallel robot 103 capable of performing the operation is increased, the time for achieving the designated pose is reduced, and the operation efficiency is improved.
The motion control module 305 is configured to control the three-coordinate platform 102 and the six-axis parallel robot 103 to move according to the surgical navigation instruction, so that the end effector 201 reaches a designated pose. In this embodiment, the three-coordinate platform 102 is controlled by controlling the three-coordinate platform driver 310 to further control the movement of the motors and the nut screws of the axes of the three-coordinate platform 102, and the six-axis parallel robot 103 is controlled by calculating the movement amount of each axis of the six-axis parallel robot 103 corresponding to the pose of the six-axis parallel robot 103 in the surgical navigation instruction through a kinematic inverse solution, and further controlling the six-axis parallel robot driver 311 to output, so that each axis of the six-axis parallel robot 103 reaches a specified length.
The navigation display module 306 is configured to interact with a doctor during a whole surgical procedure, display the spinal vertebra example reconstruction result, process a lesion position and a surgical direction of the spinal vertebra example selected by the doctor under an image coordinate system, and display real-time information of the three-coordinate platform 102 and the six-axis parallel robot 103, so that the doctor can make an operative reference during the surgical procedure. In this embodiment, the navigation display module 306 can display the segmentation and reconstruction results of the spine CT image, and can enlarge or reduce the local area according to the visualization requirement, so that the physician can more carefully observe the lesion area, select the target point 104 in the image, and confirm the location and the surgical direction of the lesion. The navigation display module 306 simultaneously retains the two-dimensional slice information for displaying the original spine CT image, and maps the lesion position and the operation direction confirmed by the doctor in the three-dimensional reconstructed spine vertebra example image into the two-dimensional slice of the original spine CT image, so that the habit of the doctor in reading the two-dimensional slice of the CT image is retained, the doctor can judge whether the operation direction affects other tissues around the lesion region, and the accuracy and the safety of the operation implementation process are improved.
The invention also provides a surgical navigation method based on the spine surgery auxiliary robot, which is applied to the spine surgery auxiliary robot 300. Referring to fig. 4, fig. 4 is a flowchart of a preferred embodiment of the surgical navigation method based on the spine surgery auxiliary robot of the present invention. In this embodiment, referring to fig. 1, fig. 2 and fig. 3, the surgical navigation method based on the spine surgery assisting robot includes the following steps: pre-operative planning 410, intra-operative execution 420, post-operative assessment 430; specifically, the preoperative plan 410 further includes the steps of:
step S411, anaesthetizing the patient, and pasting the target point 104 for the space registration on the back of the patient, wherein the pasting of the target point 104 basically covers the whole spine vertebral region, and the precision of the space registration is improved.
Step S412, scanning the patient by using CT and transmitting the spine CT data of the patient to the doctor working platform 106, wherein the spine CT data is in a DICOM file format.
Step S413, according to the patient spine CT data, segmenting spine vertebra examples from other tissues of a human body, and reconstructing the segmented spine vertebra examples in a three-dimensional mode; in this embodiment, the method for processing the spine vertebra example segmented from the rest tissues of the human body by the image processing module 302 is a U-NET-based deep learning image segmentation method, and the method for three-dimensionally reconstructing the segmented spine vertebra example is a three-dimensional reconstruction method based on surface rendering.
Step S414, respectively selecting a target point by a doctor in an image coordinate system, picking up the target point 104 by moving a space registration probe through a teleoperation rod under a robot coordinate system, obtaining a coordinate value of the target point 104, and determining a space registration relation between the image coordinate system and the robot coordinate system through Helmert transformation; specifically, the spatial registration module 303 calculates a spatial registration relationship between the CT image coordinate system and the robot spatial coordinate system. In this embodiment, the spatial registration module 303 records coordinate values of the target point 104 in a CT image coordinate system and a robot spatial coordinate system, and determines a spatial registration relationship between the image coordinate system and the robot coordinate system through Helmert transformation. The number of the target points is 5, and the manner of picking up the target points in the robot space coordinate system is that a doctor controls a space registration probe 203 below the Y-axis tail end of the three-coordinate platform 102 to contact with the target points 104 through a remote operating rod 107, and coordinate values of the target points 104 in the robot coordinate system are obtained in sequence.
In step S415, the physician selects a diseased vertebral body and confirms the location and the surgical direction of the lesion in the three-dimensional reconstructed example of the spinal vertebra. Specifically, the surgical navigation module 304 processes the position and the surgical direction of the lesion area selected by the doctor in the reconstructed example of the vertebra, obtains the position and the surgical direction of the lesion area in the robot coordinate system according to the spatial registration relationship, and generates a surgical navigation instruction according to the position and the surgical direction of the lesion area in the robot spatial coordinate system, and the movement distance of the doctor in the mounting direction and the Z-axis direction of the end effector 201 selected by the doctor. In this embodiment, the surgical navigation command includes the movement amount of each axis of the three-coordinate platform 102, the pose of the six-axis parallel robot 103, and the needle insertion depth allowed by the doctor. The pose of the six-axis parallel robot 103 further includes a movement amount along the X-axis direction, a rotation angle β around the Y-axis, and a rotation angle γ around the Z-axis; the end effector 201 can be installed in the positive direction or the negative direction of the X axis of the moving platform 200 of the six-axis parallel robot 103 according to the position of the lesion area and the determined operation direction, so that the pose change amount of the six-axis parallel robot 103 is minimized, the range of the six-axis parallel robot 103 capable of performing the operation is increased, the time for achieving the designated pose is reduced, and the operation efficiency is improved.
The intraoperative performing 420 further comprises the steps of:
step S421, registering the patient, the guide tube and the puncture needle; in this embodiment, the puncture needle is used to perform the puncture on the back and the vertebral body of the patient.
Step S422, according to the operation navigation instruction, the three-coordinate platform 102 and the six-axis parallel robot 103 are controlled to move, so that the end effector 201 reaches an appointed pose. Specifically, the motion control module 305 controls the three-coordinate platform 102 and the six-axis parallel robot 103 according to the surgical navigation instruction, so that the end effector 201 reaches a designated pose. In this embodiment, the three-coordinate platform 102 is controlled by controlling the three-coordinate platform driver 310 to further control the movement of the motor and the nut screw of each axis of the three-coordinate platform 102, and the six-axis parallel robot 103 is controlled by calculating the movement amount of each axis of the six-axis parallel robot 103 corresponding to the pose of the six-axis parallel robot 103 in the surgical navigation instruction through a kinematic inverse solution, and further controlling the six-axis parallel robot driver 311 to output, so that each axis of the six-axis parallel robot 103 reaches a specified length.
In step S423, the doctor installs the puncture guide tube and manually pierces the puncture needle into the lesion area in the patient. Specifically, the navigation display module 306 is used for interacting with a doctor throughout the operation process, displaying the spinal vertebra example reconstruction result, processing the position and the direction of the lesion selected by the doctor in an image coordinate system, and displaying real-time information of the three-coordinate platform 102 and the six-axis parallel robot 103 for the doctor to make an operation reference during the operation process. In this embodiment, the navigation display module 306 can display the segmentation and reconstruction results of the spine CT image, and enlarge or reduce the local region according to the visualization requirement, so that the physician can more carefully observe the lesion region, select the target point 104 in the image, and specify the lesion position and the surgical direction. The navigation display module 306 simultaneously retains the two-dimensional slice information for displaying the original spine CT image, maps the lesion position and the operation direction specified by the doctor in the three-dimensional reconstructed spine vertebra example image into the two-dimensional slice of the original spine CT image, retains the habit of the doctor in reading the two-dimensional slice of the CT image, is beneficial to the doctor to judge whether the specified operation direction influences other tissues around the spine vertebra, and improves the accuracy and the safety of the puncture needle process.
The post-operative assessment 430 further comprises the steps of:
step 431, CT scanning the patient again;
step 432, transmitting the spine CT data in the step 431 to the doctor working platform 106, and performing image analysis by the image processing module 302, and comparing whether the puncture needle reaches the lesion position;
step 433, judging whether the needle insertion requirement is met, if not, confirming the position and the operation direction of the lesion again, generating an operation navigation instruction again, and if so, injecting biological cement; after the current spinal vertebra surgery is completed, the patient is CT scanned again to assess the biological cement filling effect.
And 434, judging whether other spine vertebrae need to be operated, if so, continuing to execute 415 to 433, otherwise, finishing the operation, and returning the spine operation auxiliary robot to the origin.
The operation navigation system and the operation navigation method based on the spine operation auxiliary robot can complete spine vertebra example segmentation and three-dimensional reconstruction according to the spine CT image, more intuitively reflect the three-dimensional shape of the spine vertebra, and are more convenient for doctors to confirm the pathological change position and the operation direction of Percutaneous Vertebroplasty (PVP). In addition, the target point is pasted on the skin, and the high-precision ruby measuring head is matched to be used as a space registration probe, so that the space registration of the image coordinate system and the robot coordinate system is realized, and the secondary wound of a patient is avoided. Furthermore, the operation navigation system and method based on the spine operation auxiliary robot can enable the end effector to reach the designated pose by controlling the three-coordinate platform and the six-axis parallel robot to move according to the lesion position and the operation direction of the spine vertebra example confirmed by a doctor in the three-dimensional reconstructed image, so that the times of CT (computed tomography) shooting and needle inserting position and operation direction adjustment of a patient are reduced, and the operation can be continuously performed on the patient with trauma of a plurality of spine vertebrae, so that the operation of the doctor is facilitated, and the operation efficiency is improved.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent functions made by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (8)

1. A surgical navigation system based on a spine surgery auxiliary robot is characterized by comprising:
the image processing module is used for processing the spinal CT data of the patient, segmenting spinal vertebra examples from other tissues of a human body and reconstructing the segmented spinal vertebra examples in a three-dimensional manner;
the space registration module is used for calculating the space registration relation between the image coordinate system and the robot coordinate system;
the surgical navigation module is used for selecting the position and the surgical direction of a lesion area in the spine vertebra example after the three-dimensional reconstruction by a doctor, obtaining the position and the surgical direction of the lesion area in a robot coordinate system according to the space registration relation, and generating a surgical navigation instruction by selecting the installation direction of the end effector and the moving distance in the Z-axis direction by the doctor according to the position and the surgical direction of the lesion area in the robot coordinate system;
and the motion control module is used for controlling the three-coordinate platform and the six-axis parallel robot of the spinal surgery auxiliary robot to move according to the surgery navigation instruction, so that the end effector reaches an appointed pose.
2. The surgical navigation system based on the spine surgery auxiliary robot as claimed in claim 1, wherein the spatial registration module obtains coordinate values of target points by pasting biological electrodes on the back of the patient as the target points, and the doctor picks up the target points under the image coordinate system and the robot coordinate system respectively, and determines the spatial registration relationship between the image coordinate system and the robot coordinate system through a Helmert transformation.
3. The surgical navigation system based on the spine surgery auxiliary robot according to claim 2, wherein the number of the target points is at least 5, and the target points are picked up under the robot coordinate system in a manner that a doctor controls a space registration probe below the Y-axis end of the three-coordinate platform to contact with the target points through a teleoperation rod, so as to obtain coordinate values of the target points under the robot coordinate system.
4. The surgical navigation system based on the spine surgery auxiliary robot according to claim 3, wherein the teleoperation rod is a controller for manually controlling the three-coordinate platform and the six-axis parallel robot, the space registration probe below the Y-axis tail end of the three-coordinate platform is a high-precision ruby probe and is designed to be a foldable structure, and the space registration probe rotates 90 degrees after a doctor finishes picking up a target point and is hidden, so that the space registration probe is prevented from colliding with the back skin of a human body in a surgical process.
5. The spine surgery assisted robot based surgical navigation system according to claim 1, wherein the surgical navigation instruction comprises the movement amount of each axis of the three-coordinate platform, the pose of the six-axis parallel robot and the needle insertion depth allowed by a doctor to operate.
6. The spine surgery assisted robot based surgical navigation system according to claim 5, wherein the pose of the six-axis parallel robot includes an amount of movement of the X-axis, a rotation angle β around the Y-axis, and a rotation angle γ around the Z-axis.
7. The surgical navigation system based on the spine surgery auxiliary robot as claimed in any one of claims 1 to 6, further comprising a navigation display module located on the doctor working platform and used for interacting with a doctor during the whole surgical process, displaying the reconstruction result of the spine vertebra example, the doctor selecting the lesion position and direction of the spine vertebra example under an image coordinate system, and displaying the real-time information of the three-coordinate platform and the six-axis parallel robot for the doctor to make surgical reference during the surgical process.
8. A surgical navigation method based on a spine surgery auxiliary robot is characterized by comprising preoperative planning, intraoperative execution and postoperative evaluation;
1) the step of preoperative planning comprises:
1-1) anaesthetizing the patient, pasting a target point for the spatial registration on the back of the patient, scanning the patient by using CT and transmitting the spinal CT data of the patient to the doctor working platform;
1-2) segmenting spinal vertebra examples from other tissues of a human body according to spinal CT data of the patient, and reconstructing the segmented spinal vertebra examples in three dimensions;
1-3) the doctor picks up the target point under the image coordinate system and the robot coordinate system respectively to obtain the coordinate value of the target point, and determines the spatial registration relation between the image coordinate system and the robot coordinate system through Helmert transformation;
1-4) selecting the position and the operation direction of a lesion area by a doctor in the reconstructed spine vertebra example, and obtaining the position and the operation direction of the lesion area under a robot coordinate system according to the spatial registration relation;
1-5) selecting the installation direction of an end effector and the moving distance in the Z-axis direction by a doctor according to the position of a lesion area and the surgical direction in the robot coordinate system to generate a surgical navigation instruction;
2) the intraoperative steps include:
2-1) registering a patient, a guide tube and a puncture needle;
2-2) data analysis, wherein the three-coordinate platform and the six-axis parallel robot execute an operation navigation instruction to enable the end effector to reach an appointed pose;
2-3) a doctor installs a puncture guide tube and manually punctures a puncture needle into a lesion area in the body of a patient;
3) the step of post-operative assessment comprises:
3-1) carrying out CT scanning on the patient again, transmitting the spine CT data to a doctor working platform for image analysis, and comparing whether the puncture needle reaches the lesion position; if not, confirming the position and the operation direction of the lesion again, generating an operation navigation instruction, and if so, injecting biological cement;
3-2) after the current spine vertebra operation is finished, the patient carries out CT scanning again to evaluate the biological cement filling effect;
3-3) judging whether other vertebral bodies need to be operated, if so, continuing the process, otherwise, finishing the operation, and returning the spine operation auxiliary robot to the original point.
CN202210230308.XA 2022-03-10 2022-03-10 Operation navigation system and method based on spine operation auxiliary robot Pending CN114848143A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210230308.XA CN114848143A (en) 2022-03-10 2022-03-10 Operation navigation system and method based on spine operation auxiliary robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210230308.XA CN114848143A (en) 2022-03-10 2022-03-10 Operation navigation system and method based on spine operation auxiliary robot

Publications (1)

Publication Number Publication Date
CN114848143A true CN114848143A (en) 2022-08-05

Family

ID=82628541

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210230308.XA Pending CN114848143A (en) 2022-03-10 2022-03-10 Operation navigation system and method based on spine operation auxiliary robot

Country Status (1)

Country Link
CN (1) CN114848143A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115462901A (en) * 2022-10-20 2022-12-13 哈尔滨思哲睿智能医疗设备股份有限公司 Operation real-time navigation method, device, system, equipment and medium
CN117281594A (en) * 2023-11-17 2023-12-26 中国人民解放军总医院第一医学中心 Spinal surgery puncture locator
CN117323004A (en) * 2023-09-26 2024-01-02 北京长木谷医疗科技股份有限公司 Navigation positioning system of spinal surgery robot

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115462901A (en) * 2022-10-20 2022-12-13 哈尔滨思哲睿智能医疗设备股份有限公司 Operation real-time navigation method, device, system, equipment and medium
CN117323004A (en) * 2023-09-26 2024-01-02 北京长木谷医疗科技股份有限公司 Navigation positioning system of spinal surgery robot
CN117323004B (en) * 2023-09-26 2024-04-26 北京长木谷医疗科技股份有限公司 Navigation positioning system of spinal surgery robot
CN117281594A (en) * 2023-11-17 2023-12-26 中国人民解放军总医院第一医学中心 Spinal surgery puncture locator
CN117281594B (en) * 2023-11-17 2024-02-23 中国人民解放军总医院第一医学中心 Spinal surgery puncture locator

Similar Documents

Publication Publication Date Title
CN108024838B (en) System and method for using registered fluoroscopic images in image-guided surgery
US11950859B2 (en) Navigation and positioning system and method for joint replacement surgery robot
CN112220557B (en) Operation navigation and robot arm device for craniocerebral puncture and positioning method
CN114848143A (en) Operation navigation system and method based on spine operation auxiliary robot
JP4822634B2 (en) A method for obtaining coordinate transformation for guidance of an object
JP4469423B2 (en) Stereotaxic treatment apparatus and method
CN101474075B (en) Navigation system of minimal invasive surgery
CN102784003B (en) Pediculus arcus vertebrae internal fixation operation navigation system based on structured light scanning
CN113316429A (en) System and method for registration and navigation between coordinate systems
CN109984843B (en) Fracture closed reduction navigation system and method
CN109758233B (en) Diagnosis and treatment integrated operation robot system and navigation positioning method thereof
US20110060216A1 (en) Method and Apparatus for Surgical Navigation of a Multiple Piece Construct for Implantation
CN202751447U (en) Vertebral pedicle internal fixation surgical navigation system based on structured light scanning
CN112641512B (en) Spatial registration method applied to preoperative robot planning
EP3673854B1 (en) Correcting medical scans
CN114159160B (en) Surgical navigation method, device, electronic equipment and storage medium
CN114515193B (en) Parallel robot, system and medical system
CN115553883A (en) Percutaneous spinal puncture positioning system based on robot ultrasonic scanning imaging
CN113729941B (en) VR-based operation auxiliary positioning system and control method thereof
CN109745074A (en) A kind of system and method for 3-D supersonic imaging
US20240285355A1 (en) Robot equipped with an ultrasound probe for real-time guidance in percutaneous interventions
CN115775611A (en) Puncture operation planning system
CN115227349A (en) Lung puncture robot based on optical tracking technology
CN114418960A (en) Image processing method, system, computer device and storage medium
CN114617635A (en) Minimally invasive surgery device, control method thereof and minimally invasive surgery system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination