CN114098991A - Surgical robot control method, medium and device based on real-time perspective image - Google Patents

Surgical robot control method, medium and device based on real-time perspective image Download PDF

Info

Publication number
CN114098991A
CN114098991A CN202210082854.3A CN202210082854A CN114098991A CN 114098991 A CN114098991 A CN 114098991A CN 202210082854 A CN202210082854 A CN 202210082854A CN 114098991 A CN114098991 A CN 114098991A
Authority
CN
China
Prior art keywords
surgical instrument
preset
sub
image plane
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210082854.3A
Other languages
Chinese (zh)
Inventor
刘华根
王玉鑫
王英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yishengxin Technology Beijing Co ltd
Original Assignee
Yishengxin Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yishengxin Technology Beijing Co ltd filed Critical Yishengxin Technology Beijing Co ltd
Priority to CN202210082854.3A priority Critical patent/CN114098991A/en
Publication of CN114098991A publication Critical patent/CN114098991A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Abstract

The invention relates to a control method, a medium and equipment of a surgical robot based on real-time perspective images, wherein the control method comprises the following steps: receiving a first control signal; determining a first action instruction according to the first control signal; and sending a first action instruction, wherein the first action instruction is used for instructing a surgical instrument of the surgical robot execution end to move in a preset perspective image plane of the perspective image. The surgical robot control method based on the real-time perspective image provided by the embodiment of the invention can determine the position of the surgical instrument by means of the perspective image and limit the movement of the surgical instrument in the preset perspective image plane, so that the tail end of the surgical instrument can accurately move to a target point and execute corresponding surgical actions conveniently, and the surgical efficiency and the surgical success rate are improved.

Description

Surgical robot control method, medium and device based on real-time perspective image
Technical Field
The invention relates to the technical field of surgical robots, in particular to a surgical robot control method, a surgical robot control medium and surgical robot control equipment based on real-time perspective images.
Background
In the related art, the manipulating end of the surgical robot generally has only one manipulating component, for example, only one control rod for controlling the displacement, rotation, etc. of the distal end of the surgical robot. The surgical robot causes inconvenience in use for doctors and is difficult to meet some operations with specific requirements for the movement of the tail end.
Disclosure of Invention
To overcome the problems in the related art, the present invention provides a method, medium, and apparatus for controlling a surgical robot based on real-time fluoroscopic images.
According to a first aspect of the present invention, there is provided a surgical robot control method based on real-time fluoroscopic images, the control method being applied to a main control end, the control method including:
receiving a first control signal;
determining a first action instruction according to the first control signal;
and sending the first action instruction, wherein the first action instruction is used for instructing a surgical instrument of the surgical robot execution end to move in a preset perspective image plane of a perspective image.
In some embodiments of the present invention, the first manipulation signal comprises a first sub-manipulation signal, the first motion instruction comprises a first sub-motion instruction corresponding to the first sub-manipulation signal, and the first sub-motion instruction is used to instruct translation of the surgical instrument in a first direction within the preset fluoroscopic image plane;
the first control signal comprises a second sub-control signal, the first action instruction comprises a second sub-action instruction corresponding to the second sub-control signal, and the second sub-action instruction is used for indicating the rotation of the surgical instrument in the preset perspective image plane;
the first manipulation signal comprises a third sub-manipulation signal, the first action instruction comprises a third sub-action instruction corresponding to the third sub-manipulation signal, and the third sub-action instruction is used for indicating translation of the surgical instrument in a second direction in the preset fluoroscopic image plane; the second direction is the extending direction of the surgical instrument, and the first direction and the second direction form an included angle.
In some embodiments of the invention, the second sub-motion instruction is used to instruct the surgical instrument to rotate within the preset fluoroscopic image plane around a distal end point of the surgical instrument.
In some embodiments of the invention, before receiving the first manipulation signal, the control method further comprises:
receiving a second control signal moving into the preset perspective image plane;
determining a second action instruction according to the second control signal;
and sending the second action instruction, wherein the second action instruction is used for indicating the surgical instrument to move into the preset perspective image plane.
In some embodiments of the invention, the control method further comprises:
receiving a feedback signal of the surgical instrument moving into the preset perspective image plane;
and sending out the feedback signal.
According to a second aspect of the present invention, there is provided a surgical robot control method based on real-time fluoroscopic images, the control method being applied to a manipulation terminal, the control method including:
entering a first preset state based on a received first operation instruction for locking a motion plane or a feedback signal of a surgical instrument at a surgical robot execution end moving to a preset perspective image plane of a perspective image;
in the first predetermined state of the process,
receiving a second operation instruction;
determining a first control signal according to the second operation instruction;
and sending the first control signal, wherein the first control signal is used for indicating the surgical instrument to move in the preset perspective image plane.
In some embodiments of the invention, the first manipulation signal comprises a first sub-manipulation signal for indicating a translation of the surgical instrument in a first direction within the preset fluoroscopic image plane;
the first manipulation signal comprises a second sub-manipulation signal, and the second sub-manipulation signal is used for indicating the rotation of the surgical instrument in the preset perspective image plane;
the first manipulation signal comprises a third sub-manipulation signal for indicating translation of the surgical instrument in a second direction within the preset fluoroscopic image plane; the second direction is the extending direction of the surgical instrument, and the first direction and the second direction form an included angle.
In some embodiments of the present invention, before the received first operation instruction for locking the motion plane or the surgical instrument at the surgical robot executing end moves to the feedback signal in the preset fluoroscopic image plane of the fluoroscopic image, the control method further includes:
receiving a third operation instruction moving into the preset perspective image plane;
determining a second control signal according to the third operation instruction;
and sending the second control signal, wherein the second control signal is used for indicating the surgical instrument to move into the preset perspective image plane.
According to a third aspect of the present invention, there is provided a computer readable storage medium having stored thereon a computer program which, when executed, carries out the steps of the method as described above.
According to a fourth aspect of the present invention, there is provided a computer device comprising a processor, a memory and a computer program stored on the memory, the processor implementing the steps of the method as described above when executing the computer program.
The surgical robot control method based on the perspective image provided by the embodiment of the invention can determine the position of the surgical instrument by means of the perspective image, and can limit the movement of the surgical instrument in the preset perspective image plane of the perspective image, so that the tail end of the surgical instrument can accurately move to a target point and execute corresponding surgical actions conveniently, and the surgical efficiency and the surgical success rate are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a block diagram of a surgical robotic system shown in accordance with an exemplary embodiment;
FIG. 2 is a schematic diagram of a surgical robot effector in a surgical robotic system according to one exemplary embodiment;
FIG. 3 is a flowchart illustrating a fluoroscopic-based surgical robot control method according to an exemplary embodiment;
FIG. 4 is a schematic perspective view of a camera according to an exemplary embodiment;
FIG. 5 is a schematic structural diagram of a manipulation end in a surgical robotic system shown in accordance with an exemplary embodiment;
FIG. 6 is a schematic diagram illustrating a coordinate system construction of a first rocker in the manipulation end according to an exemplary embodiment;
FIG. 7 is a schematic representation of the operation of the surgical robot effector corresponding to FIG. 6;
FIG. 8 is a schematic diagram illustrating a coordinate system construction of a second rocker in the manipulation end according to an exemplary embodiment;
FIG. 9 is a schematic representation of the operation of the surgical robotic applicator corresponding to FIG. 8;
FIG. 10 is a schematic diagram illustrating a coordinate system construction of a putter in a manipulation tip according to an exemplary embodiment;
FIG. 11 is a schematic representation of the operation of the surgical robotic applicator corresponding to FIG. 10;
FIG. 12 is a flowchart illustrating a fluoroscopic-based surgical robot control method according to an exemplary embodiment;
FIG. 13 is a block diagram illustrating a computer device according to an example embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. It should be noted that the embodiments and features of the embodiments of the present invention may be arbitrarily combined with each other without conflict.
The manipulating end of the surgical robot usually has only one manipulating part, for example, only one control rod, to control the displacement and rotation of the distal end of the surgical robot, and a switching device, for example, a button, is used to switch the operation mode of the manipulating part. The surgical robot is difficult to meet the operation requirement of certain operations with specific requirements on the terminal motion, for example, in the puncture operation, the needle inserting and withdrawing operation needs to be carried out along the direction of a puncture needle, and the direction of the needle is changeable at any time.
In addition, after the puncture needle punctures and enters the skin according to the preset puncture skin needle inlet point and the preset puncture direction, the tissue is soft and can deform after being punctured and extruded, meanwhile, the tissues of the lung and the abdomen can move along with the respiration due to the respiration, and if the puncture needle punctures according to the preset puncture direction, the risk that the target spot cannot be punctured still exists.
Based on this, the embodiment of the invention provides a surgical robot control method based on real-time perspective images, which can determine the position of a puncture needle by means of the perspective images and limit the movement of a surgical instrument in a preset perspective image plane, so that the tail end of the surgical instrument can accurately move to a target point and execute corresponding surgical actions, and the surgical efficiency and the surgical success rate are improved.
For convenience of understanding, a surgical robot system composition to which a surgical robot belongs is first described, and as shown in fig. 1, the surgical robot system includes a surgical robot performing end 300, a manipulation end 100, and a main control end 200. The manipulation terminal 100 is a manipulation part of the surgical robot system, and an operator, such as a doctor, may perform an operation at the manipulation terminal 100 to send an operation instruction to the main control terminal 200. After receiving the operation instruction, the manipulation terminal 100 may generate a manipulation signal according to the operation instruction, and send the manipulation signal to the main control terminal 200, and the main control terminal 200 determines an action instruction according to the manipulation signal sent by the manipulation terminal 100, and instructs the surgical robot performing terminal 300 to perform a corresponding action according to the determined action instruction.
As shown in fig. 2, the surgical robot executing end 300 includes a surgical instrument 310 and a driving mechanism 320, the driving mechanism 320 is connected to the surgical instrument 310 and is used for driving the surgical instrument 310 to move, and the driving mechanism 320 may be, for example, a six-axis driving mechanism, so that various complex motions can be realized, and the accuracy of the motions can be ensured.
The console 100 includes a motion collection unit 140 and a console, and the motion collection unit 140 is configured to collect motion information of the console and send a console signal to the main control 200 according to the motion information. The control signal includes motion information of the control component, for example, in an embodiment where the control component includes a joystick, the control signal may include motion angle information and motion speed information of the joystick, and for example, in an embodiment where the control component includes a push rod, the control signal may include motion direction information and motion speed information of the push rod.
As shown in fig. 3, an embodiment of the present application provides a surgical robot control method based on real-time fluoroscopic images, where the control method is applied to a main control end 200, and the control method includes:
s100, receiving a first control signal;
s200, determining a first action instruction according to the first control signal;
s300, sending a first action instruction, wherein the first action instruction is used for instructing a surgical instrument at the execution end of the surgical robot to move in a preset perspective image plane of the perspective image.
In this embodiment, after receiving the first manipulation signal sent by the manipulation terminal 100, the main control terminal 200 determines a first action instruction for instructing the surgical robot performing terminal 300 to act according to the first manipulation signal. In some embodiments, the first manipulation signal is motion information of a manipulation component, and the first motion command is motion information of the driving mechanism 320, for example, in an embodiment where the driving mechanism 320 is a six-axis driving mechanism, the first motion command is motion information of each motor in the six-axis driving mechanism. In other embodiments, the surgical robot performing end 300 may have a motion processing device, and in this case, the first motion instruction may be target motion information of the surgical instrument 310, and the motion processing device of the surgical robot performing end 300 may automatically generate motion information of the driving mechanism 320 according to the target motion information of the surgical instrument 310, for example, automatically generate motion information of each motor in a six-axis driving mechanism.
The first motion command is used to instruct the surgical instrument 310 to move within a preset fluoroscopic plane of the fluoroscopic image. The surgical instrument 310 moves in the preset perspective image plane, which means that the central axis of the surgical instrument 310 is always located in the preset perspective image plane during the movement process, so that the image of the preset perspective image plane always contains the image of the surgical instrument 310. Illustratively, the surgical device 310 is a needle biopsy needle, a cryoablation needle, a particle implantation needle, a radio frequency puncture needle, a microwave puncture needle, etc., the fluoroscopic images are CT images, DSA images, MR images, ultrasound images, etc., and a display device may be disposed at the control end 100 to display an image of a preset fluoroscopic image plane, so as to facilitate an operator to observe the surgical procedure.
As shown in fig. 4, in the preset fluoroscopic image plane, the fluoroscopic image thereof includes a surgical instrument image a, a surgical instrument artifact B, and a target point C, since the surgical instrument 310 may form an artifact in the preset fluoroscopic image plane, the artifact may be used to indicate a motion trajectory of the surgical instrument 310, that is, the artifact may be used as an extension line of the surgical instrument 310, and if the artifact direction is aligned with the target point, the surgical instrument 310 may reach the target point position according to the current motion. If the artifact direction is not aligned with the target point, the operation control terminal 100 may send a first control signal to the main control terminal 200, and the main control terminal 200 sends a first action instruction to the surgical robot performing terminal 300 according to the first control signal to instruct the surgical instrument 310 to act until the artifact direction of the surgical instrument 310 is aligned with the target point. Because the movement of the surgical instrument 310 is always limited in the preset perspective image plane, the relative position relationship between the surgical instrument 310 and the target point is conveniently obtained, so that the position of the surgical instrument 310 is conveniently adjusted, and the surgical efficiency and the surgical success rate are improved, for example, when the surgical instrument 310 is a puncture needle, the surgical efficiency and the surgical success rate of the puncture surgery can be improved.
In an embodiment, the first manipulation signal includes a first sub-manipulation signal, and the first motion instruction includes a first sub-motion instruction corresponding to the first sub-manipulation signal, and the first sub-motion instruction is used to instruct the surgical instrument 310 to translate along the first direction within the preset fluoroscopic image plane. The "translation along the first direction" mentioned here means that, when the first sub-motion command indicates that the surgical instrument 310 is moving, the motion trajectory of each point on the surgical instrument 310 is located on a translation straight line, and the translation straight lines of different points on the surgical instrument 310 are all parallel to the first direction, and the motion trajectory of one or more points on the surgical instrument 310 may be located in the first direction.
The first manipulation signal includes a second sub-manipulation signal, the first motion command includes a second sub-motion command corresponding to the second sub-manipulation signal, and the second sub-motion command is used to instruct the surgical instrument 310 to rotate within the preset fluoroscopic image plane. The "rotation in the preset fluoroscopic image plane" mentioned herein means that a point on the surgical instrument 310 is located on the preset fluoroscopic image plane, and the position on the preset fluoroscopic image plane is not changed, the surgical instrument 310 rotates around the point, and the surgical instrument 310 is always in the preset fluoroscopic image plane.
As an example, the second sub-motion command is used to instruct the surgical instrument 310 to rotate within the preset fluoroscopic image plane centering on the distal end point of the surgical instrument 310, so as to more conveniently align the surgical instrument 310 with the target point.
The first control signal includes a third sub-control signal, the first action instruction includes a third sub-action instruction corresponding to the third sub-control signal, the third sub-action instruction is used to instruct the surgical instrument 310 to translate along a second direction within the preset perspective image plane, the second direction is an extending direction of the surgical instrument 310, exemplarily, the second direction is a central axis direction of the surgical instrument 310, and the first direction and the second direction are arranged at an included angle, and as an example, the first direction is perpendicular to the second direction. The "translation along the second direction" mentioned here means that, when the third sub-motion command indicates that the surgical instrument 310 is moving, the motion trajectory of each point on the surgical instrument 310 is located on one translation line, and the translation lines of different points on the surgical instrument 310 are all parallel to the second direction, and the motion trajectory of one or more points on the surgical instrument 310 may be located in the second direction.
In this embodiment, the movement of the surgical instrument 310 is divided into a translational movement along the first direction, a translational movement along the second direction, and a rotational movement within the preset fluoroscopic image plane, so as to facilitate the position adjustment of the surgical instrument 310 by the operator. Exemplarily, as shown in fig. 1, the manipulation terminal 100 is provided with three manipulation members, namely, a first manipulation member 110, a second manipulation member 120, and a third manipulation member 130. The first manipulating member 110, the second manipulating member 120 and the third manipulating member 130 are all movable members, and may be a controllable structure such as a rocker, a push rod, a knob, etc., and an operator can operate the movement of each manipulating member.
The action of the first manipulating member 110 is used to instruct the surgical device 310 to translate along the reference plane, where "translation along the reference plane" refers to that when the first manipulating member 110 instructs the surgical device 310 to move, the motion trajectory of each point on the surgical device 310 is located on a translation plane, and the translation planes of different points on the surgical device 310 are all parallel to the reference plane, and the motion trajectory of one or more points on the surgical device 310 may be located on the reference plane.
In one embodiment, as shown in fig. 5, the first manipulating part 110 employs the first rocking bar 111, the first rocking bar 111 can be manipulated at will in 360 °, the movement of the surgical device 310 along any direction of the reference plane can be realized through the first rocking bar 111, and it is convenient to establish a corresponding relationship between the movement of the first rocking bar 111 and the movement of the surgical device 310, and it is also more convenient for the operator to perform intuitive operation. Specifically, a first rocker coordinate system is constructed for the first rocker 111, and exemplarily, as shown in fig. 6, two directions perpendicular to each other are respectively set to the origin at the central position of the first rocker 111Is X1Axis and Y1The axes construct a first rocker coordinate system. As shown in fig. 7, an instrument coordinate system is constructed for the surgical instrument 310, corresponding to the coordinate system of the first rocker 111, and exemplarily, as shown in fig. 7, the two directions perpendicular to each other are X respectively with the end point of the surgical instrument 310 as the originnAxis and YnThe axis constitutes the puncture needle coordinate system, XnAxis and YnThe plane formed by the axes may be parallel to the horizontal plane.
By constructing the two corresponding coordinate systems, the corresponding relationship between the motion of the first rocking bar 111 and the motion of the surgical instrument 310 can be constructed, for example, in the conventional mode, as shown in fig. 6 and 7, the first rocking bar 111 is in the X direction1Axis and Y1Projection and X on a plane of axis construction1Angle theta of the axes1Corresponding to the direction of movement and X of the end point of the surgical instrument 310nAngle theta of the axes2I.e. theta12. Thus, the direction of movement and X of the distal point of the surgical instrument 310 may be calculated according to the following formulanAngle theta of the axes2
Figure 156123DEST_PATH_IMAGE001
The movement velocity v of the surgical instrument 310 is calculated according to the following formula:
Figure 89444DEST_PATH_IMAGE002
wherein, MAX1Is the maximum output value of the first rocker 111;
Vmaxis the maximum movement speed of the surgical instrument 310.
When the first rocker 111 is returned to the center position, x1And y1The outputs are all 0, and at this time, the main control terminal 200 issues an instruction to stop the operation to the surgical robot executing terminal 300.
In an embodiment of the first manipulating part 110, the first sub-manipulating signal can be the output value x of the first joystick 1111And y1Since the first sub-action command is for a fingerThe surgical instrument 310 is shown in a first direction (X) within a predetermined fluoroscopic image planenAxial direction), in this embodiment, when determining the first sub-action command according to the first sub-manipulation signal, y is first determined1Set to 0, and then proceed with the angle theta2And the calculation of the moving speed v, i.e., the movement direction and X of the distal point of the surgical instrument 310 according to the following formulanAngle theta of the axes2
Figure 961585DEST_PATH_IMAGE003
The movement velocity v of the surgical instrument 310 is calculated according to the following formula:
Figure 423790DEST_PATH_IMAGE004
the first sub-manipulation signal may also be x1And 0, the first sub-action command can be directly determined according to the first sub-control signal.
The action of the second manipulating member 120 is used to instruct the rotation of the surgical device 310 relative to the reference plane, and the "rotation relative to the reference plane" mentioned herein means that a point on the surgical device 310, which may be located on the reference plane, has a constant relative position with respect to the reference plane and rotates around the point. For example, the distal point of the surgical device 310 is located on a reference plane, the distal point of the surgical device 310 is the position of the surgical device 310 for performing the surgical operation, and in the embodiment where the surgical device 310 is a puncture needle, the position of the puncture needle is located on the reference plane, so that the operator can control the position of the surgical operation during the operation.
In one embodiment, as shown in fig. 5, the second manipulating member 120 employs a second rocking bar 121, the second rocking bar 121 can be manipulated at will in 360 °, the second rocking bar 121 can realize rotation of the surgical device 310 around the reference plane in any direction, and it is convenient to establish a corresponding relationship between the motion of the second rocking bar 121 and the motion of the surgical device 310, and it is also more convenient for an operator to perform intuitive operation. Specifically, a second rocker coordinate system is constructed for the second rocker 121, illustrativelyAs shown in FIG. 8, the two directions perpendicular to each other are X respectively with the central position of the second rocking bar 121 as the origin2Axis and Y2The axes construct a second rocker coordinate system. The second rocker coordinate system and the puncture needle coordinate system (with the end point of the surgical device 310 as the origin, and the two mutually perpendicular directions are X respectivelynAxis and YnAxis-constructed coordinate system) to construct a correspondence of the motion of the second rocking lever 121 to the motion of the surgical instrument 310, e.g., in the normal mode, as shown in fig. 8 and 9, the second rocking lever 121 is in X2Axis and Y2Projection and X on a plane of axis construction2Angle theta of the axes3Corresponding to surgical instrument 310 at XnAxis and YnProjection and X of an axis-constructed planenAngle theta of the axes4I.e. theta34. Thus, the surgical instrument 310 at X may be calculated according to the following equationnAxis and YnProjection and X of an axis-constructed planenAngle theta of the axes4
Figure 912540DEST_PATH_IMAGE005
The rotational angular velocity ω of the surgical instrument 310 is calculated according to the following equation:
Figure 383973DEST_PATH_IMAGE006
wherein, MAX2Is the maximum output value of the second rocker 121;
Figure 376200DEST_PATH_IMAGE007
is the maximum rotational angular velocity of the surgical instrument 310.
When the second rocking bar 121 returns to the center position, x2And y2The outputs are all 0, and at this time, the main control terminal 200 issues an instruction to stop the operation to the surgical robot executing terminal 300.
In an embodiment of the second manipulating part 120, the second sub-manipulating signal can be the output of the second joystick 121Out value x2And y2Since the second sub-motion command is used to instruct the surgical instrument 310 to rotate within the predetermined fluoroscopic image plane (i.e., around Y)nRotation of the shaft), in this embodiment, when determining the second sub-action command according to the second sub-manipulation signal, y is first determined2Set to 0, and then proceed with the angle theta4And the calculation of the rotational angular velocity ω, i.e., the calculation of the surgical instrument 310 at X according to the following formulanAxis and YnProjection and X of an axis-constructed planenAngle theta of the axes4
Figure 39000DEST_PATH_IMAGE008
The rotational angular velocity ω of the surgical instrument 310 is calculated according to the following equation:
Figure DEST_PATH_IMAGE009
the second sub-manipulation signal may also be x2And 0, the second sub-action command can be directly determined according to the second sub-control signal.
The action of the third manipulating member 130 is to instruct the surgical device 310 to move along the reference linear path, where "movement along the reference linear path" means that the movement locus of each point on the surgical device 310 is located on a linear locus, and the linear loci of different points on the surgical device 310 are parallel to the reference linear path, and the linear locus of one or more points on the surgical device 310 may be located on the reference linear path. The reference linear path is at a variable angle with respect to the reference plane, and is exemplarily perpendicular to the reference plane, for example, the reference linear path is an extending direction of the surgical instrument 310, so that the operator can conveniently operate the third manipulating part 130 to realize the accurate motion of the surgical instrument 310, for example, in the embodiment where the surgical instrument 310 is a puncture needle, the operator can conveniently realize the accurate needle inserting and withdrawing motion by operating the third manipulating part 130.
In one embodiment, as shown in FIG. 5, the thirdThe control component 130 adopts the push rod 131, and the push rod 131 is adopted, so that the operation of an operator can be facilitated, and the operation accuracy is improved. Specifically, as shown in fig. 10, the center position of pushrod 131 is set as the origin, and the pushing direction parallel to pushrod 131 is set as Z3The axes construct the putter coordinate system, as shown in FIG. 11, at XnAxis and YnOn the basis of the shaft, Z is addednAxis, ZnAxis and XnAxis and YnThe plane formed by the axes being of variable angle, ZnThe axis is the extending direction of the surgical instrument 310, so that the corresponding relation between the motion of the push rod 131 and the motion of the surgical instrument 310 is established. For example, in the normal mode, as shown in fig. 10 and 11, the pushing direction of the push rod 131 and Z3Angle of included axis beta1Corresponding to the direction of motion of the surgical instrument 310 and ZnAngle of included axis beta2,β12. Thus, the direction of motion and Z of the surgical instrument 310 may be calculated according to the following equationnAngle of included axis beta2
Figure 483888DEST_PATH_IMAGE010
The moving speed v of the surgical instrument 310 is calculated according to the following formulaz
Figure DEST_PATH_IMAGE011
Wherein, MAX3Is the maximum output value of the push rod 131;
Vmaxis the maximum movement speed of the surgical instrument 310.
When pushrod 131 returns to the center position, z3The output is 0, and at this time, the main control terminal 200 issues an instruction to stop the operation to the surgical robot executing terminal 300.
In the embodiment adopting the third manipulating part 130, the third sub-manipulating signal is the output value z of the push rod 1313And determining a third sub-action command according to the third sub-control signal by adopting the mode.
Before controlling the surgical instrument 310 to move within the preset fluoroscopic image plane, it is required to control the surgical instrument 310 to move into the preset fluoroscopic image plane, which may be performed manually by an operator or automatically by a surgical robot system, and in an embodiment, before receiving the first control signal (i.e., before controlling the surgical instrument 310 to move within the preset fluoroscopic image plane), the control method further includes:
s110, receiving a second control signal moving to a preset perspective image plane;
s120, determining a second action instruction according to the second control signal;
and S130, sending a second action instruction, wherein the second action instruction is used for indicating the surgical instrument 310 to move into the preset perspective image plane.
In this embodiment, after receiving the second manipulation signal sent by the manipulation terminal 100, the main control terminal 200 determines a second action instruction for instructing the surgical robot performing terminal 300 to act according to the second manipulation signal, so as to control the surgical instrument 310 of the surgical robot performing terminal 300 to move to the preset perspective image plane. In an embodiment, the coordinates of the preset perspective image plane are stored in the main control end 200, and the surgical robot executing end 300 feeds back the coordinates of the surgical instrument 310 to the main control end 200, so that the main control end 200 can calculate the motion trajectory of the surgical instrument 310 according to the coordinates of the surgical instrument 310 and the coordinates of the preset perspective image plane, and further determine a second motion instruction for instructing the surgical instrument 310 to move into the preset perspective image plane.
In one embodiment, the control method further comprises:
s140, receiving a feedback signal of the surgical instrument moving into a preset perspective image plane;
and S150, sending out a feedback signal.
In this embodiment, after the surgical instrument 310 is moved to the proper position, the surgical robot executing end 300 feeds back a feedback signal indicating that the surgical instrument 310 is moved to the preset perspective image plane to the main control end 200, and after receiving the feedback signal, the main control end 200 sends the feedback signal to the manipulating end 100, so that the operator performs subsequent operations, for example, the operator performs an operation of locking the movement plane at the manipulating end 100.
As shown in fig. 12, an embodiment of the present application provides a surgical robot control method based on real-time fluoroscopic images, where the control method is applied to a manipulation terminal 100, and the control method includes:
and S10, entering a first preset state based on the received first operation instruction for locking the motion plane or the feedback signal of the surgical instrument at the surgical robot execution end moving to the preset perspective image plane of the perspective image.
In this step, the control end 100 may be controlled to enter the first preset state through an operation of an operator, or the control end 100 may automatically enter the first preset state after receiving the feedback signal.
In some embodiments, the control terminal 100 is provided with a first triggering structure, and an operator can operate the first triggering structure to enable the control terminal 100 to enter a first preset state. Illustratively, as shown in fig. 5, the first trigger structure includes a first pick 151 and a second pick 152 disposed corresponding to the first rocker 111, the first pick 151 being used to lock the surgical instrument 310 along the X-axisnThe translational motion in the axial direction, when the first pick 151 is in the trigger state, no matter how the first rocker 111 moves, x1The outputs are all 0. Second paddle 152 is used to lock surgical instrument 310 along YnThe translational movement in the axial direction is, when the second pick 152 is in the trigger state, y no matter how the first rocker 111 moves1The outputs are all 0.
The first trigger structure further comprises a third pick 153 and a fourth pick 154 arranged corresponding to the second rocker 121, the third pick 153 being used for locking the surgical instrument 310 around the XnThe rotational movement of the shaft, when the third paddle 153 is in the activated state, x, no matter how the second rocker 121 moves2The outputs are all 0. Fourth paddle 154 is used to lock surgical instrument 310 about YnRotational movement of the shaft, when the fourth paddle 154 is in the activated state, y, regardless of the movement of the second rocker 1212The outputs are all 0.
In this embodiment, the first toggle piece 151 and the third toggle piece 153 are set to be in a non-trigger state, and the second toggle piece 152 and the fourth toggle piece 154 are set to be in a trigger state, so that the manipulation terminal 100 enters a first preset state.
In other embodiments, after the surgical instrument 310 is moved in place, the surgical robot executing end 300 feeds back a feedback signal indicating that the surgical instrument 310 is moved to the preset perspective image plane to the main control end 200, after receiving the feedback signal, the main control end 200 sends the feedback signal to the manipulating end 100, and after receiving the feedback signal, the manipulating end 100 automatically enters the first preset state.
S20, receiving a second operation instruction in the first preset state;
s30, determining a first control signal according to the second operation instruction;
and S40, sending a first control signal, wherein the first control signal is used for indicating the surgical instrument to move in the preset perspective image plane.
In a first preset state, the manipulation terminal 100 receives a second operation instruction of an operator, for example, a doctor, and illustratively, the operator controls a manipulation component of the manipulation terminal 100 to move, and the motion acquisition unit of the manipulation terminal 100 acquires the motion of the manipulation component and generates a first manipulation signal according to the motion of the manipulation component. The first control signal is used for indicating the surgical instrument 310 to move in the preset perspective image plane, so that the relative position relation between the surgical instrument 310 and the target point can be conveniently obtained, the position of the surgical instrument 310 can be conveniently adjusted, and the surgical efficiency and the surgical success rate are improved. Specifically, reference may be made to the related description of the control method applied to the main control end 200, and details are not described herein.
In one embodiment, the first manipulation signal includes a first sub-manipulation signal for indicating a translation of the surgical instrument 310 in a first direction within the preset fluoroscopic image plane. The first manipulation signal includes a second sub-manipulation signal for indicating the rotation of the surgical instrument 310 within the preset fluoroscopic image plane. The first manipulation signal includes a third sub-manipulation signal, the third sub-manipulation signal is used to instruct the surgical instrument 310 to translate along a second direction within the preset fluoroscopic image plane, the second direction is an extending direction of the surgical instrument 310, and the first direction and the second direction are arranged at an included angle, for example, the first direction is perpendicular to the second direction. For the first sub-manipulation signal, the second sub-manipulation signal and the third sub-manipulation signal, reference may be made to the related description of the control method applied to the main control end 200, and details are not repeated herein.
Before controlling the surgical instrument 310 to move in the preset fluoroscopic image plane, it is necessary to control the surgical instrument 310 to move into the preset fluoroscopic image plane, and the motion may be performed manually by an operator or automatically by a surgical robot system, in an embodiment, before the received first operation instruction for locking the motion plane or the feedback signal for moving the puncture needle of the surgical robot performing end 300 into the preset fluoroscopic image plane, the control method further includes:
s11, receiving a third operation instruction moving to the preset perspective image plane;
s21, determining a second control signal according to the third operation instruction;
and S31, sending a second control signal, wherein the second control signal is used for indicating the surgical instrument to move into the preset perspective image plane.
Illustratively, the manipulating end 100 is provided with a second triggering structure, and the operator can operate the second triggering structure to send a second manipulating signal to the main control end 200, wherein the second manipulating signal indicates that the surgical instrument 310 moves into the preset fluoroscopic image plane. The second triggering structure is, for example, a button disposed on the control end 100, the operator presses the button to send a third operation instruction to the control end 100, and after receiving the third operation instruction, the control end 100 generates a second control signal and sends the second control signal to the main control end 200, so that the main control end 200 controls the surgical instrument 310 to move within the preset perspective image plane.
In one embodiment, the control method further comprises:
and S50, based on the received fourth operation instruction for unlocking or the feedback signal that the surgical instrument at the surgical robot execution end deviates from the preset perspective image plane, entering a second preset state.
In this step, the main control end 200 may be controlled to enter the second preset state through the operation of the operator, or the control end 100 may automatically enter the second preset state after receiving the feedback signal.
In the embodiment where the first toggle piece 151, the second toggle piece 152, the third toggle piece 153, and the fourth toggle piece 154 are disposed on the manipulation terminal 100, the first toggle piece 151, the second toggle piece 152, the third toggle piece 153, and the fourth toggle piece 154 are all set to be in a non-trigger state, so that the manipulation terminal 100 enters a second preset state.
In other embodiments, after the surgical instrument 310 deviates from the preset perspective image plane, the surgical robot executing end 300 feeds back a feedback signal indicating that the surgical instrument 310 deviates from the preset perspective image plane to the main control end 200, the main control end 200 receives the feedback signal and sends the feedback signal to the control end 100, and the control end 100 automatically enters the second preset state after receiving the feedback signal.
In the second preset state, the movement of the surgical instrument 310 is no longer limited, and the manipulation end 100 can control the surgical instrument 310 to perform any movement, which is specifically referred to the aforementioned conventional mode and will not be described herein again.
An embodiment of the present application provides a computer-readable storage medium having stored thereon a computer program, which when executed performs the steps of the method as described above.
Fig. 13 is a block diagram illustrating a computer device 900 for implementing the method described above, according to an example embodiment. For example, computer device 900 may be provided as a server. Referring to fig. 11, the computer apparatus 900 includes a processor 901, and the number of the processors may be set to one or more as necessary. Computer device 900 also includes a memory 902 for storing instructions, such as application programs, that are executable by processor 901. The number of the memories can be set to one or more according to needs. Which may store one or more application programs. The processor 901 is configured to execute instructions to perform the above-described methods.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus (device), or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied in the medium. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, including, but not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer, and the like. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.
The present invention has been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In the present invention, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such article or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of additional like elements in the article or device comprising the element.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims (10)

1. A surgical robot control method based on real-time perspective images is characterized in that the control method is applied to a main control end and comprises the following steps:
receiving a first control signal;
determining a first action instruction according to the first control signal;
and sending the first action instruction, wherein the first action instruction is used for instructing a surgical instrument of the surgical robot execution end to move in a preset perspective image plane of a perspective image.
2. The control method according to claim 1, wherein the first manipulation signal includes a first sub-manipulation signal, and the first motion instruction includes a first sub-motion instruction corresponding to the first sub-manipulation signal, the first sub-motion instruction being used to instruct translation of the surgical instrument in a first direction within the preset fluoroscopic image plane;
the first control signal comprises a second sub-control signal, the first action instruction comprises a second sub-action instruction corresponding to the second sub-control signal, and the second sub-action instruction is used for indicating the rotation of the surgical instrument in the preset perspective image plane;
the first manipulation signal comprises a third sub-manipulation signal, the first action instruction comprises a third sub-action instruction corresponding to the third sub-manipulation signal, and the third sub-action instruction is used for indicating translation of the surgical instrument in a second direction in the preset fluoroscopic image plane; the second direction is the extending direction of the surgical instrument, and the first direction and the second direction form an included angle.
3. The control method according to claim 2, wherein the second sub-motion command is used to instruct the surgical instrument to rotate within the preset fluoroscopic image plane centering on a distal end point of the surgical instrument.
4. The control method according to any one of claims 1 to 3, characterized in that, before receiving the first manipulation signal, the control method further comprises:
receiving a second control signal moving into the preset perspective image plane;
determining a second action instruction according to the second control signal;
and sending the second action instruction, wherein the second action instruction is used for indicating the surgical instrument to move into the preset perspective image plane.
5. The control method according to claim 4, characterized by further comprising:
receiving a feedback signal of the surgical instrument moving into the preset perspective image plane;
and sending out the feedback signal.
6. A surgical robot control method based on real-time perspective images is characterized in that the control method is applied to a control end and comprises the following steps:
entering a first preset state based on a received first operation instruction for locking a motion plane or a feedback signal of a surgical instrument at a surgical robot execution end moving to a preset perspective image plane of a perspective image;
in the first predetermined state of the process,
receiving a second operation instruction;
determining a first control signal according to the second operation instruction;
and sending the first control signal, wherein the first control signal is used for indicating the surgical instrument to move in the preset perspective image plane.
7. The control method according to claim 6, wherein the first manipulation signal comprises a first sub-manipulation signal for indicating a translation of the surgical instrument in a first direction within the preset fluoroscopic image plane;
the first manipulation signal comprises a second sub-manipulation signal, and the second sub-manipulation signal is used for indicating the rotation of the surgical instrument in the preset perspective image plane;
the first manipulation signal comprises a third sub-manipulation signal for indicating translation of the surgical instrument in a second direction within the preset fluoroscopic image plane; the second direction is the extending direction of the surgical instrument, and the first direction and the second direction form an included angle.
8. The control method according to claim 6 or 7, wherein before the received first operation instruction for locking the motion plane or the surgical instrument at the surgical robot executing end moves to the feedback signal in the preset fluoroscopic image plane, the control method further comprises:
receiving a third operation instruction moving into the preset perspective image plane;
determining a second control signal according to the third operation instruction;
and sending the second control signal, wherein the second control signal is used for indicating the surgical instrument to move into the preset perspective image plane.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed, carries out the steps of the method according to any one of claims 1 to 5 or the steps of the method according to any one of claims 6 to 8.
10. A computer arrangement comprising a processor, a memory and a computer program stored on the memory, characterized in that the processor implements the steps of the method according to any one of claims 1-5, or the steps of the method according to any one of claims 6-8, when executing the computer program.
CN202210082854.3A 2022-01-25 2022-01-25 Surgical robot control method, medium and device based on real-time perspective image Pending CN114098991A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210082854.3A CN114098991A (en) 2022-01-25 2022-01-25 Surgical robot control method, medium and device based on real-time perspective image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210082854.3A CN114098991A (en) 2022-01-25 2022-01-25 Surgical robot control method, medium and device based on real-time perspective image

Publications (1)

Publication Number Publication Date
CN114098991A true CN114098991A (en) 2022-03-01

Family

ID=80361253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210082854.3A Pending CN114098991A (en) 2022-01-25 2022-01-25 Surgical robot control method, medium and device based on real-time perspective image

Country Status (1)

Country Link
CN (1) CN114098991A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101596127A (en) * 2009-06-26 2009-12-09 北京大学深圳医院 Linear array ultrasonic probe puncture synchronous guider
CN101947135A (en) * 2010-10-12 2011-01-19 上海交通大学 Remote-control puncturing and positioning system under C arm machine introduction
CN102485181A (en) * 2010-12-03 2012-06-06 张春霖 Vertebral column navigation surgery robot based on virtual identification registration control
US20120316573A1 (en) * 2011-05-31 2012-12-13 Intuitive Surgical Operations, Inc. Positive control of robotic surgical instrument end effector
US20160074123A1 (en) * 2013-05-31 2016-03-17 Randall Bly Surgery Pathway Guidance And Boundary System
CN110448378A (en) * 2019-08-13 2019-11-15 北京唯迈医疗设备有限公司 A kind of immersion intervention operation overall-in-one control schema platform
CN111948980A (en) * 2020-08-28 2020-11-17 雅客智慧(北京)科技有限公司 Robot controller and control method thereof
CN112336432A (en) * 2020-11-10 2021-02-09 亿盛欣科技(北京)有限公司 Master-slave CT perspective guide real-time puncture system and master-slave operation method
CN113445752A (en) * 2021-05-25 2021-09-28 中联重科股份有限公司 Method, device and system for controlling movement of tail end of arm support, medium and engineering machinery

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101596127A (en) * 2009-06-26 2009-12-09 北京大学深圳医院 Linear array ultrasonic probe puncture synchronous guider
CN101947135A (en) * 2010-10-12 2011-01-19 上海交通大学 Remote-control puncturing and positioning system under C arm machine introduction
CN102485181A (en) * 2010-12-03 2012-06-06 张春霖 Vertebral column navigation surgery robot based on virtual identification registration control
US20120316573A1 (en) * 2011-05-31 2012-12-13 Intuitive Surgical Operations, Inc. Positive control of robotic surgical instrument end effector
US20160074123A1 (en) * 2013-05-31 2016-03-17 Randall Bly Surgery Pathway Guidance And Boundary System
CN110448378A (en) * 2019-08-13 2019-11-15 北京唯迈医疗设备有限公司 A kind of immersion intervention operation overall-in-one control schema platform
CN111948980A (en) * 2020-08-28 2020-11-17 雅客智慧(北京)科技有限公司 Robot controller and control method thereof
CN112336432A (en) * 2020-11-10 2021-02-09 亿盛欣科技(北京)有限公司 Master-slave CT perspective guide real-time puncture system and master-slave operation method
CN113445752A (en) * 2021-05-25 2021-09-28 中联重科股份有限公司 Method, device and system for controlling movement of tail end of arm support, medium and engineering machinery

Similar Documents

Publication Publication Date Title
CN110893118B (en) Surgical robot system and method for controlling movement of robot arm
US11337768B2 (en) Systems and methods for onscreen menus in a teleoperational medical system
US11758262B2 (en) Intelligent manual adjustment of an image control element
CN111050686B (en) Camera control for surgical robotic system
US8583274B2 (en) Method for graphically providing continuous change of state directions to a user of medical robotic system
JP5543331B2 (en) Method, apparatus, and system for non-mechanically limiting and / or programming movement along one axis of a manipulator tool
WO2022126997A1 (en) Surgical robot, and control method and control apparatus therefor
CN109922750A (en) Repositioning system and correlation technique for remote-controllable executor
WO2022141153A1 (en) Ultrasonic positioning puncture system and storage medium
US20210113283A1 (en) Robotic surgical apparatus, surgical instrument, and method of attaching surgical instrument to robot arm
CN115500950A (en) Endoscope pose adjusting method, surgical robot, and storage medium
CN116919590A (en) Surgical robot control method, device and medium for hallux valgus minimally invasive surgery
JP6149175B1 (en) Surgery support apparatus, control method thereof, program, and surgery support system
CN114098988B (en) Surgical robot system, control method thereof, medium, and computer device
CN114098991A (en) Surgical robot control method, medium and device based on real-time perspective image
WO2020117561A2 (en) Improving robotic surgical safety via video processing
WO2018013773A1 (en) System for camera control in robotic and laparoscopic surgery
CN113164216B (en) Method and system for remotely controlling a surgical slave arm
WO2024066047A1 (en) Interventional robot, multi-mode control method, and storage medium
CN116269812A (en) Master-slave operation puncture system and planning method
CN117297773A (en) Surgical instrument control method, surgical robot, and storage medium
Cornella et al. Improving Cartesian position Accuraca of a telesurgical robot
CN117338411A (en) Surgical robot system motion control device, method and computer readable medium
CN116919609A (en) Surgical robot control system, method, electronic device, and storage medium
CN116849818A (en) Surgical robot control method, surgical robot, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220301

RJ01 Rejection of invention patent application after publication