CN107714178A - Surgical navigational positioning robot and its control method - Google Patents
Surgical navigational positioning robot and its control method Download PDFInfo
- Publication number
- CN107714178A CN107714178A CN201711027805.5A CN201711027805A CN107714178A CN 107714178 A CN107714178 A CN 107714178A CN 201711027805 A CN201711027805 A CN 201711027805A CN 107714178 A CN107714178 A CN 107714178A
- Authority
- CN
- China
- Prior art keywords
- mechanical arm
- display device
- organ
- surgical navigational
- destination organization
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Robotics (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Gynecology & Obstetrics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Surgical Instruments (AREA)
Abstract
The present invention discloses a kind of surgical navigational positioning robot and its control method, surgical navigational positioning robot includes the first mechanical arm, the second mechanical arm and robot body, ultrasonic probe is provided with first mechanical arm, 3D display device is provided with second mechanical arm, the outer surface of ultrasonic probe is provided with infrared locator, the outer surface of robot body is provided with operation handle, and microcontroller and memory are provided with robot body.Surgical navigational positioning robot of the present invention obtains the ultrasonography of destination organization organ in surgical procedure by ultrasonic probe, and the lesion region of destination organization organ is positioned by infrared locator real-time tracking, facilitates surgical and improves the efficiency of operation.In addition, being free to travel to doctor desired position by the 3D display device being arranged on the second mechanical arm, and ultrasonography is included referring to for surgical on 3D display device, improve the accuracy and security of operation.
Description
Technical field
The present invention relates to technical field of medical instruments, more particularly to a kind of surgical navigational positioning robot and its controlling party
Method.
Background technology
Operation refers to be cut or cut using Medical Devices or other means skin, the mucous membrane or other of operation patient
Tissue, to handle the process of pathological condition.Cut skin and handle, reconstruct or cut off the laparotomy ventrotomy of internal etc.
Deng surgical procedure may exist lose blood, side effect, pain and the problem of scar, therefore, operating robot uses mesh
Before be considered as a kind of welcome alternative.
At present, operating robot can not automatically identify the lesion region of histoorgan, automatically doctor can not be guided to look for
To lesion region, so as to which the accuracy and security of operation can not be ensured.Operating robot is used to carry out robot in addition, working as
It is to be operated inside patient body during art, while doctor can watch the operative image shown on display.However,
It is seated or stands in doctor and carries out the process of robotic surgery, because the position of display is usually fixed, therefore can not root
Need to move according to doctor according to the posture and position of doctor, the operation watched on display in surgical procedure so as to doctor
Image is made troubles.
The content of the invention
It is a primary object of the present invention to provide a kind of surgical navigational positioning robot and its control method, it is intended to solve existing
There is the lesion region that operating robot is unable to automatic identification histoorgan, the operative image that operating robot obtains can not be according to doctor
The problem of raw posture and position need and moved freely.
To achieve the above object, the invention provides a kind of surgical navigational positioning robot, including the first mechanical arm,
Two mechanical arms and robot body, ultrasonic probe, second mechanical arm are provided with first mechanical arm
On be provided with 3D display device, the outer surface of the ultrasonic probe is provided with infrared locator, the appearance of the robot body
Face is provided with operation handle, and the microcontroller for being adapted for carrying out various programmed instruction is provided with the robot body and is suitable to
The memory of a plurality of programmed instruction is stored, described program instruction is loaded by microcontroller and performs following steps:Control described
The ultrasonography of destination organization organ in ultrasonic probe real time shooting patient's surgical procedure of one mechanical arm;By target group
Knit the ultrasonography of organ and made comparisons with the reference image of normal structure organ and orient the lesion region of destination organization organ;
Space coordinates are established using the operating table that patient lies low as horizontal plane and with the position and direction of the ultrasonic probe, and are based on
The space coordinates calculate the position coordinates of lesion region;Surgical navigational is produced according to the position coordinates of the lesion region to refer to
Order, and control the infrared locator in the ultrasonic probe to produce infrared light guiding points;According to the surgical navigational order-driven
First mechanical arm moving direction makes the infrared light guiding points be radiated at the lesion region;When receive operation handle input
During operational order, the 3D display device produced according to operational order on the second mechanical arm of driving is moved to the drive of doctor's drawing axis
Dynamic signal;Second mechanical arm is driven to move the drawing axis for making 3D display device be moved to doctor according to drive signal, and
The ultrasonography of destination organization organ is included on 3D display device.
Preferably, the ultrasonography by destination organization organ is made comparisons fixed with the reference image of normal structure organ
The step of position goes out the lesion region of destination organization organ comprises the following steps:The ginseng of normal structure organ is read from the memory
Examine image;The ultrasonography of comparison object histoorgan determines both grain distributions with the reference image of normal structure organ
Difference, and orient according to both grain distribution difference the lesion region of destination organization organ.
Preferably, described produced according to operational order drives the 3D display device on the second mechanical arm to be moved to doctor's eyes
The step of drive signal in front, comprises the following steps:Calculating 3D display device according to the operational order will be moved to most
Final position is put;Calculating the second mechanical arm by the final position of 3D display device will powered angle;Produce and transmit with institute
The angle of calculating drives the drive signal of the second mechanical arm movement.
Preferably, the ultrasonic probe, infrared locator, operation handle, 3D display device and memory are electrically connected to
On microcontroller.
Preferably, the robot body also sets up supply unit, the supply unit include rechargeable lithium battary and
Cradle, the lithium battery are connected on microcontroller, and the cradle is electrically connected on lithium battery.
The present invention also provides a kind of control method of surgical navigational positioning robot, and surgical navigational positioning robot includes
First mechanical arm, the second mechanical arm and robot body, ultrasonic probe are provided with first mechanical arm, institute
State and 3D display device is provided with the second mechanical arm, the outer surface of the ultrasonic probe is provided with infrared locator, the machine
The outer surface of device human body is provided with operation handle, wherein, the control method of the surgical navigational positioning robot includes step:
Control the ultrasonogram of destination organization organ in ultrasonic probe real time shooting patient's surgical procedure of first mechanical arm
Picture;The ultrasonography of destination organization organ is made comparisons with the reference image of normal structure organ and orients destination organization organ
Lesion region;Using the operating table that patient lies low as horizontal plane and space is established with the position and direction of the ultrasonic probe to sit
Mark system, and based on the position coordinates of space coordinates calculating lesion region;Produced according to the position coordinates of the lesion region
Surgical navigational instructs, and controls the infrared locator in the ultrasonic probe to produce infrared light guiding points;Led according to the operation
Boat order-driven the first mechanical arm moving direction makes the infrared light guiding points be radiated at the lesion region;Come from when receiving
During the operational order of operation handle input, the 3D display device on the second mechanical arm of driving is produced according to operational order and is moved to doctor
The drive signal of raw drawing axis;The second mechanical arm is driven to move the eye for making 3D display device be moved to doctor according to drive signal
In front of eyeball, and the ultrasonography of destination organization organ is included on 3D display device.
Preferably, the ultrasonography by destination organization organ is made comparisons fixed with the reference image of normal structure organ
The step of position goes out the lesion region of destination organization organ comprises the following steps:The ginseng of normal structure organ is read from the memory
Examine image;The ultrasonography of comparison object histoorgan determines both grain distributions with the reference image of normal structure organ
Difference;The lesion region of destination organization organ is oriented according to the grain distribution difference.
Preferably, institutional framework difference, size of the grain distribution difference including human tissue organ's generation lesion are big
Small difference and appearance profile difference.
Preferably, described produced according to operational order drives the 3D display device on the second mechanical arm to be moved to doctor's eyes
The step of drive signal in front, comprises the following steps:Calculating 3D display device according to the operational order will be moved to most
Final position is put;Calculating the second mechanical arm by the final position of 3D display device will powered angle;Produce and transmit with institute
The angle of calculating drives the drive signal of the second mechanical arm movement.
Preferably, surgical navigational instruction include first mechanical arm and destination organization organ lesion region it
Between distance and directional information, the infrared light guiding points for one kind in patient's surgical procedure guide doctor find out histoorgan disease
Become the visual infrared round dot of position.
Compared to prior art, surgical navigational positioning robot of the present invention and its control method use above-mentioned technical side
Case, following technique effect is reached:Mesh in surgical procedure is obtained by the ultrasonic probe 11 being arranged on the first mechanical arm
The ultrasonography of histoorgan is marked, the lesion region of destination organization organ is positioned by infrared locator real-time tracking, makes disease
The position in change region is high-visible and self-navigation guides surgical to lesion region, so as to facilitate surgical and improve
The efficiency of operation.In addition, the position needed for doctor can be free to travel to by the 3D display device being arranged on the second mechanical arm
Put, and ultrasonography is included operating reference in surgical procedure for doctor on 3D display device, so as to improve operation
Accuracy and security.
Brief description of the drawings
Fig. 1 is the application environment schematic diagram of surgical navigational positioning robot preferred embodiment of the present invention;
Fig. 2 is the internal circuit connection diagram of surgical navigational positioning robot of the present invention.
Fig. 3 is the flow chart of the control method preferred embodiment of surgical navigational positioning robot of the present invention.
The object of the invention is realized, functional characteristics and advantage will be described further referring to the drawings in conjunction with the embodiments.
Embodiment
Further to illustrate the present invention to reach the technological means and effect that above-mentioned purpose is taken, below in conjunction with accompanying drawing
And preferred embodiment, embodiment, structure, feature and its effect of the present invention are described in detail.It should be appreciated that this
The specific embodiment of place description is not intended to limit the present invention only to explain the present invention.
Shown in reference picture 1, Fig. 1 is the application environment schematic diagram of surgical navigational positioning robot preferred embodiment of the present invention.
In the present embodiment, the surgical navigational positioning robot 01 can be placed in operating room, be also placed with the operating room
Lie low the operating table 02 performed the operation for patient.The medical robot 01 is include but not limited to, the first mechanical arm 1,
Two mechanical arms 2 and robot body 3.Microcontroller 30, memory 31 and charging are provided with the robot body 3
Device 32, the outer surface of the robot body 3 are additionally provided with operation handle 33.
Ultrasonic probe 11 is provided with first mechanical arm 1, the ultrasonic probe 11 is electrically connected to microcontroller
30, for passing through the ultrasonography of ultrasonic wave real time shooting tissue of patient organ, the ultrasonogram in patient's surgical procedure
3D rendering as being histoorgan.The outer surface of the ultrasonic probe 11 is provided with infrared locator 12, in patient's hand
The infrared light guiding points for guiding doctor to find out histoorgan lesion locations are produced during art.
3D display device 21 is provided with second mechanical arm 2, the 3D display device 21 is electrically connected to microcontroller 30.This
3D display device 21 is arranged on the second mechanical arm 2 by inventive embodiments, so that 3D display device 21 can be with the second machine
Tool arm 2 moves freely through.In the prior art, 3D display device 21 be it is a kind of can make doctor experience tridimensional virtual sensation
3D display system, the 3D display image of three-dimensional perception is provided physicians with, so as to provide 3D display effect.
In the present embodiment, the 3D display device 21 can be implemented as small-sized, light-duty 3D display module.3D display device 21
It can be coupled with the second mechanical arm 2 with certain one-movement-freedom-degree, so that doctor can be by required mobile 3D display
Device 21.For example, 3D display device 21 can need to move according to the posture of doctor according to doctor, such as it is seated or stands in doctor
The vertical process for carrying out robotic surgery.3D display device 21 can be moved to the drawing axis of doctor by the second mechanical arm 2, so as to
The position of maximum convenience is provided for the image of doctor's viewing 3D display device 21.
The microcontroller 30 of the present embodiment can drive the first mechanical arm 1 and the second mechanical arm 2 according to certain angle
Degree and direction move freely.The microcontroller 30 can be microprocessor, micro controller unit in coil insertion device device human body 3
(MCU) etc..Microcontroller 30 can receive is manually entered instruction from doctor, determines that 3D display device 21 will be moved to most
Final position is put.If the final position that 3D display device 21 will be moved to, microcontroller 30 can calculate the second mechanical arm 2 will
Powered angle is wanted, the angle that then can produce and transmit to be calculated drives the driving letter of the second mechanical arm 2 movement
Number.
In the present embodiment, it is coupled by 3D display device 21 with the second mechanical arm 2,3D display device 21 can be moved to doctor
Desired position, so that doctor can take arbitrary posture to be performed the operation and the 3D needed for surgical procedure can be watched to surpass
Audiograph picture and lesion region image.Therefore, when using robotic assisted surgery, doctor, assistant or nurse is not only and also may be used
Freely to adjust the position of 3D display device 21, doctor is aided in accurately to complete to perform the operation to watch 3D rendering.
The drive signal inputted by operation handle 33 can be transferred to the second manipulator by the microcontroller 30 of the present embodiment
Arm 2, so that 3D display device 21 is moved to the drawing axis of doctor.Situation in front of the manually operated movement by doctor
In, when doctor operates the operation handle 33, and the movement of 3D display device 21 is placed at doctor's eyes, microcontroller 30 can root
The second mechanical arm 2 is driven according to the input of operation handle 33 and 3D display device 21 is placed in the drawing axis of doctor.
Shown in reference picture 2, Fig. 2 is the internal circuit connection diagram of surgical navigational positioning robot of the present invention.In this reality
Apply in example, the ultrasonic probe 11, infrared locator 12,3D display device 21, memory 31 and supply unit 32 electrically connect
To microcontroller 30.Described in the present embodiment electrical connection refer to each electric components by conductor wire, signal wire, control line one
Kind or it is a variety of be connected to microcontroller 30 so that microcontroller 30 can control above-mentioned each electric components to complete
Corresponding function.
In the present embodiment, the microcontroller 30 can be a kind of central processing unit (CPU), microprocessor, microcontroller
Unit chip (MCU), data processing chip or the control unit with data processing function.The memory 31 can be
A kind of read-only memory unit ROM, the memory such as electrically-erasable memory cell EEPROM or flash memory cell FLASH.It is described to deposit
Reservoir 31 is stored with the reference image of human normal tissue organ, and the computer program instructions that storage is prepared in advance, micro-control
Device 30 processed can read loading computer program instructions and perform from memory 31, so that surgical navigational positioning robot 01 can
Guided to provide operation in patient's surgical procedure.
The supply unit 32 includes rechargeable lithium battary 321 and cradle 322, and the lithium battery 321 electrically connects
On to the microcontroller 30, for providing working power for the robot 01.The cradle 322 is electrically connected to the lithium
On battery 321, charged for patching external power source for the lithium battery 321.
Present invention also offers a kind of control method of the surgical navigational positioning robot based on 3D display function, it is applied to
In surgical navigational positioning robot 01.With reference to shown in figure 3, the control method that Fig. 3 is surgical navigational positioning robot of the present invention is excellent
Select the flow chart of embodiment.In the present embodiment, the various method and steps of the control method of the surgical navigational positioning robot
Realized by computer software programs, the computer software programs are in the form of computer program instructions and are stored in computer
In readable storage medium storing program for executing (such as memory 31), storage medium can include:Read-only storage, random access memory, disk or light
Disk etc., the computer program instructions can be loaded by processor and perform following steps S31 to step S40.
Step S31, control destination organization device in ultrasonic probe real time shooting patient's surgical procedure of the first mechanical arm
The ultrasonography of official;Specifically, microcontroller 30 controls the first mechanical arm 1 to be moved near the operating table 02 of patient, and
Start the ultrasonography of destination organization organ in patient's surgical procedure on the real time shooting operating table 02 of ultrasonic probe 11.
In the present embodiment, the ultrasonic probe 11 can use three-dimensional ultrasonic to pop one's head in, and it is by launching pyramid Volumetric ultrasound
Beam, obtains the three-dimensional ultrasonic image of destination organization organ in real time, and three-dimensional ultrasonic image is sent to microcontroller 30.
Step S32, the ultrasonography of destination organization organ is made comparisons positioning with the reference picture of normal structure organ
Go out the lesion region of destination organization organ;Specifically, microcontroller 30 reads normal structure organ from memory 31 is stored in
With reference to image, and the ultrasonography of destination organization organ is made comparisons with the reference image of normal structure organ and orients target
The lesion region of histoorgan.In the present embodiment, the ultrasonogram that the microcontroller 30 passes through comparison object histoorgan
As determining both grain distribution difference with the reference image of normal structure organ, to be positioned according to both grain distribution difference
Go out the lesion region of destination organization organ, the institutional framework that the grain distribution difference includes human tissue organ's generation lesion is poor
Different, size difference and appearance profile difference.
Step S33, space is established using the operating table that patient lies low as horizontal plane and with the position and direction of ultrasonic probe
Coordinate system;Specifically, the operating table 02 that microcontroller 30 is lain low with patient for horizontal plane and with the position of ultrasonic probe 11 and
Space coordinates are established in direction.In the present embodiment, the position coordinates of the lesion region includes lesion region relative to ultrasound
The position and direction of ripple probe 11.With reference to shown in figure 1, the operating table 02 that the pathology locating module 102 lies low according to patient is
Horizontal plane simultaneously establishes space coordinates XYZ with the position and direction of ultrasonic probe 11.
Step S34, the position coordinates of lesion region is calculated based on space coordinates;Specifically, microcontroller 30 passes through
Position and direction of the ultrasonic probe 11 under space coordinates XYZ calculate in ultrasonography any point in space coordinates
It is the position coordinates under XYZ, and then is known that position and direction of the destination organization organ relative to ultrasonic probe 11.
Step S35, surgical navigational instruction is produced according to the position coordinates of lesion region;Specifically, the basis of microcontroller 30
The position coordinates of the lesion region produces surgical navigational instruction.In the present embodiment, the surgical navigational instruction includes first
The distance between lesion region of mechanical arm 1 and destination organization organ and directional information.
Step S36, the infrared locator in ultrasonic probe is controlled to produce infrared light guiding points;Specifically, microcontroller 30
Infrared locator 12 in control ultrasonic probe 11 produces infrared light guiding points.In the present embodiment, the infrared light guiding points are
A kind of visual infrared round dot for being used to guide doctor to find out histoorgan lesion locations in patient's surgical procedure.
Step S37, infrared light guiding points are made to be radiated at disease according to surgical navigational order-driven the first mechanical arm moving direction
Become region;Specifically, microcontroller 30 instructs the control moving direction of the first mechanical arm 1 to make infrared lead according to the surgical navigational
Light spot and then can aid in doctor rapidly and accurately to find destination organization organ in the lesion region of destination organization organ
Lesion locations, so as to facilitate surgical and improve the efficiency of operation.
Step S38, when receiving from the operational order that operation handle 33 inputs, driving the is produced according to operational order
3D display device 21 on two mechanical arms 2 is moved to the drive signal of doctor's drawing axis.In the present embodiment, microcontroller is worked as
30 when being received from the operational order that doctor is manually entered by operation handle 33, and the second machine of driving is produced according to operational order
Tool arm 2 is moved to the drive signal of doctor's drawing axis.The operational order includes being moved to the left, move right, to moving up
Move and move up.I.e. doctor or nurse can to the left, to the right, operation handle 33 described in manual toggle forward or backward, so as to
Make the second mechanical arm 2 drive 3D display device 21 be moved to the left, move right, move up and be moved upwards up to it is required final
Position.Specifically, microcontroller 30 calculates the final position that 3D display device 21 will be moved to according to operational order, passes through 3D
The final position of display 21 calculate the second mechanical arm 2 will powered angle, then produce and transmit to be calculated
Angle drives the drive signal of the second mechanical arm 2 movement.
Step S39, the second mechanical arm 2 is driven to move the eyes for making 3D display device 21 be moved to doctor according to drive signal
Front.Specifically, microcontroller 30 drives the second mechanical arm 2 to move according to drive signal, and the second machine is arranged on so as to drive
The 3D display device 21 of tool arm 2 is moved to the drawing axis of doctor, so as to watch the ultrasonography of 3D display device 21 for doctor
The position of maximum convenience is provided.
Step S40, the ultrasonography of destination organization organ is included on 3D display device 21;Because 3D display device 21 can
To be moved to doctor desired position, so that doctor can take arbitrary posture to be performed the operation and can watch surgical procedure
Needed for ultrasonography, so that doctor operates reference in surgical procedure, so as to improve the accuracy of operation and safety
Property.Therefore, when using surgical navigational 01 assisted surgery of positioning robot, being not only doctor, assistant or nurse can also be free
The position of ground regulation 3D display device 21, aids in doctor accurately to complete to perform the operation to watch 3D ultrasonographies.
Surgical navigational positioning robot of the present invention is obtained by the ultrasonic probe 11 being arranged on the first mechanical arm 1
The ultrasonography of destination organization organ in surgical procedure is taken, destination organization organ is positioned by the real-time tracking of infrared locator 12
Lesion region, make the position of lesion region high-visible and self-navigation guide surgical to lesion region, so as to convenient
Surgical and the efficiency for improving operation.In addition, surgical navigational positioning robot of the present invention is by being arranged on the second machine
3D display device 21 on tool arm 2 can be free to travel to doctor desired position, and ultrasonography is included in 3D display
Operated reference in surgical procedure for doctor on device 21, so as to improve the accuracy of operation and security.
The preferred embodiments of the present invention are these are only, are not intended to limit the scope of the invention, it is every to utilize this hair
Equivalent structure or the equivalent function conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (10)
1. a kind of surgical navigational positioning robot, including the first mechanical arm, the second mechanical arm and robot body, it is special
Sign is, ultrasonic probe is provided with first mechanical arm, 3D display device, institute are provided with second mechanical arm
The outer surface for stating ultrasonic probe is provided with infrared locator, and the outer surface of the robot body is provided with operation handle, institute
State and the microcontroller that is adapted for carrying out various programmed instruction is provided with robot body and suitable for storing a plurality of programmed instruction
Memory, described program instruction are loaded by microcontroller and perform following steps:
Control the ultrasound of destination organization organ in ultrasonic probe real time shooting patient's surgical procedure of first mechanical arm
Ripple image;
The ultrasonography of destination organization organ is made comparisons with the reference image of normal structure organ and orients destination organization device
The lesion region of official;
Space coordinates are established using the operating table that patient lies low as horizontal plane and with the position and direction of the ultrasonic probe, and
The position coordinates of lesion region is calculated based on the space coordinates;
Surgical navigational instruction is produced according to the position coordinates of the lesion region, and controls the infrared locator to produce infrared lead
Luminous point;
The infrared light guiding points are made to be radiated at the disease according to the first mechanical arm of surgical navigational order-driven moving direction
Become region;
When receiving the operational order of operation handle input, the 3D on the second mechanical arm of driving is produced according to operational order and shown
Show that device is moved to the drive signal of doctor's drawing axis;
The drawing axis that second mechanical arm movement makes 3D display device be moved to doctor is driven according to drive signal, and by mesh
The ultrasonography of mark histoorgan is shown on 3D display device.
2. surgical navigational positioning robot as claimed in claim 1, it is characterised in that the ultrasound by destination organization organ
Ripple image made comparisons with the reference image of normal structure organ the lesion region for orienting destination organization organ the step of include such as
Lower step:
The reference image of normal structure organ is read from the memory;
The reference image of the ultrasonography of comparison object histoorgan and normal structure organ determines that both grain distributions are poor
It is different, and orient according to both grain distribution difference the lesion region of destination organization organ.
3. surgical navigational positioning robot as claimed in claim 1, it is characterised in that the operational order produces driving second
The step of 3D display device on mechanical arm is moved to the drive signal of doctor's drawing axis comprises the following steps:
The final position that 3D display device will be moved to is calculated according to the operational order;
Calculating the second mechanical arm by the final position of 3D display device will powered angle;
The angle for producing and transmitting to be calculated drives the drive signal of the second mechanical arm movement.
4. the surgical navigational positioning robot as described in any one of claims 1 to 3, it is characterised in that the ultrasonic probe,
Infrared locator, operation handle, 3D display device and memory are electrically connected on microcontroller.
5. the surgical navigational positioning robot as described in claim 4, it is characterised in that the robot body also sets up electricity
Source device, the supply unit include rechargeable lithium battary and cradle, and the lithium battery is connected on microcontroller, institute
Cradle is stated to be electrically connected on lithium battery.
6. a kind of control method of surgical navigational positioning robot, surgical navigational positioning robot include the first mechanical arm,
Second mechanical arm and robot body, it is characterised in that ultrasonic probe is provided with first mechanical arm, it is described
3D display device is provided with second mechanical arm, the outer surface of the ultrasonic probe is provided with infrared locator, the machine
The outer surface of human body is provided with operation handle, wherein, the control method of the surgical navigational positioning robot includes step:
Control the ultrasound of destination organization organ in ultrasonic probe real time shooting patient's surgical procedure of first mechanical arm
Ripple image;
The ultrasonography of destination organization organ is made comparisons with the reference image of normal structure organ and orients destination organization device
The lesion region of official;
Space coordinates are established using the operating table that patient lies low as horizontal plane and with the position and direction of the ultrasonic probe, and
The position coordinates of lesion region is calculated based on the space coordinates;
Surgical navigational instruction is produced according to the position coordinates of the lesion region, and controls the infrared locator to produce infrared lead
Luminous point;
The infrared light guiding points are made to be radiated at the disease according to the first mechanical arm of surgical navigational order-driven moving direction
Become region;
When receiving the operational order from operation handle input, produced according to operational order on the second mechanical arm of driving
3D display device is moved to the drive signal of doctor's drawing axis;
The drawing axis that second mechanical arm movement makes 3D display device be moved to doctor is driven according to drive signal, and by mesh
The ultrasonography of mark histoorgan is shown on 3D display device.
7. the control method of surgical navigational positioning robot as claimed in claim 6, it is characterised in that described by destination organization
The ultrasonography of organ is made comparisons the lesion region of orienting destination organization organ with the reference image of normal structure organ
Step comprises the following steps:
From the reference image for being stored in memory reading normal structure organ;
The reference image of the ultrasonography of comparison object histoorgan and normal structure organ determines that both grain distributions are poor
It is different;
The lesion region of destination organization organ is oriented according to the grain distribution difference.
8. the control method of surgical navigational positioning robot as claimed in claim 7, it is characterised in that the grain distribution is poor
Different institutional framework difference, size difference and the appearance profile difference that lesion occurs including human tissue organ.
9. the control method of surgical navigational positioning robot as claimed in claim 6, it is characterised in that described to be referred to according to operation
The step of 3D display device for making coordinate produce on the second mechanical arm of driving is moved to the drive signal of doctor's drawing axis includes
Following steps:
The final position that 3D display device will be moved to is calculated according to the operational order;
Calculating the second mechanical arm by the final position of 3D display device will powered angle;
The angle for producing and transmitting to be calculated drives the drive signal of the second mechanical arm movement.
10. the control method of surgical navigational positioning robot as claimed in claim 6, it is characterised in that the surgical navigational
Instruction includes the distance between lesion region of first mechanical arm and destination organization organ and directional information, described infrared
Light guiding points are a kind of visual infrared round dot for guiding doctor to find out histoorgan lesion locations in patient's surgical procedure.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711027805.5A CN107714178A (en) | 2017-10-28 | 2017-10-28 | Surgical navigational positioning robot and its control method |
PCT/CN2017/116668 WO2019080317A1 (en) | 2017-10-28 | 2017-12-15 | Robot for surgical navigation and position indication and control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711027805.5A CN107714178A (en) | 2017-10-28 | 2017-10-28 | Surgical navigational positioning robot and its control method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107714178A true CN107714178A (en) | 2018-02-23 |
Family
ID=61203061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711027805.5A Pending CN107714178A (en) | 2017-10-28 | 2017-10-28 | Surgical navigational positioning robot and its control method |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN107714178A (en) |
WO (1) | WO2019080317A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108742876A (en) * | 2018-08-02 | 2018-11-06 | 雅客智慧(北京)科技有限公司 | A kind of operation navigation device |
CN109938768A (en) * | 2019-03-11 | 2019-06-28 | 深圳市比邻星精密技术有限公司 | Ultrasonic imaging method, device, computer equipment and storage medium |
CN112603546A (en) * | 2020-12-24 | 2021-04-06 | 哈尔滨思哲睿智能医疗设备有限公司 | Remote operation system based on laparoscopic operation robot and control method |
CN113855123A (en) * | 2021-11-12 | 2021-12-31 | 郑州大学第一附属医院 | Surgical operation auxiliary robot |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2710567Y (en) * | 2004-07-22 | 2005-07-20 | 上海英迈吉东影图像设备有限公司 | Guiding system mechanical arm for operation |
CN100536770C (en) * | 2007-03-29 | 2009-09-09 | 新奥博为技术有限公司 | Surgical operation system under the guide of magnetic resonant image and the operation navigating method |
CN101375805A (en) * | 2007-12-29 | 2009-03-04 | 清华大学深圳研究生院 | Method and system for guiding operation of electronic endoscope by auxiliary computer |
US8958611B2 (en) * | 2011-12-29 | 2015-02-17 | Mako Surgical Corporation | Interactive CSG subtraction |
CN103908345B (en) * | 2012-12-31 | 2017-02-08 | 复旦大学 | Volume data visualization method for surgical navigation based on PPC (Panel Personal Computer) |
CN105943161A (en) * | 2016-06-04 | 2016-09-21 | 深圳市前海康启源科技有限公司 | Surgical navigation system and method based on medical robot |
CN206063225U (en) * | 2016-06-04 | 2017-04-05 | 深圳市前海康启源科技有限公司 | For para-operative medical robot |
-
2017
- 2017-10-28 CN CN201711027805.5A patent/CN107714178A/en active Pending
- 2017-12-15 WO PCT/CN2017/116668 patent/WO2019080317A1/en active Application Filing
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108742876A (en) * | 2018-08-02 | 2018-11-06 | 雅客智慧(北京)科技有限公司 | A kind of operation navigation device |
CN109938768A (en) * | 2019-03-11 | 2019-06-28 | 深圳市比邻星精密技术有限公司 | Ultrasonic imaging method, device, computer equipment and storage medium |
CN112603546A (en) * | 2020-12-24 | 2021-04-06 | 哈尔滨思哲睿智能医疗设备有限公司 | Remote operation system based on laparoscopic operation robot and control method |
CN113855123A (en) * | 2021-11-12 | 2021-12-31 | 郑州大学第一附属医院 | Surgical operation auxiliary robot |
Also Published As
Publication number | Publication date |
---|---|
WO2019080317A1 (en) | 2019-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10588701B2 (en) | Systems, apparatus, methods and computer-readable storage media facilitating surgical procedures utilizing augmented reality | |
CN107714178A (en) | Surgical navigational positioning robot and its control method | |
ES2907252T3 (en) | System for performing automated surgical and interventional procedures | |
US11039894B2 (en) | Robotic port placement guide and method of use | |
JP2022017422A (en) | Augmented reality surgical navigation | |
CN107669340A (en) | 3D image surgical navigational robots and its control method | |
CN105943161A (en) | Surgical navigation system and method based on medical robot | |
US11602403B2 (en) | Robotic tool control | |
WO2016081023A1 (en) | Ultrasound imaging system having automatic image presentation | |
EP2844342A2 (en) | Videographic display of real-time medical treatment | |
US20210128248A1 (en) | Robotic medical apparatus, system, and method | |
CN116077155B (en) | Surgical navigation method based on optical tracking equipment and mechanical arm and related device | |
US11660142B2 (en) | Method for generating surgical simulation information and program | |
JP2021166593A (en) | Robot surgery support system, robot surgery support method, and program | |
JP7401447B2 (en) | Ultrasonic imaging plane alignment using neural networks and related devices, systems, and methods | |
CN106236258A (en) | The method and device for planning of abdominal-cavity minimal-invasion surgery puncture path | |
CN109009348A (en) | A kind of robot puncturing system | |
CN108404301A (en) | A kind of tumor radiotherapy auxiliary robot production method | |
CN109152612A (en) | Robotic surgical system with embedded imaging instrument | |
Zinchenko et al. | Virtual reality control of a robotic camera holder for minimally invasive surgery | |
Lim et al. | Image-guided robotic mastoidectomy using human-robot collaboration control | |
CN116966450A (en) | Focusing ultrasonic noninvasive ablation device, and ablation preoperative planning method and system | |
WO2022219559A1 (en) | Systems, methods and programs for estimating needle pose | |
CN115227349A (en) | Lung puncture robot based on optical tracking technology | |
CN113397705A (en) | Fracture reduction navigation method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180223 |
|
WD01 | Invention patent application deemed withdrawn after publication |