CN105078580B - Surgical robot system and its laparoscopic procedure method and human body temperature type operation image processing apparatus and its method - Google Patents
Surgical robot system and its laparoscopic procedure method and human body temperature type operation image processing apparatus and its method Download PDFInfo
- Publication number
- CN105078580B CN105078580B CN201510446194.2A CN201510446194A CN105078580B CN 105078580 B CN105078580 B CN 105078580B CN 201510446194 A CN201510446194 A CN 201510446194A CN 105078580 B CN105078580 B CN 105078580B
- Authority
- CN
- China
- Prior art keywords
- image
- endoscope
- endoscopic images
- images
- picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/313—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
- A61B1/3132—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for laparoscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Master-slave robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
- A61B2034/742—Joysticks
Abstract
A kind of human body temperature type operation image processing apparatus is disclosed, it includes:Image input unit, receives the first endoscopic images and the second endoscopic images that the operation endoscope rotated from one end is provided with time point different from each other;Picture display part, by first endoscopic images and second endoscopic images different zones output to each other;Screen display control unit, for controlling the picture display part, so as to the viewpoint different from each other with the operation endoscope, accordingly by first endoscopic images and second endoscopic images, different zones are exported to each other.Accordingly rotated and mobile monitor by the viewpoint of the endoscope with being continually changing, so as to make user more vivo experience the operation sense of reality.
Description
The application is divisional application, and the applying date of original application is on October 28th, 2011, Application No.
201180052600.7, entitled " surgical robot system and its laparoscopic procedure method and human body temperature type operation figure
As processing unit and its method ".
Technical field
The present invention relates to operation, more specifically to a kind of surgical robot system and its laparoscopic procedure method with
And human body temperature type operation image processing apparatus and its method.
Background technology
Operating robot refers to the robot of the function with the operation behavior that can replace being carried out by doctor.This operation
Robot has the advantages that accurate precision operation can be carried out compared with people and can carry out remote operation.
The operating robot that the current whole world is being developed has orthopedic surgery robot, laparoscope (laparoscope)
Operating robot, stereotactic surgery robot etc..Here, laparoscopic surgery robot is to utilize laparoscope and small-sized operation work
Tool carries out the robot of Minimally Invasive Surgery.
Laparoscopic surgery robot is to be inserted after navel position drilling 1cm or so for observing in intraperitoneal conduct
The laparoscope of sight glass and the sophisticated surgical technic performed the operation, are the fields for expecting to grow a lot in the future.
Recently, laparoscope is developed into, and is provided with computer chip, so as to obtain than visually observing apparent enlarged drawing
Picture, if moreover, use specially designed laparoscopic surgery instrument while look at picture by monitor, can carry out any
Operation.
In addition, its range of operation of laparoscopic surgery is almost identical with laparotomy ventrotomy, but the complication compared with laparotomy ventrotomy
Lack, and Post operation can be treated as early as possible, so as to have the advantages that to keep patient with operation muscle power or immunologic function.Therefore,
In the countries such as US and European, treatment colorectal cancer etc., laparoscopic surgery is increasingly known as standard procedures.
Surgical robot system generally includes main robot and from robot.If applying what patient's operation main robot had
Executor (such as handle), the then operation tool for being incorporated into from the robotic arm of robot or being held by robotic arm is operated, so that
Performed the operation.Main robot and network service is carried out by network connection from robot.
However, existing surgical robot system, apply patient to obtain the image about operative site, user need into
Row is individually operated, so as to which laparoscope is moved into required position, or regulation image input angle.That is, patient is applied in surgical procedure
It is middle to utilize hand or pin to individually enter the operation for controlling laparoscope.
But, this is likely to become reduction and applies patient's note during the operation progress of holding notice high concentration is needed
The reason for power of anticipating, and imperfect operation may cause serious complication caused by decreased attention.
Now, when carrying out laparoscopic surgery, the operative image shot by laparoscope is exported to user, and user look at the figure
As being performed the operation, but compared with the operation for progress of directly cutting open the belly, the problem of there is sense of reality reduction.Occurs the original of these problems
Because that can be that laparoscope is when other positions are irradiated in intraperitoneal movement and rotation, and the image that user can see is output to phase
On monitor with position and size, therefore, relative distance sense and motion between the executor and image, with actual abdominal cavity
Relative distance sense and motion between interior operation tool and internal organs is different.
In addition, the operative image shot by laparoscope only includes the parts of images of operation tool, so these operation works
When tool is impinging one another or overlapping, it may occur however that user, which is difficult to operate, or the visual field is blocked can not be smoothed out the situation of operation.
Above-mentioned background technology is that inventor retains to export the present invention or acquired during exporting the present invention
Technical information, is not necessarily the known technology of the forward direction public of the present patent application.
The content of the invention
Technical problem
It is an object of the present invention to provide a kind of only want that the behavior of operative site needed for seeing can just be controlled by applying patient
The position of laparoscope processed and the surgical robot system of image input angle and its laparoscopic procedure method.
In addition, it is an object of the present invention to provide a kind of need not apply patient and carry out independent behaviour for operating laparoscope
Make to make to apply surgical robot system and its laparoscopic procedure method that patient concentrates entirely on operation behavior.
In addition, it is an object of the present invention to provide it is a kind of need not learn using face recognition apparatus operation method and
The surgical robot system and its laparoscopic procedure method of operating method can intuitively be understood.
In addition, it is an object of the present invention to provide a kind of be required for arm and only by the face on three dimensions
Motion can control the surgical robot system and its laparoscopic procedure method of endoscope apparatus in many ways.
In addition, it is an object of the present invention to provide a kind of corresponding in being changed according to the motion of operation endoscope
The viewpoint of sight glass, changes the outgoing position of the endoscopic images exported on the monitor that user sees, so as to make user more
Truly experience the human body temperature type operation image processing apparatus and its method of actual operation situation.
In addition, a kind of it is an object of the present invention to provide endoscopic images that storage is inputted before current time extraction
And picture display part is together output to current endoscopic images, so as to which the information that relevant endoscopic images change is led to
Know the human body temperature type operation image processing apparatus and its method to user.
In addition, a kind of it is an object of the present invention to provide endoscopic images by when performing the operation using endoscope actual photographed
Carried out each with the modeled images for the operation tool for previously generating and being stored in image storage part or match each other or adjust its chi
Human body temperature type operation image processing apparatus and its method very little to wait after amending image, exported to the monitoring unit of user's observable.
In addition, it is an object of the present invention to provide a kind of human body temperature type operation image processing apparatus and its method, by with
The viewpoint for the endoscope being continually changing accordingly rotates and mobile monitor, so as to make user more vivo experience operation
The sense of reality.
Other technical problems in addition to proposed by the present invention can be readily appreciated that by following explanation.
The method for solving technical problem
According to an embodiment of the present invention there is provided a kind of operating robot, the position in vision portion is controlled using operation signal
Put and image input angle in more than one, it includes:Junction portion, to the touched direction of motion for applying patient's face and
The corresponding direction of size and size are travelling;Motion detecting section, output corresponds to the sensing of the direction moved about in junction portion and size
Information;Operational order generating unit, is generated and is exported about one in the position in vision portion and image input angle using heat transfer agent
Operational order more than individual.
, can be with it when operational order is about the order of at least one in the linear operating and rotation process in vision portion
Correspondingly change the operation handle direction of operation robot.
Junction portion can be formed as, and be used as the part of console (console) panel of operating robot.
In order to which patient's facial positions are applied in fixation, it can be protruded at more than one position in junction portion and be formed with support.
It can perforate to be formed with junction portion and meet mesh portion, so that the image obtained by vision portion is in visual information mode
It can be seen that.
Junction portion can be formed by light transmissive material, so that the image obtained by vision portion can in visual information mode
See.
Operating robot can also include:Contact detecting, detection applies whether patient's face is contacted with junction portion or supporting
Portion;Original state recovery section, when recognizing contact releasing by contact detecting, junction portion, which is returned to, is appointed as acquiescence
(default) position and the normal condition of state.
Original state recovery section be able to can be made by the junction portion swimming direction and the reverse operating of size according to heat transfer agent
Junction portion returns to normal condition.
Operating robot can also include:Eye tracking portion, the view data sequentially generated is compared in chronological order
Compared with and judge, and generate parsing pupil position change, eyes apperance change and direction of gaze in the parsing information of at least one.
In addition, operating robot can also also include:Image pickup part, image is generated in the inner side of operating robot towards face shots are connect
Data;Storage part, for storing generated view data.
Operational order generating unit judges whether parsing information meets predetermined change as any operation order, if meeting
When, export corresponding operational order.
Vision portion can be any of microscope, endoscope, endoscope can be laparoscope, thoracoscope, arthroscope,
One or more of asoscope, cystoscope, proctoscope, ERCP, mediastinoscope, cardioscope.
Junction portion is formed at the preceding surface of control deck plate in the way of by elastomeric support, and elastomer can provide recovery
Power, to release during the external force for connecing facial movement application, makes junction portion return to original position.
According to another embodiment of the present invention there is provided a kind of operating robot, vision portion is controlled using operation signal
More than one in position and image input angle, it includes:Meet mesh portion, for the image that will be obtained by vision portion as regarding
Feel that information is provided;Eye tracking portion, generates the pupil position change, the change of eyes apperance and the side of watching attentively to seeing in mesh portion by connecing
At least one parsing information parsed in;Operational order generating unit, judges that parsing information is used as any operation order
Predetermined change whether is met, if meet, the operational order for operating vision portion is exported.
Eye tracking portion can include:Image unit, shoots and generation figure in the inner side of operating robot towards mesh portion is met
As data;Memory cell, the view data for storing generation;Eye tracking unit, by the view data sequentially generated on time
Between be sequentially compared judgement, and generate at least one in the change of parsing pupil position, the change of eyes apperance and direction of gaze
Parsing information.
Meeting mesh portion can perforate in the junction portion of the part of console (console) panel as operating robot
Formed.
According to another embodiment of the present invention there is provided a kind of operating robot, vision portion is controlled using operation signal, its
Including:Shoot part, subject and generate view data;Angle and apart from calculating section, parsing is contained in view data
Interior angle size and direction of rotation that the connecting line of two central points of face is formed with picture center line, and with being clapped before
The interior angle size and direction of rotation that the image data analyzing taken the photograph goes out are compared, to generate relevant facial direction of rotation and the anglec of rotation
The displacement information of degree;Operational order generating unit, generates and exports operational order correspondingly to operate with the displacement information
The vision portion.
Angle and apart from calculating section can also calculate according to before shoot image data analyzing go out it is described face in
Datum mark and the distance between the datum mark information in the face gone out according to image data analyzing, and for the parallel shifting in vision portion
Dynamic operation, the range information calculated may be included in displacement information.
In the face that angle and the image data analyzing that can also calculate the shooting before apart from calculating section go out between two
Away from the variable quantity between two spacing in the face gone out according to image data analyzing, the variable quantity between two spacing is contained in
Displacement information, for the image multiplying power in regulation vision portion.
Vision portion can be laparoscope, thoracoscope, arthroscope, asoscope, cystoscope, proctoscope, ERCP, mediastinum
One or more of mirror, cardioscope.
Vision portion can be the device for obtaining 3-D view, can be according to face and the position adjustments graphics of eyes
Left/right image overlapping degree as needed for processing.
Operating robot can also include:Storage part, the photograph image for authentication storage user;Judging part, calculates figure
It is big in the similar degree calculated as the similar degree between the facial characteristics key element in the facial characteristics key element and photograph image in data
When equal to predetermined value, control operation order generating unit generates and exports operational order.Here, facial characteristics key element can be structure
Into more than one in the position at position and apperance, pupil color, facial apperance, skin color, wrinkle shape, blush, constitute
Position can be more than one in eyes, eyebrow, nose and mouth.
Operating robot can also include:Storage part, for storing validated user information, the validated user information is face
One or more of a reference value of area information and view data septum reset size residing for profile;Judging part, judges to be contained in
Whether the face in view data meets validated user information, and when meeting, control operation order generating unit generates and exports behaviour
Order.
Operating robot can also include:Judging part, judges whether the facial movement state being contained in view data protects
More than the stipulated time is held, when facial movement state is kept more than the stipulated time, control operation order generating unit is generated and exported
Operational order.
Operating robot can also include:Judging part, judges whether the facial movement degree being contained in view data surpasses
Go out preset range, when beyond preset range, control operation order generating unit generates and exports operational order.
Operating robot can also include:Storage part, is stored to more than one change in head movement and facial expression
The operational order information of generation;Judging part, parses multiple images data to generate more than one in head movement and facial expression
Change information, control the operational order generating unit to generate and export corresponding operating order corresponding with operational order information.
According to another embodiment of the present invention there is provided a kind of vision portion operating method, regarded for operating robot control
More than one in the position in feel portion and image input angle, it is characterised in that comprise the following steps:Output corresponds to junction portion
The step of heat transfer agent of swimming direction and size;Using heat transfer agent, generate and export the position about vision portion and image
In input angle the step of more than one operational order;Junction portion as operating robot control deck plate a part of shape
Into, and to apply patient's face the direction of motion and the corresponding direction of size and size it is travelling.
Vision portion operating method can also comprise the following steps:Judgement applies whether patient's face is in connecing facial contact shape
The step of state;During in contact condition, start to export the step of being controlled of heat transfer agent.
Vision portion operating method can also comprise the following steps:Contact condition release when, judge junction portion whether using as
The step of position and the normal condition of state for being appointed as giving tacit consent to (default) are present, not with normal condition in the presence of, return to
The step of progress of normal condition is handled.
Return to normal condition, can by the junction portion swimming direction and the reverse operating of size according to heat transfer agent come
Carry out.
Vision portion operating method can also comprise the following steps:The inner side of operating robot is formed and stored in towards junction
The step of view data that portion is shot;The view data of storage is compared judgement in chronological order, and generates parsing pupil
The step of parsing information of more than one in change in location, the change of eyes apperance and direction of gaze.
Vision portion operating method can also comprise the following steps:Judge whether parsing information meets as any operation order
The step of predetermined change;If meet, the step of exporting operational order predetermined accordingly.
Junction portion is formed the preceding surface in control deck plate by elastomeric support, and elastomer can provide restoring force, with
When just releasing the external force for connecing facial movement application, junction portion returns to original position.
According to another embodiment of the present invention there is provided a kind of operating robot, vision portion is controlled using operation signal
More than one in position and image input angle, it includes:Junction portion, for the image that will be obtained by vision portion as regarding
Feel that information is provided;Analyzing and processing portion, the parsing information for the facial movement that generation parsing is seen by junction portion;Operational order is generated
Portion, judges whether parsing information meets predetermined change as any operation order, if meet, and exports for operating vision portion
Operational order.
Analyzing and processing portion can include:Image unit, connects face shots described in the inner side of operating robot and gives birth to
Into view data;Memory cell, the view data for storing generation;Analytic unit, in the view data sequentially generated, is pressed
Time sequencing multilevel iudge provides the change in location of characteristic point, and generates the parsing information about facial movement.
Junction portion can be formed as the part of console (console) panel of the operating robot, junction portion
Can be formed by light transmissive material so that the image obtained by vision portion in visual information mode it can be seen that.
According to another embodiment of the present invention there is provided a kind of vision portion operating method, regarded for operating robot control
More than one in the position in feel portion and image input angle, it is characterised in that comprise the following steps:Generation parsing is by connecing mesh
The step of parsing information of more than one in pupil position change, the change of eyes apperance and direction of gaze that portion is seen;Judge solution
The step of whether analysis information meets predetermined change as any operation order;If meet, generate and export relevant vision portion
Position and image input angle in more than one operational order the step of.
The step of generation parsing information, may include steps of:The inner side direction for being formed and stored in operating robot connects
The step of view data that mesh portion is shot;The view data of storage is compared judgement in chronological order, and generates parsing pupil
The step of parsing information of more than one in hole site change, the change of eyes apperance and direction of gaze.
According to another embodiment of the present invention there is provided a kind of vision portion operating method, for operating robot to vision
Portion is operated, and it comprises the following steps:Subject and the step of generate view data;Parsing is contained in view data
Face in two central points the interior angle size that is formed of connecting line and picture center line and direction of rotation, and with before basis
The interior angle size and direction of rotation that the image data analyzing of shooting goes out are compared, to generate relevant facial direction of rotation and rotation
The step of displacement information of angle;Generate and export operational order accordingly to operate the vision with the displacement information
The step of portion.
The step of generation displacement information, may include steps of:The image data analyzing shot before is calculated to go out
Face in datum mark with the distance between datum mark information in the face that is gone out according to image data analyzing the step of;Make institute
State the range information calculated and be contained in the displacement information, for the vision portion the step of moving in parallel operation.
The step of generation displacement information, may include steps of:The image data analyzing shot before is calculated to go out
Face in two spacing and the face that is gone out according to image data analyzing the step of variable quantity between two spacing;Make described
Variable quantity between two spacing is contained in the displacement information, for the step for the image multiplying power for adjusting the vision portion
Suddenly.
Vision portion can be laparoscope, thoracoscope, arthroscope, asoscope, cystoscope, proctoscope, ERCP, mediastinum
One or more of mirror, cardioscope.
Vision portion can be the device for obtaining 3-D view, can be according to face and the position adjustments graphics of eyes
Left/right image overlapping degree as needed for processing.
Vision portion operating method can also comprise the following steps:Calculate the facial characteristics key element that is contained in view data and
The step of similar degree being pre-stored between the photograph image of the certification user of storage part;It is more than or equal in the similar degree calculated
During predetermined value, it is controlled to perform described generate and the step of export the step of operational order.Here, facial characteristics key element can
Be constituting parts position and apperance, pupil color, facial apperance, skin color, wrinkle shape, blush in one with
On, constituting parts can be more than one in eyes, eyebrow, nose and mouth.
Vision portion operating method can also comprise the following steps:Judge to be contained in whether the face in view data meets pre-
The step of validated user information first stored;When meeting, it is controlled to perform the step for generating and exporting operational order
Rapid step.Effective family information can be prestored in storage part, the validated user information is in view data septum reset wheel
One or more of area information residing for exterior feature and a reference value of facial size that is contained in view data.
Vision portion operating method can also comprise the following steps:Judging the facial movement state being contained in view data is
The step of no holding is more than the stipulated time;When facial movement state is kept more than the stipulated time, it is controlled described to perform
The step of generating and export the step of operational order..
Vision portion operating method can also comprise the following steps:Judging the facial movement degree being contained in view data is
It is no exceed preset range the step of;When beyond preset range, it is controlled and is generated and the step of export operational order with performing
The step of.
Vision portion operating method can also comprise the following steps:Multiple images data are parsed to generate facial expression and head
In motion the step of more than one change information;Believe according to corresponding to more than one change in facial expression and head movement
The operational order information of breath, the step of generating and export corresponding operating order.It can be prestored to head movement in storage part
And the operational order information of the change generation of facial expression.
In addition, according to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing apparatus, it is wrapped
Include:Image input unit, receives the endoscopic images provided from operation with endoscope;Picture display part, endoscopic images are exported
In specific region;Screen display control unit, change exports the picture of endoscopic images corresponding to the viewpoint of operation endoscope
The specific region of display part.
Here, operation endoscope can be laparoscope, thoracoscope, arthroscope, asoscope, cystoscope, proctoscope, 12 fingers
One or more of colonoscopy, mediastinoscope, cardioscope or stereo endoscope.
Here, screen display control unit can include:Endoscope viewpoint tracking part, corresponding to the movement of operation endoscope
And rotation, follow the trail of the view information of operation endoscope;Image mobile message extraction unit, is believed using the viewpoint of operation endoscope
Breath, extracts the mobile message of endoscopic images;Picture position configuration part, using mobile message, is set in picture display part output
The specific region of endoscopic images.
In addition, screen display control unit can correspond to the eye coordinates changing value of operation endoscope, mobile endoscope
The central point of image.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing apparatus, it includes:Image
Input unit, receives the first endoscopic images and the second endoscope figure provided from operation endoscope with time point different from each other
Picture;Picture display part, by the first endoscopic images and the different zones output to each other of the second endoscopic images;Image storage part,
For storing the first endoscopic images and the second endoscopic images;Screen display control unit, for control interface display part, with
The viewpoint different from each other of operation endoscope accordingly, by the first endoscopic images and the second endoscopic images not same district to each other
Domain is exported.
Here, image input unit can be than receiving the first endoscopic images, picture display part prior to the second endoscopic images
The first endoscopic images and the second endoscope for making more than one in chroma, brightness, color and picture pattern different can be exported
Image.
In addition, screen display control unit can also include storage image display part, picture display part output is inputted in real time
During second endoscopic images, extract the first endoscopic images for being stored in image storage part and export to picture display part.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing apparatus, it includes:Image
Input unit, receives the endoscopic images provided from operation with endoscope;Picture display part, by endoscopic images output in given zone
Domain;Image storage part, for storing building for the relevant operation tool performed the operation to the surgical object that operation is shot with endoscope
Mould image;Images match portion, endoscopic images and modeled images are matched each other and output image is generated;Picture display control
Portion, change exports the specific region of the picture display part of endoscopic images corresponding to the viewpoint of operation endoscope, and to picture
The endoscopic images and modeled images of face display part output matching.
Here, the actual operation tool drawing picture that can be included in endoscopic images of images match portion and being contained in modeling
Modeling operation tool image in image is matched each other, and generates output image.
In addition, images match portion can also include:Characteristic value operational part, using endoscopic images and is incorporated into more than one
More than one in the location coordinate information of the actual operation instrument of robotic arm carrys out computation performance value;Modeled images achievement unit, it is real
Now correspond to the modeled images of the characteristic value of characteristic value operational part computing.
Here, images match portion can also include overlapping image processing part, modeling hand is removed from modeling operation tool image
The overlapping region of art tool drawing picture and actual operation tool drawing picture, furthermore, it is possible to be set using the operation information of operation tool
The position of the modeling operation tool image exported to modeled images.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing apparatus, it includes:Image
Input unit, receives the endoscopic images provided from operation with endoscope;Picture display part, exports endoscopic images;Picture drives
Portion, makes picture display part rotate and mobile;Picture drive control part, for control interface drive division so that picture drive division pair
It should be rotated and moving image display part in the viewpoint of operation endoscope.
Here, picture drive control part can include:Endoscope viewpoint tracking part, corresponding to the movement of operation endoscope
And rotation, follow the trail of the view information of operation endoscope;Image mobile message extraction unit, is believed using the viewpoint of operation endoscope
Breath, extracts the mobile message of endoscopic images;Activation bit generating unit, using mobile message, generates the picture of picture display part
Activation bit.
Here, picture drive division, which can be one end, is incorporated into the picture display part, and make the picture display part along
Defined shifting chute movement, or the picture drive division are that one end is incorporated into the picture display part, and show the picture
Show the drive division spatially moved for the robotic arm form for spatially moving and rotating in portion.
Here, picture display part can include vault shaped screen and penetrate the projection of endoscopic images in vault shaped screen upslide
Instrument.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing method, used for performing the operation
Image processing apparatus exports endoscopic images, and it comprises the following steps:The endoscopic images provided from operation with endoscope are provided
The step of;The step of endoscopic images being exported to the specific region of picture display part;And change corresponds to operation endoscope
Viewpoint and the step of export the specific region of the picture display part of endoscopic images.
Here, operation endoscope can be laparoscope, thoracoscope, arthroscope, asoscope, cystoscope, proctoscope, 12 fingers
One or more of colonoscopy, mediastinoscope, cardioscope or stereo endoscope.
In addition, the step of specific region of change picture display part may include steps of:Peeped corresponding to operation in
The movement and rotation of mirror, the step of following the trail of the view information of operation endoscope;Using the view information of operation endoscope, carry
The step of taking the mobile message of endoscopic images;And mobile message is utilized, it is set in picture display part output endoscopic images
Specific region the step of.
Here, the step of specific region of change picture display part may include steps of:Peeped corresponding to operation in
The step of eye coordinates changing value of mirror, central point of mobile endoscopic images.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing method, used for performing the operation
Image processing apparatus exports endoscopic images, and it comprises the following steps:Receive from operation endoscope with time point different from each other
The step of the first endoscopic images and the second endoscopic images for providing;First endoscopic images and the second endoscopic images are defeated
The step of going out to the region different from each other of picture display part;The step of storing the first endoscopic images and the second endoscopic images;
And corresponding to the viewpoint different from each other of operation endoscope, make the first endoscopic images and the second endoscopic images each other not
With region export control interface display part the step of.
Here, the step of endoscopic images are received, can be than receiving the first endoscope figure prior to the second endoscopic images
Picture, in output step, can export the first endoscope figure for making more than one in chroma, brightness, color and picture pattern different
Picture and the second endoscopic images.
In addition, picture display part rate-determining steps can also comprise the following steps:Picture display part output input in real time the
During two endoscopic images, extract the first endoscopic images for being stored in image storage part and export the step to picture display part
Suddenly.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing method, used for performing the operation
Image processing apparatus exports endoscopic images, and it comprises the following steps:The endoscopic images provided from operation with endoscope are provided
The step of;By endoscopic images output the step of the specific region of picture display part;What storage was shot to operation with endoscope
The step of modeled images for the relevant operation tool that surgical object is performed the operation;Endoscopic images and modeled images are matched each other
And the step of generate output image;And the picture that change corresponds to the viewpoint of operation endoscope and exports endoscopic images shows
Show the specific region in portion, and to picture display part output matching endoscopic images and modeled images the step of.
Here, the step of generation output image, can be included in actual operation tool drawing picture in endoscopic images and
The modeling operation tool image being contained in modeled images matches each other, and generates output image.
In addition, the step of generation output image can also comprise the following steps:Using endoscopic images and it is incorporated into one
The step of more than one in the location coordinate information of the actual operation instrument of above robotic arm carrys out computation performance value;And realize
Corresponding to the characteristic value of computing modeled images the step of.
Here, the step of generation output image can also comprise the following steps:Modeling is removed from modeling operation tool image
The step of overlapping region of operation tool image and actual operation tool drawing picture.
Furthermore, it is possible to set the modeling operation tool image exported to modeled images using the operation information of operation tool
Position.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing method, used for performing the operation
Image processing apparatus exports endoscopic images, and it comprises the following steps:The endoscopic images provided from operation with endoscope are provided
The step of;The step of endoscopic images being exported to picture display part;And corresponding to the viewpoint of operation endoscope, show picture
Show portion's rotation and mobile step.
Here, the step of rotation and moving image display part may include steps of:Corresponding to operation endoscope
Mobile and rotation, the step of following the trail of the view information of operation endoscope;Using the view information of operation endoscope, extract interior
The step of mobile message of sight glass image;Using mobile message, the step of generating the action message of picture display part.
Here, picture display part can include vault shaped screen and penetrate the projection of endoscopic images in vault shaped screen upslide
Instrument.
According to another embodiment of the present invention there is provided a kind of recording medium, in order to perform the human body temperature type operation figure
As processing method, possess the instruction repertorie that can be performed by digital processing unit, and record has what can be read by digital processing unit
Program.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing apparatus, it includes:Image
Input unit, receives the first endoscopic images and second that the operation endoscope rotated from one end is provided with time point different from each other
Endoscopic images;Picture display part, by the first endoscopic images and the different zones output to each other of the second endoscopic images;Picture
Display control unit, for control interface display part, so as to the viewpoint different from each other corresponding to operation endoscope, will be peeped in first
Different zones are exported to each other for mirror image and the second endoscopic images.
Here, operation can periodically be rotated with endoscope, the present embodiment can also include rotation process portion, should
Rotation process portion be the setting direction of rotation related to one end rotation of operation endoscope, angular velocity of rotation, acceleration and deceleration form,
More than one rotation in rotary speed, rotational time starting point, rotational time end point, rotational time length and radius of turn
Relevant information.
Here, operation is rotated with one end of endoscope in the way of forming closed figures, can be with cone or many
Pyramid shape rotates, and the rotational trajectory of operation endoscope can be any in circular, oval, triangle and rectangle
Kind.
In addition, screen display control unit can include:Consecutive image generating unit, is extracted in the first endoscopic images and second
The overlapping region of sight glass image, and generate consecutive image;Side images generating unit, extracts and is peeped in the first endoscopic images and second
The Non-overlapping Domain of mirror image, and generate side images.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing apparatus, it includes:First
Image input unit, the first endoscopic images are received from operation with endoscope;Second image input unit, reception is used from operation is incorporated into
The side of endoscope and the auxiliary endoscope that is rotated centered on operation endoscope with time point different from each other provide it is multiple
Second endoscopic images;Picture display part, by the first endoscopic images and the different zones output to each other of the second endoscopic images;
Screen display control unit, for control interface display part, so as to corresponding to operation endoscope and auxiliary endoscope each other not
Same viewpoint, by the first endoscopic images and the different zones output to each other of the second endoscopic images.
Here, auxiliary endoscope can periodically be rotated, the present embodiment can also include rotation process portion, the rotation
It is the setting direction of rotation related to aiding in one end rotation of endoscope, angular velocity of rotation, acceleration and deceleration form, rotation to turn operating portion
The rotation of more than one in speed, rotational time starting point, rotational time end point, rotational time length and radius of turn is related
Information.
In addition, screen display control unit can include:Consecutive image generating unit, is extracted in the first endoscopic images and second
The overlapping region of sight glass image, and generate consecutive image;Side images generating unit, extracts and is peeped in the first endoscopic images and second
The Non-overlapping Domain of mirror image, and generate side images.
In addition, screen display control unit can include:Consecutive image generating unit, sequential chart is generated from the first endoscopic images
Picture;Side images generating unit, extracts the second endoscopic images and generates the side images of consecutive image.
In addition, auxiliary endoscope is removably incorporated into operation endoscope.
It is operation figure according to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing method
As the method that processing unit exports endoscopic images, it comprises the following steps:Receive from one end rotate operation endoscope with
The step of the first endoscopic images and the second endoscopic images that time point different from each other provides;By the first endoscopic images and
The step of two endoscopic images are output to the region different from each other of picture display part;Control interface display part, to use interior with operation
First endoscopic images and the second endoscopic images accordingly, are output to region different from each other by the viewpoint different from each other of sight glass
Step.
Here, operation can periodically be rotated with endoscope, the present embodiment can also comprise the following steps:Setting
When the direction of rotation related to one end rotation of operation endoscope, angular velocity of rotation, acceleration and deceleration form, rotary speed, rotation
Between more than one in starting point, rotational time end point, rotational time length and radius of turn rotation relevant information the step of.
Here, operation is rotated with one end of endoscope in the way of forming closed figures, can be with cone or many
Pyramid shape rotates, and the rotational trajectory of operation endoscope can be any in circular, oval, triangle and rectangle
Kind.
In addition, the step of control interface display part may include steps of:Extract in the first endoscopic images and second
The overlapping region of sight glass image, and the step of generate consecutive image;And extract the first endoscopic images and the second endoscope figure
The Non-overlapping Domain of picture, and the step of generate side images.
According to another embodiment of the present invention there is provided a kind of human body temperature type operation image processing method, it includes as follows
Step:The step of the first endoscopic images being received from operation with endoscope;Receive be incorporated into operation endoscope side and with
Multiple second endoscopic images that the auxiliary endoscope rotated centered on operation endoscope is provided with time point different from each other
Step;The step of different zones are exported to each other by the first endoscopic images and the second endoscopic images;Control interface display part,
With the viewpoint different from each other with operation endoscope and auxiliary endoscope accordingly by the first endoscopic images and the second endoscope
Image is the step of different zones are exported to each other.
Here, auxiliary endoscope can periodically be rotated, the present embodiment can also comprise the following steps:Setting with
The related direction of rotation of one end rotation of endoscope, angular velocity of rotation, acceleration and deceleration form, rotary speed, rotational time is aided in rise
The step of rotation relevant information of more than one in initial point, rotational time end point, rotational time length and radius of turn.
In addition, the step of control interface display part may include steps of:Extract in the first endoscopic images and second
The overlapping region of sight glass image, and the step of generate consecutive image;Extract the first endoscopic images and the second endoscopic images
Non-overlapping Domain, and the step of generate side images.
In addition, the step of control interface display part may include steps of:Sequential chart is generated from the first endoscopic images
The step of picture;The step of extracting the second endoscopic images and generate the side images of consecutive image.
In addition, the present embodiment can also comprise the following steps:Auxiliary endoscope is removably combined and peeped in operation in
The step of mirror.
Here, picture display part can include vault shaped screen and penetrate the projection of endoscopic images in vault shaped screen upslide
Instrument.
In addition, according to another embodiment of the present embodiment there is provided a kind of recording medium, in order to perform the human body temperature type hand
Art image processing method, possesses the instruction repertorie that can be performed by digital processing unit, and records that have can be by digital processing unit
The program of reading.
Other embodiment, feature, advantage in addition to that mentioned above, can pass through following accompanying drawing, the model of claim
Enclose and detailed description of the invention can be clearer and more definite.
Invention effect
Embodiments in accordance with the present invention, want that the behavior of operative site needed for seeing can just be controlled by applying patient with only
The position of laparoscope and the effect of image input angle.
Carried out furthermore, it is not necessary that applying patient for operating the individually operated of laparoscope, so that also having can make to apply patient
Concentrate entirely on the effect of operation behavior.
Furthermore, it is not necessary that apparatus operation method of the study using face recognition, so that operation can intuitively be understood by also having
The effect of method.
It is required for arm in addition, also having and is only controlled in many ways by the facial movement on three dimensions
The effect of endoscope apparatus.
In addition, human body temperature type operation image processing apparatus of the present invention and its method, corresponding to according in operation
The motion of sight glass and the viewpoint of endoscope changed, change the carry-out bit of the endoscopic images exported on the monitor that user sees
Put, so that with the effect that user can be made more realistically to experience actual operation situation.
In addition, human body temperature type operation image processing apparatus of the present invention and its method, before current time extraction
Input the endoscopic images of storage and be together output to picture display part with current endoscopic images, so that having will have
The information of sight glass image change notifies the effect to user inside the Pass.
In addition, human body temperature type operation image processing apparatus of the present invention and its method, with can will operation when profit
With the modeled images of the endoscopic images and the operation tool for previously generating and being stored in image storage part of endoscope actual photographed
Each or match each other, or adjust its size etc. carry out amending image after, to user's observable monitoring unit export effect.
In addition, human body temperature type operation image processing apparatus of the present invention and its method, in being continually changing
The viewpoint rotation of sight glass and mobile monitor, so that with the effect that user can be made more vivo to experience the operation sense of reality.
Brief description of the drawings
Fig. 1 is the top view for the total for showing the operating robot that one embodiment of the invention is related to.
Fig. 2 is the concept map for the main interface for showing the operating robot that one embodiment of the invention is related to.
Fig. 3 to Fig. 6 is the schematic diagram for the motion morphology for exemplifying the junction portion that one embodiment of the invention is related to.
Fig. 7 is to briefly show the laparoscope for being used to generate laparoscopic procedure order that one embodiment of the invention is related to show
The block diagram of the composition in portion.
Fig. 8 is the flow chart for showing the laparoscopic procedure command transmission method that one embodiment of the invention is related to.
Fig. 9 is to briefly show the laparoscope for being used to generate laparoscopic procedure order that another embodiment of the present invention is related to show
Show the block diagram of the composition in portion.
Figure 10 is the flow chart for showing the laparoscopic procedure command transmission method that another embodiment of the present invention is related to.
Figure 11 is to briefly show the laparoscope for being used to generate laparoscopic procedure order that another embodiment of the present invention is related to
The block diagram of the composition of display part.
Figure 12 is the flow chart for showing the laparoscopic procedure command transmission method that another embodiment of the present invention is related to.
Figure 13 is the flow chart for showing the laparoscopic procedure command transmission method that another embodiment of the present invention is related to.
Figure 14 is the schematic diagram for the image display format for exemplifying the laparoscope display part that embodiments of the invention are related to.
Figure 15 is the flow chart for showing the laparoscopic procedure command transmission method that another embodiment of the present invention is related to.
Figure 16 is the concept map for the main interface for showing the operating robot that another embodiment of the present invention is related to.
Figure 17 is the block diagram for the composition for briefly showing the laparoscopic procedure unit that another embodiment of the present invention is related to.
Figure 18 is the schematic diagram for the movement concept for showing the laparoscopic procedure unit that another embodiment of the present invention is related to.
Figure 19 and Figure 20 are to exemplify to operate the laparoscope and face fortune that another embodiment of the present invention is related to respectively
Dynamic schematic diagram.
Figure 21 is the flow chart for the action process for showing the laparoscopic procedure unit that another embodiment of the present invention is related to.
Figure 22 is the flow chart for the step 1610 for specifically illustrating Figure 21 that another embodiment of the present invention is related to.
Figure 23 is the top view for the total for showing the operating robot that embodiments of the invention are related to.
Figure 24 is the concept map for the main interface for showing the operating robot that the first embodiment of the present invention is related to.
Figure 25 is the block diagram for showing the operating robot that the first embodiment of the present invention is related to.
Figure 26 is the block diagram for showing the human body temperature type operation image processing apparatus that the first embodiment of the present invention is related to.
Figure 27 is the flow chart for showing the human body temperature type operation image processing apparatus that the first embodiment of the present invention is related to.
Figure 28 is the human body temperature type operation image processing method output figure for showing to be related to according to the first embodiment of the present invention
The pie graph of picture.
Figure 29 is the block diagram for showing the operating robot that the second embodiment of the present invention is related to.
Figure 30 is the block diagram for showing the human body temperature type operation image processing apparatus that the second embodiment of the present invention is related to.
Figure 31 is the flow chart for showing the human body temperature type operation image processing method that the second embodiment of the present invention is related to.
Figure 32 is the human body temperature type operation image processing method output figure for showing to be related to according to the second embodiment of the present invention
The pie graph of picture.
Figure 33 is the block diagram for showing the operating robot that the third embodiment of the present invention is related to.
Figure 34 is the block diagram for showing the human body temperature type operation image processing apparatus that the third embodiment of the present invention is related to.
Figure 35 is the flow chart for showing the human body temperature type operation image processing method that the third embodiment of the present invention is related to.
Figure 36 is the human body temperature type operation image processing method output figure for showing according to the third embodiment of the invention to be related to
The pie graph of picture.
Figure 37 is the concept map for the main interface for showing the operating robot that the fourth embodiment of the present invention is related to.
Figure 38 is the block diagram for showing the operating robot that the fourth embodiment of the present invention is related to.
Figure 39 is the block diagram for showing the human body temperature type operation image processing apparatus that the fourth embodiment of the present invention is related to.
Figure 40 is the flow chart for showing the human body temperature type operation image processing method that the fourth embodiment of the present invention is related to.
Figure 41 is the concept map for the main interface for showing the operating robot that the fifth embodiment of the present invention is related to.
Figure 42 is the block diagram for showing the human body temperature type operation image processing apparatus that the sixth embodiment of the present invention is related to.
Figure 43 is the flow chart for showing the human body temperature type operation image processing method that the sixth embodiment of the present invention is related to.
Figure 44 is the schematic diagram for the spinning movement for showing the operation endoscope that the sixth embodiment of the present invention is related to.
Figure 45 is the schematic diagram for the spinning movement for showing the operation endoscope that the sixth embodiment of the present invention is related to.
Figure 46 and Figure 47 are the schematic diagrames for showing the endoscopic images that the sixth embodiment of the present invention is related to.
Figure 48 is the schematic diagram for the spinning movement for showing the auxiliary endoscope that the seventh embodiment of the present invention is related to.
Figure 49 is the concept map for the main interface for showing the operating robot that the eighth embodiment of the present invention is related to.
Embodiment
The present invention can carry out a variety of changes, it is possible to have various embodiments, be carried out in this illustrative embodiment detailed
Describe in detail bright.But, the present invention is not limited to embodiment, it should be appreciated that thought and technology including belonging to the present invention
In the range of all changes, equipollent to sub.Think the detailed description about known technology in the description of the invention
In the case of the order that the present invention may be obscured, the detailed description is omitted.
The term of such as " first " and " second " can be used to describe various inscapes, but the inscape is not
Limited by the term.The term is only used for making a distinction an inscape and another inscape.
The term used in this application is merely to illustrate specific embodiment, is not intended to limit the present invention.Odd number table
Show including complex representation, as long as understanding can be distinguished clearly.In this application, the term such as " comprising " or " having " is represented
Feature, sequence number, step, operation, inscape, component or its combination used in the description for being present in specification, and be not excluded for
There is a possibility that or increase one or more different features, sequence number, step, operation, inscape, component or its combination.
Additionally, it is possible to which the term such as " ... portion " recorded in the description, " ... device ", " module " represents at least to handle a kind of work(
Can or action unit, it being implemented in combination with by hardware or software or hardware and software.
Below, embodiments of the present invention will be described in detail with reference to the accompanying drawings, when being described with reference to the accompanying drawings, identical or corresponding
Inscape assigns same reference numerals, omits the repeat specification to it.
In addition, during explanation different embodiments of the invention, should not independently explain or implement each embodiment, it should be understood that
Can be with the other embodiments combination interpretation that illustrates individually for, the characteristic element and/or technological thought illustrated in embodiments
Or implement.
In addition, it is that can be widely applied for the utilization such as endoscope, microscope to regard that can specify the present invention according to following explanation
The technological thought of the operation or experiment in feel portion etc..Moreover, endoscope can also be laparoscope, thoracoscope, arthroscope, asoscope, wing
Guang mirror, proctoscope, ERCP, mediastinoscope, cardioscope etc. are a variety of.Below, for ease illustration and understanding, vision portion is one
Illustrated in case of planting endoscope, i.e. laparoscope.
Fig. 1 is the top view for the total for showing the operating robot that one embodiment of the invention is related to, and Fig. 2 is to show
The concept map of the main interface for the operating robot that one embodiment of the invention is related to, Fig. 3 to Fig. 6 is to exemplify of the invention one in fact
Apply the schematic diagram of the motion morphology in the junction portion that example is related to.
Referring to Figures 1 and 2, surgical robot system includes:From robot 2, hand is carried out to lying in the patient on operating table
Art;Main robot 1, for applying patient's remote operation from robot 2.Main robot 1 and need not be physically independent from robot 2
Isolated system, can become one formula, in this case, main interface 4 can equivalent to such as integral type robot boundary
Face part.
The main interface 4 of main robot 1 includes monitoring unit 6, laparoscope display part 20 and master manipulator, includes from robot 2
Robotic arm 3 and laparoscope 5.
The monitoring unit 6 of main interface 4 can be made up of more than one monitor, so as to individually show on each monitor
Show information required during operation.Be illustrated in fig. 1 and 2 on the basis of laparoscope display part 20 respectively includes one in both sides
Monitor 6, the quantity of monitor type or species of display information etc. can be set as difference as needed.
Monitoring unit 6 can for example export more than one the biological information of relevant patient.At this point it is possible to pass through monitoring unit 6
The indexs of more than one monitor output display patient's states, in the biological information such as temperature pulse respiration and blood pressure
More than one, export the situation of multiple information, each information can also be exported by region division.In order to by these biological informations
Main robot 1 is supplied to, can include biological information measurement unit from robot 2, the biological information measurement unit includes body temperature
More than one in measurement module, pulses measure module, respiration measurement modulus, blood pressure measurement module, ecg measurement module etc..
The biological information measured by each module since robot 2 can also send main frame in the form of analog signal or data signal
Device people 1, and main robot 1 can show the biological information received by monitoring unit 6.
The laparoscope display part (Telescopic) 20 of main cross section 4 passes through the operation that laparoscope 5 is inputted to patient's offer is applied
The image at position.Apply patient and image is watched by the mesh portion 220 that connects formed in the junction portion 210 of laparoscope display part 20, and lead to
Cross manipulation master manipulator to operate robotic arm 3 and end effector (effector), so as to perform the operation to operative site.In figure
Situation about being constituted in panel (panel) form is illustrated in 2 as one of junction portion 210, however, junction portion 210 also may be used
To be formed as the inner side depression to main interface 4.Patient is applied for seeing in addition, being illustrated be formed with junction portion 210 in fig. 2
The situation for meeting mesh portion 220 of the image obtained by laparoscope 5 is seen, still, junction portion 210 is by having an X-rayed the material of image behind
During formation, formation can also be omitted and meet mesh portion 220.The junction portion 210 of the image behind junction portion 210 can be had an X-rayed by applying patient,
For example formed by transparent material, or polarization mode can also be scribbled or by translucency such as the sunglasses for watching 3D IMAX films
Material is formed.
Laparoscope display part 20 is configured to, not only with as apply patient by connect mesh portion 220 confirm come from
The function of the display device of the image of laparoscope 5, while having as the control for controlling the position of laparoscope 5 and image input angle
The function of order input unit processed.
Protruded in the junction portion 210 of laparoscope display part 20 and be formed with multiple supports 230,240, to apply the face of patient
Portion is contacted or close to the support 230,240, so as to recognize the facial movement for applying patient.For example, the branch formed on top
Bearing portion 230 is contacted with applying the forehead of patient, can be used in fixing forehead position, and sidepiece formation support 240 with applying
Eyes lower portion (such as malar) contact of patient, can be used in fixing facial positions.The branch exemplified in fig. 2
The position of bearing portion and quantity are to illustrate, and the position of support or shape can have a variety of, and such as lower jaw is fixed with bracket, face
Left and right bracket 290 etc., the quantity of support can also be different.The situation of facial left and right bracket, for example can be with strip or tabular
Formed, when face is to the left or right side is moved, supported so that junction portion 210 is moved to correspondence direction etc. shape.
By the support 230,240 being thusly-formed, the facial positions for applying patient are fixed, if applying patient by connecing mesh
Portion 220 watches during the image from laparoscope 5 and to rotate face to either direction, then be able to detect that the facial movement and
As for adjusting the position of laparoscope 5 and/or the input Information Pull of image input angle.For example, thinking when applying patient
When confirming left-hand portion (that is, the position on the left of display picture) of the operative site currently shown with image, as long as face
Relative rotation head of the portion towards left side, it becomes possible to control laparoscope 5 carry out with its corresponding operating and export the image at the position.
That is, the junction portion 210 of laparoscope display part 20 is bonded with main interface 4, to be transported with the face for applying patient
It is dynamic, its position and/or angle change.Therefore, the junction portion 210 of main interface 4 and laparoscope display part 20 can pass through portion of moving about
250 are bonded to each other.Travelling portion 250 can for example be formed by elastomer, in order to change laparoscope display part 20 position and/or
Angle, and release apply patient's facial movement application external force in the case of can return to original state.In addition, travelling portion 250
Situation about being formed by inelastic body, can also control original state recovery section (reference picture 9) to make abdominal cavity by laparoscope display part 20
Mirror display part 20 returns to original state.
It can be operated into by travelling junction portion 210 of portion 250, in the three dimensions formed by XYZ axles, with virtual center
Point and coordinate on the basis of, moved to rectilinear direction, or to either direction (such as clockwise, counterclockwise in one
More than kind) it is in rotary moving.Here, virtual center point can be any point or axle in junction portion 210, for example can be junction portion
210 central point.
The motion morphology in junction portion 210 is exemplified into Fig. 6 in Fig. 3.
When applying the facial movement direction of patient with X, Y or parallel Z axis, as shown in figure 3, junction portion 210 is applied to facial movement
The direction of reinforcing is moved in parallel.
The facial movement direction for applying patient is situation about rotating on an x-y plane, and as illustrated in Figure 5, junction portion 210 is with base
Direction centered on from fiducial axis to facial movement applying power is in rotary moving.Now, according to the direction of applying power, junction portion 210 can be with
To in rotary moving clockwise or counterclockwise.
During according to applying two axles application of the power of patient's facial movement into X, Y and Z axis, as illustrated in fig. 6, junction portion
210 carried out on the basis of two axles of virtual center point and applying power it is in rotary moving.
In this way, junction portion 210 is moved and in rotary moving with vertical/horizontal direction, according to the side of the mobile power applied of face
Always determine, more than one motion morphology can also combine to represent as described above.
Patient's facial movement is applied below in reference to the detection of relevant drawings detailed description laparoscope display part 20 and is generated corresponding
Operational order method and composition.
As illustrated in Fig. 1 and Fig. 2, main interface 4 has master manipulator, and to apply patient, with the hands holding is grasped respectively
Make.Master manipulator can have two handles 10, or its above quantity handle 10, with patient's operation handle 10 is applied, accordingly
Operation signal is sent to from robot 2, so that control machine arm 3.Robotic arm 3 can be carried out by applying patient's operation handle 10
The surgical action such as position movement, rotation, cutting operation.
For example, handle 10 can include main handle (main handle) and auxiliary-handle (sub handle).Applying patient can be with
Only by main handle operation from robotic arm 3 or laparoscope 5 etc., or auxiliary-handle can also be operated while entering multiple surgical apparatuses
Row real-time operation.Main handle and auxiliary-handle can have a variety of mechanical structures according to its mode of operation, it is, for example, possible to use manipulating
Rod type, keyboard, tracking ball, touch-screen etc. be used for make from the robotic arm 3 of robot 2 and/or other surgical apparatuses action it is many
Plant input block.
Master manipulator is not limited to the shape of handle 10, as long as it is capable of the form of the action of control machine arm 3 by network,
It can be applicable without restriction.
Main robot 1 can be connected with from robot 2 by wire net or wireless communication networks, so that operation be believed
Number, the laparoscopic image that is inputted by laparoscope 5 etc. send other side to.If, it is necessary to logical in the transmission of identical and/or similar time
When crossing the multiple operation signals for the multiple handles 10 having in main interface 4 and/or operation signal for adjusting laparoscope 5,
Each operation signal can be sent to from robot 2 independently of one another.Here, each operation signal ' independence ' transmission is to represent, operation
Do not disturbed each other between signal, an operation signal does not interfere with another operation signal.It is such, in order to pass independently of one another
Send multiple operation signals, it is possible to use following different modes, each operation signal is assigned in the stage for generating each operation signal and being marked
Topic information is transmitted, and each operation information is transmitted according to its genesis sequence or the biography on each operation information
Send the predetermined priority of order and transmitted etc. according to its order.Each operation information is transmitted at this point it is possible to independently have
Transmitting path, so as to be disturbed between inherently preventing each operation signal.
Driven with there can be multiple degrees of freedom from the robotic arm 3 of robot 2.Robotic arm 3 can for example include:Operation work
Tool, is inserted in the operative site of patient;Deflection driven portion, according to surgery location so that operation tool revolves to deflection (yaw) direction
Turn;Pitching drive division, orthogonal pitching (pitch) direction rotary operation instrument to the rotation driving with deflection driven portion;Transfer
Drive division, operation tool is moved to length direction;Rotary driving part, for rotary operation instrument;Operation tool drive division, is located at
The end of operation tool, for incision or cutting operation focus.But, the structure of robotic arm 3 is not limited to this, it should be appreciated that
For protection scope of the present invention is not limited to these illustrations.In addition, apply patient makes robotic arm 3 to right by operation handle 10
The actual control process such as answer direction to rotate, move, having certain distance with idea of the invention, therefore omit detailed description thereof.
In order to be performed the operation to patient, it is possible to use more than one from robot 2, and it is used for operative site so as to energy
Enough by junction portion 220 confirm image (i.e. picture image) show laparoscope 5 can also by it is independent from robot 2 come reality
It is existing.In addition, as described above, embodiments of the invention can be widely applied to peep in using the various operations in addition to laparoscope
In the operation of mirror (such as thoracoscope, arthroscope, asoscope).
Fig. 7 is to briefly show the laparoscope for being used to generate laparoscopic procedure order that one embodiment of the invention is related to show
The block diagram of the composition in portion, Fig. 8 is the flow chart for showing the laparoscopic procedure command transmission method that one embodiment of the invention is related to.
Reference picture 7, laparoscope display part 20 includes motion detecting section 310, operational order generating unit 320 and transport unit 330.
In addition, laparoscope display part 20 can also include apply patient by connect mesh portion 220 can visual identity inputted by laparoscope 5
Operative site image inscape, but these have certain distance with idea of the invention, therefore omit its description.
Motion detecting section 310, by detection apply patient by facial contact in junction portion 210 support 230 (and/or
240) face is moved to certain direction in the state of, to export heat transfer agent.Motion detecting section 310 can include being used to detect face
The sensing unit of the direction of motion and size (such as distance).As long as sensing unit can detect that junction portion 210 is transported to certain direction
Dynamic how many sensing units, for example, it may be for detect the flexible travelling portion 250 in branch continuing surface portion 210 to
Certain direction stretches how many sensors, or is located at the inner side of main robot 1, for detecting in the inner surface of junction portion 210
The specified point of formation is close and/or rotates how many sensors etc..
Operational order generating unit 320 parses the face for applying patient using the heat transfer agent received from motion detecting section 310
The direction of motion and size, and the operational order for controlling the position of laparoscope 5 and image input angle is generated according to analysis result.
Transport unit 330 sends the operational order generated by operational order generating unit 320 from robot 2 to, to grasp
Make position and the image input angle of laparoscope 5, to provide corresponding image.In order to transmit the operation for operating robotic arm 3
Order, transport unit 330 can also be the transport unit that main robot 1 has.
The laparoscopic procedure command transmission method that one embodiment of the invention is related to is shown in Fig. 8.
Reference picture 8, in step 410, laparoscope display part 20 detect the facial movement for applying patient, then go to step
420, generate the operational order for operating laparoscope 5 using by detecting heat transfer agent that facial movement generated.
Then, in step 430, by generate at step 420 be used for operate laparoscope 5 operational order send to from
Robot 2.
Here, by the operational order for operating laparoscope 5 to be generated, specific action can also be performed to main robot 1
Effect.For example, being rotated by detecting face rotation during laparoscope 5, sent to about the operational order rotated from robot 2
Meanwhile, the operation handle direction of main robot 1 is also changed corresponding to which, so as to keep applying the intuitive and hand of patient
Art convenience.If for example, detecting the rotating signal in junction portion 210, laparoscope 5 is revolved according to the operation signal of generation
Turn, now, the position for the surgical seen on the image and its image shown on screen may be with current operation handle
The position of hand is inconsistent, it is possible to perform the action of moving operation handle position so that with the operation recruitment that is shown on screen
The position consistency of tool.Such control operation handle orientations, are applicable when can be not only 210 rotary motion of junction portion, as long as screen
The position/orientation of the surgical shown on curtain and the inconsistent situation of the position/orientation of practical operation handle, linear motion
When can similarly be applicable.
Fig. 9 is to briefly show the laparoscope for being used to generate laparoscopic procedure order that another embodiment of the present invention is related to show
Show the block diagram of the composition in portion, Figure 10 is the stream for showing the laparoscopic procedure command transmission method that another embodiment of the present invention is related to
Cheng Tu.
Reference picture 9, laparoscope display part 20 can include motion detecting section 310, operational order generating unit 320, transport unit
330th, contact detecting 510, original state recovery section 520.
Motion detecting section 310, operational order generating unit 320 and the transport unit 330 shown is said in above reference picture 7
It is bright, therefore omit its description.But, motion detecting section 310 recognizes as the heat transfer agent of contact detecting 510 and applies patient
Facial contact is able to carry out action during support 230 (and/or 240).
Whether the face that contact detecting 510 applies patient by detection is contacted with support 230 (and/or 240) to export
Heat transfer agent.Therefore, for example, in addition, energy can also be applicable in the end of support with contact-detection sensor
The various detection modes whether enough detection faces contact.
Original state recovery section 520 recognizes the face and support 230 for applying patient as the heat transfer agent of contact detecting
When the contact of (and/or 240) finishes, junction portion 210 is set to return to original state by controlled motor drive division 530.Original state
Recovery section 520 can be included in motor driving part 530 described below.
Exemplify, driven as the motor unit for making junction portion 210 return to original state using the motor of motor in fig .9
Portion 530, but for realizing that the motor unit of identical purpose is not limited to this.For example, it is also possible to each by air pressure or hydraulic pressure etc.
The progress that the method for kind makes junction portion 210 return to original state is handled.
Original state recovery section 520, for example, can utilize the letter about the normal condition of junction portion 210 (i.e. position and/or angle)
Breath come controlled motor drive division 530, or using the facial movement direction and size parsed according to operational order generating unit 320 come
Controlled motor drive division 530 is operated to its opposite direction and size, so that junction portion 210 returns to original position.
For example, applying patient to confirm the position different from the current operative site shown with image or take the position
Measure and rotate face (junction portion 210 is also corresponding mobile or rotates) so that laparoscope 5 is by the state of corresponding operating to the direction
Under, if at the end of detecting the contact that face docks face 210, original state recovery section 520 can be made with controlled motor drive division 530
Junction portion 210 returns to the normal condition for being appointed as giving tacit consent to (default).
Motor driving part 530 can include the motor by controlling original state recovery section 520 and rotating, motor driving part 530
It is bonded to each other with junction portion 210, the rotation will pass through motor can adjust state (the i.e. position and/or angle in junction portion 210
Degree).Motor driving part 530 can be accommodated in the inner side of main interface 4.Motor included by motor driving part 530, for example can be
Globular motor for multiple degrees of freedom (degree of freedom) motion can be carried out, in order to eliminate the limit at angle of inclination,
The supporting structure of globular motor is made up of ball bearing and round rotor, or can also be configured to have for supporting circular turn
The frame structure of the three degree of freedom of son.
Even if junction portion 210 returns to original state by the action of above-mentioned each inscape, operational order generating unit 320 is not
The operational order for it can be generated and transmit, so the image for inputting and exporting by laparoscope 5 will not change.Accordingly, it is capable to
Patient is applied after enough holdings and confirms the image of laparoscope 5 while the uniformity performed the operation by meeting mesh portion 220.
So far, what reference picture 9 illustrated to realize laparoscope display part 20 by operating motor driving part 530 meets mesh portion
The situation of 210 reinstatement states, but when releasing the external force for applying the application of patient's facial movement, can also be by having springy elasticity
The travelling portion 250 of body material returns to original state.Junction portion 210 by spring return to original state when, for operating abdominal cavity
The operation signal of mirror 5 will not also be generated.
Figure 10 illustrates the laparoscopic procedure command transmission method that another embodiment of the present invention is related to.
Reference picture 10, in step 410, laparoscope display part 20 detect the facial movement for applying patient, then go to step
420, generate the operational order for operating laparoscope 5 using by detecting heat transfer agent that facial movement generated.Afterwards,
In step 430, it will generate at step 420 for operating the operational order of laparoscope 5 to send to from robot 2.
Then, in step 610, laparoscope display part 20 judges whether the contact for applying patient's docking face 210 releases.If
When keeping contact condition, step 410 is passed again to, if during contact condition releasing, step 620 is gone to, to control junction portion 210 extensive
Original position is arrived again.
Figure 11 is to briefly show the laparoscope for being used to generate laparoscopic procedure order that another embodiment of the present invention is related to
The block diagram of the composition of display part.
Reference picture 11, laparoscope display part 20 can include contact detecting 510, image pickup part 710, storage part 720, eyeball
Tracking portions 730, operational order generating unit 320, transport unit 330 and control unit 740.
Whether the face that contact detecting 510 applies patient by detection is contacted with protrudes the supporting formed in junction portion 210
Portion 230 (and/or 240) exports heat transfer agent.
Image pickup part 710 detected according to the heat transfer agent of contact detecting 510 apply patient face contacted with junction portion 210
When captured in real-time apply the images of patient eyes.Image pickup part 710 can be configured to, and can shoot and pass through in the inner side of main interface 4
Connect that mesh portion 220 sees applies patient's eyes.The relevant image for applying patient's eyes photographed by image pickup part 710 is stored in storage
Portion 720, so that eye tracking portion 730 carries out eye tracking (eye tracking) processing.
As long as the image eye tracking portion 730 shot by image pickup part 710 can carry out eye tracking (eye
Tracking) handle, can also perform for handle eye tracking portion 730 required pretreatment after be stored in storage
Portion 720.The image generating method and the image type of generation handled for eye tracking, for those skilled in the art
It is it will be apparent that therefore omitting its description.
The image that storage part 720 is stored in real-time or specified period is compared in eye tracking portion 730 in chronological order
Relatively analyze, and parsing applies the change of patient's pupil position and its direction of gaze to export parsing information.In addition, eye tracking portion 730
Pupil apperance (for example, blink etc.) can also be parsed to export the parsing information to it.
Operational order generating unit 320 generates operational order with reference to the parsing information in eye tracking portion 730, the operational order
Position and/or image input angle for controlling laparoscope 5, to apply when the direction of gaze of patient changes corresponding thereto.
In addition, if the change of pupil apperance is for inputting predetermined command, operational order generating unit 320 can also generate for this
Operational order.For example, the specified order changed according to pupil apperance, laparoscope when can be predefined for for example continuously blinking right eye twice
5, close to operative site, rotate clockwise when continuously blinking left eye twice.
Transport unit 330 sends the operational order generated by operational order generating unit 320 from robot 2, with convenient to operate to
The position of laparoscope 5 and image input angle, to provide respective image.In order to transmit the operational order for operating robotic arm 3,
Transport unit 330 can also be the transport unit for being located at main robot 1.
Control unit 740 controls each inscape to perform required movement.
So far, reference picture 11 illustrates that the laparoscope display part of pupil movement is recognized and handled using eye tracking technology
20.But, this is not limited to, laparoscope display part 20 can also apply the facial motion of itself of patient to be identified with detection
And the mode of processing is realized.As one, face-image can also be shot by image pickup part 710, if replacing eye by analyzing and processing portion
Ball tracking portions 730 parsing shooting image in specified point (for example, in two positions, nose shape, people in position etc. one with
On) position and change, then operational order generating unit 320 generate its corresponding operational order.
Figure 12 is the flow chart for showing the laparoscopic procedure command transmission method that another embodiment of the present invention is related to.
Reference picture 12, in step 810, if laparoscope display part 20 is detected by contact detecting 510 applies patient face
Portion is contacted, then activate image pickup part 710 generate it is relevant by connecing the digital image data for applying patient's eyes that mesh portion 220 is seen, and
It is stored in storage part 720.
In step 820, laparoscope display part 20 is stored in storage part 720 by multilevel iudge with real-time or specified period
Digital image data come generate about apply patient's pupil position and direction of gaze change parsing information.During multilevel iudge, abdomen
Hysteroscope display part 20 allows have certain error, so the positional information change in prescribed limit, can also be identified as pupil position
Put and do not change.
In step 830, laparoscope display part 20 judges whether the direction of gaze for applying patient's change maintains predetermined critical
It is more than the time.
If the direction of gaze of change maintains more than the crash time, in step 840, the generation of laparoscope display part 20 is used
The operational order of the location drawing picture can be received in operation laparoscope 5, and is sent to from robot 2.Here, the crash time can be with
It is set as, is trembleed according to the pupil for applying patient or the pupil movement for Comprehensive affirming operative site integrally etc., laparoscope 5 will not
The time operated, the time value can also be set according to experiment, statistics, or be set according to patient etc. is applied.
But, the direction of gaze of change does not maintain more than the crash time, then to pass again to step 810.
Figure 13 is the flow chart for showing the laparoscopic procedure command transmission method that another embodiment of the present invention is related to, Figure 14
It is the schematic diagram for the image display format for exemplifying the laparoscope display part that embodiments of the invention are related to.
Reference picture 13, in step 810, if laparoscope display part 20 is detected by contact detecting 510 applies patient face
Portion is contacted, then activate image pickup part 710 generate it is relevant by connecing the digital image data for applying patient's eyes that mesh portion 220 is seen, and
It is stored in storage part 720.
In step 820, laparoscope display part 20 is stored in storage part 720 by multilevel iudge with real-time or specified period
Digital image data come generate about apply patient's pupil position and direction of gaze change parsing information.
In step 910, whether the direction of gaze that laparoscope display part 20 judges to apply patient is predetermined setting.
The image display format of laparoscope display part 20 is exemplified in fig. 14.
As illustrated in Figure 14, the imaged image that laparoscope 5 is provided can be identified through by meeting mesh portion 220 by applying patient
1010, the imaged image can include operative site and apparatus 1020.In addition, the image of laparoscope display part 20, can be with overlapping
What patient was applied in display watches position 1030 attentively, can also together show setting position.
It can include the rotation of outline side 1040, the first rotation indicating bit of indicating positions 1050 and second as setting position
Put more than one in 1060 grades.For example, apply patient watch attentively the edge-critical graph time of the either direction in outline side 1040 with
When upper, laparoscope 5 can be controlled to be moved to the direction.That is, if the left side watched attentively in outline side 1040 is more than the crash time
When, in order to shoot the position for being located at left side than current display position, laparoscope 5 can be controlled to move to the left.If in addition, applying art
When person watches the first rotation indicating positions 1050 attentively more than the crash time, laparoscope can also be controlled to rotate counterclockwise, if
When applying patient and watching the second rotation indicating positions 1060 attentively more than the crash time, control laparoscope rotates clockwise.
Referring again to Figure 13, apply patient to watch position attentively when not being the position of above-mentioned setting, pass again to step 810.
But, referring again to Figure 13, when applying patient's fixation position and being set to the position of above-mentioned setting, step 920 is gone to, judges to apply
Patient watch attentively whether maintain it is predetermined more than the crash time.
If apply patient to setting position watch attentively maintain more than the crash time when, in step 930, laparoscope is shown
Portion 20 makes laparoscope 5 according to the specified generation operational order ordered and operated to the setting position, and sends to from robot
2。
But, if apply patient to the setting position watch attentively do not maintain more than the crash time when, pass again to step
810。
Figure 15 is the flow chart for showing the laparoscopic procedure command transmission method that another embodiment of the present invention is related to.
Reference picture 15, in step 810, if laparoscope display part 20 is detected by contact detecting 510 applies patient face
Portion is contacted, then activate image pickup part 710 generate it is relevant by connecing the digital image data for applying patient's eyes that mesh portion 220 is seen, and
It is stored in storage part 720.
In step 1110, laparoscope display part 20 is stored in storage part by multilevel iudge with real-time or specified period
720 image information come generate about operator's eyes apperance change parsing information.For example, parsing information can be it is relevant
Patient's eyes are applied in stipulated time to have blinked several times, if blink when which eyes blink etc. information.
In step 1120, laparoscope display part 20 judges whether meet predetermined about the parsing information that eyes apperance changes
Specified requirements.For example, the specified requirements changed according to eyes apperance can be preset, for example, right eye at the appointed time
Whether continuously blink secondary, continuously whether blink two is inferior for left eye at the appointed time.
If the parsing information changed about eyes apperance meets predetermined condition, step 1130 is gone to, with meeting this
Specified order during part is generated for operating the operational order of laparoscope 5, and is sent to from robot 2.For example, according to eyes mould
The specified order of sample change, which can be preassigned, is, for example, when right eye continuously blinks secondary, laparoscope 5 is left close to operative site
When eye continuously blinks secondary, rotation etc. clockwise.
But, if the parsing information changed about eyes apperance is unsatisfactory for predetermined condition, go to step 910.
Figure 16 is the concept map for the main interface for showing the operating robot that another embodiment of the present invention is related to.
Reference picture 16, the main interface 4 of main robot 1 includes monitoring unit 6, master manipulator and shooting unit 1210.Although not
Diagram, can include robotic arm 3 and laparoscope 5 from robot 2 as described above.
As illustrated, the monitoring unit 6 of main interface 4 can be made up of more than one monitor, can be on each monitor
Required information when individually display is performed the operation (such as the image, the biological information of patient that are shot as laparoscope 5).Certainly, monitor
Information type or species that the quantity of device can be shown as needed etc. are set as difference.
The biological information (such as pulse, breathing, blood pressure, body temperature) of the patient shown by monitoring unit 6 can also be by area
Domain divides output, and these biological informations are supplied to after being measured by the biological information measurement unit having from robot 2
Main robot 1.
Shooting unit 1210 is that the unit of patient's apperance (such as facial zone) is applied with contactless shooting.Shooting unit
1210 can for example be realized by the camera system including imaging sensor.
The image shot by shooting unit 1210 is supplied to laparoscopic procedure unit 1200 (reference picture 17), laparoscopic procedure
Unit 1200 using parse the image be obtained about the information of object variable quantity to control laparoscope 5 to perform rotation, it is mobile
Operation or the amplification/reduction operation for performing image.
In addition, main interface 4 can have master manipulator, to apply patient, with the hands holding is operated respectively.It is main to manipulate
Device can have two or more handle 10, and with patient's operation handle 10 is applied, corresponding operation signal is sent to from robot 2,
So as to control machine arm 3.The position movement, rotation, cutting operation of robotic arm 3 can be carried out by operation handle 10 by applying patient
Etc. surgical action.
For example, handle 10 can include main handle and auxiliary-handle, apply patient only can have more certainly by main handle operation
Driven by degree from robotic arm 3 or laparoscope 5 of robot 2 etc., or auxiliary-handle can also be operated while making multiple operations
Equipment carries out real-time operation.Certainly, master manipulator is not limited to handle shape, as long as being capable of control machine arm by network
The form of 3 actions, can be applicable without restriction.
Can utilize more than one from robot 2, to be performed the operation to patient, for by display part 6 with can be true
The image (i.e. picture image) recognized shows that the laparoscope 5 of operative site can also be realized by independent from robot 2.As above institute
State, embodiments of the invention can be widely applied to using in addition to laparoscope various operation endoscopes (such as thoracoscope,
Arthroscope, asoscope etc.) operation in.
Figure 17 is the block diagram for the composition for briefly showing the laparoscopic procedure unit that another embodiment of the present invention is related to, Figure 18
It is the schematic diagram for the movement concept for showing the laparoscopic procedure unit that another embodiment of the present invention is related to, Figure 19 and Figure 20 are point
Do not exemplify to operate the laparoscope and the schematic diagram of facial movement that another embodiment of the present invention is related to.
Reference picture 13, laparoscopic procedure unit 1200 includes storage part 1220, angle and apart from calculating section 1230, operation life
Make generating unit 1240 and transport unit 1250.More than one in the inscape of diagram, can be realized by software program, or
Person can be made up of the combination of hardware and software program.Furthermore it is possible to omit the more than one inscape of diagram.
Storage part 1220 is for storing the imaged image shot by shooting unit 1210, by angle and apart from calculating section
1230 displacement informations calculated.
Displacement information can be using calculating the extended line of two that continuous two imaged images are calculated in the cycle
Interior angle and relevant rotation side between picture center line (i.e. by the transverse direction and the horizontal line of vertical central point of imaged image)
To information, the moving distance information between the datum mark in face, spacing variable quantity information between the central point of two etc..
Certainly, in order to calculate displacement information etc., and not only limit utilizes continuous two imaged images, can also utilize and work as
Preceding imaged image and the imaged image whenever shot before.For example, in the presence of the n-th -3 times shooting imaged image, the n-th -2 times
It is sharp during parsing during the imaged image that the imaged image of shooting, the imaged image of (n-1)th shooting, the n-th of current time are shot
The cycle that shoots of imaged image is 3 situation, it is possible to use the shadow that the imaged image and n-th of the n-th -3 times shootings are shot
Displacement information etc. is calculated as image.Although omission separately illustrates that these principles not only go for displacement information,
It can also be equally applicable to calculate angle and distance etc..But, in this specification for ease illustration and understanding, parse for giving birth to
Into customizing messages imaged image when, enter in case of the imaged image and the imaged image that shoots before using current shooting
Row explanation.
Angle and shot by shooting unit 1210 and be stored in the imaged image of storage part 1220 apart from calculating section 1230
Middle utilize calculates continuous two imaged images (i.e. the imaged image of current shooting and the imaged image shot before) in the cycle
To generate the displacement information between imaged image, and it is stored in storage part 1220.
Here, angle and relevant two calculated using continuous two imaged images can be generated apart from calculating section 1230
Extended line and picture center line between interior angle and direction of rotation information, face in datum mark between displacement believe
Spacing variable quantity information between breath, the central point of two etc..
Below, the method for illustrating angle and displacement information being generated apart from calculating section 1230.But, by parsing striograph
As recognizing the video recording analysis technology of facial zone, eye position and datum mark (such as nose central point) position, for ability
It is it will be apparent that therefore omitting detailed description thereof for the technical staff in domain.
First, in order to generate about face rotate information, as shown in figure 19, angle and apart from calculating section 1230 by clapping
Unit 1210 is taken the photograph to shoot and be stored in the extended line for calculating binocular (two) in the imaged image of storage part 1220 and picture center
Interior angle size between line, and be compared with the interior angle size of imaged image before parsing, so as to generate relevant direction of rotation
With the information of the anglec of rotation.
That is, as illustrated in Figure 19 (b) and (c), such as by edge detecting technology image processing techniques identification by
The location and shape of binocular in the imaged image that shooting unit 1210 shoots and generated, therefore, after the central point for obtaining each eye,
Calculate interior angle size and facial direction of rotation that the virtual line for connecting each eye central point and picture center line are formed.That is, if
The imaged image generated before is the raw video image of (a), and the imaged image being currently generated is and (b) identical striograph
Picture, the then intersection point that can be formed by the straight line being compared to each other in each imaged image connection picture center line and two central points
Which formed in position of face and corresponding interior angle size, to recognize how many angle have rotated to which direction for face.
Secondly, in order to generate the information that relevant face is moved in parallel, angle and apart from calculating section 1230 by shooting unit
1210 shoot and are stored in the imaged image of storage part 1220, predetermined datum mark (such as nose central point) in detection face
Move how many from the transverse direction and vertical central point of imaged image.
That is, as illustrated in Figure 19 (d), in the raw video image being represented by dashed line, datum mark and imaged image
Central point is consistent, but in current imaged image, the central point that can recognize datum mark from imaged image is parallel to the right
Move any distance.Certainly, it can be up and down etc. a variety of to move in parallel direction.
Secondly, for the information for generating the amplification of relevant picture and reducing, angle and apart from calculating section 1230 single by shooting
Member 1210, which shoots and is stored in the imaged image of storage part 1220, calculates binocular spacing (Figure 20 d1, d2 or d3), union
Go out the distance calculated and (apply the facial to close shooting of patient than shooting the increase of the distance of the binocular in the imaged image of storage before
The direction movement of unit 1210) or reduce (face for applying patient is moved to the direction away from shooting unit 1210).Binocular
Distance can specify and be applicable it is a variety of, for example, the distance between the central point of each eye, or most short between the outer contour of each eye
Distance etc..
Operational order generating unit 1240 is using by angle and the displacement information calculated apart from calculating section 1230, to generate
Operational order for controlling the position of laparoscope 5 (such as moving, rotate) and image multiplying power (such as amplification, reducing).
If recognize apply the face of patient to either direction have rotated when, operational order generating unit 1240 is generated for making abdomen
The operational order that hysteroscope 5 rotates;If recognize apply patient face to either direction moved in parallel when, operational order generating unit
1240 generate the operational order for making laparoscope 5 be moved to respective direction and respective distance;If recognizing the face for applying patient
During close to or away from 1240 direction of shooting unit, operational order generating unit 1240 generates the image for making to shoot generation by laparoscope 5
The operational order that the multiplying power of image is zoomed in or out.
Transport unit 1250 sends the operational order generated by operational order generating unit 1240 from robot 2 to, to grasp
Make position of laparoscope 5 etc., to provide respective image.In order to transmit the operational order for operating robotic arm 3, transport unit 1250
It can be the transport unit for being located at main robot 1.
In addition, although not shown in fig. 17, judging part can also also be included, the judging part is used to judge current by clapping
Whether the user taken the photograph in the imaged image of the shooting of unit 1210 is to be certified user.
That is, it is judged that whether portion judges facial apperance in the imaged image of current shooting with depositing in advance as being certified user
Store up the facial apperance in the imaged image of storage part 1220 consistent in error range, can only in the case of consistent
Carry out the laparoscopic procedure according to aforesaid way.Certainly, when judging part determines whether to be certified user, face can not only be utilized
This characteristic element of portion's apperance, can also using the location and shape of eyes/eyebrow/nose/mouth, pupil color, skin color,
More than one in the characteristic elements such as wrinkle, blush.
In this way, judging part is had additional, so, even if being photographed in the imaged image shot by shooting unit 1210 many
People, change combined operation of the laparoscope only to applying the facial apperance of patient.
Moreover, it is judged that portion can be by judging whether the facial apperance in imaged image is located at predetermined region or facial big
It is small whether to be more than or equal to size predetermined in imaged image, to further determine whether to make above-mentioned laparoscopic procedure unit 1200 to make
With.Certainly, judging part can also replace judging that face size is by judging whether facial size meets predetermined dimensional standard
It is no to be more than or equal to predetermined size.This be apply patient's face patch it is too near when the reason for be also possible to cause misoperation.
For example, as illustrated in Figure 18, judging part can be being shown by shooting unit by judging whether face is located at
Presumptive area 1320 in the viewing area 1310 of 1210 imaged images shot, whether to determine laparoscopic procedure unit 1200
Effect.Certainly, it is not intended to limit face and is contained in whole presumptive area 1320, can also be redefined for, as long as one of face
Subpackage is contained in presumptive area 1320.
If when face is located in presumptive area 1320, can be acted laparoscopic procedure unit 1200, otherwise,
Make laparoscopic procedure unit 1200 without action.
In addition, as illustrated in Figure 18, whether judging part can big by the facial size for judging to be contained in imaged image
In the size of presumptive area 1320, to determine whether laparoscopic procedure unit 1200 acts on.Whether facial size is more than fate
The size in domain 1320, for example, the big of the area that judges the facial outer contour of detection and calculate and presumptive area 1320 can be passed through
It is small to have what relation to carry out.Now, it can also judge whether facial size is less than specified threshold.
If, such as shown in (a), when facial size is less than the size of presumptive area, it may be possible to be located remotely from the of main interface 4
Three people, or patient is even applied, it also is located at away from main interface 4 without surgical action, it is possible to make laparoscopic procedure
Unit 1200 does not work.
In this way, when judging part only shows face more than given size in presumptive area 1320, making laparoscopic procedure
Unit 1200 is acted on, therefore, it is possible to prevent laparoscope 5 according to the facial movement of the non-third party for applying patient or with surgical action without
Ground is closed to move and maloperation.
Moreover, it is judged that portion only can also keep certain in the imaged image septum reset motion shot by shooting unit 1210
During the time, handle the progress that above-mentioned laparoscopic procedure unit 1200 is acted on.For example, can be redefined for, only in identification
Head is tilted to the right under to the normal condition for applying patient and watching attentively front the stipulated time (such as two seconds) is kept after predetermined angular
When, the generation of laparoscopic procedure unit 1200 simultaneously transfer operation order, so that laparoscope rotates and/or mobile respective corners to the right
Degree.Certainly, when judging part is not only the facial movement holding stipulated time or when facial movement exceeds preset range, make
The progress that laparoscopic procedure unit 1200 is acted on is handled.
Carry out thereby, it is possible to preventing from applying patient consciously or without the meaning later or during the action such as mobile, being inserted in patient
The danger that the laparoscope 5 of belly is operated suddenly with the motion for applying patient.This respect can be similar with previously described method
Processing, i.e. when face disengages formula structure, cut-out apply patient motion pass to from robot 2, so that it is guaranteed that patient pacify
Entirely.
Moreover, it is judged that portion also can also carry out request function, recognized in the imaged image shot by shooting unit 1210
When characteristic portion in face changes and (such as blinked), request generates corresponding operational order by operational order generating unit 1240
And send to from robot 2.Storage part 1220 should generate the letter of which kind of operational order when can prestore characteristic portion change
Breath.
For example, in surgical procedure, the imaged image shot by shooting unit 1210 is fogged and can not be seen clearly by fog
During object etc., only winked one's left eye if applying patient, operational order generating unit 1240, which can be generated, to be used for intraperitoneal discharge dioxy
Change carbon and open the operational order of valve, and send to from robot 2.As another example, in surgical procedure, apply patient and only blink
Right eye, then can generate and export makes the image before at position in current procedure show or disappear in main interface as augmented reality
Operational order on 4 monitor 6.
Certainly, judging part indicates to generate operational order except the characteristic portion change in the face in identification imaged image
Method outside, can also using image analysis technology identification above and below nod or left and right head shaking movement come indicate generate corresponding operating
Order.If for example, recognizing when applying patient's nodding action, being identified as reacting certainly, if recognizing when applying patient's head shaking movement, know
It Wei not negate reaction.
Using these, when execution progress Yes/No of the patient to some actions is applied in operating robot requirement under specific circumstances
Selection when, the button being located in console or operation handle need not be pressed by applying patient, only by behavior of nodding or shake the head, also may be used
With response operating robot.
In addition, the contactless abdominal cavity mirror control method that the present embodiment is related to, can also basis during using three-dimensional laparoscope
Left/right image overlapping degree needed for the position adjustments 3-D view processing of face and eyes.
Figure 21 is the flow chart for the action process for showing the laparoscopic procedure unit that another embodiment of the present invention is related to.
Reference picture 21, in step 1610, laparoscopic procedure unit 1200 has the shadow for applying patient's face by parsing to shoot
Facial zone, eye position and datum mark (such as nose central point) position are detected as image.
Then, in step 1620,1200 pairs of imaged images of laparoscopic procedure unit calculate two extended lines and picture
Interior angle and direction of rotation between center line, and using with the interior angle that is calculated to the imaged image shot before and direction of rotation
Difference, to calculate current facial direction of rotation and the anglec of rotation for applying patient.
In addition, in step 1630,1200 pairs of imaged images of laparoscopic procedure unit calculate datum mark in face with
The distance between facial fiducial point in the imaged image shot before variable quantity.
In addition, in step 1640,1200 pairs of imaged images of laparoscopic procedure unit are calculated between two central points
Distance, and the distance between two central points calculated with reference to the imaged image to shooting before, come calculate between two away from
From variable quantity.
Above-mentioned steps 1620, can be in order or executed sequentially to step 1640, or can also perform simultaneously.
In step 1650, laparoscopic procedure unit 1200 generates the displacement with being calculated in step 1620 to step 1640
The amount corresponding operational order of information is simultaneously supplied to from robot 2, with convenient to operate laparoscope 5.
Figure 22 is the flow chart for the step 1610 for specifically illustrating Figure 21 that another embodiment of the present invention is related to.
Above-mentioned Figure 21 step 1610 is to shoot to apply patient's face by parsing on laparoscopic procedure unit 1200
The step of imaged image is to detect facial zone, eye position and datum mark (such as nose central point) position.
Figure 22 of step 1610 is shown with reference to concrete example, laparoscopic procedure unit 1200 receives shooting in step 1710 to be had
The imaged image of patient's face is applied, step 1720 is gone to afterwards facial zone, eyes position is detected by parsing the imaged image
Put and datum mark (such as nose central point) position.
Then, in step 1730, laparoscopic procedure unit 1200 judges whether the motion state for applying patient maintains
Stipulated time.
If the motion state maintains the stipulated time, it is identified as needing to operate laparoscope 5, so as to go to step 1620
To step 1640.
But, if the motion state does not maintain the stipulated time, it is identified as operating unrelated motion with laparoscope 5,
So as to pass again to step 1710.
In this way, the accuracy in order to ensure whether to operate laparoscope 5, step 1610 can be embodied as multiple steps.
In addition, step 1610 can be embodied as various ways.
As one, facial apperance in the imaged image received is judged as being certified and user and is stored in advance in storage
Whether the facial apperance in portion 1220 is consistent in error range, when consistent only in error range, can go to step 1620
Progress to step 1640 embodies.
As another example, judge whether the facial apperance in the imaged image received is located at presumptive area or facial size
Whether it is more than or equal to preliminary dimension or less than or equal to preliminary dimension, is only more than or equal to specified size in facial size and is located at
When meeting designated area, the progress for just going to step 1620 to step 1640 embodies.
In addition, step 1610 can differently be embodied, it can also integrate including more than one embodiment
And embodied.
Described laparoscopic procedure method can also be realized by software program etc..The code and code segment of configuration program,
Personnel can be weaved into by the computer in the field easily to infer.Moreover, program storage is in computer-readable medium (computer
Readable media) on, execution is readable by a computer, so as to realize methods described.Computer-readable medium includes magnetic recording
Media, optical recording media and carrier medium.
Figure 23 is the top view for the total for showing the operating robot that embodiments of the invention are related to, and Figure 24 is to show
The concept map of the main interface for the operating robot that the first embodiment of the present invention is related to.
This embodiment is characterized in that, the endoscope viewpoint changed corresponding to the motion according to operation endoscope, to change
The outgoing position of the endoscopic images for the monitor output that deflecting user sees, so as to make user more realistically experience reality
Border operative status.That is, endoscope viewpoint can be consistent with the user's viewpoint performed the operation, so the present embodiment has following spy
Levy, make position and the outbound course of monitor of the intraperitoneal endoscope viewpoint with exporting endoscopic images in outside operative site
Unanimously, so that the system acting of externally-located operative site reflects that the actual endoscope in patient's internal motion is acted, more
There is the sense of reality.
The operation endoscope that the present embodiment is related to can be not only laparoscope or thoracoscope, arthroscope, nose
Mirror, cystoscope, proctoscope, ERCP, mediastinoscope, cardioscope etc. perform the operation when as shooting instrument use it is different types of
Instrument.In addition, the operation endoscope that the present embodiment is related to can be stereo endoscope.That is, the operation that the present embodiment is related to is used
Endoscope can be the stereo endoscope for generating stereo image information, and the mode of this generation stereo image information can be by difference
Technology is realized.For example, the operation endoscope that is related to of the present embodiment can be with order to obtain the multiple images with steric information
With multiple video cameras, or stereo-picture can be obtained using the different modes such as multiple images are obtained using a video camera.
In addition to these modes, the operation endoscope that the present embodiment is related to can generate stereo-picture by other different modes.
In addition, the human body temperature type operation image processing apparatus that the present embodiment is related to, is not limited in operation as depicted
Robot system is realized, as long as output endoscopic images 9 and the system performed the operation using operation tool during operation
It is applicable.Below, to be applicable operation that the present embodiment is related on surgical robot system with the situation of image processing apparatus in
The heart is illustrated.
Reference picture 23 and 24, surgical robot system includes:From robot 2, hand is carried out to lying in the patient on operating table
Art;Main robot 1, for applying patient's remote operation from robot 2.Main robot 1 and need not be physically independent from robot 2
Isolated system, can become one formula, in such case, main interface 4 can equivalent to such as integral type robot interface
Part.
The main interface 4 of main robot 1 includes monitoring unit 6 and master manipulator, includes robotic arm 3 and apparatus 5a from robot 2.
Apparatus 5a is such as laparoscope endoscope, the operation tool such as operating theater instruments operated direct to patient.
Main interface 4 has master manipulator, and to apply patient, with the hands holding is operated respectively.Such as Figure 23 and 24 examples
Show, master manipulator there can be two handles 10, and with patient's operation handle 10 is applied, corresponding operation signal is sent to from machine
People 2, so that control machine arm 3.The position that robotic arm 3 and/or apparatus 5a can be carried out by applying patient's operation handle 10 is moved
Dynamic, rotation, cutting operation etc..
For example, handle 10 can include main handle (main handle) and auxiliary-handle (sub handle).Can also only it lead to
A handle operation is crossed from robotic arm 3 or apparatus 5a etc., or auxiliary-handle can also be increased while carrying out multiple surgical apparatuses
Real-time operation.Main handle and auxiliary-handle can have a variety of mechanical structures according to its mode of operation, it is, for example, possible to use control stick
Form, keyboard, tracking ball, touch-screen etc. be used for make from the robotic arm 3 of robot 2 and/or other surgical apparatuses action it is a variety of
Input block.
Master manipulator is not limited to the shape of handle 10, as long as it is capable of the form of the action of control machine arm 3 by network,
It can be applicable without restriction.
The endoscopic images 9, the camera review that are inputted by apparatus 5a shown with picture image in the monitoring unit 6 of main interface 4
And modeled images.Can be diversified according to selected image species in addition, being shown in the information of monitoring unit 6.
Monitoring unit 6 can be made up of more than one monitor, so as to when individually display is performed the operation on each monitor
Required information.Monitoring unit 6 is illustrated in Figure 23 and Figure 24 includes three monitors, and the quantity of monitor can be according to need
Type or species of display information etc. is wanted to be set as difference.In addition, when monitoring unit 6 includes multiple monitors, picture can be each other
Linkage extension.That is, endoscopic images 9 as the window (window) shown on a monitor, can move freely in
Each monitor, exports the parts of images being connected to each other on each monitor, so whole image can also be exported.
It can be connected from robot 2 with main robot 1 by wired or wireless, so that main robot 1 can be to slave
The transfer operation signal of device people 2, the endoscopic images 9 inputted by apparatus 5a are transmitted from robot 2 to main robot 1.If, it is necessary to
Two operation signals of two handles 10 having in main interface 4 are transported through in identical and/or similar time and/or are used for
During the operation signal of adjusting instrument 5a positions, each operation signal can be sent to from robot 2 independently of one another.Here, each behaviour
It is to represent to make signal ' independence ' transmission, is not disturbed each other between operation signal, and an operation signal does not interfere with another operation
Signal.It is such, in order to transmit multiple operation signals independently of one another, it is possible to use following different modes, generating each operation
The stage of signal assigns heading message to each operation signal and transmitted, or each operation information is carried out according to its genesis sequence
Transmission, or preset priority on the transmission order of each operation information and transmitted etc. according to its order.Now,
Can independently have the transmitting path for transmitting each operation information, so as to be disturbed between inherently preventing each operation signal.
Driven with there can be multiple degrees of freedom from the robotic arm 3 of robot 2.Robotic arm 3 can for example include:Operation work
Tool, is inserted in the operative site of patient;Deflection driven portion, according to surgery location so that operation tool revolves to deflection (yaw) direction
Turn;Pitching drive division, orthogonal pitching (pitch) direction rotary operation instrument to the rotation driving with deflection driven portion;Transfer
Drive division, operation tool is moved to length direction;Rotary driving part, for rotary operation instrument;Operation tool drive division, is located at
The end of operation tool, for incision or cutting operation focus.But, the structure of robotic arm 3 is not limited to this, it should be appreciated that
For protection scope of the present invention is not limited to these illustrations.In addition, apply patient makes robotic arm 3 to right by operation handle 10
The actual control process such as answer direction to rotate, move, having certain distance with idea of the invention, therefore omit detailed description thereof.
In order to perform the operation to patient, it is possible to use more than one from robot 2, and it is used to by monitoring unit 6 draw
Face image shows the apparatus 5a of operative site, can also be realized by independent from robot 2, main robot 1 can also with from
The integration of robot 2.
Figure 25 is the block diagram for showing the operating robot that the first embodiment of the present invention is related to.Reference picture 25, shows bag
Include image input unit 2310, picture display part 2320, arm operating portion 2330, operation signal generating unit 340, screen display control unit
2350th, the main robot 1 of control unit 370, and including robotic arm 3, endoscope 8 from robot 2.
The human body temperature type operation image processing apparatus that the present embodiment is related to can be by showing including image input unit 2310, picture
Show portion 2320, the module of screen display control unit 2350 is realized, certainly, this module can also include arm operating portion 2330, behaviour
Make signal generation portion 340, control unit 370.
Image input unit 2310 receives the image by being inputted from the endoscope 8 of robot 2 by wired or wireless transmission.It is interior
The one kind for the operation tool that sight glass 8 can also be related to as the present embodiment, its quantity can be more than one.
Picture display part 2320 regard the picture image for corresponding to the pass the image that image input unit 2310 is received as vision
Information is exported.Picture display part 2320 is defeated with original size or amplification (Zoom In)/diminution (Zoom Out) by endoscopic images
Go out, or endoscopic images and aftermentioned modeled images are matched or exported with independent image.
In addition, picture display part 2320 by endoscopic images and can also reflect the image of whole operative status, for example take the photograph
Camera shoot surgical object outside generation camera review simultaneously and/or the output that matches, to be easily mastered operative status.
In addition, picture display part 2320 can also perform following function, export a part for output image or individually drawing
Whole image (endoscopic images, modeled images and camera review etc.) is reduced in output on the window (window) generated on face
Image, applies patient using above-mentioned master manipulator from when the selection of the downscaled images of output or rotation specified point, whole output image is moved
Dynamic or rotation, birds-eye view (bird`s eye view) function of so-called CAD program.Shown as described above to being output to picture
The function that the image in portion 2320 is amplified/reduced, moves, rotating etc., can be operated by the correspondence master manipulator of control unit 370 Lai
It is controlled.
Picture display part 2320 can be realized in forms such as monitoring units 6, and picture display part is passed through for that will receive image
2320 image processing programs exported with picture image, can pass through control unit 370, screen display control unit 2350 or single
Image processing part is (not shown) to be performed.The picture display part 2320 that the present embodiment is related to can be the display for realizing various technologies
Device, for example, the monitor of the ultrahigh resolution such as many visual displays, UHDTV (7380 × 4320).In addition, the present embodiment is related to
Picture display part 2320 can be 3D displays.For example, the picture display part 2320 that the present embodiment is related to can utilize binocular
Principle of parallax makes customer analysis left eye be recognized with right eye with image.This 3D rendering implementation can be with spectacle (example
Such as red blue glasses mode (Anaglyph), polaroid glasses mode (passive glasses), shutter glasses mode (active glasses)
Deng), biconvex lens mode, the different modes such as obstruction method realize.
The endoscopic images of input are output to specific region by picture display part 2320.Here, specific region can be
There is the region of prescribed level and position on picture.These specific regions as described above can correspond to the vision of endoscope 8
Change to determine.
The vision that screen display control unit 2350 can correspond to endoscope 8 sets these specific regions.That is, picture is shown
Control unit 2350 action such as rotates, moved corresponding to endoscope 8 and following the trail of its viewpoint, and reflects these, in picture display part 2320
The specific region of setting output endoscopic images.
Arm operating portion 2330 is to enable to apply patient to operate from the position of the robotic arm 3 of robot 2 and the unit of function.Such as
Illustrated in Figure 24, arm operating portion 2330 can be formed as the shape of handle 10, but be not limited to the shape, can be changed to be used for
The different shape of identical purpose is realized to realize.For example, it is also possible to which part is formed as handle shape, other parts are formed as clutch
The different shapes such as device button, can also also form finger insertion tube or insertion ring, apply patient's finger and are inserted into the insertion tube
Or insertion ring and be fixed, so as to easy operation instrument.
Patient is applied to move the position of robotic arm 3 and/or endoscope 8 or be operated and motion arm operating portion
When 2330, operation signal generating unit 340 generates its corresponding operation signal, and sends to from robot 2.Operation signal can be with
Transmitted by wired or wireless communication net.
Operation signal generating unit 340 generates operation using according to the corresponding operating information for applying patient's motion arm operating portion 340
Signal, the operation signal of generation is sent to from robot 2, as a result causes actual operation instrument to carry out the behaviour corresponding to operation signal
Make.Moreover, according to the position of the actual operation instrument of manipulation signal and operational shape, applying patient can be by endoscope 8
The image of input confirms.
Figure 26 is the block diagram for showing the human body temperature type operation image processing apparatus that the first embodiment of the present invention is related to.Reference
Figure 26, screen display control unit 2350 can include endoscope viewpoint tracking part 351, image mobile message extraction unit 353, image
Position configuration part 355.
Endoscope viewpoint tracking part 351 corresponds to the movement and rotation of endoscope 8, follows the trail of the view information of endoscope 8.
This, visual information represents the viewpoint (view point) that endoscope is regarded, and these view informations can be from the operating robot
The signal extraction of system operatio endoscope 8.That is, view information can be moved by operation endoscope 8 and the signal of rotary motion Lai
It is specific.These operation signals of endoscope 8 generate in surgical robot system and are sent to the robotic arm 3 of operation endoscope 8, so
Using these signals, institute's apparent direction of endoscope 8 can be followed the trail of.
Image mobile message extraction unit 353 utilizes the view information of endoscope 8, to extract the mobile letter of endoscopic images
Breath.That is, the view information of endoscope 8 can include the location variation letter of the reference object of the endoscopic images about being obtained
Breath, the mobile message of endoscopic images can be extracted from these information.
Picture position configuration part 355 sets the picture display part of output endoscopic images using the mobile message of extraction
2320 specific region.If for example, the view information of endoscope 8, which becomes, turns to regulation vector A, about in patient's internal viscera
Sight glass image, corresponding to the vector, carrys out its specific mobile message, and utilize the spy of the mobile message setting screen display part 2320
Determine region.If endoscopic images, which become, turns to regulation vector B, can utilize these information and picture display part 2320 size,
Shape, resolution ratio, to be set in the specific region of the reality output endoscopic images of picture display part 2320.
Figure 27 is the flow chart for showing the human body temperature type operation image processing apparatus that the first embodiment of the present invention is related to.Under
The face each step to be performed can be carried out with screen display control unit 2350 for main body, but each step need not be with description order
Time series is performed.
In step S511, corresponding to the movement and rotation of endoscope 8, tracking is used as the relevant viewpoint of endoscope 8
The view information of the endoscope 8 of information.View information is moved according to operation endoscope 8 and the signal of rotary motion is come specific, institute
So that institute's apparent direction of endoscope 8 can be followed the trail of.
In step S513, using the view information of endoscope 8, the mobile message of endoscopic images, the endoscope are extracted
The mobile message of image corresponds to the location variation of the reference object of endoscopic images.
In step S515, using the mobile message of extraction, the output endoscopic images of picture display part 2320 are set in
Specific region.That is, if the view information and the mobile message of endoscopic images of specific endoscope 8 as described above, utilize this
A little mobile messages, are set in the specific region that picture display part 2320 exports endoscopic images.
In step S517, the specific region set in picture display part 2320 exports endoscopic images.
Figure 28 is the human body temperature type operation image processing method output figure for showing to be related to according to the first embodiment of the present invention
The pie graph of picture.Picture display part 2320 can be whole picture, and the endoscopic images 2620 obtained by endoscope 8 can be with defeated
Go out the ad-hoc location to picture display part 2320, for example, the central point of endoscopic images 2620 is located at coordinate X, Y.Coordinate X, Y can
Set corresponding to the variable quantity of the viewpoint of endoscope 8.For example, the view information of endoscope 8 and the amount of movement of endoscopic images to
Left+1, to during vertical -1 change, the central point of endoscopic images 2620 can be moved to coordinate X+1, Y-1 position.
Figure 29 is the block diagram for showing the operating robot that the second embodiment of the present invention is related to.Reference picture 29, shows bag
Include image input unit 2310, picture display part 2320, arm operating portion 2330, operation signal generating unit 340, screen display control unit
2350th, the main robot 1 of image storage part 360, control unit 370, and including robotic arm 3, endoscope 8 from robot 2.Under
Face, main explanation and above-mentioned difference.
This embodiment is characterized in that, the endoscopic images of storage are inputted before current time, extraction and are peeped with currently interior
Mirror image is together output to picture display part 2320, so as to inform the user the information of relevant endoscopic images change.
Image input unit 2310 receives the first endoscopic images provided from operation endoscope with time point different from each other
With the second endoscopic images.Here, first, second grade ordinal number can be the identification for distinguishing endoscopic images different from each other
Number, the first endoscopic images and the second endoscopic images can be the figures that endoscope 8 is shot in time point different from each other and viewpoint
Picture.In addition, image input unit 2310 can be than receiving the first endoscopic images prior to the second endoscopic images.
Image storage part 360 stores first endoscopic images and the second endoscopic images.Image storage part 360 is not only deposited
The image information of the actual image content of the first endoscopic images and the second endoscopic images is stored up, is also stored about being output to picture
The information of the specific region of display part 2320.
Different zones export the first endoscopic images and the second endoscopic images to picture display part 2320 to each other, and picture
Display control unit 2350 can be with control interface display part 2320, so as to the viewpoint different from each other corresponding to endoscope 8, in first
Different zones are exported to each other for sight glass image and the second endoscopic images.
Make more than one in chroma, brightness, color and picture pattern different here, picture display part 2320 can be exported
First endoscopic images and the second endoscopic images.For example, picture display part 2320 is by the second endoscopic images currently inputted
Exported with coloured image, and the first endoscopic images of image are exported with black white image etc. in the past, so as to distinguish user
Image.Reference picture 32, is illustrated as the second endoscopic images 622 of current input image and to be output in coordinate with coloured image
X1, Y1, and be formed with picture pattern, i.e. slash pattern as the first endoscopic images 621 of past input picture and be output in
Coordinate X2, Y2.
In addition, the first endoscopic images of image can continue or only export in the given time before.The latter's
Situation, past image is only exported to picture display part 2320 at the appointed time, so picture display part 2320 can continue more
It is newly new endoscopic images.
Figure 30 is the block diagram for showing the human body temperature type operation image processing apparatus that the second embodiment of the present invention is related to.Reference
Figure 30, screen display control unit 2350 can include endoscope viewpoint tracking part 351, image mobile message extraction unit 353, image
Position configuration part 355, storage image display part 357.
Endoscope viewpoint tracking part 351 corresponds to the movement and rotation of endoscope 8 to follow the trail of the view information of endoscope 8,
Image mobile message extraction unit 353 extracts the mobile message of above-mentioned endoscopic images using the view information of endoscope 8.
Picture position configuration part 355 is set in the output endoscope figure of picture display part 2320 using the mobile message of extraction
The specific region of picture.
During picture display part 2320 exports the second endoscopic images inputted in real time, storage image display part 357
The first endoscopic images that extraction is stored in storage part 360 are exported to picture display part 2320.Due to the first endoscopic images and
Its output area and image information of second endoscopic images are different from each other, so, storage image display part 357 is shown to picture
The output of portion 2320 is as the first endoscopic images of image in the past, and past image is to extract these information from image storage part 360
And store.
Figure 31 is the flow chart for showing the human body temperature type operation image processing method that the second embodiment of the present invention is related to.Under
Each step of face description can be carried out with screen display control unit 2350 for main body, can substantially divide into the first endoscope of output
The step of the second endoscopic images of the step of image and output, but the first endoscopic images as described above and the second endoscope figure
As can export simultaneously.
In step S511, the first movement and rotation information corresponding to endoscope 8, tracking are used as the relevant institute of endoscope 8
Depending on the view information of the endoscope 8 of the information of viewpoint.
In step S513, the mobile message of the first endoscopic images is extracted, in step S515, the movement of extraction is utilized
Information, is set in the specific region that picture display part 2320 exports endoscopic images, in step S517, in setting position output
First endoscopic images.
In step S519, the information and first of the first endoscopic images about exporting is stored in image storage part 360
Picture position.
In step S521, the second movement and rotation information corresponding to endoscope 8, tracking are used as the relevant institute of endoscope 8
Depending on the view information of the endoscope 8 of the information of viewpoint.
In step S522, the mobile message of the second endoscopic images is extracted, in step S523, the movement of extraction is utilized
Information, is set in the specific region that picture display part 2320 exports endoscopic images, in step S524, in setting position output
Second endoscopic images.
In step S525, the information and first of the second endoscopic images about exporting is stored in image storage part 360
Picture position.In step S526, image storage part is stored in the first picture position output in the same direction of the second endoscopic images one
360 the first endoscopic images.Here, the first endoscopic images can with the chroma of the second endoscopic images, brightness, color and
The different output of more than one in picture pattern.
Figure 33 is the block diagram for showing the operating robot that the third embodiment of the present invention is related to.Reference picture 33, shows bag
Include image input unit 2310, picture display part 2320, arm operating portion 2330, operation signal generating unit 340, screen display control unit
2350th, control unit 370, the main robot 1 in images match portion 450, and including robotic arm 3, endoscope 8 from robot 2.Under
Face, main explanation and above-mentioned difference.
This embodiment is characterized in that, can be by endoscopic images when performing the operation using endoscope actual photographed and pre- Mr.
Into and be stored in the modeled images of operation tool of image storage part 360 and each or match each other, or adjust the progress such as its size
After amending image, exported to the picture display part 2320 of user's observable.
Images match portion 450 is by the endoscopic images received by image input unit 2320 and is stored in described image storage
The modeled images of the relevant operation tool in portion 360 are matched each other, and generate output image, and are output to picture and are shown
Portion 2320.Endoscopic images are the figures for only shooting limited area and obtaining using the image inside endoscope shooting patient body
Picture, so the image of the part apperance including operation tool.
Modeled images are the images that the shape of whole operation tool is implemented and generated with 2D or 3D rendering.Modeled images can
With the special time before being the operation time started, the operation tool image that state is shot for example is set in the early stage.Modeled images
It is the image generated by Surgery simulation technology, so images match portion 450 can be in actual endoscopic images on
Exported with the operation tool shown and modeled images.Modeling actual object and obtain the technology of image, have with idea of the invention
Certain distance, therefore omit detailed description thereof.In addition, the concrete function in images match portion 450, different detailed structures etc.,
It is described in detail referring to relevant drawings.
Control unit 370 controls the action of each inscape, so as to perform the function.Control unit 370 can also be held
It is about to be converted into the imaged image that shows by picture display part 2320 by the image that image input unit 2310 is inputted.In addition,
The control image of control unit 360 matching part 450, when inputting corresponding operating information so as to the operation according to arm operating portion 2330, with
It passes through picture display part 2320 and exports modeled images accordingly.
The actual operation instrument being contained in endoscopic images is contained within inputting by endoscope 8 and being sent to master machine
Operation tool in the image of people 1, is the operation tool for directly implementing operation behavior to patient body.In contrast to this, it is contained in
The advance mathematical modeling operation tool of modeling operation tool in modeled images and image storage part is stored in 2D or 3D rendering
360.The operation tool of endoscopic images and the modeling operation tool of modeled images, can apply patient's motion arm operating portion by basis
2330 and operation information (i.e. the information of the movement of relevant operation tool, rotation etc.) control for being recognized by main robot 1.It can pass through
Operation information determines its position and the operational shape of actual operation instrument and modeling operation tool.Reference picture 36, endoscopic images
2620 match with modeled images 2610 and export coordinate X, Y in picture display part 2320.
Modeled images not only include operation tool image, the image that can also include modeling patient's internal organs and reconstruct.I.e.
Modeled images include with reference to from computer tomography inspection (CT, Computer Tomography), magnetic resonance (MR,
Magnetic Resonance), positron emission computerized tomography (PET, Positron Emission Tomography), monochromatic light
Sub- emission computed tomography (SPECT, Single Photon Emission Computed Tomography), ultrasonic wave
The 2D or 3D rendering for patient's organ surface that the image that the vision facilities such as scanning (US, Ultrasonography) are obtained is reconstructed,
Now, if actual endoscopic images and microcomputer modelling image are matched, it can further provide for including to patient is applied
The whole image of operative site.
Figure 34 is the block diagram for showing the human body temperature type operation image processing apparatus that the third embodiment of the present invention is related to.Reference
Figure 34, images match portion 450 can include characteristic value operational part 451, modeled images achievement unit 453, overlapping image processing part
455。
Characteristic value operational part 451 is using by inputting the image of offer from the laparoscope 8 of robot 2 and/or being incorporated into machine
The computation performance value such as coordinate information of position of actual operation instrument of arm 3.The position of actual operation instrument is referred to slave
The positional value of the robotic arm 3 of device people 2 is recognized, about the information of the position, can also be supplied to main robot since robot 2
1。
Characteristic value operational part 451 calculates the visual angle of laparoscope 8 such as can utilize the image by laparoscope 8
(FOV, Field of View), magnifying power, viewpoint (such as institute's apparent direction), regard depth, and actual operation instrument kind
The characteristic values such as class, direction, depth, degree of crook.Using laparoscope 8 image operation characteristic value when, can also using extract bag
The outer contour for the object being contained in the image, shape recognition, the image recognition technology for recognizing angle of inclination etc..This
Outside, species of actual operation instrument etc., can also combine on robotic arm 3 and be pre-entered in process of the operation tool etc..
Modeled images achievement unit 453, which is realized, to be corresponded to by the modeled images of the characteristic value of the computing of characteristic value operational part 451.Have
Closing the data of modeled images can extract from image storage part 360.That is, modeled images achievement unit 453, which is extracted, corresponds to laparoscope 8
Characteristic value (visual angle (FOV, Field of View), magnifying power, viewpoint, regard depth etc., and actual operation instrument species,
Direction, depth, degree of crook etc.) the modeled images data of relevant operation tool etc. realize modeled images, to be peeped with interior
Operation tool of mirror image etc. is matched.
Modeled images achievement unit 453 extracts the method for corresponding to the image by the characteristic value of the computing of characteristic value operational part 451
Can have a variety of.For example, modeled images achievement unit 453 directly can extract corresponding build using the characteristic value of laparoscope 8
Mould image.That is, modeled images achievement unit 453 is referred to the data such as visual angle, the magnifying power of affiliated laparoscope 8 and extracts corresponding
2D or 3D modeling operation tool image, and it is matched with endoscopic images.Here, such as visual angle, magnifying power characteristic
Value can by being calculated with being compared according to the benchmark image for initially setting up value, or by the image of laparoscope 8 sequentially generated that
This is compared computing.
In addition, according to other embodiments, modeled images achievement unit 453 can be utilized for determining laparoscope 8 and robotic arm 3
The operation information of position and operational shape extracts modeled images.That is, the operation tool of endoscopic images as described above can be by
Operation information according to applying patient's motion arm operating portion 2330 and being recognized by main robot 1 is controlled, corresponding to endoscopic images
The position of the modeling operation tool of characteristic value and operational shape can be determined according to operation information.
The operation information can be stored in single database in temporal sequence, and modeled images achievement unit 453 is referred to
The database recognizes the characteristic value of actual operation instrument, and correspondingly extracts the information about modeled images.That is, it is output to
The position of operation tool in modeled images, it is possible to use change the accumulation data of signal to set in the position of operation tool.Example
Such as, if including being rotated clockwise 90 degree and to extension about the operation information as an operating theater instruments in operation tool
When 1cm information is moved in direction, modeled images achievement unit 453 corresponds to the operation information and changes the hand being contained in modeled images
The shape of art apparatus is simultaneously extracted.
Here, operating theater instruments is arranged on the leading section of the surgical machine arm with actuator, drive division (not shown) is had
Some driving wheels (not shown) receive driving force from actuator and acted, and being inserted in patient with operation body of being connected with driving wheel
Operation thing carry out compulsory exercise, so as to be performed the operation.Driving wheel is formed as disc, is incorporated into actuator to receive
Driving force.In addition, the quantity of driving wheel can correspond to the quantity of control object to determine, the technology of these driving wheels, for hand
It is for the related technical staff of art apparatus it will be apparent that therefore detailed description will be omitted.
Overlapping image processing part 455 exports modeling to prevent the endoscopic images and modeled images of actual photographed overlapping
The parts of images of image.That is, endoscopic images include the partial shape of operation tool, and the output of modeled images achievement unit 453 with
During its corresponding modeling operation tool, overlapping image processing part 455 confirms the actual operation tool drawing picture of endoscopic images and built
The overlapping region of mould operation tool image, and lap is deleted from modeling operation tool image so that two images that
This is consistent.Overlapping image processing part 455 deletes modeling operation tool image and actual operation work from modeling operation tool image
Has the overlapping region of image, so as to handle overlapping region.
For example, the total length of actual operation instrument is 20cm, it is considered to characteristic value (visual angle (FOV, Field of View), amplification
Rate, viewpoint, regard depth etc., and actual operation instrument species, direction, depth, degree of crook etc.) when endoscopic images reality
When the length of border operation tool image is 3cm, the overlapping utilization level value of image processing part 455 includes modeled images and not exported
Modeling operation tool image output into endoscopic images.
Figure 35 is the flow chart for showing the human body temperature type operation image processing method that the third embodiment of the present invention is related to.Under
Face, main explanation and above-mentioned difference.
In step S131, modeled shape is previously generated to surgical object and/or operation tool and stored.Modeled images can
To be modeled by computer modeling technique, the present embodiment can also utilize single modeled images generating means generation modeling figure
Picture.
In step S132, characteristic value operational part 351 calculates the characteristic value of endoscopic images.As described above, characteristic value
Operational part 351 is using by inputting the image of offer from the laparoscope 8 of robot 2 and/or being incorporated into the actual operation work of robotic arm 3
The computation performance value such as coordinate information of the position of tool, characteristic value can be laparoscope 8 visual angle (FOV, Field of View),
Magnifying power, viewpoint are such as institute's apparent direction), regard depth, and the species of actual operation instrument, direction, depth, degree of crook
Deng.
In step S133, images match portion 450 corresponds to endoscopic images and extracts modeled images, and handles lap
The backward picture display part 2320 for making two images match each other is exported.Here, its output time can be set as it is varied,
Initially modeled images are together defeated after the output of mutually the same time, or endoscopic images output for endoscopic images and modeled images
Go out.
Figure 37 is the concept map for the main interface for showing the operating robot that the fourth embodiment of the present invention is related to.Reference picture
37, main interface 4 can include monitoring unit 6, handle 10, monitor driver element 12, shifting chute 13.Below, main explanation with it is upper
The difference stated.
This embodiment is characterized in that, corresponding to the viewpoint for the endoscope 8 being continually changing as described above, make main interface 4
Monitoring unit 6 rotate and mobile, so as to make user more vivo experience the sense of reality of operation.
One end of monitor driver element 12 is combined with monitoring unit 6, and the other end is combined with the body of main interface 4, is passed through
Driving force is applied to monitoring unit 6, drive division 6 is rotated and mobile.Here, rotation is the rotation at not coaxial (X, Y, Z) center,
I.e., it is possible to including pitching (pitch), rolling (roll), the rotation for deflecting (yaw) axle.Reference picture 37, shows by yawing axis
Rotation.
In addition, shifting chute 13 of the monitor driver element 12 along the body of main interface 4 formation positioned at the lower end of monitoring unit 6
Mobile (B directions), so as to make monitoring unit 6 be moved according to the viewpoint of endoscope 8.The user oriented direction of shifting chute 13 is recessed
Formed, so that monitoring unit 6, which is moved along when groove 13 is moved, can make before monitoring unit 6 user oriented all the time.
Figure 38 is the block diagram for showing the operating robot that the fourth embodiment of the present invention is related to.Reference picture 38, shows bag
Include image input unit 2310, picture display part 2320, arm operating portion 2330, operation signal generating unit 340, control unit 370, picture
The main robot 1 of drive control part 380, picture drive division 390, and including robotic arm 3, endoscope 8 from robot 2.Below,
Main explanation and above-mentioned difference.
Picture drive division 390 is to make the unit as the picture display part 2320 of above-mentioned monitoring unit 6 rotates and moves, for example,
Motor, the bearing unit of monitoring unit 6 etc. can be included.Picture drive control part 380 can be with control interface drive division 390, to draw
The viewpoint that face drive division 390 makes picture display part 2320 correspond to endoscope 8 rotates and mobile.Picture drive division 390 can be wrapped
The electric-motor drive unit 12 is included, monitoring unit 6 is moved along groove 13 and moves.
Reference picture 39, picture drive control part 380 can include:Endoscope viewpoint tracking part 381, corresponding to endoscope 8
Movement and rotation, follow the trail of endoscope 8 view information;Image mobile message extraction unit 383, is believed using the viewpoint of endoscope 8
Breath, extracts the mobile message of endoscopic images;Activation bit generating unit 385, using mobile message, generates picture display part 2320
Modeling information (picture activation bit).Believed using the modeling of the picture display part 2320 generated in activation bit generating unit 385
Breath, picture drive division 390 drives picture display part 2320 as described above.
In addition, according to another embodiment, picture drive division 390 can also drive according to user command.For example, picture drives
Control unit 380 can be replaced by user interface, for example by the exercisable switch (ex. rides switching plate) of user, at this time it is also possible to
Operated according to user, the rotation and movement of control interface drive division 30.
The action of the picture drive division 390, can also be controlled by touch-screen.For example, picture display part 2320 by
Touch-screen is realized, if user drags in the state of touching picture display part 2320 using finger etc. to prescribed direction, picture shows
Show that portion 2320 can also correspondingly rotate and mobile.In addition it is also possible to follow the trail of eyes of user, or utilize the shifting according to junction portion
Move the direction rotation/movable signal generated or the rotation/movable signal generated according to voice command etc., control interface display part
2320 modeling.
Figure 40 is the flow chart for showing the human body temperature type operation image processing method that the fourth embodiment of the present invention is related to.Under
The face each step to be performed can be carried out with picture drive control part 380 for main body.
In step S181, corresponding to the movement and rotation of endoscope 8, tracking is used as the relevant viewpoint of endoscope 8
The view information of the endoscope 8 of information.
In step S182, using the view information of endoscope 8, the mobile message of endoscopic images, the endoscope are extracted
The mobile message of image corresponds to the location variation of the reference object of endoscopic images.
In step S183, using the view information and/or the mobile message of extraction of endoscope 8, generate the picture and drive
Dynamic information.That is, if the view information and the mobile message of endoscopic images of specific endoscope 8 as described above, utilize the shifting
Dynamic information generation makes the information that picture display part 2320 is moved and rotated.
In step S814, make the corresponding picture activation bit movement of picture display part 2320 and rotate.
Figure 41 is the concept map for the main interface for showing the operating robot that the fifth embodiment of the present invention is related to.Reference picture
41, show vault screen 191, projecting apparatus 192, workbench 193, the first endoscopic images 621, the second endoscopic images 622.
This embodiment is characterized in that, realized and shown as described above in picture using vault screen 191 and projecting apparatus 192
The characteristic area in portion 2320 exports the function of endoscopic images, so that user faster can more easily confirm hand by wide screen
Art situation.
Endoscopic images are projected vault screen 191 by projecting apparatus 192.Here, endoscopic images can be projected
Graphics front-end portion is shaped as spherical spherical chart picture.Here, it is spherical and not only represent spherical on strict mathematical sense, can be with
It is the various forms such as curve shape, Part-spherical including ellipse, section.
Vault screen 191 includes:Open front end;Internal vault face, hemispherical shape is thrown with episcopic projector 12
The image penetrated.The size of vault screen 191 can be the size that user is easy to viewing, for example, its a diameter of 1m~2m or so.Vault
The inside vault face of screen 191 can be surface-treated by region respectively, or with semi-spherical shape.In addition, vault screen 191
It can be formed as, the axial symmetry centered on central shaft, user's sight is located at the central shaft of vault screen 191.
Projecting apparatus 192 can be located between the user performed the operation and vault screen 191, to prevent user from blocking the figure of projection
Picture.In addition, in order to prevent image that user projects when blocking work, and ensuring work space, projecting apparatus 192 may be mounted at
The bottom surface of workbench 530.Internal vault face can form or apply the material by the high material of reflectivity.
During using these vault screens 191 and projecting apparatus 192, the difference that endoscope 8 is can correspond to as described above is regarded
Point, the first endoscopic images 621 and the second endoscopic images 622 are projected to the specific region of vault screen 191.
Figure 42 is the block diagram for showing the human body temperature type operation image processing apparatus that the sixth embodiment of the present invention is related to.Reference
Figure 42, screen display control unit 2350 can include endoscope viewpoint tracking part 351, image mobile message extraction unit 353, image
Position configuration part 355, storage image display part 357, consecutive image generating unit 352 and side images generating unit 354.Below, it is main
Illustrate and above-mentioned difference.
The present embodiment has following feature, rotates one end of operation endoscope, so as to obtain multiple images, with
Ensure user's wide viewing angle.That is, this embodiment is characterized in that, make operation endoscope one end rotate to form regulation track,
So as to can not only obtain operative site image, side images can also be obtained, user can be made to watch wider position.
Consecutive image generating unit 352 extracts the first endoscopic images obtained from operation with endoscope and the second endoscope figure
The overlapping region of picture, and generate consecutive image.First endoscopic images and the second endoscopic images can be the operations from rotation
The image provided with endoscope with time point different from each other.Reference picture 44, operation endoscope 221 is inclined centered on rotary shaft A
Tiltedly rotation, so as to obtain multiple endoscopic images.
One end of operation endoscope 221 is to form the rotation of different tracks, for example, being used as the hand extended with specific length
Art forms rotational trajectory with the light incident section (camera lens part) of one end of endoscope 221, and the other end is located in rotary shaft, so whole
Body is rotated with cone or multiple-angle pyramid shape, and the rotational trajectory of one end can be circle, ellipse, triangle, rectangle, other
Polygon, closed figures (closed figure) etc. are variously-shaped.Here, closed figures are it can be appreciated that bent including closure
The concept of line.
In addition, operation can be determined as needed with the rotary speed of endoscope 221 and time.For example, operation endoscope
221 one end can periodically be rotated, or the any-mode operated according to user correspondingly rotates.Here, ' the cycle
Property ' mean, the situation that operation carries out circus movement with endoscope 221 with constant speed can be included.In addition, the meaning of ' periodicity '
Think of is can also to include the situation that rotation status and non-rotating state are repeated cyclically.
In addition, the embodiment according to Figure 45, operation has curved shape with endoscope 221, centered on rotary shaft A
During rotation, one end of operation endoscope 221 can form the rotation of regulation track.Now, operation endoscope 221 can be wrapped
Include:First axle 222, to extend with rotary shaft A overlap conditions;Second axle 223, one end is combined with light incident section, with first axle
222 separate state extends to rotary shaft A directions;Axle connecting portion 224, and the not parallel extension in rotary shaft A directions, for connecting first
One end of axle 222 and the other end of the second axle 223.
In addition, the rotational trajectory of light incident section can be circle, ellipse, triangle, rectangle, other polygons, closure
Figure (closed figure) etc. is variously-shaped.
Here, the operation direction of rotation of endoscope 221, degree of rotation, rotational trajectory shape, rotational trajectory size, rotation
Rotary speed etc. rotate association attributes, can sequencing in advance and be stored in storage part (not shown).
Screen display control unit 2350 is referred to the rotation association attributes prestored, extracts the overlay region of multiple images
Domain, and these regions are generated with consecutive image.For example, extract operation be 70 degree with the visual angle of endoscope 221, rotational trajectory be
The lap for the image that circular, rotational trajectory is shot when being radius 2cm, the overlay chart picture being extracted can sustainable be seen
Consecutive image.
Side images generating unit 354 is extracted the Non-overlapping Domain of the first endoscopic images and the second endoscopic images and generated
Side images.Non-overlapping Domain can be the region that first endoscopic images and the second endoscopic images do not overlap each other.
Moreover, the region can be predetermined region.For example, can be by the region in Figure 46 in addition to predetermined consecutive image 232
Image setting be Non-overlapping Domain side images 231.
Reference picture 46, shows the overlapping consecutive image 232 for being continuously shot and persistently being seen, and not overlapping and be used as week
The image 231 of edge image processing.Each circular expression endoscopic images different from each other, in order to distinguish each other, are properly termed as first
Endoscopic images or the second endoscopic images.Consecutive image 232 is the image continuously persistently seen on picture, side images
232 be the image only seen in discontinuous shooting.These are shown in order to distinguish, consecutive image 232 is shown as limpid in sight, week
Edge image 231 is shown as unlike this.That is, the brightness of consecutive image 232 and side images 231, chroma and color etc. can be with those
This different display.
Reference picture 47, consecutive image 232 can be the image of presumptive area.I.e., as shown in figure 46, if only by all peripheries
When the image setting in the overlapping region of image 232 is consecutive image 232, the size for the consecutive image 232 being primarily viewed may become
It is small, so can also be by the image setting in the overlapping region of multiple side images 232, such as 2~3 right peripheries images 231
For consecutive image 232.Now, consecutive image 232 includes the information with respect to consecutive image compared with side images 232, shoots area
The size in domain can be more overlapping than all side images 231 region size it is big.
Figure 43 is the flow chart for showing the human body temperature type operation image processing method that the sixth embodiment of the present invention is related to.Under
The face each step to be performed can be carried out with screen display control unit 2350 for main body.
In step S211, image input unit 2310 is received with the time different from each other from the operation endoscope 221 of rotation
The first endoscopic images and the second endoscopic images that point is provided.Here, screen display control unit 2350 is corresponding as described above
In movement and rotation of the operation with the one end of endoscope 221, tracking is used as operation of the operation with the information of the viewpoint of endoscope 221
With the view information of endoscope 221.
In step S212, using view information of the operation with endoscope 221, the mobile message of endoscopic images is extracted,
The mobile message of the endoscopic images corresponds to the location variation of the reference object of endoscopic images.
In step S213, using the operation view information of endoscope 221 and/or the mobile message of extraction, set defeated
Go out the picture position of the first endoscopic images and the second endoscopic images.That is, if particular procedure endoscope 221 as described above
View information and endoscopic images mobile message, then using these mobile messages, be set in picture display part 2320 and export
The picture position of first endoscopic images and the second endoscopic images.
In step S214, peeped in the region different from each other output first of the setting position as picture display part 2320
Mirror image and the second endoscopic images.
Figure 48 is the schematic diagram for the spinning movement for showing the auxiliary endoscope that the seventh embodiment of the present invention is related to.Reference picture
48, show operation endoscope 241, auxiliary endoscope 242, joint portion 243.
The present embodiment is additionally arranged at the auxiliary endoscope 242 of the surrounding of operation endoscope 241 rotation, is peeped with obtaining in multiple
Mirror image, so that consecutive image and side images can be generated as described above.That is, auxiliary endoscope 242 is peeped in operation in
The surrounding of mirror 241 obtains endoscopic images while rotation, so as to obtain consecutive image and week from overlay chart picture and non-overlapping images
Edge image.
Aid in the rotatable side for being incorporated into operation endoscope 241 of endoscope 242, such as side.Aid in endoscope
242 can also be applicable the conventional endoscope structure that image is obtained by camera lens reception light.Here, will can be peeped by operation in
The image that mirror 241 is obtained is referred to as the first endoscopic images, and the image obtained by auxiliary endoscope 242 is referred to as the second endoscope figure
Picture.
In addition, auxiliary endoscope 242 is removably combined centered on central shaft A with operation with endoscope 241, or and hand
Art is joined integrally with endoscope 241.The former situation has the following advantages that, auxiliary endoscope 242 can be in patient in vitro and hand
Art endoscope 241 is combined, or is peeped with performing the operation in it after being inserted into independently of operation endoscope 241 in patient's body
Mirror 241 is combined, so as to obtain first endoscopic images and the second endoscopic images.
In addition, according to another embodiment, operation with the first endoscopic images of endoscope 241 can be set as sequential chart
Picture, the second endoscopic images of auxiliary endoscope 242 are set as side images.That is, the present embodiment has the following advantages that, it is not necessary to carry
Overlapping region is taken, and generates consecutive image and side images, so as to shorten image processing time.
Figure 49 is the concept map for the main interface for showing the operating robot that the eighth embodiment of the present invention is related to.Reference picture
49, the main interface 4 can include monitoring unit 6, handle 10, space movement drive division 25.Below, main explanation and above-mentioned area
Not.
The present embodiment has following feature, in order that what monitoring unit 6 was spatially rotated freely and moved, the knot of monitoring unit 6
Together in spatially can free movement space move drive division 25.Here, monitoring unit 6 can be the picture display part
2320。
One end of space movement drive division 25 is incorporated into monitoring unit 6, and the other end is incorporated into the body of main interface 4, passed through
Apply driving force to monitoring unit 6, monitoring unit 6 is spatially rotated and mobile.Here, rotation can include corresponding to space shifting
Multiple amount of articulation that dynamic drive division 25 has with the rotation at not coaxial (X, Y, Z) center, i.e., pitching (pitch), rolling
(roll) rotation of (yaw) axle, is deflected.
Space movement drive division 25 can be realized in robotic arm form, with following feature, be operated by handle 10, or right
Viewpoint rotation that should be in the operation endoscope being continually changing as described above and the monitoring unit 6 of mobile main interface 4, thereby using
More vivo experience the sense of reality of operation in family.
In addition, another embodiment of the present invention can also include rotation process portion (not shown), the rotation process portion makes institute
State operation endoscope 221 and/or auxiliary endoscope 242 rotates.Rotation process portion can be that user determines relevant endoscope rotation
The information turned, such as direction of rotation, angular velocity of rotation, acceleration and deceleration form, rotary speed, rotational time starting point, rotational time
End point, rotational time length, direction of rotation, radius of turn etc. rotate relevant information.
Here, direction of rotation is the direction of endoscope rotation, such as clockwise or counterclockwise, acceleration and deceleration form table
Show the form that the endoscopes such as straight line, S turnings, exponential function form rotate in a variety of manners.In addition, angular velocity of rotation and rotation speed
Degree is operation endoscope 221 or aids in the speed of one end rotation of endoscope 242, and rotational time starting point starts rotation
Temporal information, rotational time end point is to terminate the temporal information of rotation.In addition, radius of turn be operation endoscope 221 with
Rotary shaft and operation the distance between one end of endoscope 221, during bendable endoscope, axle connecting portion 224 during cone rotation
Length, auxiliary endoscope 242 with operation with endoscope 241 separated by a distance.
Rotation process portion can include the interface that user can operate, for example, interface can be by manipulation rod type, button-shape
Formula, keyboard, tracking ball, touch-screen etc. are used to operate the various forms of robotic arm and/or other surgical apparatuses to realize.User utilizes
When the interface setting rotates relevant information and inputs the information, operation is rotated with endoscope 221 with cone shape or with first axle
Rotated centered on 222, so that its one end rotates, and auxiliary endoscope 242 can also be centered on operation endoscope 241
Axle is rotated.
In addition, the relevant The concrete specification for the human body temperature type operation image processing apparatus that embodiments of the invention are related to,
The Interface Standard technology such as the common flat such as embedded system, O/S technology and communication protocol, I/O interfaces and actuator, battery,
The detailed description of the parts Standardization technology such as video camera, sensor etc., comes for those skilled in the art
Say it is it will be apparent that therefore detailed description will be omitted.
Human body temperature type operation image processing method of the present invention, can be by the journey that is performed by various computer units
Sequence command forms are realized, and are stored in computer-readable medium.That is, recording medium can be that record is used to hold on computers
The computer readable recording medium storing program for performing of the program of row each step.
The computer-readable medium form can include program command, data file, data structure etc. alone or in combination.
The program command recorded in the medium can be specially designed to constitute for the present invention, or can also be for computer
It is known for software engineering personnel.As the example of computer readable recording medium storing program for performing, including such as magnetic Jie of hard disk, floppy disk and tape
Matter (Magnetic Media), such as CD-ROM, DVD optical medium (Optical Media), such as soft CD (Floptical
Disk magnet-optical medium (Magneto-Optical Media) and such as read-only storage (ROM), random access memory)
(RAM), the hardware unit being specially constructed of the storage such as flash memory and configuration processor order.
The medium can also be include transmission be used for the signal such as designated program order, data structure conveyance ripple light or
Metal wire, waveguide etc. transmit medium.As the example of program command, not only including the machine language code generated by compiler,
The higher-level language code also performed including the use of analysis program etc. by computer.The hardware unit in order to perform the present invention
The composition that action can be acted as more than one software module.
It is illustrated above-mentioned with reference to the preferred embodiments of the present invention, but for those skilled in the art
For, in the range of the thought of the invention recorded without departing from claims and region, the present invention can carry out a variety of repair
Change and change.
Claims (3)
1. a kind of human body temperature type operation image processing apparatus, including:
First image input unit, the first endoscopic images are received from operation with endoscope;
Second image input unit, receives multiple second endoscopic images that auxiliary endoscope is provided with time point different from each other, should
Auxiliary endoscope is incorporated into the side of the operation endoscope and rotated centered on the operation endoscope;
Picture display part, by first endoscopic images and second endoscopic images different zones output to each other;
Screen display control unit, controls the picture display part, so as to the operation endoscope and the auxiliary endoscope
Viewpoint different from each other accordingly by first endoscopic images and second endoscopic images, different zones are defeated to each other
Go out.
2. human body temperature type operation image processing apparatus as claimed in claim 1, it is characterised in that
The operation is periodically rotated with endoscope.
3. human body temperature type operation image processing apparatus as claimed in claim 1, in addition to:
Rotation process portion, for set the direction of rotation related to one end rotation of the operation endoscope, angular velocity of rotation,
In acceleration and deceleration form, rotary speed, rotational time starting point, rotational time end point, rotational time length and radius of turn
More than one rotation relevant information.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020100108156A KR20110049703A (en) | 2009-11-04 | 2010-11-02 | Surgical robot system and laparoscope handling method thereof |
KR10-2010-0108156 | 2010-11-02 | ||
KR10-2010-0117546 | 2010-11-24 | ||
KR1020100117546A KR20110114421A (en) | 2010-04-13 | 2010-11-24 | Apparatus and method for processing surgical image based on motion |
CN201180052600.7A CN103188987B (en) | 2010-11-02 | 2011-10-28 | Surgical robot system and laparoscopic procedure method thereof and human body temperature type operation image processing apparatus and method thereof |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201180052600.7A Division CN103188987B (en) | 2010-11-02 | 2011-10-28 | Surgical robot system and laparoscopic procedure method thereof and human body temperature type operation image processing apparatus and method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105078580A CN105078580A (en) | 2015-11-25 |
CN105078580B true CN105078580B (en) | 2017-09-12 |
Family
ID=46025237
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510446194.2A Active CN105078580B (en) | 2010-11-02 | 2011-10-28 | Surgical robot system and its laparoscopic procedure method and human body temperature type operation image processing apparatus and its method |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN105078580B (en) |
WO (1) | WO2012060586A2 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102914284B (en) * | 2012-10-19 | 2015-07-08 | 中铁隧道集团有限公司 | Real-time measurement system for work position of operation arm and measurement method thereof |
CN106456145B (en) | 2014-05-05 | 2020-08-18 | 维卡瑞斯外科手术股份有限公司 | Virtual reality surgical device |
JPWO2017199926A1 (en) * | 2016-05-17 | 2019-10-10 | カイロス株式会社 | Endoscope device |
US10799308B2 (en) | 2017-02-09 | 2020-10-13 | Vicarious Surgical Inc. | Virtual reality surgical tools system |
WO2018189742A1 (en) * | 2017-04-13 | 2018-10-18 | V.T.M. (Virtual Tape Measure) Technologies Ltd. | Endoscopic measurement methods and tools |
JP7387588B2 (en) | 2017-09-14 | 2023-11-28 | ヴィカリアス・サージカル・インコーポレイテッド | Virtual reality surgical camera system |
CN110393499B (en) * | 2018-08-31 | 2021-12-07 | 上海微创医疗机器人(集团)股份有限公司 | Electronic endoscope and electronic endoscope system |
WO2020086345A1 (en) | 2018-10-22 | 2020-04-30 | Intuitive Surgical Operations, Inc. | Systems and methods for master/tool registration and control for intuitive motion |
CN115607285B (en) * | 2022-12-20 | 2023-02-24 | 长春理工大学 | Single-port laparoscope positioning device and method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101176653A (en) * | 2006-11-09 | 2008-05-14 | 奥林巴斯医疗株式会社 | Image display apparatus |
CN101257838A (en) * | 2005-09-09 | 2008-09-03 | 奥林巴斯医疗株式会社 | Image display apparatus |
CN100435712C (en) * | 2003-11-18 | 2008-11-26 | 奥林巴斯株式会社 | Capsule-type medical system |
CN101516252A (en) * | 2006-09-21 | 2009-08-26 | 奥林巴斯医疗株式会社 | Endoscope system |
WO2010110560A2 (en) * | 2009-03-24 | 2010-09-30 | 주식회사 래보 | Surgical robot system using augmented reality, and method for controlling same |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5417210A (en) * | 1992-05-27 | 1995-05-23 | International Business Machines Corporation | System and method for augmentation of endoscopic surgery |
US5657429A (en) * | 1992-08-10 | 1997-08-12 | Computer Motion, Inc. | Automated endoscope system optimal positioning |
DE10025285A1 (en) * | 2000-05-22 | 2001-12-06 | Siemens Ag | Fully automatic, robot-assisted camera guidance using position sensors for laparoscopic interventions |
KR100962472B1 (en) * | 2009-08-28 | 2010-06-14 | 주식회사 래보 | Surgical robot system and control method thereof |
-
2011
- 2011-10-28 WO PCT/KR2011/008152 patent/WO2012060586A2/en active Application Filing
- 2011-10-28 CN CN201510446194.2A patent/CN105078580B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100435712C (en) * | 2003-11-18 | 2008-11-26 | 奥林巴斯株式会社 | Capsule-type medical system |
CN101257838A (en) * | 2005-09-09 | 2008-09-03 | 奥林巴斯医疗株式会社 | Image display apparatus |
CN101516252A (en) * | 2006-09-21 | 2009-08-26 | 奥林巴斯医疗株式会社 | Endoscope system |
CN101176653A (en) * | 2006-11-09 | 2008-05-14 | 奥林巴斯医疗株式会社 | Image display apparatus |
WO2010110560A2 (en) * | 2009-03-24 | 2010-09-30 | 주식회사 래보 | Surgical robot system using augmented reality, and method for controlling same |
Also Published As
Publication number | Publication date |
---|---|
CN105078580A (en) | 2015-11-25 |
WO2012060586A3 (en) | 2012-09-07 |
WO2012060586A2 (en) | 2012-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105078580B (en) | Surgical robot system and its laparoscopic procedure method and human body temperature type operation image processing apparatus and its method | |
US11547520B2 (en) | Systems, methods, and computer-readable storage media for controlling aspects of a robotic surgical device and viewer adaptive stereoscopic display | |
CN103188987B (en) | Surgical robot system and laparoscopic procedure method thereof and human body temperature type operation image processing apparatus and method thereof | |
US20220331013A1 (en) | Mixing directly visualized with rendered elements to display blended elements and actions happening on-screen and off-screen | |
JP2022017422A (en) | Augmented reality surgical navigation | |
Sielhorst et al. | Advanced medical displays: A literature review of augmented reality | |
US20220387128A1 (en) | Surgical virtual reality user interface | |
KR20080089376A (en) | Medical robotic system providing three-dimensional telestration | |
JP2020156800A (en) | Medical arm system, control device and control method | |
US11871904B2 (en) | Steerable endoscope system with augmented view | |
US20220215539A1 (en) | Composite medical imaging systems and methods | |
US20230186574A1 (en) | Systems and methods for region-based presentation of augmented content | |
Zinchenko et al. | Virtual reality control of a robotic camera holder for minimally invasive surgery | |
US20220096164A1 (en) | Systems and methods for facilitating optimization of an imaging device viewpoint during an operating session of a computer-assisted operation system | |
US10854005B2 (en) | Visualization of ultrasound images in physical space | |
Zhang et al. | From AR to AI: augmentation technology for intelligent surgery and medical treatments | |
KR101713836B1 (en) | Apparatus and Method for processing surgical image based on motion | |
KR101114237B1 (en) | Apparatus and method for processing surgical image based on motion | |
KR20110114421A (en) | Apparatus and method for processing surgical image based on motion | |
Qian | Augmented Reality Assistance for Surgical Interventions Using Optical See-through Head-mounted Displays | |
JP2024514640A (en) | Blending visualized directly on the rendered element showing blended elements and actions occurring on-screen and off-screen | |
CA3221339A1 (en) | Systems, methods, and media for presenting biophysical simulations in an interactive mixed reality environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |