CN220175076U - Handheld robot-assisted endoscope - Google Patents

Handheld robot-assisted endoscope Download PDF

Info

Publication number
CN220175076U
CN220175076U CN202223391748.8U CN202223391748U CN220175076U CN 220175076 U CN220175076 U CN 220175076U CN 202223391748 U CN202223391748 U CN 202223391748U CN 220175076 U CN220175076 U CN 220175076U
Authority
CN
China
Prior art keywords
endoscope
reusable
image
disposable
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202223391748.8U
Other languages
Chinese (zh)
Inventor
欧阳小龙
欧阳詹姆士
欧阳戴安娜
王士平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meguiar Vision Co
Original Assignee
Meguiar Vision Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/941,884 external-priority patent/US20230117151A1/en
Application filed by Meguiar Vision Co filed Critical Meguiar Vision Co
Application granted granted Critical
Publication of CN220175076U publication Critical patent/CN220175076U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present utility model relates to a hand-held robot-assisted endoscope configured to derive a position and/or orientation of a front end portion of a disposable portion of the endoscope in cooperation with a position of a handle of the reusable portion of the endoscope and to display side-by-side images of an object, such as an organ of a patient being diagnosed or treated with the endoscope and the front end portion of the endoscope during a medical procedure, and to provide guidance to a system user by displaying images, such as a prior image of the object, a standardized image of the object of a coaching image related to a medical procedure.

Description

Handheld robot-assisted endoscope
RELATED APPLICATIONS
This patent application claims the benefit of the following non-provisional applications, which are incorporated by reference:
U.S. patent application Ser. No. 63/256,634, filed on 10/18 of 2021;
U.S. patent application Ser. No. 63/282,108, filed on 11/22 of 2021;
U.S. patent application Ser. No. 63/283,367, filed on 11/26 of 2021;
U.S. patent application Ser. No. 63/332,233, filed on 18, 4, 2022.
The utility model is a continuation of each of the following patent applications and their directly or indirectly cited applications, and claims the benefit of the filing date of their application, including U.S. provisional application, U.S. non-provisional application, and non-provisional patent application:
U.S. non-provisional application Ser. No. 16/363,209 filed on 3.25.2019;
U.S. non-patent application Ser. No. 17/362,043, filed on 6/29 of 2021;
U.S. non-patent application Ser. No. 17/473,587, filed on day 2021, month 9 and day 13;
U.S. non-patent application Ser. No. 17/745,526, filed 5/16/2022;
U.S. non-patent application Ser. No. 17/521,397, filed 11/8/2021; and
U.S. non-patent application Ser. No. 17/720,143, filed on day 13, 4, 2022.
This patent application is also related to the following international, non-provisional and provisional applications, and are incorporated by reference:
international patent application PCT/US17/53171 filed on 25 th 9 2017;
U.S. patent No. 8,702,594 issued 22, 4, 2014;
U.S. patent application Ser. No. 16/363,209, filed on 25/3/2019;
international patent application PCT/US19/36060 filed on 6 th and 7 th 2019;
U.S. patent application Ser. No. 16/972,989, filed 12/7/2020;
U.S. provisional application Ser. No. 62/816,366, filed on 3/11/2019;
U.S. provisional application No. 62/671,445 filed on 5/15/2018;
U.S. provisional application Ser. No. 62/654,295 filed on 4/6/2018;
U.S. provisional application No. 62/647,817 filed on 25.3.2018;
U.S. provisional application No. 62/558,818 filed on day 14 of 9 in 2017;
U.S. provisional application No. 62/550,581 filed on 8/26 2017;
U.S. provisional application No. 62/550,560 filed on 25 th 8 of 2017;
U.S. provisional application No. 62/550,188 filed on 25 th 8 of 2017;
U.S. provisional application No. 62/502,670 filed on 5.6.2017;
U.S. provisional application No. 62/485,641 filed on 14 th 4 th 2017;
U.S. provisional application No. 62/485,454 filed on 14 days 4 of 2017;
U.S. provisional application Ser. No. 62/429,368, filed 12/2016;
U.S. provisional application Ser. No. 62/428,018, filed 11/30/2016;
U.S. provisional application No. 62/424,381 filed 11/18/2016;
U.S. provisional application No. 62/423,213, filed 11/17/2016;
U.S. provisional application Ser. No. 62/405,915, filed 10/8/2016;
U.S. provisional application Ser. No. 62/399,712, filed at 9/26/2016;
U.S. provisional application Ser. No. 62/399,436 filed on day 2016, 9 and 25;
U.S. provisional application Ser. No. 62/399,429, filed by day 2016, 9 and 25;
U.S. provisional application Ser. No. 62/287,901, filed on 28 th 1/2016;
U.S. provisional application Ser. No. 62/279,784 filed on day 2016, 1, 17;
U.S. provisional application No. 62/275,241 filed 1/6/2016;
U.S. provisional application Ser. No. 62/275,222, filed 1/5/2016;
U.S. provisional application No. 62/259,991 filed on 11/25/2015;
U.S. provisional application No. 62/254,718 filed on 11/13 2015;
U.S. provisional application No. 62/139,754 filed on 3.29 of 2015;
U.S. provisional application No. 62/120,316 filed on 24 th month 2 2015; and
U.S. provisional application No. 62/119,521 filed on 2.23.2015.
Technical Field
The present utility model relates generally to endoscopic instruments and methods. Some embodiments relate to endoscopic instruments that include a disposable portion releasably connected to a reusable portion.
Background
Endoscopes have long been used to view and treat internal tissues of the human body. For conventional rigid endoscopes and flexible endoscopes, the lens or fiber optic system is relatively expensive and reused multiple times. Therefore, each use must be subjected to rigorous sterilization and disinfection, which not only requires trained specialized personnel and specialized equipment, but also can abrade the endoscope over multiple uses. In recent years, disposable endoscopes have been developed and improved, which generally include a disposable portion including a disposable cannula having a camera at a forward end thereof, releasably connected to a reusable portion including image processing electronics and a display. Disposable or single use endoscopes reduce the risk of cross-contamination and hospital-set disease and are cost-effective. Such endoscopes find application in medical procedures, such as imaging and treating the urinary system of men and women and the reproductive system of women, as well as other internal organs and tissues. Examples of disposable endoscopes are discussed in U.S. patent nos. 10,292,571, 10,874,287, 11,013,396, 11,071,442, 11,330,973, and 11,350,816.
Robots and robotic-assisted surgery have attracted considerable attention in the industry and in the academia. They tend to be large-scale specialized systems, require specialized operating rooms, tend to be cumbersome to set up, and have limited flexibility.
The present description relates to different types of systems-small size, hand-held, and modular, with digital integration and artificial intelligence, that can be used for robotic-assisted surgery, that do not require specialized surgical gowns, that can be used in the doctor's office, and that are significantly improved over endoscopic systems without robotic assistance. The present specification is directed to an endoscope system that can be effectively used with or without the activation of one or more existing robotic aids.
The subject matter described or claimed in this patent specification is not limited to what has been described in the particular embodiments in order to solve any particular disadvantages or to operate only in environments such as those described above. Rather, the foregoing background is provided only for the purpose of illustrating the feasibility of some embodiments described herein in the exemplary technical field.
Disclosure of Invention
As described in the claims that were originally presented, but which may be amended at the time of filing this patent application, in some embodiments a compact robotic-assisted endoscope system includes: an endoscope comprising a disposable portion including a cannula with a camera at a forward end thereof, and a reusable portion releasably connected to the reusable portion to form the endoscope; a first sensor device mounted to at least one of the reusable portion and the disposable portion and configured to obtain a measurement of the relative position of a selected portion of the disposable portion with respect to the reusable portion. Wherein the first sensor device is configured to operate in one or more of the following ways to track movement of the disposable part relative to the reusable part or another coordinate system: laser marking with time of flight; ultrasonic positioning using time of flight; imaging at least one of the disposable portion and the reusable portion with a VR headset with a camera array; radio frequency tracking selected portions of the disposable portion; driving the insertion tube to perform multi-degree-of-freedom selected motion by using a stepping motor, and tracking the motion by using operation parameters of the stepping motor; tracking movement of the disposable part with a forward imaging system mounted on the reusable part; the forward camera system tracks the reflective label arranged on the disposable part; the forward camera system tracks the LEDs disposed on the disposable part; the system further includes a processor receiving the output of the first sensor arrangement related to the tracking and configured to derive therefrom camera position coordinates of the disposable portion relative to the reusable portion or relative to a selected portion of another coordinate system; a display configured to display an image of an object to be diagnosed or treated with the endoscope and juxtaposed with an image of a front end portion of the disposable portion.
In some embodiments, the system may further include one or more of the following features: (a) Further comprising a second sensor device configured to measure a handle position indicative of at least one of a position and an orientation of the reusable portion relative to the selected coordinate system; (b) At least a portion of the second sensor apparatus is disposed in the VR headset and is configured to measure a handle position relative to the VR headset; (c) At least a portion of the second sensor device is mounted at a selected position that is invariant to movement of the endoscope and the second sensor device and is configured to measure a handle position relative to the selected position; (d) Also included is a source of instructional images related to a medical procedure on the object or similar object, including a prior image of the object, a standard image related to the object, and/or coaching information related to the medical procedure.
In some embodiments, a compact, handheld, robot-assisted endoscope system includes: an endoscope comprising a disposable portion including a cannula with a camera at a forward end thereof, and a reusable portion releasably connected to the portion to form the endoscope; a manual control device located at the reusable portion configured to be operated by a user grasping the reusable portion and to control rotation and translation of at least a portion of the disposable portion relative to the reusable portion and the angle of the front end portion of the disposable portion relative to the long axis thereof; a display coupled to the camera and configured to display a current image of an object captured with the camera, while displaying one or more of a previous image including the object, an image of a similar object, and an image for guiding a medical procedure on the object; electronically controlling one or more movements of at least part of the disposable part relative to the reusable part; a processor configured to provide the additional image to the display and optionally drive the motorized control.
The system of the preceding paragraph may further include one or more of the following features. (a) A first tracking device is included that is configured to automatically provide an estimate of at least one of a changing position and a changing direction of a portion of a disposable portion relative to a reusable portion, and a processor is configured to use the estimate in displaying a current image of the portion of the disposable portion relative to the object on the display. (b) The first tracking device includes a radio frequency transmitter at a distal end of the cannula and a radio frequency receiver on the reusable portion; and (c) the first tracking means comprises causing the processor to derive the estimate based at least in part on a signal related to the motorised control driving the disposable portion relative to the reusable portion.
In some embodiments, a compact, handheld, robotic-assisted endoscope system includes: an endoscope comprising a disposable portion including a cannula with a camera at a forward end thereof, and a reusable portion releasably connected to the portion to form the endoscope; a manual control device located at the reusable portion, the device configured to be operated by a user grasping the reusable portion and to control rotation and translation of at least a portion of the disposable portion relative to the reusable portion and angular change of a front end portion of the disposable portion relative to a long axis thereof; a display coupled to the camera and configured to display a current image of an object captured with the camera, while displaying additional images including one or more of a previous image of the object, an image of a similar object, and an image for guiding a medical procedure on the object; electronically controlling one or more movements of at least part of the disposable part relative to the reusable part; a processor configured to provide the additional image to the display and selectively drive the motorized control.
In some embodiments, the system of the immediately preceding paragraph may further comprise a scanning mode or operation wherein the manual control is configured to automatically scan a predetermined interior region of the object in response to a push to rotate the front end portion of the reusable portion by a predetermined angle about the long axis of the disposable portion while being angled with respect to the long axis.
Drawings
To further clarify the above and other advantages and features of the present patent specification, a more particular embodiment is illustrated in the drawings. These drawings should be understood as depicting only exemplary embodiments and thus should not be taken as limiting the scope of protection of the present patent specification or appended claims. The subject matter of the utility model is described and explained with specificity and detail through the use of the accompanying drawings in which:
fig. 1 is a perspective view of a miniature robotic system in some embodiments.
Fig. 2 illustrates the definition of a coordinate system of a miniature robotic system and an object such as a patient's organ or tissue in some embodiments.
Fig. 3 illustrates Artificial Intelligence (AI) and robotic assisted surgery, in some embodiments using a small robotic system, involving fusion of real-time information from internal and external sensors, and using an Artificial Intelligence (AI) engine and Virtual Reality (VR) headset.
FIG. 4 illustrates Artificial Intelligence (AI) and robotic-assisted surgery, in some embodiments, using a small robotic system, involving fusion of real-time information from internal and external sensors and the use of an Artificial Intelligence (AI) engine.
Fig. 5 illustrates Artificial Intelligence (AI) and robotic-assisted surgery, in some embodiments, involving the use of robotic devices and systems for Artificial Intelligence (AI) -assisted diagnosis and treatment.
FIG. 6 illustrates Artificial Intelligence (AI) and robotic assisted surgery, in some embodiments involving determination of handle position and patient position parameters via laser marking and time-of-flight techniques.
Fig. 7 illustrates AI and robotic-assisted surgery involving determination of handle position and patient position parameters by ultrasound techniques in some embodiments.
Fig. 8 illustrates Artificial Intelligence (AI) and robotic assisted surgery, in some embodiments involving determining handle position and patient position parameters using a camera array of a VR headset.
Fig. 9 illustrates AI and robotic-assisted surgery involving determination of handle position and patient position parameters by radio frequency tracking, in some embodiments.
FIG. 10 illustrates an AI and robot-assisted surgery involving engine operation in the step of determining handle position and patient position parameters in some embodiments.
FIG. 11 illustrates Artificial Intelligence (AI) and robotic assisted surgery, in some embodiments involving the use of a forward facing camera on an integral display to determine handle position and patient position parameters.
FIG. 12 illustrates Artificial Intelligence (AI) and robotic-assisted surgery, in some embodiments involving the use of infrared tracking to determine handle position and patient position parameters.
Fig. 13 illustrates Artificial Intelligence (AI) and robotic assisted surgery, in some embodiments involving the use of a forward facing camera and infrared light to illuminate a cannula with reflective labels to determine handle position and patient position parameters.
Fig. 14 illustrates Artificial Intelligence (AI) and robotic assisted surgery, in some embodiments involving the use of a forward facing camera to determine patient position from a handle position.
Figure 15 illustrates an example of a forward facing camera view associated with determining a patient position from a handle position in some embodiments.
Figure 16 illustrates an example of a forward facing camera view associated with determining a patient position from a handle position in some embodiments.
Figure 17 illustrates yet another example of a forward facing camera view associated with determining a patient position from a handle position in some embodiments.
Fig. 18 illustrates yet another example of a forward facing camera view associated with determining a patient position from a handle position in some embodiments.
Figure 19 illustrates yet another example of a forward facing camera view associated with determining a patient position from a handle position in some embodiments.
Figure 20 illustrates yet another example of a forward facing camera view associated with determining a patient position from a handle position in some embodiments.
Fig. 21 is a perspective view of a floor standing micro-robotic system and an object being inspected or processed in some embodiments.
Fig. 22 is a perspective view of a miniaturized robotic system and an object being inspected or processed in some embodiments.
Fig. 23 is a perspective view of a floor-standing or wall-mounted miniature robotic system and an object being inspected or treated in some embodiments.
Fig. 24 is a schematic view of a small robotic system operating in a scanning mode that automatically acquires up to 360 degrees of scanning of the interior of an object.
Detailed Description
A detailed description of the preferred embodiments is provided below. While several embodiments are described, it should be understood that the novel subject matter described in this patent specification is not limited to any one embodiment or combination of embodiments described herein, but includes many alternatives, modifications, and equivalents. Furthermore, although numerous specific details are set forth in the following description in order to provide a thorough understanding, some embodiments may be practiced without some or all of these details. Moreover, for the sake of clarity, certain technical material that is known in the prior art has not been described in detail to avoid unnecessarily obscuring the novel subject matter described herein. It should be clear that each feature of one or several of the specific embodiments described herein may be used in combination with features of other described embodiments or other features. Further, like reference numbers and designations in the various drawings indicate like elements.
This patent specification describes an endoscope system whose functionality is enhanced or enhanced to varying degrees and with the assistance of various robots and Artificial Intelligence (AI), with different but interrelated implementations. The clinician may still have direct manual control of the endoscope and related equipment, but some of the actions or operations are robotically controlled and assisted by motors. The new system described in this patent specification enhances the performance of human operators by combining the skills and judgment of the person with the accuracy and artificial intelligence of robotic assistance. The system described in this patent specification requires far less capital equipment than known comprehensive robotic surgical equipment, requires relatively few settings or special rooms, and optimally combines clinician skill and a degree of robotic assistance to achieve efficient and effective results.
The functions of the new system include:
3D or stereoscopic vision using multiple cameras of different perspectives
-precise control of the cannula of the catheter by feedback from the vision camera
Motor-driven or manual three-dimensional movement: engagement (angle), translation and rotation of parts
An ergonomic hand-held instrument arrangement, the user's hand being in a natural forward position, the hand and instrument being in natural field of view
Magnification of the image, for example 5 times, facilitating a more accurate and smoother positioning of the instrument or component
Multiframe and multispectral aid in distinguishing tissue structure and properties
Real-time recognition, analysis and guidance using the data of the previous number of images and programs to assist in the planning and execution of the programs and to increase flexibility
Small, portable, hand-held configurations allowing surgery outside of a specific operating room
Modular design, enabling multiple configurations and use of multiple small-scale robotic-assisted endoscopes, combining different capabilities or uses in one procedure for more complex surgery or other visualization or treatment
As described in more detail below, an important aspect of the new endoscope system is that a user, such as a surgeon, urologist, or gynecologist, is in contact with or in close proximity to the patient, typically holding the endoscope during the procedure, and in contrast to known full-size robotic surgical systems, the user typically does not hold instruments into the patient until a console or a microscope is located a distance from the patient.
Fig. 1 illustrates a compact, hand-held, robot-assisted endoscope system in some embodiments. Endoscope 100 includes a disposable portion 102 including a cannula 107 with a camera and light source module 103 at its forward end, and a reusable portion 104 including a handle 106 and a display screen 108, typically displaying images and/or other information acquired with the camera, such as patient and procedure identification and other images. The module 103 may include two or more image sensors that may act as independent cameras providing stereoscopic or three-dimensional views. As indicated by the arrow, the cannula 107 is configured to rotate and translate relative to the reusable portion 104, and the front end portion 105 of the cannula 107 is configured to rotate relative to the long axis of the cannula 107. Handle 106 typically includes buttons, levers, and/or touch pads 110, etc., through which a user may control the angle, rotation, and/or translation of the front end and/or other portions of the disposable portion, such as with the thumb of the hand holding handle 106. The front end portion 105 of the single-use portion 102 may be hinged in the position shown in the figures, in addition to being straight along the long axis of the cannula 107. The illustrated robot-assisted endoscope enhances performance of a human operator by combining human skills with the accuracy and artificial intelligence of robotic equipment, as described in more detail below.
Endoscope 100 may be as shown in fig. 1, or may be any of the endoscopes shown and described in the patents and applications incorporated herein by reference, or may include a combination of features thereof, or may be a display-less endoscope as shown in fig. 2, or similar variations thereof. The display screen 108 may have one or more front-end or forward-facing cameras whose field of view includes the front end 105 of the reusable portion 102, as discussed in more detail below. The module 103 at the front end of the cannula 107 may include one or more cameras that can selectively image different ranges of light wavelengths, and a light source, such as an LED, in the module 103 can selectively emit light of the different wavelength ranges desired. Endoscope 100 may include permanently mounted surgical devices such as graspers, injection needles, and the like, and may include a working channel through which the surgical device may be inserted to reach object 301, and may include a fluid channel through which fluid may be introduced into object 301 or withdrawn from object 301, as described in the patents and applications incorporated herein by reference.
Fig. 2 shows the definition of the position and orientation of a component, such as the endoscope of fig. 1, and an object, such as a patient's internal organ or tissue, relative to a coordinate system. As shown in fig. 2, the position of the object 301 may be defined in orthogonal coordinates and the direction of the object may be defined in polar coordinates, thereby providing six degrees of freedom. The term patient position in this patent specification refers to the position and/or orientation of an object at a particular time. The single-use portion 102 typically has one or more cameras at its front end, the position and/or orientation of which is defined in a respective coordinate system at a time, referred to as a camera position. The position and/or orientation of the reusable portion 104 or handle 106 is defined in the respective coordinate system at a time, referred to as the handle position.
Fig. 3 illustrates an embodiment of imaging and/or treating an object 301 during a medical procedure, such as the endoscope of fig. 1, but without the display screen 108. The object 301 may be the patient's knee joint, as shown, or other organ or tissue, such as the patient's bladder, uterus, spine, etc. In this embodiment, the program utilizes real-time information from internal and external sensors of endoscope 100, an Artificial Intelligence (AI) capable processor 302, a cloud computing source 304, and a VR headset 306, such as an adaptation of commercially available models, e.g., oculus Quest2, HTC live Pro 2,HTC Vive Cosmos Elite, or HP Reverb G2. The handle position information may be provided in real-time or near real-time by VR headset 306 using techniques such as laser marking, ultrasound imaging or detection, and camera tracking. The camera position information may be obtained in real time or near real time using radio frequency tracking techniques such as the disposable portion 102 (including the front end portion 105 thereof), or from the known static relationship between the handle position information and the disposable and reusable portions, as well as commanded engagement, rotation and translation, by a combination of the two techniques. The illustrated system is configured to provide camera position and handle position information to an Artificial Intelligence (AI) capable processor 302, which can communicate with a VR headset 306 and cloud computing facility 304, which can provide information, such as a database from previous programs and guidance from current programs. The processor 302 communicates with the VR headset 306, typically wirelessly, as with current commercially available electronic gaming systems.
FIG. 4 illustrates Artificial Intelligence (AI) assisted imaging and/or surgery, in some embodiments, involving fusion of real-time information from internal and external sensors with an Artificial Intelligence (AI) engine infrastructure. The endoscope 100, with the display screen 108, can view and/or manipulate the object 301 as in fig. 1. In addition, a typical larger size display screen 402 may be driven, preferably wirelessly driven, by the processor 302 to display information such as images of the disposable portion and the front end portion 105 of the object 301 or the camera 103 and their relative positions and orientations, and/or other information. The user 404 may view the display 108 and/or the display 402 as desired during the medical procedure. As described in connection with fig. 2, endoscope 100 provides real-time camera position and handle position information to processor 302, preferably wirelessly. In this embodiment, the processor 302 provides processing information to the display 402 for displaying images, such as images showing the front end portion 105 of the single use portion 102, the camera 103, the object 301, its relative position, and/or other information.
Fig. 5 shows a robotic device and system for diagnosis and treatment with the aid of Artificial Intelligence (AI). The endoscope 100 or another imaging modality probe provides an image of the object 301 captured with the endoscope front end camera. Input/output (I/O) device 504 assembles the above-described position and/or orientation information camera position and handle position and provides this information to Artificial Intelligence (AI) engine and system processor 302. The input/output device 506 assembles a real-time image or video of the object 301 photographed with the camera module 103 of the endoscope 100 or another probe or in another manner, and supplies the resulting liv_target data to the unit 302. The database unit 508 stores data such as a previous image of the subject 301 taken in a previous medical procedure of the same patient or an image taken earlier in the same procedure and supplies them to the unit 302. The I/O and database unit 510 provides data to the unit 302, such as images and/or other parameters, designated as Avg TARGET models, that are derived from or related to an object such as the subject 301, e.g., from a collection of such images and/or parameters obtained from a typically large patient population, and possibly from other sources, such as anatomical reference materials. Some or all of the information for the Avg TARGET model may come from the internet or other connection to the cloud computing source 512. The Artificial Intelligence (AI) engine and system processor 302 processes the information provided to it from units 504, 506, 508, and 510 to generate real-time images and/or images/videos of the object 301 and endoscope 100 (including its front end 105 and module 103) and of average or typical objects 301 and/or videos displayed on the display 502 and/or VR headset 306.
During a medical procedure using the system of fig. 5, the images displayed on the units 306 and 502 may guide the user in inserting the single-use portion 102 into the object 301, and during the medical procedure, a real-time view of the relative position and orientation of the object 301, such as the front end of the cannula 107 (and any surgical device protruding therefrom), is displayed in the user-selected material, an image or video of the object 301 taken early in the procedure, how the object 301 will or should be seen (including the portion of the object 301 that is not currently in view of the endoscope 100), and how a similar procedure is performed according to the information provided by the Avg TARGET model. If some movement of the single-use portion 102 and the surgical device protruding therefrom is motor controlled, information from the unit 302 can be used to enhance manual control of such movement. For example, if analysis of the image of the object 301 by the unit 302 indicates that the operation of the manual command is inconsistent with the current environment of the component 105 in the object 301, the information from the unit 302 may limit the angular range of the front end portion 105 of the cannula 107.
Fig. 6 illustrates the use of laser time-of-flight techniques to determine the position of the handle and the position of the patient (the position and/or orientation of the reusable portion 104 and object 301). The patient position relative to handle 106 may be determined using laser light from module 103 or from front portion 105 of disposable portion 102 and/or from a laser source 602 directed to one side from the front of display 108 to illuminate object 301. The arrangement of fig. 6 may be used as endoscope 100 in the system of fig. 5, or as a stand-alone arrangement. The position and/or orientation of handle 106 and/or the portion of disposable portion 102 that is not within the patient may be determined relative to a fixed frame of reference using one or more laser sources and imager 604 at a fixed location, such as on a wall of a room that illuminates handle 106. Figure 6 shows the signs of orthogonal and polar parameters for the position and orientation of the patient position and the handle position. Techniques for laser Time-of-flight measurements are known, for example, as discussed in https:// www.terabee.com/Time-of-flight-principle/and https:// en.
FIG. 7 is otherwise identical to FIG. 6, butThe determination of the handle position and patient position using ultrasonic time-of-flight techniques is shown, for example, with ultrasonic sensors mounted on the front tip 105, the display 108, and/or a fixed location 702, such as a room wall or ceiling. The arrangement of fig. 7 may be used as endoscope 100 in the system of fig. 5, or as a stand-alone arrangement. Techniques for ultrasonic time-of-flight measurements are known, for example, inhttps://www.terabee.com/time-of-flight- principle/
Fig. 8 illustrates determining a handle position and a patient position using a camera array in VR headset 306. For example, at https? & a=1 & jumpid=cs_con_nc_ns & utm _medium=cs & utm _source=ga & utm _campaign=hp-store_us_all_ps_all_ Hgm _opex_google_all_smart-pla_acceptors_unbr & utm _content=sp & adid=600244346557 & adddittype=u &1g5u1aa%23aba & cq_add & cq_cmp=17340334760 & cq_con= a commercially available VR headset shown by 142804800851& cq_term= & cq_med= & cq_plac= & cq_net= & cq_pos= & cq_plt= & gp & gclid= cj0kcqjw9 zgybhceaerieuxitw 5Ep4EG1m8Q 8 b6guathK9ztovjd 2UhA7FVn4 lubtkhuyopacccigaal 9 ealw wcB & gclsrc= aw ds may be served at VR headset 306, to track the movement of the disposable portion 102 and the reusable portion 104 in real time. If desired, the markers for tracking may be fixed in the relative positions of the disposable portion 102 and the reusable portion 104. The handle position may be determined relative to a fixed coordinate system and sensor, such as an ultrasonic or optical or radio frequency sensor 802 on a wall or ceiling of a room.
Fig. 9 is otherwise similar to fig. 6, but illustrates the use of radio frequency tracking of camera position. In this embodiment, one or more radio frequency receivers 902 are secured to the reusable portion 104, such as at the front surface of the display 108, to receive radio frequency transmissions from the source 904 at the top of the front portion 105 of the disposable portion 102. The indicated camera position information may display in real time the position of the front tip of the cannula 107 relative to the reusable portion 104, including after translation of the cannula 107 along its long axis relative to the handle 106. The arrangement of fig. 9 may be used as endoscope 100 in the system of fig. 5, or as a stand alone arrangement. Techniques for radio frequency Distance Measurement are commercially available, for example, see https:// www.researchgate.net/publication/224214985_radio_frequency_time-of-flight_distance_measurement_for_low-cost_wireless_sensor_localization. If desired, tracking markers may be fixed at the relevant locations of the disposable portion 102 and the reusable portion 104. The handle position may be determined with sensors, such as ultrasonic or light or radio frequency sensors 802 on the walls or ceiling of the room, relative to a fixed coordinate system.
Fig. 10 shows the use of one or more stepper motors to derive the camera position relative to the handle position. The endoscope 100 in the example includes two spaced apart, forward-facing imaging systems 1002 with their respective light sources positioned on the front side of the display 108. A digital stepper motor 1006 within the reusable portion 104 drives rotation and translation of the cannula 107 relative to the handle 106, as well as deflection or angle of the front end portion 105 of the cannula 107. The arrangement of fig. 10 may be used as endoscope 100 in the system of fig. 5, or as a stand alone arrangement. The forward camera system 1002 views the cannula 107, including the front end portion 105 thereof. The stepper motor 1006 provides a motor step signal to a processor 1008 in the reusable portion 104 that is configured to determine the position and/or orientation of the cannula 107, including the front end portion 105 and tip thereof, from the step counts of each motor. The forward imaging system 1002 generates real-time images of the cannula 107 and its front end portion 105 and tip, which are also fed into a processor 1008, which is configured to correlate the images with step motor counts to determine the camera position relative to the reusable portion 104. If a handle position is desired, it can be determined as in the other embodiments discussed above, and thus the camera position relative to the selected frame of reference, and relative to the handle 106.
Fig. 11 is otherwise identical to fig. 10, but shows endoscope 100 (a small robotic system) from a different angle. As with the arrangement of fig. 10, the rotation, translation and/or angle of the cannula 107 and the front end portion 105 relative to the reusable portion 104 may be derived from the number of steps performed in response to manual manipulation of the touch panel or joystick 110 (or commanded robotic manipulation by the unit 302 (fig. 5)), and determination of the camera position may be further aided by information from the forward imaging system 1002, processed in the processor 1008 (shown in fig. 10). The handle position may be determined as in the other embodiments discussed above. The arrangement of fig. 11 may be used as endoscope 100 in the system of fig. 5, or as a stand-alone arrangement.
Fig. 12 illustrates the use of a forward-facing camera system and LEDs to determine the movement of the cannula 107 and its forward end portion 105 relative to the reusable portion 104. In other respects, the arrangement of fig. 12 is the same as that of fig. 10 and 11. In fig. 12, numeral 1202 designates a forward imaging system and a light source whose front end faces. In this embodiment, the light source of the forward imaging system 1002 may be turned off. An LED matrix 1204 emitting infrared light may be placed at selected locations of the disposable portion 102, for example along the cannula 107 and its front end portion 105. The camera position relative to the reusable part 104 can be derived from images of the infrared source along the disposable part obtained with the forward imaging system 1002, which are based on geometric calculations of the image positions of the LEDs1204 in the field of view of the forward imaging system 1202. The output of the forward camera 1002 is processed by the processor 1008 (fig. 10), as described above. The handle position may be derived as in the other embodiments discussed above. The arrangement of fig. 12 may be used as endoscope 100 in the system of fig. 5, or as a stand alone arrangement.
Fig. 13 is otherwise similar to fig. 12, but along the single use portion 102, including the cannula 107 and the front end portion 105, a matrix of labels 1302 is used that reflect light from a light source at the forward imaging system 1002.
Fig. 14 shows an arrangement for deriving camera position from or relative to a handgrip position using a forward facing camera 1402. In this embodiment, one or more forward facing cameras 1402 comprising respective white light sources are on the display 108 and illuminate a field of view comprising the cannula 107. The forward imaging system 1402 images the field of view to detect movement of the cannula 107 and/or the front end portion 105 and derives therefrom a camera position relative to the reusable portion 104 by processing the images in the processor 1008 (fig. 10). This avoids the need for reflective labels or LEDs along the disposable portion 102. The handle position may be determined in accordance with other embodiments discussed above. The arrangement of fig. 11 may be used as endoscope 100 in the system of fig. 5, or as a stand alone arrangement. The arrangement of fig. 14 may be used as endoscope 100 in the system of fig. 5, or as a stand alone arrangement.
Fig. 15 is a perspective view of a complete endoscope, with the forward facing camera 1202 used to derive the camera position from the handle position, as discussed above with respect to fig. 14.
Fig. 16 shows the image processing portion involved in deriving the camera position from the handle position using the forward imaging system 1002 or 1402 at the image of the disposable portion 102 taken by the reusable portion 104, as described above. The left side in fig. 16 is an image captured with the forward imaging system, the right side is a segmented image, and only the contour or edge of the left side image remains. This process may be performed in processor 1008 (fig. 10) or processor 302.
Fig. 17-20 illustrate other examples of image processing portions involved in deriving camera position from handle position, showing other orientations of the disposable portion 102 relative to the reusable portion 104.
Fig. 21 shows endoscope 100 mounted on articulating robotic arms 2102 and 2104, which are tabletop mounted or floor mounted. The robotic arms 2102 and 2104 may be manually moved to position the endoscope 100 as desired in preparation for or during a medical procedure. The user can grasp the bracket 2106 or 2108 that receives the handle 106 of the endoscope 100, as described above, and manually operate the control device 110. In addition, the unit 302 (fig. 5) may direct movement of the robotic arms 2102 and 2104, and/or movement of the stepper motor of the endoscope 100, as described above, if desired or required. Only one robotic arm and endoscope may be used in one environment instead of the two shown in fig. 21, as needed or desired.
Fig. 22 is otherwise similar to fig. 21, but the robotic arms 2202 and 2204 are mounted on a ceiling rather than on a table or floor. In addition, one or both robotic arms may be mounted on a wall.
Fig. 23 is otherwise similar to fig. 21, but endoscope 102 is shown mounted with display screen 150 touching, displaying a cross-track 1148 along which a user may move a finger or point to command the bending of front portion 110 of cannula 120 in a horizontal plane, a vertical plane, or a plane at an angle to the vertical and horizontal planes.
Fig. 24 is a side view of a small robotic endoscope that may otherwise be similar to those described or referenced above, but having a control knob 1320 that may be conveniently manipulated by the thumb of a user holding handle 140. Knob 1320 is coupled to stepper motor 1006 (fig. 13) to control the bending of forward portion 105 of cannula 107. The coupling may be configured to: pushing knob 1320 to the left or right bends front end portion 105 to the left or right by an angle determined by the force on the knob or the time of pushing; pushing the knob up or down bends the front end portion 105 up or down by an angle determined by the force or time of pushing. While pushing the knob (in the forward direction) rotates the angled forward portion 105 a predetermined angle, such as 360 degrees, about the long axis of the cannula 107 to automatically image the entire interior of the body cavity or organ. Such imaging of the interior of the entire body cavity or organ is referred to herein as a scanning mode operation and has been found to be particularly beneficial in certain medical procedures, for example, where it may be convenient to preview all or at least a substantial portion of the body cavity or organ prior to focusing on a suspicious region or lesion for examination or treatment.
Although the foregoing has been described in some detail for purposes of clarity of illustration, it will be apparent that certain changes and modifications may be practiced without departing from the principles of the utility model. It should be noted that there are many alternative ways of implementing the processes and apparatuses described herein. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the body of work described herein is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.

Claims (5)

1. A hand-held robot-assisted endoscope, comprising:
an endoscope comprising a disposable portion including a cannula with a camera at a forward end thereof and a reusable portion, the disposable portion being releasably connectable to the endoscope;
a manual control device located on the reusable portion, the device configured to be manipulated by a user grasping the reusable portion and controlling rotation and translation of at least a portion of the disposable portion relative to the reusable portion, and the angle of the distal portion of the disposable portion relative to its long axis;
a display coupled to the camera and configured to display a current image of the object captured by the camera, while displaying additional images including one or more of a previous image of the object, an image of a similar object, and an image for guiding a medical procedure of the object;
electronically controlling one or more actions of at least a portion of the disposable portion relative to the reusable portion; and
a processor configured to provide the additional image to the display and selectively drive the motorized control.
2. The handheld robotic-assisted endoscope according to claim 1, comprising a first tracking device configured to automatically provide at least one of an estimate of a changing position and a changing direction of the disposable portion relative to the reusable portion, and a processor configured to use the estimate when displaying a current image of the disposable portion relative to the object on the display.
3. The hand-held robot-assisted endoscope of claim 2, wherein: the first tracking device includes a radio frequency transmitter at the forward end of the cannula and a radio frequency receiver at the reusable portion.
4. The hand-held robot-assisted endoscope of claim 2, wherein: the first tracking device includes a processor configured to derive the estimate based at least in part on a signal associated with the motorized control driving the disposable portion relative to the reusable portion.
5. A hand-held robot-assisted endoscope, comprising:
an endoscope comprising a disposable portion including a cannula with a camera at a forward end thereof and a reusable portion, the disposable portion being releasably connectable to the endoscope;
a manual control device at the reusable portion, the device configured to be manipulated by a user grasping the reusable portion and controlling rotation and translation of at least a portion of the disposable portion relative to the reusable portion and the angle of the front end portion of the disposable portion relative to the long axis thereof;
a display coupled to the camera and configured to display a current image of the object captured by the camera, while displaying additional images including one or more of a previous image of the object, an image of a similar object, and an image for guiding a medical procedure of the object;
electronically controlling one or more movements of at least part of the disposable part relative to the reusable part; and
a processor is configured to provide the additional image to the display and to selectively drive the motorized control.
CN202223391748.8U 2022-09-09 2022-12-08 Handheld robot-assisted endoscope Active CN220175076U (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/941,884 US20230117151A1 (en) 2021-10-18 2022-09-09 Hand-held, robotic-assisted endoscope
US17/941,884 2022-09-09

Publications (1)

Publication Number Publication Date
CN220175076U true CN220175076U (en) 2023-12-15

Family

ID=86483191

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202223391748.8U Active CN220175076U (en) 2022-09-09 2022-12-08 Handheld robot-assisted endoscope
CN202211629055.XA Pending CN115886695A (en) 2022-09-09 2022-12-08 Handheld robot-assisted endoscope

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202211629055.XA Pending CN115886695A (en) 2022-09-09 2022-12-08 Handheld robot-assisted endoscope

Country Status (1)

Country Link
CN (2) CN220175076U (en)

Also Published As

Publication number Publication date
CN115886695A (en) 2023-04-04

Similar Documents

Publication Publication Date Title
CN109069215B (en) System and method for controlling a surgical instrument
CN110325138B (en) System and method for intelligent seed registration
KR102643758B1 (en) Biopsy devices and systems
JP6714085B2 (en) System, controller, and method for using virtual reality devices for robotic surgery
KR101720047B1 (en) Virtual measurement tool for minimally invasive surgery
KR102542848B1 (en) Systems and methods for display of pathological data in an image guided procedure
EP3119286B1 (en) Medical devices and systems using eye gaze tracking
KR101258912B1 (en) Laparoscopic ultrasound robotic surgical system
JP2022140730A (en) Systems and methods for using registered fluoroscopic images in image-guided surgery
US7951070B2 (en) Object observation system and method utilizing three dimensional imagery and real time imagery during a procedure
JP5568574B2 (en) Method for navigating an endoscopic device and device for image display
KR20120087806A (en) Virtual measurement tool for minimally invasive surgery
KR20210005901A (en) Systems and methods related to elongated devices
JPWO2007145327A1 (en) Remote control system
US20200015910A1 (en) Systems and methods for teleoperated control of an imaging instrument
CN115334993A (en) System and method for constrained motion control of a medical instrument
CN220175076U (en) Handheld robot-assisted endoscope
KR20120052574A (en) Surgical robitc system and method of driving endoscope of the same
US20230117151A1 (en) Hand-held, robotic-assisted endoscope
KR20120052573A (en) Surgical robitc system and method of controlling the same
US20220323157A1 (en) System and method related to registration for a medical procedure
US11850004B2 (en) Systems and methods for determining an arrangement of explanted tissue and for displaying tissue information
Sarli Design, modeling and control of continuum robots and dexterous wrists with applications to transurethral bladder cancer resection
CN116348058A (en) Systems and methods for tracking an object through a body wall for operation associated with a computer-assisted system
EP2910194A1 (en) Surgical instrument

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant