WO2019114605A1 - 一种腹腔镜 - Google Patents

一种腹腔镜 Download PDF

Info

Publication number
WO2019114605A1
WO2019114605A1 PCT/CN2018/119585 CN2018119585W WO2019114605A1 WO 2019114605 A1 WO2019114605 A1 WO 2019114605A1 CN 2018119585 W CN2018119585 W CN 2018119585W WO 2019114605 A1 WO2019114605 A1 WO 2019114605A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera module
wing
illumination
light
abdominal cavity
Prior art date
Application number
PCT/CN2018/119585
Other languages
English (en)
French (fr)
Inventor
刘晓龙
Original Assignee
梅达布蒂奇股份有限公司
刘晓龙
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 梅达布蒂奇股份有限公司, 刘晓龙 filed Critical 梅达布蒂奇股份有限公司
Publication of WO2019114605A1 publication Critical patent/WO2019114605A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets

Definitions

  • the present application relates to the field of medical device technology, and in particular to a laparoscope.
  • MIS Minimally Invasive Surgery
  • the conventional laparoscopic is a rigid long rod laparoscope comprising a trocar, an elongated hollow needle, a monitor connected to the hollow needle, and a camera with a free end of the hollow needle.
  • the camera includes an image sensor, a lens, and an illumination source.
  • the hollow needle is configured for passage through a tissue layer of the abdominal wall into the abdominal cavity, and the camera can be manually mounted to the free end of the hollow needle by a trocar.
  • the camera can collect tissue within the abdominal cavity under illumination from the illumination source and transmit the acquired image to a monitor external to the abdominal cavity.
  • a trocar is placed at one opening, and the surgical instrument is inserted into the abdominal cavity through the trocar and the tissue is surgically operated.
  • the other opening is used to extend the hollow needle into the abdominal cavity to provide illumination and imaging of the tissue in the abdominal cavity.
  • more than two openings have a greater damage to the tissue of the abdominal wall.
  • the specific technical solution is as follows.
  • the embodiment of the present application provides a laparoscope, comprising: a gripping component, an anchoring component, a lighting device, and a camera module;
  • the camera module is connected to the anchoring component by the lighting device;
  • the grasping member is placed outside the abdominal cavity during surgery for fixing the anchoring member to the abdominal wall in the abdominal cavity by suction between the anchoring member;
  • the anchoring member, the illumination device, and the camera module enter the abdominal cavity through a trocar for insertion of a surgical instrument during surgery.
  • the laparoscopic device further includes: a monitor for displaying an image collected by the camera module
  • the illumination device comprises: a wing member including at least three wings uniformly arranged in space, a wing deployment mechanism, a light-emitting component and a lens component on each wing, the lens component being covered The outside of the light-emitting component;
  • the wing deployment mechanism is coupled to the wing member, the wing deployment mechanism capable of causing the wing member to deploy; the wing member entering the abdominal cavity in a collapsed state, after the illumination device enters the abdominal cavity When in working condition, the wing members are in an unfolded state.
  • the lighting device further includes: a tilting motion mechanism; the tilting motion mechanism can cause the lighting device to tilt.
  • the camera module is fixed at an intermediate position of the wing component; when the wing component is in an unfolded state, the camera module is capable of acquiring an image in the abdominal cavity when the wing component is at In the folded state, the camera module is inside the wing member.
  • the laparoscope further includes: a user controller;
  • the gripping component comprises: a controller circuit board;
  • the controller circuit board is configured to receive a first control command sent by the user controller, and control the lighting device to perform a first operation according to the first control command; and receive a second message sent by the user controller Controlling a command, and controlling the camera module to perform a second operation according to the second control command;
  • the first operation comprises opening the wing member, closing the wing member, moving the lighting device, tilting the lighting device, and adjusting a brightness of the lighting device;
  • the second operation comprises: starting to acquire an image And stop collecting images.
  • the range of the target illumination area of the illumination device in the abdominal cavity is not less than the range of the image acquisition area of the camera module in the abdominal cavity.
  • the lens component causes light emitted by the light emitting component to be mapped to a target illumination area according to a specified mapping relationship
  • the specified mapping relationship is: a mapping relationship between the illumination uniformity of the target illumination area of the illumination device in the abdominal cavity is not less than a preset uniformity threshold, and the illumination intensity is not less than a preset intensity threshold; the specified mapping The relationship is based on a refractive index of the lens component, a specified volume of the lens component, a size of the light emitting component, a light intensity distribution of the light emitting component, a relative position between the light emitting component and the target illumination region determine.
  • the specified mapping relationship is based on a surface gradient Get, said For the solution of the following equation:
  • the ⁇ is a constant coefficient
  • the ⁇ s is a light source domain of the light-emitting component
  • the ⁇ and ⁇ are respectively an abscissa and an ordinate of a projection plane of the light-emitting component
  • the I 0 is a light intensity distribution at an axis of the light-emitting component
  • the BC is a boundary condition
  • the E t is an illuminance distribution function of a preset target illumination area, and the E t is determined according to the preset uniformity threshold and a preset intensity threshold.
  • the surface gradient Determine in the following ways:
  • the laparoscope provided by the embodiment of the present application includes: a monitor, a trocar, a gripping component, an anchoring component, a lighting device, and a camera module.
  • the gripping component is placed outside the abdominal cavity
  • the anchoring component, the illumination device and the camera module are all placed in the abdominal cavity
  • the anchoring component is fixed on the abdominal cavity wall by the suction force between the anchoring component
  • the camera module passes the illumination device. Attached to the anchoring component to achieve illumination and image acquisition of the tissue within the abdominal cavity.
  • the anchoring component, the illuminating device and the camera module enter the abdominal cavity through the trocar, and the trocar is a trocar for inserting the surgical instrument, and the opening of the trocar during the operation is an opening necessary for the operation Therefore, there is no need to open another opening on the abdominal wall, so the number of openings of the abdominal wall can be reduced, thereby reducing the tissue damage of the abdominal wall.
  • the trocar is a trocar for inserting the surgical instrument
  • the opening of the trocar during the operation is an opening necessary for the operation Therefore, there is no need to open another opening on the abdominal wall, so the number of openings of the abdominal wall can be reduced, thereby reducing the tissue damage of the abdominal wall.
  • FIG. 1 is a schematic structural view of a laparoscope according to an embodiment of the present application.
  • Figure 2a is a reference diagram of an application scenario of a laparoscopic operation in practice
  • 2b is a schematic structural diagram of a movable connection between a lighting device and a camera module
  • 3a and 3b are respectively a schematic structural view of a lighting device in an unfolded state and a folded state according to an embodiment of the present application;
  • FIG. 3 is a schematic structural diagram of a lighting device and a camera module according to an embodiment of the present application
  • 3c2 is another schematic structural diagram of a lighting device and a camera module according to an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of a lighting device according to an embodiment of the present application.
  • 3e1 and 3e2 are respectively two reference drawings corresponding to FIG. 3d;
  • FIG. 3f1 is another schematic structural diagram of a laparoscope according to an embodiment of the present application.
  • Figure 3f2 is a reference diagram corresponding to Figure 3f1;
  • FIG. 3g1 is a schematic structural diagram of a gripping component provided by an embodiment of the present application.
  • Figure 3g2 is a reference diagram corresponding to Figure 3g1;
  • FIG. 3h is a reference diagram of the tilting of the lighting device controlled by the gripping component provided by the embodiment of the present application.
  • FIG. 3I is a schematic diagram of light redirected by a light-emitting component according to an embodiment of the present application.
  • FIG. 4 is a schematic flow chart of a process for determining a surface gradient according to an embodiment of the present application
  • 5a to 5d are reference diagrams when determining a specified mapping relationship provided by an embodiment of the present application.
  • the embodiment of the present application provides a laparoscope.
  • the present application will be described in detail below through specific embodiments.
  • FIG. 1 is a schematic structural view of a laparoscope according to an embodiment of the present application.
  • FIG. 2 is a reference diagram of an application scenario of a laparoscope according to an embodiment of the present application.
  • the laparoscope includes a gripping member 103, an anchoring member 104, a lighting device 105, and a camera module 106. Also shown in Figure 1 is a monitor 101 and a trocar 102.
  • the camera module 106 can include sub-components such as an imaging sensor, a lens, and the like.
  • the camera module 106 is connected to the anchoring component 104 via the illumination device 105.
  • the grasping member 103 is placed outside the abdominal cavity during surgery for securing the anchoring member 104 to the abdominal wall within the abdominal cavity by suction between the anchoring member 104.
  • the anchoring member 104, the illumination device 105, and the camera module 106 enter the abdominal cavity through a trocar for insertion of a surgical instrument during surgery.
  • the grasping member 103 is placed outside the abdominal cavity, and the anchoring member 104 is placed on the inner wall of the abdominal cavity.
  • the surgical instrument 100 inserted into the trocar 102 is shown in Figure 2a.
  • the gripping member 103 and the anchoring member 104 may both be magnetic members, and the gripping member 103 and the anchoring member 104 may be respectively adsorbed to the outside and the inside of the abdominal cavity wall by magnetic force.
  • the anchoring component 104, the illumination device 105, and the camera module 106 can be fixedly coupled together. After the anchoring member 104 is secured inside the abdominal wall, the illumination device 105 and camera module 106 coupled to the anchoring member 104 are also anchored within the abdominal cavity.
  • the lighting device 105 and the camera module 106 may be fixedly connected together, and the connection between the anchoring component 104 and the lighting device 105 may be an active connection, that is, the anchoring component 104 and the lighting device 105. The angle between them can be adjusted, see Figure 2b.
  • the power supply can be inside the lighting device or inside the camera module, or it can supply power to the lighting device and camera module through a dedicated power cable.
  • the anchoring member 104, the illumination device 105, and the camera module 106 can be delivered into the abdominal cavity through the trocar 102 by forceps.
  • the grasping member 103 can be first placed on the outer side of the abdominal cavity wall near the trocar, and the anchoring member 104, the illumination device 105, and the camera module 106 are sent from the trocar 102 into the abdominal cavity using forceps, and The anchor member 104 releases the forceps when it is caught by the grip member 103. In this way, the anchoring member 104, the illumination device 105 and the camera module 106 can be fixed in the abdominal cavity.
  • the gripping member 103 can be moved such that the anchoring member 104 is moved under the suction of the gripping member 103 and moved to the target position. After moving to the target position, the anchoring member is fixed to the abdominal wall in the abdominal cavity.
  • the above movements include translation and rotation.
  • the anchoring component 104 and the illumination device 105 can be adjusted prior to feeding the anchoring component 104, the illumination device 105, and the camera module 106 into the abdominal cavity. The angle between.
  • the surgical instrument can be inserted into the abdominal cavity through the trocar, and the tissue in the abdominal cavity can be operated under the illumination of the illumination device.
  • the camera module 106 can transmit the acquired image of the intra-abdominal tissue to the monitor 101.
  • the monitor 101 can display an image for the operator to view.
  • the anchoring component in the embodiment is fixed on the abdominal cavity wall by the suction force with the grasping component, and the camera module is connected with the anchoring component through the illumination device to realize illumination and image of the tissue in the abdominal cavity. collection.
  • the anchoring component, the illuminating device and the camera module enter the abdominal cavity through the trocar, and the trocar is a trocar for inserting the surgical instrument, and the opening of the trocar during operation is an opening required for the operation, so There is no need to open another opening in the abdominal wall, so the number of openings in the abdominal wall can be reduced, thereby reducing tissue damage of the abdominal wall.
  • the gripping member can move the anchoring member through the suction force with the anchoring member, thereby moving the lighting device and the camera module, adjusting the position of the lighting device and the camera module, and changing the illumination area and the imaging area in the body. s position.
  • the coaxial configuration of the imaging sensor and the light source causes a lack of shadow depth cues in the output two-dimensional image, which may result in the doctor not being able to more accurately determine the depth and position of each tissue.
  • the coaxial configuration is a configuration in which the central axis of the imaging sensor and the central axis of the light source are parallel.
  • the laparoscope in the embodiment shown in FIG. 1 can be further improved to set the imaging sensor and the light source in a non-coaxial configuration.
  • the non-coaxial configuration is a configuration in which the central axis of the imaging sensor and the central axis of the light source are not parallel.
  • the illumination device 105 can include: a wing member 51 including at least three wings that are spatially evenly arranged, the wing deployment mechanism 52, located The light-emitting member 53 and the lens member 54 on each wing cover the outside of the light-emitting member 53.
  • the wing deployment mechanism 52 is coupled to the wing member 51, the wing deployment mechanism 52 can cause the wing member 51 to deploy, the wing member 51 enters the abdominal cavity in a folded state, and when the illumination device 105 enters the abdominal cavity and is in an operational state, the wing member 51 It is in an expanded state.
  • the wing deployment mechanism 52 can be an electric motor or other device capable of providing a driving force.
  • the lighting device 105 may further include: a tilting motion mechanism 55.
  • the tilting motion mechanism 55 can cause the lighting device 105 to tilt.
  • the tilting mechanism 55 can be an electric motor or other device capable of providing a driving force.
  • FIG. 3a is a schematic structural view showing the lighting device in an unfolded state in the embodiment.
  • FIG. 3b is a schematic structural view of the lighting device in a folded state in the embodiment.
  • the illumination device 105 of Figures 3a and 3b includes three wing members 51, a tilting mechanism 55, a wing deployment mechanism 52, and a lighting member 53 and lens member 54 on each wing.
  • the illumination device is delivered into the abdominal cavity in a folded state.
  • the anchoring member is anchored on the abdominal wall, the structure of the illumination member will transition from the collapsed state to the deployed state.
  • the illuminating device in the folded state, may have an outer diameter of 17 mm and may enter the abdominal cavity from a trocar having a diameter of 20 mm.
  • the illuminating device includes a wing member including at least three wings that are spatially evenly arranged, and the illuminating member and the lens member are located on each of the wings.
  • FIG. 3c1 is a schematic structural diagram of a position between a lighting device and a camera module according to an embodiment of the present application
  • FIG. 3c2 is a lighting device and an imaging device provided by an embodiment of the present application. Schematic diagram of the angle.
  • the camera module 106 can be fixed to an intermediate position of the wing member 51. When the wing member 51 is in the unfolded state, the camera module 106 can capture an image in the abdominal cavity, and when the wing member 51 is in the folded state, the camera module 106 is inside the wing member 51.
  • the folding and unfolding of the wing member can be facilitated, and the structure is also easier to implement.
  • FIG. 3d is a schematic diagram of an internal structure of a lighting device according to an embodiment of the present application
  • FIG. 3e1 and FIG. 3e2 are respectively two reference drawings corresponding to FIG. 3d.
  • the luminaire further comprises two worms and gear sets 56, the first worm and gear set 561 for connecting the tilting mechanism 55 with the anchoring member 104, and the second worm and gear set 562 for connecting the wings
  • the mechanism 52 and the wing member 51 are deployed.
  • the worm and gear set 562 can include a worm and three gears that are coupled to the three wings, respectively.
  • the worms in the first worm and gear set 561 can be coupled to the tilting motion mechanism 55, and the gears in the first worm and gear set 561 are coupled to the anchoring member 104.
  • the worm in the first worm and gear set 561 drives the gear to rotate, causing a certain angle between the illumination device 105 and the anchoring member 104.
  • the worms in the second worm and gear set 562 can be coupled to the wing deployment mechanism 52, and the gears in the second worm and gear set 562 can be coupled to the wing members 51. Driven by the wing deployment mechanism 52, the worm in the second worm and gear set 562 drives the gear to rotate, causing the wing member 51 to unfold or fold.
  • the second worm and gear set 562 can include a worm and three gears. Each of the three gears meshes with the worm, and each of the gears is coupled to each of the wings. Under the driving force provided by the worm, each gear drives the corresponding wing to unfold or fold.
  • the laparoscope shown in FIG. 1 may further include: a user controller 107; the gripping member includes: a controller circuit board.
  • FIG. 3f1 is another structural diagram of a laparoscope according to an embodiment of the present application
  • FIG. 3f2 is a reference diagram corresponding to FIG. 3f1.
  • the user controller 107 can communicate with the gripping component 103.
  • the camera module 106 is coupled to the gripping member 103 and transmits the acquired image to the gripping member 103, and the gripping member 103 transmits the image transmitted by the received camera module to the monitor.
  • the camera module can be connected to the gripping component through a cable.
  • FIG. 3g1 is a schematic diagram of an internal structure of the grip member 103
  • FIG. 3g2 is a reference diagram corresponding to FIG. 3g1.
  • the grip member 103 includes a controller circuit board 31, a gear 32, a permanent magnet 33, a bearing 34, a spur pinion 35, and an electric motor 36.
  • the tilting motion mechanism 55 and the wing deployment mechanism 52 in the lighting device may each be coupled to the controller circuit board 31.
  • the shaft of the permanent magnet 33 is coupled to the shaft of the gear 32 and the bearing 34, respectively.
  • the motor 36 can drive the spur pinion 35 to rotate, and the spur pinion 35 meshes with the gear 32.
  • the motor 36 can drive the permanent magnet 33 to rotate by the gear 32 and the spur pinion 35.
  • the controller circuit board 31 is configured to receive a first control command sent by the user controller 107, and control the lighting device 105 to perform a first operation according to the first control command; receive a second control command sent by the user controller 107, and The camera module 106 is controlled to perform a second operation according to the second control command.
  • the first operation may include opening the wing member, closing the wing member, moving the lighting device, tilting the lighting device, and adjusting at least one of brightness of the lighting device; the second operation includes: starting to acquire the image and stopping at least one of the captured images Kind.
  • Mobile lighting devices include rotating lighting devices and panning lighting devices.
  • FIG. 3h is a reference diagram of the gripping member controlling the tilting motion of the lighting device according to the embodiment of the present application.
  • the component above the abdominal wall is a gripping member
  • the cylindrical block below the abdominal wall is an anchoring component
  • the components below the anchoring component are a lighting device and a camera module.
  • the gripping member can drive the anchoring member to rotate
  • the lighting device and the camera module can also be inclined with respect to the anchoring member.
  • the rotation of the anchoring component and the tilting of the illumination device can greatly increase the illumination range and angle of the illumination device, as well as increase the image acquisition position and angle of the camera module.
  • the controller circuit board 31 can control the rotation of the spur pinion 35 by the motor 36.
  • the spur pinion 35 drives the gear 32 to rotate by the meshing action, and the gear 32 drives the permanent magnet 33 to rotate.
  • the anchoring member 104 can be rotated by the magnetic force between the permanent magnet 33 and the anchoring member 104, and the anchoring member 104 drives the lighting device to rotate.
  • the gripping member 103 can be translated, the gripping member 103 translating the anchoring member 104 by the magnetic force with the anchoring member 104, and the anchoring member 104 causes the illuminating device to translate.
  • the gripping member 103 can be manually translated.
  • the controller circuit board 31 can control the current delivered to the light-emitting components 53 in the illumination device 105 to adjust the lightness of the illumination device 105.
  • the controller circuit board 31 can transmit an instruction to start capturing images to the camera module 106. After receiving the command, the camera module 106 starts acquiring images and transmits the acquired images to the controller circuit board 31.
  • the controller circuit board 31 can directly transmit the image transmitted by the camera module 106 to the monitor 101, and can also process the image transmitted by the camera module 106.
  • the controller circuit board 31 can send an instruction to the camera module 106 to stop acquiring images, and the camera module 106 stops acquiring images after receiving the command.
  • the gripping component may be an external anchoring and control unit (EACU), and the anchoring component, the lighting component, and the camera module may be collectively referred to as a robotic camera.
  • EACU external anchoring and control unit
  • the collapsing robotic camera is inserted through the trocar into the abdominal cavity.
  • the EACU fixes the robot camera to the inside of the abdominal wall by magnetic force.
  • the flexible cable between the EACU and the robotic camera is used to control signal transmission, imaging data acquisition, power supply, and removal of the robotic camera from the abdominal cavity.
  • the operator can send control signals to the microcontroller (MCU) in the EACU via the user controller.
  • the MCU is the controller board.
  • the MCU controls the wing component of the robot camera to open or close, or controls the robot camera to perform panning, tilting, etc. to adjust the attitude of the robot camera or adjust the brightness of the LED in the illumination device in the robot camera.
  • the controller can also turn the imaging system in the robot camera on or off.
  • the EACU can be the central control unit for the entire laparoscope.
  • the surgical video acquired from the robotic camera is processed in the EACU and sent to the monitor in real time.
  • the MCU may send a command to start capturing images and stop acquiring images to the camera in the robot camera, or may send a control command to the motor in the robot camera to control the robot camera to open the wing components, Turn off the wing components, turn, tilt, or adjust the brightness.
  • the EACU has a radially magnetized permanent magnet (EPM) inside.
  • the EPM is magnetically coupled to the magnet IPM inside the robotic camera to provide anchoring force for the fixation of the abdominal wall by the robotic camera and to provide rotational torque for rotational motion control.
  • the tilting motion of the robot camera is driven by the onboard actuation mechanism of the robotic camera.
  • the tilting motion of the robotic camera can be controlled by an onboard drive, i.e., tilting mechanism 55, with a worm and gear set 561.
  • the combination of translational and tilting motion enables the robotic camera to visually cover the entire surgical field.
  • the wing deployment mechanism 52 in the housing controls the opening angle of the wings by the worm and gear set 562.
  • the robot camera can provide a range of pitch motion of 49° and a range of wing motion of 80° (0° in the folded state).
  • the tilting mechanism 55 and the wing deploying mechanism 52 may be a stepping motor.
  • a step of 4 mm in diameter, 14.42 mm in length, and 125:1 in the planetary gear head (model ZWBMD004004-125) may be selected.
  • the stepper motor can provide 10mNm of torque during continuous operation.
  • the worm and gear set for the tilting mechanism and the wing deployment mechanism can have a reduction ratio of 12:1 and 20:1, respectively.
  • the light emitted from the light-emitting member 53 is bent after passing through the lens member 54, and finally irradiated on the target irradiation region.
  • the range of the target illumination area of the illumination device 105 in the abdominal cavity is not less than the range of the image acquisition area of the camera module 106 in the abdominal cavity.
  • Figure 3I is a schematic illustration of light being redirected by a light emitting component.
  • the FOV Field Of View
  • R is the target illumination area.
  • Lighting equipment plays a vital role in determining the quality of surgical images. If a light-emitting diode (LED) is combined with a mirror as the light source for the illumination device, the unconstrained beam will illuminate areas outside the FOV, wasting most of the energy; or producing a bright center on the imaging plane of the imaging sensor And the edge of the darkness.
  • LED light-emitting diode
  • the lighting equipment should meet the following requirements: (1) the illumination is evenly distributed in the target illumination area; and (2) the light efficiency is high, which means that the light is projected to the maximum extent in the field of view of the camera module.
  • the lens component 54 can cause the light emitted by the light-emitting component 52 to be mapped to the target illumination area in accordance with a specified mapping relationship.
  • the mapping can also be understood as projection or illumination, that is, the lens member 54 can cause the light emitted from the light-emitting member 52 to be projected or irradiated in a target irradiation region in a predetermined mapping relationship.
  • the specified mapping relationship is: a mapping relationship between the illumination uniformity of the target illumination area of the illumination device 105 in the abdominal cavity is not less than a preset uniformity threshold, and the illumination intensity is not less than a preset intensity threshold.
  • the specified mapping relationship is determined based on the refractive index of the lens member 54, the designated volume of the lens member 54, the size of the light-emitting member 53, the light intensity distribution of the light-emitting member 53, and the relative position between the light-emitting member and the target irradiation region.
  • the light transmitted from the illuminating part changes the optical path after passing through the lens, and the light is irradiated to the target illuminating area according to the specified mapping relationship, so that the target illuminating area has a certain illumination uniformity and illumination intensity, providing reliable and stable illumination for the minimally invasive surgery. .
  • the above specified mapping relationship can be understood as a mapping relationship determined by a lens.
  • the above specified mapping relationship may be based on a surface gradient get. Specifically, it can be based on the surface gradient
  • a surface shape function of the lens is constructed such that when light emitted by the light-emitting member passes through the lens, a predetermined mapping relationship exists between the light projected from the lens and the light emitted by the light-emitting member.
  • is a constant coefficient, which is used to assist in the calculation of the solution of the above equation.
  • E s is the illuminance distribution function of the light-emitting component
  • ⁇ ( ⁇ , ⁇ )
  • is the calculation domain of the illuminance of the light-emitting component.
  • ⁇ s is the light source field of the light-emitting component
  • ⁇ and ⁇ are the abscissa and the ordinate of the projection plane ⁇ - ⁇ of the light-emitting component, respectively.
  • I 0 is the light intensity distribution at the axis of the light-emitting member, that is, the light intensity distribution at the polar angle of the light-emitting member at 0 degrees.
  • BC is a boundary condition.
  • E t is the illuminance distribution function of the preset target illumination area, and E t is determined according to the preset uniformity threshold and the preset intensity threshold.
  • the above surface gradient can be determined by the steps of the flow diagram shown in FIG. 4:
  • Step S401 taking the first initial value as the illuminance distribution function E t of the target illumination area
  • Step S402 Substituting E t into the equation
  • Step S403 determining a simulated illuminance distribution function of the target illumination area according to u ⁇
  • the surface may be determined according to the gradient u ⁇ , the gradient is determined according to the determined surface shape of the surface of the lens function according to the luminance distribution function of the known light-emitting member, determines the illuminance distribution function of the light after the active surface shape of the lens function obtained As the target area
  • Step S404 determining the said Whether the difference between the E t and the E t is less than a preset value, if yes, step S405 is performed, and if no, step S406 is performed.
  • the gap between E and Et can be The difference between E and Et can also be The variance between E and Et .
  • the preset value is a preset value.
  • Step S405 requesting a gradient of the u to obtain a gradient
  • Step S406 Calculating the corrected illuminance distribution function
  • the corrected illuminance distribution function is used as the value of the illuminance distribution function E t , and the process returns to step S402.
  • step S402 can be performed in the following manner:
  • Step 1 The second initial value and the third initial value are taken as the values of the u ⁇ and ⁇ , respectively.
  • the second initial value is a solution of the guessed equation.
  • Step 2 Substituting the values of the u ⁇ and ⁇ into the equation
  • Step 3 Perform numerical discretization of the equation after substituting the value, and use a numerical solver to determine the solution u ⁇ of the equation after the discretization of the value.
  • Step 4 Determine whether the value of ⁇ is less than a preset minimum value, and if so, determine the solution u ⁇ as the solution result of the equation; if not, update the values of u ⁇ and ⁇ , and return to step 2.
  • step 3 may be determined as a solution u ⁇ u ⁇ updated.
  • the value of ⁇ in the constant sequence can be determined according to the deviation direction of the solved u ⁇ and the substituted u ⁇ .
  • E s ( ⁇ , ⁇ ) and E t (x, y) denote the illuminance distribution of the LED source, that is, the illuminance distribution of the LED source and the specified target radiation distribution, respectively.
  • the above equation is considered to be a special case of the L 2 Monge-Kantorovich problem. Assuming no transmission energy loss, ⁇ should be satisfied
  • Equation (6) has a numerical solution of the grid size h.
  • the final value of ⁇ in equation (6) is related to h, which is used to achieve optimized convergence speed and minimize error. This relationship depends on the norm used. According to the experimental data obtained by the present application, when At the time, the smallest global error can be obtained.
  • the discretization of the first and second order partial derivatives in equation (8) uses the central finite difference method in the inner region of ⁇ S to the boundary region A forward/backward finite difference method with second-order correction error is used.
  • the discretization of the biharmonic term in equation (8) ⁇ 2 u ⁇ can be expressed by a thirteen-point template
  • U ⁇ represents the vector of the variable u ⁇ .
  • the ray mapping method proposed above requires the use of the irradiance distribution E s ( ⁇ , ⁇ ) of the light source LED.
  • Luminous intensity This embodiment applies a stereoscopic projection method to convert the light intensity of the light source into an irradiance distribution defined on a plane.
  • the final form of the irradiance E s on the ⁇ - ⁇ plane is
  • each pair of coordinates ( ⁇ i , ⁇ j ) in the space ⁇ L ⁇ x L , y L , z L ⁇ can be mapped to the target plane ⁇ G ⁇ x G , y G , z G ⁇
  • the point T' i in the space , j (x' i , y' j , z'(x i , y j )), where i and j represent the discretization indices of the light source.
  • T′ i,j can be represented by T i,j in ⁇ L as shown in Fig. 5d(2).
  • O i,j is defined as a unit outward ray from the optical surface, and it is formulated as:
  • p i,j represents the point to be constructed on the surface.
  • the initial point p 1,1 can be manually selected according to the required lens volume in consideration of the desired lens volume. Therefore, O 1,1 is calculated by the formula (13).
  • the normal vector at p i,j can be calculated by Snell's law:
  • n 0 represents the refractive index of the medium surrounding the lens and n 1 represents the refractive index of the lens.
  • the coordinates of the point p 1,2 above and below the curve are calculated as the intersection between the ray I 1,2 and the plane defined by p 1,1 and N 1,2 .
  • the point of the curve of direction 2 can be calculated by using the point on the first curve as the initial point.
  • ⁇ i, j represents the distance between S and the surface point p i, j .
  • the nonlinear least squares method is used to minimize F 1 ( ⁇ ) 2 +...+F N ( ⁇ ) 2 , where ⁇ i,j is used as a variable.
  • the updated normal vector N i,j is calculated according to equation (14) by using the current iteratively optimized ⁇ and ray mapping. Iterating is performed to calculate a new ⁇ until the calculated surface point satisfies the convergence condition ⁇ t - ⁇ t-1 ⁇ , where t represents the current number of iterations and ⁇ is the stopping condition.
  • the optical surface can be represented by the use of free surface points with Non-Uniform Rational Basis Spline (NURBS).
  • FIGS 6(a) and (b) show the on-axis and off-axis experiments, which respectively use optical design software to study the effectiveness of optical design methods in different application scenarios.
  • PMMA polymethyl methacrylate
  • NCSWE17A type LED having a luminous flux of 118 lm is used as a light source.
  • the ray mapping relationship in the following is the specified mapping relationship.
  • the calculation of the ray mapping relationship First, the light intensity distribution of the LED (Fig. 6(c)) is converted into a normalized illuminance distribution (Fig. 6(d)).
  • the computational domain of ⁇ [-1,1], ⁇ [-1,1] of LEDs is discretized by a 81 ⁇ 81 mesh.
  • the present application selects ⁇ to take a sequence of 1, 0.5, 0.025 to approximate the numerical solution of the ray mapping.
  • the intermediate ray mapping result calculated by ⁇ , 1, 0.5, and 0.025 is demonstrated.
  • Figure 6 is a simulation device for evaluating a free optical design method.
  • On-axis test the LED axis coincides with the axis of the target illumination area, the target illumination area is circular and square during the test;
  • 1, 0.5, 0.025
  • a 61 ⁇ 61 grid is inserted in this figure.
  • Figure 8 shows the convergence speed of the ray mapping relationship generation method.
  • the characteristics of the convergence speed are expressed by the residual value of
  • 2 is in millimeters.
  • the free-form optical lens can be on the sub-micron scale (10 -4 mm)
  • the convergence threshold can be conservatively set at the nanoscale (10 -7 mm). In all experiments,
  • (a) - (c) and (d) - (f) are the convergence speeds of the circular region and the square region when ⁇ is 1, 0.5 and 0.025, respectively.
  • 2 is the residual value
  • Circular target region is the circular target illumination area
  • Square target region is the square target illumination area
  • Iteration is iteration.
  • Figure 8(a) shows the simulated setup of the axial test for a freeform optical lens design.
  • the on-axis test was performed using a circular target irradiation area having a radius R of 80 mm and a square target irradiation area having a side length of 2R of 160 mm.
  • Figures 9(a) and (b) show a design lens profile with a mark size.
  • Figures 11 (c) and (d) show the simulated illuminance distribution on the target illumination area. In consideration of Fresnel loss, the optical efficiency of the free-form lens is 88.3% and 90.5%, respectively. Uniformity can be calculated by equation (19)
  • ⁇ and ⁇ are the standard deviation and average of the collected illuminance data.
  • Table 2 details the optical performance of the on-axis test.
  • Figure 9 is an on-axis freeform lens design for two different illumination patterns.
  • (a) and (b) show the lens profiles of the circular area and the square area, respectively.
  • (c) and (d) show the uniformity of illumination performed on the target plane by (a) and (b), respectively.
  • Fig. 9 the units of the x-axis and the y-axis are millimeters, lm is the unit lumen, Min is the minimum value of the average illuminance, Max is the maximum value of the average illuminance, and Ave is the average value of the average illuminance, Total Flux Indicates total luminous flux, Flux/Emitted Flux represents the ratio of luminous flux to emitted luminous flux, and Incident Rays represents incident optics.
  • lm is the unit lumen
  • Min the minimum value of the average illuminance
  • Max the maximum value of the average illuminance
  • Ave the average value of the average illuminance
  • Flux/Emitted Flux represents the ratio of luminous flux to emitted luminous flux
  • Incident Rays represents incident optics.
  • FIG 10(b) illustrates the simulated setup for the off-axis test.
  • the illumination area is set to a circular area having a radius R of 80 mm.
  • the axial offset ⁇ d 5 mm, 10 mm and 15 mm were introduced to evaluate the best performance when the LED 5s axis and the target illumination area S5s are inconsistent.
  • a transformation matrix is needed to convert the ray map from global coordinates to the local coordinates of the LED.
  • Figure 10 shows the lens profile and simulated illuminance distribution results for each case.
  • Figures 10(a), (d) and (g) show the simulated illuminance distribution of the circular target illumination area.
  • the optical efficiency of the free-form lens is 88.06%, 87.74% and 88.15%, respectively, considering the Fresnel loss.
  • Figures 10(c), (f) and (i) show illuminance uniformity in the horizontal and vertical directions in the illumination region. Table 3 summarizes the optical performance of the off-axis test summarized.
  • Illuminance represents illuminance
  • Horizontal and vertical point locations represent horizontal and vertical point positions
  • Horizontal represents horizontal direction
  • Vertical represents vertical direction.
  • the lens mounting position L on the wing is set to 20.5 mm.
  • a lens volume with a maximum radial length pmax of 5.4 mm is set to ensure that the three lenses can be loaded into the robot camera.
  • the initial illumination distance D is set to 100 mm.
  • the radius of the target circular area R is set to 80 mm. Table 4 summarizes the specifications of the free optical lens design for laparoscopic lighting equipment.
  • Figure 11 shows a three dimensional (3D) design of a laparoscopic lighting device.
  • Figure 11 (a) shows three views of the freeform surface.
  • Figure 11 (b) shows the compactness of the lens that meets the lens volume limit.
  • Figure 11 (c) shows the integration of a lens and an LED in one wing.
  • Figure 11 (d) shows the 3D structure of the assembled laparoscopic lighting device.
  • the Front view is a front view
  • the Side view is a side view
  • the Top view is a top view
  • the Restriction of lens volume is a lens volume limit.
  • Fig. 12(a) shows the illuminance distribution on the target irradiation area.
  • the optical efficiency of the designed freeform lens is 89.45%, which means that a total of 105.55 lm of light in a total of 118 lumens of light is successfully projected onto the desired target illumination area.
  • the average illumination provided by a single LED is 5473.8lx.
  • the horizontal and vertical illuminance uniformities are 95.87% and 94.78%, respectively, as shown in Fig. 12(b).
  • Figure 12(c) shows the illuminance distribution on the target illumination area when all of the LEDs are powered.
  • the total luminous flux provided by the illuminating device is 354 lumens, while the total luminous flux falling in the target illuminated area is 316.58 lumens and the optical efficiency is 89.43%.
  • the average illumination of the target illumination area is 12,441lx.
  • Figure 12(d) shows that the illumination units in the horizontal and vertical directions are 96.33% and 96.79%, respectively.
  • Figure 12 (e) shows the illuminance distribution of the target illumination area with a 3D profile. The evaluation results of the lighting performance are summarized in Table 5. It can be clearly seen that the laparoscopic illumination device developed by the embodiment of the present application satisfies all the design requirements in Table 6. Illuminance in Fig. 12(e) represents illuminance.
  • the distance D between the camera and the target surgical field may be less than 100 mm.
  • the wings of the luminaire can still provide good illumination in this area at an angle ⁇ of 80 degrees, the uniformity of illumination is reduced and more energy is wasted outside of the FOV.
  • the in vivo laparoscopic illumination device proposed by the embodiment of the present application has a refocusing function.
  • the angle of the wing By adjusting the angle of the wing, the target illumination area can be uniformly illuminated when the camera module to the target distance changes, thereby controlling the light beam.
  • the value of ⁇ can be determined by the angle ⁇ . According to the geometry of this setup, ⁇ is calculated to be 6°.
  • it is set to 74°.
  • the average illuminance of a circular area having a radius R of 48 mm was calculated to be 45,823 lx.
  • the optical efficiency is about 92%.
  • the illuminance uniformity in the horizontal and vertical directions was 98.29% and 98.22%, respectively.
  • the average illuminance of a circular area having a radius R of 64 mm was calculated to be 24,172 lx. Considering the Fresnel loss, the optical efficiency is 90.9%.
  • the horizontal and vertical illumination uniformity were 95.37% and 95.98%, respectively.
  • the illumination performance of the refocused beam is summarized in Table 7.

Abstract

一种腹腔镜,该腹腔镜包括:监视器(101)、套管针(102)、抓握部件(103)、锚定部件(104)、照明设备(105)、摄像模组(106);监视器(101)用于显示摄像模组(106)采集的图像,套管针(102)置于腹腔壁的开口处,抓握部件(103)置于腹腔外,锚定部件(104)、照明设备(105)和摄像模组(106)均置于腹腔内;锚定部件(104)通过与抓握部件(103)之间的吸力固定在该腹腔壁上,摄像模组(106)通过照明设备(105)与锚定部件(104)连接,锚定部件(104)、照明设备(105)和摄像模组(106)通过套管针(102)进入腹腔内。应用该腹腔镜提供的方案,无需在腹腔壁上再另开口,因此能够减少腹腔壁的开口数量,从而减少腹腔壁的组织损伤。

Description

一种腹腔镜
本申请要求于2017年12月11日提交中国专利局、申请号为201711310849.9、发明名称为“一种腹腔镜”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及医疗设备技术领域,特别是涉及一种腹腔镜。
背景技术
通过小切口插入腹腔,并可为外科医生提供实时的视觉反馈的体内腹腔镜的发展,一直是改善微创手术(Minimally Invasive Surgery,MIS)临床可操作性的一个重要研究课题。
传统的腹腔镜为刚性长杆腹腔镜,该腹腔镜包括套管针、细长的空心针、与空心针连接的监视器、空心针自由端的摄像头。摄像头包括图像传感器、镜头、照明光源。该空心针被配置为用于穿过腹腔壁的组织层,进入腹腔内,并可以通过套管针手动地将摄像头安装在空心针的自由端。这样,在使用时,摄像头可以在照明光源的照射下采集腹腔内的组织,并将采集的图像传输至腹腔外部的监视器。
这种腹腔镜在使用时,通常需要至少在腹腔壁上开两个口,一个开口处放置套管针,手术器械通过套管针伸入腹腔内并对组织进行手术操作。另一个开口用于空心针伸入腹腔内,对腹腔内的组织提供照明并摄像。但是,两个以上的开口对腹腔壁的组织损伤较大。
发明内容
本申请实施例的目的在于提供了一种腹腔镜,以减少腹腔壁的开口数量,从而减少腹腔壁的组织损伤。具体的技术方案如下。
本申请实施例提供了一种腹腔镜,包括:抓握部件、锚定部件、照明设备和摄像模组;
所述摄像模组通过所述照明设备与所述锚定部件连接;
所述抓握部件在手术时置于腹腔外,用于通过与所述锚定部件之间的吸力,将所述锚定部件固定在腹腔内的腹腔壁上;
所述锚定部件、照明设备和摄像模组,在手术时通过用于插入手术器械的套管针进入腹腔内。
可选的,上述腹腔镜还包括:用于显示所述摄像模组采集的图像的监视器
可选的,所述照明设备包括:包括在空间上均匀排布的至少三个翼的翼部件,翼展开机构,位于每个翼上的发光部件和透镜部件,所述透镜部件罩在所述发光部件的外侧;
所述翼展开机构与所述翼部件连接,所述翼展开机构能够促使所述翼部件展开;所述翼部件以折叠状态进入所述腹腔内,当所述照明设备进入所述腹腔内之后并处于工作状态时,所述翼部件处于展开状态。
可选的,所述照明设备还包括:倾斜运动机构;所述倾斜运动机构能够促使所述照明设备倾斜。
可选的,所述摄像模组固定于所述翼部件的中间位置上;当所述翼部件处于展开状态时,所述摄像模组能够采集所述腹腔内的图像,当所述翼部件处于折叠状态时,所述摄像模组处于所述翼部件内部。
可选的,所述腹腔镜还包括:用户控制器;所述抓握部件包括:控制器电路板;
所述控制器电路板,用于接收所述用户控制器发送的第一控制命令,并根据所述第一控制命令控制所述照明设备执行第一操作;接收所述用户控制器发送的第二控制命令,并根据所述第二控制命令控制所述摄像模组执行第二操作;
其中,所述第一操作包括打开所述翼部件、关闭所述翼部件、移动所述照明设备、使所述照明设备倾斜、调节照明设备的光亮度;所述第二操作包括:开始采集图像和停止采集图像。
可选的,所述照明设备在所述腹腔内的目标照射区域的范围不小于所述摄像模组在所述腹腔内的图像采集区域的范围。
可选的,所述透镜部件使所述发光部件发出的光按照指定映射关系映射在目标照射区域;
所述指定映射关系为:使所述照明设备在所述腹腔内的目标照射区域的光照均匀度不小于预设均匀度阈值,以及光照强度不小于预设强度阈值的映射关系;所述指定映射关系为根据所述透镜部件的折射率、所述透镜部件的指定体积、所述发光部件的尺寸、所述发光部件的光强分布、所述发光部件与所述目标照射区域之间的相对位置确定。
可选的,所述指定映射关系为根据表面梯度
Figure PCTCN2018119585-appb-000001
得到,所述
Figure PCTCN2018119585-appb-000002
为以下方程的解:
Figure PCTCN2018119585-appb-000003
其中,所述∈为常系数,
Figure PCTCN2018119585-appb-000004
所述Ω s为所述发光部件的光源域,所述ξ和η分别为所述发光部件所在投影平面的横坐标和纵坐标,所述I 0为所述发光部件中轴处的光强分布,所述BC为边界条件,所述E t为预设的目标照射区域的照度分布函数,所述E t为根据所述预设均匀度阈值和预设强度阈值确定。
可选的,所述表面梯度
Figure PCTCN2018119585-appb-000005
采用以下方式确定:
将第一初始值作为所述目标照射区域的照度分布函数E t
将所述E t代入方程
Figure PCTCN2018119585-appb-000006
得出求解结果u
根据所述u 确定所述目标照射区域的模拟照度分布函数
Figure PCTCN2018119585-appb-000007
判断所述
Figure PCTCN2018119585-appb-000008
与所述E t之间的差距是否小于预设值;
如果是,则对所述u 求梯度,得到
Figure PCTCN2018119585-appb-000009
如果否,则计算修正照度分布函数
Figure PCTCN2018119585-appb-000010
将所述修正照度分布函数作为所述照度分布函数E t的值,返回执行所述将所述E t代入方程
Figure PCTCN2018119585-appb-000011
的步骤。
可选的,采用以下方式获得方程
Figure PCTCN2018119585-appb-000012
的求解结果u
将第二初始值和第三初始值分别作为所述u 和∈的值;
将所述u 和∈的值均代入所述方程
Figure PCTCN2018119585-appb-000013
对代入值后的方程进行数值离散化,采用数值求解器确定数值离散化之后方程的解u
判断所述∈的值是否小于预设最小值,如果是,则将确定的解u 作为所述方程的求解结果;如果否,则更新u 和∈的值,返回执行所述将所述u 和∈的值均代入所述方程
Figure PCTCN2018119585-appb-000014
的步骤。
本申请实施例提供的腹腔镜,包括:监视器、套管针、抓握部件、锚定部件、照明设备、摄像模组。其中,抓握部件置于腹腔外,锚定部件、照明设备和摄像模组均置于腹腔内,锚定部件通过与抓握部件之间的吸力固定在腹腔壁上,摄像模组通过照明设备与锚定部件连接,以实现对腹腔内组织的照明和图像采集。并且,锚定部件、照明设备和摄像模组通过套管针进入腹腔内,而该套管针为用于插入手术器械的套管针,手术时套管针所在的开口是手术时必需的开口,故无需在腹腔壁上再另开口,因此能够减少腹腔壁的开口数量,从而减少腹腔壁的组织损伤。当然,实施本申请的任一产品或方法并不一定需要同时达到以上所述的所有优点。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的腹腔镜的一种结构示意图;
图2a为实际中腹腔镜的应用场景参考图;
图2b为照明设备和摄像模组活动连接的一种结构示意图;
图3a和图3b分别为本申请实施例提供的照明设备处于展开状态和折叠状态的一种结构示意图;
图3c1为本申请实施例提供的照明设备和摄像模组的一种结构示意图;
图3c2为本申请实施例提供的照明设备和摄像模组的另一种结构示意图;
图3d为本申请实施例提供的照明设备的一种结构示意图;
图3e1和图3e2分别为与图3d对应的两种参考图;
图3f1为本申请实施例提供的腹腔镜的另一种结构示意图;
图3f2为与图3f1对应的一种参考图;
图3g1为本申请实施例提供的抓握部件的一种结构示意图;
图3g2为与图3g1对应的一种参考图;
图3h为本申请实施例提供的抓握部件控制照明设备倾斜的一种参考图;
图3I为本申请实施例提供的发光部件的光被重定向的示意图;
图4为本申请实施例提供的表面梯度的确定过程的一种流程示意图;
图5a~图5d均为本申请实施例提供的确定指定映射关系时的参考图;
图6~图13均为本申请实施例提供的对透镜光学设计的评价和测试参考图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整的描述。显然,所描述的实施例仅仅是本申请的一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本申请保护的范围。
为了减少腹腔壁的开口数量,从而减少腹腔壁的组织损伤,本申请实施例提供了一种腹腔镜。下面通过具体实施例,对本申请进行详细说明。
图1为本申请实施例提供的腹腔镜的一种结构示意图。图2a为本申请实施例提供的腹腔镜的一种应用场景参考图。该腹腔镜包括:抓握部件103、锚定部件104、照明设备105和摄像模组106。在图1中还显示出了监视器101和套管针102。摄像模组106可以包括成像传感器、镜头等子部件。
本实施例中,摄像模组106通过照明设备105与锚定部件104连接。抓握部件103在手术时置于腹腔外,用于通过与锚定部件104之间的吸力,将锚定部 件104固定在腹腔内的腹腔壁上。锚定部件104、照明设备105和摄像模组106,在手术时通过用于插入手术器械的套管针进入腹腔内。在本实施例中,在手术时,抓握部件103置于腹腔外,锚定部件104置于腹腔内壁上。
在图2a中示出了插入套管针102中的手术器械100。
在本实施例中,抓握部件103和锚定部件104可以均为磁性部件,抓握部件103和锚定部件104可以通过磁力分别吸附在腹腔壁的外侧和内侧。在实际应用中,锚定部件104、照明设备105、摄像模组106可以固定地连接在一起。当锚定部件104固定在腹腔壁内侧之后,与锚定部件104连接的照明设备105和摄像模组106也被锚定在腹腔内。
在另一种实施方式中,照明设备105、摄像模组106可以固定地连接在一起,锚定部件104与照明设备105之间的连接可以是活动连接,也就是锚定部件104与照明设备105之间的夹角可调节,参见图2b。
在为照明设备和摄像模组供电时,供电电源可以在照明设备内部或摄像模组内部,也可以通过专门的电源线为照明设备和摄像模组供电。
在实际使用时,可以通过镊子将锚定部件104、照明设备105和摄像模组106通过套管针102送进腹腔内。具体的,可以先将抓握部件103放置在腹腔壁外侧靠近套管针的位置,使用镊子将锚定部件104、照明设备105和摄像模组106从套管针102送进腹腔内,并在锚定部件104与抓握部件103吸住时松开镊子。这样即可将锚定部件104、照明设备105和摄像模组106固定在腹腔内。
在需要调整照明设备105和摄像模组106在腹腔内的位置时,可以移动抓握部件103,从而使锚定部件104在抓握部件103的吸力作用下产生移动,并移动至目标位置。在移动至目标位置之后,锚定部件即固定在腹腔内的腹腔壁上。上述移动包括平移和转动。
当锚定部件104与照明设备105之间的夹角可调节时,在将锚定部件104、照明设备105和摄像模组106送进腹腔内之前,可以调节好锚定部件104与照明设备105之间的夹角。
在将锚定部件104、照明设备105和摄像模组106送进腹腔内并固定之后,可以将手术器械通过套管针伸入腹腔,在照明设备的光照下,对腹腔内的组织进行手术。摄像模组106可以将采集的腹腔内组织的图像发送至监视器101,监视器101在接收到摄像模组106发送的图像时可以显示图像,供手术人员查看。
由上述内容可知,本实施例中的锚定部件通过与抓握部件之间的吸力固定在腹腔壁上,摄像模组通过照明设备与锚定部件连接,以实现对腹腔内组织的照明和图像采集。锚定部件、照明设备和摄像模组通过套管针进入腹腔内,而该套管针为用于插入手术器械的套管针,手术时套管针所在的开口是手术时必需的开口,故无需在腹腔壁上再另开口,因此能够减少腹腔壁的开口数量,从而减少腹腔壁的组织损伤。
同时,由于抓握部件可以通过与锚定部件之间的吸力来移动锚定部件,从而移动照明设备和摄像模组,实现对照明设备和摄像模组位置的调整,改变体内照明区域和摄像区域的位置。
在相关技术中,成像传感器和光源的同轴配置会导致输出的二维图像中缺少阴影深度线索,这将导致医生无法更准确地确定各个组织的深度、位置。其中,同轴配置为成像传感器的中轴和光源的中轴平行的配置方式。
为了增加摄像模组采集的腹腔内图像的阴影深度信息,还可以进一步对图1所示实施例中的腹腔镜加以改进,将成像传感器和光源设置成非同轴配置。其中,非同轴配置为成像传感器的中轴和光源的中轴不平行的配置方式。
在本申请的另一实施例中,在图1所示实施例的基础上,照明设备105可以包括:包括在空间上均匀排布的至少三个翼的翼部件51,翼展开机构52,位于每个翼上的发光部件53和透镜部件54,透镜部件54罩在发光部件53的外侧。
其中,翼展开机构52与翼部件51连接,翼展开机构52能够促使翼部件51展开,翼部件51以折叠状态进入腹腔内,当照明设备105进入腹腔内之后并处于工作状态时,翼部件51处于展开状态。其中,翼展开机构52可以为电动机或其他能提供驱动力的装置。
可选的,照明设备105还可以包括:倾斜运动机构55。该倾斜运动机构55能够促使照明设备105倾斜。其中,倾斜运动机构55可以为电动机或其他能提供驱动力的装置。
图3a所示为本实施例中照明设备处于展开状态的一种结构示意图。图3b为本实施例中照明设备处于折叠状态的一种结构示意图。图3a和图3b中的照明设备105包括三个翼部件51、倾斜运动机构55、翼展开机构52和位于每个翼上的发光部件53和透镜部件54。
在实际应用中,照明设备以折叠状态被传送到腹腔内。当锚定部件在腹 腔壁上被锚定时,照明部件的结构将从折叠状态转换成展开状态。
作为一种具体的实施方式,在折叠状态下,照明设备的外径可以为17mm,可以从直径为20mm的套管针中进入腹腔内。
综上,在本实施例中,照明设备包括翼部件,该翼部件包括在空间上均匀排布的至少三个翼,发光部件和透镜部件均位于每个翼上。这样,不管摄像模组固定在照明设备的什么位置,都可以保证成像传感器与光源的不同轴配置,从而能够增加图像中的阴影深度信息,为手术人员提供更优质的视场。
在本申请的另一实施例中,图3c1为本申请实施例提供的照明设备和摄像模组之间位置的一种结构示意图,图3c2为本申请实施例提供的照明设备和摄像设备一种角度的结构示意图。其中,摄像模组106可以固定于翼部件51的中间位置上。当翼部件51处于展开状态时,摄像模组106能够采集腹腔内的图像,当翼部件51处于折叠状态时,摄像模组106处于翼部件51内部。
在本实施方式中,当摄像模组固定于翼部件的中间位置上时,可以便于翼部件的折叠和展开,并且该结构也更易于实施。
在本申请的另一实施例中,图3d为本申请实施例提供的照明设备的一种内部结构示意图,图3e1和图3e2分别为与图3d对应的两种参考图。
在图3d中,照明设备还包括两个蜗杆和齿轮组56,第一个蜗杆和齿轮组561用于连接倾斜运动机构55与锚定部件104,第二个蜗杆和齿轮组562用于连接翼展开机构52和翼部件51。当翼部件51包括三个翼时,蜗杆和齿轮组562可以包括一个蜗杆和三个齿轮,这三个齿轮分别与三个翼连接。
具体的,第一个蜗杆和齿轮组561中的蜗杆可以与倾斜运动机构55相连接,第一个蜗杆和齿轮组561中的齿轮与锚定部件104相连接。在倾斜运动机构55的驱动下,第一个蜗杆和齿轮组561中的蜗杆带动齿轮转动,使照明设备105与锚定部件104之间呈现一定夹角。
第二个蜗杆和齿轮组562中的蜗杆可以与翼展开机构52相连接,第二个蜗杆和齿轮组562中的齿轮与翼部件51相连接。在翼展开机构52的驱动下,第二个蜗杆和齿轮组562中的蜗杆带动齿轮转动,使翼部件51展开或折叠。
当照明设备的翼部件包括三个翼时,第二个蜗杆和齿轮组562中可以包括一个蜗杆和三个齿轮。三个齿轮均与该蜗杆相啮合,各个齿轮分别与每个翼连接。在蜗杆提供的驱动力作用下,每个齿轮带动对应的翼展开或者折叠。
在本申请的另一实施例中,图1所示的腹腔镜还可以包括:用户控制器 107;抓握部件包括:控制器电路板。参见图3f1,图3f1为本申请实施例提供的腹腔镜的另一种结构示意图,图3f2为与图3f1对应的一种参考图。
在图3f1所示实施例中,用户控制器107可以与抓握部件103进行通信。摄像模组106与抓握部件103连接,并将采集的图像发送至抓握部件103,抓握部件103将接收的摄像模组发送的图像发送至监视器。其中,摄像模组可以通过线缆与抓握部件连接。
图3g1为抓握部件103的一种内部结构示意图,图3g2为与图3g1对应的一种参考图。该抓握部件103包括:控制器电路板31、齿轮32、永久磁体33、轴承34、直齿小齿轮35、电动机36。照明设备中的倾斜运动机构55和翼展开机构52可以均与控制器电路板31连接。永久磁体33的轴分别与齿轮32的轴和轴承34连接。电动机36可以驱动直齿小齿轮35转动,直齿小齿轮35与齿轮32啮合。电动机36可以通过齿轮32和直齿小齿轮35驱动永久磁体33转动。
其中,控制器电路板31,用于接收用户控制器107发送的第一控制命令,并根据第一控制命令控制照明设备105执行第一操作;接收用户控制器107发送的第二控制命令,并根据第二控制命令控制摄像模组106执行第二操作。
第一操作可以包括打开翼部件、关闭翼部件、移动照明设备、使照明设备倾斜、调节照明设备的光亮度中的至少一种;第二操作包括:开始采集图像和停止采集图像中的至少一种。移动照明设备包括转动照明设备和平移照明设备。
具体的,当照明设备105中的翼展开机构52为电动机时,控制器电路板31可以通过控制翼展开机构52实现打开翼部件或关闭翼部件的操作。当照明设备105中的倾斜运动机构55为电动机时,控制器电路板31可以通过控制倾斜运动机构55实现使照明设备倾斜操作。图3h为本申请实施例提供的抓握部件控制照明设备倾斜运动的一种参考图。
在图3h中,腹腔壁上方的部件为抓握部件,腹腔壁下方的圆柱块为锚定部件,锚定部件下方的部件为照明设备和摄像模组。其中,抓握部件可以带动锚定部件转动,照明设备和摄像模组也可以相对于锚定部件倾斜。锚定部件的转动以及照明设备的倾斜,可以大大增加照明设备的照明范围和角度,以及增加摄像模组的图像采集位置和角度。
控制器电路板31可以通过电动机36控制直齿小齿轮35转动,直齿小齿 轮35通过啮合作用带动齿轮32转动,齿轮32带动永久磁体33转动。当永久磁体33转动时,可以通过永久磁体33与锚定部件104之间的磁力使锚定部件104转动,锚定部件104带动照明设备转动。
当需要平移照明设备105时,可以平移抓握部件103,抓握部件103通过与锚定部件104之间的磁力使锚定部件104平移,锚定部件104带动照明设备平移。在实际应用中,可以通过人工平移抓握部件103。
控制器电路板31可以控制输送至照明设备105中的发光部件53的电流,以调节照明设备105的光亮度。
控制器电路板31可以向摄像模组106发送开始采集图像的指令,摄像模组106接收到该指令之后开始采集图像,并将采集的图像发送至控制器电路板31。控制器电路板31可以将摄像模组106发送的图像直接发送至监视器101,也可以对摄像模组106发送的图像进行处理。控制器电路板31可以向摄像模组106发送停止采集图像的指令,摄像模组106接收到该指令之后停止采集图像。
在实际应用中,抓握部件可以为具有外部锚定和控制作用的单元(External Anchoring and Control Unit,EACU),锚定部件、照明部件、摄像模组可以共同称为机器人摄像机。在微创手术(Minimally Invasive Surgery,MIS)进行期间,处于折叠状态的机器人摄像机通过套管针插入腹腔。EACU将机器人摄像机通过磁力固定在腹腔壁的内侧。EACU和机器人摄像机之间的软电缆用于控制信号传输、成像数据采集、电源供应以及将机器人摄像机从腹腔内取出。
手术人员可以通过用户控制器向EACU中的微控制器(MCU)发送控制信号。该MCU即为控制器电路板。该MCU控制机器人摄像机的翼部件打开或关闭,或者控制机器人摄像机执行平移、倾斜等操作,以调整机器人摄像机的姿态,或者调节机器人摄像机中照明设备中LED的亮度。该控制器还可以开启或关闭机器人摄像机中的成像系统。
EACU可以是整个腹腔镜的中央控制单元。从机器人摄像机中获取的手术视频在EACU中进行处理,并实时发送到监视器。MCU在接收到来自用户控制器的控制命令之后,可以向机器人摄像机中的摄像机发送开始采集图像和停止采集图像指令,也可以向机器人摄像机中的电动机发送控制指令,以控制机器人摄像机打开翼部件、关闭翼部件、转动、倾斜或调节光亮度等。
EACU内部有径向磁化的永久磁体(EPM)。EPM与机器人摄像机内部的磁体IPM磁性耦合,为机器人摄像机对腹腔壁的固定提供锚定力,并为旋转运动控制提供旋转扭矩。机器人摄像机的倾斜运动由机器人摄像机的机载致动机构驱动。
除了由EPM驱动的平移外,机器人摄像机的倾斜运动可以由带有蜗杆和齿轮组561的机载驱动器即倾斜运动机构55控制。平移和倾斜运动的组合使得机器人摄像机能够在视觉上覆盖整个手术区域。外壳中的翼展开机构52通过蜗杆和齿轮组562来控制翼的打开角度。机器人摄像机可以提供49°的俯仰运动范围和80°的翼运动范围(折叠状态时为0°)。
在本实施例中,上述倾斜运动机构55和翼展开机构52可以为步进电动机,例如可以选择直径为4mm,长度为14.42mm,行星齿轮头为125:1(型号为ZWBMD004004-125)的步进电动机。步进电动机在连续运行时可以提供10mNm的扭矩。用于倾斜运动机构和翼展开机构的蜗杆和齿轮组可以分别具有12:1和20:1的减速比。
在图1所示实施例中,发光部件53发出的光经过透镜部件54之后发生弯折,最终照射在目标照射区域上。在一种具体的实施方式中,照明设备105在腹腔内的目标照射区域的范围不小于摄像模组106在腹腔内的图像采集区域的范围。图3I为发光部件的光被重定向的示意图。其中,FOV(Field Of View)为摄像模组106的视场,以R为半径的圆形区域为目标照射区域。
目前,全插入式腹腔镜还存在的主要问题之一,是它们的成像性能较差。照明设备在确定手术图像质量方面起着至关重要的作用。如果将发光二极管(LED)与反光镜相结合作为照明设备的光源,那么不受约束的光束会照亮FOV之外的区域,从而浪费大部分能量;或在成像传感器的成像平面产生明亮的中心和黑暗的边缘。
为了提高图像质量,照明设备应满足以下要求:(1)照度均匀地分布在目标照射区域;(2)光效率高,这意味着需要最大限度地将光线投射在摄像模组的视场内。
为了达到上述要求,在本申请的另一实施例中,透镜部件54可以使发光部件52发出的光按照指定映射关系映射在目标照射区域。其中,映射也可以理解为投射或照射,即透镜部件54可以使发光部件52发出的光按照指定映射关系投射或照射在目标照射区域。
其中,上述指定映射关系为:使照明设备105在腹腔内的目标照射区域的光照均匀度不小于预设均匀度阈值,以及光照强度不小于预设强度阈值的映射关系。指定映射关系为:根据透镜部件54的折射率、透镜部件54的指定体积、发光部件53的尺寸、发光部件53的光强分布、该发光部件与目标照射区域之间的相对位置确定。
从发光部件发送的光经过透镜之后改变了光路,光按照指定映射关系照射在了目标照射区域,使目标照射区域具有一定的光照均匀度和光照强度,为微创手术提供了可靠而稳定的照明。
上述指定映射关系可以理解为由透镜所确定的映射关系。上述指定映射关系可以为根据表面梯度
Figure PCTCN2018119585-appb-000015
得到。具体的,可以根据表面梯度
Figure PCTCN2018119585-appb-000016
构建透镜的表面形状函数,当发光部件发出的光经过该透镜时,从该透镜投射出的光与发光部件发出的光之间存在指定的映射关系。
表面梯度
Figure PCTCN2018119585-appb-000017
可以理解为透镜的表面梯度。其中,
Figure PCTCN2018119585-appb-000018
为以下方程的解:
Figure PCTCN2018119585-appb-000019
其中,∈为常系数,用于辅助计算上述方程的解。
Figure PCTCN2018119585-appb-000020
E s为发光部件的照度分布函数,ζ={(ξ,η)|ξ 22≤1},ζ为发光部件照度的计算域。Ω s为发光部件的光源域,ξ和η分别为发光部件所在投影平面ξ-η的横坐标和纵坐标。I 0为发光部件中轴处的光强分布,也就是发光部件极角为0度处的光强分布。BC为边界条件。E t为预设的目标照射区域的照度分布函数,E t为根据预设均匀度阈值和预设强度阈值确定。
上述表面梯度可以采用图4所示流程示意图的步骤确定:
步骤S401:将第一初始值作为目标照射区域的照度分布函数E t
步骤S402:将E t代入方程
Figure PCTCN2018119585-appb-000021
得出求解结果u
步骤S403:根据u 确定目标照射区域的模拟照度分布函数
Figure PCTCN2018119585-appb-000022
本步骤中,可以根据u 确定表面梯度,根据确定的表面梯度确定透镜的表面形状函数,根据已知的发光部件的照度分布函数,确定光经过透镜的表面形状函数作用之后得到的照度分布函数,作为目标照射区域的
Figure PCTCN2018119585-appb-000023
步骤S404:判断所述
Figure PCTCN2018119585-appb-000024
与所述E t之间的差距是否小于预设值,如果是,则执行步骤S405,如果否,则执行步骤S406。
其中,
Figure PCTCN2018119585-appb-000025
与E t之间的差距可以是
Figure PCTCN2018119585-appb-000026
与E t之间的差值,也可以是
Figure PCTCN2018119585-appb-000027
与E t之间的方差。预设值为预先设定的值。
步骤S405:对所述u 求梯度,得到
Figure PCTCN2018119585-appb-000028
步骤S406:计算修正照度分布函数
Figure PCTCN2018119585-appb-000029
将所述修正照度分布函数作为所述照度分布函数E t的值,返回执行步骤S402。
在一种具体实施方式中,步骤S402可以采用以下方式执行:
步骤1:将第二初始值和第三初始值分别作为所述u 和∈的值。
其中,第二初始值为猜测的方程的解。∈可以在预设的逐渐减小的常数序列中取值,例如可以在1,10 -1,10 -2等中取值。
步骤2:将所述u 和∈的值均代入所述方程
Figure PCTCN2018119585-appb-000030
步骤3:对代入值后的方程进行数值离散化,采用数值求解器确定数值离散化之后的方程的解u
其中,对数值离散化和数值求解器为常见的求解方程的方法,此处不再细述。
步骤4:判断∈的值是否小于预设最小值,如果是,则将确定的解u 作为所述方程的求解结果;如果否,则更新u 和∈的值,返回执行步骤2。
在更新u 时,可以将步骤3确定的解u 作为更新后的u 。可以根据解出 的u 与代入的u 的偏离方向,确定∈在常数序列中的取值。
下面具体说明上述公式的推导过程。
令E s(ξ,η)和E t(x,y)分别表示发光部件即LED源辐照度分布和规定的目标辐射分布。如图5a所示,本申请的目标是找到射线映射函数
Figure PCTCN2018119585-appb-000031
将辐照度E s转变为E t,其中ζ=(ξ,η)和
Figure PCTCN2018119585-appb-000032
是源域Ω s和目标域Ω t约束的笛卡尔坐标。上述等式被认为是L 2Monge-Kantorovich问题的特殊情况。假设没有传输能量损失,φ应满足
Figure PCTCN2018119585-appb-000033
根据映射
Figure PCTCN2018119585-appb-000034
式(1)应表示为
Figure PCTCN2018119585-appb-000035
Brenier定理指出L2Monge-Kantorovich问题存在唯一解
Figure PCTCN2018119585-appb-000036
L2Monge-Kantorovich问题可以被表征为凸面的梯度
Figure PCTCN2018119585-appb-000037
代替式(2)中的
Figure PCTCN2018119585-appb-000038
我们可以看到u是标准Monge-Ampere方程的解:
Figure PCTCN2018119585-appb-000039
观察到低阶非线性偏微分方程的弱解可以由高阶准线性偏微分方程的序列近似。为了近似作为二阶非线性偏微分方程的标准Monge-Ampere方程的解,带有四阶偏导数的双调和算子是一个很好的选择。
式(3)的近似解因此可以从下式计算出:
Figure PCTCN2018119585-appb-000040
其中∈>0,如果极限存在,lim ∈→0+u 是弱解。Ω s的内部点应满足式(4)。Ω s的边界
Figure PCTCN2018119585-appb-000041
上的点应映射到Ω t的边界
Figure PCTCN2018119585-appb-000042
上。
根据
Figure PCTCN2018119585-appb-000043
诺依曼(Neumann)边界条件可以表达为
Figure PCTCN2018119585-appb-000044
其中,f是
Figure PCTCN2018119585-appb-000045
的数学表达式。结合式(4)和式(5),用于设计自由透镜的射线映射可以从以下准线性PDE和Neumann边界条件计算
Figure PCTCN2018119585-appb-000046
从式(6)计算射线映射
Figure PCTCN2018119585-appb-000047
需要有效的数值方法,在本部分详细介绍。上述步骤1~步骤4给出了求解式(6)的计算步骤。所提出的数值方法的主要思想是通过在每个迭代中更新∈来迭代近似u 。具体而言,将∈设定为逐渐减小的常数值的序列,例如1,10 -1,10 -2等。在每次迭代中,初始u 首先由最后一次迭代的输出u 提供或手动给出(在第一次迭代中)。迭代次数取决于序列中∈的个数。我们可以用∈=1开始迭代,得到u 的初始近似,这就是式(3)的解。当∈→0 +,式(4)等于式(3)。但这并不意味着我们在迭代过程中将∈定为0时可以找到最佳近似解u
误差
Figure PCTCN2018119585-appb-000048
由下式约束:
Figure PCTCN2018119585-appb-000049
其中,u 表示式(6)具有网格大小h的数值解。式(6)中∈的最终值与h有关,用于实现优化的收敛速度和最小化误差。这一关系取决于使用的范数。根据本申请获得的实验数据可知,当
Figure PCTCN2018119585-appb-000050
时,能够得到最小的全局误差。
为了数值离散化式(6),准线性偏微分方程和边界条件BC被重新表示为:
Figure PCTCN2018119585-appb-000051
式(8)中的一阶和二阶偏导数的离散化在Ω S内部区域采用中心有限差分法,对边界区域
Figure PCTCN2018119585-appb-000052
采用具有二阶校正误差的前向/后向有限差分方法。式(8)中的双调和项的离散化Δ 2u 可以由十三点模板表述
Figure PCTCN2018119585-appb-000053
其中,将(ξ ij)简写为(i,j)。然而,当通过使用式(9)中的十三点模板离散临界点时,引入了未定义的点。图5b示出了位于临界区域中的十三点模板
Figure PCTCN2018119585-appb-000054
的中心的示例。在这种情况下,
Figure PCTCN2018119585-appb-000055
Figure PCTCN2018119585-appb-000056
在源区域Ω s之外。未定义的
Figure PCTCN2018119585-appb-000057
Figure PCTCN2018119585-appb-000058
Figure PCTCN2018119585-appb-000059
的近似可以通过以下公式计算:
Figure PCTCN2018119585-appb-000060
其中,
Figure PCTCN2018119585-appb-000061
表示的网格中临界值;h是两个方向ξ和η的网格大小;
Figure PCTCN2018119585-appb-000062
是Ω s上的一阶偏微分,其可由式(8)中的边界条件确定。式(8)的数值离散化得到一组非 线性方程,可以表示为以下形式
F(U )=0       (11)
其中,U 表示变量u 的向量。选择牛顿法作为数值求解器来计算输出u 。然后,在当前迭代中将∈与∈ min=h比较,如果∈>h,则将初始值u 和∈用计算出的U 和更小的∈更新。如果∈≤h,则将当前迭代中的数值解U 的梯度作为最终的表面梯度。
上面提出的射线映射方法需要使用光源LED的辐照度分布E s(ξ,η)。然而,通常被认为是朗伯光源的大功率LED由半球形空间中的发光强度分布I=I 0cosθ(lmsr -1)定义,其中θ表示光线的极角,I 0表示θ=0°时的发光强度。本实施例应用立体投影法将光源的光强度转换为在平面上定义的辐照度分布。该方法的主要思想是将沿着发射方向SP=x u,y u,z u的光能量映射到ξ-η平面上的投影坐标ζ=(ξ,η)处,如图5c所示。在ξ-η平面上的辐照度E s的最终形式为
Figure PCTCN2018119585-appb-000063
其中,ξ 22≤1。对于网格点ξ 22≥1,我们定义E s(ξ,η)=0。
基于计算得到的射线映射,在∑ L{x L,y L,z L}空间中的每一对坐标(ξ ij)都能映射到目标平面上∑ G{x G,y G,z G}空间中的点T′ i,j=(x′ i,y′ j,z′(x i,y j)),其中i和j表示光源的离散化指数。根据∑ G和∑ L之间的旋转矩阵R和平移矢量T,T′ i,j能够由∑ L中的T i,j表示,如图5d(2)所示。由
Figure PCTCN2018119585-appb-000064
表示来自光源的单位入射光线矢量,其中
Figure PCTCN2018119585-appb-000065
Figure PCTCN2018119585-appb-000066
是(ξ ij)的函数。本实施例采用易于实施的表面构建方法设计光源的初始光学表面。该方法的主要思想是首先构建一个具有点p 1,1,…,p 1,n的序列的曲线,如图5d(1)-①所示。然后生成的曲线用于 计算沿图5d(1)-②中的方向的表面点。
如图5d(1)所示,定义O i,j作为来自光学表面的单位向外射线,并且将它用公式表示为:
Figure PCTCN2018119585-appb-000067
其中,p i,j表示要在表面上构建的点。在图5d(1)-①中,考虑到期望的透镜体积,可以根据需要的透镜体积手动选择初始点p 1,1。因此,O 1,1是用式(13)计算得到的。在p i,j的法向量可以由Snell定律计算得到:
Figure PCTCN2018119585-appb-000068
其中,n 0表示围绕透镜的介质的折射率,n 1表示透镜的折射率。曲线上下一个点p 1,2的坐标被计算为光线I 1,2与由p 1,1和N 1,2定义的平面之间的交点。在获得图5d(a)-①中第一条曲线上的点后,可以通过使用第一条曲线上的点作为初始点来计算方向②的曲线的点。
在采用了上述方法构建了具有所需透镜体积的自由曲面之后,由于累积误差,它不能保证在p i,j处计算的法向量N i,j对于p i,j与其相邻点p i+1,j,p i,j+1之间的向量是恒定的,如图5d(2)所示。为了解决这个问题并提高光照性能,本申请引入迭代优化技术来校正构建的初始曲面以更好地拟合法向量。理论上,如果表面网格足够小,表面点p i,j和该点处的法向量N i,j应满足以下约束:
(p i+1,j-p i,j)·N i,j=0      (15)
(p i,j+1-p i,j)·N i,j=0      (16)
假设我们用N个点来表示平面。将式(15)和式(16)中的p i,j替换为ρ i,jI i,j, 得到N个约束F 1,…F N
F k(ρ)=||(ρ i+1,jI i+1,ji,jI i,j)·N i,j||+||(ρ i,j+1I i,j+1i,jI i,j·N i,j||=0,
                                               (17)
其中,k=1,2,…,N,ρ i,j表示S与表面点p i,j之间的距离。采用非线性最小二乘法最小化F 1(ρ) 2+…+F N(ρ) 2,其中ρ i,j作为变量。更新的法向量N i,j根据式(14)通过使用当前迭代优化的ρ和射线映射计算得到。进行迭代以计算新的ρ,直到计算的表面点满足收敛条件‖ρ tt-1‖<δ,其中t代表当前的迭代次数,δ是停止条件。最后,光学表面能够通过使用具有非齐次有理基样条(Non-Uniform Rational Basis Spline,NURBS)的自由表面点来表示。
出于点光源假设,使用扩展尺寸的LED,照度均匀性会降低,尤其是在设计小体积光学透镜的情况下。通过采用反馈修正方法可以减轻此问题。采用E t(x,y)表示目标照射区域所需的照度分布,
Figure PCTCN2018119585-appb-000069
表示应用自由透镜后照度分布的模拟结果。下一次迭代后被修正的照度分布
Figure PCTCN2018119585-appb-000070
可以定义为
Figure PCTCN2018119585-appb-000071
在每一次迭代中会检测光照表现是否达到满意的照度均匀性。如果是,自由光学镜头设计就完成了。否则,将执行下一个迭代来修正自由透镜的表面。
在本申请中,申请人评估了腹腔镜中透镜设计方法的性能。图6(a)和(b)示出了轴上实验和离轴实验,该实验分别进行了利用光学设计软件研究不同应用场景下光学设计方法的有效性。本申请采用折射率为1.49的聚甲基丙烯酸甲酯(PMMA)为透镜材料,采用具有118lm光通量的NCSWE17A型LED作为光源。为了验证本申请实施例提供的方法是灵活的,能够设计用于不同图案的目标照射区域的自由光学透镜,申请人在轴上照明测试中设置圆形图案和正方形图案的目标照射区域。详细规格见表1。
表1自由曲面光学设计方法评定规范
Figure PCTCN2018119585-appb-000072
下述内容中的光线映射关系即为指定映射关系。
光线映射关系的计算。首先,将LED的光强分布(图6(c))转换为归一化照度分布(图6(d))。LED的ξ∈[-1,1],η∈[-1,1]的计算领域由81×81的网格离散化。根据光线映射算法,网格大小h=0.025确定ε的最小值为0.025。本申请选择了ε取1,0.5,0.025的序列来近似光线映射的数值解。为了验证本实施例中光线映射关系生成方法的有效性,演示了用ε取1,0.5,0.025计算的中间光线映射结果。采用ε=0.025计算的光线映射关系用于生成LED的自由光学透镜的初始表面。
图6为评价自由光学设计方法的模拟装置。(a)轴上测试:LED轴与目标照射区域的轴线重合,在测试中目标照射区域为圆形和方形;(b)离轴测试:在LED的轴线和目标照射区域的轴线之间偏移Δd=5mm、10mm和15mm。在该测试中,只使用圆形目标照射区域;(3)从LED数据表中获得LED光强分布;(d)使用该方法转换LED照度分布。
图7为对圆形和方形的目标照射区域分别计算的轴上光线映射关系,其中ε=1,0.5,0.025,采用81×81网格。为了达到清晰的可视化目的,在本图中插入的是61×61的网格。
图8显示了光线映射关系生成方法的收敛速度。收敛速度的特征采用公式(11)中||F|| 2的残值和迭代次数表示。公式(11)的剩余值||F|| 2单位为毫米。考虑到自由曲面光学透镜可以是在亚微米级(10 -4mm),可以保守地将收敛阈值设置在纳米级(10 -7mm)。在所有的实验中,||F|| 2可以在10次迭代之后达到10 -7的值。图8中(a)-(c)和(d)-(f)分别为ε取1,0.5和0.025时,圆形区域和方形区域情况下的收敛速度。
在图8中,Residual value of ||F|| 2为剩余值||F|| 2,Circular target region为圆 形目标照射区域,Square target region为方形目标照射区域,Iteration为迭代。
自由曲面光学透镜设计的在轴测试。图8(a)显示了自由曲面光学透镜设计的轴向测试的模拟设置。采用半径R为80mm的圆形目标照射区域和边长2R为160mm的方形目标照射区域进行在轴测试。从LED到目标照射区域中心的照明距离被设定为D=100mm。图9(a)和(b)展示了具有标记尺寸的设计透镜轮廓。图11(c)和(d)示出了目标照射区域上的模拟照度分布。在考虑菲涅耳损耗的情况下,自由曲面透镜的光学效率分别为88.3%和90.5%。照度均匀度(Uniformity)可以通过式(19)计算
Figure PCTCN2018119585-appb-000073
其中σ和μ是收集的照度数据的标准偏差和平均值。表2详细列出了轴上测试的光学性能。
表2在轴测试的光学性能
Figure PCTCN2018119585-appb-000074
图9为用于两种不同照明图案的轴上自由曲面透镜设计。(a)和(b)分别示出了圆形区域和正方形区域的透镜轮廓。(c)和(d)分别示出了(a)和(b)在目标平面上执行的照度均匀性。
在图9中,x轴和y轴的单位均为毫米(millimeters),lm为单位流明,Min表示平均照度的最小值,Max表示平均照度的最大值,Ave表示平均照度的平均值,Total Flux表示总光通量,Flux/Emitted Flux表示光通量与发出的光通量的比值,Incident Rays表示入射光学。上述英文注释同样适用于图10、图12和图13。
自由曲面光学透镜设计的离轴测试。图10(b)说明了离轴测试的模拟设置。照明区域被设置为半径R为80mm的圆形区域。LED到目标平面的距离设为D=100mm。轴向偏移Δd=5mm,10mm和15mm被引入以评估当LED5s轴线和目标照射区域S5s不一致时的最佳性能。为了在这种更一般化的情况下构建自由曲面光学曲面,需要一个变换矩阵将射线图从全局坐标转换到LED的局部坐标。图10显示了每种情况下设计的透镜轮廓和模拟照度分布结果。由于轴偏移,光学透镜不再是对称的。因此,本申请实施例提供镜头的 正面和侧面图,如图10(a),(d)和(g)所示。图10(b),(e)和(h)显示了圆形目标照射区域的模拟照度分布。在考虑菲涅耳损耗的情况下,自由曲面透镜的光学效率分别为88.06%,87.74%和88.15%。图10(c),(f)和(i)示出照明区域中沿水平和垂直方向的照度均匀性。表3中为总结的离轴测试的光学性能。
在图10中的Illuminance表示照度,Horizontal and vertical point locations表示水平和垂直点位置,Horizontal表示水平方向,Vertical表示垂直方向。上述英文注释同样适用于图12和图13。
表3离轴测试的光学性能
Figure PCTCN2018119585-appb-000075
LED自由光学透镜的最终设计。参见图3I中提供的照明设备的配置,机翼上的镜头安装位置L被设定为20.5mm。对于扩展模式,机翼的张角设定为β=80°。在设计中,设置了最大径向长度pmax为5.4毫米的镜头体积,以确保三个镜头可以装入机器人摄像机。初始照明距离D被设定为100mm。目标圆形区域R的半径被设定为80mm。表4总结了腹腔镜照明设备的自由光学透镜设计的规格。
表4照明设备设置的规格
Figure PCTCN2018119585-appb-000076
图11示出了腹腔镜照明设备的三维(3D)设计。图11(a)显示了自由曲面的三个视图。图11(b)示出了满足透镜体积限制的透镜的紧凑性。图 11(c)示出了在一个机翼中集成了透镜和LED。图11(d)示出了组装的腹腔镜照明设备的3D结构。
在图11中,Front view为前视图,Side view为侧视图,Top view为顶视图,Restriction of lens volume为透镜的体积限制。
目标照射区域的照明性能。根据表4中的模拟设置评估开发的照明设备的性能。由于三个翼的对称布置,单个LED首先被激励,通过其自由形式的透镜发射光线。图12(a)示出了目标照射区域上的照度分布。考虑到菲涅耳损耗,设计的自由曲面透镜的光学效率为89.45%,这意味着总共118流明光通量中的每个105.55lm的光被成功地投影到期望的目标照射区域。单颗LED提供的平均照度为5473.8lx。根据公式(19),水平和垂直照度均匀度分别为95.87%和94.78%,如图12(b)所示。
图12(c)示出了当所有的LED都被加电时目标照射区域上的照度分布。在这种情况下,照明设备提供的总光通量为354流明,而落在目标照射区域的总光通量为316.58流明,光学效率为89.43%。目标照射区域的平均照度为12,441lx。图12(d)显示,水平和垂直方向的照度单位分别为96.33%和96.79%。图12(e)展示了具有3D轮廓的目标照射区域的照度分布。表5中总结了照明性能的评估结果。可以很明显看到,本申请实施例开发的腹腔镜照明设备满足表6中所有设计要求。图12(e)中的Illuminance表示照度。
表5照明性能的评价结果
Figure PCTCN2018119585-appb-000077
表6腹腔镜照明设备的设计要求
Figure PCTCN2018119585-appb-000078
聚焦光束。在MIS中,将体内腹腔镜系统插入腹腔内之后,摄像机与目标手术区域之间的距离D可能小于100mm。虽然照明设备的翼在角度β为80度时仍然可以在该区域提供良好的照明,但照度均匀性会降低,并且在FOV 之外浪费更多的能量。
本申请实施例提出的体内腹腔镜照明设备具有重新聚焦功能,通过调节机翼的角度,可以在摄像模组到目标距离变化时均匀地照明目标照射区域,从而控制光束。图13(a)中,可以设置所需的目标照射区域D=60毫米。当机翼的角度β被设定为80°时,被照亮的区域为图13(a)中D=100mm,R=80mm的区域。这个β值最适合D=100mm。为了在D=60mm时将目标照射区域的光线重新聚焦,将翼展角度从β减小到β-△β,可以通过夹角θ来确定△β的值。根据这个设置的几何结构,θ被计算为6°。类似地,为了照亮D=80mm的目标照射区域,机翼的角度应该从初始角度β=80°降低θ=3°。
图13(b)-(e)示出了在D=60mm和D=80mm时通过重新聚焦目标平面的光束的照度分布。在图13(b)和(c)的情况下,设定为74°。半径R为48mm的圆形区域的平均照度计算为45823lx。在考虑菲涅耳损失的情况下,光学效率约为92%。水平和垂直方向的照度均匀度分别为98.29%和98.22%。而在图13(d)和(e)的情况下,β被设定为77°以照射D=80mm的目标照射区域。半径R为64毫米的圆形区域的平均照度计算为24172lx。考虑到菲涅耳损耗,光学效率为90.9%。水平和垂直照度均匀度分别为95.37%和95.98%。表7中总结了重新聚焦的光束的照明性能。
表7光重新聚焦测试的照明性能
Figure PCTCN2018119585-appb-000079
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并 不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
本说明书中的各个实施例均采用相关的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述得比较简单,相关之处参见方法实施例的部分说明即可。
以上所述仅为本申请的较佳实施例而已,并非用于限定本申请的保护范围。凡在本申请的精神和原则之内所做的任何修改、等同替换、改进等,均包含在本申请的保护范围内。

Claims (11)

  1. 一种腹腔镜,其特征在于,包括:抓握部件、锚定部件、照明设备和摄像模组;
    所述摄像模组通过所述照明设备与所述锚定部件连接;
    所述抓握部件在手术时置于腹腔外,用于通过与所述锚定部件之间的吸力,将所述锚定部件固定在腹腔内的腹腔壁上;
    所述锚定部件、照明设备和摄像模组,在手术时通过用于插入手术器械的套管针进入腹腔内。
  2. 根据权利要求1所述的腹腔镜,其特征在于,还包括:用于显示所述摄像模组采集的图像的监视器。
  3. 根据权利要求1所述的腹腔镜,其特征在于,所述照明设备包括:包括在空间上均匀排布的至少三个翼的翼部件,翼展开机构,位于每个翼上的发光部件和透镜部件,所述透镜部件罩在所述发光部件的外侧;
    所述翼展开机构与所述翼部件连接,所述翼展开机构能够促使所述翼部件展开;所述翼部件以折叠状态进入所述腹腔内,当所述照明设备进入所述腹腔内之后并处于工作状态时,所述翼部件处于展开状态。
  4. 根据权利要求3所述的腹腔镜,其特征在于,所述照明设备还包括:倾斜运动机构;所述倾斜运动机构能够促使所述照明设备倾斜。
  5. 根据权利要求4所述的腹腔镜,其特征在于,所述摄像模组固定于所述翼部件的中间位置上;当所述翼部件处于展开状态时,所述摄像模组能够采集所述腹腔内的图像,当所述翼部件处于折叠状态时,所述摄像模组处于所述翼部件内部。
  6. 根据权利要求5所述的腹腔镜,其特征在于,所述腹腔镜还包括:用户控制器;所述抓握部件包括:控制器电路板;
    所述控制器电路板,用于接收所述用户控制器发送的第一控制命令,并根据所述第一控制命令控制所述照明设备执行第一操作;接收所述用户控制器发送的第二控制命令,并根据所述第二控制命令控制所述摄像模组执行第二操作;
    其中,所述第一操作包括打开所述翼部件、关闭所述翼部件、移动所述照明设备、使所述照明设备倾斜、调节所述照明设备的光亮度;所述第二操作包括:开始采集图像和停止采集图像。
  7. 根据权利要求5所述的腹腔镜,其特征在于,所述照明设备在所述腹腔内的目标照射区域的范围不小于所述摄像模组在所述腹腔内的图像采集区域的范围。
  8. 根据权利要求3所述的腹腔镜,其特征在于,所述透镜部件使所述发光部件发出的光按照指定映射关系映射在目标照射区域;
    所述指定映射关系为:使所述照明设备在所述腹腔内的目标照射区域的光照均匀度不小于预设均匀度阈值,以及光照强度不小于预设强度阈值的映射关系;所述指定映射关系为根据所述透镜部件的折射率、所述透镜部件的指定体积、所述发光部件的尺寸、所述发光部件的光强分布、所述发光部件与所述目标照射区域之间的相对位置确定。
  9. 根据权利要求8所述的腹腔镜,其特征在于,所述指定映射关系为根据表面梯度
    Figure PCTCN2018119585-appb-100001
    得到,所述
    Figure PCTCN2018119585-appb-100002
    为以下方程的解:
    Figure PCTCN2018119585-appb-100003
    其中,所述∈为常系数,
    Figure PCTCN2018119585-appb-100004
    ζ={(ξ,η)|ξ 22≤1},所述Ω s为所述发光部件的光源域,所述ξ和η分别为所述发光部件所在投影平面的横坐标和纵坐标,所述I 0为所述发光部件中轴处的光强分布,所述BC为边界条件,所述E t为预设的目标照射区域的照度分布函数,所述E t为根据所述预设均匀度阈值和预设强度阈值确定。
  10. 根据权利要求9所述的腹腔镜,其特征在于,所述表面梯度
    Figure PCTCN2018119585-appb-100005
    采用以下方式确定:
    将第一初始值作为所述目标照射区域的照度分布函数E t
    将所述E t代入方程
    Figure PCTCN2018119585-appb-100006
    得出求解结果u
    根据所述u 确定所述目标照射区域的模拟照度分布函数
    Figure PCTCN2018119585-appb-100007
    判断所述
    Figure PCTCN2018119585-appb-100008
    与所述E t之间的差距是否小于预设值;
    如果是,则对所述u 求梯度,得到
    Figure PCTCN2018119585-appb-100009
    如果否,则计算修正照度分布函数
    Figure PCTCN2018119585-appb-100010
    将所述修正照度分布函数作为所述照度分布函数E t的值,返回执行所述将所述E t代入方程
    Figure PCTCN2018119585-appb-100011
    的步骤。
  11. 根据权利要求10所述的腹腔镜,其特征在于,采用以下方式获得方程
    Figure PCTCN2018119585-appb-100012
    的求解结果u
    将第二初始值和第三初始值分别作为所述u 和∈的值;
    将所述u 和∈的值均代入所述方程
    Figure PCTCN2018119585-appb-100013
    对代入值后的方程进行数值离散化,采用数值求解器确定数值离散化之后方程的解u
    判断所述∈的值是否小于预设最小值,如果是,则将确定的解u 作为所述方程的求解结果;如果否,则更新u 和∈的值,返回执行所述将所述u 和∈的值均代入所述方程
    Figure PCTCN2018119585-appb-100014
    的步骤。
PCT/CN2018/119585 2017-12-11 2018-12-06 一种腹腔镜 WO2019114605A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711310849.9A CN109893078B (zh) 2017-12-11 2017-12-11 一种腹腔镜
CN201711310849.9 2017-12-11

Publications (1)

Publication Number Publication Date
WO2019114605A1 true WO2019114605A1 (zh) 2019-06-20

Family

ID=66819935

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/119585 WO2019114605A1 (zh) 2017-12-11 2018-12-06 一种腹腔镜

Country Status (2)

Country Link
CN (1) CN109893078B (zh)
WO (1) WO2019114605A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112957107B (zh) * 2021-02-19 2021-11-30 南昌华安众辉健康科技有限公司 一种具有腹腔硬镜的胸腹腔手术器械

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1093894A (zh) * 1993-02-22 1994-10-26 谷实验室有限公司 一种腹腔镜下解剖的张力牵开器装置和方法
WO2001080734A1 (en) * 2000-04-27 2001-11-01 The University Of Nottingham Planar light sheet probes
CN102247176A (zh) * 2011-04-11 2011-11-23 西安交通大学 一种腹腔镜手术的磁性辅助照明摄像的装置
CN104783889A (zh) * 2015-04-01 2015-07-22 上海交通大学 内窥镜手术机械臂系统及其视觉反馈装置
CN105326470A (zh) * 2015-11-30 2016-02-17 西安交通大学第一附属医院 一种基于近红外光视觉诊断的磁锚定腹腔镜系统
CN105358072A (zh) * 2014-04-22 2016-02-24 香港生物医学工程有限公司 单进入通道外科机器人装置和系统以及配置单进入通道外科机器人装置和系统的方法
US20170100160A1 (en) * 2015-10-08 2017-04-13 Karl Storz Gmbh & Co. Kg Access system for endoscopic operations

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5949592B2 (ja) * 2013-02-14 2016-07-06 ソニー株式会社 内視鏡及び内視鏡装置
CN208598361U (zh) * 2017-12-11 2019-03-15 梅达布蒂奇股份有限公司 一种腹腔镜

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1093894A (zh) * 1993-02-22 1994-10-26 谷实验室有限公司 一种腹腔镜下解剖的张力牵开器装置和方法
WO2001080734A1 (en) * 2000-04-27 2001-11-01 The University Of Nottingham Planar light sheet probes
CN102247176A (zh) * 2011-04-11 2011-11-23 西安交通大学 一种腹腔镜手术的磁性辅助照明摄像的装置
CN105358072A (zh) * 2014-04-22 2016-02-24 香港生物医学工程有限公司 单进入通道外科机器人装置和系统以及配置单进入通道外科机器人装置和系统的方法
CN104783889A (zh) * 2015-04-01 2015-07-22 上海交通大学 内窥镜手术机械臂系统及其视觉反馈装置
US20170100160A1 (en) * 2015-10-08 2017-04-13 Karl Storz Gmbh & Co. Kg Access system for endoscopic operations
CN105326470A (zh) * 2015-11-30 2016-02-17 西安交通大学第一附属医院 一种基于近红外光视觉诊断的磁锚定腹腔镜系统

Also Published As

Publication number Publication date
CN109893078A (zh) 2019-06-18
CN109893078B (zh) 2021-12-03

Similar Documents

Publication Publication Date Title
JP7222133B2 (ja) 物体の三次元モデルを形成するための1つ又は複数の光度エッジの生成
US20220265134A1 (en) Surgical visualization systems and related methods
JP6922024B2 (ja) トレーニング方法
US8602980B2 (en) Folding endoscope and method of using the same
JP2020127770A (ja) 可搬型外科手術の方法、システムおよび装置
US20210186316A1 (en) Surgical visualization systems and related methods
Castro et al. A wireless robot for networked laparoscopy
US20100013910A1 (en) Stereo viewer
US9545190B2 (en) Endoscope apparatus with rotatable imaging module
WO2019114605A1 (zh) 一种腹腔镜
US20070274577A1 (en) &#34;System for the stereoscopic viewing of real time or static images&#34;
CN103836369A (zh) 具摄像功能的便携式手术无影灯
Liu et al. Transformable in vivo robotic laparoscopic camera with optimized illumination system for single-port access surgery: Initial prototype
CN208598361U (zh) 一种腹腔镜
WO2019114607A1 (zh) 照明设备和机器人摄像机
CN217938392U (zh) 外科手术植入成像系统
Liu et al. Optical design of an in vivo laparoscopic lighting system
CN205909218U (zh) 体检车及其无影灯系统
CN205814360U (zh) 一种搭载oct技术的无线传输式胸腹微创装置
Liu et al. Design and test of an in-vivo robotic camera integrated with optimized illumination system for single-port laparoscopic surgery
CN109602381A (zh) 一种模块化体内模拟人眼的3d微型摄像头
Garcia-Martinez et al. Toward an enhanced modular operating room
Zhuang et al. Video system design of a miniature cable-free robot for LESS
CN106090834A (zh) 体检车及其无影灯系统
Feng et al. Development of master-slave magnetic anchoring vision robotic system for single-port laparoscopy (SPL) surgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18888277

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18888277

Country of ref document: EP

Kind code of ref document: A1