CN110687669A - Tracking type naked eye 3D display system and method based on body type microscope - Google Patents

Tracking type naked eye 3D display system and method based on body type microscope Download PDF

Info

Publication number
CN110687669A
CN110687669A CN201911106139.3A CN201911106139A CN110687669A CN 110687669 A CN110687669 A CN 110687669A CN 201911106139 A CN201911106139 A CN 201911106139A CN 110687669 A CN110687669 A CN 110687669A
Authority
CN
China
Prior art keywords
target
position information
module
naked eye
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911106139.3A
Other languages
Chinese (zh)
Inventor
乔梦阳
叶磊
周峰
李焘然
龚健
韦晓孝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xinzhiwei Technology Co Ltd
Original Assignee
Shenzhen Xinzhiwei Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xinzhiwei Technology Co Ltd filed Critical Shenzhen Xinzhiwei Technology Co Ltd
Priority to CN201911106139.3A priority Critical patent/CN110687669A/en
Publication of CN110687669A publication Critical patent/CN110687669A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0012Surgical microscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/20Surgical microscopes characterised by non-optical aspects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Signal Processing (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computer Graphics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The invention discloses a tracking type naked eye 3D display system and method based on a body type microscope, wherein the system is configured in the microscope and comprises at least one operative field acquisition module, a tracking type naked eye 3D display module and a tracking type naked eye 3D display module, wherein the operative field acquisition modules are respectively arranged in a lens cone of the microscope and are used for respectively acquiring target views at target angles and processing the target views to obtain target format images; the tracking module is arranged on the naked eye display device and used for tracking and determining the position information of the target light source and determining the target position information according to the position information; and the image processing module is arranged in the naked eye display device and used for processing the target format image, the target position information and the interleaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image. According to the technical scheme of the embodiment of the invention, the image information acquired by the surgical field acquisition module is processed to obtain the three-dimensional view corresponding to the surgical part, so that a worker can process the surgical part according to the three-dimensional view, and the accuracy and convenience of the surgery are improved.

Description

Tracking type naked eye 3D display system and method based on body type microscope
Technical Field
The embodiment of the invention relates to the technical field of medical treatment, in particular to a tracking type naked eye 3D display system and method based on an integral microscope.
Background
In the prior art, a worker performs an operation only by means of an operating microscope during an operation, and a doctor and an assistant respectively observe a focus of a patient through an ocular lens of the operating microscope and perform the operation. When adopting this kind of mode to carry out the operation, the doctor of main sword and assistant all need lie prone and observe patient's focus in eyepiece department, and the limitation is stronger. In another mode, a photoelectric conversion module is arranged in an ocular lens, and pictures shot by a microscope are displayed on a display, so that both a doctor and an assistant can perform an operation through the display. However, due to technical limitations, the conventional photoelectric conversion module can only output one path of image, i.e., a 2D image, and cannot determine depth information corresponding to the image, and there may be a problem of low operation efficiency due to uncertainty of depth information corresponding to a lesion during an operation.
Disclosure of Invention
The invention provides a tracking type naked eye 3D display system and method based on an integral microscope, and aims to achieve the technical effects of improving the accuracy and convenience of an operation.
In a first aspect, an embodiment of the present invention provides a tracking naked-eye 3D display system based on a body type microscope, where the system is applied to a microscope, and includes
The surgical field acquisition modules are respectively arranged in a lens cone of the microscope and are used for respectively acquiring target views at target angles and processing the target views to obtain target format images;
the tracking module is arranged on the naked eye display device and used for tracking and determining the position information of the target light source and determining the target position information according to the position information;
and the image processing module is arranged in the naked eye display device and used for processing the target format image, the target position information and the interweaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image.
In a second aspect, an embodiment of the present invention further provides a tracking naked eye 3D display method based on a body type microscope, where the method includes:
at least one operative field acquisition module respectively acquires a target view of a target angle and processes the target view to obtain a target format image;
the tracking module tracks and determines the position information of the target light source, and determines the target position information according to the position information;
and the image processing module processes the target format image, the target position information and the interleaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image.
In a third aspect, an embodiment of the present invention further provides an apparatus, where the apparatus includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the tracking naked eye 3D display method based on the body type microscope according to any embodiment of the present invention.
In a fourth aspect, the present invention further provides a storage medium containing computer executable instructions, which when executed by a computer processor, are configured to perform the method for tracked naked-eye 3D display based on a body microscope according to any one of the embodiments of the present invention.
According to the technical scheme of the embodiment of the invention, at least one operative field acquisition module is respectively arranged in a lens cone of a microscope and used for respectively acquiring target views of target angles and processing the target views to obtain target format images; the tracking module is arranged on the naked eye display device and used for tracking and determining the position information of the target light source and determining the target position information according to the position information; the image processing module is arranged in the naked eye display device and used for processing the target format image, the target position information and the interweaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image, so that the problem that in the prior art, when a patient is operated only through an eyepiece of a microscope, the limitation is large is solved; when the photoelectric conversion module is used for displaying the operation part on the display interface, the depth information corresponding to the operation part does not exist, and the problem of influence on the accuracy of the operation is solved, the image information collected by the operation field collection module is processed, the three-dimensional view corresponding to the operation part is obtained, so that the operation part is processed by a worker according to the three-dimensional view, and the technical effects of accuracy and convenience of the operation are improved.
Drawings
In order to more clearly illustrate the technical solutions of the exemplary embodiments of the present invention, a brief description is given below of the drawings used in describing the embodiments. It should be clear that the described figures are only views of some of the embodiments of the invention to be described, not all, and that for a person skilled in the art, other figures can be derived from these figures without inventive effort.
Fig. 1 is a schematic structural diagram of a tracking naked-eye 3D display system based on an integrated microscope according to a first embodiment of the present invention;
fig. 2 is a schematic structural diagram of another tracking naked-eye 3D display system based on a body microscope according to a first embodiment of the present invention;
FIG. 3 is a diagram illustrating image information displayed on a display screen by an image-based display system according to an embodiment of the present invention;
fig. 4 is a schematic flow chart of a tracking naked eye 3D display method based on an integrated microscope according to a second embodiment of the present invention;
fig. 5 is a schematic structural diagram of an apparatus according to a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic structural diagram of a tracking naked-eye 3D display system based on an integrated microscope according to a first embodiment of the present invention. The image display system is applied to microsurgery, and can process the video image shot by the microscope to obtain a three-dimensional view corresponding to the image collected by the microscope, so that the technical effects of improving the operation efficiency and accuracy are achieved. As shown in fig. 1, the image system includes: at least one surgical field acquisition module 110, a tracking module 120, and an image processing module 130.
The surgical field acquisition modules 110 are respectively installed in a lens barrel of the microscope and are used for respectively acquiring target views at target angles and processing the target views to obtain target format images; the tracking module 120 is arranged on the naked eye display device and used for tracking and determining the position information of the target light source and determining the target position information according to the position information; and the image processing module 130 is arranged in the naked eye display device and is used for processing the target format image, the target position information and the interleaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image.
According to the technical scheme of the embodiment of the invention, at least one operative field acquisition module is respectively arranged in a lens cone of a microscope and used for respectively acquiring target views of target angles and processing the target views to obtain target format images; the tracking module is arranged on the naked eye display device and used for tracking and determining the position information of the target light source and determining the target position information according to the position information; the image processing module is arranged in the naked eye display device and used for processing the target format image, the target position information and the interweaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image, so that the problem that in the prior art, when a patient is operated only through an eyepiece of a microscope, the limitation is large is solved; when the photoelectric conversion module is used for displaying the operation part on the display interface, the depth information corresponding to the operation part does not exist, and the problem of influence on the accuracy of the operation is solved, the image information collected by the operation field collection module is processed, the three-dimensional view corresponding to the operation part is obtained, so that the operation part is processed by a worker according to the three-dimensional view, and the technical effects of accuracy and convenience of the operation are improved.
The number of the at least one operation field acquisition module can be one, two or more, and optionally, the number of the operation field acquisition modules is two. The operative field acquisition modules are respectively arranged in a lens cone of the microscope and used for acquiring image information corresponding to left and right eyes of a user. The target angle is an angle corresponding to the user viewing the surgical site through the eyepiece of the microscope, and is referred to as a target angle. The operation field acquisition module can respectively acquire two paths of operation field images, and the two paths of operation field images respectively correspond to image information seen by the left eye and the right eye of a user through the ocular. The target format image may be in a three-dimensional display format. That is, if the target view needs three-dimensional display, the surgical field acquisition module may process the left and right views after acquiring the left and right views corresponding to the target angle, so as to obtain an image with a three-dimensional display format. Optionally, the target format may be a 2D + Z format, and may also be view information in other formats.
It should be noted that the surgical field acquisition module can acquire image information of the surgical site in real time, and process the image information acquired at the same time to obtain an image in a target format.
It can be understood that the two operative field acquisition modules are respectively arranged in a lens cone of the microscope and used for respectively acquiring view information corresponding to left and right eyes of a user to obtain a left view and a right view. The operative field acquisition module can process the left view and the right view respectively to obtain image information in a three-dimensional format.
The tracking module is mainly used for positioning the target light source, the tracking module is arranged on the display device, and optionally, the tracking module is arranged at the upper edge of a display screen of the naked-eye 3D display device. The tracking module may be a camera. After the tracking module locates the target light source, the tracking module can determine spatial position information corresponding to the target light source, and then determine spatial position coordinates of the pupil of the user, namely the target position information, according to the relative position information of the target light source and the pupil. The purpose of determining the pupil position information of the user is to adjust the image information corresponding to the target user in real time when the naked eye 3D display device displays three-dimensionally, so that the technical problem that crosstalk occurs in the image on the display device when the naked eye 3D display device performs an operation on a patient is solved.
Optionally, the tracking module is a camera, and includes a single camera, two cameras, or multiple cameras; the tracking module is used for capturing light ray information of a preset wave band and determining the position information of the target light source according to the light ray information.
Wherein, the tracking module can adopt one or more of the cameras. In order to improve the accuracy of determining the position of the target light source, two cameras or multiple cameras can be adopted. The light within the preset wave band is emitted by a light emitting module worn on the user, namely a light source. The advantages of determining the position information of the target light source by capturing the light information in the preset wave band are that: the efficiency of collecting the light source position is higher, so that the efficiency according to the relative position of the light source position and the pupil of the user is improved, namely the efficiency of determining the target position information is improved.
The light ray information in the preset wave band is transmitted and can be realized through the light ray transmitting module. Optionally, the light emitting module is disposed on the wearable device, worn at a preset position of the target user, and configured to emit preset waveband light information to provide position information of the target light source.
The light emitting module is integrated on the wearable device, and is optionally arranged on the head-mounted device. The preset position may be understood as being worn on the forehead of the target user with the target light source at a central position of the forehead.
Specifically, when the light emitting module and the light emitting module which are worn on the forehead of the user work, namely light in a preset waveband is emitted, the tracking module can determine the spatial position of the target light source, and further the pupil position of the user can be determined according to the spatial position of the target light source.
It should be noted that, if the tracking module is used to locate the position of the target light source, a filter may be disposed on the tracking module, that is, the camera, so as to ensure that the camera can only locate the target light source emitting light information of a specific waveband. Optionally, an optical filter is disposed at the front end of the camera and used for filtering light rays in a preset wavelength band to determine position information of the target light source.
The light information within the predetermined wavelength band may be light information within an infrared wavelength band. Optionally, the light in the preset wave band is between 840nm and 1100 nm. By positioning the position of the target light source emitting the specific waveband, the efficiency of capturing the position of the target light source can be improved, so that the naked eye 3D display device can determine the position information of the pupil of the user according to the position information of the target light source, namely, the efficiency of determining the position information of the pupil of the user is improved.
It should be noted that, since image information corresponding to the surgical site needs to be accurately acquired in real time during the surgical procedure, image information displayed on the naked eye 3D display interface needs to be determined in real time according to the pupil position information of the user. When the pupil position information of the user is directly acquired, the efficiency is low, so that the image displayed on the display screen can correspond to the pupil position information of the target user in real time by positioning the position information of the target light source, and the problem of image crosstalk is avoided.
It should be noted that a series of calculation formulas are preset in the tracking module to determine the pupil position information of the user according to the spatial position information of the target light source.
When the tracking module adopts a color camera, the position of the mark can be determined by positioning a specific pattern or color without positioning the target light source information, and further the pupil position information of the user can be determined according to the relative position between the mark and the pupil.
Optionally, the tracking module includes two color cameras; and the two color cameras are used for capturing image information marked on the target user in advance and determining pupil coordinate information of the target user as target position information according to the image information.
The color camera can position the preset color or the marked position information.
Specifically, if the tracking module uses a color camera, a specific color or pattern mark may be marked on a specific part of the user in advance. For example, a red square is pasted between the eyebrow centers, and the color camera can determine the spatial position information of the red square. And determining the position information of the pupil according to the space position information of the red square and the relative position information of the pupil.
The image processing module is arranged in the naked eye display device, and after receiving the image information of the target format, the user three-dimensionally displays the image of the target format on the display interface according to the image information of the target format, the pupil position information of the user and the interweaving parameter information of the naked eye 3D display device, namely information such as the distance between prisms on the display screen.
In this embodiment, the tracking module locates the target light source emitting light of a specific waveband, so that the efficiency of determining the pupil position of the user is high, and the image processing module can quickly determine the image information displayed on the naked eye 3D display device in a target format according to the pupil position information of the user, thereby avoiding the technical problem of image crosstalk caused by user movement.
As a preferred embodiment of the foregoing embodiment, fig. 2 is a schematic structural diagram of a tracking naked-eye 3D display system based on a body microscope according to an embodiment of the present invention. As shown in fig. 2, the surgical field tracking system comprises a surgical field acquisition module, a naked eye 3D display module, a tracking module and an auxiliary module.
It should be noted that each module in the system structure diagram may be named according to its function, or may be named by other names as long as the corresponding function can be implemented.
The surgical field acquisition module 210 is installed in the lens barrels corresponding to the eyepieces in the surgical microscope, can acquire two surgical field pictures, processes 3D images, and transmits the images to the naked eye 3D display module. That is to say, the operative field acquisition module 210 can directly send the processed target format image to the naked eye 3D display module after acquiring and processing the left and right views. And the tracking module 220 is used for tracking and positioning, and calculating the positions of the eyes of the viewer according to the position relationship between the auxiliary module and the eyes of the viewer through tracking and positioning of the auxiliary module. The tracking module is generally a camera, and the tracking camera may adopt a single camera, a double camera or a plurality of cameras. The auxiliary module 230 is generally worn on the body of the user and used as a mark for auxiliary tracking, and the tracking camera calculates the spatial position of the auxiliary module through vision, so as to obtain the pupil position information of the user.
The method for matching the auxiliary module with the tracking module comprises at least two methods. The first mode is as follows: the tracking module is a camera which can collect light of a specific wave band, optionally, a filter is arranged on the camera and used for collecting light information of the specific wave band, and the tracking mode is the positioning of the light mark point of the specific wave band. The positioning of the light mark points of a specific wavelength band may be: the auxiliary module is an optical mark point device with a specific wave band, and optionally a light emitting module emitting light of the specific wave band. The tracking module determines space coordinate information corresponding to the pupils of the user by tracking the position of the target light source of the positioning auxiliary module and adopting a space calculation method; the second way is: the tracking module is a color camera, and the auxiliary module is a mark with a specific pattern and a specific color. The tracking module determines the pupil position coordinates of the user by locating the image information of a specific mark or color and sends the pupil position coordinates to the 3D display module. The 3D display screen is a special naked-eye 3D screen, light splitting display can be carried out to form naked-eye 3D, and the 3D display module can adopt technologies such as a cylindrical lens technology, a slit technology and an LC lens. The 3D display module can determine the three-dimensional image information displayed on the display device according to the received target format image, the target position information of the pupil and the interweaving parameter information of the naked eye display device.
It should be noted that, with the technical solution of the embodiment of the present invention, the image of the surgical site displayed on the display screen may be as shown in fig. 3.
The image display system provided by the embodiment of the invention can execute the image display method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the system are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
Example two
Fig. 4 is a schematic flow chart of a tracking naked eye 3D display method based on an integral microscope according to a second embodiment of the present invention, and as shown in fig. 4, the image display method includes:
s410, at least one operative field acquisition module respectively acquires a target view of a target angle and processes the target view to obtain a target format image.
The microscope is provided with an operation field acquisition module in the lens barrels corresponding to the two eyepieces, the operation field acquisition module can acquire operation field pictures on an operating table, namely target angle views corresponding to target users can be acquired, namely, the two operation field acquisition modules can respectively acquire left and right views corresponding to left and right eyes. After the left view and the right view are collected, the left view and the right view can be processed to obtain images in a 3D format, and the images in the 3D format are sent to an image processing module on a naked eye 3D display device.
And S420, tracking and determining the position information of the target light source by the tracking module, and determining the target position information according to the position information.
Before the tracking module works, a target user needs to wear a head-mounted device comprising a target light source or make an image of a specific color or mark on a specific part of the target user. The advantage of this setting is that can confirm the position of mark point through mark point location, enter into according to mark point and the corresponding position information of user's pupil, confirm user's pupil position coordinate.
Specifically, the tracking module may collect light ray information of a specific waveband, calculate position information of the target light source according to the light ray information of the specific waveband, calculate coordinate information of a pupil in space according to corresponding position information between the target light source and the pupil of the user, and send the space coordinate information corresponding to the pupil of the user to the image processing module of the naked eye 3D display device.
S430, the image processing module processes the target format image, the target position information and the interleaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image.
After the image in the target format and the spatial coordinates corresponding to the pupils of the user are obtained, the image in the target format can be processed by the image processing module in combination with the interweaving parameter information of the display device, optionally, the parameter information such as the spacing of the prisms on the display device, so as to obtain a three-dimensional view corresponding to the target format, and the three-dimensional view is displayed on the display screen, so that a doctor can perform surgical operation through the video image information updated in real time on the naked-eye 3D display device.
It should be noted that the image information displayed on the naked-eye 3D display screen can be updated in real time according to the pupil coordinate information of the target user, and the user cannot perceive the change on the display screen because the updating time is relatively fast, so that the operation is not affected.
According to the technical scheme of the embodiment of the invention, at least one operative field acquisition module is respectively arranged in a lens cone of a microscope and used for respectively acquiring target views of target angles and processing the target views to obtain target format images; the tracking module is arranged on the naked eye display device and used for tracking and determining the position information of the target light source and determining the target position information according to the position information; the image processing module is arranged in the naked eye display device and used for processing the target format image, the target position information and the interweaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image, so that the problem that in the prior art, when a patient is operated only through an eyepiece of a microscope, the limitation is large is solved; when the photoelectric conversion module is used for displaying the operation part on the display interface, the depth information corresponding to the operation part does not exist, and the problem of influence on the accuracy of the operation is solved, the image information collected by the operation field collection module is processed, the three-dimensional view corresponding to the operation part is obtained, so that the operation part is processed by a worker according to the three-dimensional view, and the technical effects of accuracy and convenience of the operation are improved.
EXAMPLE III
Fig. 5 is a schematic structural diagram of an apparatus according to a third embodiment of the present invention. FIG. 5 illustrates a block diagram of an exemplary device 50 suitable for use in implementing embodiments of the present invention. The device 50 shown in fig. 5 is only an example and should not bring any limitation to the function and scope of use of the embodiments of the present invention.
As shown in FIG. 5, device 50 is embodied in a general purpose computing device. The components of the device 50 may include, but are not limited to: one or more processors or processing units 501, a system memory 502, and a bus 503 that couples the various system components (including the system memory 502 and the processing unit 501).
Bus 503 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 50 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by device 50 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 502 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)504 and/or cache memory 505. The device 50 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 506 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, commonly referred to as a "hard drive"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to the bus 503 by one or more data media interfaces. Memory 502 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 508 having a set (at least one) of program modules 507 may be stored, for instance, in memory 502, such program modules 507 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 507 generally perform the functions and/or methodologies of embodiments of the invention as described herein.
Device 50 may also communicate with one or more external devices 509 (e.g., keyboard, pointing device, display 510, etc.), with one or more devices that enable a user to interact with device 50, and/or with any devices (e.g., network card, modem, etc.) that enable device 50 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 511. Also, device 50 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via network adapter 512. As shown, the network adapter 512 communicates with the other modules of the device 50 over a bus 503. It should be appreciated that although not shown in FIG. 5, other hardware and/or software modules may be used in conjunction with device 50, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 501 executes various functional applications and data processing, for example, implementing an image display method provided by an embodiment of the present invention, by running a program stored in the system memory 502.
Example four
The fourth embodiment of the present invention further provides a storage medium containing computer executable instructions, which when executed by a computer processor, are used to execute a tracking naked-eye 3D display method based on an integral microscope.
The method comprises the following steps:
at least one operative field acquisition module respectively acquires a target view of a target angle and processes the target view to obtain a target format image;
the tracking module tracks and determines the position information of the target light source, and determines the target position information according to the position information;
and the image processing module processes the target format image, the target position information and the interleaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image. Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A tracking naked eye 3D display system based on a body type microscope is characterized in that the tracking naked eye 3D display system is configured in the microscope and comprises
The surgical field acquisition modules are respectively arranged in a lens cone of the microscope and are used for respectively acquiring target views at target angles and processing the target views to obtain target format images;
the tracking module is arranged on the naked eye display device and used for tracking and determining the position information of the target light source and determining the target position information according to the position information;
and the image processing module is arranged in the naked eye display device and used for processing the target format image, the target position information and the interweaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image.
2. The display system of claim 1, further comprising:
the light emitting module is arranged on the wearable device, worn at a preset position of a target user and used for emitting preset waveband light information so as to provide position information of the target light source.
3. The display system of claim 2, wherein the tracking module is a camera, including a single camera, a dual camera, or multiple cameras; the tracking module is used for capturing light ray information of a preset wave band and determining the position information of the target light source according to the light ray information.
4. The display system of claim 1, wherein the tracking module comprises two color cameras;
the two color cameras are used for capturing image information marked on a target user in advance, and determining pupil coordinate information of the target user according to the image information to serve as target position information.
5. The display system as claimed in claim 3 or 4, wherein the front end of the camera is provided with a filter for filtering light rays in a predetermined wavelength band to determine the position information of the target light source.
6. The display system of claim 5, wherein the predetermined wavelength range of light is between 840nm and 1100 nm.
7. The display system of claim 1, further comprising:
and the image display module is arranged in the naked eye display device and used for acquiring the three-dimensional view and displaying the three-dimensional view on a display interface of the naked eye display device.
8. A tracking type naked eye 3D display method based on a body type microscope is characterized by comprising the following steps:
at least one operative field acquisition module respectively acquires a target view of a target angle and processes the target view to obtain a target format image;
the tracking module tracks and determines the position information of the target light source, and determines the target position information according to the position information;
and the image processing module processes the target format image, the target position information and the interleaving parameters of the display device to obtain a three-dimensional view corresponding to the target format image.
9. The method of claim 8,
and the image display module acquires the three-dimensional view and displays the three-dimensional view on a display interface of the naked eye display device.
10. An apparatus, the apparatus comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the body microscope-based tracked naked eye 3D display method of any one of claims 8-9.
CN201911106139.3A 2019-11-13 2019-11-13 Tracking type naked eye 3D display system and method based on body type microscope Pending CN110687669A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911106139.3A CN110687669A (en) 2019-11-13 2019-11-13 Tracking type naked eye 3D display system and method based on body type microscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911106139.3A CN110687669A (en) 2019-11-13 2019-11-13 Tracking type naked eye 3D display system and method based on body type microscope

Publications (1)

Publication Number Publication Date
CN110687669A true CN110687669A (en) 2020-01-14

Family

ID=69116520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911106139.3A Pending CN110687669A (en) 2019-11-13 2019-11-13 Tracking type naked eye 3D display system and method based on body type microscope

Country Status (1)

Country Link
CN (1) CN110687669A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114302130A (en) * 2021-12-06 2022-04-08 嘉兴智瞳科技有限公司 Intelligent microsurgery imaging device control method and system
CN116699819A (en) * 2023-08-04 2023-09-05 杭州安劼医学科技有限公司 3D surgical microscope optical system and 3D image presentation method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108175522A (en) * 2017-12-21 2018-06-19 上海瑞烁信息科技有限公司 A kind of wear-type three-dimensional electronic operating microscope system
CN108553073A (en) * 2018-05-25 2018-09-21 张家港康得新光电材料有限公司 Endoscopic surgery bore hole 3D rendering display system and display methods
CN108836236A (en) * 2018-05-11 2018-11-20 张家港康得新光电材料有限公司 Endoscopic surgery naked eye 3D rendering display system and display methods
CN108965856A (en) * 2018-07-27 2018-12-07 杭州行开科技有限公司 A kind of naked eye 3D display system and method based on 3D endoscope
CN109031642A (en) * 2018-09-14 2018-12-18 广州弥德科技有限公司 A kind of display methods and system and device of general stereoscopic micro- Glassless
CN109151445A (en) * 2018-09-26 2019-01-04 深圳市新致维科技有限公司 A kind of naked eye 3D display system and its display methods and computer memory device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108175522A (en) * 2017-12-21 2018-06-19 上海瑞烁信息科技有限公司 A kind of wear-type three-dimensional electronic operating microscope system
CN108836236A (en) * 2018-05-11 2018-11-20 张家港康得新光电材料有限公司 Endoscopic surgery naked eye 3D rendering display system and display methods
CN108553073A (en) * 2018-05-25 2018-09-21 张家港康得新光电材料有限公司 Endoscopic surgery bore hole 3D rendering display system and display methods
CN108965856A (en) * 2018-07-27 2018-12-07 杭州行开科技有限公司 A kind of naked eye 3D display system and method based on 3D endoscope
CN109031642A (en) * 2018-09-14 2018-12-18 广州弥德科技有限公司 A kind of display methods and system and device of general stereoscopic micro- Glassless
CN109151445A (en) * 2018-09-26 2019-01-04 深圳市新致维科技有限公司 A kind of naked eye 3D display system and its display methods and computer memory device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114302130A (en) * 2021-12-06 2022-04-08 嘉兴智瞳科技有限公司 Intelligent microsurgery imaging device control method and system
CN114302130B (en) * 2021-12-06 2023-03-17 嘉兴智瞳科技有限公司 Intelligent microsurgery imaging device control method and system
CN116699819A (en) * 2023-08-04 2023-09-05 杭州安劼医学科技有限公司 3D surgical microscope optical system and 3D image presentation method
CN116699819B (en) * 2023-08-04 2023-11-17 杭州安劼医学科技有限公司 3D surgical microscope optical system and 3D image presentation method

Similar Documents

Publication Publication Date Title
EP3533409B1 (en) Augmented reality navigation systems for use with robotic surgical systems
US11217028B2 (en) Surgeon head-mounted display apparatuses
RU2740259C2 (en) Ultrasonic imaging sensor positioning
US11980429B2 (en) Tracking methods for image-guided surgery
US7774044B2 (en) System and method for augmented reality navigation in a medical intervention procedure
WO2019238214A1 (en) Visualization of medical data depending on viewing-characteristics
WO2021089440A1 (en) Augmented reality headset for medical imaging
JP2019527002A (en) Stereo vision system that enables depth perception in the surgical field
CN110687669A (en) Tracking type naked eye 3D display system and method based on body type microscope
EP3461451B1 (en) Laser pointer system for radiotherapy
US20120249533A1 (en) Stereoscopic display apparatus
CN115624384B (en) Operation auxiliary navigation system, method and storage medium based on mixed reality technology
Salb et al. INPRES (intraoperative presentation of surgical planning and simulation results): augmented reality for craniofacial surgery
Cattari et al. Towards a Wearable Augmented Reality Visor for High-Precision Manual Tasks
CN115316919B (en) Dual-camera 3D optical fluorescence endoscope imaging system, method and electronic equipment
US20220142722A1 (en) Method and system for controlling dental machines
CN116687585A (en) Joint surgery assisting method, equipment and medium based on augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200114