WO2021208603A1 - 显微镜自动对焦方法、显微镜系统、医疗设备和存储介质 - Google Patents

显微镜自动对焦方法、显微镜系统、医疗设备和存储介质 Download PDF

Info

Publication number
WO2021208603A1
WO2021208603A1 PCT/CN2021/077828 CN2021077828W WO2021208603A1 WO 2021208603 A1 WO2021208603 A1 WO 2021208603A1 CN 2021077828 W CN2021077828 W CN 2021077828W WO 2021208603 A1 WO2021208603 A1 WO 2021208603A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
microscope
camera
light
lens
Prior art date
Application number
PCT/CN2021/077828
Other languages
English (en)
French (fr)
Inventor
廖俊
姚建华
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2021208603A1 publication Critical patent/WO2021208603A1/zh
Priority to US17/745,571 priority Critical patent/US20220342195A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/283Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising used for beam splitting or combining
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/02Objectives
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/18Arrangements with more than one light path, e.g. for comparing two specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G02B21/244Devices for focusing using image analysis techniques
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G02B21/245Devices for focusing using auxiliary sources, detectors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/248Base structure objective (or ocular) turrets
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/361Optical details, e.g. image relay to the camera or image sensor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/106Beam splitting or combining systems for splitting or combining a plurality of identical beams or images, e.g. image replication
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals

Definitions

  • This application relates to medical image processing technology, in particular to a microscope autofocus method, a microscope system, medical equipment, and storage media.
  • augmented reality technology and artificial intelligence have been proposed to be configured as traditional optical microscope systems.
  • a camera is used to collect images of the sample to be observed, and real-time images are analyzed in combination with machine learning algorithms.
  • the ability of the camera to collect high-quality images is the guarantee for the accuracy of the algorithm of the above-mentioned augmented reality microscope.
  • the defocused image of the sample will lose a lot of important optical information. Therefore, it is particularly important to ensure that the camera can collect an accurately focused image of the sample to reduce the inaccurate focus of the microscope image that affects the output effect of the model.
  • the embodiments of the present application provide a microscope auto-focusing method, a microscope system, medical equipment, and a storage medium.
  • the technical solutions of the embodiments of the present application are implemented as follows:
  • An embodiment of the present application provides a microscope system, which includes:
  • the objective lens is configured to obtain the light from the sample to be observed entering the first optical path, and after entering the first optical path through the lens assembly with the light generated by the image projection module, it merges at the beam splitter;
  • the beam splitter assembly includes at least one beam splitter configured to separate and project light in different optical paths;
  • the lens assembly includes at least one lens, configured to project the light rays generated by the observed sample during observation into different light paths through the objective lens, so that the light rays can propagate along different light paths;
  • An image projection component arranged in the corresponding optical path of the light projected by the lens component, and configured to perform image enhancement processing on the image of the sample to be observed;
  • a camera assembly the camera assembly being arranged in the first light path, the camera assembly comprising a camera configured to take a picture of the sample to be observed in the microscope field of view to form and output a clearly focused image taken through the first light path;
  • the auxiliary focusing device includes an auxiliary focusing light source and an auxiliary focusing camera, which are arranged in the second optical path and configured to determine a focal length matching the camera assembly;
  • a focusing device the focusing device is configured to adjust the focal length of the image light entering the camera assembly according to the defocus amount of the sample image to be tested determined by the auxiliary focusing device.
  • the microscope system further includes:
  • Eyepieces and a trinocular lens tube the eyepieces are sleeved with the trinocular lens tube and configured to observe the sample to be observed through the objective lens;
  • the trinocular lens barrel is arranged at an end of the beam splitter away from the objective lens, the trinocular lens barrel includes a channel and a tube lens, the channel includes at least two and the channel is located at an end away from the beam splitter , One of the channels communicates with the eyepiece, and the tube lens is located at one end close to the beam splitter.
  • the focusing device includes a shift drive assembly and a zoom lens, so as to realize that the sample to be observed in the field of view of the microscope can be photographed at different focal lengths.
  • the beam splitter assembly is respectively communicated with the objective lens and the tube lens of the trinocular lens barrel, and the camera assembly is arranged in one of the channels of the trinocular lens barrel;
  • the beam splitter assembly includes a beam splitter, and the lens assembly includes a lens disposed between the beam splitter and the image projection assembly;
  • the focusing device is located between the beam splitter and the camera assembly, and is configured to adjust the focal length of the image light entering the camera assembly according to the defocus amount of the sample image to be tested determined by the auxiliary focusing device.
  • the image projection assembly further includes a first polarizer, the first polarizer is located between the lens assembly and the beam splitter, and is configured to polarize the corresponding light in the first optical path;
  • the camera assembly further includes a second polarizer located between the focusing device and the beam splitter and configured to perform polarization processing on the corresponding light collected by the camera assembly.
  • the auxiliary focusing light source is arranged in the Fourier back focal plane corresponding to the condenser lens assembly of the microscope system, and is configured to emit auxiliary focusing light to form the second optical path;
  • the beam splitter assembly includes a beam splitter arranged between the focusing device and the camera assembly, configured to reflect light in the second optical path to the auxiliary focusing camera;
  • the auxiliary focusing camera is arranged at an axially offset position of the conjugate plane of the camera assembly, and is configured to photograph an overlap that matches the sample to be observed in the microscope field of view based on the light in the second optical path image.
  • the auxiliary focusing light source is arranged in the Fourier back focal plane corresponding to the condenser lens assembly of the microscope system, and is configured to emit auxiliary focusing light to form the second optical path;
  • the auxiliary focusing camera and the image projection assembly are arranged opposite to each other along the beam splitter assembly, and are configured to shoot an overlapping image matching the sample to be observed in the microscope field of view based on the light in the second optical path .
  • the image projection component and the camera component use a time division multiplexing mechanism to operate.
  • the microscope system further includes:
  • At least one output interface device which is coupled to the data processing unit of the microscope system to output a clearly focused image taken through the first optical path and an image of the sample to be observed that has undergone image enhancement processing,
  • the objective lens includes at least one of the following:
  • Achromatic objective lens plan achromatic objective lens, plan semi-apochromatic objective lens, or plan apochromatic objective lens;
  • the beam splitter includes at least one of the following:
  • Box beam splitter flat beam splitter or thin film beam splitter.
  • the embodiment of the present application also provides a microscope auto-focusing method, the method includes:
  • the focal length of the image light entering the camera assembly is adjusted, so as to realize that the camera assembly shoots a clearly focused image through the first optical path.
  • the acquiring the measurement sample taken by the auxiliary focus camera in the second optical path of the microscope includes:
  • the collected light in the second optical path is processed based on the type of the auxiliary focusing camera, so as to achieve an overlapping image matching with the sample to be observed in the microscope field of view.
  • the method further includes:
  • the sample to be observed in the microscope field of view is photographed through the light in the first optical path to form and output a clearly focused image taken through the first optical path.
  • An embodiment of the present application also provides a medical device, which includes:
  • the microscope system is the microscope system provided in the foregoing embodiment, and the processor executes the following steps:
  • Memory configured to store executable instructions
  • the processor is configured to implement the previous microscope auto-focusing method when running the executable instructions stored in the memory.
  • An embodiment of the present application also provides a computer-readable storage medium that stores executable instructions, and the executable instructions are executed by a processor to implement the aforementioned microscope auto-focusing method.
  • the light from the sample to be observed is configured to enter the first optical path, and the light generated by the image projection module enters the first optical path through the lens assembly, and then merges at the beam splitter;
  • the beam splitter assembly includes At least one beam splitter is configured to separate and project light in different optical paths;
  • the lens assembly includes at least one lens configured to treat the light generated by the observed sample during observation through the objective lens and enter different optical paths.
  • the light rays are projected to realize that the light rays propagate along different light paths;
  • the image projection component is arranged in the corresponding light path of the light rays projected by the lens assembly, and is configured to perform image enhancement processing on the image of the sample to be observed; and a camera;
  • Component, the camera component is arranged in the first light path, the camera component includes a camera configured to take a picture of the sample to be observed in the microscope field of view to form and output a clearly focused image taken through the first light path;
  • the focusing device includes an auxiliary focusing light source and an auxiliary focusing camera, which are arranged in the second optical path and configured to determine a focal length matching the camera assembly; The defocus amount of the sample image is measured, and the focal length of the image light entering the camera assembly is adjusted.
  • the focusing device can automatically focus on the camera assembly of the microscope system, and form and output a clearly focused image taken through the first optical path, which saves the time for the microscope system to focus and improves the accuracy of focusing.
  • FIG. 1 is a schematic diagram of a use environment of a microscope auto-focus method provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of the composition structure of a medical device provided by an embodiment of the application.
  • FIG. 3 is an optional structure of the microscope system in the embodiment of the application.
  • FIG. 4 is a schematic diagram of an optional process of the microscope auto-focusing method provided by the embodiment of the present application.
  • FIG. 5 is a schematic diagram of an optional structure of the microscope system provided by the embodiment of the present application.
  • FIG. 6 is a schematic diagram of an optional structure of the microscope system provided by the embodiment of the present application.
  • FIG. 7 is a schematic diagram of an optional structure of a microscope system provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the relationship between the defocus amount and the distance between ghost images in an embodiment of the application.
  • FIG. 9 is a schematic diagram of fitting the relationship between the defocus amount and the distance between ghost images in this application.
  • Terminals including but not limited to: ordinary terminals and dedicated terminals, wherein the ordinary terminals maintain a long connection and/or a short connection with the transmission channel, and the dedicated terminal maintains a long connection with the transmission channel.
  • a carrier that implements a specific function in a terminal for example, a mobile client (APP) is a carrier of a specific function in a mobile terminal, such as a function of performing payment and consumption or purchasing a wealth management product.
  • APP mobile client
  • the lens assembly at least one lens combination device, can be provided with a lens barrel configured to observe an enlarged optical image of an object such as a cell.
  • CAD Computer Aided Diagnosis
  • CAD is used for imaging, medical image processing technology, and other possible physiological and biochemical methods, combined with computer analysis and calculation, to assist in the discovery of lesions and improve the accuracy of diagnosis.
  • FIG. 1 is a schematic diagram of the use scene of the microscope autofocus method provided by an embodiment of the present application, see FIG. 1.
  • the terminal (including the terminal 10-1 and the terminal 10-2) is provided with corresponding clients capable of performing different functions.
  • the belonging client is the terminal (including the terminal 10-1 and the terminal 10-2) from the corresponding
  • the server 200 obtains different slice images for browsing.
  • the terminal connects to the server 200 through the network 300.
  • the network 300 can be a wide area network or a local area network, or a combination of the two, and uses a wireless link to realize data transmission.
  • the terminal (including the terminal 10 -1 and terminal 10-2)
  • the sliced image types acquired from the corresponding server 200 through the network 300 may be the same or different.
  • the terminal (including the terminal 10-1 and the terminal 10-2) may be through the network 300 obtains a pathological image or a pathological video matching the target object from the corresponding server 200, or obtains only pathological slices matching the current target from the corresponding server 200 through the network 300 for browsing.
  • the server 200 may store slice images corresponding to different target objects, and may also store auxiliary analysis information matching the slice images of the target object.
  • the neural network model in the field of artificial intelligence deployed by the server can use a camera on a traditional optical microscope to collect images of the sample to be observed, and combine machine learning algorithms to analyze real-time images.
  • Artificial Intelligence is a theory, method, technology and application system that uses digital computers or machines controlled by digital computers to simulate, extend and expand human intelligence, perceive the environment, acquire knowledge, and use knowledge to obtain the best results.
  • artificial intelligence is a comprehensive technology of computer science, which attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a similar way to human intelligence.
  • Artificial intelligence is to study the design principles and implementation methods of various intelligent machines, so that the machines have the functions of perception, reasoning and decision-making.
  • Artificial intelligence software technology mainly includes computer vision technology, speech processing technology, natural language processing technology, and machine learning/deep learning.
  • the patient’s lesions viewed under the microscope system can include a variety of different application scenarios, such as lung cancer cell screening, early cervical cancer screening, etc. Screening of different cell sections, etc.
  • the image processing method of the microscope system based on this embodiment can be deployed in a variety of application scenarios, thereby facilitating remote inspection and use by doctors.
  • the server 200 sends pathology information of the same target object to the terminal (terminal 10-1 and/or terminal 10-2) through the network 300 to realize the pathology of the target object by the user of the terminal (terminal 10-1 and/or terminal 10-2) Information is analyzed, therefore.
  • the server 200 deploys the corresponding neural network model and is configured to analyze the clear image information output by the microscope system.
  • the image acquisition by the microscope system can be achieved in the following manner: Obtain the auxiliary in the second optical path of the microscope The measurement sample taken by the focusing camera; the corresponding image evaluation parameter is calculated according to the measurement sample taken by the focusing assist camera and the corresponding image evaluation standard; the image is searched in the pre-stored calibration curve according to the image evaluation parameter Evaluate the relationship between the parameters and the defocus amount, and then determine the required defocus amount; according to the determined defocus amount, adjust the focal length of the image light entering the camera assembly to realize that the camera assembly shoots a clearly focused image through the first optical path.
  • FIG. 2 is a schematic diagram of the composition structure of the medical device provided by the embodiment of the application. It is understood that FIG. 2 only shows an exemplary structure of the medical device instead of the entire structure, and part of the structure or all of the structure shown in FIG. 2 can be implemented as required. .
  • the medical device provided by the embodiment of the present application includes: at least one processor 201, a memory 202, a user interface 203, and at least one network interface 204.
  • the various components in the medical device 20 are coupled together through the bus system 205.
  • the bus system 205 is configured to implement connection and communication between these components.
  • the bus system 205 also includes a power bus, a control bus, and a status signal bus.
  • various buses are marked as the bus system 205 in FIG. 2.
  • the user interface 203 may include a display, a keyboard, a mouse, a trackball, a click wheel, keys, buttons, a touch panel, or a touch screen.
  • the memory 202 may be a volatile memory or a non-volatile memory, and may also include both volatile and non-volatile memory.
  • the memory 202 in the embodiment of the present application can store data to support the operation of the terminal (such as 10-1). Examples of such data include: any computer program configured to operate on a terminal (such as 10-1), such as an operating system and application programs.
  • the operating system includes various system programs, such as a framework layer, a core library layer, a driver layer, etc., which are configured to implement various basic services and process hardware-based tasks.
  • Applications can include various applications.
  • the microscope system provided in the embodiments of the present application may be implemented in a combination of software and hardware.
  • the microscope system provided in the embodiments of the present application may be a processor in the form of a hardware decoding processor, which is programmed To execute the image processing method of the microscope system provided in the embodiment of the present application.
  • a processor in the form of a hardware decoding processor may adopt one or more application specific integrated circuits (ASIC, Application Specific Integrated Circuit), DSP, programmable logic device (PLD, Programmable Logic Device), and complex programmable logic device (CPLD, Complex Programmable Logic Device, Field-Programmable Gate Array (FPGA, Field-Programmable Gate Array) or other electronic components.
  • ASIC application specific integrated circuits
  • DSP digital signal processor
  • PLD programmable logic device
  • CPLD Complex Programmable Logic Device
  • FPGA Field-Programmable Gate Array
  • the microscope system provided by the embodiment of the present application may be directly embodied as a combination of software modules executed by the processor 201, and the software modules may be located in a storage medium.
  • the processor 201 reads the executable instructions included in the software module in the memory 202, and combines necessary hardware (for example, including the processor 201 and other components connected to the bus 205) to complete the microscope system image provided by the embodiment of the present application Approach.
  • the processor 201 may be an integrated circuit chip with signal processing capabilities, such as a general-purpose processor, a digital signal processor (DSP, Digital Signal Processor), or other programmable logic devices, discrete gates, or transistor logic devices , Discrete hardware components, etc., where the general-purpose processor may be a microprocessor or any conventional processor.
  • DSP Digital Signal Processor
  • the general-purpose processor may be a microprocessor or any conventional processor.
  • the device provided in the embodiment of the application may directly use the processor 201 in the form of a hardware decoding processor to perform the execution, for example, be integrated by one or more applications.
  • Circuit ASIC, Application Specific Integrated Circuit
  • DSP Programmable Logic Device
  • PLD Programmable Logic Device
  • CPLD Complex Programmable Logic Device
  • FPGA Field-Programmable Gate
  • the memory 202 in the embodiment of the present application is configured to store various types of data to support the operation of the medical device 20. Examples of these data include: any executable instructions configured to operate on the medical device 20, such as executable instructions, and a program that implements the image processing method of the slave microscope system of the embodiment of the present application may be included in the executable instructions.
  • FIG. 2 shows the microscope system 2020 stored in the memory 202, which can be software in the form of programs and plug-ins, and includes a
  • the series of modules, as an example of the program stored in the memory 202, may include a microscope system 2020, and the microscope system 2020 includes the following software modules:
  • the information processing module 2081 is configured to obtain measurement samples taken by the focusing assist camera in the second optical path of the microscope; according to the measurement samples taken by the focusing assist camera and corresponding image evaluation standards, calculate corresponding image evaluation parameters;
  • the image evaluation parameter searches for the relationship between the image evaluation parameter and the defocus amount in the pre-stored calibration curve, and then determines the required defocus amount; according to the determined defocus amount, adjusts the focal length of the image light entering the camera assembly , In order to achieve the camera assembly through the first optical path to shoot a sharply focused image.
  • FIG. 3 is an optional structure of the microscope system in the related art of this application.
  • the microscope 300 has a microscope body 301, a microscope body stage focusing knob 302, a microscope body stage 303, a microscope sample 304 to be observed, and a body objective lens 305, A trinocular tube 306, a camera 307, and an eyepiece 308.
  • a microscope stage 303 is arranged above the microscope body 301, a sample 304 to be observed is placed on the microscope body stage 303, and the microscope body stage focusing knobs 302 are arranged on both sides of the microscope body 301.
  • the microscope body objective lens 305 is located above the microscope body stage 303, and a trinocular tube 306 is also provided above the microscope body objective lens 305, and the trinocular lens 306 is connected to the camera 307 and the eyepiece 308 respectively.
  • Adjusting the microscope body stage focus knob 302 can adjust the microscope body stage 303 to rise or fall in the vertical direction, thereby changing the distance between the microscope body stage 303 and the microscope body objective lens 305 to achieve focusing.
  • the microscope body objective lens 305 can also be moved to change the distance between the microscope body stage 303 and the microscope body objective lens 305 to achieve focus adjustment.
  • the premise of the focus adjustment of the microscope 300 is that it is assumed that the end of the eyepiece 308 and the end of the camera 307 of the trinocular tube 306 of the microscope 300 are in the same focus.
  • the image of the camera 307 and the image of the eyepiece 308 are out of focus.
  • the parfocal lens of different magnifications is not adjusted well, the eyes of different microscope 300 users have different diopters, and when the users of microscope 300 are exchanged, the new user does not have the awareness of adjusting the eyepiece 308 diopter knob, and directly adjusts the stage to make The sample is refocused.
  • FIG. 4 is an optional process schematic diagram of the microscope auto-focusing method provided by the embodiment of the present application.
  • the microscope auto-focusing method provided in the present application includes the following steps:
  • Step 401 Obtain a measurement sample taken by a focusing assist camera in the second optical path of the microscope;
  • Step 402 Calculate corresponding image evaluation parameters according to the measurement samples taken by the focusing assisting camera and the corresponding image evaluation standards.
  • the image evaluation criterion may be the relative pixel offset of the ghost image collected by the auxiliary focus camera.
  • Step 403 According to the image evaluation parameter, search for the relationship between the image evaluation parameter and the defocus amount in the pre-stored calibration curve, and then determine the required defocus amount;
  • the calibration curve can be pre-stored in the corresponding storage medium, so that the auto-focus function of the augmented reality microscope can be realized by calling the calibration curve.
  • the pre-stored calibration curve corresponds to the different defocus amounts collected in advance.
  • a curve determined by the image and the corresponding image evaluation standard for example, the relative pixel offset of the image ghost collected by the auxiliary focus camera, so by looking up the calibration curve, the defocus amount and different defocus degrees can be determined.
  • Image evaluation The relationship between the parameters.
  • Step 404 Adjust the focal length of the image light entering the camera assembly according to the determined defocus amount, so as to realize that the camera assembly shoots a clearly focused image through the first optical path.
  • the camera assembly can pass the One light path shoots clear and focused images, avoiding the defects of slow manual focusing speed and poor accuracy in the prior art.
  • the structure of the microscope system is different from that of the microscope shown in Fig. 3, for example, an augmented reality microscope (ARM Augmented Reality Microscope).
  • augmented reality microscope ARM Augmented Reality Microscope
  • a doctor uses an augmented reality microscope to observe a slice, he can obtain the diagnosis result based on the slice at the same time. That is, the augmented reality microscope can superimpose the diagnosis result on the slice as augmented reality information.
  • the neural network model running in the server can make auxiliary diagnosis and treatment judgments on the patient area to help doctors make correct judgments on the pathological information of the lesion.
  • FIG. 5 is an optional schematic diagram of the microscope system provided by an embodiment of the present application. It includes an objective lens 115, a beam splitter 1112, an image projection assembly 1111, a camera assembly 117, and a trinocular lens barrel 116.
  • the objective lens 115 has a first end 10a and a second end 10b disposed oppositely, and the first end 10a faces the sample to be observed.
  • the beam splitter 1112 is arranged at the second end 10b, the beam splitter 1112 is respectively communicated with the objective lens 115 and the tube lens 1118 of the multi-eye lens barrel, and the camera assembly 117 is arranged at the third end 10b.
  • the image projection assembly 1111 performs image projection in the corresponding field of view through the light transmitted in the lens 1115, the camera assembly 117 receives the light transmitted by the tube lens 1118, and the camera assembly 117 It includes a camera and a corresponding image output device, and is configured to transmit the captured image in the corresponding field of view to the server to process or recognize the image.
  • the trinocular lens barrel 116 is set at the beam splitter 1112 away from the One end of the objective lens 10, the trinocular tube 116 includes a channel and a tube lens 1118, the channel includes at least two and is located at an end away from the beam splitter 1112, the tube lens 1118 is located close to the beam splitter 1112 At one end, the camera assembly 117 receives the light output from the beam splitter 1112 through the tube lens 1118 to complete the collection of the corresponding image in the field of view.
  • the image projection assembly 1111 also includes a first polarizer 1116 ,
  • the first polarizer 1116 is located between the tube lens 1118 and the beam splitter 1112, and is configured to polarize the corresponding light in the first optical path;
  • the camera assembly further includes a second polarizer 1117.
  • the second polarizer 1117 is located between the tube lens 1118 and the camera assembly 117, and is configured to perform polarization processing on the corresponding light collected by the camera assembly 117.
  • the optical path of the microscope system 1100 is: the light from the objective lens 115 is transmitted to the beam splitter 1112, and the beam splitter 1112 reflects a part of the light to the tube lens 1118 and transmits it to the photosensitive chip of the camera assembly 117 through the first polarizer 1116.
  • the beam splitter 1112 transmits a part of the light to the tube lens 1118, passes through the tube lens 1118 and reflects the transmitted light through the tube lens 1118 to reach the trinocular tube 116, and the trinocular tube 116 transmits the light to the eyepiece 118,
  • the image of the sample 114 to be observed can be observed through the eyepiece 118.
  • the light generated by the image projection component 1111 passes along the lens 1115, passes through the polarization process of the second polarizer 1117, and passes through the beam splitter 1112 and the first polarizer 1116. , So that it cannot reach the camera assembly 117, and the shooting of the camera assembly 117 will not be affected.
  • the cumbersome trinocular parfocal adjustment needs to be repeated every time when the microscope observers of different diopters exchange the microscope system.
  • the camera cannot complete autofocus autonomously and cannot collect clear images.
  • the user of the microscope sees a clear image through the eyepiece, what the camera collects is an out-of-focus image, which cannot guarantee the correctness of the analysis result of the image algorithm executed in the server.
  • FIG. 6 is an optional structural diagram of the microscope system provided by an embodiment of the present application, where the microscope system 600 specifically includes an objective lens 115, a beam splitter 1112, and an image projection assembly 1111.
  • the microscope system 600 specifically includes an objective lens 115, a beam splitter 1112, and an image projection assembly 1111.
  • a camera assembly 117 and a trinocular tube 116 the objective lens 115 has a first end 10a and a second end 10b that are opposed to each other, the first end 10a faces the sample to be observed, and the beam splitter 1112 is arranged on the At the two ends 10b, the beam splitter 1112 is respectively communicated with the objective lens 115 and the tube lens 1118 of the multi-eye lens barrel, and the camera assembly 117 is arranged in one of the channels of the trinocular lens barrel, wherein the The image projection component 1111 performs image projection in the corresponding field of view through the light transmitted in the lens 1115, the camera component 117 receives the light transmitted by the tube lens 1118, the camera
  • the trinocular lens barrel 116 is arranged at an end of the beam splitter 1112 away from the objective lens 10, and the trinocular lens barrel 116 includes Channel and tube lens 1118, the channel includes at least two and is located at an end away from the beam splitter 1112, the tube lens 1118 is located at an end close to the beam splitter 1112, the camera assembly 117 passes through the tube
  • the mirror 1118 receives the light output from the beam splitter 1112 to complete the collection of the corresponding image in the field of view.
  • the image projection assembly 1111 also includes a second polarizer 1117, and the second polarizer 1117 is The polarizer 1117 is located between the lens 1115 and the beam splitter 1112, and is configured to polarize the corresponding light in the first optical path; the camera assembly further includes a first polarizer 1116, the first The polarizer 1116 is located between the tube lens 1118 and the camera assembly 117, and is configured to polarize the corresponding light collected by the camera assembly 117.
  • the image of the sample 114 to be observed can also observe the image output from the image projection component 1111.
  • the camera component 117 only hopes to capture the image of the sample 114 to be observed while ignoring the image output from the image projection component 1111.
  • the second polarizer 1117 can convert the light output by the image projection component 1111 into polarized light, which can directly reach the human eye through the eyepiece 118 for observation.
  • a polarizer 1116 can eliminate the polarized light output by the second polarizer 1117, so that the camera assembly 117 can only take an image of the sample 114 to be observed.
  • the objective lens includes at least one of the following:
  • An achromatic objective lens, a plan achromatic objective lens, a plan semi-apochromatic objective lens, or a plan apochromatic objective lens, the beam splitter includes at least one of the following: a cube beam splitter, a flat beam splitter or a thin film splitter Beamer.
  • a cube beam splitter e.g., a cube beam splitter, a flat beam splitter or a thin film splitter Beamer.
  • a combination of objective lenses with magnifications of 4.0X, 10.0X, 20.0X, 60.0X and 100.0X can be provided for users to choose.
  • the cube beam splitter, flat beam splitter or thin film beam splitter can be selected and adapted according to the type of augmented reality microscope to adapt to different use environments.
  • the microscope system 1100 further includes: an auxiliary focusing light source 1140 arranged in the Fourier back focal plane corresponding to the 1141 condenser lens group, configured to emit auxiliary focusing light, wherein the auxiliary focusing light source 1140 may be two The same infrared LED light emitter, the auxiliary focusing camera 1143 is arranged at the axial offset position of the conjugate plane of the camera assembly 117, wherein the second optical path formed by the light generated by the auxiliary focusing light source 1140 is shown in Figure 1144 .
  • the optical path of the microscope system 1100 includes: a first optical path and a second optical path, wherein the first optical path is configured to project the light generated by the observed sample during observation after entering the optical path through the objective lens, so that the camera The component takes a picture of the sample to be observed in the microscope field of view to form and output a clearly focused image taken through the first optical path, and at the same time, it can also realize that the image projection component uses the light in the second optical path to image the image of the sample to be observed Enhanced processing.
  • the first optical path includes: the light of the objective lens 115 is transmitted to the beam splitter 1112, and the beam splitter 1112 reflects a part of the light to the tube lens 1118, passes through the first lens 1116, and is transmitted to the photosensitive chip of the camera assembly 117.
  • the beamer 1112 transmits a part of the light to the tube lens 1118, passes through the tube lens 1118 and reflects the transmitted light through the tube lens 1118 to reach the trinocular tube 116.
  • the trinocular tube 116 transmits the light to the eyepiece 118, which can pass through the eyepiece 118. Observe the image of the sample 114 to be observed.
  • the light generated by the image projection component 1111 runs along the lens 1115, passes through the polarization process of the second polarizer 1117, and passes through the beam splitter 1112 and the first polarizer 1116 to make it impossible Arriving in the camera assembly 117, the camera assembly 117 can only take an image of the sample 114 to be observed, and the photographing of the camera assembly 117 will not be affected.
  • the polarization directions of the first polarizer 1116 and the second polarizer 1117 are perpendicular to each other.
  • the focal length of the camera component 117 needs to be adjusted first. Specifically, the corresponding defocus amount parameter can be determined through the image in the second optical path.
  • the second optical path includes: the light from the auxiliary focusing light source 1140 in the Fourier back focal plane passes through the object 1185 and reaches the beam splitter 1112.
  • the beam splitter 1112 transmits the light to the tube lens 1118, which is finally used as the auxiliary focusing light source.
  • the infrared light emitted by the infrared LED is transmitted to the auxiliary focusing camera 1143 through the tube lens 1118 and then imaged at the auxiliary focusing camera 1143 (partially overlapping images).
  • the focusing device 1121 is located between the beam splitter 1112 and the camera assembly 117, and is configured to drive the focusing device based on the focal length determined by the defocus amount of the overlapping image
  • the focal length of the first lens is adjusted to form a new focal length.
  • the focusing device 1121 can be an electric motor such as an ultrasonic drive motor or other mechanical motors that can be used to drive the lens group accordingly; it can also be a liquid zoom lens that performs liquid zoom independent of the lens group to adapt to different use environments .
  • the focusing device 1121 is located between the beam splitter 1112 and the camera assembly 117, and is configured to perform focus through a focal length determined by a zoom lens based on the amount of defocus of the overlapped image. Adjust to form the new focal length; the camera assembly 117 is configured to take a photograph of the sample to be observed in the microscope field of view based on the new focal length to form and output a clearly focused image taken through the first optical path.
  • the camera can also be used in conjunction with the camera adapter, and the camera based on the photosensitive chip is connected through the camera adapter. Insert the camera interface on the top of the trinocular observation tube, so as to realize the connection between the camera and the trinocular observation tube.
  • the camera adapter can also be embedded with a polarizer, and the embedded polarizer can filter out the light whose polarization state is perpendicular to the polarizer and avoid interference with imaging.
  • FIG. 7 is an optional structural diagram of a microscope system provided by an embodiment of the present application.
  • the microscope system 700 specifically includes an objective lens 115, a beam splitter 1112, an image projection assembly 1111, a camera assembly 117, and a trinocular lens barrel. 116.
  • the objective lens 115 has a first end 10a and a second end 10b disposed oppositely, the first end 10a faces the sample to be observed, the beam splitter 1112 is disposed at the second end 10b, and the beam splitter
  • the device 1112 is respectively communicated with the objective lens 115 and the tube lens 1118 of the multi-eye lens barrel.
  • the camera assembly 117 is arranged in one of the channels of the trinocular lens barrel, wherein the image projection assembly 1111 passes through the lens 1115.
  • the transmitted light performs image projection in the corresponding field of view.
  • the camera assembly 117 receives the light transmitted by the tube lens 1118.
  • the camera assembly 117 includes a camera and a corresponding image output device configured to capture the captured image in the corresponding field of view.
  • the trinocular lens barrel 116 is arranged at the end of the beam splitter 1112 away from the objective lens 10, and the trinocular lens barrel 116 includes a channel and a tube lens 1118.
  • the channel includes at least two channels and is located at an end away from the beam splitter 1112, the tube lens 1118 is located at an end close to the beam splitter 1112, and the camera assembly 117 receives the beam splitter through the tube lens 1118 1112 output light to complete the collection of images in the corresponding field of view.
  • the image projection assembly 1111 also includes a second polarizer 1117, and the second polarizer 1117 is The polarizer 1117 is located between the lens 1115 and the beam splitter 1112, and is configured to polarize the corresponding light in the first optical path; the camera assembly further includes a first polarizer 1116, the first The polarizer 1116 is located between the tube lens 1118 and the camera assembly 117 and is configured to perform polarization processing on the corresponding light collected by the camera assembly 117.
  • the naked eye of the microscope operator hopes to observe the image of the sample 114 to be observed through the eyepiece 118, but also to observe the image output by the image projection component 1111, and the camera component 117 only hopes to be able to photograph the sample 114 to be observed.
  • the light output by the image projection component 1111 can be converted into polarized light by the second polarizer 1117, which can directly reach the human eye for observation through the eyepiece 118, but due to the second polarizer 1117
  • the polarization directions of the first polarizer 1116 and the second polarizer 1117 are perpendicular to each other.
  • the first polarizer 1116 can eliminate the polarized light output by the second polarizer 1117, so that the camera assembly 117 can only take an image of the sample 114 to be observed.
  • the microscope system 1100 further includes: an auxiliary focusing light source 1140 arranged in the Fourier back focal plane corresponding to the 1141 condenser lens group, configured to emit auxiliary focusing light, wherein the auxiliary focusing light source 1140 may be two The same infrared LED light emitter, the auxiliary focusing camera 1143 is set at the horizontally symmetrical position of the image projection assembly 1111, and receives the light in the second optical path refracted by the beam splitter 112 through the lens 1119, and the auxiliary focusing light source 1140 produces The second optical path formed by the light is shown at 1144 in the figure.
  • the optical path of the microscope system 1100 includes: a first optical path and a second optical path, where the first optical path includes: the light of the objective lens 115 is transmitted to the beam splitter 1112, and the beam splitter 1112 reflects a part of the light to the tube lens 1118 and passes through the first optical path.
  • the lens 1116 is transmitted to the photosensitive chip of the camera assembly 117.
  • the beam splitter 1112 transmits a part of the light to the tube lens 1118, passes through the tube lens 1118, and reflects the transmitted light through the tube lens 1118 to the trinocular tube 116.
  • the trinocular tube 116 transmits the light to the eyepiece 118, and the image of the sample 114 to be observed can be observed through the eyepiece 118.
  • the light generated by the image projection component 1111 runs along the lens 1115, in which the first polarizer 1116 and the second polarizer The polarization directions of 1117 are perpendicular to each other.
  • the beam splitter 1112 and the first polarizer 1116 prevent light from reaching the camera assembly 117, so that the camera assembly 117 can only take pictures for observation.
  • the image of the sample 114 will not appear that the light output by the image projection component 1111 affects the shooting of the camera component 117.
  • the naked eye of the microscope operator hopes that the image of the sample 114 to be observed can be observed through the eyepiece 118, as well as the image projection component.
  • the focal length of the camera component 117 needs to be adjusted first. Specifically, the corresponding out-of-focus parameter can be determined through the image in the second optical path.
  • the second optical path includes: the light from the auxiliary focusing light source 1140 in the Fourier back focal plane passes through the object 1185 and reaches the beam splitter 1112.
  • the beam splitter 1112 transmits the light to the lens 1119, which is finally used as the auxiliary focusing light source infrared LED
  • the emitted infrared light is transmitted to the auxiliary focusing camera 1143 through the lens 1119 and then imaged at the auxiliary focusing camera 1143 (partially overlapping images).
  • the focusing device 1121 is located between the beam splitter 1112 and the camera assembly 117, and is configured to drive the focusing device based on the focal length determined by the defocus amount of the overlapping image The focal length of the first lens is adjusted to form a new focal length.
  • the focusing device 1121 is located between the beam splitter 1112 and the camera assembly 117, and is configured to perform focus through a focal length determined by a zoom lens based on the amount of defocus of the overlapped image. Adjust to form the new focal length; the camera assembly 117 is configured to take a photograph of the sample to be observed in the microscope field of view based on the new focal length and output a clearly focused image taken through the first optical path.
  • FIG. 8 A schematic diagram of the relationship between the amount and the distance between ghost images, where the infrared light emitted by the infrared LED as the auxiliary focusing light source is transmitted to the auxiliary focusing camera 1143 and then imaged at the auxiliary focusing camera 1143 (partially overlapping images), where the auxiliary focusing camera can be
  • the ordinary industrial camera with the infrared filter removed can also be a special infrared camera. Capable of capturing infrared light.
  • the camera in the photographic assembly is usually a color camera, and there is usually an infrared cut filter in front of the color camera, so the infrared light generated by the infrared LED 40 will not be captured by the camera.
  • the peak position of the autocorrelation result obtained after the autocorrelation calculation is performed on the image collected by the auxiliary focus camera will change.
  • z[x] s[x]+s[x-x0]
  • s[x] and s[x-x0] are two ghost images separated by a distance of x0.
  • R() stands for autocorrelation operation symbol. 2 ⁇ [x]+ ⁇ [x-x0]+ ⁇ [x+x0] represents three ⁇ functions.
  • R(s[x])*(2 ⁇ [x]+ ⁇ [x-x0]+ ⁇ [x+x0]) represents the convolution of R(s[x]) and three ⁇ functions. This means that the result of R(z[x]) will form three spikes through arithmetic operations.
  • One peak has the highest height in the middle, and the other two peaks are located on both sides of the highest peak, and are separated from this peak by x0. At this time, it also means that if the distance between any two of the three spikes formed by the algorithm can be determined, x0 can be obtained, which is the distance between the two ghost images collected by the camera.
  • (a1), (b1), (c1) show the focus images in the corresponding field of view collected by the assisted focus camera when the rat kidney slices are at different defocusing amounts.
  • (a2), (b2), (c2) are the autocorrelation calculation results corresponding to each picture of (a1), (b1), (c1).
  • (A1)-(c1) in FIG. 8 respectively show the infrared images containing two ghosts collected by the auxiliary focus camera in the microscope system of the different embodiments in the preceding preamble.
  • the distance between the two ghost images will be different.
  • the relationship curve between the sample defocus amount and the distance between two ghosts in the image is obtained.
  • the fitted curve is used as a reference table for the subsequent focusing process to realize auto-focusing.
  • Fig. 9 is a schematic diagram of fitting the relationship between the defocus amount and the distance between the ghost images in this application, and a schematic diagram of the curve after fitting according to the relationship between the defocus amount and the distance between the ghost images is shown in Fig. 9, the sample is defocused The relationship between the amount and the ghost image of the image collected by the auxiliary focus camera is fitted to the curve.
  • the distances of the three peaks will also be very close, resulting in the value of the peak top being overwhelmed or new unexpected uncorrelated peaks appearing. This is not conducive to finding the peak position that needs to be determined. So the introduction of bias here is to increase the distance between the three peaks.
  • the measurement samples taken by the assisted focus camera in the second light path of the microscope are obtained; the corresponding image evaluation parameters are calculated according to the measurement samples taken by the assisted focus camera and corresponding image evaluation standards; Image evaluation parameters, search for the relationship between the image evaluation parameters and the defocus amount in the pre-stored calibration curve, and then determine the required defocus amount; according to the determined defocus amount, adjust the focal length of the image light entering the camera assembly to It is realized that the camera component shoots a sharply focused image through the first optical path.
  • the focusing device can automatically focus on the camera assembly of the microscope system, and form and output a clearly focused image taken through the first optical path, which saves the time for the microscope system to focus and improves the accuracy of focusing.
  • the measurement samples taken by the assisted focus camera in the second optical path of the microscope are obtained; the corresponding image evaluation parameters are calculated according to the measurement samples taken by the assisted focus camera and the corresponding image evaluation standards;
  • the image evaluation parameters are described, the relationship between the image evaluation parameters and the defocus amount is searched in the pre-stored calibration curve, and then the required defocus amount is determined; according to the determined defocus amount, the focal length of the image light entering the camera assembly is adjusted, In order to realize that the camera assembly shoots a clear focused image through the first optical path.
  • the focusing device can automatically focus on the camera components of the microscope system, and form and output a clearly focused image taken through the first optical path, which saves the time for the microscope system to focus, improves the accuracy of focus, and enables the use of the microscope system The person obtains a higher-resolution image.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

本申请提供了一种显微镜系统,包括:分束器组件配置为分别对不同光路中的光线进行分离与投射;透镜组件配置为对被观测样品在观测时产生的光线经由物镜进入不同的光路后对光线进行投射,照相机组件,配置为对显微镜视野中的待观测样品进行拍照,以实现照相机组件通过第一光路拍摄清晰聚焦的图像;辅助对焦装置,配置为确定与照相机组件相匹配的焦距;对焦装置配置为根据辅助对焦装置确定的待测样本图像的离焦量,调整进入照相机组件的图像光线的焦距。本申请还提供了显微镜自动对焦方法、医疗设备和存储介质,本申请能够实现对显微镜系统进行自动对焦,形成并输出通过第一光路拍摄清晰聚焦的图像,节省了显微镜系统对焦的时间,提升了对焦的精确度。

Description

显微镜自动对焦方法、显微镜系统、医疗设备和存储介质
相关申请的交叉引用
本申请基于申请号为202010284514.X、申请日为2020年04月13日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及医疗图像处理技术,尤其涉及显微镜自动对焦方法、显微镜系统、医疗设备和存储介质。
背景技术
随着人工智能技术研究和进步,人工智能技术在多个领域展开研究和应用,例如,近年来增强现实技术和人工智能被提出配置为传统的光学显微镜系统。在传统的光学显微镜上使用相机采集待观察样品的图像,并结合机器学习算法对实时图像进行分析。
其中,相机能采集到高质量的图像是上述增强现实显微镜的算法准确性的保障。样品离焦的图像会丢失很多重要的光学信息,所以,保证相机能够采集到样品准确对焦的图像尤为重要,以减少对焦不准确的显微镜图像影响了模型的输出效果。
发明内容
有鉴于此,本申请实施例提供一种显微镜自动对焦方法、显微镜系统、 医疗设备和存储介质,本申请实施例的技术方案是这样实现的:
本申请实施例提供了一种显微镜系统,所述显微镜系统包括:
物镜,配置为获取待观测样品的光线进入第一光路,并与图像投影模块产生的光线经由透镜组件进入第一光路后,在分束器处汇合;
分束器组件,包括至少一个分束器,配置为分别对不同光路中的光线进行分离与投射;
透镜组件,包括至少一个透镜,配置为对被观测样品在观测时产生的光线经由所述物镜进入不同的光路后对所述光线进行投射,以实现所述光线沿不同光路传播;
图像投影组件,设置于所述透镜组件所投射光线的相应光路中,配置为对所述待观测样品的图像进行图像增强处理;
照相机组件,所述照相机组件设置于第一光路中,所述照相机组件包括照相机,配置为对所述显微镜视野中的待观测样品进行拍照,以形成并输出通过第一光路拍摄清晰聚焦的图像;
辅助对焦装置,包括辅助对焦光源、辅助对焦相机,设置于第二光路中,配置为确定与所述照相机组件相匹配的焦距;
对焦装置,所述对焦装置配置为根据所述辅助对焦装置确定的待测样本图像的离焦量,调整进入所述照相机组件的图像光线的焦距。
上述方案中,所述显微镜系统还包括:
目镜、三目镜筒,所述目镜与所述三目镜筒套接,配置为通过所述物镜对待观测样品进行观察;
所述三目镜筒设置在所述分束器远离所述物镜的一端,所述三目镜筒包括通道和管镜,所述通道至少包括两个且所述通道位于远离所述分束器的一端,其中一个通道与目镜连通,所述管镜位于靠近所述分束器的一端。
上述方案中,所述对焦装置包括移驱动组件和可变焦镜头,以实现在 不同焦距时,对所述显微镜视野中的待观测样品进行拍照。
上述方案中,
所述分束器组件分别与所述物镜和所述三目镜筒的管镜连通,所述照相机组件设置于所述三目镜筒的其中一个通道中;
所述分束器组件包括一个分束器,所述透镜组件包括一个透镜,设置于所述分束器与所述图像投影组件之间;
所述对焦装置位于所述分束器与所述照相机组件之间,配置为根据所述辅助对焦装置确定的待测样本图像的离焦量,调整进入所述照相机组件的图像光线的焦距。
上述方案中,
所述图像投影组件还包括第一偏振片,所述第一偏振片位于所述透镜组件与所述分束器之间,配置为对所述第一光路中的相应光线进行偏振处理;
所述照相机组件还包括第二偏振片,所述第二偏振片位于所述对焦装置与所述分束器之间,配置为对所述照相机组件所采集的相应光线进行偏振处理。
上述方案中,
所述辅助对焦光源设置于所述显微镜系统的聚光镜组件对应的傅里叶后焦面中,配置为发出辅助对焦光线,以形成所述第二光路;
所述分束器组件包括一个设置于所述对焦装置和所述照相机组件之间的分束器,配置为将所述第二光路中的光线反射至所述辅助对焦相机中;
所述辅助对焦相机设置于所述照相机组件的共轭平面的轴向偏置位置,配置为基于所述第二光路中的光线,拍摄与所述显微镜视野中的待观测样品相匹配的交叠影像。
上述方案中,
所述辅助对焦光源设置于所述显微镜系统的聚光镜组件对应的傅里叶后焦面中,配置为发出辅助对焦光线,以形成所述第二光路;
所述辅助对焦相机与所述图像投影组件沿所述分束器组件相对设置,配置为基于所述第二光路中的光线,拍摄与所述显微镜视野中的待观测样品相匹配的交叠影像。
上述方案中,所述图像投影组件和所述照相机组件采用时分复用机制运行。
上述方案中,所述显微镜系统还包括:
至少一个输出接口设备,所述输出接口设备所述显微镜系统的数据处理单元相耦合,以输出通过第一光路拍摄清晰聚焦的图像和经过图像增强处理的待观测样品的图像,
上述方案中,所述物镜包括至少以下之一:
消色差物镜、平场消色差物镜、平场半复消色差物镜,或者平场复消色差物镜;
所述分束器包括至少以下之一:
方体分束器、平板分束器或薄膜分束器。
本申请实施例还提供了一种显微镜自动对焦方法,所述方法包括:
获取显微镜第二光路中的辅助对焦相机所拍摄的测量样本;
根据所述辅助对焦的相机所拍摄的测量样本以及相应的图像评价标准,计算对应的图像评价参数;
根据所述图像评价参数,在预先存储的校准曲线中查找图像评价参数和离焦量的关系,进而确定所需要的离焦量
根据所确定的离焦量,调整进入照相机组件的图像光线的焦距,以实现照相机组件通过第一光路拍摄清晰聚焦的图像。
上述方案中,所述获取显微镜第二光路中的辅助对焦相机所拍摄的测 量样本,包括:
通过所述辅助对焦相机,采集所述第二光路中的光线;
基于与所述辅助对焦相机的类型对所采集的所述第二光路中的光线进行处理,以实现拍摄与所述显微镜视野中的待观测样品相匹配的交叠影像。
上述方案中,所述方法还包括:
基于所述焦距调整的结果,通过第一光路中的光线对所述显微镜视野中的待观测样品进行拍照,形成并输出通过第一光路拍摄清晰聚焦的图像。
本申请实施例还提供了一种医疗设备,所述医疗设备包括:
包括显微镜系统、存储器和处理器,所述显微镜系统为前述实施例所提供的显微镜系统,所述处理器执行以下步骤:
存储器,配置为存储可执行指令;
处理器,配置为运行所述存储器存储的可执行指令时,实现前的显微镜自动对焦方法。
本申请实施例还提供了一种计算机可读存储介质,存储有可执行指令,其所述可执行指令被处理器执行时实现前述的显微镜自动对焦方法。
本申请实施例具有以下有益效果:
本申请实施例通过物镜,配置为获取待观测样品的光线进入第一光路,并与图像投影模块产生的光线经由透镜组件进入第一光路后,在分束器处汇合;分束器组件,包括至少一个分束器,配置为分别对不同光路中的光线进行分离与投射;透镜组件,包括至少一个透镜,配置为对被观测样品在观测时产生的光线经由所述物镜进入不同的光路后对所述光线进行投射,以实现所述光线沿不同光路传播;图像投影组件,设置于所述透镜组件所投射光线的相应光路中,配置为对所述待观测样品的图像进行图像增强处理;照相机组件,所述照相机组件设置于第一光路中,所述照相机组件包括照相机,配置为对所述显微镜视野中的待观测样品进行拍照,以形 成并输出通过第一光路拍摄清晰聚焦的图像;辅助对焦装置,包括辅助对焦光源、辅助对焦相机,设置于第二光路中,配置为确定与所述照相机组件相匹配的焦距;对焦装置,所述对焦装置配置为根据所述辅助对焦装置确定的待测样本图像的离焦量,调整进入所述照相机组件的图像光线的焦距。由此,可以实现对焦装置对显微镜系统的照相机组件进行自动对焦,形成并输出通过第一光路拍摄清晰聚焦的图像,节省了显微镜系统对焦的时间,提升了对焦的精确度。
附图说明
为了更清楚地说明本申请实施例或相关技术中的技术方案,下面将对实施例或相关技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的显微镜自动对焦方法的使用环境示意图;
图2为本申请实施例提供的医疗设备的组成结构示意图;
图3为本申请实施例中显微镜系统的一种可选的结构;
图4是本申请实施例提供的显微镜自动对焦方法的一个可选的过程示意图;
图5是本申请实施例提供的显微镜系统的一个可选的结构示意图;
图6是本申请实施例提供的显微镜系统的一个可选的结构示意图;
图7是本申请实施例提供的显微镜系统的一个可选的结构示意图;
图8为本申请实施例中离焦量与重影间距离关系示意图;
图9为本申请中离焦量与重影间距离关系拟合示意图。
具体实施方式
为了使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请作进一步地详细描述,所描述的实施例不应视为对本申请的限制,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本申请保护的范围。
在以下的描述中,涉及到“一些实施例”,其描述了所有可能实施例的子集,但是可以理解,“一些实施例”可以是所有可能实施例的相同子集或不同子集,并且可以在不冲突的情况下相互结合。
对本申请实施例进行进一步详细说明之前,对本申请实施例中涉及的名词和术语进行说明,本申请实施例中涉及的名词和术语适配置为如下的解释。
1)响应于,用于表示所执行的操作所依赖的条件或者状态,当满足所依赖的条件或状态时,所执行的一个或多个操作可以是实时的,也可以具有设定的延迟;在没有特别说明的情况下,所执行的多个操作不存在执行先后顺序的限制。
2)终端,包括但不限于:普通终端、专用终端,其中所述普通终端与发送通道保持长连接和/或短连接,所述专用终端与所述发送通道保持长连接。
3)客户端,终端中实现特定功能的载体,例如移动客户端(APP)是移动终端中特定功能的载体,例如执行支付消费功能或者是购买理财产品的功能。
4)透镜组件,至少一个透镜组合的装置,可以设置有镜筒,配置为观测对象如细胞的放大的光学影像。
5)视野,通过透镜组件观察涂片中细胞的放大影像时所能够观察到的范围。
6)计算机辅助诊断(AD Computer Aided Diagnosis)其中,CAD用于通过影像学、医学图像处理技术以及其他可能的生理、生化手段,结合计算机的分析计算,辅助发现病灶,提高诊断的准确率。
下面以通过显微镜对相应的病灶细胞切片进行观察为例对本申请所提供的显微镜自动对焦方法进行说明,参考图1,图1为本申请实施例提供的显微镜自动对焦方法的使用场景示意图,参见图1,终端(包括终端10-1和终端10-2)上设置有能够执行不同功能相应客户端其中,所属客户端为终端(包括终端10-1和终端10-2)通过网络300从相应的服务器200中获取不同的切片图像进行浏览,终端通过网络300连接服务器200,网络300可以是广域网或者局域网,又或者是二者的组合,使用无线链路实现数据传输,其中,终端(包括终端10-1和终端10-2)通过网络300从相应的服务器200中所获取的切片图像类型既可以相同也可以不相同,例如:终端(包括终端10-1和终端10-2)既可以通过网络300从相应的服务器200中获取与目标对象相匹配的病理图像或者病理视频,也可以通过网络300从相应的服务器200中获取仅与当前目标相匹配的病理切片进行浏览。服务器200中可以保存有不同目标对象各自对应的切片图像,也可以保存与所述目标对象的切片图像相匹配的辅助分析信息。
其中,服务器所部署的人工智能领域的神经网络模型能够利用在传统的光学显微镜上使用相机采集待观察样品的图像,并结合机器学习算法对实时图像进行分析。人工智能(AI Artificial Intelligence)是利用数字计算机或者数字计算机控制的机器模拟、延伸和扩展人的智能,感知环境、获取知识并使用知识获得最佳结果的理论、方法、技术及应用系统。
具体来说,人工智能是计算机科学的一个综合技术,它企图了解智能的实质,并生产出一种新的能与人类智能相似的方式做出反应的智能机器。人工智能也就是研究各种智能机器的设计原理与实现方法,使机器具有感 知、推理与决策的功能。人工智能软件技术主要包括计算机视觉技术、语音处理技术、自然语言处理技术以及机器学习/深度学习等几大方向。
需要说明的是,在显微镜系统(与所述目标对象的病理细胞切片相接触的医疗设备)下查看的患者病灶可以包括多种不同的应用场景,如肺癌细胞筛查,宫颈癌早期筛查等不同细胞切片筛查等。基于本实施例的显微镜系统图像处理方法可以部署到多种应用场景,从而便于医生的远程查阅与使用。
服务器200通过网络300向终端(终端10-1和/或终端10-2)发送同一目标对象的病理信息以实现终端(终端10-1和/或终端10-2)的用户对目标对象的病理信息进行分析,因此。作为一个事例,服务器200部署相应的神经网络模型,配置为对显微镜系统所输出的清晰的图像信息进行分析,其中,所述显微镜系统获取图像可以通过以下方式实现:获取显微镜第二光路中的辅助对焦相机所拍摄的测量样本;根据所述辅助对焦的相机所拍摄的测量样本以及相应的图像评价标准,计算对应的图像评价参数;根据所述图像评价参数,在预先存储的校准曲线中查找图像评价参数和离焦量的关系,进而确定所需要的离焦量;根据所确定的离焦量,调整进入照相机组件的图像光线的焦距,以实现照相机组件通过第一光路拍摄清晰聚焦的图像。
基于所述焦距调整的结果对所述显微镜视野中的待观测样品进行拍照,形成并输出通过第一光路拍摄清晰聚焦的图像。
下面对本申请实施例的医疗设备的结构做详细说明,医疗设备可以各种形式来实施,如带有显微镜系统图像处理功能的专用终端,也可以为带有显微镜系统图像处理功能的医疗设备或者云服务器,例如前述图1中的服务器200。图2为本申请实施例提供的医疗设备的组成结构示意图,可以理解,图2仅仅示出了医疗设备的示例性结构而非全部结构,根据需要可 以实施图2示出的部分结构或全部结构。
本申请实施例提供的医疗设备包括:至少一个处理器201、存储器202、用户接口203和至少一个网络接口204。医疗设备20中的各个组件通过总线系统205耦合在一起。可以理解,总线系统205配置为实现这些组件之间的连接通信。总线系统205除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。但是为了清楚说明起见,在图2中将各种总线都标为总线系统205。
其中,用户接口203可以包括显示器、键盘、鼠标、轨迹球、点击轮、按键、按钮、触感板或者触摸屏等。
可以理解,存储器202可以是易失性存储器或非易失性存储器,也可包括易失性和非易失性存储器两者。本申请实施例中的存储器202能够存储数据以支持终端(如10-1)的操作。这些数据的示例包括:配置为在终端(如10-1)上操作的任何计算机程序,如操作系统和应用程序。其中,操作系统包含各种系统程序,例如框架层、核心库层、驱动层等,配置为实现各种基础业务以及处理基于硬件的任务。应用程序可以包含各种应用程序。
在一些实施例中,本申请实施例提供的显微镜系统可以采用软硬件结合的方式实现,作为示例,本申请实施例提供的显微镜系统可以是采用硬件译码处理器形式的处理器,其被编程以执行本申请实施例提供的显微镜系统图像处理方法。例如,硬件译码处理器形式的处理器可以采用一个或多个应用专用集成电路(ASIC,Application Specific Integrated Circuit)、DSP、可编程逻辑器件(PLD,Programmable Logic Device)、复杂可编程逻辑器件(CPLD,Complex Programmable Logic Device)、现场可编程门阵列(FPGA,Field-Programmable Gate Array)或其他电子元件。
作为本申请实施例提供的显微镜系统采用软硬件结合实施的示例,本 申请实施例所提供的显微镜系统可以直接体现为由处理器201执行的软件模块组合,软件模块可以位于存储介质中,存储介质位于存储器202,处理器201读取存储器202中软件模块包括的可执行指令,结合必要的硬件(例如,包括处理器201以及连接到总线205的其他组件)完成本申请实施例提供的显微镜系统图像处理方法。
作为示例,处理器201可以是一种集成电路芯片,具有信号的处理能力,例如通用处理器、数字信号处理器(DSP,Digital Signal Processor),或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等,其中,通用处理器可以是微处理器或者任何常规的处理器等。
作为本申请实施例提供的显微镜系统采用硬件实施的示例,本申请实施例所提供的装置可以直接采用硬件译码处理器形式的处理器201来执行完成,例如,被一个或多个应用专用集成电路(ASIC,Application Specific Integrated Circuit)、DSP、可编程逻辑器件(PLD,Programmable Logic Device)、复杂可编程逻辑器件(CPLD,Complex Programmable Logic Device)、现场可编程门阵列(FPGA,Field-Programmable Gate Array)或其他电子元件执行实现本申请实施例提供的显微镜系统图像处理方法。
本申请实施例中的存储器202配置为存储各种类型的数据以支持医疗设备20的操作。这些数据的示例包括:配置为在医疗设备20上操作的任何可执行指令,如可执行指令,实现本申请实施例的从显微镜系统图像处理方法的程序可以包含在可执行指令中。
在另一些实施例中,本申请实施例提供的显微镜系统可以采用软件方式实现,图2示出了存储在存储器202中的显微镜系统2020,其可以是程序和插件等形式的软件,并包括一系列的模块,作为存储器202中存储的程序的示例,可以包括显微镜系统2020,显微镜系统2020中包括以下的软件模块:
信息处理模块2081,配置为获取显微镜第二光路中的辅助对焦相机所拍摄的测量样本;根据所述辅助对焦的相机所拍摄的测量样本以及相应的图像评价标准,计算对应的图像评价参数;根据所述图像评价参数,在预先存储的校准曲线中查找图像评价参数和离焦量的关系,进而确定所需要的离焦量;根据所确定的离焦量,调整进入照相机组件的图像光线的焦距,以实现照相机组件通过第一光路拍摄清晰聚焦的图像。
在介绍本申请所提供的显微镜自动对焦方法之前,首先对相关技术中的显微镜对焦过程进行说明,其中,参考图3,图3为本申请的相关技术中显微镜系统的一种可选的结构,相关实施例中提供了一种显微镜300,该显微镜300具有显微镜机身301、显微镜机身载物台调焦旋钮302、显微镜机身载物台303、显微镜待观察样品304、机身物镜305、三目镜筒306、相机307以及目镜308。其中,显微镜机身301上方设置有显微镜载物台303,显微镜机身载物台303上放置有待观察样品304,所述显微镜机身301两侧设有显微镜机身载物台调焦旋钮302,所述显微镜机身物镜305位于显微镜机身载物台303的上方,在显微镜机身物镜305的上方还设有三目镜筒306,三目镜306头分别与照相机307和目镜308连接。调节所述显微镜机身载物台调焦旋钮302可以调整显微镜机身载物台303在垂直方向上升或者下降,从而改变显微镜机身载物台303与显微镜机身物镜305之间的间距以实现调焦。当然,也可以使得显微镜机身物镜305移动,从而改变显微镜机身载物台303与显微镜机身物镜305之间的间距从而实现调焦。
其中,上述显微镜300调焦的前提是假设目镜308端与显微镜300三目镜筒306的相机307端是齐焦的。然而,由于调焦技术的限制的原因使得相机307的图像与目镜308端的图像是不齐焦的。例如:不同倍数物镜齐焦没有调节好,不同显微镜300使用者眼睛屈光度不同,且交换显微镜300使用者时,新的使用者没有调节目镜308屈光度旋钮的意识,而直接去 调节载物台来使样品重新聚焦。这些原因都可以导致显微镜300的目镜308端与相机307端图像不齐焦,以至于人眼看到清晰图像时,相机307采集到的是离焦的图像,从而不能保证图像算法分析结果的正确性。而在显微镜图像自动分析领域中,相机能采集到高质量的图像是增强现实显微镜的算法准确性的保障。样品离焦的图像会丢失很多重要的光学信息,即使计算能力较强的后期算法也无法弥补。所以,保证相机能够采集到样品准确聚焦的图像尤为重要。
为了克服上述缺陷,参考图4,图4是本申请实施例提供的显微镜自动对焦方法的一个可选的过程示意图,本申请所提供的显微镜自动对焦方法包括以下步骤:
步骤401:获取显微镜第二光路中的辅助对焦的相机所拍摄的测量样本;
步骤402:根据所述辅助对焦的相机所拍摄的测量样本以及相应的图像评价标准,计算对应的图像评价参数。
其中,图像评价标准可以为辅助对焦相机所采集的图像重影的相对像素的偏移量。
步骤403:根据所述图像评价参数,在预先存储的校准曲线中查找图像评价参数和离焦量的关系,进而确定所需要的离焦量;
其中,校准曲线可以预先保存在相应的存储介质中,便于通过调用该校准曲线,可以实现增强现实显微镜的自动对焦功能,其中,该预先存储的校准曲线是根据预先采集的不同离焦量对应的图像与相应的图像评价标准(例如辅助对焦相机所采集的图像重影的相对像素的偏移量)所确定的一条曲线,所以通过查找该校准曲线可以确定离焦量和不同离焦程度图像评价参数之间的关系。
步骤404:根据所确定的离焦量,调整进入所述照相机组件的图像光线 的焦距,以实现照相机组件通过第一光路拍摄清晰聚焦的图像。
其中,通过本实施所提供的自动对焦方法,根据所确定的离焦量,调整进入照相机组件的图像光线的焦距时,无论目镜中所呈现的待观测图像是否清晰,都可以实现照相机组件通过第一光路拍摄清晰聚焦的图像,避免现有技术中手动调焦速度慢、精度差的缺陷。
下面结合不同形态的显微镜系统,对本申请所提供的显微镜自动对焦方法进行说明,对于不同形态的显微镜系统,其结构与图3所示的显微镜也并不相同,例如增强现实显微镜(ARM Augmented Reality Microscope)能够便捷且准确地观察在显微镜下的待观测样本时能够同时获取到其他增强信息,如此帮助观察者快速定位和量化感兴趣的特征。仅以应配置为医疗诊断场景下为例,医生使用增强现实显微镜观察切片时,同时能够获取到基于该切片的诊断结果,即增强现实显微镜能够将诊断结果作为增强现实信息叠加在切片上,如此方便医生在视场中实时读取结论,同时服务器中运行的神经网络模型能够对病患区域做出辅助诊疗判断,以帮助医生对病灶的病理信息做出正确的判断。
下面继续结合不同显微镜系统的状态对本申请所提供的显微镜系统的结构进行说明,参考图5,图5是本申请实施例提供的显微镜系统的一个可选的结构示意图,其中,该显微镜系统1100具体包括物镜115、分束器1112图像投影组件1111、照相机组件117以及三目镜筒116,所述物镜115具有相对设置的第一端10a和第二端10b,所述第一端10a朝向待观察样品,所述分束器1112设置在所述第二端10b,所述分束器1112分别与所述物镜115和所述多目镜筒的管镜1118连通,所述照相机组件117设置于所述三目镜筒的其中一个通道中,其中,所述图像投影组件1111通过透镜1115中所传输的光线进行相应视野中的图像投影,所述照相机组件117接收管镜1118传输的光线,所述照相机组件117包括照相机和相应的图像输出装置,配 置为将所拍摄的相应视野中的图像传输至服务器中,以对图像进行处理或者识别,所述三目镜筒116设置在所述分束器1112远离所述物镜10的一端,所述三目镜筒116包括通道和管镜1118,所述通道至少包括两个且位于远离所述分束器1112的一端,所述管镜1118位于靠近所述分束器1112的一端,所述照相机组件117通过所述管镜1118接收所述分束器1112输出的光线,以完成对应的视野中图像的采集。
这一过程中,由于所述照相机组件117与所述图像投影组件1111设置在不同的位置,为了避免光线的传播过程中的影响,因此,所述图像投影组件1111中还包括第一偏振片1116,所述第一偏振片1116位于所述管镜1118与所述分束器1112之间,配置为对所述第一光路中的相应光线进行偏振处理;所述照相机组件还包括第二偏振片1117,所述第二偏振片1117位于所述管镜1118与照相机组件117之间,配置为对所述照相机组件117所采集的相应光线进行偏振处理。
进一步地,显微镜系统1100光路为:物镜115的光线传输到分束器1112,分束器1112将一部分光线反射到管镜1118并通过所述第一偏振片1116传输到照相机组件117的感光芯片上,同时,分束器1112将一部分光线传输到管镜1118,并穿过管镜1118并将传输的光反射穿过管镜1118达到三目镜筒116,三目镜筒116将光传输到目镜118,可以通过目镜118观察待观察样品114的图像,同时,图像投影组件1111产生的光线沿着透镜1115,经过第二偏振片1117的偏振处理,并通过分束器1112以及第一偏振片1116的作用,使其不能到达相机组件117中,也就不会影响相机组件117的拍摄。
但是这一过程中,不同屈光度的显微镜观测人员交换使用显微镜系统时需要每次都重复一遍繁琐的三目齐焦调节。相机也不能够自主完成自动对焦,无法采集清晰的图像。同时,显微镜使用者通过目镜看到清晰图像 时,相机采集到的是离焦的图像,从而不能保证服务器中所执行的图像算法分析结果的正确性。
为解决上述问题,进一步地,参考图6,图6是本申请实施例提供的显微镜系统的一个可选的结构示意图,其中,该显微镜系统600具体包括物镜115、分束器1112图像投影组件1111、照相机组件117以及三目镜筒116,所述物镜115具有相对设置的第一端10a和第二端10b,所述第一端10a朝向待观察样品,所述分束器1112设置在所述第二端10b,所述分束器1112分别与所述物镜115和所述多目镜筒的管镜1118连通,所述照相机组件117设置于所述三目镜筒的其中一个通道中,其中,所述图像投影组件1111通过透镜1115中所传输的光线进行相应视野中的图像投影,所述照相机组件117接收管镜1118传输的光线,所述照相机组件117包括照相机和相应的图像输出装置,配置为将所拍摄的相应视野中的图像传输至服务器中,以对图像进行处理或者识别,所述三目镜筒116设置在所述分束器1112远离所述物镜10的一端,所述三目镜筒116包括通道和管镜1118,所述通道至少包括两个且位于远离所述分束器1112的一端,所述管镜1118位于靠近所述分束器1112的一端,所述照相机组件117通过所述管镜1118接收所述分束器1112输出的光线,以完成对应的视野中图像的采集。由于所述照相机组件117与所述图像投影组件1111设置在不同的位置,为了避免光线的传播过程中的影响,因此,所述图像投影组件1111中还包括第二偏振片1117,所述第二偏振片1117位于所述透镜1115与所述分束器1112之间,配置为对所述第一光路中的相应光线进行偏振处理;所述照相机组件还包括第一偏振片1116,所述第一偏振片1116位于所述管镜1118与照相机组件117之间,配置为对所述照相机组件117所采集的相应光线进行偏振处理,具体来说,由于显微镜操作人员的肉眼希望通过目镜118既能够观察待观察样品114的图像,还能够观察图像投影组件1111中所输出的图像,同时, 照相机组件117仅希望能够拍摄待观察样品114的图像,同时忽略图像投影组件1111中所输出的图像,通过第二偏振片1117可以将图像投影组件1111所输出的光线变为偏振光线,可以直接通过目镜118到达人眼进行观察,但是由于第一偏振片1116和第二偏振片1117的偏振方向相互垂直,第一偏振片1116可以把第二偏振片1117所输出的偏振光线消除,使得照相机组件117仅可以拍摄待观察样品114的图像。
在本申请的一些实施例中,所述物镜包括至少以下之一:
消色差物镜、平场消色差物镜、平场半复消色差物镜,或者平场复消色差物镜,所述分束器包括至少以下之一:方体分束器、平板分束器或薄膜分束器。具体来说,考虑到在进行物体观测时,可能存在不同放大倍数的需求,例如针对同一观测物如细胞的轮廓和内核需要采用不同放大倍数,或者针对不同大小的观测物需要采用不同放大倍数,还可以提供具有不同放大倍数的物镜组合以供用户选择。例如,可以提供放大倍数为4.0X、10.0X、20.0X、60.0X和100.0X的物镜组合,供用户选择。同时方体分束器、平板分束器或薄膜分束器可以根据增强现实显微镜的种类进行选择适配,以适配不同的使用环境。
同时,为了实现自动对焦,显微镜系统1100还包括:辅助对焦光源1140设置在聚光镜组1141件对应的傅里叶后焦面中,配置为发出辅助对焦光线,其中,辅助对焦光源1140可以为两个相同的红外LED光线发射器,辅助对焦相机1143设置于照相机组件117的共轭平面的轴向偏置位置,其中,辅助对焦光源1140所产生的光线所形成的第二光路如图中1144所示。
进一步地,显微镜系统1100光路包括:第一光路和第二光路,其中,第一光路配置为对被观测样品在观测时产生的光线经由所述物镜进入光路后对所述光线进行投射,使得照相机组件对所述显微镜视野中的待观测样品进行拍照,以形成并输出通过第一光路拍摄的清晰聚焦的图像,同时还 能实现图像投影组件利用第二光路中的光线对待观测样品的图像进行图像增强处理。
其中,第一光路包括:物镜115的光线传输到分束器1112,分束器1112将一部分光线反射到管镜1118并通过第一透镜1116并传输到照相机组件117的感光芯片上,同时,分束器1112将一部分光线传输到管镜1118,并穿过管镜1118并将传输的光反射穿过管镜1118达到三目镜筒116,三目镜筒116将光传输到目镜118,可以通过目镜118观察待观察样品114的图像,同时,图像投影组件1111产生的光线沿着透镜1115,经过第二偏振片1117的偏振处理,并通过分束器1112以及第一偏振片1116的作用,使其不能到达相机组件117中,使得照相机组件117仅可以拍摄待观察样品114的图像,也就不会影响相机组件117的拍摄,其中,第一偏振片1116和第二偏振片1117的偏振方向相互垂直。在照相机组件117对相应视野中的图像进行采集之前,首先需要对照相机组件117的焦距进行调整,具体可以通过第二光路中的图像确定相应的失焦量参数。
其中,第二光路包括:傅里叶后焦面中的辅助对焦光源1140的光线经过物1185,到达分束器1112,分束器1112将将光线传输至管镜1118,最终由作为辅助对焦光源红外LED发射的红外光线经过管镜1118传输至辅助对焦相机1143后在辅助对焦相机1143处成像(部分重叠的影像)。
进一步地,在本申请的一些实施例中,对焦装置1121位于所述分束器1112与所述照相机组件117之间,配置为基于所述交叠影像的离焦量所确定的焦距驱动所述第一透镜进行焦距调整,形成新的焦距。具体来说,焦装置1121可以是电动马达如超声驱动马达或其他机械马达可用来相应的驱动镜片组;也可以是液态变焦镜头,独立于镜片组而进行液态变焦,以适配不同的使用环境。
在本申请的一些实施例中,对焦装置1121位于所述分束器1112与所述 照相机组件117之间,配置为通过可变焦镜头基于所述交叠影像的离焦量所确定的焦距进行焦距调整,形成所述新的焦距;所述照相机组件117,配置为基于所述新的焦距对所述显微镜视野中的待观测样品进行拍照形成并输出通过第一光路拍摄清晰聚焦的图像。
在本申请的一些实施例中,考虑到相机接口可能不统一,为了兼容多种相机,或是为了扩大或缩小视野,还可以将相机与相机适配器配合使用,基于感光芯片的相机通过相机适配器接入所述三目观察筒顶端的相机接口,从而实现相机与三目观察筒的连接。其中,相机适配器中还可以内嵌偏振片,该内嵌偏振片可以实现过滤除偏振态与偏振片相垂直的光线,避免干扰成像。
参考图7,图7是本申请实施例提供的显微镜系统的一个可选的结构示意图,其中,该显微镜系统700具体包括物镜115、分束器1112图像投影组件1111、照相机组件117以及三目镜筒116,所述物镜115具有相对设置的第一端10a和第二端10b,所述第一端10a朝向待观察样品,所述分束器1112设置在所述第二端10b,所述分束器1112分别与所述物镜115和所述多目镜筒的管镜1118连通,所述照相机组件117设置于所述三目镜筒的其中一个通道中,其中,所述图像投影组件1111通过透镜1115中所传输的光线进行相应视野中的图像投影,所述照相机组件117接收管镜1118传输的光线,所述照相机组件117包括照相机和相应的图像输出装置,配置为将所拍摄的相应视野中的图像传输至服务器中,以对图像进行处理或者识别,所述三目镜筒116设置在所述分束器1112远离所述物镜10的一端,所述三目镜筒116包括通道和管镜1118,所述通道至少包括两个且位于远离所述分束器1112的一端,所述管镜1118位于靠近所述分束器1112的一端,所述照相机组件117通过所述管镜1118接收所述分束器1112输出的光线,以完成对应的视野中图像的采集。由于所述照相机组件117与所述图像投影 组件1111设置在不同的位置,为了避免光线的传播过程中的影响,因此,所述图像投影组件1111中还包括第二偏振片1117,所述第二偏振片1117位于所述透镜1115与所述分束器1112之间,配置为对所述第一光路中的相应光线进行偏振处理;所述照相机组件还包括第一偏振片1116,所述第一偏振片1116位于所述管镜1118与照相机组件117之间,配置为对所述照相机组件117所采集的相应光线进行偏振处理。具体来说,由于显微镜操作人员的肉眼希望通过目镜118既能够观察待观察样品114的图像,还能够观察图像投影组件1111中所输出的图像,同时,照相机组件117仅希望能够拍摄待观察样品114的图像,同时忽略图像投影组件1111中所输出的图像,通过第二偏振片1117可以将图像投影组件1111所输出的光线变为偏振光线,可以直接通过目镜118到达人眼进行观察,但是由于第一偏振片1116和第二偏振片1117的偏振方向相互垂直,第一偏振片1116可以把第二偏振片1117所输出的偏振光线消除,使得照相机组件117仅可以拍摄待观察样品114的图像。
同时,为了实现自动对焦,显微镜系统1100还包括:辅助对焦光源1140设置在聚光镜组1141件对应的傅里叶后焦面中,配置为发出辅助对焦光线,其中,辅助对焦光源1140可以为两个相同的红外LED光线发射器,辅助对焦相机1143设置于图像投影组件1111的水平对称位置,并通过透镜1119接收分束器112所折射的第二光路中的光线中,辅助对焦光源1140所产生的光线所形成的第二光路如图中1144所示。
进一步地,显微镜系统1100光路包括:第一光路和第二光路,其中第一光路包括:物镜115的光线传输到分束器1112,分束器1112将一部分光线反射到管镜1118并通过第一透镜1116并传输到照相机组件117的感光芯片上,同时,分束器1112将一部分光线传输到管镜1118,并穿过管镜1118并将传输的光反射穿过管镜1118达到三目镜筒116,三目镜筒116将光传 输到目镜118,可以通过目镜118观察待观察样品114的图像,同时,图像投影组件1111产生的光线沿着透镜1115,其中,第一偏振片1116和第二偏振片1117的偏振方向相互垂直,经过第二偏振片1117的偏振处理,并通过分束器1112以及第一偏振片1116的作用,使光线不能到达相机组件117中,使得照相机组件117仅可以拍摄待观察样品114的图像,也就不会出现图像投影组件1111输出的光线影响相机组件117的拍摄,同时显微镜操作人员的肉眼希望通过目镜118既能够观察待观察样品114的图像,还能够观察图像投影组件1111中所输出的图像。在照相机组件57对相应视野中的图像进行采集之前,首先需要对照相机组件117的焦距进行调整,具体可以通过第二光路中的图像确定相应的失焦量参数。
其中,第二光路包括:傅里叶后焦面中的辅助对焦光源1140的光线经过物1185,到达分束器1112,分束器1112将将光线传输透镜1119,最终由作为辅助对焦光源红外LED发射的红外光线经过透镜1119传输至辅助对焦相机1143后在辅助对焦相机1143处成像(部分重叠的影像)。
进一步地,在本申请的一些实施例中,对焦装置1121位于所述分束器1112与所述照相机组件117之间,配置为基于所述交叠影像的离焦量所确定的焦距驱动所述第一透镜进行焦距调整,形成新的焦距。
在本申请的一些实施例中,对焦装置1121位于所述分束器1112与所述照相机组件117之间,配置为通过可变焦镜头基于所述交叠影像的离焦量所确定的焦距进行焦距调整,形成所述新的焦距;所述照相机组件117,配置为基于所述新的焦距对所述显微镜视野中的待观测样品进行拍照形成并输出通过第一光路拍摄清晰聚焦的图像。
下面继续结合图3所示的方法以及图5至图7所示的不同显微镜系统形态对本申请所提供的显微镜自动对焦方法进行说明,其中,参考图8,图8为本申请实施例中离焦量与重影间距离关系示意图,其中,作为辅助对焦 光源红外LED发射的红外光线传输至辅助对焦相机1143后在辅助对焦相机1143处成像(部分重叠的影像),其中,辅助对焦的相机可以是拆掉红外滤波片的普通工业相机,也可以是专门的红外相机。能够捕捉红外光。照相组件中的相机通常是彩色相机,而彩色相机前通常有一块红外截止滤波片,故红外LED40产生的红外光并不会被相机捕捉到。当然也可以在相机之前追加额外的红外截止滤波片来达到更好的滤除红外光的效果。
其中,不同离焦量的情况下,对辅助对焦相机采集到的图像进行自相关运算后获得的自相关结果的峰值位置将发生变化。
可以通过以下所示的理论推导:
假设辅助对焦相机采集到的图像为z[x]=s[x]+s[x-x0],其中s[x]和s[x-x0]为间隔距离为x0的两个重影。z[x]也可以表示为这种形式:z[x]=s[x]*h[x]。其中‘*’代表卷积符号,h[x]=δ[x]+δ[x-x0]。
通过对z[x]做自相关运算,得到R(z[x])=R(s[x])*R(h[x])=R(s[x])*(2δ[x]+δ[x-x0]+δ[x+x0])。其中‘R()’代表自相关运算符号。2δ[x]+δ[x-x0]+δ[x+x0]代表三个δ函数。R(s[x])*(2δ[x]+δ[x-x0]+δ[x+x0])表示R(s[x])和三个δ函数的卷积。也就意味着R(z[x])的结果中通过算法运算会形成三个尖峰。一个峰高最高位于中间,另外两个尖峰位于该最高峰两边,并分别与此峰距离x0。此时也意味着如果能够确定经过算法运算所形成的这三个尖峰中任意两个之间的距离就能得到x0,也就是相机采集到的两个重影之间的距离。
继续以通过显微镜对老鼠肾脏切片进行观察为例对本申请所提供的显微镜自动对焦方法进行说明,其中,如图5展示的是两个红外LED40在辅助对焦相机的表面形成的样本重影(图中展示的是老鼠肾脏切片在20倍物镜下的截图)。
继续参考图8,其中:(a1)、(b1)、(c1)所示是老鼠肾脏切片在不同 离焦量时辅助对焦相机采集到的相应视野中的对焦图像。(a2)、(b2)、(c2)为分别对应(a1)、(b1)、(c1)每张图所做自相关运算结果。
图8中(a1)-(c1)分别表示了在前序不同实施例的显微镜系统中的辅助对焦相机所采集到的含有两个重影的红外图像。样本处于不同离焦量时,两个重影之间的距离便会不同。通过对辅助对焦相机采集到的图像进行一些运算(包括但不限于自相关运算)得到样本离焦量与图像中两个重影之间距离的关系曲线。从而以此拟合的曲线作为之后对焦过程的参考表来实现自动对焦。
继续参考图9,图9为本申请中离焦量与重影间距离关系拟合示意图,其中,一条根据离焦量与重影间距离关系拟合后的曲线示意图如图9,样本离焦量与辅助对焦相机采集到图像的重影之间的关系拟合曲线。
图9中曲线呈现单调递增趋势的原因是辅助对焦相机相对照相机组件设置有一定的偏置。此图中偏置为60微米,正如图中最中心的第六点。图9展示的是从样本离焦量-30微米(图9中30微米处,即第一点)到+30微米处(图7中90微米处,即第11点)采集的11张重影图像拟合出来的曲线。设置偏置的原因是因为如果不设置偏置,该拟合曲线理论上会接近“V”形。然而,在样品聚焦附近时,两个重影的距离会非常近。根据上述所述自相关计算方法,三个尖峰的距离也会靠的很近,从而导致峰顶的值被淹没或者出现新的意想不到的不相关的峰值。这便不利于寻找需要确定的峰值位置。所以此处引入偏置就是为了拉开三个峰值之间的距离。
需要说明的是,将不同的指标信息值拟合成一曲线,其中曲线的顶点就是离焦量为零时对应的位置。离焦量越接近零,则代表图像越清晰,也就是对焦装置驱动照相机需要调节到的位置。
有益技术效果:
本申请实施例通过获取显微镜第二光路中的辅助对焦相机所拍摄的测 量样本;根据所述辅助对焦的相机所拍摄的测量样本以及相应的图像评价标准,计算对应的图像评价参数;根据所述图像评价参数,在预先存储的校准曲线中查找图像评价参数和离焦量的关系,进而确定所需要的离焦量;根据所确定的离焦量,调整进入照相机组件的图像光线的焦距,以实现照相机组件通过第一光路拍摄清晰聚焦的图像。由此,可以实现对焦装置对显微镜系统的照相机组件进行自动对焦,形成并输出通过第一光路拍摄清晰聚焦的图像,节省了显微镜系统对焦的时间,提升了对焦的精确度。
以上所述,仅为本申请的实施例而已,并非配置为限定本申请的保护范围,凡在本申请的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本申请的保护范围之内。
工业实用性
本申请实施例中通过获取显微镜第二光路中的辅助对焦相机所拍摄的测量样本;根据所述辅助对焦的相机所拍摄的测量样本以及相应的图像评价标准,计算对应的图像评价参数;根据所述图像评价参数,在预先存储的校准曲线中查找图像评价参数和离焦量的关系,进而确定所需要的离焦量;根据所确定的离焦量,调整进入照相机组件的图像光线的焦距,以实现照相机组件通过第一光路拍摄清晰聚焦的图像。由此,可以实现对焦装置对显微镜系统的照相机组件进行自动对焦,形成并输出通过第一光路拍摄清晰聚焦的图像,节省了显微镜系统对焦的时间,提升了对焦的精确度,使得显微镜系统的使用者获得清晰度更高的图像。

Claims (15)

  1. 一种显微镜系统,其中,所述显微镜系统包括:
    物镜,配置为获取待观测样品的光线进入第一光路,并与图像投影模块产生的光线经由透镜组件进入所述第一光路后,在分束器处汇合;
    分束器组件,包括至少一个分束器,配置为分别对不同光路中的光线进行分离与投射;
    照相机组件,所述照相机组件设置于所述第一光路中,所述照相机组件包括照相机,配置为对所述显微镜视野中的待观测样品进行拍照,以形成并输出通过第一光路拍摄清晰聚焦的图像;
    辅助对焦装置,包括辅助对焦光源、辅助对焦相机,设置于第二光路中,配置为确定与所述照相机组件相匹配的焦距;
    对焦装置,配置为根据所述辅助对焦装置确定的待测样本图像的离焦量,调整进入所述照相机组件的图像光线的焦距。
  2. 根据权利要求1所述的显微镜系统,其中,所述显微镜系统还包括:
    透镜组件,包括至少一个透镜,配置为对被观测样品在观测时产生的光线经由所述物镜进入不同的光路后对所述光线进行投射,以实现所述光线沿不同光路传播;
    目镜、三目镜筒,所述目镜与所述三目镜筒套接,配置为通过所述物镜对待观测样品进行观察;
    图像投影组件,设置于所述透镜组件所投射光线的相应光路中,配置 为对所述待观测样品的图像进行图像增强处理;
    所述三目镜筒设置在所述分束器远离所述物镜的一端,所述三目镜筒包括通道和管镜,所述通道至少包括两个且所述通道位于远离所述分束器的一端,其中一个通道与目镜连通,所述管镜位于靠近所述分束器的一端。
  3. 根据权利要求1所述的显微镜系统,其中,所述对焦装置包括移驱动组件和可变焦镜头,以实现在不同焦距时,对所述显微镜视野中的待观测样品进行拍照。
  4. 根据权利要求2所述的显微镜系统,其中,
    所述分束器组件分别与所述物镜和所述三目镜筒的管镜连通,所述照相机组件设置于所述三目镜筒的其中一个通道中;
    所述分束器组件包括一个分束器,所述透镜组件包括一个透镜,设置于所述分束器与所述图像投影组件之间;
    所述对焦装置位于所述分束器与所述照相机组件之间,配置为根据所述辅助对焦装置确定的待测样本图像的离焦量,调整进入所述照相机组件的图像光线的焦距。
  5. 根据权利要求4所述的显微镜系统,其中,
    所述图像投影组件还包括第一偏振片,所述第一偏振片位于所述透镜组件与所述分束器之间,配置为对所述第一光路中的相应光线进行偏振处理;
    所述照相机组件还包括第二偏振片,所述第二偏振片位于所述对焦装置与所述分束器之间,配置为对所述照相机组件所采集的相应光线进行偏振处理。
  6. 根据权利要求4所述的显微镜系统,其中,
    所述辅助对焦光源设置于所述显微镜系统的聚光镜组件对应的傅里叶后焦面中,配置为发出辅助对焦光线,以形成所述第二光路;
    所述分束器组件包括一个设置于所述对焦装置和所述照相机组件之间的分束器,配置为将所述第二光路中的光线反射至所述辅助对焦相机中;
    所述辅助对焦相机设置于所述照相机组件的共轭平面的轴向偏置位置,配置为基于所述第二光路中的光线,拍摄与所述显微镜视野中的待观测样品相匹配的交叠影像。
  7. 根据权利要求4所述的显微镜系统,其中,
    所述辅助对焦光源设置于所述显微镜系统的聚光镜组件对应的傅里叶后焦面中,配置为发出辅助对焦光线,以形成所述第二光路;
    所述辅助对焦相机与所述图像投影组件沿所述分束器组件相对设置,配置为基于所述第二光路中的光线,拍摄与所述显微镜视野中的待观测样品相匹配的交叠影像。
  8. 根据权利要求1所述的显微镜系统,其中,所述图像投影组件和所述照相机组件采用时分复用机制运行。
  9. 根据权利要求1所述的显微镜系统,其中,所述显微镜系统还包括:
    至少一个输出接口设备,所述输出接口设备所述显微镜系统的数据处理单元相耦合,以输出通过第一光路拍摄清晰聚焦的图像和经过图像增强处理的待观测样品的图像。
  10. 根据权利要求1所述的显微镜系统,其中,所述物镜包括至少以 下之一:
    消色差物镜、平场消色差物镜、平场半复消色差物镜,或者平场复消色差物镜;
    所述分束器包括至少以下之一:
    方体分束器、平板分束器或薄膜分束器。
  11. 一种显微镜自动对焦方法,所述方法由显微镜系统执行,所述显微镜自动对焦方法包括:
    获取显微镜第二光路中的辅助对焦相机所拍摄的测量样本;
    根据所述辅助对焦的相机所拍摄的测量样本以及相应的图像评价标准,计算对应的图像评价参数;
    根据所述图像评价参数,在预先存储的校准曲线中查找图像评价参数和离焦量的关系,进而确定所需要的离焦量;
    根据所确定的离焦量,调整进入照相机组件的图像光线的焦距,以实现照相机组件通过第一光路拍摄清晰聚焦的图像。
  12. 根据权利要求11所述的方法,其中,所述获取显微镜第二光路中的辅助对焦相机所拍摄的测量样本,包括:
    通过所述辅助对焦相机,采集所述第二光路中的光线;
    基于与所述辅助对焦相机的类型对所采集的所述第二光路中的光线进行处理,以实现拍摄与所述显微镜视野中的待观测样品相匹配的交叠影像。
  13. 根据权利要求11所述的方法,其中,所述方法还包括:
    基于所述焦距调整的结果,通过第一光路中的光线对所述显微镜视野中的待观测样品进行拍照,形成并输出通过第一光路拍摄清晰聚焦的图像。
  14. 一种医疗设备,其中,所述医疗设备包括:
    包括显微镜系统、存储器和处理器,所述显微镜系统为权利要求1至10任一项所述的显微镜系统,所述处理器执行以下步骤:
    存储器,配置为存储可执行指令;
    处理器,配置为运行所述存储器存储的可执行指令时,实现权利要求11至13任一项所述的显微镜自动对焦方法。
  15. 一种计算机可读存储介质,存储有可执行指令,其中,所述可执行指令被处理器执行时实现权利要求11至13任一项所述的显微镜自动对焦方法。
PCT/CN2021/077828 2020-04-13 2021-02-25 显微镜自动对焦方法、显微镜系统、医疗设备和存储介质 WO2021208603A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/745,571 US20220342195A1 (en) 2020-04-13 2022-05-16 Microscope automatic focusing method, microscope system, medical device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010284514.X 2020-04-13
CN202010284514.XA CN111443476B (zh) 2020-04-13 2020-04-13 显微镜自动对焦方法、显微镜系统、医疗设备和存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/745,571 Continuation US20220342195A1 (en) 2020-04-13 2022-05-16 Microscope automatic focusing method, microscope system, medical device, and storage medium

Publications (1)

Publication Number Publication Date
WO2021208603A1 true WO2021208603A1 (zh) 2021-10-21

Family

ID=71651731

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/077828 WO2021208603A1 (zh) 2020-04-13 2021-02-25 显微镜自动对焦方法、显微镜系统、医疗设备和存储介质

Country Status (3)

Country Link
US (1) US20220342195A1 (zh)
CN (2) CN111443476B (zh)
WO (1) WO2021208603A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4296749A1 (en) * 2022-06-20 2023-12-27 Evident Corporation Microscope system

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111443476B (zh) * 2020-04-13 2023-04-14 腾讯科技(深圳)有限公司 显微镜自动对焦方法、显微镜系统、医疗设备和存储介质
CN116095477B (zh) * 2022-08-16 2023-10-20 荣耀终端有限公司 对焦处理系统、方法、设备及存储介质
CN116794822B (zh) * 2023-07-05 2024-04-30 苏州欧米特光电科技有限公司 一种显微镜控制系统和方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012181341A (ja) * 2011-03-01 2012-09-20 Olympus Corp 顕微鏡装置
US20140152796A1 (en) * 2012-11-21 2014-06-05 Samsung Electronics Co., Ltd. Auto focus control apparatus, semiconductor inspecting apparatus and microscope
CN104932092A (zh) * 2015-06-15 2015-09-23 上海交通大学 基于偏心光束法的自动对焦显微镜及其对焦方法
CN108646396A (zh) * 2018-04-27 2018-10-12 合肥工业大学 自动对焦显微镜系统
CN110673325A (zh) * 2019-09-25 2020-01-10 腾讯科技(深圳)有限公司 显微镜系统、智能医疗设备、自动对焦方法和存储介质
CN111443476A (zh) * 2020-04-13 2020-07-24 腾讯科技(深圳)有限公司 显微镜自动对焦方法、显微镜系统、医疗设备和存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107656364B (zh) * 2017-11-16 2020-10-23 宁波舜宇仪器有限公司 一种显微成像系统及其实时对焦方法
CN108254853B (zh) * 2018-01-17 2023-08-11 宁波舜宇仪器有限公司 一种显微成像系统及其实时对焦方法
CN108051897B (zh) * 2018-01-17 2023-06-23 宁波舜宇仪器有限公司 一种显微成像系统及实时对焦方法
CN110727093A (zh) * 2019-11-21 2020-01-24 宁波五维检测科技有限公司 多光谱显微自动聚焦装置及方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012181341A (ja) * 2011-03-01 2012-09-20 Olympus Corp 顕微鏡装置
US20140152796A1 (en) * 2012-11-21 2014-06-05 Samsung Electronics Co., Ltd. Auto focus control apparatus, semiconductor inspecting apparatus and microscope
CN104932092A (zh) * 2015-06-15 2015-09-23 上海交通大学 基于偏心光束法的自动对焦显微镜及其对焦方法
CN108646396A (zh) * 2018-04-27 2018-10-12 合肥工业大学 自动对焦显微镜系统
CN110673325A (zh) * 2019-09-25 2020-01-10 腾讯科技(深圳)有限公司 显微镜系统、智能医疗设备、自动对焦方法和存储介质
CN111443476A (zh) * 2020-04-13 2020-07-24 腾讯科技(深圳)有限公司 显微镜自动对焦方法、显微镜系统、医疗设备和存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4296749A1 (en) * 2022-06-20 2023-12-27 Evident Corporation Microscope system

Also Published As

Publication number Publication date
US20220342195A1 (en) 2022-10-27
CN111443476A (zh) 2020-07-24
CN111443476B (zh) 2023-04-14
CN116430568A (zh) 2023-07-14

Similar Documents

Publication Publication Date Title
WO2021208603A1 (zh) 显微镜自动对焦方法、显微镜系统、医疗设备和存储介质
US20170031146A1 (en) Imaging Assemblies With Rapid Sample Auto-Focusing
JP6287238B2 (ja) プレノプティック型検耳鏡
CN111308690B (zh) 一种光场电子内窥设备及其成像方法
CN106210520B (zh) 一种自动调焦电子目镜及系统
JP6862569B2 (ja) 仮想光線追跡方法および光照射野の動的リフォーカス表示システム
WO2021057422A1 (zh) 显微镜系统、智能医疗设备、自动对焦方法和存储介质
JP7104296B2 (ja) 測距カメラ
CN110488479A (zh) 一种增强现实显微镜、图像投影设备及图像处理系统
CN115830675B (zh) 一种注视点跟踪方法、装置、智能眼镜及存储介质
JP2017192086A (ja) 画像生成装置、画像観察装置、撮像装置および画像処理プログラム
Kagawa et al. A three‐dimensional multifunctional compound‐eye endoscopic system with extended depth of field
US8508589B2 (en) Imaging systems and associated methods thereof
CN108937909B (zh) 一种基于片层光的选层血流散斑成像装置及方法
CN111443477B (zh) 显微镜自动对焦方法、显微镜系统、医疗设备和存储介质
JP2681092B2 (ja) 結像光路中に設けられた光線分割器を有する手術用顕微鏡
US8593508B2 (en) Method for composing three dimensional image with long focal length and three dimensional imaging system
US20200074628A1 (en) Image processing apparatus, imaging system, image processing method and computer readable recoding medium
US20040169922A1 (en) Stereo microscopy
US20180017776A1 (en) Variable working distance microscope
Cruz et al. Automated urine microscopy using scale invariant feature transform
JP6069324B2 (ja) デュアル・サンプリング・レンズを備える単一軸の立体撮像装置
CN115316919B (zh) 双相机3d光学荧光内窥镜摄像系统、方法及电子设备
WO2022004305A1 (ja) 撮像支援装置、撮像装置、撮像支援方法、及びプログラム
JP2743307B2 (ja) 同軸マスターレンズ及びスレーブレンズを備えた撮影器材

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21788902

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15/03/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21788902

Country of ref document: EP

Kind code of ref document: A1