CN116679461B - Image sensor, imaging device and method - Google Patents

Image sensor, imaging device and method Download PDF

Info

Publication number
CN116679461B
CN116679461B CN202211198000.8A CN202211198000A CN116679461B CN 116679461 B CN116679461 B CN 116679461B CN 202211198000 A CN202211198000 A CN 202211198000A CN 116679461 B CN116679461 B CN 116679461B
Authority
CN
China
Prior art keywords
optical signal
photosensitive pixel
obtaining
dimensional image
pixel array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211198000.8A
Other languages
Chinese (zh)
Other versions
CN116679461A (en
Inventor
李泠霏
葛宝梁
刘文可
孔云川
许俊豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211198000.8A priority Critical patent/CN116679461B/en
Publication of CN116679461A publication Critical patent/CN116679461A/en
Application granted granted Critical
Publication of CN116679461B publication Critical patent/CN116679461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/54Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being generated by moving a 2D surface, e.g. by vibrating or rotating the 2D surface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6428Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/75Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated
    • G01N21/77Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator
    • G01N21/7703Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator using reagent-clad optical fibres or optical waveguides
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/24Coupling light guides
    • G02B6/42Coupling light guides with opto-electronic elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6428Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
    • G01N2021/6439Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes" with indicators, stains, dyes, tags, labels, marks

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Immunology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Plasma & Fusion (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The application provides an image sensor, image device and method, image sensor sets up in the image device of microcosmic sample, includes: the imaging device comprises a photosensitive pixel array and a surface functional layer, wherein the photosensitive pixel array is coated on the inner wall surface of a channel of the imaging device, and the surface functional layer is arranged on the photosensitive pixel array, and the photosensitive pixel array comprises at least two photosensitive pixel surfaces; the surface functional layer is used for obtaining a first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces and obtaining a second optical signal; the photosensitive pixel array is used for obtaining a two-dimensional image according to the second optical signal, so that a processor connected with the image sensor obtains a three-dimensional image according to a preset algorithm and the two-dimensional image. It is possible to provide a miniaturized imaging device and achieve simple and efficient imaging of 3D images.

Description

Image sensor, imaging device and method
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image sensor, an imaging device, and a method.
Background
With development of micro technology, requirements for detection and analysis of a micro sample are continuously increased, and for the micro sample (such as a biological sample), the micro structure is complex, and the comprehensive morphology and structure information of the micro sample is difficult to obtain by a two-dimensional imaging technology.
At present, the three-dimensional imaging technology of microscopic samples requires complex optical system design and complex algorithm assistance, has high cost and huge imaging device equipment, so that the application is limited, and a miniaturized imaging device capable of simply and efficiently imaging is needed.
Disclosure of Invention
The application provides an image sensor, an imaging device and a method, which can provide a miniaturized imaging device and realize simple and efficient imaging of 3D images.
In a first aspect, there is provided an image sensor provided in an imaging device for microscopic samples, comprising: the imaging device comprises a photosensitive pixel array and a surface functional layer, wherein the photosensitive pixel array is coated on the inner wall surface of a channel of the imaging device, and the surface functional layer is arranged on the photosensitive pixel array, and the photosensitive pixel array comprises at least two photosensitive pixel surfaces; the surface functional layer is used for obtaining a first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces and obtaining a second optical signal; the photosensitive pixel array is used for obtaining a two-dimensional image according to the second optical signal, so that a processor connected with the image sensor obtains a three-dimensional image according to a preset algorithm and the two-dimensional image.
Microscopic samples include samples where the individual shape and structure are microscopic, including samples on the order of a few microns to a few millimeters in size. The image sensor may be adapted to different scenarios, in particular to biomedical applications, where the microscopic sample may be a biological sample or the like. The image sensor can be suitable for different imaging devices, can be arranged in the imaging devices together with some equipment with the functions of detection and analysis and the like, and can also be integrated in the imaging devices with the functions of detection and analysis and the like.
Unlike planar photosensitive pixel arrays, the photosensitive pixel array provided herein can image microscopic samples in channels, such as cells, by multi-faceted coating, and can be viewed as a planar photosensitive pixel array that "folds" to form multiple facets.
In one possible implementation, two adjacent ones of the at least two photosensitive pixel faces are not coplanar. The normal line included angle of the adjacent photosensitive pixel surfaces is determined according to the shape of the channel of the imaging device, and if the channel is rectangular, the photosensitive pixel array comprises four photosensitive pixel surfaces, and the normal line included angle of the adjacent photosensitive pixel surfaces is 90 degrees. When the surface of the channel to be covered is a curved surface, a photosensitive pixel surface covering the curved surface may be formed in accordance with the shape of the curved surface. The response spectrum of the photosensitive pixel array covers visible light to near infrared light, and the wavelength of the response spectrum can be referenced in the range of 400nm-1550 nm. The preparation material of the photosensitive pixel array can be selected according to the pixels in the photosensitive pixel array, for example, the material can be a traditional semiconductor material (including silicon, germanium, gallium arsenide and the like) or a novel material (such as an organic semiconductor, quantum dots, a two-dimensional material, an oxide semiconductor and the like).
The surface functional layer is a composite functional layer, and the main functions that can be realized include: the passivation protection function of the pixel surface, the light filtering function and the refractive index matching are realized, and the propagation loss of the optical waveguide mode is reduced. In different application scenarios, the composite function of the surface functional layer may comprise at least one of these functions. The material of the surface functional layer can be selected from dielectric materials, metal materials or organic polymer materials, and the like, and the surface functional layer at least comprises a coating layer and can also comprise a light filtering layer, wherein the coating layer is used for realizing passivation and protection effects of the pixel surface on one hand, and the light filtering layer can filter out unnecessary light.
The preset algorithm may be an image reconstruction algorithm including, but not limited to, a deep learning based method, a compressed sensing based algorithm, a super-resolution pixel based algorithm, a holographic interference based algorithm, and the like.
In a possible implementation manner, the device further comprises a mask layer arranged on the photosensitive pixel array, and the mask layer is used for modulating and encoding the second optical signal obtained by the surface functional layer to obtain a third optical signal, wherein the surface functional layer is arranged on the mask layer; the photosensitive pixel array is further configured to obtain the two-dimensional image according to the third optical signal, so that the processor obtains the three-dimensional image according to the preset algorithm and the two-dimensional image.
The mask layer and the surface functional layer are grown to have the same shape as the photosensitive pixel array, and the mask layer has the function of realizing modulation codes of the intensity, the phase, the wavelength, the polarization and the like of the optical signals, wherein the modulation codes comprise but are not limited to super-structured surfaces, random pattern masks, phase code masks, diffraction elements and the like. The material of the mask layer may be selected from dielectric materials, liquid crystal materials, ferroelectric materials, metallic materials, organic polymer materials, or the like.
In one possible implementation, illumination is provided for microscopic samples in the channel, including fiber-coupled illumination and free-space illumination, which may be coherent illumination, partially coherent illumination, or incoherent illumination, depending on the application scenario and algorithm requirements.
In a second aspect, there is provided an image forming apparatus comprising: a channel for carrying a microscopic sample; the image sensor is coated on the inner wall surface of the channel and comprises a photosensitive pixel array and a surface functional layer arranged on the photosensitive pixel array, wherein the photosensitive pixel array comprises at least two photosensitive pixel surfaces; the surface functional layer is used for obtaining a first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces and obtaining a second optical signal; the photosensitive pixel array is used for obtaining a two-dimensional image according to the second optical signal, so that a processor connected with the image sensor obtains a three-dimensional image according to a preset algorithm and the two-dimensional image.
The imaging device can be a micro-fluidic chip, and an image sensor is integrated on the inner wall surface of a channel of the micro-fluidic chip, so that response spectrum coverage of a microscopic sample or lens-free three-dimensional information acquisition on a plurality of sheets is realized under the conditions of simple illumination, simple optical design and no control of cell rolling.
In one possible implementation manner, the image sensor further includes a mask layer disposed on the photosensitive pixel array, and configured to modulate and encode the second optical signal obtained by the surface functional layer to obtain a third optical signal, where the surface functional layer is disposed on the mask layer; the photosensitive pixel array is further configured to obtain the two-dimensional image according to the third optical signal, so that the processor obtains the three-dimensional image according to the preset algorithm and the two-dimensional image.
In one possible implementation, two adjacent ones of the at least two photosensitive pixel faces are not coplanar.
In one possible implementation, the apparatus further includes: an optical fiber for coupling monochromatic coherent light into the channel through the optical fiber; if the microscopic sample in the channel is subjected to dyeing pretreatment, the surface functional layer is specifically configured to obtain the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at a time, where the first optical signal includes an optical signal of the monochromatic coherent light and a fluorescent signal, the second optical signal includes the fluorescent signal, the fluorescent signal is excited by the monochromatic coherent light on the dyed and pretreated microscopic sample, the surface functional layer includes a coating layer and a filter layer, the coating layer is configured to implement surface protection of the photosensitive pixel array and refractive index matching of the first optical signal, and the filter layer is configured to filter out the optical signal of the monochromatic coherent light; and if the photosensitive pixel array receives the second optical signal, obtaining the two-dimensional image according to the second optical signal.
The dyeing pretreatment is carried out, namely marking is carried out on the microscopic sample, fluorescent signals are excited, the microscopic sample subjected to the dyeing pretreatment is marked by fluorescent substances, and the dyeing pretreatment comprises but is not limited to dye marking, quantum dot marking and the like.
In one possible implementation, the method further includes: an optical fiber for coupling monochromatic coherent light into the channel through the optical fiber; if the microscopic sample is subjected to dyeing pretreatment and moves along with solution flowing in the channel, the surface functional layer is specifically configured to obtain the first optical signal of each of the at least two photosensitive pixel surfaces at least once in the movement of the microscopic sample, and correspondingly obtain a frame of the second optical signal according to the first optical signal obtained each time, where the first optical signal includes an optical signal of the monochromatic coherent light and a fluorescent signal, the second optical signal includes the fluorescent signal, the fluorescent signal is excited by the monochromatic coherent light on the microscopic sample subjected to dyeing pretreatment, the surface functional layer includes a coating layer and a filter layer, and the coating layer is configured to implement surface protection of the photosensitive pixel array and refractive index matching of the first optical signal, and the filter layer is configured to filter out the optical signal of the monochromatic coherent light; the mask layer modulates and codes the second optical signal of each frame to correspondingly obtain a frame of the third optical signal; and if the photosensitive pixel array receives at least one frame of the third optical signal, obtaining the two-dimensional image according to the at least one frame of the third optical signal.
For example, taking an imaging device as a flowing microfluidic chip, imaging a fluorescent waveguide type three-dimensional (3 d) image, taking a microscopic sample as an example of a cell, performing single-frame two-dimensional (2 d) image acquisition when the cell flows through a channel, wherein a surface functional layer of the image sensor comprises a light filtering layer and a film coating layer, the film coating layer can realize surface passivation protection of a photosensitive pixel array, the film coating layer can also realize refractive index matching of a waveguide mode, the light filtering layer is used for removing an optical signal of monochromatic coherent light, namely an excitation optical signal, before a first optical signal in the channel passes through a mask layer, filtering the excitation optical signal to obtain a second optical signal, and the second optical signal is a fluorescent signal in the scene, wherein the fluorescent signal can be received by the photosensitive pixel array, or the fluorescent signal can be modulated and encoded by the mask layer to obtain a third optical signal, and the third optical signal is received by the photosensitive pixel array. The photosensitive pixel array obtains and outputs a single-frame 2D imaging result, namely a 2D image, and then a 3D image is reconstructed by a preset algorithm to obtain the 3D image. For another example, multiple frames of 2D image acquisition are performed as the cells flow through the channel. The surface functional layer comprises a light filtering layer and a coating layer, wherein the light filtering layer is used for removing the excitation light signal before the first light signal in the channel passes through the mask layer, filtering the excitation light signal to obtain a fluorescent signal, modulating and encoding the fluorescent signal through the mask layer to obtain a third light signal, and receiving the third light signal by the photosensitive pixel array. The mask patterns of the image sensor at different spatial positions are different and arranged randomly, so that the mask patterns of each frame in the acquisition process can be ensured to have certain distinction degree and difference. The third light signal is received by the photosensitive pixel array. The photosensitive pixel array obtains and outputs a multi-frame 2D imaging result, and then the reconstruction of a 3D image is realized through a preset algorithm.
In one possible implementation, the method further includes: an optical fiber for coupling coherent light into the channel through the optical fiber, wherein the coherent light comprises monochromatic coherent light or polychromatic coherent light; the processor is further configured to obtain light field distribution information in the channel; the surface functional layer is specifically configured to obtain the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at a time, where the first optical signal includes a scattered optical signal of the coherent light, and the surface functional layer includes a coating layer, where the coating layer is configured to implement surface protection of the photosensitive pixel array and refractive index matching of the first optical signal; if the photosensitive pixel array receives the second optical signal, the two-dimensional image is obtained according to the second optical signal; the processor is further configured to obtain the three-dimensional image according to the light field distribution information, the preset algorithm, and the two-dimensional image.
The obtained light field distribution information can be obtained through light field simulation or experimental test, the light field simulation can be obtained through commercial electromagnetic simulation software simulation or self-writing calculation code simulation, and the experimental test can be obtained through obtaining a blank back signal when a sample does not flow in the microfluidic chip under the condition that the image sensing array is used for collecting the same hardware setting.
In one possible implementation, the method further includes: an optical fiber for coupling coherent light into the channel through the optical fiber, wherein the coherent light comprises monochromatic coherent light or polychromatic coherent light; the processor is further configured to obtain light field distribution information in the channel; the microscopic sample in the channel moves along with the flow of the solution, the surface functional layer is specifically used for obtaining the first optical signal of each of the at least two photosensitive pixel surfaces at least once in the movement of the microscopic sample, and correspondingly obtaining a frame of the second optical signal according to the first optical signal obtained each time, wherein the first optical signal comprises the scattered optical signal of the coherent light, and the surface functional layer comprises a coating layer which is used for realizing the surface protection of the photosensitive pixel array and the refractive index matching of the first optical signal; the mask layer carries out modulation coding on the second optical signal of each frame to correspondingly obtain a frame of third optical signal; if the photosensitive pixel array receives at least one frame of the third optical signal, the two-dimensional image is obtained according to the at least one frame of the third optical signal; the processor is further configured to obtain the three-dimensional image according to the light field distribution information, the preset algorithm, and the two-dimensional image.
For example, taking an imaging device as a flowing type micro-fluidic chip, taking a light field waveguide type 3D image imaging, taking a microscopic sample as a cell as an example, firstly obtaining light field distribution information through light field simulation or experimental test, and carrying out single-frame 2D image acquisition when the cell flows through a channel. The photosensitive pixel array obtains and outputs a single-frame 2D imaging result, namely a 2D image, and the reconstruction of a 3D image is realized by combining light field distribution information through a preset algorithm. For another example, light field distribution information is obtained through light field simulation or experimental test, and multi-frame 2D image acquisition is performed when cells flow through the channel. The surface functional layer comprises a coating layer, the first optical signal passes through the coating layer to obtain a second optical signal, the second optical signal enters the mask layer, and the second optical signal is modulated and encoded by the mask layer and then received by the photosensitive pixel array. The mask patterns of the image sensor at different spatial positions are different and arranged randomly, so that the mask patterns of each frame in the acquisition process can be ensured to have certain distinction degree and difference. The third light signal is received by the photosensitive pixel array. The photosensitive pixel array obtains and outputs a plurality of 2D imaging results, and the reconstruction of the 3D image is realized by combining light field distribution information through a preset algorithm. Preset algorithms for bright field guided 3D image imaging include, but are not limited to: methods utilizing inverse imaging models+regularization, algorithms based on physical models, methods based on deep learning, and the like.
In one possible implementation, the method further includes: the illumination array is used for providing a light source for the micro-culture dish array, wherein the micro-culture dish array is arranged in the channel, the micro-sample is placed in the micro-culture dish array, and the image sensor is coated and arranged on the inner wall surface of the micro-culture dish array; the processor is also used for obtaining light field distribution information in the micro-culture dish array; the surface functional layer is specifically configured to obtain the second optical signal according to the first optical signal of each of the at least two photosensitive pixel surfaces obtained at one time, where the first optical signal includes an optical signal provided by the light source, and the surface functional layer includes a coating layer, where the coating layer is used to implement surface protection of the photosensitive pixel array and refractive index matching of the first optical signal; the mask layer modulates and codes the second optical signal of each frame to correspondingly obtain a frame of the third optical signal; the photosensitive pixel array obtains the two-dimensional image according to the received third optical signal; the processor is further configured to obtain the three-dimensional image according to the light field distribution information, the preset algorithm, and the two-dimensional image.
The illumination array may provide a partially coherent light source, such as a light source providing any wavelength in the 400nm to 1550nm band, to achieve partially coherent light illumination.
For example, taking an imaging device as a non-flowing microfluidic chip, taking a microscopic sample as a cell as an example, firstly obtaining light field distribution information through light field simulation, and also obtaining light field distribution information through other modes, carrying out single-frame image acquisition on the cell placed in the non-flowing microfluidic chip, at this time, because the surface functional layer of the image sensor only comprises a coating layer, the coating layer is used for realizing the surface protection of a photosensitive pixel array, the image sensor can respectively image the cell from a plurality of coating surfaces (such as five coating surfaces (namely five photosensitive pixel surfaces) of the non-flowing microfluidic chip, obtain and output five 2D imaging results through single-frame acquisition, and then realize the reconstruction of a 3D image by combining the light field distribution information through a preset algorithm, thereby obtaining a 3D image.
In a third aspect, there is provided an imaging method, adapted for use in an imaging apparatus, comprising: obtaining a first optical signal of each of the at least two photosensitive pixel surfaces of a microscopic sample to obtain a second optical signal, wherein the microscopic sample is carried in a channel of the imaging device, the imaging device further comprises an image sensor which is coated on the inner wall surface of the channel, the image sensor comprises a photosensitive pixel array and a surface functional layer which is arranged on the photosensitive pixel array, the photosensitive pixel array comprises at least two photosensitive pixel surfaces, and the surface functional layer is used for obtaining the first optical signal and the second optical signal; and obtaining a two-dimensional image according to the second optical signal, and obtaining a three-dimensional image according to a preset algorithm and the two-dimensional image.
In one possible implementation manner, after the second optical signal is obtained, the method further includes: modulating and encoding the second optical signal to obtain a third optical signal, wherein the image sensor further comprises a mask layer arranged on the photosensitive pixel array, the surface functional layer is arranged on the mask layer, and the mask layer is used for modulating and encoding; and obtaining the two-dimensional image according to the third optical signal, and obtaining the three-dimensional image according to the preset algorithm and the two-dimensional image.
In one possible implementation manner, before the obtaining the first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces, the method further includes: performing dyeing pretreatment on the microscopic sample; the obtaining a first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces, the obtaining a second optical signal comprising: obtaining the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at one time, wherein the first optical signal comprises an optical signal of monochromatic coherent light and a fluorescent signal, the second optical signal comprises the fluorescent signal, the monochromatic coherent light is coupled into the channel through an optical fiber of the imaging device, the fluorescent signal is excited by the monochromatic coherent light on the microscopic sample subjected to dyeing pretreatment, the surface functional layer comprises a coating layer and a filter layer, the coating layer is used for realizing surface protection of the photosensitive pixel array and refractive index matching of the first optical signal, and the filter layer is used for filtering the optical signal of the monochromatic coherent light; the obtaining a two-dimensional image according to the second optical signal includes: and if the second optical signal is received, obtaining the two-dimensional image according to the second optical signal.
In one possible implementation manner, before the obtaining the first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces, the method further includes: performing dyeing pretreatment on the microscopic sample; the obtaining a first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces, the obtaining a second optical signal comprising: in the moving of the micro sample, at least obtaining the first optical signal of each of the at least two photosensitive pixel surfaces once, and correspondingly obtaining a frame of the second optical signal according to the first optical signal obtained each time, wherein the micro sample moves along with the solution flowing in the channel, the first optical signal comprises an optical signal of monochromatic coherent light and a fluorescent signal, the second optical signal comprises the fluorescent signal, the monochromatic coherent light is coupled into the channel through an optical fiber of the imaging device, the fluorescent signal is excited by the monochromatic coherent light on the micro sample subjected to dyeing pretreatment, the surface functional layer comprises a coating layer and a filter layer, and the coating layer is used for realizing the surface protection of the photosensitive pixel array and the refractive index matching of the first optical signal, and the filter layer is used for filtering the optical signal of the monochromatic coherent light; the modulating and encoding the second optical signal to obtain a third optical signal includes: modulating and coding the second optical signal of each frame to correspondingly obtain a frame of the third optical signal; the obtaining the two-dimensional image according to the third optical signal includes: and if at least one frame of the third optical signal is received, obtaining the two-dimensional image according to the at least one frame of the third optical signal.
In one possible implementation, the method further includes: acquiring light field distribution information in the channel; the obtaining a first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces, the obtaining a second optical signal comprising: obtaining the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at a time, wherein the first optical signal comprises a scattered optical signal of coherent light, the coherent light is coupled into the channel through an optical fiber of the imaging device to form the scattered optical signal of the coherent light, the coherent light comprises single-color coherent light or multiple-color coherent light, the surface functional layer comprises a coating layer, and the coating layer is used for realizing surface protection of the photosensitive pixel array and refractive index matching of the first optical signal; the obtaining a two-dimensional image according to the second optical signal, and obtaining a three-dimensional image according to a preset algorithm and the two-dimensional image includes: if the second optical signal is received, the two-dimensional image is obtained according to the second optical signal; and obtaining the three-dimensional image according to the light field distribution information, the preset algorithm and the two-dimensional image.
In one possible implementation, the method further includes: the processor is further configured to obtain light field distribution information in the channel; the obtaining the first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces, and the obtaining the second optical signal includes: obtaining the first optical signal of each of the at least two photosensitive pixel surfaces at least once in the movement of the microscopic sample, and correspondingly obtaining a frame of the second optical signal according to the first optical signal obtained each time, wherein the microscopic sample moves along with the solution flow in the channel, the first optical signal comprises a scattered optical signal of coherent light, the coherent light is coupled into the channel through an optical fiber of the imaging device to form the scattered optical signal of the coherent light, the coherent light comprises monochromatic coherent light or multi-color coherent light, and a coating layer is contained in the surface functional layer and is used for realizing the surface protection of the photosensitive pixel array and the refractive index matching of the first optical signal; the modulating and encoding the second optical signal to obtain a third optical signal includes: modulating and coding the second optical signal of each frame to correspondingly obtain a frame of the third optical signal; the obtaining the two-dimensional image according to the third optical signal, and obtaining the three-dimensional image according to the preset algorithm and the two-dimensional image includes: if at least one frame of the third optical signal is received, the two-dimensional image is obtained according to the at least one frame of the third optical signal; and obtaining the three-dimensional image according to the light field distribution information, the preset algorithm and the two-dimensional image.
In one possible implementation manner, light field distribution information in a micro-culture dish array is obtained, the micro-culture dish array is arranged in the channel, the micro-sample is placed in the micro-culture dish array, the image sensor is coated and arranged on the inner wall surface of the micro-culture dish array, and the illumination array provides a light source for the micro-culture dish array; the obtaining the first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces, and the obtaining the second optical signal includes: obtaining the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at one time, wherein the first optical signal comprises a scattered optical signal formed by the light source, and the surface functional layer comprises a coating layer which is used for realizing surface protection of the photosensitive pixel array and refractive index matching of the first optical signal; the modulating and encoding the second optical signal to obtain a third optical signal includes: modulating and coding the second optical signal of each frame to obtain the third optical signal correspondingly; the obtaining the two-dimensional image according to the third optical signal, and obtaining the three-dimensional image according to the preset algorithm and the two-dimensional image includes: obtaining the two-dimensional image according to the received third optical signal; and obtaining the three-dimensional image according to the light field distribution information, the preset algorithm and the two-dimensional image.
In a fourth aspect, there is provided a computer readable storage medium having instructions stored therein which, when run on a processor, implement the method of any one of the preceding aspects and some or all of the operations included in any one of the possible implementations of any one of the preceding aspects.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a processor, implement the method of any one of the preceding aspects and some or all of the operations included in any one of the possible implementations of any one of the preceding aspects.
In a sixth aspect, a chip is provided, including: an interface circuit and a processor. The interface circuit is coupled to the processor for causing the chip to perform some or all of the operations included in the method of any one of the preceding aspects and any possible implementation of any one of the preceding aspects.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments of the present application will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an image sensor according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an imaging device according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a microfluidic system according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a fluorescence waveguide type 3D imaging method according to an embodiment of the present application;
fig. 5 is a schematic diagram of a fluorescent waveguide type 3D imaging reconstruction principle provided in an embodiment of the present application;
fig. 6 is a schematic flow chart of a bright field waveguide type 3D imaging method according to an embodiment of the present application;
fig. 7 is a schematic diagram of a principle of reconstruction of bright field waveguide type 3D imaging according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a no-flow microfluidic system according to an embodiment of the present application;
fig. 9 is a schematic flow chart of a 3D imaging method of a no-flow microfluidic system according to an embodiment of the present application;
fig. 10 is a schematic diagram of a 3D imaging reconstruction principle in a non-flow microfluidic system according to an embodiment of the present application.
Detailed Description
In order to make those skilled in the art better understand the solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the present application are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first target object and the second target object, etc., are used to distinguish between different target objects, and are not used to describe a particular order of target objects.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. For example, the plurality of processing units refers to two or more processing units; the plurality of systems means two or more systems.
For ease of understanding, the following description will first explain the relevant terms or terminology used in the embodiments of the present application:
1. microfluidic chip
The method is characterized in that basic operations of sample preparation, reaction, separation, detection, culture, separation, cracking and the like in the fields of chemistry, biology and the like are integrated on a chip, a network is formed by micro-channels, and controllable fluid penetrates through the whole operation to realize various functions of a conventional chemistry or biology laboratory. The common microfluidic chip can be classified into a flow type microfluidic chip and a non-flow type microfluidic chip according to the flow condition of liquid in a channel.
2. Lens-less imaging
Is a technique for imaging without the aid of a lens. The photosensitive pixel array of the image sensor directly receives signals from the imaged object (sample). Since no lens is arranged between the imaging object (sample) and the photosensitive pixel surface (target surface) of the photosensitive pixel array, the imaging object (sample) image cannot be obtained directly on the image sensor or the obtained imaging object (sample) image is blurred in general, and after the image sensor acquires signals, a specific algorithm is combined for reconstruction, so that the imaging object (sample) image can be obtained.
3. Biological sample
Including biological tissue, cells or macromolecules, such as biological samples may be single cells, multicellular clusters, biological tissue, protein molecules, bacteria, viruses, deoxyribonucleic acids (Deoxyribonucleic Acid, DNA), exosomes, or the like.
The embodiments of the present application provide an image sensor that may be disposed in a microscopic sample imaging device, where a microscopic sample including an individual shape and a structure is microscopic, and the microscopic sample includes a dimension on the order of several micrometers to several millimeters, such as a biological sample, and the biological sample is exemplified as a cell in the following examples and diagrams, but not limited thereto. Further, the imaging device provided in the embodiment of the present application may include a microfluidic chip system designed for preparing, reacting, separating, detecting, culturing, sorting, lysing, etc. different biological cells and biological tissues, for example, the imaging device is a microfluidic chip, or the imaging device may include a device designed with other biological detecting and analyzing systems such as a cell flow meter system, etc. these systems are matched with the sensor provided in the embodiment of the present application in the imaging device, so that a 3D image of a microscopic sample can be accurately and efficiently obtained. The image sensor of the embodiment of the application can be suitable for different scenes, and particularly can be applied to biomedical application scenes, such as scenes of implantable sensing detection, blood detection on dialysis pipelines and the like.
Fig. 1 is a schematic structural diagram of an image sensor provided in an embodiment of the present application, as shown in fig. 1, an image sensor 10 includes a photosensitive pixel array 101, a mask layer 102 and a surface functional layer 103, where the photosensitive pixel array 101 includes at least two photosensitive pixel surfaces, and as shown in fig. 1, the inner wall of a channel is a tetragonal body, and the photosensitive pixel array 101 wrapped around the inner wall of the channel may include four photosensitive pixel surfaces, which respectively cover the four surfaces of the inner wall of the channel. The mask layer 102 is disposed on the photosensitive pixel array 101, the surface functional layer 103 is disposed on the mask layer 102, and the shapes of the mask layer 102 and the surface functional layer 103 may be the same as those of the photosensitive pixel array 101, for example, the shapes obtained by growing the mask layer 102 and the surface functional layer 103 are also respectively wrapped on four surfaces of the inner wall of the channel. Further, the structure of the image sensor including the photosensitive pixel array and the surface functional layer is also within the scope of the embodiments of the present application, which are exemplified by the image sensor 10 including the photosensitive pixel array 101, the mask layer 102 and the surface functional layer 103, but not limited thereto.
The image sensor can adapt to different imaging devices, can be arranged in the imaging devices together with some equipment with functions of detection and analysis and the like, and can also be integrated in the imaging devices with functions of detection and analysis and the like. Fig. 2 is a schematic structural diagram of an imaging device provided in an embodiment of the present application, as shown in fig. 2, where the imaging device is a microfluidic chip 1, and an image sensor 10 is disposed on the microfluidic chip 1, and in some examples, the image sensor 10 may be integrated on a channel 20 of the microfluidic chip 1 through a process design, and as shown in fig. 2 (left view), the image sensor 10 includes a photosensitive pixel array 101, a mask layer 102, and a surface functional layer 103. As shown in fig. 2 (right diagram), taking a flowing microfluidic chip as an example, and taking a microscopic sample as a cell 30 as an example in fig. 2, a channel 20 is wrapped by an image sensor 10, the channel 20 is of a hollow structure, when the microfluidic chip 1 is used, the channel 20 is used as a flowing channel, the cell 30 can move along with the flowing of a solution in the channel 20, taking the structure provided in fig. 2 as an example, the channel 20 is a rectangular body, the inner wall 201 of the channel is as shown in fig. 2 (right diagram), each of the four surfaces includes four surfaces, the inner walls of the four surfaces of the channel are wrapped by the image sensor 10, that is, the inner wall 201 of the channel is integrated with a photosensitive pixel array 101, the photosensitive pixel array 101 includes four photosensitive pixel surfaces, which are respectively marked as a first photosensitive pixel surface 1011, a second photosensitive pixel surface 1012, a third photosensitive pixel surface 1013 and a fourth photosensitive pixel surface 1014 in fig. 2, and respectively correspondingly disposed on four surfaces of the inner wall 201 of the channel. In this case, unlike a planar photosensitive pixel array, the photosensitive pixel array 101 can image the four-sided coating of the cells 30 in the channel 20, and the photosensitive pixel array 101 can be regarded as a "folded-up" planar photosensitive pixel array. In the microfluidic chip 1, the surface functional layer 103 of the image sensor 10 may obtain a first optical signal corresponding to each photosensitive pixel surface of the cell 30, as shown in fig. 2, the surface functional layer 103 corresponds to the first photosensitive pixel surface 1011, the second photosensitive pixel surface 1012, the third photosensitive pixel surface 1013 and the fourth photosensitive pixel surface 1014, and four first optical signals of the cell 30 are obtained under the shooting of a frame, different structures are disposed in the surface functional layer 103, corresponding functions are configured, the first optical signal is changed into the second optical signal under the corresponding function after passing through the surface functional layer 103, and because the first optical signal is obtained corresponding to each of at least two photosensitive pixel surfaces, that is, the first optical signal is obtained corresponding to each photosensitive pixel surface, and therefore, the second optical signal is also four optical signals corresponding to each photosensitive pixel surface, and each frame of the second optical signal may reach the first photosensitive pixel surface 1011, the second photosensitive pixel surface 1012, the third photosensitive pixel surface 1013 and the fourth photosensitive pixel surface of the photosensitive pixel array 101, and the second optical signal is obtained by modulating the second optical signal corresponding to the third photosensitive pixel surface 1013, and the fourth optical signal is obtained by the second optical mask 101, and the second optical signal is obtained by the second optical mask layer 101, and the second optical signal is modulated under the condition of the second optical mask 101. It should be noted that, in the embodiment of the present application, a single frame, a frame and each frame refer to a result of photographing at a time, a single frame refers to a 2D image corresponding to one photosensitive pixel surface, referring to the example of fig. 2, for the shape of the channel 20, the photosensitive pixel surface of the image sensor 10 is wrapped in four sides to form an image, a single frame may obtain a 2D image corresponding to four photosensitive pixel surfaces, and for a planar image sensor, only one photosensitive pixel surface of one plane may obtain a single 2D image.
For example, the second optical signal reaching the first photosensitive pixel surface 1011 is received by the first photosensitive pixel surface 1011 to obtain a 2D image, or the third optical signal reaching the first photosensitive pixel surface 1011 is received by the first photosensitive pixel surface 1011 to obtain a 2D image, and so on, the image sensor 10 can correspondingly obtain 2D images corresponding to the cells 30 on the first photosensitive pixel surface 1011, the second photosensitive pixel surface 1012, the third photosensitive pixel surface 1013 and the fourth photosensitive pixel surface 1014, that is, the single frame acquisition of the photosensitive pixel array on the microfluidic chip 1 can obtain four single 2D images with different visual angles, and the four different visual angles comprise the up-down and left-right positions of the cells 30, and the photosensitive pixel array 101 has a certain distance from the cells 30, so that the viewing angles covered by the first photosensitive pixel surface 1011, the second photosensitive pixel surface 1012, the third photosensitive pixel surface 1013 and the fourth photosensitive pixel surface are larger, the obtained optical signals can basically represent the 3D images corresponding to the cells 30, and the sensor 10 can be connected with the image sensor 10 according to the two-dimensional algorithm, and the image sensor is not connected with the image sensor 10 according to the two-dimensional algorithm. Further, during single-frame 3D imaging, the photosensitive pixel array 101 may obtain a 3D image by obtaining four single 2D images with different viewing angles through the second optical signal without over-modulation encoding, or may obtain a 3D image by obtaining four single 2D images with different viewing angles through modulating the encoded second optical signal. In multi-frame 3D imaging, the photosensitive pixel array 101 may obtain multi-frame four single-frame 2D images with different viewing angles through multi-frame modulation and coding of the second optical signal, so as to obtain a 3D image with higher imaging quality.
In addition, the embodiment of the application also improves an integration mode of the image sensor and the imaging device such as a microfluidic chip, integrates the image sensor on the inner wall surface of a channel of the microfluidic chip, and achieves response spectrum coverage of a microscopic sample or lens-free three-dimensional information acquisition on a chip under the conditions of simple illumination, simple optical design and no control of cell rolling, so that the optical signal of the microscopic sample under more visual angles can be obtained in the acquisition process of a 2D image, and a high-efficiency and high-quality 3D imaging result is obtained.
In some examples, the main construction of the microfluidic chip 1 comprises a bottom substrate, a middle channel layer and a top cover sheet, often fabricated by patterning of glass, polymer, quartz, silicon, etc. materials, the microfluidic chip 1 may be fabricated by a Complementary Metal Oxide Semiconductor (CMOS) compatible silicon-based process, a silicon-based for microelectromechanical systems (Micro-ElectroMechanical System, MEMS), glass, quartz, polymer process or other novel process. The microfluidic chip 1 may be a microfluidic functional chip having a specific single function or multiple functions, and the layout of the microfluidic chip 1 may mainly include an inlet region, an outlet region, a functional region, and a channel region according to the functional distinction of the microfluidic chip 1, wherein the channel region may include micro channels of hundred micrometers and less, the micro channels may include the channels 20 illustrated above, and the respective functional regions, the inlet region, and the outlet region of the microfluidic chip 1 may communicate through the channels 20. In some examples, the photosensitive pixel array 101 is disposed in the microfluidic chip 1 as shown in fig. 2, and the channel 20 is a rectangular body, so in the photosensitive pixel array 101 wrapping the channel 20, the normal angles of the adjacent photosensitive pixel surfaces may be 90 °, that is, the first photosensitive pixel surface 1011, the second photosensitive pixel surface 1012, the third photosensitive pixel surface 1013 and the fourth photosensitive pixel surface 1014 are respectively located on four inner wall surfaces of the channel 20, and the normal angles of the adjacent photosensitive pixel surfaces may be 90 °, for the microfluidic chip 1, the inner wall surfaces refer to the surface of the channel 20 close to the cell 30, further, the first photosensitive pixel surface 1011, the second photosensitive pixel surface 1012, the third photosensitive pixel surface 1013 and the fourth photosensitive pixel surface 1014 respectively obtain optical signals of the cell 30 from the upper, lower, left and right angles shown in the right diagram in fig. 2, so as to obtain image information, which is equivalent to realizing 2D image acquisition of the cell 30 in the channel 20 at a plurality of angles, and four photosensitive pixel surfaces respectively obtain four single 2D images. Therefore, the photosensitive pixel array 101 can cover four viewing angles by single frame sampling, and can obtain single 2D images at four different viewing angles according to single frame acquisition imaging by combining with a preset algorithm, so as to obtain the effect of 3D images, and the microfluidic chip 1 does not need to make other adjustments, such as calling a complex algorithm or performing optical scheduling, etc., to obtain the 2D images of the cells 30 at different viewing angles. The photosensitive pixel array provided by the embodiment of the application is coated on the channel in the form of the coating surface, the side, close to the microscopic sample, in the channel, the coating surface can be selected according to the geometric shape of the channel, and the design principle of the coating surface is that the photosensitive pixel surface coats the channel of the microfluidic chip as much as possible so as to realize 2D image imaging of the microscopic sample in the channel at different angles. When the photosensitive pixel array is suitable for different scenes, the shape of the photosensitive pixel array can be determined according to the geometric shape, the flow mode and other factors of a channel where a microscopic sample is located in an imaging device in the current scene, for example, when the surface of the channel covered by the photosensitive pixel array is a plane, the photosensitive pixel surface included in the photosensitive pixel array can be more than or equal to two, and the covering of the photosensitive pixel array surface includes but is not limited to two-sided covering, three-sided covering, four-sided covering and five-sided covering of the photosensitive pixel surface in combination with the geometric design of the channel and the imaging requirement of the microscopic sample, and each photosensitive pixel surface and the adjacent photosensitive pixel surface are not coplanar. When the surface of the channel to be covered is a curved surface, a photosensitive pixel surface covering the curved surface may be formed in accordance with the shape of the curved surface. The design of different cladding surfaces, namely photosensitive pixel surfaces, clad in one channel is consistent, and the number of the cladding surfaces can be determined according to the channel or the actual requirement. The photosensitive pixel array provided by the embodiment of the application can be obtained based on the design scheme of a traditional photodiode or other novel design schemes, the response spectrum of the photosensitive pixel array covers visible light to near infrared light, and the wavelength of the photosensitive pixel array can be referred to and taken in the range of 400nm-1550 nm. The preparation material of the photosensitive pixel array can be selected according to the pixels in the photosensitive pixel array, for example, the material can be a traditional semiconductor material (including silicon, germanium, gallium arsenide and the like) or a novel material (such as an organic semiconductor, quantum dots, two-dimensional materials, oxide semiconductors and the like), and further, the size of the pixels can be between 0.1 mu m and 40 mu m according to different imaging requirements such as different imaging resolutions, for example, 9 mu m can be selected. In addition, each photosensitive pixel surface is provided with a mask layer and a surface functional layer, wherein the mask layer is used for realizing modulation codes of intensity, phase, wavelength, polarization and the like of optical signals, and the modulation codes comprise, but are not limited to, super-structured surfaces, random pattern masks, phase coding masks, diffraction elements and the like. The material of the mask layer may be selected from dielectric materials, liquid crystal materials, ferroelectric materials, metallic materials, organic polymer materials, or the like. The surface functional layer is a composite functional layer, and the main functions that can be realized include: the passivation protection function of the pixel surface, the light filtering function and the refractive index matching are realized, and the propagation loss of the optical waveguide mode is reduced. In different application scenarios, the composite function of the surface functional layer may comprise at least one of these functions. The material of the surface functional layer may be selected from dielectric materials, metal materials, organic polymer materials, or the like.
In some examples, illumination may be provided for cells 30 in channel 20, including fiber-coupled illumination and free-space illumination, which may be coherent, partially coherent, or incoherent, depending on the application scenario and algorithm requirements.
In some examples, the preset algorithm in deriving the 3D image from the preset algorithm and the 2D image corresponding to each photosensitive pixel surface may be an image reconstruction algorithm, including but not limited to a compressed sensing-based algorithm, a deep learning-based method, a super-resolution pixel-based algorithm, a holographic interference-based algorithm, and the like.
Since the photosensitive pixel surface in the photosensitive pixel array provided in the embodiment of the application is integrated on the inner wall surface of the channel, the distance between the photosensitive pixel surface (imaging target surface) and the cell (imaging object) of the image sensor is more suitable for adopting a lens-free imaging technology, that is, no optical lens is arranged between the photosensitive pixel surface and the cell, wherein the lens-free imaging technology includes, but is not limited to, diffraction type lens-free imaging, projection type lens-free imaging, bright field illumination lens-free imaging, fluorescent lens-free imaging and the like. The imaging device provided by the embodiment of the application adopts the lens-free imaging technology for imaging, and compared with the lens-free imaging technology, the lens-free imaging technology has the advantages of large field of view, high flux, low hardware requirement and the like.
The following examples illustrate the use of the image sensor provided in embodiments of the present application in a microfluidic system for 3D imaging and reconstruction of biological samples. Fig. 3 is a schematic structural diagram of a microfluidic system according to an embodiment of the present application, and as shown in fig. 3, the microfluidic system 2 includes an image sensor 10, a microfluidic chip 1, an optical fiber 80, and an incident illumination light source 40.
The image sensor 10 has a 3D cladding structure, the structure of the image sensor 10 can refer to fig. 1, the structure of the photosensitive pixel array 101 in the image sensor 10 can refer to fig. 2, based on the rectangular body of the channel 20, the photosensitive pixel array 101 is a three-dimensional structure with four photosensitive pixel surfaces cladding, two adjacent surfaces are not coplanar, the normal angle is 90 °, the structure of the microfluidic chip 1 can refer to fig. 2, and the incident illumination light source can provide incident light for the microfluidic chip 1, as shown in fig. 3. Taking the microfluidic system 2 as a waveguide type 3D imaging system, the microfluidic chip 1 is a flow type microfluidic chip, and the microscopic sample is a cell 30 as an example. Since the cells 30 move along with the flow of the solution in the channels 20 in the flow-type microfluidic chip, the photosensitive pixel arrays 101 are disposed on the upper, lower, left, and right sides of the channels 20, respectively, based on the principle that the movement of the cells 30 in the channels is not affected. The flow-type microfluidic chip may be various microfluidic function chips of a flow type, the microfluidic control functions of which include, but are not limited to: the method comprises the steps of microscopic sample analysis, microscopic sample sorting, microscopic sample separation, microscopic sample imaging, microscopic sample enrichment, microscopic sample reaction, microscopic sample cracking, microscopic sample preparation, microscopic sample reaction, impedance detection and the like. In the flow-type microfluidic chip, the channel provided with the photosensitive pixel array can be a channel area which is positioned between other functional areas or functional areas on the microfluidic chip, and can also be an independent functional area for realizing cell imaging. For a waveguide-type 3D imaging system, as shown in fig. 3, illumination provided by the incident illumination Light source 40 may be coupled into the micro-channel through an optical fiber 80, where the incident illumination Light source 40 may be single-color coherent Light or multiple-color coherent Light, for example, the incident illumination Light source 40 may be a laser Light source, a superluminescent Light Emitting diode (SuperLuminescent Diodes, SLD), or a Light Emitting Diode (LED), etc., the wavelength of the Light may be in the range of 400nm to 1550nm, and the optical fiber 80 may be a single-mode optical fiber or a multimode optical fiber. In different scenarios, the imaging modes include fluorescence imaging and bright field imaging, and the following embodiments are described in terms of waveguide-type 3D imaging system fluorescence imaging and waveguide-type 3D imaging system bright field imaging, respectively.
In one possible example, a fluorescence imaging waveguide-type 3D imaging system may be referred to as a fluorescence waveguide-type 3D imaging system in which microscopic samples to be imaged, such as cells 30, need to be subjected to a staining pretreatment, i.e., labeling on cells 30, exciting fluorescent signals, labeling cells 30 subjected to the staining pretreatment with a fluorescent substance, the staining pretreatment including, but not limited to, dye labeling, quantum dot labeling, and the like. The excitation light for exciting fluorescent molecules on the cells 30 is monochromatic coherent light, referring to fig. 3, the incident illumination light source 40 is monochromatic coherent light, which is introduced into the microfluidic chip 1 through an optical fiber 80 such as a single-mode optical fiber or a multimode optical fiber, the fluorescent substance-labeled cells 30 in the excitation channel 20 emit fluorescence, which can also be referred to as excitation light, the image sensor 10 is wrapped around the channel based on the rectangular parallelepiped shape of the channel in fig. 3, and fluorescent signals excited by the cells 30 are collected by the image sensor 10 on four wrapping surfaces. In the fluorescent waveguide type 3D imaging system, the surface functional layer of the image sensor 10 includes a filter layer and a coating layer, the filter layer is used for filtering excitation light, the coating layer is used for passivation and protection of the pixel surface on one hand, and on the other hand, the coating layer is used for enabling the channel 20 to meet waveguide conditions for forming light field propagation.
Fig. 4 is a schematic flow chart of a fluorescence waveguide type 3D imaging method provided in an embodiment of the present application, where the method is applicable to the above fluorescence waveguide type 3D imaging system, and in a practical application scenario, according to different algorithms and imaging requirements, 3D imaging may be obtained through a single-frame 2D image or through multiple-frame 2D images, as shown in fig. 4, and the method includes:
s101, after dyeing pretreatment, the cells are sent into a flow-type microfluidic chip.
In the fluorescent waveguide type 3D imaging system, the dyeing pretreatment can be controlled by a flow type micro-fluidic chip, or can be controlled by other devices in the system before the cells enter the flow type micro-fluidic chip.
S102, the surface functional layer of the image sensor processes the collected first optical signals to obtain second optical signals.
Referring to the fluorescent imaging scenario in fig. 3, monochromatic coherent light incident on an illumination light source is introduced into a channel through an optical fiber, the monochromatic coherent light excites a cell to emit fluorescence, and the monochromatic coherent light is denoted as excitation light, and at this time, in the channel, not only a fluorescence signal but also an excitation light signal, that is, a first light signal collected by an image sensor includes a fluorescence signal and an excitation light signal. When the first optical signal passes through the filter layer of the surface functional layer, the excitation optical signal in the first optical signal is filtered out, and a fluorescence signal, namely a second optical signal, is obtained.
Based on the 3D imaging, it may be obtained from a 2D image of a single frame or from a 2D image of multiple frames, and after S102, it may be determined whether the 3D imaging operation mode set by the system is a 2D image reconstruction mode of a single frame, if so, S103 is performed, and if not, S105 is performed.
S103, the image sensor receives a second optical signal or a third optical signal, wherein the third optical signal is obtained by modulating and encoding the second optical signal through a mask layer.
The image sensor light-sensitive pixel array may receive the second light signal or the third light signal. Masks in the mask layer include, but are not limited to, wavelength encoded masks, transmittance encoded masks, and phase encoded masks.
S104, acquiring a single-frame 2D image by a photosensitive pixel array of the image sensor according to the received optical signal.
If the image sensor sensitization pixel array receives the second optical signal, acquiring a single 2D image according to the received second optical signal; if the image sensor sensitization pixel array receives the third optical signal, a single 2D image is acquired according to the third optical signal.
S105, modulating and encoding the second optical signal by the mask layer of the image sensor under different mask patterns to obtain a third optical signal.
Because the image sensor is provided with the mask layer on the inner wall surface of the channel, the mask layer has different mask patterns at different positions, fluorescent signals of cells with positions changed along with the flow of the solution in the channel can be obtained after passing through the mask layer at different positions, and a multi-frame third optical signal can be obtained to form a multi-frame 2D image.
S106, the photosensitive pixel array of the image sensor acquires a multi-frame 2D image according to the received third optical signal.
S107 is performed after S104 or S106.
And S107, the processor obtains a 3D image according to a preset algorithm and the acquired 2D image.
After S104, the processor connected to the image sensor may receive the single-frame 2D image acquired by the image sensor, obtain single-frame 2D image data, and implement reconstruction of the 3D image by using the compression imaging 3D reconstruction algorithm, to obtain a 3D image. Because the image sensor is provided with the photosensitive pixel surfaces on the inner wall surfaces of the channels, a single 2D image on four surfaces can be obtained by collecting a single 2D image, the reconstruction of a three-dimensional image can be realized according to the four 2D images, and a 3D image can be obtained by collecting a single frame, so that the image collecting and processing speeds are higher.
After S106, the processor connected to the image sensor may receive the multi-frame 2D image acquired by the image sensor, obtain multi-frame 2D image data, where each frame includes data of four 2D images, and implement reconstruction of the 3D image by using a preset algorithm, such as a compression imaging 3D reconstruction algorithm, to obtain a 3D image, so that the obtained 3D image has higher accuracy and higher quality of the obtained 3D image.
Fig. 5 is a schematic diagram of a fluorescence waveguide type 3D imaging reconstruction principle provided in an embodiment of the present application, as shown in fig. 5 (upper diagram), a 3D image is obtained by 3D imaging reconstruction, and a flow of signal acquisition and image reconstruction of fluorescence waveguide type single-frame 3D imaging includes: when the cell flows through the channel, single-frame 2D image acquisition is carried out, because the surface functional layer of the image sensor comprises a light filtering layer and a coating layer, the coating layer can realize surface passivation protection of the photosensitive pixel array, the coating layer can also realize refractive index matching of a waveguide mode, the light filtering layer is used for removing an excitation light signal before a first light signal in the channel passes through the mask layer, filtering the excitation light signal to obtain a second light signal, the second light signal is a fluorescent signal in the scene, the fluorescent signal can be received by the photosensitive pixel array, or the fluorescent signal can be modulated and encoded through the mask layer to obtain a third light signal, and the third light signal is received by the photosensitive pixel array. The photosensitive pixel array obtains and outputs a single-frame 2D imaging result, namely a 2D image, and then the reconstruction of the 3D image is realized through a compression imaging 3D reconstruction algorithm.
As shown in fig. 5 (the following diagram), the flow of signal acquisition and image reconstruction of the fluorescence waveguide type multi-frame 3D imaging includes: multiple frames of 2D image acquisition were performed as the cells flowed through the channel. The surface functional layer comprises a light filtering layer and a coating layer, wherein the light filtering layer is used for removing excitation light signals before the first light signals in the channel pass through the mask layer, filtering the excitation light signals to obtain fluorescent signals, and modulating and encoding the fluorescent signals through the mask layer and then receiving the fluorescent signals through the photosensitive pixel array. The mask patterns of the image sensor at different spatial positions are different and arranged randomly, so that the mask patterns of each frame in the acquisition process can be ensured to have certain distinction degree and difference. The modulated and encoded fluorescent signal is received by the photosensitive pixel array. The photosensitive pixel array obtains and outputs a multi-frame 2D imaging result, and then the reconstruction of a 3D image is realized through a compression imaging 3D reconstruction algorithm.
The signal acquisition and image reconstruction of the 3D imaging provided by the embodiment of the application do not need to operate cells to controllably roll, and do not need to operate a light source or an image sensor to rotate around an imaging object, so that the difficulty of acquiring signals is reduced, and the efficiency of image reconstruction is improved.
In one possible example, the bright field imaging of the waveguide type 3D imaging system may be referred to as a bright field waveguide type 3D imaging system, in which microscopic samples to be imaged, such as cells 30, do not need to be subjected to a dyeing pretreatment, referring to fig. 3, the incident illumination light source 40 may be either monochromatic coherent light or polychromatic coherent light, which is introduced into the microfluidic chip 1 through a single-mode or multimode optical fiber, to form scattered light on the cells 30, the image sensor 10 wraps the channels on four sides based on the rectangular shape of the channels in fig. 3, and the scattered light signals of the cells 30 are collected by the image sensor 10 on four wrapping sides. In the bright field waveguide type 3D imaging system, the surface functional layer of the image sensor 10 includes a coating layer, and the scattered light signal is received by the photosensitive pixel array after being modulated and encoded by a mask layer, wherein the mask in the mask layer includes, but is not limited to, a wavelength encoding mask, a transmittance encoding mask and a phase encoding mask.
Fig. 6 is a schematic flow chart of a bright field waveguide type 3D imaging method according to an embodiment of the present application, and the method is suitable for use in the bright field waveguide type 3D imaging system, and the method still corresponds to the hardware requirement and is shown in fig. 3, which is different in that the surface functional layer on the image sensing array does not include a filter layer, and microscopic samples such as cells do not need to be subjected to dyeing pretreatment, but light field distribution information in a scene needs to be obtained.
In a practical application scene, according to different algorithms and imaging requirements, 3D imaging can be obtained through a single-frame 2D image or through multiple frames of 2D images, as shown in fig. 6, and the method includes:
s201, the flow type micro-fluidic chip obtains light field distribution information through light field simulation or experimental test.
In the fluorescent waveguide type 3D imaging system, the light field distribution information obtained through light field simulation can be obtained through commercial electromagnetic simulation software simulation, and can also be obtained through simulation according to calculation codes independently written. The experimental test can be that the image sensing array 10 collects blank back signals when no sample flows through the microfluidic chip under the same hardware setting.
S202, the surface functional layer of the image sensor processes the collected first optical signals to obtain second optical signals.
Referring to the scenario of fig. 3, coherent light of an incident illumination light source is introduced into a channel through an optical fiber, the coherent light is incident on a cell to generate scattering, a scattered light signal in the channel is collected once by an image sensor to obtain a first light signal, and the first light signal is subjected to refractive index matching through a coating layer of a surface functional layer to reduce loss in waveguide propagation of an optical field, increase collection efficiency of the scattered light, and obtain a second light signal.
Based on the 3D imaging, it may be obtained from a 2D image of a single frame or from a 2D image of multiple frames, and after S202, it may be determined whether the 3D imaging operation mode set by the system is a 2D image reconstruction mode of a single frame, if so, S203 is performed, and if not, S205 is performed.
S203, the image sensor receives a second optical signal or a third optical signal, wherein the third optical signal is obtained by modulating and encoding the second optical signal through a mask layer.
When the 3D imaging is obtained from a single frame of 2D image, the image sensor photosensitive pixel array may receive the second optical signal or the third optical signal.
S204, the photosensitive pixel array of the image sensor acquires a single-frame 2D image according to the received optical signal.
S205, modulating and encoding the second optical signal by the mask layer of the image sensor under different mask patterns to obtain a third optical signal.
Because the image sensor is provided with the mask layer on the inner wall surface of the channel, the mask patterns of the mask layer are different at different positions, and in the channel, the scattered light signals of cells with changed positions along with the flow of the solution can be obtained after passing through the mask layer at different positions, so as to form a multi-frame 2D image.
S206, the photosensitive pixel array of the image sensor acquires a multi-frame 2D image according to the received third optical signal.
S207 is performed after S204 or S206.
S207, the processor obtains a 3D image according to the light field distribution information, a preset algorithm and the acquired 2D image.
After S204, the processor connected to the image sensor may receive the single-frame 2D image acquired by the image sensor, obtain single-frame 2D image data, and implement reconstruction of the 3D image by using the compression imaging 3D reconstruction algorithm, to obtain a 3D image. Because the image sensor is provided with the photosensitive pixel surfaces on the inner wall surface of the channel, a single 2D image on four surfaces can be obtained by collecting a single 2D image, the reconstruction of a three-dimensional image can be realized according to the four 2D images, and a 3D image can be obtained by one-time collection, so that the image collection and processing speed is higher.
After S206, the processor connected to the image sensor may receive the multi-frame 2D image acquired by the image sensor, obtain multi-frame 2D image data, where each frame includes data of four 2D images, and implement reconstruction of the 3D image by using a compression imaging 3D reconstruction algorithm to obtain a 3D image, so that the accuracy of the obtained 3D image is higher, and the quality of the obtained 3D image is higher.
Fig. 7 is a schematic diagram of a principle of bright-field waveguide type 3D imaging reconstruction provided in an embodiment of the present application, where a schematic diagram of a corresponding system is shown in fig. 3, and as shown in fig. 7 (upper diagram), 3D imaging reconstruction is performed to obtain a 3D image, and a process of signal acquisition and image reconstruction of bright-field waveguide type single-frame 3D imaging includes: firstly, light field distribution information (for example, light field simulation is used for obtaining) is obtained, single-frame 2D image acquisition is carried out when cells flow through the channel, and as the surface functional layer of the image sensor only comprises a coating layer, the coating layer is used for realizing surface protection of the photosensitive pixel array and refractive index matching of the first optical signal so as to reduce loss of light field wave guide propagation in the channel 20, the second optical signal can be received by the photosensitive pixel array, or a third optical signal is obtained by modulating and coding through a mask layer, and the third optical signal is received by the photosensitive pixel array. The photosensitive pixel array obtains and outputs a single-frame 2D imaging result, namely a 2D image, and the 3D image is reconstructed by combining light field distribution information through a compression imaging 3D reconstruction algorithm.
As shown in fig. 7 (the following diagram), the flow of signal acquisition and image reconstruction of the bright-field waveguide type multi-frame 3D imaging includes: firstly, light field distribution information (for example, light field simulation is used for obtaining the light field distribution information) is obtained, and multi-frame 2D image acquisition is carried out when cells flow through a channel. The surface functional layer comprises a coating layer, the first optical signal passes through the coating layer to obtain a second optical signal, the second optical signal enters the mask layer, and the third optical signal is modulated and encoded by the mask layer to obtain a third optical signal, and the third optical signal is received by the photosensitive pixel array. The mask patterns of the image sensor at different spatial positions are different and arranged randomly, so that the mask patterns of each frame in the acquisition process can be ensured to have certain distinction degree and difference. The third optical signal is received by the photosensitive pixel array, the photosensitive pixel array obtains and outputs a multi-frame 2D imaging result, and the reconstruction of the 3D image is realized by combining the light field distribution information through a compression imaging 3D reconstruction algorithm.
Single and multiple amplitude 3D reconstruction algorithms for open field waveguides include, but are not limited to: methods utilizing inverse imaging model + regularization, algorithms based on physical models, algorithms based on data-driven neural networks, and methods based on deep learning, etc.
In one possible example, the microfluidic system is a non-flowing type microfluidic 3D imaging system, fig. 8 is a schematic structural diagram of a non-flowing type microfluidic system provided in an embodiment of the present application, and as shown in fig. 8, the microfluidic system 3 includes a microfluidic chip 4, an illumination LED array 50 and a peripheral structure support 60, taking a microscopic sample as an example of a cell 30. The microfluidic chip 4 is a non-flow type microfluidic chip, and includes an image sensor 70, a micro culture dish array 90, a microfluidic chip substrate and other functional areas. The microfluidic chip 4 may be a non-flow type of various functional chips including, but not limited to, an imaging microfluidic chip, an impedance detection microfluidic chip, a cell culture dish array, and the like. The structure of the image sensor 70 may refer to fig. 1, which includes a surface functional layer, a mask layer and a photosensitive pixel array, where the photosensitive pixel array in the image sensor 70 is a three-dimensional structure with a plurality of photosensitive pixel surfaces coated, and since the micro-dish array 90 in the microfluidic chip with no flow is a five-sided closed micro-dish array structure, the pixel array of the image sensor 70 is a five-sided coated micro-dish array, and is coated on the inner wall of the micro-dish array 90 near the cell 30 side, as shown in fig. 8 (right diagram), and the cells 30 are placed one by one in the micro-dish array 90 through other sorting or separation technologies. As shown in fig. 8 (left view), an illumination LED array 50 is disposed above the micro-dish array 90, and the illumination LED array 50 is a partially coherent light source, and the wavelength may be any wavelength in the 400nm to 1550nm wavelength band, so as to achieve partially coherent light illumination. The illumination LED array 50 is fixed on the peripheral structure bracket 60, and in some usage scenarios of the microfluidic system 3, after the microfluidic chip 4 is manufactured, the illumination LED array 50 fixed on the peripheral structure bracket 60 and the microfluidic chip are assembled to form the microfluidic system 3. Since the microfluidic system 3 is a microfluidic 3D imaging system that is not flowing, a 3D image can be reconstructed from a 2D image of a single frame.
Fig. 9 is a schematic flow chart of a 3D imaging method of a no-flow microfluidic system according to an embodiment of the present application, as shown in fig. 9, the method includes:
s301, obtaining light field distribution information by the non-flow type micro-fluidic chip through light field simulation or experimental test.
The non-flow type micro-fluidic chip can obtain light field distribution information through simulation of commercial electromagnetic simulation software, and can also be obtained through simulation according to autonomously written calculation codes. The experimental test may be to let the image sensor array 70 collect a blank back signal when no sample flows through the microfluidic chip under the same hardware setting.
S302, the surface functional layer of the image sensor processes the collected first optical signal to obtain a second optical signal.
Referring to the scenario of fig. 8, the partially coherent light illuminating the LED array 50 is directed to the cells 30 to produce scattering, the image sensor 70 collects the scattered light signals in the primary micro dish array 90 to obtain a first light signal, and the first light signal passes through the coating layer of the surface functional layer for refractive index matching to reduce losses in waveguide propagation of the light field, and increase the collection efficiency of the scattered light to obtain a second light signal.
Based on the microfluidic chip in which cells 30 are fixedly placed in the micro-dish array 90, 3D imaging can be obtained from a single frame of 2D images.
S303, the image sensor receives a second optical signal or a third optical signal, wherein the third optical signal is obtained by modulating and encoding the second optical signal through a mask layer.
When the 3D imaging is obtained from a single frame of 2D image, the image sensor photosensitive pixel array may receive the second optical signal or the third optical signal.
S304, acquiring a single-frame 2D image by a photosensitive pixel array of the image sensor according to the received optical signal
And S305, the processor obtains a 3D image according to the light field distribution information, a preset algorithm and the acquired 2D image.
Fig. 10 is a schematic diagram of a 3D imaging reconstruction principle in a non-flowing microfluidic system according to the embodiment of the present application, where, as shown in fig. 10, the non-flowing microfluidic system obtains light field distribution information through light field simulation, and then performs single-frame image acquisition on cells placed in a non-flowing microfluidic chip, at this time, since a surface functional layer of an image sensor only includes a coating layer, the coating layer is used to realize surface protection of a photosensitive pixel array, the image sensor can image cells from five coating surfaces (i.e., photosensitive pixel surfaces), respectively, acquire and output five 2D imaging results in a single frame, and then implement reconstruction of a 3D image thereof by combining light field distribution information through a compression imaging 3D reconstruction algorithm.
In practical application, in other microscopic sample imaging devices such as biological detection and analysis equipment, in a channel with a microscopic sample, the photosensitive pixel array provided by the application is wrapped and arranged, and the photosensitive pixel array is not listed one by one in the embodiment of the application due to the diversity of the device.
The imaging device provided by the embodiment of the application can be applied to medical biological instruments, in particular to miniaturized and integrated medical biological instruments, such as instruments for realizing bedside detection, on-site rapid detection, micro total analysis systems and the like. According to the micro-sample on-chip imaging scheme based on the micro-fluidic chip, complex optical design (scanning illumination light direction, designing complex LED array light illumination) or mechanical design (designing a mechanical arm to rotate a sample or a rotating light source and the like) is not needed, imaging complexity and operation complexity are greatly simplified, miniaturization and high efficiency of a micro-fluidic system can be well achieved, and a 3D image is obtained.
Embodiments of the present application also provide a computer-readable storage medium having instructions stored therein that, when executed on a processor, implement some or all of the operations in any of the methods of any of the preceding embodiments.
Embodiments of the present application also provide a computer program product comprising a computer program which, when run on a processor, implements part or all of the operations of any of the methods described in any of the preceding embodiments.
The present application also provides an imaging system, including the flow-type microfluidic system and the non-flow-type microfluidic system provided in the foregoing embodiments, for implementing part or all of the operations in any of the methods described in any of the foregoing embodiments.
The present application also provides another imaging system, including at least one memory and at least one processor, the at least one memory storing instructions that are executable by the at least one processor to cause the imaging system to implement some or all of the operations of any of the methods of any of the previous embodiments.
The embodiment of the application also provides a chip, which comprises: an interface circuit and a processor. The interface circuit is connected to the processor for causing the chip to perform part or all of the operations of any of the methods described in any of the preceding embodiments.
The embodiment of the application also provides a chip system, which comprises: a processor coupled to a memory for storing programs or instructions which, when executed by the processor, cause the system-on-a-chip to perform part or all of the operations of any of the methods of any of the preceding embodiments.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integrated with the processor or may be separate from the processor, and embodiments of the present application are not limited. For example, the memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately provided on different chips, and the type of memory and the manner of providing the memory and the processor in the embodiments of the present application are not specifically limited.
The system-on-chip may be, for example, a field programmable gate array (Field Programmable Gate Array, FPGA), a specially-applied integrated circuit (Application Specific Integrated Circuit, ASIC), a system-on-chip (SoC), a central processing unit (Central Processing Unit, CPU), a digital signal processing circuit (digital signal processor, DSP), a microcontroller (micro controller unit, MCU), a programmable controller (programmable logic device, PLD) or other integrated chip.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, e.g., the division of units is merely a logical service division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each service unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software business units.
The integrated units, if implemented in the form of software business units and sold or used as stand-alone products, may be stored in a computer readable storage medium. With such understanding, all or part of the technical solutions of the present application may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of skill in the art will appreciate that in one or more of the examples described above, the services described herein may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the services may be stored in a computer-readable medium or transmitted as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The foregoing embodiments have been provided for the purpose of illustrating the objects, technical solutions and advantageous effects of the present application in further detail, and it should be understood that the foregoing embodiments are merely exemplary embodiments of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (16)

1. An image sensor, provided in an imaging device for microscopic samples, comprising:
the imaging device comprises a photosensitive pixel array and a surface functional layer, wherein the photosensitive pixel array is coated on the inner wall surface of a channel of the imaging device, and the surface functional layer is arranged on the photosensitive pixel array, wherein the photosensitive pixel array comprises at least two photosensitive pixel surfaces, and two adjacent photosensitive pixel surfaces in the at least two photosensitive pixel surfaces are not coplanar;
the surface functional layer is used for obtaining a first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces and obtaining a second optical signal;
the photosensitive pixel array is used for obtaining a two-dimensional image according to the second optical signal, so that a processor connected with the image sensor obtains a three-dimensional image according to a preset algorithm and the two-dimensional image.
2. The image sensor of claim 1, further comprising:
the mask layer is arranged on the photosensitive pixel array and used for modulating and encoding the second optical signal obtained by the surface functional layer to obtain a third optical signal, wherein the surface functional layer is arranged on the mask layer;
The photosensitive pixel array is further configured to obtain the two-dimensional image according to the third optical signal, so that the processor obtains the three-dimensional image according to the preset algorithm and the two-dimensional image.
3. An image forming apparatus, comprising:
a channel for carrying a microscopic sample;
the image sensor is coated on the inner wall surface of the channel and comprises a photosensitive pixel array and a surface functional layer arranged on the photosensitive pixel array, wherein the photosensitive pixel array comprises at least two photosensitive pixel surfaces, and two adjacent photosensitive pixel surfaces in the at least two photosensitive pixel surfaces are not coplanar;
the surface functional layer is used for obtaining a first optical signal of the microscopic sample corresponding to each of the at least two photosensitive pixel surfaces and obtaining a second optical signal;
the photosensitive pixel array is used for obtaining a two-dimensional image according to the second optical signal, so that a processor connected with the image sensor obtains a three-dimensional image according to a preset algorithm and the two-dimensional image.
4. The apparatus of claim 3, wherein the device comprises a plurality of sensors,
the image sensor further comprises a mask layer arranged on the photosensitive pixel array and used for modulating and encoding the second optical signal obtained by the surface functional layer to obtain a third optical signal, wherein the surface functional layer is arranged on the mask layer;
The photosensitive pixel array is further configured to obtain the two-dimensional image according to the third optical signal, so that the processor obtains the three-dimensional image according to the preset algorithm and the two-dimensional image.
5. The apparatus according to claim 3 or 4, further comprising:
an optical fiber for coupling monochromatic coherent light into the channel through the optical fiber;
if the microscopic sample in the channel is subjected to dyeing pretreatment, and the imaging device is used for fluorescence imaging, the surface functional layer is specifically used for obtaining the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at one time, wherein the first optical signal comprises an optical signal of the monochromatic coherent light and a fluorescent signal, the second optical signal comprises the fluorescent signal, the fluorescent signal is excited by the monochromatic coherent light on the dyed microscopic sample, the surface functional layer comprises a coating layer and a filter layer, the coating layer is used for realizing surface protection of the photosensitive pixel array and refractive index matching of the first optical signal, and the filter layer is used for filtering the optical signal of the monochromatic coherent light;
And if the photosensitive pixel array receives the second optical signal, obtaining the two-dimensional image according to the second optical signal.
6. The apparatus as recited in claim 4, further comprising:
an optical fiber for coupling monochromatic coherent light into the channel through the optical fiber;
if the microscopic sample is subjected to dyeing pretreatment and the imaging device is used for fluorescence imaging, the microscopic sample moves along with solution flowing in the channel, the surface functional layer is specifically used for obtaining the first optical signal of each of the at least two photosensitive pixel surfaces at least once in the movement of the microscopic sample, and correspondingly obtaining a frame of the second optical signal according to the first optical signal obtained each time, wherein the first optical signal comprises an optical signal of the monochromatic coherent light and a fluorescence signal, the second optical signal comprises the fluorescence signal, the fluorescence signal is excited by the monochromatic coherent light on the microscopic sample subjected to dyeing pretreatment, the surface functional layer comprises a coating layer and a filter layer, and the coating layer is used for realizing surface protection of the photosensitive pixel array and refractive index matching of the first optical signal, and the filter layer is used for filtering the optical signal of the monochromatic coherent light;
The mask layer modulates and codes the second optical signal of each frame to correspondingly obtain a frame of the third optical signal;
and if the photosensitive pixel array receives at least one frame of the third optical signal, obtaining the two-dimensional image according to the at least one frame of the third optical signal.
7. The apparatus according to any one of claims 3 to 4, 6, further comprising:
an optical fiber for coupling coherent light into the channel through the optical fiber, wherein the coherent light comprises monochromatic coherent light or polychromatic coherent light;
the processor is further configured to obtain light field distribution information in the channel;
the surface functional layer is specifically configured to obtain the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at a time, where the first optical signal includes a scattered optical signal of the coherent light, and the surface functional layer includes a coating layer, where the coating layer is configured to implement surface protection of the photosensitive pixel array and refractive index matching of the first optical signal;
if the photosensitive pixel array receives the second optical signal, the two-dimensional image is obtained according to the second optical signal;
The processor is further configured to obtain the three-dimensional image according to the light field distribution information, the preset algorithm, and the two-dimensional image.
8. The apparatus as recited in claim 4, further comprising:
an optical fiber for coupling coherent light into the channel through the optical fiber, wherein the coherent light comprises monochromatic coherent light or polychromatic coherent light;
the processor is further configured to obtain light field distribution information in the channel;
if the micro sample moves along with the solution flowing in the channel, the surface functional layer is specifically configured to obtain the first optical signal of each of the at least two photosensitive pixel surfaces at least once during the movement of the micro sample, and correspondingly obtain a frame of the second optical signal according to the first optical signal obtained each time, where the first optical signal includes a scattered optical signal of the coherent light, and the surface functional layer includes a coating layer, where the coating layer is configured to implement surface protection of the photosensitive pixel array and refractive index matching of the first optical signal;
the mask layer carries out modulation coding on the second optical signal of each frame to correspondingly obtain a frame of third optical signal;
If the photosensitive pixel array receives at least one frame of the third optical signal, the two-dimensional image is obtained according to the at least one frame of the third optical signal;
the processor is further configured to obtain the three-dimensional image according to the light field distribution information, the preset algorithm, and the two-dimensional image.
9. The apparatus as recited in claim 4, further comprising:
the illumination array is used for providing a light source for the micro-culture dish array, wherein the micro-culture dish array is arranged in the channel, the micro-sample is placed in the micro-culture dish array, and the image sensor is coated and arranged on the inner wall surface of the micro-culture dish array;
the processor is also used for obtaining light field distribution information in the micro-culture dish array;
the surface functional layer is specifically configured to obtain the second optical signal according to the first optical signal of each of the at least two photosensitive pixel surfaces obtained at a time, where the first optical signal includes an optical signal provided by the light source, and the surface functional layer includes a coating layer, where the coating layer is configured to implement surface protection of the photosensitive pixel array and refractive index matching of the first optical signal;
The mask layer modulates and codes the second optical signal to obtain the third optical signal;
the photosensitive pixel array obtains the two-dimensional image according to the received third optical signal;
the processor is further configured to obtain the three-dimensional image according to the light field distribution information, the preset algorithm, and the two-dimensional image.
10. An imaging method, suitable for use in an imaging device, comprising:
obtaining a first optical signal of a microscopic sample corresponding to each of at least two photosensitive pixel surfaces to obtain a second optical signal, wherein the microscopic sample is carried in a channel of the imaging device, the imaging device further comprises an image sensor which is coated on the inner wall surface of the channel, the image sensor comprises a photosensitive pixel array and a surface functional layer which is arranged on the photosensitive pixel array, the photosensitive pixel array comprises at least two photosensitive pixel surfaces, two adjacent photosensitive pixel surfaces in the at least two photosensitive pixel surfaces are not coplanar, and the surface functional layer is used for obtaining the first optical signal and the second optical signal;
and obtaining a two-dimensional image according to the second optical signal, and obtaining a three-dimensional image according to a preset algorithm and the two-dimensional image.
11. The method of claim 10, further comprising, after the obtaining the second optical signal:
modulating and encoding the second optical signal to obtain a third optical signal, wherein the image sensor further comprises a mask layer arranged on the photosensitive pixel array, the surface functional layer is arranged on the mask layer, and the mask layer is used for modulating and encoding;
and obtaining the two-dimensional image according to the third optical signal, and obtaining the three-dimensional image according to the preset algorithm and the two-dimensional image.
12. The method of claim 10 or 11, wherein prior to obtaining the first optical signal for each of the at least two photosensitive pixel facets, further comprising:
performing dyeing pretreatment on the microscopic sample;
the obtaining the first optical signal of each of the at least two photosensitive pixel surfaces corresponding to the microscopic sample, and the obtaining the second optical signal includes:
obtaining the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at one time, wherein the first optical signal comprises an optical signal of monochromatic coherent light and a fluorescent signal, the second optical signal comprises the fluorescent signal, the monochromatic coherent light is coupled into the channel through an optical fiber of the imaging device, the fluorescent signal is excited by the monochromatic coherent light on the microscopic sample subjected to dyeing pretreatment, the surface functional layer comprises a coating layer and a filter layer, the coating layer is used for realizing surface protection of the photosensitive pixel array and refractive index matching of the first optical signal, and the filter layer is used for filtering the optical signal of the monochromatic coherent light;
The obtaining a two-dimensional image according to the second optical signal includes:
and if the second optical signal is received, obtaining the two-dimensional image according to the second optical signal.
13. The method of claim 11, wherein prior to obtaining the first optical signal for each of the at least two photosensitive pixel facets for the microscopic sample, further comprising:
performing dyeing pretreatment on the microscopic sample;
the obtaining the first optical signal of each of the at least two photosensitive pixel surfaces corresponding to the microscopic sample, and the obtaining the second optical signal includes:
in the moving of the micro sample, at least obtaining the first optical signal of each of the at least two photosensitive pixel surfaces once, and correspondingly obtaining a frame of the second optical signal according to the first optical signal obtained each time, wherein the micro sample moves along with the solution flowing in the channel, the first optical signal comprises an optical signal of monochromatic coherent light and a fluorescent signal, the second optical signal comprises the fluorescent signal, the monochromatic coherent light is coupled into the channel through an optical fiber of the imaging device, the fluorescent signal is excited by the monochromatic coherent light on the micro sample subjected to dyeing pretreatment, the surface functional layer comprises a coating layer and a filtering layer, and the coating layer is used for realizing the surface protection of the photosensitive pixel array and the refractive index matching of the first optical signal, and the filtering layer is used for filtering the optical signal of the monochromatic coherent light;
The modulating and encoding the second optical signal to obtain a third optical signal includes:
modulating and coding the second optical signal of each frame to correspondingly obtain a frame of the third optical signal;
the obtaining the two-dimensional image according to the third optical signal includes:
and if at least one frame of the third optical signal is received, obtaining the two-dimensional image according to the at least one frame of the third optical signal.
14. The method according to claim 10 or 11, further comprising:
acquiring light field distribution information in the channel;
the obtaining the first optical signal of each of the at least two photosensitive pixel surfaces corresponding to the microscopic sample, and the obtaining the second optical signal includes:
obtaining the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at a time, wherein the first optical signal comprises a scattered optical signal of coherent light, the coherent light is coupled into the channel through an optical fiber of the imaging device to form the scattered optical signal of the coherent light, the coherent light comprises single-color coherent light or multiple-color coherent light, the surface functional layer comprises a coating layer, and the coating layer is used for realizing surface protection of the photosensitive pixel array and refractive index matching of the first optical signal;
The obtaining a two-dimensional image according to the second optical signal, and obtaining a three-dimensional image according to a preset algorithm and the two-dimensional image includes:
if the second optical signal is received, the two-dimensional image is obtained according to the second optical signal;
and obtaining the three-dimensional image according to the light field distribution information, the preset algorithm and the two-dimensional image.
15. The method as recited in claim 11, further comprising:
acquiring light field distribution information in the channel;
the obtaining the first optical signal of each of the at least two photosensitive pixel surfaces corresponding to the microscopic sample, and the obtaining the second optical signal includes:
obtaining the first optical signal of each of the at least two photosensitive pixel surfaces at least once in the movement of the microscopic sample, and correspondingly obtaining a frame of the second optical signal according to the first optical signal obtained each time, wherein the microscopic sample moves along with the solution flow in the channel, the first optical signal comprises a scattered optical signal of coherent light, the coherent light is coupled into the channel through an optical fiber of the imaging device to form the scattered optical signal of the coherent light, the coherent light comprises monochromatic coherent light or multi-color coherent light, and a coating layer is contained in the surface functional layer and is used for realizing the surface protection of the photosensitive pixel array and the refractive index matching of the first optical signal;
The modulating and encoding the second optical signal to obtain a third optical signal includes:
modulating and coding the second optical signal of each frame to correspondingly obtain a frame of the third optical signal;
the obtaining the two-dimensional image according to the third optical signal, and obtaining the three-dimensional image according to the preset algorithm and the two-dimensional image includes:
if at least one frame of the third optical signal is received, the two-dimensional image is obtained according to the at least one frame of the third optical signal;
and obtaining the three-dimensional image according to the light field distribution information, the preset algorithm and the two-dimensional image.
16. The method as recited in claim 11, further comprising:
acquiring light field distribution information in a micro-culture dish array, wherein the micro-culture dish array is arranged in the channel, the micro-sample is placed in the micro-culture dish array, the image sensor is coated on the inner wall surface of the micro-culture dish array, and the illumination array provides a light source for the micro-culture dish array;
the obtaining the first optical signal of each of the at least two photosensitive pixel surfaces corresponding to the microscopic sample, and the obtaining the second optical signal includes:
obtaining the second optical signal according to a first optical signal of each of the at least two photosensitive pixel surfaces obtained at one time, wherein the first optical signal comprises a scattered optical signal formed by the light source, and the surface functional layer comprises a coating layer which is used for realizing surface protection of the photosensitive pixel array and refractive index matching of the first optical signal;
The modulating and encoding the second optical signal to obtain a third optical signal includes:
modulating and coding the second optical signal of each frame to obtain the third optical signal correspondingly;
the obtaining the two-dimensional image according to the third optical signal, and obtaining the three-dimensional image according to the preset algorithm and the two-dimensional image includes:
obtaining the two-dimensional image according to the received third optical signal;
and obtaining the three-dimensional image according to the light field distribution information, the preset algorithm and the two-dimensional image.
CN202211198000.8A 2022-09-29 2022-09-29 Image sensor, imaging device and method Active CN116679461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211198000.8A CN116679461B (en) 2022-09-29 2022-09-29 Image sensor, imaging device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211198000.8A CN116679461B (en) 2022-09-29 2022-09-29 Image sensor, imaging device and method

Publications (2)

Publication Number Publication Date
CN116679461A CN116679461A (en) 2023-09-01
CN116679461B true CN116679461B (en) 2024-01-05

Family

ID=87777598

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211198000.8A Active CN116679461B (en) 2022-09-29 2022-09-29 Image sensor, imaging device and method

Country Status (1)

Country Link
CN (1) CN116679461B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103237181A (en) * 2013-03-29 2013-08-07 上海集成电路研发中心有限公司 Pixel array with enlarged photosensitive area
CN103348215A (en) * 2010-11-07 2013-10-09 科学与工业研究理事会 On-chip 4D lightfield microscope
CN109690359A (en) * 2016-04-22 2019-04-26 伊鲁米那股份有限公司 Equipment and constituent used in the luminescence imaging in multiple sites in pixel based on photon structure and the method using it
CN109825428A (en) * 2018-03-05 2019-05-31 思特威电子科技(开曼)有限公司 Using the DNA sequencing device and method of imaging sensor
CN111510700A (en) * 2020-06-18 2020-08-07 深圳市汇顶科技股份有限公司 Image acquisition device
CN111812834A (en) * 2019-07-23 2020-10-23 南京九川科学技术有限公司 Microscopic imaging device and microscopic imaging method for liquid-based cell sample
CN111855083A (en) * 2019-04-19 2020-10-30 厦门大学 Analog detection device and method for micro-fluidic chip with liquid flow control valve
CN112086474A (en) * 2020-08-28 2020-12-15 清华大学深圳国际研究生院 Image sensor for fluorescence detection
CN112087559A (en) * 2019-06-13 2020-12-15 华为技术有限公司 Image sensor, image photographing apparatus and method
CN114467176A (en) * 2020-08-03 2022-05-10 深圳市汇顶科技股份有限公司 Image sensor, image processing method and device and imaging system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050257885A1 (en) * 2004-05-24 2005-11-24 Nanostream, Inc. Capillary multi-channel optical flow cell
CN102792151B (en) * 2010-03-23 2015-11-25 加州理工学院 For the super resolution optofluidic microscope of 2D and 3D imaging
US9354159B2 (en) * 2012-05-02 2016-05-31 Nanoscopia (Cayman), Inc. Opto-fluidic system with coated fluid channels
WO2020021603A1 (en) * 2018-07-23 2020-01-30 株式会社島津製作所 Microfluidic device observation device
US11633089B2 (en) * 2019-06-20 2023-04-25 Cilag Gmbh International Fluorescence imaging with minimal area monolithic image sensor

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103348215A (en) * 2010-11-07 2013-10-09 科学与工业研究理事会 On-chip 4D lightfield microscope
CN103237181A (en) * 2013-03-29 2013-08-07 上海集成电路研发中心有限公司 Pixel array with enlarged photosensitive area
CN109690359A (en) * 2016-04-22 2019-04-26 伊鲁米那股份有限公司 Equipment and constituent used in the luminescence imaging in multiple sites in pixel based on photon structure and the method using it
CN109825428A (en) * 2018-03-05 2019-05-31 思特威电子科技(开曼)有限公司 Using the DNA sequencing device and method of imaging sensor
CN111855083A (en) * 2019-04-19 2020-10-30 厦门大学 Analog detection device and method for micro-fluidic chip with liquid flow control valve
CN112087559A (en) * 2019-06-13 2020-12-15 华为技术有限公司 Image sensor, image photographing apparatus and method
CN111812834A (en) * 2019-07-23 2020-10-23 南京九川科学技术有限公司 Microscopic imaging device and microscopic imaging method for liquid-based cell sample
CN111510700A (en) * 2020-06-18 2020-08-07 深圳市汇顶科技股份有限公司 Image acquisition device
CN114467176A (en) * 2020-08-03 2022-05-10 深圳市汇顶科技股份有限公司 Image sensor, image processing method and device and imaging system
CN112086474A (en) * 2020-08-28 2020-12-15 清华大学深圳国际研究生院 Image sensor for fluorescence detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Fine-tuned grayscale optofluidic maskless lithography for three-dimensional freeform shape microstructure fabrication;Song, SH 等;《OPTICS LETTERS》;第39卷(第17期);5162-5165 *
高性能探测成像与识别的研究进展及展望;王雪松 等;《中国科学:信息科学》;第46卷(第9期);1211-1235 *

Also Published As

Publication number Publication date
CN116679461A (en) 2023-09-01

Similar Documents

Publication Publication Date Title
Wu et al. Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring
US11774362B2 (en) Imaging flow cytometer using spatial-temporal transformation
US9007433B2 (en) Incoherent lensfree cell holography and microscopy on a chip
Paiè et al. Microfluidic based optical microscopes on chip
US8325349B2 (en) Focal plane adjustment by back propagation in optofluidic microscope devices
CN107850530B (en) Apparatus and method for optically imaging a sample
Arandian et al. Optical Imaging Approaches to Monitor Static and Dynamic Cell‐on‐Chip Platforms: A Tutorial Review
US20110181884A1 (en) Optofluidic microscope device with photosensor array
EP3724854A1 (en) Generating virtually stained images of unstained samples
Merola et al. Phase contrast tomography at lab on chip scale by digital holography
JP2021525866A (en) An analyzer for three-dimensional analysis of medical samples using a light field camera
CN110487223A (en) A kind of micro- plastics detection device and method based on spatial correlation Yu phase difference value product
Ge et al. Single-frame label-free cell tomography at speed of more than 10,000 volumes per second
Ahmad et al. On the robustness of machine learning algorithms toward microfluidic distortions for cell classification via on-chip fluorescence microscopy
CN116679461B (en) Image sensor, imaging device and method
Kim et al. Portable, Automated and Deep‐Learning‐Enabled Microscopy for Smartphone‐Tethered Optical Platform Towards Remote Homecare Diagnostics: A Review
Hu et al. Microfluidics on lensless, semiconductor optical image sensors: challenges and opportunities for democratization of biosensing at the micro-and nano-scale
Kafian et al. Light-sheet fluorescent microscopy: fundamentals, developments and applications
Gu et al. Microfluidic diffraction phase microscopy for high-throughput, artifact-free quantitative phase imaging and identification of waterborne parasites
Meng et al. A drop-in, focus-extending phase mask simplifies microscopic and microfluidic imaging systems for cost-effective point-of-care diagnostics
Memmolo et al. Loss minimized data reduction in single-cell tomographic phase microscopy using 3D zernike descriptors
Han et al. Imaging flow cytometry using linear array spot excitation
Zheng Chip‐scale microscopy imaging
D'Almeida Development of lens-free holographic microscopes using multiheight and multispectral phase recovery methods
Xiong Advanced optofluidic sensing and imaging technologies

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant