CN116309237A - Deep high-resolution phase information extraction method - Google Patents

Deep high-resolution phase information extraction method Download PDF

Info

Publication number
CN116309237A
CN116309237A CN202111575717.5A CN202111575717A CN116309237A CN 116309237 A CN116309237 A CN 116309237A CN 202111575717 A CN202111575717 A CN 202111575717A CN 116309237 A CN116309237 A CN 116309237A
Authority
CN
China
Prior art keywords
phase
sample
ray
images
phase information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111575717.5A
Other languages
Chinese (zh)
Inventor
杨杰成
苏婷
葛永帅
梁栋
刘新
郑海荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN202111575717.5A priority Critical patent/CN116309237A/en
Priority to PCT/CN2022/139815 priority patent/WO2023116584A1/en
Publication of CN116309237A publication Critical patent/CN116309237A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/60Image enhancement or restoration using machine learning, e.g. neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Quality & Reliability (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Analysing Materials By The Use Of Radiation (AREA)

Abstract

The application discloses a depth high-resolution phase information extraction method, a device, equipment and a storage medium thereof, wherein the method comprises the following steps: deconstructing and separating the mutually overlapped positive and negative phase images of the sample through a deep convolutional neural network model based on an attention mechanism; establishing differential phase images based on the deconstructed and separated positive and negative phase images of the sample; network training data is generated through an X-ray phase difference microscopic imaging physical model, the trained model is directly used for experimental data processing, and absolute phase and differential phase information of an object is extracted. According to the scheme, absolute phase and differential phase information of a sample can be rapidly and accurately extracted from positive and negative overlapped X-ray phase microscopic imaging data; meanwhile, the invention generates the network training data in a numerical simulation mode, does not need to collect the training data on an experimental system, greatly reduces the experimental difficulty and saves the experimental time.

Description

Deep high-resolution phase information extraction method
Technical Field
The invention relates to the technical field of image processing methods, in particular to a deep high-resolution phase information extraction method, device and equipment and a storage medium thereof.
Background
Over the past decades, the development of X-ray phase imaging technology has widened the applicable scenarios of X-ray imaging. As an important complement to conventional absorption imaging, phase imaging enables to observe the internal structure of weakly absorbing objects (e.g. biological samples) with higher sensitivity, because for light elements the X-ray phase shift has a larger interaction cross section than the X-ray attenuation.
In the current various X-ray phase contrast imaging methods, grating-based X-ray Talbot (-Lau) interferometers are receiving increasing attention from researchers due to their greater inclusion in light source conditions. The device can be used for simultaneously acquiring the absorption, small-angle scattering and differential phase information of the sample, and can be matched with a Computed Tomography (CT) technology to obtain three-dimensional information of the sample based on different contrast. The method can be combined with an X-ray microscope with nanoscale space resolution capability to realize multi-contrast high-resolution imaging of the internal morphology of a tiny object (such as a cell). For this research direction, several groups of problems have been developed internationally to search and research for experimental schemes for effectively integrating a grating interferometer with an X-ray transmission microscope. One of the marked developments was phase-contrast imaging achieved in 2017 by Takano et al, university of northeast japan, by combining a grating interferometer with an X-ray nano-microscope system commercialized by ZEISS corporation. Unlike classical Talbot (-Lau) interferometers, the imaging system obtains forward and reverse phase images that are not differential phase images anymore, but rather overlap each other. The existence of such artifacts causes a certain difficulty in extracting the structural information of the sample, thereby preventing further use and popularization of the imaging method.
In order to extract phase information from overlapping phase images, the current mainstream solution is to use an image post-processing algorithm. For example, yashiro et al propose phase-shift summation algorithms and deconvolution algorithms. The phase shift summation algorithm sums differential signals of different interference levels according to the imaging physical principle of the grating interferometer and the X-ray transmission microscope, and phase images are extracted; the deconvolution algorithm divides the interference level correlation term in the frequency domain by fourier transform and obtains the phase distribution by inverse fourier transform. Furthermore, takano et al propose iterative deconvolution methods, which use deconvolution methods to repeatedly correct sample images in the spatial and frequency domains to achieve phase information extraction; wolf et al propose a Total Variation (TV) regularization based maximum likelihood reconstruction scheme and use iterative optimization algorithms to achieve phase image reconstruction at low photon numbers.
Currently, a major problem with using phase shift summation algorithms and deconvolution algorithms to extract object phase information is that the algorithms are susceptible to noise. In the case of fewer photons, the noise level of the differential phase image acquired by the system increases, which results in noise amplification of the phase signal extracted by the algorithm. Furthermore, the deconvolution algorithm is prone to periodic artifacts. The iterative deconvolution method can inhibit partial noise and reduce periodic artifacts, but the algorithm can only be used for pi/2 gratings and has limited application scenes. The maximum likelihood reconstruction scheme based on TV regularization can effectively inhibit noise and artifacts, but has the limitations of large calculation amount and low reconstruction speed.
Disclosure of Invention
In view of the foregoing drawbacks or shortcomings in the prior art, it is desirable to provide a deep high resolution phase information extraction method, apparatus, device, and storage medium thereof.
In a first aspect, an embodiment of the present application provides a depth high resolution phase information extraction method, where the method includes:
deconstructing and separating the mutually overlapped positive and negative phase images of the sample through a deep convolutional neural network model based on an attention mechanism;
establishing differential phase images based on the deconstructed and separated positive and negative phase images of the sample;
network training data is generated through an X-ray phase difference microscopic imaging physical model, the trained model is directly used for experimental data processing, and absolute phase and differential phase information of an object is extracted.
In one embodiment, the attention mechanism network model comprises: deep convolutional neural networks based on spatial attention and channel attention mechanisms.
In one embodiment, the sample positive and negative phase images are separated to a degree of
Figure BDA0003420599720000031
Where λ is the X-ray wavelength, d is the distance from the phase grating to the detector, p is the equivalent period of the phase grating, and M is the multiple representing the amplified imaging of the sample through the zone plate.
In one embodiment, the generating network training data by X-ray phase difference microscopy imaging physical model includes: downloading a preset number of natural image open source data sets ImageNet; for a single picture, taking one color channel data, resetting the size of the picture, and recording the data as x 0 The method comprises the steps of carrying out a first treatment on the surface of the Generating a binary mask image M with randomly varied position and shape using Matlab code, and x 0 Performing point multiplication operation, and extracting a region x of interest ROI =M·x 0 The method comprises the steps of carrying out a first treatment on the surface of the For x ROI Adjusting the value of each pixel point of the image, restricting the value range of the image to a preset interval according to the real sample material, and simulating the absolute phase shift of an object; based on X-ray phase microscopic imaging physics, adding a noise model, generating overlapping phase images phi and differential phase images phi' with different separation scales, and repeating the steps until all pictures are processed.
In a second aspect, an embodiment of the present application further provides a depth high resolution phase information extraction apparatus, where the apparatus includes:
the deconstructing and separating unit is used for deconstructing and separating the mutually overlapped positive and negative phase images of the sample through a deep convolution neural network model based on an attention mechanism;
the establishing unit is used for establishing differential phase images based on the deconstructed and separated positive and negative phase images of the sample;
the extraction unit is used for generating network training data through the X-ray phase difference microscopic imaging physical model, directly using the trained model for experimental data processing, and extracting absolute phase and differential phase information of an object.
In one embodiment, the attention mechanism network model comprises: deep convolutional neural networks based on spatial attention and channel attention mechanisms.
In one embodiment, the sample positive and negative phase images are separated to a degree of
Figure BDA0003420599720000041
Where λ is the X-ray wavelength, d is the distance from the phase grating to the detector, p is the equivalent period of the phase grating, and M is the multiple representing the amplified imaging of the sample through the zone plate.
In one embodiment, the generating network training data by X-ray phase difference microscopy imaging physical model includes: downloading a preset number of natural image open source data sets ImageNet; for a single picture, taking one color channel data, resetting the size of the picture, and recording the data as x 0 The method comprises the steps of carrying out a first treatment on the surface of the Generating a binary mask image M with randomly varied position and shape using Matlab code, and x 0 Performing point multiplication operation, and extracting a region x of interest ROI =M·x 0 The method comprises the steps of carrying out a first treatment on the surface of the For x ROI Adjusting the value of each pixel point of the image, restricting the value range of the image to a preset interval according to the real sample material, and simulating the absolute phase shift of an object; based on X-ray phase microscopic imaging physics, adding a noise model, generating overlapping phase images phi and differential phase images phi' with different separation scales, and repeating the steps until all pictures are processed.
In a third aspect, embodiments of the present application further provide a computer device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing a method as described in any of the embodiments of the present application when the program is executed.
In a fourth aspect, embodiments of the present application further provide a computer apparatus, a computer readable storage medium having stored thereon a computer program for: the computer program, when executed by a processor, implements a method as described in any of the embodiments of the present application.
The invention has the beneficial effects that:
the depth high-resolution phase information extraction method provided by the invention can rapidly and accurately extract absolute phase and differential phase information of a sample from positive and negative overlapped X-ray phase microscopic imaging data; meanwhile, the invention generates the network training data in a numerical simulation mode, does not need to collect the training data on an experimental system, greatly reduces the experimental difficulty and saves the experimental time.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings, in which:
fig. 1 is a schematic flow chart of a depth high resolution phase information extraction method according to an embodiment of the present application;
fig. 2 shows an exemplary block diagram of a deep high resolution phase information extraction apparatus 200 according to one embodiment of the present application;
FIG. 3 shows a schematic diagram of a computer system suitable for use in implementing the terminal device of the embodiments of the present application;
FIG. 4 shows a flow chart of a method for extracting absolute phase and differential phase information of deep learning-based X-ray phase microscopy imaging according to an embodiment of the present application;
fig. 5 shows a schematic diagram of a U-Net network architecture based on spatial attention and channel attention mechanisms provided by an embodiment of the present application.
Detailed Description
In order that the above objects, features and advantages of the invention will be readily understood, a more particular description of the invention will be rendered by reference to the appended drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. The present invention may be embodied in many other forms than described herein and similarly modified by those skilled in the art without departing from the spirit of the invention, whereby the invention is not limited to the specific embodiments disclosed below.
In the description of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", "axial", "radial", "circumferential", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the device or element being referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the present invention.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present invention, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly, through intermediaries, or both, may be in communication with each other or in interaction with each other, unless expressly defined otherwise. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In the present invention, unless expressly stated or limited otherwise, a first feature "up" or "down" a second feature may be the first and second features in direct contact, or the first and second features in indirect contact via an intervening medium. Moreover, a first feature being "above," "over" and "on" a second feature may be a first feature being directly above or obliquely above the second feature, or simply indicating that the first feature is level higher than the second feature. The first feature being "under", "below" and "beneath" the second feature may be the first feature being directly under or obliquely below the second feature, or simply indicating that the first feature is less level than the second feature.
It will be understood that when an element is referred to as being "fixed" or "disposed" on another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "upper," "lower," "left," "right," and the like are used herein for illustrative purposes only and are not meant to be the only embodiment.
Referring to fig. 1, fig. 1 is a flow chart illustrating a depth high resolution phase information extraction method according to an embodiment of the present application.
As shown in fig. 1, the method includes:
step 110, deconstructing and separating the mutually overlapped positive and negative phase images of the sample through a deep convolutional neural network model based on an attention mechanism;
step 120, establishing differential phase images based on the deconstructed and separated positive and negative phase images of the sample;
and 130, generating network training data through an X-ray phase difference microscopic imaging physical model, directly using the trained model for experimental data processing, and extracting absolute phase and differential phase information of an object.
By adopting the technical scheme, the absolute phase and differential phase information of the sample can be rapidly and accurately extracted from positive and negative overlapped X-ray phase microscopic imaging data; meanwhile, the invention generates the network training data in a numerical simulation mode, does not need to collect the training data on an experimental system, greatly reduces the experimental difficulty and saves the experimental time.
In some embodiments, the attention mechanism network model in the present application includes: deep convolutional neural networks based on spatial attention and channel attention mechanisms.
Specifically, the network model provided by the invention is a deep convolutional neural network based on a spatial attention and channel attention mechanism. Taking a U-Net network as an example, referring to fig. 5, the network input image is positive and negative overlapping phase information, and the output is an absolute phase image (positive or negative in this example) and a differential phase image. The network extracts deep image features through multi-layer convolution and downsampling operations. The extracted image features generate a space and channel weight map through an attention mechanism, and the weight map is multiplied by the features which are up-sampled to the same dimension subsequently. This operation may focus the network on the true image phase information for a particular spatial location and strengthen the particular channel feature weights. And finally, symmetrically upsampling the characteristic images to the same dimension to realize image phase information extraction. The training of the network uses the numerical simulation data set generated in the last step, takes the Mean Square Error (MSE) between the network output and the true value as a loss function, carries out Xavier initialization on the network parameters, and uses an Adam optimizer to train the network to obtain an updated network parameter model. The trained network model can be directly used for experimental data processing. First, acquiring X-ray phase microscopic imaging data from laboratory equipment, and then inputting the data into a trained network model to obtain separated sample phase information.
In some embodiments, the sample positive and negative phase images in the present application are separated by:
Figure BDA0003420599720000081
where λ is the X-ray wavelength, d is the distance from the phase grating to the detector, p is the equivalent period of the phase grating, and M is the multiple representing the amplified imaging of the sample through the zone plate.
Specifically, the experimental device referred by the invention is an X-ray phase microscopic imaging system based on a grating interferometer. The main components of the system comprise an X-ray source, an ellipsoidal condenser, a source grating, a zone plate, a phase grating and a detector. The imaging advantage of the system is mainly embodied in the phase imaging of a small-size weak absorption sample with a microstructure, and can provide sample morphology information with higher contrast than that of absorption imaging. With the imaging system, microscopic imaging and phase imaging of the sample can be achieved simultaneously. The working principle is that the X-ray beam emitted by the light source is converged by an ellipsoidal condenser, and a structural light illumination sample is formed after passing through the source grating. The radiation beam with the sample characteristic information is then focused through the zone plate, forming an image of the source grating near its back focal plane, and an enlarged sample image at the further detection end. At this time, a phase grating is introduced at a proper position between the two, so that the image of the homologous grating forms a structure similar to a Talbot (-Lau) interferometer. By stepping the phase grating in a direction perpendicular to the optical axis, the detector can record the raw imaging data containing the phase information of the sample.
Taking into account a pure phase sample with refractive index n= (1-delta), the imaging process is physically modeled based on the principle of wave optics, and the phase information of the sample extracted by the phase stepping operation can be deduced theoretically, which corresponds to the overall phase shift phi (x) = [ delta (x, z) dz along the optical axis z direction, in the specific form of
Figure BDA0003420599720000091
The device comprises two mutually separated positive and negative phase images, wherein lambda is the wavelength of X-rays, k is the wave number, p is the equivalent period of a phase grating, M represents the multiple of amplified imaging of a sample through a zone plate, and d is the distance from the phase grating to a detector. The degree of separation between the two can be expressed as
Figure BDA0003420599720000092
The separation distance of the positive and negative phase images of the sample in the actual phase imaging system can be rapidly judged by using the formula (2).
In some embodiments, generating network training data by X-ray phase difference microscopy imaging physical model in the present application includes: downloading a preset number of natural image open source data sets ImageNet; for a single picture, taking one color channel data, resetting the size of the picture, and recording the data as x 0 The method comprises the steps of carrying out a first treatment on the surface of the Generating a binary mask image M with randomly varied position and shape using Matlab code, and x 0 Performing point multiplication operation, and extracting a region x of interest ROI =M·x 0 The method comprises the steps of carrying out a first treatment on the surface of the For x ROI Adjusting the value of each pixel point of the image, restricting the value range of the image to a preset interval according to the real sample material, and simulating the absolute phase shift of an object; based on X-ray phase microscopic imaging physics, adding a noise model, generating overlapping phase images phi and differential phase images phi' with different separation scales, and repeating the steps until all pictures are processed.
Specifically, the present invention uses deep learning to extract absolute phase and differential phase information of an object from positive and negative overlapping phase images. Fig. 4 is a block diagram showing the method, comprising: the deep convolution neural network based on the attention mechanism is used for realizing a phase information extraction flow and an X-ray phase microscopic imaging training data generation method.
The specific steps of the training data generation are as follows: 1) And downloading a large-scale natural image open source data set ImageNet. 2) For a single picture, one of the color channel data is taken, the picture size is reset (e.g. 1024 x 1024), and the data is recorded as x 0 . 3) Generating a binary mask image M with randomly varied position and shape using Matlab code, and x 0 Performing point multiplication operation, and extracting a region of interestx ROI =M·x 0 . 4) For x ROI And (3) adjusting the value of each pixel point of the image, and restricting the range of the value range to a certain interval according to the real sample material (such as organic glass or graphite), so as to simulate the absolute phase shift of an object. 5) Based on X-ray phase microscopic imaging physics, a noise model is added, and an overlapped phase image phi and a differential phase image phi' with different separation scales are generated. Repeating the steps 2) -5) until all the pictures are processed.
Further, referring to fig. 2, fig. 2 shows an exemplary block diagram of a depth high resolution phase information extraction apparatus 200 according to one embodiment of the present application.
As shown in fig. 2, the apparatus includes:
a deconstructing and separating unit 210 for deconstructing and separating the sample positive and negative phase images overlapped with each other by a deep convolutional neural network model based on an attention mechanism;
a building unit 220, configured to build differential phase images based on the deconstructed and separated positive and negative phase images of the sample;
the extracting unit 230 is configured to generate network training data through the X-ray phase difference microscopic imaging physical model, directly use the trained model for experimental data processing, and extract absolute phase and differential phase information of the object.
It should be understood that the elements or modules depicted in apparatus 200 correspond to the various steps in the method described with reference to fig. 1. Thus, the operations and features described above with respect to the method are equally applicable to the apparatus 200 and the units contained therein, and are not described in detail herein. The apparatus 200 may be implemented in advance in a browser or other security application of the electronic device, or may be loaded into the browser or security application of the electronic device by means of downloading or the like. The corresponding units in the apparatus 200 may cooperate with units in an electronic device to implement the solutions of the embodiments of the present application.
Referring now to FIG. 3, a schematic diagram of a computer system 300 suitable for use in implementing a terminal device or server of an embodiment of the present application is shown.
As shown in fig. 3, the computer system 300 includes a Central Processing Unit (CPU) 301 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage section 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data required for the operation of the system 300 are also stored. The CPU 301, ROM 302, and RAM 303 are connected to each other through a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
The following components are connected to the I/O interface 305: an input section 306 including a keyboard, a mouse, and the like; an output portion 307 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 308 including a hard disk or the like; and a communication section 309 including a network interface card such as a LAN card, a modem, or the like. The communication section 309 performs communication processing via a network such as the internet. The drive 310 is also connected to the I/O interface 305 as needed. A removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 310 as needed, so that a computer program read therefrom is installed into the storage section 308 as needed.
In particular, according to embodiments of the present disclosure, the process described above with reference to fig. 1 may be implemented as a computer software program. For example, embodiments of the present disclosure include a deep high resolution phase information extraction method comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the method of fig. 1. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 309, and/or installed from the removable medium 311.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present application may be implemented by software, or may be implemented by hardware. The described units or modules may also be provided in a processor, for example, as: a processor includes a first sub-region generation unit, a second sub-region generation unit, and a display region generation unit. The names of these units or modules do not constitute a limitation of the unit or module itself in some cases, and for example, the display area generating unit may also be described as "a unit for generating a display area of text from the first sub-area and the second sub-area".
As another aspect, the present application also provides a computer-readable storage medium, which may be a computer-readable storage medium contained in the foregoing apparatus in the foregoing embodiment; or may be a computer-readable storage medium, alone, that is not assembled into a device. The computer-readable storage medium stores one or more programs for use by one or more processors in performing the text generation method described herein as applied to transparent window envelopes.
The foregoing description is only of the preferred embodiments of the present application and is presented as a description of the principles of the technology being utilized. It will be appreciated by persons skilled in the art that the scope of the invention referred to in this application is not limited to the specific combinations of features described above, but it is intended to cover other embodiments in which any combination of features described above or equivalents thereof is possible without departing from the spirit of the invention. Such as the above-described features and technical features having similar functions (but not limited to) disclosed in the present application are replaced with each other.

Claims (10)

1. A depth high resolution phase information extraction method, the method comprising:
deconstructing and separating the mutually overlapped positive and negative phase images of the sample through a deep convolutional neural network model based on an attention mechanism;
establishing differential phase images based on the deconstructed and separated positive and negative phase images of the sample;
network training data is generated through an X-ray phase difference microscopic imaging physical model, the trained model is directly used for experimental data processing, and absolute phase and differential phase information of an object is extracted.
2. The deep high resolution phase information extraction method of claim 1, wherein the attention mechanism network model comprises:
deep convolutional neural networks based on spatial attention and channel attention mechanisms.
3. The method for extracting deep high resolution phase information according to claim 1, wherein the degree of separation of the forward and reverse phase images of the sample is:
Figure FDA0003420599710000011
where λ is the X-ray wavelength, d is the distance from the phase grating to the detector, p is the equivalent period of the phase grating, and M is the multiple representing the amplified imaging of the sample through the zone plate.
4. The method of claim 1, wherein generating network training data by using an X-ray phase difference microscopic imaging physical model comprises:
downloading a preset number of natural image open source data sets ImageNet;
for a single picture, taking one color channel data, resetting the size of the picture, and recording the data as x 0
Generating a binary mask image M with randomly varied position and shape using Matlab code, and x 0 Performing point multiplication operation, and extracting a region x of interest ROI =M·x 0
For x ROI Adjusting the value of each pixel point of the image, restricting the value range of the image to a preset interval according to the real sample material, and simulating the absolute phase shift of an object;
based on X-ray phase microscopic imaging physics, adding a noise model, generating overlapping phase images phi and differential phase images phi' with different separation scales, and repeating the steps until all pictures are processed.
5. A depth high resolution phase information extraction apparatus, comprising:
the deconstructing and separating unit is used for deconstructing and separating the mutually overlapped positive and negative phase images of the sample through a deep convolution neural network model based on an attention mechanism;
the establishing unit is used for establishing differential phase images based on the deconstructed and separated positive and negative phase images of the sample;
the extraction unit is used for generating network training data through the X-ray phase difference microscopic imaging physical model, directly using the trained model for experimental data processing, and extracting absolute phase and differential phase information of an object.
6. The deep high resolution phase information extraction apparatus of claim 5, wherein the attention mechanism network model comprises:
deep convolutional neural networks based on spatial attention and channel attention mechanisms.
7. The depth high resolution phase information extraction apparatus of claim 5, wherein the degree of separation of the sample forward and reverse phase images is:
Figure FDA0003420599710000021
where λ is the X-ray wavelength, d is the distance from the phase grating to the detector, p is the equivalent period of the phase grating, and M is the multiple representing the amplified imaging of the sample through the zone plate.
8. The deep high resolution phase information extraction apparatus of claim 5, wherein generating network training data by X-ray phase difference microscopy imaging physical model comprises:
downloading a preset number of natural image open source data sets ImageNet;
for a single picture, taking one color channel data, resetting the size of the picture, and recording the data as x 0
Generating a binary mask image M with randomly varied position and shape using Matlab code, and x 0 Performing point multiplication operation, and extracting a region x of interest ROI =M·x 0
For x ROI Adjusting the value of each pixel point of the image, restricting the value range of the image to a preset interval according to the real sample material, and simulating the absolute phase shift of an object;
based on X-ray phase microscopic imaging physics, adding a noise model, generating overlapping phase images phi and differential phase images phi' with different separation scales, and repeating the steps until all pictures are processed.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1-4 when the program is executed by the processor.
10. A computer readable storage medium having stored thereon a computer program for:
the computer program, when executed by a processor, implements the method of any of claims 1-4.
CN202111575717.5A 2021-12-20 2021-12-20 Deep high-resolution phase information extraction method Pending CN116309237A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111575717.5A CN116309237A (en) 2021-12-20 2021-12-20 Deep high-resolution phase information extraction method
PCT/CN2022/139815 WO2023116584A1 (en) 2021-12-20 2022-12-17 Deep high-resolution phase information extraction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111575717.5A CN116309237A (en) 2021-12-20 2021-12-20 Deep high-resolution phase information extraction method

Publications (1)

Publication Number Publication Date
CN116309237A true CN116309237A (en) 2023-06-23

Family

ID=86811805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111575717.5A Pending CN116309237A (en) 2021-12-20 2021-12-20 Deep high-resolution phase information extraction method

Country Status (2)

Country Link
CN (1) CN116309237A (en)
WO (1) WO2023116584A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013130587A1 (en) * 2012-02-28 2013-09-06 The Board Of Regents Of The University Of Texas System Method and apparatus for extended phase correction in phase sensitive magnetic resonance imaging
EP3889885A1 (en) * 2020-04-01 2021-10-06 Koninklijke Philips N.V. Bias correction for dark-field imaging based on sliding window phase retrieval
CN112568923B (en) * 2020-12-10 2022-08-19 中国科学院深圳先进技术研究院 X-ray phase contrast image extraction method, device, terminal and storage medium
CN113379818B (en) * 2021-05-24 2022-06-07 四川大学 Phase analysis method based on multi-scale attention mechanism network

Also Published As

Publication number Publication date
WO2023116584A1 (en) 2023-06-29

Similar Documents

Publication Publication Date Title
Xie et al. Artifact removal using improved GoogLeNet for sparse-view CT reconstruction
Joshi et al. Maximum a posteriori estimation with Good’s roughness for three-dimensional optical-sectioning microscopy
Thüring et al. Non-linear regularized phase retrieval for unidirectional X-ray differential phase contrast radiography
Broxton et al. Wave optics theory and 3-D deconvolution for the light field microscope
Holmes et al. Richardson-Lucy/maximum likelihood image restoration algorithm for fluorescence microscopy: further testing
Yang et al. TV-based conjugate gradient method and discrete L-curve for few-view CT reconstruction of X-ray in vivo data
Xu et al. Investigation of discrete imaging models and iterative image reconstruction in differential X-ray phase-contrast tomography
Conchello et al. Enhanced three-dimensional reconstruction from confocal scanning microscope images. II. Depth discrimination versus signal-to-noise ratio in partially confocal images
Zhang et al. Task-oriented low-dose CT image denoising
Hendriksen et al. On-the-fly machine learning for improving image resolution in tomography
Tivnan et al. Physical modeling and performance of spatial-spectral filters for CT material decomposition
Yashiro et al. Exploring frontiers of 4D X-ray tomography
Massimi et al. Fast, non-iterative algorithm for quantitative integration of X-ray differential phase-contrast images
Sperl et al. A Fourier-domain algorithm for total-variation regularized phase retrieval in differential X-ray phase contrast imaging
Alsamadony et al. Deep learning driven noise reduction for reduced flux computed tomography
Guzzi et al. Automatic differentiation for inverse problems in X-ray imaging and microscopy
Alloo et al. Dark-field tomography of an attenuating object using intrinsic x-ray speckle tracking
Shen et al. Blind restoration of remote sensing images by a combination of automatic knife-edge detection and alternating minimization
Zhang et al. Underwater Image Restoration via Adaptive Color Correction and Contrast Enhancement Fusion
Rositi et al. Information-based analysis of X-ray in-line phase tomography with application to the detection of iron oxide nanoparticles in the brain
Coe et al. Computational modeling of optical projection tomographic microscopy using the finite difference time domain method
CN116309237A (en) Deep high-resolution phase information extraction method
Cha et al. Development of adaptive point-spread function estimation method in various scintillation detector thickness for X-ray imaging
Liu et al. Optimization of reconstructed quality of hard x-ray phase microtomography
Riedel et al. Comparing x-ray phase-contrast imaging using a talbot array illuminator to propagation-based imaging for non-homogeneous biomedical samples

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination