CN120298564B - Volume electron microscope data visualization method, device and storage medium for generating instantiated projections - Google Patents
Volume electron microscope data visualization method, device and storage medium for generating instantiated projectionsInfo
- Publication number
- CN120298564B CN120298564B CN202510773132.6A CN202510773132A CN120298564B CN 120298564 B CN120298564 B CN 120298564B CN 202510773132 A CN202510773132 A CN 202510773132A CN 120298564 B CN120298564 B CN 120298564B
- Authority
- CN
- China
- Prior art keywords
- data
- projection
- mask
- instantiation
- electron microscope
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—Three-dimensional [3D] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—Three-dimensional [3D] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Optimization (AREA)
- Mathematical Analysis (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Computational Mathematics (AREA)
- Computer Graphics (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Geometry (AREA)
- Analysing Materials By The Use Of Radiation (AREA)
Abstract
The invention discloses a method, a device and a storage medium for visualizing body electron microscope data for generating an instantiation projection, which relate to the technical field of image processing and comprise the steps of performing dot multiplication on texture data blocks of body electron microscope data and segmentation mask blocks corresponding to the body electron microscope data to obtain instantiation texture data filtered by a mask, and superposing a marked subcellular structure on the instantiation texture data filtered by the mask to obtain superposed marked instantiation texture data; the method comprises the steps of converting the instantiation texture data of the superposition mark on any angle plane into an orthogonal plane through data rotation, obtaining an instantiation projection image on the current orthogonal plane through a projection function, normalizing the instantiation projection image, and synthesizing the normalized projection image after spatial proportion correction so as to output a visualization result corresponding to the current ID.
Description
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method, an apparatus, and a storage medium for visualizing data of a volume electron microscope for generating an instantiation projection.
Background
The high resolution of the neural tissue image sequences obtained by electron microscopy techniques, but the huge amount of data, is a key technical challenge how to efficiently extract and visualize the neuron structure information from the segmented data.
Browsing or visualizing the volume electronic microscope data needs to traverse all layers or be displayed in a three-dimensional rendering mode. Taking a sequential image as an example, one picture is one layer, then a three-dimensional structure in the sem data often exists across multiple layers. To this end, if it is desired to browse the structure object, it is necessary to traverse all the slices in which the structure exists. Taking electron microscope imaging of biological tissues such as nerve cells as an example, the cell structures span too much in the electron microscope data, so that only a small data block can be read for browsing, and all layers of the small data block are often required to be accessed, which results in inefficiency.
And the other adopts a three-dimensional rendering mode, and the surface of each object in the data block can be subjected to grid generation and rendering through an algorithm. However, such mesh generation algorithms generally require a targeted design to handle different surface morphologies, and depending on the rendering tool and rendering settings, may produce different results, and thus cannot be simply implemented for wide application. And, rendering from the surface mesh may completely miss image textures, which are important for understanding and subsequent processing of the data object.
In general, the prior art lacks an efficient and intuitive way of visualizing multiple objects present in the electron-volume data.
Disclosure of Invention
Based on the technical problems in the background art, the invention provides a method, a device and a storage medium for visualizing the data of the electron-body microscope for generating the instantiation projection, which can quickly generate the visual format of a plurality of objects containing forms and textures in the electron-body microscope data, and realize the quick retrieval, the verification and the analysis of the electron-body microscope data.
The invention provides a method for visualizing data of a body electron microscope for generating an instantiation projection, which comprises the following steps:
Performing dot multiplication on texture data blocks of the volume electron microscope data and segmentation mask blocks corresponding to the volume electron microscope data to obtain mask-filtered instantiation texture data, and superposing a mark subcellular structure on the mask-filtered instantiation texture data to obtain superposition-marked instantiation texture data;
converting the instantiation texture data of the superposition mark on any angle plane into an orthogonal plane through data rotation, and obtaining and normalizing an instantiation projection diagram on the current orthogonal plane by utilizing a projection function;
And synthesizing the normalized projection images after space proportion correction, so as to output a visual result corresponding to the current ID.
Further, the texture data block and the segmentation mask block are the same size and aligned;
extracting all non-zero ID values from a split mask block by a de-duplication method to form a set Wherein the aggregateEach ID of (a) represents an instance object;
Set the first The voxels of the ID value are 1 and the others are 0, thereby creating the firstThe segmentation mask block of the individual IDs,。
Further, the labeling subcellular structure is superimposed on the mask filtered instantiated texture data, specifically:
Taking the mask filtered instantiation texture data and the set subcellular structure segmentation marking the intersection area between the masks and marking;
If no subcellular structure to be marked exists, setting all values of a subcellular structure segmentation marking mask to be 0;
If a plurality of subcellular structures to be marked exist, the instantiated texture data after the current subcellular structure division marking mask is overlapped with the next subcellular structure division marking mask in sequence.
Further, the objects are marked by marking functions, each of which is independent when marking a plurality of objects, the marking functions being as follows:
;
;
Wherein, the Is the firstThe mask filtered instantiated texture data for each object,To superimpose the labeled mask filtered instantiated texture data,Is the three-dimensional coordinates of voxels within the texture data block,As a function of the marking,A mask that is a subcellular structure,For example, in the case of a glass,Is a red mark, and the color of the red mark is equal to that of the red mark,In order to highlight the mark(s),It is the sign luminance value that is often 255,Is a highlight coefficient.
Further, in converting the instantiated texture data of the superimposed mark on the arbitrary angle plane into orthogonal planes by data rotation, the orthogonal planes are composed of、AndA plane;
the projection function is utilized to obtain a projection diagram on the current orthogonal plane, which is specifically as follows:
At the position of Plane projection alongThe direction sums the mask filtered instantiated texture data to obtainAn instantiated projection view on an orthogonal plane such that all slices in the original three-dimensional volume electron microscope data are onAndThe boundaries of the direction are preservedThe thickness and texture in the direction are displayed in gray scale;
At the position of Plane projection alongThe direction sums the mask filtered instantiated texture data to obtainAn instantiated projection view on an orthogonal plane such that all slices in the original three-dimensional volume electron microscope data are onAndThe boundaries of the direction are preservedThe thickness and texture in the direction are displayed in gray scale;
At the position of Plane projection alongThe direction sums the mask filtered instantiated texture data to obtainAn instantiated projection view on an orthogonal plane such that all slices in the original three-dimensional volume electron microscope data are onAndThe boundaries of the direction are preservedThe thickness and texture in the direction are displayed in gray scale.
Further, the normalized projection map is corrected by space proportion, specifically:
For a pair of Plane projection,Plane projectionThe projection images on the plane projection are respectively subjected to telescopic operation so as to be consistent with the actual physical space proportion.
Further, the GPU is utilized to accelerate the operation of the electron microscope data visualization method.
The body electron microscope data visualization device for generating the instantiation projection comprises a data preprocessing module, a projection map calculation module and a visualization module;
the data preprocessing module is used for performing dot multiplication on texture data blocks of the body electron microscope data and segmentation mask blocks corresponding to the body electron microscope data to obtain mask-filtered instantiation texture data, and superposing a mark subcellular structure on the mask-filtered instantiation texture data to obtain superposition-marked instantiation texture data;
the projection map calculation module is used for converting the instantiation texture data of the superposition marks on any angle plane into an orthogonal plane through data rotation, and obtaining and normalizing a projection map on the current orthogonal plane by utilizing a projection function;
And the visualization module is used for synthesizing the normalized projection images after the spatial proportion correction so as to output a visualization result corresponding to the current ID.
Further, the data preprocessing module comprises a mask superposition module, a first marking module and a second marking module;
the mask superposition module is used for taking and marking the intersection area between the mask filtered instantiation texture data and the set subcellular structure segmentation marking mask;
If no subcellular structure to be marked exists, setting all values of a subcellular structure segmentation marking mask to be 0 through a first marking module;
if a plurality of subcellular structures to be marked exist, the second marking module sequentially superimposes the instantiation texture data superimposed with the current subcellular structure division marking mask with the next subcellular structure division marking mask.
A computer readable storage medium having stored thereon a number of classification programs for being invoked by a processor and performing a method of visualizing data of a volume electron microscope as described above.
The method, the device and the storage medium for visualizing the data of the electron-body microscope for generating the instantiation projection have the advantages that the visual format of the form and the texture contained in a plurality of objects in the electron-body microscope data can be quickly generated, and the quick retrieval, the check and the analysis of the electron-body microscope data are realized. In addition, the example projection diagram contains structure and texture projection information, is close to the visual effect of X-ray and CT imaging, provides visual neuron morphological representation, and is beneficial to scientific researchers to quickly understand complex three-dimensional structures. The instantiation of the projection map may calculate projections of multiple objects simultaneously, and in addition, subcellular structures (synapses, mitochondria, etc.) may be combined to resolve the positional, morphological, and spatial relationships of multiple 3-dimensional objects in the instantiation of the projection map. The method is applicable to the data of the body electron microscope and the segmentation results of different sources, and supports the visual requirements of various neuron forms.
Drawings
FIG. 1 is a schematic diagram of the structure of the present invention;
FIG. 2 is a schematic diagram showing input data, (a) voxel data of biological tissue, (b) segmentation mask, (c) rendering of a small number of objects, and (d) single object in slice;
Fig. 3 is an exemplary projection view and a three-view, wherein a is an exemplary projection view obtained by combining a segmented object in the volume electron microscope data and an electron microscope image and then projecting the combined object and the electron microscope image along a plurality of directions;
FIG. 4 is a schematic diagram of three practical variations of a segmentation projection map, (a) a segmentation projection map using segmentation alone as a projection, (b) a marker projection map using neuron segmentation projection followed by superposition of synaptic segmentations (red), and (c) a texture projection map using segmentation, texture dot multiplication followed by projection;
FIG. 5 is an exemplary projection of multiple objects within the same region, (a) a projection of a dendrite bifurcation, (b) a projection of a dendrite trunk, (c) a projection of an axon tip, and (d) a projection of a thin axon tip.
Detailed Description
In the following detailed description of the present invention, numerous specific details are set forth in order to provide a thorough understanding of the present invention. The invention may be embodied in many other forms than described herein and similarly modified by those skilled in the art without departing from the spirit or scope of the invention, which is therefore not limited to the specific embodiments disclosed below.
In the embodiment, professional words such as the data of the body electron microscope, the sequence images and the data blocks, wherein each sequence image is a layer and is combined into a body block, and the three-dimensional structure in the data of the body electron microscope, the object and the segmentation object exist. The electron-volume mirror data mentioned in this embodiment often contains a plurality of three-dimensional structural objects, and each object may span a data block at any angle.
As shown in fig. 1 to 5, the method for visualizing the data of the electron microscope for generating the instantiation projection provided by the invention comprises the following steps:
Step one, performing dot multiplication on texture data blocks of the body electron microscope data and segmentation mask blocks corresponding to the body electron microscope data to obtain mask-filtered instantiation texture data, and superposing a mark subcellular structure on the mask-filtered instantiation texture data to obtain superposition-marked instantiation texture data;
step two, converting the instantiation texture data of the superposition mark on any angle plane into an orthogonal plane through data rotation, and obtaining and normalizing an instantiation projection diagram on the current orthogonal plane by utilizing a projection function;
and thirdly, synthesizing the normalized projection graph after correcting the space proportion, so as to output a visual result corresponding to the current ID.
According to the first to third steps, a visual format of which a plurality of objects in the electron-volume microscope data contain morphology and texture can be rapidly generated, and rapid retrieval, check sum analysis of the electron-volume microscope data are realized. In addition, the example projection diagram contains structure and texture projection information, is close to the visual effect of X-ray and CT imaging, provides visual neuron morphological representation, and is beneficial to scientific researchers to quickly understand complex three-dimensional structures. The instantiation of the projection map may calculate projections of multiple objects simultaneously, and in addition, subcellular structures (synapses, mitochondria, etc.) may be combined to resolve the positional, morphological, and spatial relationships of multiple 3-dimensional objects in the instantiation of the projection map. The method is applicable to the data of the body electron microscope and the segmentation results of different sources, and supports the visual requirements of various neuron forms.
It should be noted that, the method for visualizing the data of the volume electron microscope of the present embodiment may be used for neurons or other structural objects that densely occupy a three-dimensional space of a tissue, such as glial cells and myocytes, so as to visualize the neuron morphology or type cell morphology of the neural tissue, and the neuron is described below, and other types of cells are directly obtained similarly.
In one embodiment, the first step is specifically:
texture data block (Fig. 2 (a)) a three-dimensional data block composed of a sequence of electron microscope images, the gray value of each pixel point representing the imaging brightness of the neural tissue electron microscope. Because of heavy metal staining process in the sample preparation process, the lipid-containing areas such as cell membranes and organelle membranes have higher brightness, whereinIs a three-dimensional data coordinate.
Segmentation mask block of neuron(Fig. 2 (b)) three-dimensional data blocks of the same size as texture data blocks, wherein each neuron object is represented by a unique ID, the background is typically set to 0. Wherein the texture data block and the segmentation mask block have the same size and are aligned. The mask may be generated by manual labeling or by existing automatic segmentation algorithms.
Subcellular structure segmentation marking maskThe background of a three-dimensional data block with the same size as the texture data block is usually set to 0, subcellular structures such as mitochondria, synapses and the like of the required marks have non-0 values, one type of subcellular structure is usually represented by an ID (for example, the ID of a mark using a single subcellular structure is exemplified later and is 1), and the mask can be generated by manual labeling or the existing automatic segmentation algorithm.
The specific electron microscope data loading process comprises the steps of reading a three-dimensional electron microscope image sequence, taking the three-dimensional electron microscope image sequence as a texture data block (the pixel value range is 0 to 255, the pixel value is the inverse of the pixel value=255-the original pixel value), and reading the data of a corresponding neuron segmentation mask block and a subcellular structure segmentation mark mask;
in this embodiment, the neuron ID is first extracted from the segmentation mask of neurons by extracting all non-zero ID values from the segmentation mask of neurons by a de-duplication method to form a set Each ID therein representing a neuronal object. For each neuron ID (denoted as)。
Then reserving a neuron mask region in the three-dimensional texture data, specifically:
Set the first The voxels of the ID value are 1 and the others are 0, thereby creating the firstNeuron partition mask block of individual IDsDividing neurons into mask blocksAnd texture data blockPerforming point multiplication operation to obtain instantiation texture data filtered by maskWherein each element is derived from the product of the mask and the corresponding position value of the three-dimensional data in the texture data block:
;
in the exemplary texture data of mask filtering, the cell membrane of the neuron is taken as a boundary, the pixel value of the target neuron is reserved, and the value of the pixel points outside the target neuron is 0. This operation may separate out the instantiated texture data for the target neuron.
Finally, marks (three dimensions) are superimposed on the mask filtered instantiated texture data:
Filtering the mask to obtain instantiated texture data The subcellular structure (specific structure division inside the neuron division object, such as synapse, mitochondria, etc.) of (a) is divided according to the subcellular structure and marks the maskAnd marking the intersection area of the two areas. If no subcellular structure to be marked exists, dividing the subcellular structure into marking masksIf a plurality of subcellular structures to be marked exist, sequentially superposing the instantiated texture data after superposing the current subcellular structure segmentation marking mask with the next subcellular structure segmentation marking mask until all object marks are completed, wherein the marking function can be different in each superposition marking, namely a plurality of subcellular structures of different types can be marked on the same neuron.
Wherein the marking processing function isFunction ofIt may be to set the corresponding pixel to red (fig. 4 (b)) or other highlighting operation (e.g., multiply the original pixel value by 2, set a threshold of 255, set 255 beyond 255 after multiplying by 2):
;
;
Wherein, the Is the firstThe mask filtered instantiated texture data for each neuron,To superimpose the labeled mask filtered instantiated texture data,Is the three-dimensional coordinates of the voxels of the texture data block,As a function of the marking,A mask that is a subcellular structure,For example, two examples are illustratedA red mark or a highlight mark respectively,It is the sign luminance value that is often 255,The highlighting coefficient is typically greater than 1.
As can be seen from the four biological neural tissues of fig. 1, since there are many segmented objects and it is difficult to directly observe and analyze the objects in the data across multiple three-dimensional slices, the present embodiment proposes an exemplary projection map method to visualize each segmented object.
In one embodiment, if the projection processing is required to be performed on the three-dimensional instantiated texture data from any direction, the data redirection is required to be implemented through spatial rotation, so that the projection direction is aligned with the coordinate axis, and the projection operation is completed along the new axis, that is, the process of generating the instantiated projection map of the instantiated texture data with the superimposed mark on any angle plane is as follows:
(a1) Data rotation;
mask filtered instantiated texture data after overlay marking is Any rotation angle is expressed as euler angle combinations around three axesThe rotation matrix isRepresenting Euler rotation operations in three-dimensional space, which can be broken down into windings、、Rotation matrix product of axes:
;;;;
for each voxel coordinate in the rotated data block Converting it into physical coordinates, and then inverting it by a rotation matrixMapping the position to a physical coordinate system, mapping the rotated physical coordinate to an original voxel coordinate space, and finally carrying out interpolation calculation by using the value of the position in the original three-dimensional electron microscope data, thereby obtaining a rotated volume data value;
;
;
;
;
;
;
Wherein, the Representing voxel coordinates in rotated data blocksThe corresponding physical coordinates (three dimensions),Represented by a reverse rotation matrixWill beMapped back to the coordinates of the physical coordinate system,、、Respectively isThe three-axis coordinates of the representation,、、Respectively is、、Voxel spacing of axes (voxel physical size),Representing interpolation functions for use in non-integer coordinatesAnd obtaining the data value.
(A2) Projection after rotation;
The size and form of the rotated data block are unchanged, the numerical value is derived from the interpolation result, and the coordinates become The three orthogonal directions of the data block no longer follow the original direction, X, Y, Z becomes、、. In this case in orthogonal planesIn the middle, to、AndThe planar projection is as follows:
At the position of Plane projection alongThe direction sums the mask filtered instantiated texture data to obtainAn instantiated projection view on an orthogonal plane such that all slices in the original three-dimensional volume electron microscope data are onAndThe boundaries of the direction are preservedThe thickness and texture in the direction are displayed in gray scale;
;
At the position of Plane projection alongThe direction sums the mask filtered instantiated texture data to obtainAn instantiated projection view on an orthogonal plane such that all slices in the original three-dimensional volume electron microscope data are onAndThe boundaries of the direction are preservedThe thickness and texture in the direction are displayed in gray:
;
At the position of Plane projection alongThe direction sums the mask filtered instantiated texture data to obtainAn instantiated projection view on an orthogonal plane such that all slices in the original three-dimensional volume electron microscope data are onAndThe boundaries of the direction are preservedThe thickness and texture in the direction are displayed in gray scale;
;
Wherein, the Is thatAn instantiated projection view on a plane,Is thatAn instantiated projection view on a plane,Is thatAn instantiated projection view on a plane.
Projection computation for instantiation volume dataSpace where instantiated texture data is located, intended to make full use of mask filteringFor highlighting the visual features of interest, different forms of projection functions may be employed. To the rotated instantiation texture dataPlanar projection is exemplified and includes, but is not limited to, the following implementation methods:
(b1) Maximum projection:
;
The space in which the instantiated texture data is filtered for the mask is used to highlight high intensity signals such as myelin, lysosomes, etc. textures.
(B2) Minimum projection:
;
The method is suitable for the structure with reverse display brightness, such as cavitation bubbles, low-density areas and the like.
(B3) Standard deviation projection:
the method is used for highlighting the region with severe signal variation and enhancing boundary and texture contrast.
(B4) Median projection:
is insensitive to noise and is suitable for more stable tissue structure analysis.
It should be noted that, both these projection methods and the summation projection method described above have the characteristic of high-efficiency calculation, can realize perspective effect, and can apply GPU acceleration. All projection results can be mapped to the [0,255] gray scale range by normalization operations to meet conventional image display requirements.
Preferably, the normal orthogonal directionWhen three-dimensional instantiation texture data projection is carried out, summation and normalization are carried out along a certain coordinate axis, namely, the generation process of an instantiation projection diagram of the superposition marked instantiation texture data on three orthogonal planes is as follows:
projecting on an XY orthogonal plane, and summing up the filtered instantiation texture data of the mask along the Z direction to obtain an instantiation projection diagram on the XY orthogonal plane, so that boundaries in X and Y directions on all slices in the original three-dimensional body electron microscope data are reserved, and the thickness and texture in the Z direction are displayed in gray scale;
;
Wherein, the Is an exemplary projection view on an XY orthogonal plane.
Projecting on an XZ orthogonal plane, summing up the filtered instantiation texture data of the mask along the Y direction to obtain an instantiation projection diagram on the XZ orthogonal plane, so that boundaries of all slices in the X and Z directions in the original three-dimensional electron microscope data are reserved, and the thickness and texture in the Y direction are displayed in gray scale;
;
Wherein, the Is an instantiated projection view on an XZ orthogonal plane.
Projecting on a YZ orthogonal plane, summing up the filtered instantiation texture data of the mask along the X direction to obtain an instantiation projection diagram on the YZ orthogonal plane, so that boundaries of all slices in the Y and Z directions in the original three-dimensional electron microscope data are reserved, and the thickness and texture in the X direction are displayed in gray scale;
;
Wherein, the Is an exemplary projection view on the YZ orthogonal plane.
According to the projection mode, conversion from three-dimensional data to two-dimensional data is achieved, the schematic view is shown in A of fig. 3, and an instantiation projection diagram is obtained by combining a segmentation object in the body electron microscope data and an electron microscope image and then projecting the combined segmentation object and electron microscope image along a plurality of directions. The exemplary projection map designed in this embodiment retains the boundary of the object, and at the same time retains the texture features of the object to some extent (B of fig. 3, since an electron microscope image is used in projection, if a mitochondrial mask is set, the mitochondrial region will appear as a red or highlight mark, and mitochondrial texture (no mitochondrial mask is set) can be observed in the exemplary projection map).
As shown in fig. 4, where (a), (b), and (c) are three practical variations of the split projection map, respectively, projection using only splitting as projection (split projection), projection using neuron split projection followed by superimposed synaptic split (red) (marker projection), and projection using splitting + texture point multiplication (texture projection). As can be seen clearly in fig. 4.
If the texture is not multiplied, a simple morphological projection map can be obtained by directly projecting the neuron segmentation mask block, and only the morphological projection map obtained in this way lacks some texture features, as shown in fig. 4 (a).
Normalizing the obtained instantiation projection diagram on each orthogonal plane to a range of 0-255:
;
Wherein, the Representing an instantiated projection view on a current orthogonal plane, specifically corresponding to、、,
It will be appreciated that when XYZ is directly orthogonal, thenCan be directly corresponding to、、Thus(s)According toIn particular to、Or (b)Respectively correspond to、Or (b)。
In one embodiment, the third step is specifically:
Respectively to Plane projection,Plane projectionThe normalized projection images on the plane projection are respectively subjected to telescopic operation, so that the normalized projection images are consistent with the actual physical space proportion;
;
Wherein, the To pair(s)A projection view after the telescopic operation is performed,The function of the value of the finger,For the anisotropic acquisition coefficient(s),Respectively is、Or (b)I.e. whenIs thatIn the time-course of which the first and second contact surfaces,Taking outWhen (when)Is thatIn the time-course of which the first and second contact surfaces,Taking outWhen (when)Is thatIn the time-course of which the first and second contact surfaces,Taking out。Respectively is、Or (b)I.e. whenIs thatIn the time-course of which the first and second contact surfaces,Taking outWhen (when)Is thatIn the time-course of which the first and second contact surfaces,Taking outWhen (when)Is thatIn the time-course of which the first and second contact surfaces,Taking out,AndNot taking the same parameters at the same time, due toIs composed of a plane, thusAnd (3) withThe same plane is indicated, and the other planes are the same, and finally correspond to three planes, that is,Respectively correspond to、Or (b),Respectively correspond to、Or (b)。
When XYZ is directly an orthogonal plane, XY typically has the same resolution on an electron microscope image, and there may be a difference in Z direction, depending on the acquisition resolution, in many cases the single layer thickness, i.e. the Z direction resolution value, is greater than X, Y directions, the Z plane projection is scaled, whereAnd can be X or Y, so that the ratio of X to Y is consistent with the actual physical space ratio:
;
Wherein, the To pair(s)A projected view after the telescoping operation,The function of the value of the finger,For anisotropic acquisition coefficients (e.g. ratio of Z-direction resolution to X-or Y-direction resolution),Respectively isOr (b)I.e. whenIn the case of the X-ray,Taking outWhen (when)In the case of the Y-type compound, the compound is,Taking out. In addition, the present embodiment does not exclude stretching or scaling of the instantiated projection view on the XY orthogonal plane.
After the view space ratio correction is completed, two modes are available, wherein one mode is to store the corrected three views as three channels of the images, and the blank area of the size difference between the images is complemented with a value of 0. Another way is to combine the three telescoped projection images into one integrated viewIn (B of fig. 3):
;
Wherein, the 、AndRespectively represent、AndThe direction length (number of voxels),、、For an instantiated projection view corrected by spatial scale.
Preferably, when XYZ is directly orthogonal, the three telescopically operated projection images are combined into one integrated viewMiddle (B of FIG. 3)
;
Wherein, the 、AndRespectively, X, Y and Z-direction lengths (voxel numbers).
In the above formulaRefers to an instantiated projection view on a normalized XY orthogonal plane,AndFor an instantiated projection view corrected by spatial scale.
And storing the comprehensive view as an image file, wherein the file name comprises a corresponding neuron ID, so that the subsequent indexing according to the neuron is convenient.
In this embodiment, compared with operations such as three-dimensional rendering, the method for instantiating the projection map is simple, and except for the step one, the other operations are simple matrix batch operations such as dot multiplication and summation of the matrix, so that the matrix operation can be accelerated by using the GPU, thereby rapidly generating a large number of instantiating projection maps. The three views of the instantiation projection view can be expanded to multiple views, so that the accuracy of structure description is effectively improved. Similar to CT, three-dimensional data can be reconstructed by using multiple view angles, and the three-dimensional data can be reconstructed by using an instantiation projection diagram of the multiple view angles, which shows that the description of the structure by the instantiation projection diagram is stably determined.
As an embodiment:
Referring to fig. 4, using a texture data block example of a neural tissue, on the basis of existing voxel data and the segmentation of all the neuron fragments therein (segmentation may be obtained by artificial labeling or an automatic segmentation algorithm), a segmentation projection map (fig. 4 (a)), a label projection map (fig. 4 (b)) and a texture projection map (fig. 4 (c)) may be obtained by computing projections on a simple segmentation mask, segmentation mask and texture dot-product rear projection, and overlapping synaptic segmentation projections, respectively.
The light and shade of the split projection diagram in fig. 4 (a) shows the thickness change of the split object, the bright part is thick, the dim part is thin, the split object can be seen from three view angles from the upper, deep and left side of the data block to the lower, shallow and left side, and the object is provided with a plurality of small dendritic spine structures which are branched into dendrites. Since the texture projection has more texture details than the segmentation projection, the brightness is no longer the simple thickness information, and the projection of the internal structure of the segmentation object is included, the projection of the structure such as mitochondria therein can be distinguished (B in fig. 3).
The (b) label projection of fig. 4 shows whether synapses are formed on the dendrites, wherein red is a projection of the post-synaptic region, showing that after a plurality of synapses are formed on the dendrites and dendrites, but after one dendrite is suspected to be not found to form a synapse, the view can be used to analyze the relationship between a neuron fragment and a neuron substructure on the basis of the segmentation of an existing neuron object, to verify possible segmentation errors.
Fig. 5 shows an exemplary projection of further segments of the block of neural tissue, from left to right (a), (b), (c), and (d) being calculated texture projections of a plurality of segmented objects (overlaid labeled subcellular structures) in the same block of texture data, from which it can be observed that the segmented objects belong to dendrite branches, dendrite trunks, axon endings, and thin axon endings, respectively. Wherein the dendrite bifurcation, dendrite trunk, axon terminal, and fine axon terminal all belong to different locations on the neuron. This example illustrates that sufficient morphological features remain in the exemplary projection map to distinguish where a segment of a neuron belongs to the neuron. However, it is often impossible to determine by a single electron microscope image, because the field of view of the single electron microscope image is too small, multiple slices must be observed continuously, and sometimes erroneous determination may be caused by lack of three-dimensional information. After these segmented objects can be observed and distinguished, the instantiation projection map can be used to find the desired object (search) from a large number of neuron segmentations, e.g., if larger structures therein are to be extracted for further analysis, a dendrite trunk may be selected. If one wants to look at the connections between the neuronal endings, one can choose to choose between the axonal endings and the fine axonal endings for further analysis.
After stacking the labeled subcellular structures, an inspection can be performed, for example, subcellular structure markers from manual or automated segmentation means, but with possible errors. According to the general knowledge in the art, there may be a synaptic distribution in the axonal terminals, and if a synaptic signature is present on it, this signature may be correct, and conversely, if it is present on other fragments, it may be incorrect.
Similar judgment is often difficult to be carried out through a single electron microscope picture, three-dimensional rendering is complex, software is dependent, and shielding is easy to occur (for example, subcellular structures on the back or inside of a single neuron cannot be observed under a single visual angle when the single neuron is subjected to 3D rendering). Fig. 5 illustrates that the method of the present embodiment may exhibit morphological and textural features of different neuronal segments in an instantiated projection view. The method of the embodiment generates the instantiation projection diagram with high efficiency and small generated data volume, and can slightly compress and store the instantiation projection diagram of all objects as a retrieval thumbnail so as to quickly find the interested analysis object from the three-dimensional data block and realize quick retrieval.
The foregoing is only a preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art, who is within the scope of the present invention, should make equivalent substitutions or modifications according to the technical scheme of the present invention and the inventive concept thereof, and should be covered by the scope of the present invention.
Claims (10)
1. The method for visualizing the data of the electron-body microscope for generating the instantiation projection is characterized by comprising the following steps:
Performing dot multiplication on texture data blocks of the volume electron microscope data and segmentation mask blocks corresponding to the volume electron microscope data to obtain mask-filtered instantiation texture data, and superposing a mark subcellular structure on the mask-filtered instantiation texture data to obtain superposition-marked instantiation texture data;
converting the instantiation texture data of the superposition mark on any angle plane into an orthogonal plane through data rotation, and obtaining and normalizing an instantiation projection diagram on the current orthogonal plane by utilizing a projection function;
And synthesizing the normalized projection images after space proportion correction, so as to output a visual result corresponding to the current ID.
2. The method of claim 1, wherein the texture data block and the segmentation mask block are the same size and aligned;
extracting all non-zero ID values from a split mask block by a de-duplication method to form a set Wherein the aggregateEach ID of (a) represents an instance object;
Set the first The voxels of the ID value are 1 and the others are 0, thereby creating the firstThe segmentation mask block of the individual IDs,。
3. The method for visualizing data in a volumetric electron microscope for generating an exemplary projection according to claim 1, wherein a labeled subcellular structure is superimposed on the mask filtered exemplary texture data, in particular:
Taking the mask filtered instantiation texture data and the set subcellular structure segmentation marking the intersection area between the masks and marking;
If no subcellular structure to be marked exists, setting all values of a subcellular structure segmentation marking mask to be 0;
If a plurality of subcellular structures to be marked exist, the instantiated texture data after the current subcellular structure division marking mask is overlapped with the next subcellular structure division marking mask in sequence.
4. A method of visualizing data for a volumetric electron microscope for generating an instantiated projection as claimed in claim 3 wherein objects are marked by a marking function, each marking function being independent when a plurality of objects are marked, the marking function being as follows:
;
;
Wherein, the Is the firstThe mask filtered instantiated texture data for each object,For mask filtered instantiated texture data after overlay marking,Is the three-dimensional coordinates of voxels within the texture data block,As a function of the marking,A mask that is a subcellular structure,Is a red mark, and the color of the red mark is equal to that of the red mark,In order to highlight the mark(s),Is the value of the intensity of the mark,Is a highlight coefficient.
5. The method for visualizing data for a volume electron microscope generating an instantiated projection as recited in claim 1, wherein in converting instantiated texture data with superimposed markers on arbitrary angle planes by data rotation into orthogonal planes, said orthogonal planes are those comprising、AndA plane;
The projection function is utilized to obtain an instantiation projection diagram on the current orthogonal plane, which is specifically as follows:
At the position of Plane projection alongThe direction sums the mask filtered instantiated texture data to obtainAn instantiated projection view on an orthogonal plane such that all slices in the original three-dimensional volume electron microscope data are onAndThe boundaries of the direction are preservedThe thickness and texture in the direction are displayed in gray scale;
At the position of Plane projection alongThe direction sums the mask filtered instantiated texture data to obtainAn instantiated projection view on an orthogonal plane such that all slices in the original three-dimensional volume electron microscope data are onAndThe boundaries of the direction are preservedThe thickness and texture in the direction are displayed in gray scale;
At the position of Plane projection alongThe direction sums the mask filtered instantiated texture data to obtainAn instantiated projection view on an orthogonal plane such that all slices in the original three-dimensional volume electron microscope data are onAndThe boundaries of the direction are preservedThe thickness and texture in the direction are displayed in gray scale.
6. The method for visualizing data in a volumetric electron microscope for generating an exemplary projection as recited in claim 5, wherein said normalized projection map is corrected by spatial scaling, in particular:
For a pair of Plane projection,Plane projectionAnd respectively performing telescopic operation on the normalized projection graphs on the plane projection, so that the normalized projection graphs are consistent with the actual physical space proportion.
7. The method for visualizing data in a volume electron microscope for generating an instantiated projection of claim 1, the method is characterized in that the GPU is utilized to accelerate the operation of the data visualization method of the electron microscope.
8. The body electron microscope data visualization device for generating the instantiation projection is characterized by comprising a data preprocessing module, a projection map calculation module and a visualization module;
The data preprocessing module is used for performing dot multiplication on texture data blocks of the body electron microscope data and segmentation mask blocks corresponding to the body electron microscope data to obtain mask-filtered instantiation texture data, and superposing a mark subcellular structure on the mask-filtered instantiation texture data to obtain superposition mark instantiation;
the projection map calculation module is used for converting the instantiation texture data of the superposition mark on the arbitrary angle plane into an orthogonal plane through data rotation, and obtaining and normalizing the instantiation projection map on the current orthogonal plane by utilizing a projection function;
And the visualization module is used for synthesizing the normalized projection images after the spatial proportion correction so as to output a visualization result corresponding to the current ID.
9. The device for visualizing data in a volumetric electron microscope for generating an instantiated projection as defined in claim 8, wherein the data preprocessing module comprises a mask overlay module, a first marking module and a second marking module;
the mask superposition module is used for taking and marking the intersection area between the mask filtered instantiation texture data and the set subcellular structure segmentation marking mask;
If no subcellular structure to be marked exists, setting all values of a subcellular structure segmentation marking mask to be 0 through a first marking module;
if a plurality of subcellular structures to be marked exist, the second marking module sequentially superimposes the instantiation texture data superimposed with the current subcellular structure division marking mask with the next subcellular structure division marking mask.
10. A computer readable storage medium, wherein a plurality of classification programs are stored on the computer readable storage medium, and the classification programs are used for being called by a processor and executing the method for visualizing the data of the body electron microscope according to any one of claims 1 to 7.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510773132.6A CN120298564B (en) | 2025-06-11 | 2025-06-11 | Volume electron microscope data visualization method, device and storage medium for generating instantiated projections |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510773132.6A CN120298564B (en) | 2025-06-11 | 2025-06-11 | Volume electron microscope data visualization method, device and storage medium for generating instantiated projections |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN120298564A CN120298564A (en) | 2025-07-11 |
| CN120298564B true CN120298564B (en) | 2025-09-09 |
Family
ID=96280546
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202510773132.6A Active CN120298564B (en) | 2025-06-11 | 2025-06-11 | Volume electron microscope data visualization method, device and storage medium for generating instantiated projections |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN120298564B (en) |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR100393086B1 (en) * | 2001-05-15 | 2003-07-31 | 한국과학기술원 | Anisotropic texture filtering method and apparatus using area coverage weight of sub-texel precision |
| US8860723B2 (en) * | 2009-03-09 | 2014-10-14 | Donya Labs Ab | Bounded simplification of geometrical computer data |
| US10650569B2 (en) * | 2015-08-19 | 2020-05-12 | Adobe Inc. | Browser-based texture map generation and application |
| WO2017216123A1 (en) * | 2016-06-13 | 2017-12-21 | Nanolive Sa | Method of characterizing and imaging microscopic objects |
| US11514579B2 (en) * | 2018-06-04 | 2022-11-29 | University Of Central Florida Research Foundation, Inc. | Deformable capsules for object detection |
| CN111091530B (en) * | 2018-10-24 | 2022-06-17 | 华中科技大学 | Automatic detection method and system for single neuron dendritic spines in fluorescent image |
| WO2021108584A1 (en) * | 2019-11-27 | 2021-06-03 | Fluidigm Canada Inc. | Automated and high throughput imaging mass cytometry |
| CN116057538A (en) * | 2020-08-13 | 2023-05-02 | 文塔纳医疗系统公司 | Machine Learning Models for Cell Localization and Classification Learned Using Repulsive Coding |
| US12411106B2 (en) * | 2021-10-27 | 2025-09-09 | Life Technologies Corporation | Methods for electroscopic imaging for analysis of cells |
-
2025
- 2025-06-11 CN CN202510773132.6A patent/CN120298564B/en active Active
Also Published As
| Publication number | Publication date |
|---|---|
| CN120298564A (en) | 2025-07-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Rani et al. | Knowledge vector representation of three-dimensional convex polyhedrons and reconstruction of medical images using knowledge vector | |
| CN110110617B (en) | Medical image segmentation method, device, electronic device and storage medium | |
| CN104376594A (en) | Three-dimensional face modeling method and device | |
| CN115393294B (en) | Weld seam detection and segmentation method and device based on RGB-D feature layered fusion | |
| CN113706514B (en) | Focus positioning method, device, equipment and storage medium based on template image | |
| CN115984583B (en) | Data processing method, apparatus, computer device, storage medium, and program product | |
| CN118429528A (en) | Method, device and equipment for three-dimensional reconstruction of scene | |
| CN110570417A (en) | Pulmonary nodule classification method and device and image processing equipment | |
| CN114219988B (en) | Multi-category rock and mineral rapid classification method, equipment and storage medium based on ViT framework | |
| KR20150002157A (en) | Content-based 3d model retrieval method using a single depth image, 3d model retrieval server for performing the methods and computer readable recording medium thereof | |
| Rasoulzadeh et al. | Strokes2surface: recovering curve networks from 4D architectural design sketches | |
| CN118521488A (en) | Image reconstruction method, electronic equipment and storage medium | |
| CN120298564B (en) | Volume electron microscope data visualization method, device and storage medium for generating instantiated projections | |
| CN111354076A (en) | Single-image three-dimensional part combined modeling method based on embedding space | |
| CN117037155B (en) | A semantic annotation method for point clouds in large-scale garden scenes based on multimodal fusion | |
| CN116993752B (en) | Semantic segmentation method, medium and system for live-action three-dimensional Mesh model | |
| CN118982664A (en) | Enhanced CT image anatomical atlas segmentation method and system based on deep learning | |
| Hafeez et al. | Performance evaluation of patterns for image-based 3D model reconstruction of textureless objects | |
| US11600009B2 (en) | Aligning data sets based on identified fiducial markers | |
| CN116342947B (en) | Lightweight three-dimensional model classification system based on multi-view grouping | |
| Pandey et al. | Convolutional Neural Network-Based Framework for Single Image Superresolution of Magnetic Resonance Imaging Images Using Multiscale Feature Extraction and Attention Mechanism | |
| Nagendra et al. | Multi-Organ 3D Reconstruction and Virtual Reality Visualization Using Graph-Based Segmentation | |
| Prieto et al. | 3D Texture synthesis for modeling realistic organic tissues | |
| Miloševic et al. | Box-counting method in quantitative analysis of images of the brain | |
| Wu et al. | Algorithm for the creation of the standard Drosophila brain model and its coordinate system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |