CN117710229A - Multi-mode-based imaging image fusion and superposition method and related device - Google Patents

Multi-mode-based imaging image fusion and superposition method and related device Download PDF

Info

Publication number
CN117710229A
CN117710229A CN202311738173.9A CN202311738173A CN117710229A CN 117710229 A CN117710229 A CN 117710229A CN 202311738173 A CN202311738173 A CN 202311738173A CN 117710229 A CN117710229 A CN 117710229A
Authority
CN
China
Prior art keywords
image data
ultrasonic
imaging
imaging image
skin contact
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311738173.9A
Other languages
Chinese (zh)
Other versions
CN117710229B (en
Inventor
李亚楠
丁毅
雷晓兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Peninsula Medical Technology Co ltd
Original Assignee
Hunan Peninsula Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Peninsula Medical Technology Co ltd filed Critical Hunan Peninsula Medical Technology Co ltd
Priority to CN202311738173.9A priority Critical patent/CN117710229B/en
Publication of CN117710229A publication Critical patent/CN117710229A/en
Application granted granted Critical
Publication of CN117710229B publication Critical patent/CN117710229B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention discloses a multi-mode-based imaging image fusion and superposition method and a related device, wherein the method comprises the following steps: after energy treatment is carried out on the skin contact area by using an energy therapeutic instrument, imaging treatment is carried out on the skin contact area, and ultrasonic imaging image data and three-dimensional thermal imaging image data of the skin contact area are obtained; respectively carrying out image feature extraction processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data to obtain first extracted feature data and second extracted feature data; and carrying out multi-mode fusion superposition processing based on the first extracted feature data and the second extracted feature data to obtain fusion superposition ultrasonic image data. In the embodiment of the invention, the ultrasonic imaging data with temperature data information of the region after energy treatment can be rapidly obtained, and the subsequent evaluation and detection of the region after energy treatment are facilitated.

Description

Multi-mode-based imaging image fusion and superposition method and related device
Technical Field
The invention relates to the field of imaging graph analysis in time, in particular to a multi-mode-based imaging graph fusion and superposition method and a related device.
Background
In some existing energy treatment devices, because the corresponding imaging devices are not integrated, after a medical staff uses the energy treatment devices to treat a certain designated area of a patient, treatment conditions cannot be analyzed by using related devices in time, and corresponding treatment related evaluation can only be performed through the working experience of the medical staff, so that the evaluation result may be far away from the actual result; meanwhile, the temperature of the treatment area is controlled in an empirical mode, so that the temperature is high easily, and the corresponding treatment effect cannot be achieved; therefore, an imaging device integrating specific ultrasonic imaging functions and thermal imaging functions on the retreatment device is needed to generate ultrasonic imaging data with temperature data information, so that the subsequent evaluation and detection of the areas after energy treatment are facilitated.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and provides a multi-mode-based imaging image fusion and superposition method and a related device, which can rapidly acquire ultrasonic imaging data with temperature data information of an area after energy treatment, and are convenient for subsequent evaluation and detection of the area after energy treatment.
In order to solve the technical problems, the embodiment of the invention provides a multi-mode-based imaging image fusion and superposition method which is applied to treatment equipment, wherein an energy therapeutic instrument and imaging equipment are integrated on a treatment head of the treatment equipment; the method comprises the following steps:
after the treatment head is contacted with the skin and the energy treatment instrument is utilized to treat the skin contact area, the skin contact area is subjected to imaging treatment based on an ultrasonic imaging module in the imaging equipment, and ultrasonic imaging image data of the skin contact area are obtained;
imaging the skin contact area based on an ultrasonic temperature measurement module in the imaging equipment to obtain three-dimensional thermal imaging image data of the skin contact area;
respectively carrying out image feature extraction processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data;
and carrying out multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data based on the first extracted feature data and the second extracted feature data to obtain fusion superposition ultrasonic image data.
Optionally, the imaging processing is performed on the skin contact area based on the ultrasonic imaging module in the imaging device to obtain ultrasonic imaging image data of the skin contact area, including:
transmitting a plurality of ultrasonic signals to the skin contact area based on an ultrasonic imaging module in the imaging device, and generating ultrasonic imaging image data of the skin contact area based on the received ultrasonic echo signals of the skin contact area.
Optionally, the ultrasound imaging module generates ultrasound imaging image data of the skin contact area based on the received multiple segments of ultrasound echo signals of the skin contact area, including:
the ultrasonic imaging module carries out filtering treatment on the multi-section ultrasonic echo signals to obtain filtered multi-section ultrasonic echo signals;
the ultrasound imaging module generates ultrasound imaging image data of the skin contact region based on the filtered multi-segment ultrasound echo signals.
Optionally, the imaging processing is performed on the skin contact area based on the ultrasonic thermometry module in the imaging device, so as to obtain three-dimensional thermal imaging image data of the skin contact area, including:
when the ultrasonic temperature measurement module in the imaging equipment carries out thermal imaging detection on the skin contact area, the three-dimensional thermal imaging processing is carried out on the skin contact area based on an echo time shifting algorithm, and three-dimensional thermal imaging image data of the skin contact area are obtained.
Optionally, the performing image feature extraction processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data respectively includes:
image segmentation processing is carried out on the ultrasonic imaging image data and the three-dimensional thermal imaging image data according to a preset segmentation rule, segmented ultrasonic imaging image data and segmented three-dimensional thermal imaging image data are obtained, and the size of each segmented block of the segmented ultrasonic imaging image data is consistent with that of each segmented region of the segmented three-dimensional thermal imaging image data;
encoding the split ultrasonic imaging image data and the split three-dimensional thermal imaging image data by using a FACET module to obtain encoded split ultrasonic imaging image data and encoded split three-dimensional thermal imaging image data;
performing feature extraction processing on the encoded segmented ultrasonic imaging image data and the encoded segmented three-dimensional thermal imaging image data based on Bi-LSTM and an attention machine to obtain extraction features of the segmented ultrasonic imaging image data and extraction features of the segmented three-dimensional thermal imaging image data;
Respectively carrying out feature enhancement processing on the extracted features of the segmented ultrasonic imaging image data and the extracted features of the segmented three-dimensional thermal imaging image data to obtain enhanced extracted features of the segmented ultrasonic imaging image data and enhanced extracted features of the segmented three-dimensional thermal imaging image data;
and re-stitching the enhanced extracted features of the segmented ultrasonic imaging image data and the enhanced extracted features of the segmented three-dimensional thermal imaging image data according to segmentation rules to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data.
Optionally, the performing multi-mode fusion and superposition processing on the ultrasound imaging image data and the three-dimensional thermal imaging image data based on the first extracted feature data and the second extracted feature data to obtain fusion and superposition ultrasound image data includes:
and carrying out multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data by utilizing the first extracted feature data and the second extracted feature data based on an orthogonalization mechanism to obtain fusion superposition ultrasonic image data.
Optionally, the method further comprises: positioning a target tissue image area based on the fusion and superposition ultrasonic image data, and performing tissue deformation and temperature analysis treatment on the positioned target tissue image area to obtain a tissue analysis result of the target tissue image area;
tissue state information after energy treatment is obtained based on the tissue analysis results of the target tissue image region.
In addition, the embodiment of the invention provides an imaging graph analysis device based on multi-mode fusion, which is applied to treatment equipment, wherein an energy therapeutic instrument and an imaging device are integrated on a treatment head of the treatment equipment; the device comprises:
a first imaging module: the ultrasonic imaging module is used for carrying out imaging treatment on the skin contact area based on the ultrasonic imaging module in the imaging equipment after the treatment head is contacted with the skin and the energy treatment instrument is used for carrying out energy treatment on the skin contact area, so as to obtain ultrasonic imaging image data of the skin contact area;
and a second imaging module: the imaging device is used for carrying out imaging processing on the skin contact area based on an ultrasonic temperature measurement module in the imaging device to obtain three-dimensional thermal imaging image data of the skin contact area;
And the feature extraction module is used for: the method comprises the steps of respectively carrying out image feature extraction processing on ultrasonic imaging image data and three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data;
a multi-mode fusion module: and the method is used for carrying out multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data based on the first extracted feature data and the second extracted feature data to obtain fusion superposition ultrasonic image data.
In addition, an embodiment of the present invention provides a therapeutic apparatus, including a processor and a memory, where the processor executes a computer program or code stored in the memory to implement any one of the above-mentioned methods for fusion and superposition of imaging maps.
In addition, an embodiment of the present invention provides a computer readable storage medium storing a computer program or code, which when executed by a processor, implements the method for fusion and superposition of imaging maps according to any of the above.
In the embodiment of the invention, the ultrasonic imaging data with temperature data information of the region after energy treatment can be quickly obtained by collecting the ultrasonic imaging image data and the three-dimensional thermal imaging image data of the skin contact region and fusing and superposing the ultrasonic imaging image data and the three-dimensional thermal imaging image data to form fused superposed ultrasonic image data, so that the subsequent evaluation and detection of the region after energy treatment are convenient, the designated treatment region is prevented from deviating from a normal value due to overhigh temperature, and the treatment effect is ensured.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings which are required in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a multi-modality-based image fusion overlay method in an embodiment of the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, fig. 1 is a flow chart of a multi-modality-based image fusion and superposition method according to an embodiment of the present invention.
As shown in fig. 1, a multi-mode-based imaging image fusion superposition method is applied to treatment equipment, and an energy therapeutic instrument, a first imaging device and a second imaging device are integrated on a treatment head of the treatment equipment; the method comprises the following steps:
s11: after the treatment head is contacted with the skin and the energy treatment instrument is utilized to treat the skin contact area, the skin contact area is subjected to imaging treatment based on an ultrasonic imaging module in the imaging equipment, and ultrasonic imaging image data of the skin contact area are obtained;
in the implementation process of the invention, the imaging processing is performed on the skin contact area based on the ultrasonic imaging module in the imaging device to obtain ultrasonic imaging image data of the skin contact area, and the method comprises the following steps: transmitting a plurality of ultrasonic signals to the skin contact area based on an ultrasonic imaging module in the imaging device, and generating ultrasonic imaging image data of the skin contact area based on the received ultrasonic echo signals of the skin contact area.
Further, the ultrasound imaging module generates ultrasound imaging image data of the skin contact area based on the received multi-segment ultrasound echo signals of the skin contact area, comprising: the ultrasonic imaging module carries out filtering treatment on the multi-section ultrasonic echo signals to obtain filtered multi-section ultrasonic echo signals; the ultrasound imaging module generates ultrasound imaging image data of the skin contact region based on the filtered multi-segment ultrasound echo signals.
Specifically, when a user performs a treatment operation on a designated skin area by using a treatment head of a treatment device, after the treatment head is in contact with the skin and the contacted skin contact area is subjected to energy treatment by using an energy therapeutic instrument integrated on the treatment head, an ultrasonic imaging module in an imaging device integrated on the treatment head is used for imaging the skin contact area, so that ultrasonic imaging image data of the skin contact area can be obtained.
When an ultrasonic imaging module in the imaging device performs imaging processing, a plurality of sections of ultrasonic signals are transmitted to a skin contact area through the ultrasonic imaging module of the imaging device, meanwhile, the ultrasonic echo signals of the skin contact area are received by the ultrasonic imaging module of the imaging device, and then ultrasonic imaging image data are generated according to the received plurality of sections of ultrasonic echo signals.
When the multi-section ultrasonic echo signals are used for generating ultrasonic imaging image data of a skin contact area, the multi-section ultrasonic echo signals are required to be demodulated to generate multi-section discontinuous demodulation data; according to the preset data length, carrying out filtering processing on data corresponding to the data length which is preset and is the length from the data starting part of each piece of the demodulation data in the discontinuous demodulation data, and generating filtered data, wherein the filtering processing is carried out by adopting an IIR filter; and filtering the filtered data subjected to the filtering treatment and the data not subjected to the filtering treatment in the demodulation data in a recursion mode to generate segmented filtering demodulation data, then performing interpolation complement treatment on the segmented filtering demodulation data to form interpolation complement demodulation data, and finally generating ultrasonic imaging image data of the skin contact area by using the interpolation complement demodulation data.
S12: imaging the skin contact area based on an ultrasonic temperature measurement module in the imaging equipment to obtain three-dimensional thermal imaging image data of the skin contact area;
in the implementation process of the invention, the imaging processing is performed on the skin contact area based on the ultrasonic temperature measurement module in the imaging device to obtain three-dimensional thermal imaging image data of the skin contact area, and the method comprises the following steps: when the ultrasonic temperature measurement module in the imaging equipment carries out thermal imaging detection on the skin contact area, the three-dimensional thermal imaging processing is carried out on the skin contact area based on an echo time shifting algorithm, and three-dimensional thermal imaging image data of the skin contact area are obtained.
Specifically, ultrasonic waves are sent to the skin contact area through the ultrasonic temperature measurement module, then nondestructive temperature measurement is achieved through an echo time shifting mode, namely, the depth of the skin contact area is assumed to be z, and when the acoustic waves propagate at the speed c (T), the required time is as follows:
if thermal expansion is considered, there are:
wherein θ (ζ) represents a temperature at a depth ζ; c (ζ, θ (ζ)) represents the sound velocity at the depth ζ and the temperature θ (ζ); α (ζ) represents a thermal expansion coefficient at ζ; t is t c And c in (a) represents a correlation of the sound velocity to heat in consideration.
By the method, the temperature data of the skin contact area can be calculated, and the three-dimensional thermal imaging image data of the skin contact area can be obtained by constructing the three-dimensional image.
S13: respectively carrying out image feature extraction processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data;
in the implementation process of the present invention, the image feature extraction processing is performed on the ultrasonic imaging image data and the three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data, where the method includes: image segmentation processing is carried out on the ultrasonic imaging image data and the three-dimensional thermal imaging image data according to a preset segmentation rule, segmented ultrasonic imaging image data and segmented three-dimensional thermal imaging image data are obtained, and the size of each segmented block of the segmented ultrasonic imaging image data is consistent with that of each segmented region of the segmented three-dimensional thermal imaging image data; encoding the split ultrasonic imaging image data and the split three-dimensional thermal imaging image data by using a FACET module to obtain encoded split ultrasonic imaging image data and encoded split three-dimensional thermal imaging image data; performing feature extraction processing on the encoded segmented ultrasonic imaging image data and the encoded segmented three-dimensional thermal imaging image data based on Bi-LSTM and an attention machine to obtain extraction features of the segmented ultrasonic imaging image data and extraction features of the segmented three-dimensional thermal imaging image data; respectively carrying out feature enhancement processing on the extracted features of the segmented ultrasonic imaging image data and the extracted features of the segmented three-dimensional thermal imaging image data to obtain enhanced extracted features of the segmented ultrasonic imaging image data and enhanced extracted features of the segmented three-dimensional thermal imaging image data; and re-stitching the enhanced extracted features of the segmented ultrasonic imaging image data and the enhanced extracted features of the segmented three-dimensional thermal imaging image data according to segmentation rules to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data.
Specifically, image segmentation processing is carried out on the ultrasonic imaging image data and the three-dimensional thermal imaging image data according to a preset segmentation rule, wherein the preset segmentation rule can be set according to user requirements, such as average segmentation into 4 image segments or average segmentation into 8 image segments, so as to obtain segmented ultrasonic imaging image data and segmented three-dimensional thermal imaging image data, and each segmentation block of the segmented ultrasonic imaging image data is consistent with each segmentation region of the segmented three-dimensional thermal imaging image data in size; firstly, respectively encoding the split ultrasonic imaging image data and the split three-dimensional thermal imaging image data by using a FACET module, so as to obtain encoded split ultrasonic imaging image data and encoded split three-dimensional thermal imaging image data; then, the Bi-LSTM and the attention machine are utilized to conduct feature extraction processing on the encoded segmented ultrasonic imaging image data and the encoded segmented three-dimensional thermal imaging image data respectively, and extraction features of the segmented ultrasonic imaging image data and extraction features of the segmented three-dimensional thermal imaging image data can be obtained; in order to obtain better extraction features, the extraction features of the segmented ultrasonic imaging image data and the extraction features of the segmented three-dimensional thermal imaging image data are required to be subjected to feature enhancement processing respectively, so that enhancement extraction features of the segmented ultrasonic imaging image data and enhancement extraction features of the segmented three-dimensional thermal imaging image data are obtained; and then re-sequencing and splicing the obtained enhanced extracted features of the segmented ultrasonic imaging image data and the enhanced extracted features of the segmented three-dimensional thermal imaging image data according to a segmentation rule respectively to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data.
S14: based on the first extracted feature data and the second extracted feature data, carrying out multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data to obtain fusion superposition ultrasonic image data;
in the implementation process of the present invention, the multi-mode fusion and superposition processing is performed on the ultrasound imaging image data and the three-dimensional thermal imaging image data based on the first extracted feature data and the second extracted feature data to obtain fusion and superposition ultrasound image data, including: and carrying out multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data by utilizing the first extracted feature data and the second extracted feature data based on an orthogonalization mechanism to obtain fusion superposition ultrasonic image data.
Specifically, when the multi-mode fusion processing is performed, because the independence between the first extracted data feature and the second extracted data feature may be reduced, so that the feature is crashed, the orthogonalization mechanism is adopted to perform the multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data by using the first extracted feature data and the second extracted feature data, so as to obtain the fusion superposition ultrasonic image data.
S15: positioning a target tissue image area based on the fusion and superposition ultrasonic image data, and performing tissue deformation and temperature analysis treatment on the positioned target tissue image area to obtain a tissue analysis result of the target tissue image area;
in the implementation process of the invention, the positioning of the target tissue image area based on the fusion superimposed ultrasonic image data, and the tissue deformation and temperature analysis processing of the positioned target tissue image area are carried out to obtain the tissue analysis result of the target tissue image area, comprising the following steps: positioning the target tissue image area based on the heat source position in the fusion superimposed ultrasonic image data to obtain a positioned target tissue image area; according to the surface temperature data and the heat source position, performing temperature analysis processing on the target tissue image area to obtain a tissue temperature analysis result of the target tissue image area; and carrying out deformation fitting treatment on the target tissue image area and the given tissue image area to obtain a deformation analysis result of the target tissue image area.
Specifically, the target tissue image area may be subjected to positioning processing according to the heat source position marked in the fused superimposed ultrasound image data, so as to obtain a positioned target tissue image area, in general, the heat source position marked in the fused imaging chart may be used as the positioned target tissue image area, and after determining the distance between the target tissue image area and the surface of the skin contact area, the distance may be represented by using the heat source depth; then, according to the surface temperature data and the heat source position, carrying out temperature analysis processing on the target tissue image area to obtain a tissue analysis result of the target tissue image area; the specific analytical formula is as follows:
Wherein T represents the temperature at any point of the surface of the skin contact area; t (T) m A temperature extremum representing the surface of the skin contact area; d represents the depth value of the heat source,wherein a represents the distance from any point of the skin contact area surface to the temperature extremum of the skin contact area surface; k represents the tissue thermal conductivity; t (T) Q The temperature of the tissue representing the target tissue image region, i.e., the temperature data of the heat source.
The temperature data of the simulated touch target tissue image area and the temperature data of the heat source can be calculated through the formula; then extracting a target tissue image region from the fused superimposed ultrasonic image data by utilizing an edge extraction algorithm to obtain a target tissue image region, and performing deformation fitting treatment on the target tissue image region and a given tissue image region to obtain a deformation analysis result of the target tissue image region; during deformation fitting processing, firstly, extracting geometric key points in a target tissue image region map, performing deformation analysis processing by fitting the extracted geometric key points with the geometric key points of a given tissue image region, and recording to obtain a deformation analysis result of the target tissue image region, wherein the geometric key points comprise bending turning points, curvature extreme points, crossing points and bifurcation points.
S16: and obtaining the tissue state information after energy based on the tissue analysis result of the target tissue image area.
In the implementation process of the invention, after the tissue temperature analysis result and the deformation analysis result of the target tissue image area are obtained, the tissue state information after energy can be confirmed through the tissue temperature analysis result and the deformation analysis result.
In the embodiment of the invention, the ultrasonic imaging data with temperature data information of the region after energy treatment can be quickly obtained by collecting the ultrasonic imaging image data and the three-dimensional thermal imaging image data of the skin contact region and fusing and superposing the ultrasonic imaging image data and the three-dimensional thermal imaging image data to form fused superposed ultrasonic image data, so that the subsequent evaluation and detection of the region after energy treatment are convenient, the designated treatment region is prevented from deviating from a normal value due to overhigh temperature, and the treatment effect is ensured.
In a second embodiment, the imaging graph analysis device based on multi-mode fusion is applied to treatment equipment, and an energy therapeutic instrument and an imaging device are integrated on a treatment head of the treatment equipment; the device comprises:
A first imaging module: the ultrasonic imaging module is used for carrying out imaging treatment on the skin contact area based on the ultrasonic imaging module in the imaging equipment after the treatment head is contacted with the skin and the energy treatment instrument is used for carrying out energy treatment on the skin contact area, so as to obtain ultrasonic imaging image data of the skin contact area;
in the implementation process of the invention, the imaging processing is performed on the skin contact area based on the ultrasonic imaging module in the imaging device to obtain ultrasonic imaging image data of the skin contact area, and the method comprises the following steps: transmitting a plurality of ultrasonic signals to the skin contact area based on an ultrasonic imaging module in the imaging device, and generating ultrasonic imaging image data of the skin contact area based on the received ultrasonic echo signals of the skin contact area.
Further, the ultrasound imaging module generates ultrasound imaging image data of the skin contact area based on the received multi-segment ultrasound echo signals of the skin contact area, comprising: the ultrasonic imaging module carries out filtering treatment on the multi-section ultrasonic echo signals to obtain filtered multi-section ultrasonic echo signals; the ultrasound imaging module generates ultrasound imaging image data of the skin contact region based on the filtered multi-segment ultrasound echo signals.
Specifically, when a user performs a treatment operation on a designated skin area by using a treatment head of a treatment device, after the treatment head is in contact with the skin and the contacted skin contact area is subjected to energy treatment by using an energy therapeutic instrument integrated on the treatment head, an ultrasonic imaging module in an imaging device integrated on the treatment head is used for imaging the skin contact area, so that ultrasonic imaging image data of the skin contact area can be obtained.
When an ultrasonic imaging module in the imaging device performs imaging processing, a plurality of sections of ultrasonic signals are transmitted to a skin contact area through the ultrasonic imaging module of the imaging device, meanwhile, the ultrasonic echo signals of the skin contact area are received by the ultrasonic imaging module of the imaging device, and then ultrasonic imaging image data are generated according to the received plurality of sections of ultrasonic echo signals.
When the multi-section ultrasonic echo signals are used for generating ultrasonic imaging image data of a skin contact area, the multi-section ultrasonic echo signals are required to be demodulated to generate multi-section discontinuous demodulation data; according to the preset data length, carrying out filtering processing on data corresponding to the data length which is preset and is the length from the data starting part of each piece of the demodulation data in the discontinuous demodulation data, and generating filtered data, wherein the filtering processing is carried out by adopting an IIR filter; and filtering the filtered data subjected to the filtering treatment and the data not subjected to the filtering treatment in the demodulation data in a recursion mode to generate segmented filtering demodulation data, then performing interpolation complement treatment on the segmented filtering demodulation data to form interpolation complement demodulation data, and finally generating ultrasonic imaging image data of the skin contact area by using the interpolation complement demodulation data.
And a second imaging module: the imaging device is used for carrying out imaging processing on the skin contact area based on an ultrasonic temperature measurement module in the imaging device to obtain three-dimensional thermal imaging image data of the skin contact area;
in the implementation process of the invention, the imaging processing is performed on the skin contact area based on the ultrasonic temperature measurement module in the imaging device to obtain three-dimensional thermal imaging image data of the skin contact area, and the method comprises the following steps: when the ultrasonic temperature measurement module in the imaging equipment carries out thermal imaging detection on the skin contact area, the three-dimensional thermal imaging processing is carried out on the skin contact area based on an echo time shifting algorithm, and three-dimensional thermal imaging image data of the skin contact area are obtained.
Specifically, ultrasonic waves are sent to the skin contact area through the ultrasonic temperature measurement module, then nondestructive temperature measurement is achieved through an echo time shifting mode, namely, the depth of the skin contact area is assumed to be z, and when the acoustic waves propagate at the speed c (T), the required time is as follows:
if thermal expansion is considered, there are:
wherein θ (ζ) represents a temperature at a depth ζ; c (ζ, θ (ζ)) represents the sound velocity at the depth ζ and the temperature θ (ζ); α (ζ) represents a thermal expansion coefficient at ζ; t is t c And c in (a) represents a correlation of the sound velocity to heat in consideration.
By the method, the temperature data of the skin contact area can be calculated, and the three-dimensional thermal imaging image data of the skin contact area can be obtained by constructing the three-dimensional image.
And the feature extraction module is used for: the method comprises the steps of respectively carrying out image feature extraction processing on ultrasonic imaging image data and three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data;
in the implementation process of the present invention, the image feature extraction processing is performed on the ultrasonic imaging image data and the three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data, where the method includes: image segmentation processing is carried out on the ultrasonic imaging image data and the three-dimensional thermal imaging image data according to a preset segmentation rule, segmented ultrasonic imaging image data and segmented three-dimensional thermal imaging image data are obtained, and the size of each segmented block of the segmented ultrasonic imaging image data is consistent with that of each segmented region of the segmented three-dimensional thermal imaging image data; encoding the split ultrasonic imaging image data and the split three-dimensional thermal imaging image data by using a FACET module to obtain encoded split ultrasonic imaging image data and encoded split three-dimensional thermal imaging image data; performing feature extraction processing on the encoded segmented ultrasonic imaging image data and the encoded segmented three-dimensional thermal imaging image data based on Bi-LSTM and an attention machine to obtain extraction features of the segmented ultrasonic imaging image data and extraction features of the segmented three-dimensional thermal imaging image data; respectively carrying out feature enhancement processing on the extracted features of the segmented ultrasonic imaging image data and the extracted features of the segmented three-dimensional thermal imaging image data to obtain enhanced extracted features of the segmented ultrasonic imaging image data and enhanced extracted features of the segmented three-dimensional thermal imaging image data; and re-stitching the enhanced extracted features of the segmented ultrasonic imaging image data and the enhanced extracted features of the segmented three-dimensional thermal imaging image data according to segmentation rules to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data.
Specifically, image segmentation processing is carried out on the ultrasonic imaging image data and the three-dimensional thermal imaging image data according to a preset segmentation rule, wherein the preset segmentation rule can be set according to user requirements, such as average segmentation into 4 image segments or average segmentation into 8 image segments, so as to obtain segmented ultrasonic imaging image data and segmented three-dimensional thermal imaging image data, and each segmentation block of the segmented ultrasonic imaging image data is consistent with each segmentation region of the segmented three-dimensional thermal imaging image data in size; firstly, respectively encoding the split ultrasonic imaging image data and the split three-dimensional thermal imaging image data by using a FACET module, so as to obtain encoded split ultrasonic imaging image data and encoded split three-dimensional thermal imaging image data; then, the Bi-LSTM and the attention machine are utilized to conduct feature extraction processing on the encoded segmented ultrasonic imaging image data and the encoded segmented three-dimensional thermal imaging image data respectively, and extraction features of the segmented ultrasonic imaging image data and extraction features of the segmented three-dimensional thermal imaging image data can be obtained; in order to obtain better extraction features, the extraction features of the segmented ultrasonic imaging image data and the extraction features of the segmented three-dimensional thermal imaging image data are required to be subjected to feature enhancement processing respectively, so that enhancement extraction features of the segmented ultrasonic imaging image data and enhancement extraction features of the segmented three-dimensional thermal imaging image data are obtained; and then re-sequencing and splicing the obtained enhanced extracted features of the segmented ultrasonic imaging image data and the enhanced extracted features of the segmented three-dimensional thermal imaging image data according to a segmentation rule respectively to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data.
A multi-mode fusion module: the method comprises the steps of performing multi-mode fusion superposition processing on ultrasonic imaging image data and three-dimensional thermal imaging image data based on first extracted feature data and second extracted feature data to obtain fusion superposition ultrasonic image data;
in the implementation process of the present invention, the multi-mode fusion and superposition processing is performed on the ultrasound imaging image data and the three-dimensional thermal imaging image data based on the first extracted feature data and the second extracted feature data to obtain fusion and superposition ultrasound image data, including: and carrying out multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data by utilizing the first extracted feature data and the second extracted feature data based on an orthogonalization mechanism to obtain fusion superposition ultrasonic image data.
Specifically, when the multi-mode fusion processing is performed, because the independence between the first extracted data feature and the second extracted data feature may be reduced, so that the feature is crashed, the orthogonalization mechanism is adopted to perform the multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data by using the first extracted feature data and the second extracted feature data, so as to obtain the fusion superposition ultrasonic image data.
And an analysis and processing module: the method comprises the steps of positioning a target tissue image area based on the fusion superposition ultrasonic image data, and carrying out tissue deformation and temperature analysis on the positioned target tissue image area to obtain a tissue analysis result of the target tissue image area;
in the implementation process of the invention, the positioning of the target tissue image area based on the fusion superimposed ultrasonic image data, and the tissue deformation and temperature analysis processing of the positioned target tissue image area are carried out to obtain the tissue analysis result of the target tissue image area, comprising the following steps: positioning the target tissue image area based on the heat source position in the fusion superimposed ultrasonic image data to obtain a positioned target tissue image area; according to the surface temperature data and the heat source position, performing temperature analysis processing on the target tissue image area to obtain a tissue temperature analysis result of the target tissue image area; and carrying out deformation fitting treatment on the target tissue image area and the given tissue image area to obtain a deformation analysis result of the target tissue image area.
Specifically, the target tissue image area may be subjected to positioning processing according to the heat source position marked in the fused superimposed ultrasound image data, so as to obtain a positioned target tissue image area, in general, the heat source position marked in the fused imaging chart may be used as the positioned target tissue image area, and after determining the distance between the target tissue image area and the surface of the skin contact area, the distance may be represented by using the heat source depth; then, according to the surface temperature data and the heat source position, carrying out temperature analysis processing on the target tissue image area to obtain a tissue analysis result of the target tissue image area; the specific analytical formula is as follows:
Wherein T represents the temperature at any point of the surface of the skin contact area; t (T) m A temperature extremum representing the surface of the skin contact area; d represents the depth value of the heat source,wherein a represents the distance from any point of the skin contact area surface to the temperature extremum of the skin contact area surface; k represents the tissue thermal conductivity; t (T) Q Then the target tissue is representedThe temperature of the tissue of the image area, i.e. the temperature data of the heat source.
The temperature data of the simulated touch target tissue image area and the temperature data of the heat source can be calculated through the formula; then extracting a target tissue image region from the fused superimposed ultrasonic image data by utilizing an edge extraction algorithm to obtain a target tissue image region, and performing deformation fitting treatment on the target tissue image region and a given tissue image region to obtain a deformation analysis result of the target tissue image region; during deformation fitting processing, firstly, extracting geometric key points in a target tissue image region map, performing deformation analysis processing by fitting the extracted geometric key points with the geometric key points of a given tissue image region, and recording to obtain a deformation analysis result of the target tissue image region, wherein the geometric key points comprise bending turning points, curvature extreme points, crossing points and bifurcation points.
The obtaining module is as follows: for obtaining tissue state information after energy based on the tissue analysis result of the target tissue image region.
In the implementation process of the invention, after the tissue temperature analysis result and the deformation analysis result of the target tissue image area are obtained, the tissue state information after energy can be confirmed through the tissue temperature analysis result and the deformation analysis result.
In the embodiment of the invention, the ultrasonic imaging data with temperature data information of the region after energy treatment can be quickly obtained by collecting the ultrasonic imaging image data and the three-dimensional thermal imaging image data of the skin contact region and fusing and superposing the ultrasonic imaging image data and the three-dimensional thermal imaging image data to form fused superposed ultrasonic image data, so that the subsequent evaluation and detection of the region after energy treatment are convenient, the designated treatment region is prevented from deviating from a normal value due to overhigh temperature, and the treatment effect is ensured.
The embodiment of the invention provides a computer readable storage medium, and a computer program is stored on the computer readable storage medium, and when the program is executed by a processor, the method for fusing and superposing the imaging images in any one of the embodiments is realized. The computer readable storage medium includes, but is not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magneto-optical disks, ROMs (Read-Only memories), RAMs (Random AcceSS Memory, random access memories), EPROMs (EraSable Programmable Read-Only memories), EEPROMs (Electrically EraSable ProgrammableRead-Only memories), flash memories, magnetic cards, or optical cards. That is, a storage device includes any medium that stores or transmits information in a form readable by a device (e.g., computer, cell phone), and may be read-only memory, magnetic or optical disk, etc.
The embodiment of the invention also provides a computer application program which runs on a computer and is used for executing the imaging image fusion superposition method of any one of the embodiments.
The embodiment of the invention also provides a treatment device which comprises a processor, a memory, an input unit, a display unit and other devices. Those skilled in the art will appreciate that the therapeutic device structural elements do not constitute a limitation on all devices, and may include more or fewer components, or may combine certain components. The memory may be used to store application programs and various functional modules, and the processor runs the application programs stored in the memory, thereby executing various functional applications of the device and data processing. The memory may be internal memory or external memory, or include both internal memory and external memory. The internal memory may include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), flash memory, or random access memory. The external memory may include a hard disk, floppy disk, ZIP disk, U-disk, tape, etc. The disclosed memory includes, but is not limited to, these types of memory. The memory disclosed herein is by way of example only and not by way of limitation.
The input unit is used for receiving input of signals and receiving keywords input by users. The input unit may include a touch panel and other input devices. The touch panel may collect touch operations on or near the user (e.g., the user's operation on or near the touch panel using any suitable object or accessory such as a finger, stylus, etc.), and drive the corresponding connection device according to a preset program; other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., play control keys, switch keys, etc.), a trackball, mouse, joystick, etc. The display unit may be used to display information input by a user or information provided to the user and various menus of the terminal device. The display unit may take the form of a liquid crystal display, an organic light emitting diode, or the like. The processor is a control center of the terminal device, connects various parts of the entire device using various interfaces and lines, performs various functions and processes data by running or executing software programs and/or modules stored in the memory, and calling data stored in the memory.
As one embodiment, the therapeutic apparatus includes: the system comprises one or more processors, a memory, and one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to perform the imaging map fusion overlay method of any of the above embodiments.
In the embodiment of the invention, the ultrasonic imaging data with temperature data information of the region after energy treatment can be quickly obtained by collecting the ultrasonic imaging image data and the three-dimensional thermal imaging image data of the skin contact region and fusing and superposing the ultrasonic imaging image data and the three-dimensional thermal imaging image data to form fused superposed ultrasonic image data, so that the subsequent evaluation and detection of the region after energy treatment are convenient, the designated treatment region is prevented from deviating from a normal value due to overhigh temperature, and the treatment effect is ensured.
In addition, the above detailed description of the multi-mode-based image fusion superposition method and the related device provided by the embodiment of the present invention should be provided, and specific examples should be adopted to illustrate the principle and implementation of the present invention, where the above description of the embodiment is only used to help understand the method and core idea of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (10)

1. The multi-mode-based imaging image fusion superposition method is characterized by being applied to treatment equipment, wherein an energy therapeutic instrument and an imaging device are integrated on a treatment head of the treatment equipment; the method comprises the following steps:
After the treatment head is contacted with the skin and the energy treatment instrument is utilized to treat the skin contact area, the skin contact area is subjected to imaging treatment based on an ultrasonic imaging module in the imaging equipment, and ultrasonic imaging image data of the skin contact area are obtained;
imaging the skin contact area based on an ultrasonic temperature measurement module in the imaging equipment to obtain three-dimensional thermal imaging image data of the skin contact area;
respectively carrying out image feature extraction processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data;
and carrying out multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data based on the first extracted feature data and the second extracted feature data to obtain fusion superposition ultrasonic image data.
2. The fusion superposition method of imaging diagrams according to claim 1, wherein the imaging processing of the skin contact area based on the ultrasonic imaging module in the imaging device to obtain ultrasonic imaging image data of the skin contact area comprises:
Transmitting a plurality of ultrasonic signals to the skin contact area based on an ultrasonic imaging module in the imaging device, and generating ultrasonic imaging image data of the skin contact area based on the received ultrasonic echo signals of the skin contact area.
3. The method of claim 2, wherein the ultrasound imaging module generates ultrasound imaging image data of a skin contact region based on received multi-segment ultrasound echo signals of the skin contact region, comprising:
the ultrasonic imaging module carries out filtering treatment on the multi-section ultrasonic echo signals to obtain filtered multi-section ultrasonic echo signals;
the ultrasound imaging module generates ultrasound imaging image data of the skin contact region based on the filtered multi-segment ultrasound echo signals.
4. The fusion superposition method of imaging diagrams according to claim 1, wherein the imaging processing is performed on a skin contact area based on an ultrasonic thermometry module in the imaging device to obtain three-dimensional thermal imaging image data of the skin contact area, comprising:
when the ultrasonic temperature measurement module in the imaging equipment carries out thermal imaging detection on the skin contact area, the three-dimensional thermal imaging processing is carried out on the skin contact area based on an echo time shifting algorithm, and three-dimensional thermal imaging image data of the skin contact area are obtained.
5. The method of image fusion and superposition according to claim 1, wherein performing image feature extraction processing on the ultrasound imaging image data and the three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasound imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data, respectively, includes:
image segmentation processing is carried out on the ultrasonic imaging image data and the three-dimensional thermal imaging image data according to a preset segmentation rule, segmented ultrasonic imaging image data and segmented three-dimensional thermal imaging image data are obtained, and the size of each segmented block of the segmented ultrasonic imaging image data is consistent with that of each segmented region of the segmented three-dimensional thermal imaging image data;
encoding the split ultrasonic imaging image data and the split three-dimensional thermal imaging image data by using a FACET module to obtain encoded split ultrasonic imaging image data and encoded split three-dimensional thermal imaging image data;
performing feature extraction processing on the encoded segmented ultrasonic imaging image data and the encoded segmented three-dimensional thermal imaging image data based on Bi-LSTM and an attention machine to obtain extraction features of the segmented ultrasonic imaging image data and extraction features of the segmented three-dimensional thermal imaging image data;
Respectively carrying out feature enhancement processing on the extracted features of the segmented ultrasonic imaging image data and the extracted features of the segmented three-dimensional thermal imaging image data to obtain enhanced extracted features of the segmented ultrasonic imaging image data and enhanced extracted features of the segmented three-dimensional thermal imaging image data;
and re-stitching the enhanced extracted features of the segmented ultrasonic imaging image data and the enhanced extracted features of the segmented three-dimensional thermal imaging image data according to segmentation rules to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data.
6. The method of claim 1, wherein the performing multi-modal fusion and superposition processing on the ultrasound imaging image data and the three-dimensional thermal imaging image data based on the first extracted feature data and the second extracted feature data to obtain fusion and superposition ultrasound image data includes:
and carrying out multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data by utilizing the first extracted feature data and the second extracted feature data based on an orthogonalization mechanism to obtain fusion superposition ultrasonic image data.
7. The method of image fusion overlay according to claim 1, further comprising:
positioning a target tissue image area based on the fusion and superposition ultrasonic image data, and performing tissue deformation and temperature analysis treatment on the positioned target tissue image area to obtain a tissue analysis result of the target tissue image area;
tissue state information after energy treatment is obtained based on the tissue analysis results of the target tissue image region.
8. An imaging graph analysis device based on multi-mode fusion is characterized by being applied to treatment equipment, wherein an energy therapeutic instrument and an imaging device are integrated on a treatment head of the treatment equipment; the device comprises:
a first imaging module: the ultrasonic imaging module is used for carrying out imaging treatment on the skin contact area based on the ultrasonic imaging module in the imaging equipment after the treatment head is contacted with the skin and the energy treatment instrument is used for carrying out energy treatment on the skin contact area, so as to obtain ultrasonic imaging image data of the skin contact area;
and a second imaging module: the imaging device is used for carrying out imaging processing on the skin contact area based on an ultrasonic temperature measurement module block in the imaging device to obtain three-dimensional thermal imaging image data of the skin contact area;
And the feature extraction module is used for: the method comprises the steps of respectively carrying out image feature extraction processing on ultrasonic imaging image data and three-dimensional thermal imaging image data to obtain first extracted feature data corresponding to the ultrasonic imaging image data and second extracted feature data corresponding to the three-dimensional thermal imaging image data;
a multi-mode fusion module: and the method is used for carrying out multi-mode fusion superposition processing on the ultrasonic imaging image data and the three-dimensional thermal imaging image data based on the first extracted feature data and the second extracted feature data to obtain fusion superposition ultrasonic image data.
9. A therapeutic apparatus comprising a processor and a memory, wherein the processor runs a computer program or code stored in the memory to implement the fusion overlay method of imaging maps of any one of claims 1 to 7.
10. A computer readable storage medium storing a computer program or code which, when executed by a processor, implements the imaging map fusion overlay method according to any one of claims 1 to 7.
CN202311738173.9A 2023-12-18 2023-12-18 Multi-mode-based imaging image fusion and superposition method and related device Active CN117710229B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311738173.9A CN117710229B (en) 2023-12-18 2023-12-18 Multi-mode-based imaging image fusion and superposition method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311738173.9A CN117710229B (en) 2023-12-18 2023-12-18 Multi-mode-based imaging image fusion and superposition method and related device

Publications (2)

Publication Number Publication Date
CN117710229A true CN117710229A (en) 2024-03-15
CN117710229B CN117710229B (en) 2024-06-21

Family

ID=90163389

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311738173.9A Active CN117710229B (en) 2023-12-18 2023-12-18 Multi-mode-based imaging image fusion and superposition method and related device

Country Status (1)

Country Link
CN (1) CN117710229B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150165235A1 (en) * 2012-09-03 2015-06-18 Kabushiki Kaisha Toshiba Medical image processing apparatus and radiation treatment apparatus
CN111340742A (en) * 2018-12-18 2020-06-26 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging method and device and storage medium
CN111686379A (en) * 2020-07-23 2020-09-22 上海联影医疗科技有限公司 Radiation therapy system and method
CN113960111A (en) * 2021-09-15 2022-01-21 湖南大学 Three-dimensional thermal imaging system and method based on thermal imager and laser linkage scanning technology
CN114376606A (en) * 2022-01-18 2022-04-22 武汉联影医疗科技有限公司 Filtering method and system for ultrasonic imaging
CN115375595A (en) * 2022-07-04 2022-11-22 武汉联影智融医疗科技有限公司 Image fusion method, device, system, computer equipment and storage medium
CN116236280A (en) * 2023-02-02 2023-06-09 逸超医疗科技(北京)有限公司 Interventional therapy guiding method and system based on multi-mode image fusion
CN116975776A (en) * 2023-07-14 2023-10-31 湖北楚天高速数字科技有限公司 Multi-mode data fusion method and device based on tensor and mutual information

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150165235A1 (en) * 2012-09-03 2015-06-18 Kabushiki Kaisha Toshiba Medical image processing apparatus and radiation treatment apparatus
CN111340742A (en) * 2018-12-18 2020-06-26 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging method and device and storage medium
CN111686379A (en) * 2020-07-23 2020-09-22 上海联影医疗科技有限公司 Radiation therapy system and method
CN113960111A (en) * 2021-09-15 2022-01-21 湖南大学 Three-dimensional thermal imaging system and method based on thermal imager and laser linkage scanning technology
CN114376606A (en) * 2022-01-18 2022-04-22 武汉联影医疗科技有限公司 Filtering method and system for ultrasonic imaging
CN115375595A (en) * 2022-07-04 2022-11-22 武汉联影智融医疗科技有限公司 Image fusion method, device, system, computer equipment and storage medium
CN116236280A (en) * 2023-02-02 2023-06-09 逸超医疗科技(北京)有限公司 Interventional therapy guiding method and system based on multi-mode image fusion
CN116975776A (en) * 2023-07-14 2023-10-31 湖北楚天高速数字科技有限公司 Multi-mode data fusion method and device based on tensor and mutual information

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张琪;武法提;许文静;: "多模态数据支持的学习投入评测:现状、启示与研究趋向", 远程教育杂志, no. 01, 20 January 2020 (2020-01-20) *
陈锐锋;方路平;潘清;曹平;高坤;: "多模态医学图像融合超声检查系统的设计与实现", 计算机工程, no. 04, 15 April 2015 (2015-04-15) *

Also Published As

Publication number Publication date
CN117710229B (en) 2024-06-21

Similar Documents

Publication Publication Date Title
EP2400370B1 (en) Information processing device and information processing method
JP5711421B2 (en) Electronic device for collecting finger data and displaying finger movement trajectory and associated method
WO2016036304A1 (en) Pseudo random guided fingerprint enrolment
TR201807210T4 (en) A system and method for guiding a user during a shaving procedure.
JP2017538224A (en) Fingerprint registration by guidance based on the center point of attention
Mathiassen et al. Robust real-time needle tracking in 2-D ultrasound images using statistical filtering
CN117710229B (en) Multi-mode-based imaging image fusion and superposition method and related device
JP2009282706A (en) Vein authentication device and vein authentication method
CN111327888A (en) Camera control method and device, computer equipment and storage medium
CN103177245B (en) gesture recognition method and device
Sufyan et al. A novel and lightweight real-time continuous motion gesture recognition algorithm for smartphones
CN105117000A (en) Method and device for processing medical three-dimensional image
CN101650776A (en) Method and system for tracking position of human limbs
US20140150085A1 (en) User authentication based on a user's operation on a displayed three-dimensional model
JP2020044331A (en) Image acquisition method, related apparatus, and readable storage medium
JP2010117964A (en) Device and program for estimating confusion
CN112488982A (en) Ultrasonic image detection method and device
CN110809089B (en) Processing method and processing apparatus
JP5972082B2 (en) Information processing apparatus, display method, and program
CN111552941B (en) Terminal unlocking method and device, electronic equipment and readable storage medium
Morash et al. Determining the bias and variance of a deterministic finger-tracking algorithm
CN110244839A (en) Control method, electronic equipment and storage medium
CN117373135B (en) Sliding gesture recognition method and system based on vision and related equipment
US20170357328A1 (en) Quick command entry for wearable devices
JP3980341B2 (en) Eye position tracking method, eye position tracking device and program therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant