CN113965573A - Ultrasonic image processing method and device based on cloud computing and signal processing - Google Patents

Ultrasonic image processing method and device based on cloud computing and signal processing Download PDF

Info

Publication number
CN113965573A
CN113965573A CN202111397617.8A CN202111397617A CN113965573A CN 113965573 A CN113965573 A CN 113965573A CN 202111397617 A CN202111397617 A CN 202111397617A CN 113965573 A CN113965573 A CN 113965573A
Authority
CN
China
Prior art keywords
signal data
ultrasonic
data
ultrasonic signal
cloud server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111397617.8A
Other languages
Chinese (zh)
Inventor
刘西耀
刘鑫
王自尊
李晓军
伍博文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Stork Healthcare Technology Co ltd
Original Assignee
Chengdu Stork Healthcare Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Stork Healthcare Technology Co ltd filed Critical Chengdu Stork Healthcare Technology Co ltd
Priority to CN202111397617.8A priority Critical patent/CN113965573A/en
Publication of CN113965573A publication Critical patent/CN113965573A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention relates to an ultrasonic image processing method and equipment based on cloud computing and signal processing, which comprises the following steps: responding to a first ultrasonic detection instruction of a user, and sending a link request to a cloud server; when the cloud server responds to the link request, generating a first control instruction, controlling the ultrasonic equipment to acquire ultrasonic signal data, preprocessing the acquired ultrasonic signal data, and transmitting the preprocessed ultrasonic signal data to the cloud server; and receiving first data fed back by the cloud server according to the ultrasonic signal data, and displaying the first data. According to the ultrasonic image processing method provided by the invention, when the cloud server responds to the link request, the original ultrasonic signal data acquired by the ultrasonic equipment can be directly uploaded to the cloud server through data preprocessing, so that the cloud server can receive the signal data containing the original dynamic range and acquire the phase information therein, more comprehensive and accurate ultrasonic signal data processing is completed, and accurate focus data is obtained.

Description

Ultrasonic image processing method and device based on cloud computing and signal processing
Technical Field
The invention relates to the field of ultrasonic image data processing, in particular to an ultrasonic image processing method and ultrasonic image processing equipment based on cloud computing and signal processing.
Background
Traditional desk-top ultrasonic equipment, usually by the supersound front end, the main control computer, GPU handles the card, the display, the keyboard, elements such as complete machine bearing structure constitute, because its hardware calculation resource is abundant, so support more high-end inspection mode, imaging effect is also very clear simultaneously, the disadvantage is that the consumption is high, and is bulky, it is too heavy, can only use in fixed scene, the popular novel portable supersound of recent years, for example notebook supersound or wireless palm supersound, because the restriction of volume, they have optimized the hardware resource, the consumption is reduced, accomplish small in size and convenient the removal, but the disadvantage is also obvious, limited hardware resource leads to the imaging quality relatively poor, and do not support high-end inspection mode.
With the development of the ultrasound front-end technology, high-order imaging technologies such as plane wave complex imaging, full focus imaging, synthetic aperture imaging, fourier imaging and the like are invented, but compared with the traditional imaging mode, the high-order imaging technologies have higher lateral resolution, but the high-order imaging technologies have very high requirements on computing power, and can be generally realized in large desktop ultrasound with enough hardware resources, while the existing lighter and more flexible portable ultrasound equipment cannot integrate the high-order imaging modes due to the limited hardware computing power resources.
In the prior art, a cloud server and a local terminal are deployed on the basis of a portable ultrasonic device to perform collaborative interactive high-order imaging, that is, a cloud ultrasonic technology. However, in the existing cloud ultrasound technology, the imaging process is completed on the ultrasound device, the ultrasound device uploads the ultrasound image data to the cloud server for secondary image processing, and at this time, the ultrasound image data received by the cloud is the ultrasound image data formed by performing beam forming processing and performing dynamic range compression on the beam signals after beam forming, the amplitude and phase information of the ultrasound original signal data is lost, and only an image is used for identifying a focus, so that the focus information which can be obtained is very limited, and the final diagnosis result can be influenced.
For example, chinese patent application No. 2021102662092 discloses a method of: the cloud computing-based ultrasonic image processing method and the ultrasonic system have the advantages that when a cloud network is available, ultrasonic image data received by scanning of an ultrasonic receiving and sending terminal is transmitted to the cloud through the network, the cloud carries out secondary image processing on the ultrasonic image data and feeds the ultrasonic image data back to a display interaction terminal for display, when the cloud network is unavailable, the display interaction terminal is used for locally processing the ultrasonic image data received by scanning of the ultrasonic receiving and sending terminal, in the cloud ultrasonic technology provided by the ultrasonic image processing method, the cloud and the display interaction terminal both receive the ultrasonic image data, the dynamic range and the phase information of the ultrasonic original signal data are lost, and the focus information which can be obtained only from images is very limited.
Disclosure of Invention
The invention aims to overcome the defects that the cloud end of the existing cloud ultrasonic technology can only receive ultrasonic image data imaged by ultrasonic equipment, cannot acquire original ultrasonic signal data with a high dynamic range and phase information, and the cloud end can identify limited focus information.
Accurate ultrasound image data is generated.
In order to achieve the above purpose, the invention provides the following technical scheme:
an ultrasound image processing method based on cloud computing and signal processing is applied to a terminal device, the terminal device is in communication connection with an ultrasound device and a cloud server, and the method comprises the following steps:
responding to a first ultrasonic detection instruction (an instruction of needing to use the cloud service) of a user, and sending a link request to a cloud server; when the cloud server responds to the link request, generating a first control instruction, controlling the ultrasonic equipment to acquire ultrasonic signal data, preprocessing the acquired ultrasonic signal data, and transmitting the preprocessed ultrasonic signal data to the cloud server;
and receiving first data fed back by the cloud server according to the ultrasonic signal data, and displaying the first data.
Specifically, in the application, corresponding data interaction can be performed between the terminal device and the ultrasonic device and between the terminal device and the cloud server (namely, the cloud end), and the communication mode includes but is not limited to Type-C, ten gigabit optical fiber or WIFI and the like. The terminal device and the ultrasonic device are usually local devices, the terminal device can respond to instruction parameters of a user to generate various control instructions for the ultrasonic device, and the ultrasonic device executes operations such as signal acquisition, signal preprocessing and signal transmission based on the control of the terminal device. The cloud server is used as an application virtualization software platform with high computing power, can simultaneously respond to link requests of a plurality of terminal devices, and can form a plurality of ultrasonic networks with the plurality of terminal devices and the ultrasonic devices. When the cloud server responds to a link request of a certain terminal device, the cloud server receives ultrasonic signal data from the ultrasonic device, wherein the ultrasonic signal data transmitted to the cloud server by the ultrasonic device is original signal data, and the method comprises the following steps: and the cloud server performs high-order data processing according to the ultrasonic signal data to generate first data and feed the first data back to the terminal.
According to a specific embodiment, in the ultrasound image processing method based on cloud computing and signal processing, the method further includes:
when the cloud server cannot respond to the link request, generating a second control instruction, and controlling the ultrasonic equipment to transmit the preprocessed ultrasonic signal data to the terminal equipment;
screening the received ultrasonic signal data to obtain screened partial ultrasonic signal data, further sending a transmission request to a cloud server, and transmitting the partial ultrasonic signal data to the cloud server;
and receiving and displaying second data fed back by the cloud server according to the partial ultrasonic signal data.
Actually, in a specific application, since the data volume of the ultrasound signal data to be transmitted in the ultrasound network is large, the network state is not good, and the transmission of the ultrasound signal data cannot be supported. When the network between the cloud server and the terminal equipment and/or between the cloud server and the ultrasonic equipment is poor, the cloud server cannot respond to the link request of the terminal equipment, and then feedback that the link request cannot be responded is sent to the terminal equipment. Further, in specific applications, when the cloud server is simultaneously linked to the plurality of terminal devices to provide the computing service, a situation that a load exceeds a threshold (a load exceeds a computing power threshold of the cloud end) and the rapid computing service cannot be provided may also exist, and at this time, the cloud server may also fail to respond to the link request of the terminal device due to the overload of the load, and then send a feedback that the link request cannot be responded to the terminal device. And when the cloud server cannot respond to the link request, the terminal equipment can control the ultrasonic equipment to directly transmit the preprocessed ultrasonic signal data to the terminal equipment for caching, signal processing and the like. At this moment, the terminal device can perform some screening on the received ultrasonic signal data, relieve network transmission pressure and/or cloud computing pressure by reducing data volume, and further selectively transmit the ultrasonic signal data to the cloud server when the cloud server responds to a transmission request, so as to lighten the data processing of the cloud server.
According to a specific embodiment, in the ultrasound image processing method based on cloud computing and signal processing, the method further includes:
responding to a second ultrasonic detection instruction of a user, generating a third control instruction, controlling the ultrasonic equipment to acquire ultrasonic signal data, performing beam forming processing on the acquired ultrasonic signal data, and transmitting the formed beam signal data to the terminal equipment;
and receiving the formed beam signal data, generating basic ultrasonic image data according to the formed beam signal data, and displaying the basic ultrasonic image data.
In specific application, if the user does not need the cloud computing service, the terminal device can complete signal processing such as basic imaging according to the beam signal data after beam forming transmitted by the ultrasonic device.
In a further embodiment of the present invention, there is also provided an ultrasound image processing method based on cloud computing and signal processing, where the method is applied to an ultrasound device, and the ultrasound device is in communication connection with a terminal device and a cloud server, and the method includes:
responding to a first control instruction of the terminal equipment, acquiring ultrasonic signal data, preprocessing the ultrasonic signal data, and transmitting the preprocessed ultrasonic signal data to a cloud server.
According to a specific embodiment, in the ultrasound image processing method based on cloud computing and signal processing, the method further includes: responding to a second control instruction of the terminal equipment, acquiring ultrasonic signal data, preprocessing the ultrasonic signal data, and transmitting the preprocessed ultrasonic signal data to the terminal equipment;
and responding to a third control instruction of the terminal equipment, controlling the ultrasonic equipment to acquire ultrasonic signal data, performing beam forming processing on the acquired ultrasonic signal data, and transmitting the formed beam signal data to the terminal equipment.
According to a specific embodiment, in the ultrasound image processing method based on cloud computing and signal processing, the preprocessing includes: and carrying out longitudinal depth fixing treatment on the acquired ultrasonic signal data, and extracting and filtering the ultrasonic signal data with fixed longitudinal depth.
Specifically, the preprocessing operation is performed on the channel signal data in the ultrasonic signal data, and the channel signal data is preprocessed to adapt to the bandwidth limitation of the common commercial interconnection transmission interface.
In a further embodiment of the present invention, there is also provided an ultrasound image processing method based on cloud computing and signal processing, where the method is applied to a cloud server, and the cloud server is in communication connection with an ultrasound device and a terminal device, and the method includes:
responding to a link request of the terminal equipment, receiving ultrasonic signal data transmitted by the ultrasonic equipment, generating first data according to the ultrasonic signal data, and transmitting the first data to the terminal equipment.
According to a specific embodiment, in the ultrasound image processing method based on cloud computing and signal processing, the method further includes:
responding to a transmission request of the terminal equipment, receiving partial ultrasonic signal data transmitted by the terminal equipment, generating second data according to the partial ultrasonic signal data, and transmitting the second data to the terminal equipment.
Specifically, the cloud server can provide various high-order data processing services for ultrasonic signal data from the ultrasonic equipment, and meanwhile provide various light-weight data processing services for part of ultrasonic signal data from the terminal equipment; the data processing computing service which can be provided by the cloud server is determined by an application virtualization software algorithm configured in the cloud server, and can be configured according to actual computing needs.
According to a specific embodiment, in the ultrasound image processing method based on cloud computing and signal processing, the generating second data according to the partial ultrasound signal data includes:
and performing feature estimation on the part of the ultrasonic signal data through a pre-trained second deep learning model to generate second data.
According to a specific embodiment, in the ultrasound image processing method based on cloud computing and signal processing, the generating first data according to the ultrasound signal data includes:
generating a high-order ultrasonic image according to the ultrasonic signal data by adopting a high-order imaging technology, extracting image features of the high-order ultrasonic image by utilizing a pre-trained first deep learning model, and overlapping the extracted features to the high-order ultrasonic image to generate first data;
the high-order imaging technology is one of plane wave composite imaging, full focusing composite imaging, synthetic aperture imaging and Fourier imaging.
Preferably, the high-order data processing service provided by the cloud server is configured to: high-order imaging, image feature extraction and feature superposition, and configuring lightweight data processing services provided by a cloud server as follows: and (5) estimating human body parameters.
There is also provided in a further embodiment of the present invention an electronic device, including:
one or more processors;
a memory for storing one or more programs, which when executed by the one or more processors, cause the one or more processors to implement the ultrasound image processing method applied to the terminal device or the ultrasound image processing method applied to the cloud server.
In a further embodiment of the present invention, there is also provided a computer-readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the ultrasound image processing method applied to a terminal device as described above, or the ultrasound image processing method applied to a cloud server as described above.
In a further embodiment of the present invention, an ultrasound device is further provided, where the ultrasound device is configured to, in response to a first control instruction of a terminal device, acquire ultrasound signal data, preprocess the ultrasound signal data, and transmit the preprocessed ultrasound signal data to a cloud server.
According to a specific embodiment, in the above ultrasound apparatus, the ultrasound apparatus includes: a channel signal compression module for compressing the channel signal,
the channel signal compression module is used for carrying out longitudinal depth fixing processing on the ultrasonic signal data acquired by the ultrasonic probe and carrying out extraction filtering on the ultrasonic signal data after the longitudinal depth is fixed.
According to a specific embodiment, in the above ultrasound apparatus, the ultrasound apparatus is further configured to: responding to a second control instruction of the terminal equipment, acquiring ultrasonic signal data, preprocessing the ultrasonic signal data, and transmitting the preprocessed ultrasonic signal data to the terminal equipment;
and responding to a third control instruction of the terminal equipment, controlling the ultrasonic equipment to acquire ultrasonic signal data, performing beam forming processing on the acquired ultrasonic signal data, and transmitting the formed beam signal data to the terminal equipment.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the ultrasonic image processing method provided by the invention, when the cloud server responds to the link request, the original ultrasonic signal data acquired by the ultrasonic equipment can be directly uploaded to the cloud server through data preprocessing, so that the cloud server can receive the signal data containing the original dynamic range and acquire the phase information therein, more comprehensive and accurate ultrasonic signal data processing is completed, and accurate focus data is obtained;
2. the ultrasonic image processing method provided by the invention utilizes the characteristics that the cloud server has rich computing resources and is not limited by hardware resources, the cloud server undertakes the imaging tasks of the ultrasonic equipment and the terminal equipment, and the hardware resources of the ultrasonic equipment and the terminal equipment are lightened on the basis of realizing high-order accurate data processing; the cloud server provided by the invention integrates a high-order imaging computing technology, an artificial intelligence-based image feature processing technology and a big data storage function, can be networked with a plurality of ultrasonic devices and terminal devices, simultaneously responds to data requests of the ultrasonic devices and the terminal devices, performs high-order imaging, and balances the computational load of the whole ultrasonic network.
3. The ultrasonic equipment provided by the invention can preprocess and compress the ultrasonic original signal data, so that the data volume is reduced, the bandwidth limitation of a common commercial interconnection transmission interface is met, and the requirement of data transmission rate is met.
Drawings
FIG. 1 is a schematic diagram of a plurality of ultrasound network architectures according to an embodiment of the present invention;
fig. 2a is a schematic block diagram of data transmission in an ultrasound network according to an embodiment of the present invention;
fig. 2b is a schematic block diagram of each device in the ultrasound network according to the embodiment of the present invention;
FIG. 3 is an analog front end of an ultrasound device of an exemplary embodiment of the present invention;
FIG. 4 is a schematic diagram of the basic beamforming of an ultrasound device of an exemplary embodiment of the present invention;
FIG. 5 is a schematic channel acquisition diagram of an ultrasound device in an exemplary embodiment of the present invention;
FIG. 6 is a schematic IQ demodulation diagram of an ultrasound apparatus in accordance with an exemplary embodiment of the present invention;
FIG. 7 is a schematic drawing of decimation filtering of an ultrasound device in an exemplary embodiment of the invention;
FIG. 8 is a schematic diagram of a transmit beam type of an ultrasound device in an exemplary embodiment of the present invention;
FIG. 9 is a block diagram of a basic imaging process of a terminal device in an exemplary embodiment of the invention;
FIG. 10 is a schematic diagram of a plane wave complex focusing technique in accordance with an exemplary embodiment of the present invention;
FIG. 11 is a schematic diagram of plane wave complex imaging in accordance with an exemplary embodiment of the present invention;
FIG. 12 is a schematic diagram of a fully focused compounding technique in accordance with an exemplary embodiment of the present invention;
FIG. 13 is a flowchart of a deep learning process according to an exemplary embodiment of the present invention;
FIG. 14 is a diagram of an ultrasonic baseband signal data format in accordance with an exemplary embodiment of the present invention;
fig. 15 is a schematic diagram of a command frame format transmitted from a terminal device to an ultrasound device according to an exemplary embodiment of the present invention;
fig. 16 is a diagram illustrating a data format of a beam signal according to an exemplary embodiment of the present invention;
FIG. 17 is a diagram of a channel signal data format in accordance with an exemplary embodiment of the present invention;
FIG. 18 is a diagram of an image data format in accordance with an exemplary embodiment of the present invention;
FIG. 19 is an imaging schematic of conventional focused beam synthesis of an exemplary embodiment of the present invention;
FIG. 20 is a schematic diagram of imaging of cloud-based plane wave recombination in an exemplary embodiment of the invention.
Detailed Description
The present invention will be described in further detail with reference to test examples and specific embodiments. It should be understood that the scope of the above-described subject matter is not limited to the following examples, and any techniques implemented based on the disclosure of the present invention are within the scope of the present invention.
Example 1
Fig. 1 shows a schematic diagram of an ultrasound network architecture according to an exemplary embodiment of the present invention, where a cloud server, a plurality of ultrasound devices, and a terminal device form a plurality of ultrasound networks. Further, as shown in fig. 2a and 2b, the ultrasound device performs data interaction with the cloud server through a first network, performs data interaction with the terminal device through a second network, and performs data interaction with the cloud server through a third network;
wherein, the ultrasonic equipment is portable ultrasonic scanning control equipment. The terminal equipment is commercial intelligent equipment such as a mobile phone, a tablet computer, a notebook computer, a desktop computer and the like. An ultrasound cloud (cloud server) is an ultrasound application service program deployed on the cloud server. The first network to the third network are both wireless or wired communication networks, including but not limited to Type-C, gigabit fiber, WIFI network, etc. The functions of the ultrasound device include, but are not limited to, ultrasound transmission and reception, basic signal processing, signal compression, wired or wireless communication, and the like. The functions of the terminal device include, but are not limited to, user interaction, image display, control of the ultrasound device, basic ultrasound data processing, wired or wireless communication, and the like. The functions of the ultrasound cloud include, but are not limited to, functions of ultrasound signal processing, high-order imaging, deep learning, data storage, software interaction, wired or wireless communication, and the like. Wherein, the data transmitted by the first network comprises: ultrasound signal data, namely including: original signal data of the channel signal data; the data transmitted by the second network includes: ultrasound device control commands, ultrasound signal data (including raw signal data, beamformed beam signal data), and the like. The data transmitted by the third network includes: ultrasound signal data (including raw signal data, beamformed beam signal data), image data, and the like.
Specifically, during actual processing, a user may send a first ultrasonic detection instruction (requiring cloud service) or a second ultrasonic detection instruction (not requiring cloud service) to the terminal device, where the detection instruction sent by the user further includes: the scanning mode set by the user. If a user sends a first ultrasonic detection instruction, a terminal device can respond to the first ultrasonic detection instruction and send a link request to a cloud server, when the cloud server can respond to the link request, a first control instruction is generated, ultrasonic scanning is started after the ultrasonic device receives the first control instruction, ultrasonic echo signal data are received in real time, after certain signal preprocessing, the processed ultrasonic signal data are transmitted to the cloud server through a first network, after the ultrasonic cloud receives the ultrasonic signal data, first data including high-quality ultrasonic images and the like are formed through a series of processing, and the first data are transmitted back to the terminal device through a third network and displayed, and the mode is a cloud ultrasonic mode; and when the cloud server cannot respond to the link request due to the fact that the network state or the load exceeds the threshold value, the terminal equipment sends a second control instruction to the ultrasonic equipment, the ultrasonic equipment transmits ultrasonic signal data to the terminal equipment according to the second control instruction, the terminal equipment screens partial ultrasonic signal data according to data requirements and transmits the partial ultrasonic signal data to the ultrasonic cloud, and the second data containing the characteristic parameters are returned after the ultrasonic cloud is processed. If the user sends a second ultrasonic detection instruction, the basic imaging mode is adopted, the terminal equipment can generate a third control instruction for the ultrasonic equipment, the ultrasonic equipment is controlled to acquire ultrasonic signal data, beam forming processing is carried out on the ultrasonic signal data, the formed beam signal data is transmitted to the terminal equipment, and the terminal equipment completes basic imaging.
Specifically, the modules and functions included in the ultrasound device provided by the present invention specifically include:
1. energy converter
The transducer is a sound wave-voltage conversion device, can convert the emission excitation voltage into ultrasonic waves with corresponding frequency to radiate outwards in the emission state, and converts the vibration of the sound waves into voltage change to be transmitted to a post-stage processing circuit in the receiving state.
2. Analog front end
As shown in fig. 3, the analog front end includes an analog circuit including a transmitting part and a receiving part, the transmitting part converts the transmitted information into a high-voltage pulse excitation signal, and the receiving part converts the echo voltage signal into a digital signal through an analog-to-digital converter (ADC) after amplification, filtering and conditioning.
The transmitting circuit supports continuous voltage excitation, namely, a digital transmitting signal is converted into continuous analog excitation voltage through a digital-to-analog converter (DAC) to be transmitted, so that the transmitted ultrasonic signal has cleaner frequency components.
And the voltage range of the transmitting circuit supporting high-voltage excitation is linearly adjustable from-100V to + 100V.
And thirdly, the low noise gain amplifier (LNA) of the receiving circuit has the support gain of +12dB to +30dB and is adjustable in sections, and the stepping gain is 6 dB.
And the support gain of a Variable Gain Amplifier (VGA) of the receiving circuit is linearly adjustable from-12 dB to +42 dB.
The anti-aliasing filter (AAF) of the receiving circuit is a high-pass filter, the-6 dB cut-off frequency range of the high-pass filter is 2 kHz-2 MHz, the segmentation is adjustable, and the stepping is 5 kHz.
The ADC resolution of the receiving circuit is selectable by 12 bits, 14 bits and 16 bits, and the sampling rate is selectable from 40MHz to 100 MHz.
3. Beam synthesis
The beam forming capability of the ultrasound device is the most basic beam forming technology, namely Delay and Accumulate (DAS), when the ultrasound cloud is not turned on, the ultrasound device and the terminal device may form a conventional ultrasound image, the ultrasound device performs the beam forming based on the beam forming as shown in fig. 4, and the terminal device performs the imaging processing and displaying.
Assuming that at time t, N channels receive input data x1(t) and x2(t) … xn (t), after delay accumulation calculation, each channel calculates a corresponding delay τ (t), and all channels delay respective data by τ (t) corresponding to the respective data and then all the data are accumulated to form 1 beam-synthesized data s (t).
Figure BDA0003370556420000091
Where channels represent the number of receive channels.
Where τ (t) is calculated by beamforming, the ultrasound device 1 may select which beamforming techniques to integrate based on the actual hardware resource configuration.
4. Channel acquisition
As shown in fig. 5, the channel acquisition is to directly acquire all the original channel data into an internal data buffer to prepare for uploading the subsequent original channel data.
IQ demodulation
Both the signal s (t) after beam forming and the original channel signal ch (t) are bandpass signals with a sampling rate Fs, a center frequency Fc and a bandwidth B, and need to be down-converted and filtered to form a baseband signal (or called an analytic signal) with a center frequency of 0 and a bandwidth of B, which is convenient for subsequent processing. As shown in fig. 6, the input signal x (t) is subjected to mixing and low-pass filtering to form a real part signal i (t) and an imaginary part signal q (t), and these 2 signals are combined to form a baseband signal, the center frequency of the baseband signal is 0, and the bandwidth is B.
Figure BDA0003370556420000101
The instantaneous amplitude and phase of the original signal x (t) can be quickly obtained from the baseband signal:
Figure BDA0003370556420000102
Figure BDA0003370556420000103
6. decimation filtering
After IQ demodulation, if M is Fs/B and M is a positive integer, the sampling rate may be reduced by decimation filtering as shown in fig. 7, thereby reducing the data amount. The method comprises the steps of performing M times of extraction filtering on a signal with an input signal sampling rate Fs, performing low-pass filtering on the input signal firstly, setting the cut-off frequency of a filter to be Fs/2/M, enabling the filtered input signal to enter an extractor, and reserving 1 data for every M data, so that the data volume is reduced by M times.
Figure BDA0003370556420000104
Figure BDA0003370556420000105
7. Emission control
The transmitting part, as shown in fig. 8, transmits beam types supporting focused waves, plane waves, spherical waves, etc., and waveforms supporting ordinary pulsed waves, coded excitation waves, etc.
Focusing wave: therefore, the acoustic waves emitted by the array elements reach the focus F at the same time, and energy focusing is formed in the area near the focus F, so that a funnel-shaped sound field is formed.
Plane wave: all array elements transmit sound waves with the delay, so that the wave front of the sound waves is a plane, and the plane of the wave front and the horizontal direction can form an included angle.
Third, spherical wave: the sound waves emitted by all the array elements enable the wave front to be a convex surface, and a fan-shaped sound field is formed.
8. Communication control
And the communication control part is responsible for bridging external communication interfaces including Type-C, optical fibers, WIFI and the like, receiving data packets transmitted by the external communication interfaces, converting the data packets into a uniform control protocol inside the equipment, or packaging internal data and transmitting the internal data to the external communication interfaces.
9. Communication peripheral
The communication is set as a common commercial interconnection transmission interface, including but not limited to Type-C, gigabit fiber, WIFI and other wired or wireless communication protocols.
Specifically, the modules and functions included in the terminal device provided by the present invention specifically include:
1. a communication peripheral: the communication is externally set to be a common commercial interconnection transmission interface, including but not limited to Type-C, gigabit fiber, WIFI and other wired or wireless communication interfaces.
2. Communication driver
The communication driver layer contains the following software functions:
and driving programs such as Windows, Linux, Android and iOS corresponding to the communication peripherals.
Accessing the package and unpacking program of the custom register of the ultrasonic equipment 1.
And thirdly, receiving the unpacking program of the data packet returned by the ultrasonic equipment 1.
And fourthly, performing package and unpacking procedures of the interactive data package with the ultrasonic cloud 3.
And fifthly, a communication exception response program.
3. Scan control
The scanning control part is used for receiving upper layer commands and controlling the ultrasonic equipment 1 to scan, and comprises the following software functions:
firstly, B, C, D, M mode scanning control of probes such as a common linear array probe, a common convex array probe, a phased array probe, a micro-convex probe, an intracavity probe and the like is supported.
And secondly, calculating the transmitting/receiving geometric information of each scanning line of the ultrasonic equipment 1 according to different scanning modes and settings to form scanning table data, and downloading the scanning table data to the ultrasonic equipment 1.
And thirdly, receiving and identifying signal data returned by the ultrasonic equipment 1, and transmitting the signal data to an upper layer for imaging post-processing.
4. Basis imaging
The basic imaging processing section, as shown in fig. 9, includes basic ultrasonic signal post-processing, image processing, and the like, and may finally form a displayed image. Before the ultrasonic signal starts imaging, digital gain compensation is carried out, the signal is adjusted to a proper amplitude range, if B-mode imaging is carried out, processing such as signal envelope detection, logarithmic compression, image enhancement and the like is carried out, a realistic gray image is formed, if other Doppler modes (C, D, M and the like) are carried out, processing such as vessel wall filtering, spectrum detection, Doppler parameter estimation and the like is carried out on the signal, parameter information such as similar blood flow velocity, energy variance and the like is obtained, and the parameter information is superposed on B-mode image data.
5. User interface
The user interface is a graphical interface for the ultrasound system to interact with the user, and comprises the following functions:
managing users: login window, local case database, etc.
Scanning and setting: selection of different modes, selection of different scanned parts, selection of personalized image processing parameters, activation/freezing of equipment switches and the like.
Imaging and displaying: and displaying the local basic imaging data in real time, or receiving the image data returned by the ultrasonic cloud 3 for displaying. Specifically, the modules and functions included in the ultrasound cloud provided by the present invention specifically include:
1. communication peripheral
The communication is externally set to a common commercial interconnection transmission interface of the cloud server, including but not limited to wired or wireless communication protocols such as a gigabit fiber, WIFI and the like.
2. Communication driver
The communication driver layer contains the following software functions:
driving programs such as Windows and Linux corresponding to the communication peripherals.
And secondly, interacting a packet packaging and unpacking program of the data packet with the terminal equipment 2.
And thirdly, receiving the unpacking program of the data packet returned by the ultrasonic equipment 1.
And fourthly, a communication abnormity response program.
3. High order imaging
High-order imaging technologies, including but not limited to high-end imaging technologies such as plane wave complex imaging and full focus complex imaging, require high computing power and storage capacity.
Plane wave composite imaging
Fig. 10 illustrates that, in a plane coordinate system XOZ, a plane wave complex method is used to synthesize a focal point F (x, z), and we use N plane waves with different angles to synthesize, that is, transmit N sub-plane waves (dashed lines) with different angles, and perform ordinary delay-accumulation beam synthesis on (x, z) points for each reception, so as to obtain N sub-echo signal data corresponding to (x, z) points: w (1, x, z), w (2, x, z) … w (N, x, z), and then accumulating the N sub-echo signal data to obtain the echo signal data F (x, z) after the composite.
Figure BDA0003370556420000121
Fig. 11 shows how to perform complete imaging by using the plane wave composite technology, sequentially transmit N plane waves at different angles, perform beam imaging on echo data received by each transmission to form a sub-graph, store the sub-graphs, sequentially form N sub-graphs w (1), w (2) … w (N), and finally superimpose the N sub-graphs to form a final synthesized imaging graph, where the horizontal and vertical resolutions of the synthesized graph are significantly improved relative to the sub-graphs, and the image quality of the synthesized graph depends on the angle number N of the sub-graphs.
The reason that the complexity of the algorithm is high is that each time plane waves are transmitted to form sub-echo data, beam synthesis is directly carried out to form a graph, the image resolution is assumed to be R × C, only R times of beam synthesis are needed to form a receiving line under the traditional focusing beam synthesis method, however, R × C times of beam synthesis is needed to form a graph under the condition that each time of sub-echo data of the plane waves are obtained, and R × C sub-graph data are also needed to be stored, and when all N times of R × C sub-graph data are obtained, N times of accumulation is carried out, and finally a R × C echo image is obtained.
Fig. 12 illustrates that in the plane coordinate system XOZ, the focus F (x, z) is synthesized by using a full focus complex method, the full focus technique is to transmit a focused waveform to form a convex or concave wave front, the acoustic energy transmitted by the transmit events near F (x, z) passes through the (x, z) point, and the SNR of the (x, z) spot beam synthesized energy can be improved by using the echoes formed by the adjacent transmit events to the (x, z) energy.
The N (x, z) points are used for adjacent emission events to carry out compounding, each emission event carries out delay accumulation beam forming on the (x, z) points, and sub-echo signal data corresponding to the N (x, z) points can be obtained: w (1, x, z) and w (2, x, z) … w (N, x, z), and then the N sub-echo signal data are weighted and accumulated to obtain the echo signal data F (x, z) after the composite.
Figure BDA0003370556420000131
Where q (t) is a weighting function whose purpose is to let the echo energy contribution of transmit events further away from F (x, z) become lower and vice versa higher. The algorithm complexity and the storage capacity requirement are similar to those of plane wave compounding, but the calculation difficulty is higher, because the radius of the arc wave front generated by the focusing wave is continuously changed along with time, and a more complex geometric calculation mode is needed for delay calculation during beam synthesis.
4. Deep learning
The deep learning module is provided with at least two deep learning models, the first deep learning model is an image feature extraction model, and is used for carrying out deep learning processing on an input original image by using trained AI network parameters and finally outputting an image with special diagnosis parameters.
The network types of the deep learning module include, but are not limited to, UNet, YOLO, TransUNet, and the like.
As shown in fig. 13, the deep learning module has 5 steps:
in the process of starting up initialization, the system loads AI parameters from data into a deep learning network.
After the ultrasonic equipment starts to scan, the original image can be input into the image cache of the deep learning module.
And thirdly, taking out the original image from the image cache, and entering a deep learning network for processing.
And fourthly, outputting the special diagnosis parameters of the original image after the deep learning network processing is finished, and converting the parameters into a visual expression mode.
And fifthly, superposing the converted visual special diagnosis parameters with the original image and outputting the image with special diagnosis.
The second deep learning model is a model for parameter estimation, and under the condition that the state of the ultrasonic network is not good or the load of the ultrasonic cloud algorithm is too heavy, the terminal equipment can be selected for imaging, a small part of signal data is uploaded to the ultrasonic cloud for high-order processing of human body parameter estimation, and the result is fed back to the interruption equipment for display. For example: in the color doppler flow imaging mode, the scanning mode is B, C mode hybrid, and the returned signal data includes B-mode imaging signal data and C-mode doppler signal data, where the B-mode imaging signal data can be directly imaged by the terminal device, and the C-mode doppler signal data with less data volume can be uploaded to cloud ultrasound. The cloud ultrasound utilizes ultra-high calculation power, after information such as the flow velocity, the direction and the like of blood flow is calculated, the AI algorithm is utilized to match various blood focus signal data in the cloud database with currently input Doppler signal data, the position of a blood focus is automatically marked, and then the obtained information such as blood flow parameters, focus position and the like is transmitted back to the terminal equipment.
5. Interactive interface
The ultrasound cloud 3 is an ultrasound application service program deployed on a cloud server, and allows third-party software to access, process data by using the ultrasound cloud 3, or derive data of a database inside the ultrasound cloud 3. The interactive interface software can set different access authorities to carry out classified login management of users.
Specifically, the first network, the second network, the third network and the format of the transmission data in the network in the ultrasound network provided by the present invention include:
1. the communication protocol is as follows: the communication peripheral comprises but is not limited to USB3.0/3.1, PCI express3.0/4.0, MFI authentication protocol, network (TCP/IP), WIFI6 and other wired or wireless communication protocols, and the transmission speed of the communication peripheral is at least equal to or greater than 500 MB/s.
2. Communication content:
the terminal device 2 sends a command frame to the ultrasound device 1 for controlling scanning of the ultrasound device 1.
And secondly, the ultrasonic equipment 1 transmits signal data to the terminal equipment 2 or the ultrasonic cloud 3, and the signal data is divided into a beam signal data frame and a channel signal data frame.
And thirdly, the ultrasonic cloud 3 transmits back image data frames to the terminal equipment 2.
3. Ultrasonic signal format
The ultrasonic device 1 transmits ultrasonic signal data to the terminal device 2 or the ultrasonic cloud 3, wherein one signal data is data after beam forming and is called beam signal data, and the other signal data is original channel signal data and is called channel signal data. The final transmitted signal data are baseband signal data i (t) and q (t) after IQ demodulation and decimation filtering. Referring to fig. 14, for all signal data, we define i (t) and q (t) each of 2 bytes, and one ultrasonic signal data bb (t) includes 4 bytes of i (t) and q (t), where i (t) is at the lower 2 bytes and q (t) is at the upper 2 bytes.
4. Communication data format
Command frame
The terminal device 2 will transmit a control command to the ultrasound device 1, control the start and stop of the ultrasound device 1, and set corresponding ultrasound scanning parameters, and through the format of the command frame, the format is as shown in fig. 15: the command frame is composed of a frame header, frame information and command data, wherein the frame header and the frame information are respectively 8 bytes, and the command data are variable from 4 bytes to 16368 bytes. The head of the command frame is the only identification symbol for identifying the command frame, and the frame information comprises the length of command data, the address of a register, control and other information. The command data includes register setting data, scan table data, and the like, which control the ultrasound scan.
Beam signal data frame
Each path of echo data of the M paths of channels obtained by each ultrasonic scanning is N, and 1 path of data is formed after beam forming, and then is subjected to IQ demodulation and extraction filtering to form 1 path of baseband signals bb (t), wherein t is greater than or equal to 1 and less than or equal to N, and the transmission format is shown in fig. 16: the beam signal data frame is composed of a frame head, frame information and signal data, wherein the frame head and the frame information are 8 bytes respectively, and the signal data are variable from 4 bytes to N × 4 bytes. The frame header of the beam signal data frame is a unique identification symbol for identifying the beam signal data frame, and the frame information comprises information such as signal data length, scanning event number, scanning mode, image frame line number and the like. The data of the beam signal data frame is N BB (t), wherein 1 ≦ t ≦ N, and each BB (t) occupies 4 bytes, including I (t) and Q (t), as defined above for BB (t).
Channel signal data frame
The data volume of each path of the echo data of the M paths of channels obtained by each ultrasonic scanning is N, the data of each path is subjected to IQ demodulation and extraction filtering to form M-path baseband data BB (t), wherein t is more than or equal to 1 and less than or equal to N, and the format is shown in FIG. 17: the channel signal data frame is composed of a frame head, frame information and signal data, wherein the frame head and the frame information are 8 bytes respectively, and the signal data are variable from 4 bytes to M × N × 4 bytes. The frame header of the channel signal data frame is a unique identification symbol for identifying the channel signal data frame, and the frame information comprises information such as signal data length, scanning event number, channel number, scanning mode, image frame line number and the like. The frame data is arranged with the data ch1_ BB (t) of the 1 st channel and then arranged with the data ch2_ BB (t) of the 2 nd channel until the data chM _ BB (t) of the M-th channel.
Image data frame
After the ultrasonic cloud 3 completes high-quality imaging, the image data frame is transmitted back by the terminal device 2, and the format of the image data frame is shown in fig. 18: the image data frame is composed of a frame head, frame information and signal data, wherein the frame head and the frame information are respectively 8 bytes, and the signal data are variable from 1 byte to 131072 byte. The frame header of the image data frame is a unique identification symbol for identifying the image data frame, and the frame information comprises information such as image data length, image frame number, image resolution, scanning mode and the like. The frame data is image pixel data, each pixel point occupies 1 byte, and the data length and the resolution ratio have a relationship.
Example 2
In a further embodiment of the present invention, a compression technology-based original channel signal data transmission technology is also provided, by which an ultrasound device can adapt to an existing communication device to quickly upload original channel data to a cloud server.
Firstly, the communication rate required for transmitting original data can be calculated by utilizing actual data;
number of channels (channels) 32
Sampling rate (Fs) 50MHz
Number of sampling points (samples) 16384 (about 25cm)
Pulse Repetition Frequency (PRF) 3000 times/second
Where the Pulse Repetition Frequency (PRF) is defined as the number of times an ultrasound wave is transmitted/received in 1 second, typically the raw channel RF data can be represented in 2 bytes, then the minimum speed required to transmit the raw channel data is:
speed_RF=2×samples×channeIs×PRF
the following table illustrates the maximum rates of commonly used commercial interconnect transport protocols:
Figure BDA0003370556420000161
the above-mentioned exemplary transmission protocols are only the theoretical maximum speed, and actually, the protocol overhead, channel quality, software delay, etc. need to be considered, and the design speed may not reach 80% of the theoretical speed. In order to solve the problem, original channel RF data needs to be compressed, and the method adopts a method of fixed depth data quantity plus decimation filtering to compress the data.
Fix the data quantity of depth.
Although the data volume of the ultrasonic imaging in the longitudinal direction increases with the deepening of the sampling depth, for example, in the above example, 25cm reaches 16384 sampling points, but the actual longitudinal resolution of the image is not as high as that of the required multiple pixel points to represent, and after practical exploration, the longitudinal resolution of the original data can be reserved only by 512 longitudinal pixels no matter how deep the image is scanned, so that the data of each channel can be completely reduced to 512 points.
Decimation filtering
The signal is reduced to 512, requiring the use of decimation filtering techniques. From the perspective of signal processing, it is known that if a signal is decimated by M times, the spectrum of the signal is spread by M times, and if the bandwidth of the signal is greater than Fs/M, aliasing of the spectrum emission of the decimated signal is caused. Therefore, filtering is needed before decimation, low-pass filtering is needed for baseband signals, the cut-off frequency of the filter is set to be Fs/M/2, and parts which can cause aliasing are filtered out.
The relationship of the down-sampling rate Fs, the signal bandwidth B and the scan depth D is evaluated.
Name (R) Sampling rate Fs Effective bandwidth of signal B Maximum depth of observation D
Linear array 50MHz 8MHz 5cm
Convex array 50MHz 1.5MHz 25cm
Phased array 50MHz 2MHz 15cm
For the linear array, a sampling point of about 4.7cm is 3072, the extraction rate M is 3072/512 ═ 6, and the bandwidth which can be reserved after extraction is 50MHz/M ═ 8.333MHz, which is greater than 8MHz, so that the condition of reserving the bandwidth B is satisfied.
For the linear array, a sampling point of about 25.23cm is calculated to be 16384, the extraction rate M is 16384/512 ═ 32, and the bandwidth which can be reserved after extraction is 50MHz/M ═ 1.5625MHz, which is greater than 1.5MHz, so that the condition of reserving the bandwidth B is met.
For a phased array, we calculate that a sampling point of about 15.76cm is 10240, the extraction rate M is 10240/512-20, and the bandwidth that can be reserved after extraction is 50 MHz/M-2.5 MHz, which is greater than 2MHz, and meets the condition of reserving the bandwidth B.
The above calculations confirm that a fixed number of depths + decimation filtering can reduce the amount of useful data and preserve the effective bandwidth of the signal. Then, taking 16384 samples as an example, after decimation filtering, the 2-byte RF signal is changed to a 4-byte analysis signal, which is 2 bytes i (t) and 2 bytes q (t), respectively, and the decimation rate M is 16384/512 times 32 times, so that the data amount is greatly reduced after data decimation.
Then the required channel signal transmission rate after the down-extraction filtering is calculated again:
speed_baseband=4×decimated_samples×channels×PRF
it can be seen that the transmission of the original channel RF data signal requires a transmission bandwidth of 3.145GB/s, while the transmission of the channel baseband signal data after the decimation filtering requires only a transmission bandwidth of 98.3MB/s, which is reduced by 32 times in an integer, and the latter can match the transmission bandwidth of any commercial transmission protocol, so that the compression processing of the data by the method of the fixed depth data amount + the decimation filtering provided by the present invention can be adapted to the existing transmission technology to realize the transmission of the ultrasound channel data.
Example 3
Traditional portable ultrasound generally adopts an FPGA device as a beam forming tool, and the following table shows a storage resource possessed by an FPGA device XC7K325T commonly used in traditional portable ultrasound:
name (R) Capacity of
BlockRAM 1.73MB
Let us calculate the computational and memory resources required for the lower plane wave complex imaging: defining the image resolution as 512 × 256, the angular number N of the synthesized waves as 25, each baseband signal occupies 4 bytes, and we say that plane wave synthesis needs to store N sub-imaging graphs and synthesize 1 graph, so the required storage capacity is:
S=4*512*256*25
the storage capacity is far greater than that of an FPGA device, so that the traditional ultrasonic wave can only be used for traditional focusing beam synthesis, and the capacity of 12.5MB is very small in storage capacity requirement for a cloud computing server, so that the FPGA can be used for channel data acquisition, the channel data is transmitted to the cloud server to be subjected to plane wave composite imaging, and the purpose of computing power transfer is achieved.
Fig. 19 shows a schematic diagram of conventional focusing beam synthesis imaging, fig. 20 is a schematic diagram of cloud-based plane wave complex imaging, and it can be seen from fig. 19 to 20 that due to an image formed by conventional focusing synthesis, a target point starts to diverge at 3cm, the lateral resolution decreases very fast with the increase of depth, and even if the target point at 6cm is still very clear in plane wave complex imaging (25 angles) based on cloud computing, the focusing effect is almost the same as that at 1cm, which proves that the lateral resolution does not significantly decrease with the gain of depth, and the imaging effect is far inferior to that of the cloud-based plane wave complex imaging technology.
Meanwhile, the performance of the image feature extraction technology based on deep learning also depends on hardware computing resource allocation, storage capacity and the number of targets for training, and the performance difference of deep learning of the traditional palm ultrasound tablet computer and cloud service is compared as follows:
Figure BDA0003370556420000181
Figure BDA0003370556420000191
it can be seen that with the same AI network, the tablet computer can only marginally bear the AI parameters of 20MB size trained from 195 images due to its limited storage and computing capabilities, and it takes 20ms to process a real-time image. Even though 595 images are trained to obtain 60MB AI parameters, the cloud server with strong computing power and storage power can process one real-time image only in 1ms in the real-time deep learning process, and the performance of the cloud server is greatly due to the fact that a tablet computer is configured in the traditional handheld ultrasonic device.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (15)

1. An ultrasound image processing method based on cloud computing and signal processing is characterized in that the method is applied to a terminal device, the terminal device is in communication connection with an ultrasound device and a cloud server, and the method comprises the following steps:
responding to a first ultrasonic detection instruction of a user, and sending a link request to a cloud server; when the cloud server responds to the link request, generating a first control instruction, controlling the ultrasonic equipment to acquire ultrasonic signal data, preprocessing the acquired ultrasonic signal data, and transmitting the preprocessed ultrasonic signal data to the cloud server;
and receiving first data fed back by the cloud server according to the ultrasonic signal data, and displaying the first data.
2. The cloud computing and signal processing based ultrasound image processing method of claim 1, wherein the method further comprises:
when the cloud server cannot respond to the link request, generating a second control instruction, and controlling the ultrasonic equipment to transmit the preprocessed ultrasonic signal data to the terminal equipment;
screening the received ultrasonic signal data to obtain screened partial ultrasonic signal data, further sending a transmission request to a cloud server, and transmitting the partial ultrasonic signal data to the cloud server;
and receiving and displaying second data fed back by the cloud server according to the partial ultrasonic signal data.
3. The cloud computing and signal processing based ultrasound image processing method of claim 1 or 2, wherein the method further comprises:
responding to a second ultrasonic detection instruction of a user, generating a third control instruction, controlling the ultrasonic equipment to acquire ultrasonic signal data, performing beam forming processing on the acquired ultrasonic signal data, and transmitting the formed beam signal data to the terminal equipment;
and receiving the formed beam signal data, generating basic ultrasonic image data according to the formed beam signal data, and displaying the basic ultrasonic image data.
4. An ultrasound image processing method based on cloud computing and signal processing is characterized in that the method is applied to an ultrasound device, the ultrasound device is in communication connection with a terminal device and a cloud server, and the method comprises the following steps:
responding to a first control instruction of the terminal equipment, acquiring ultrasonic signal data, preprocessing the ultrasonic signal data, and transmitting the preprocessed ultrasonic signal data to a cloud server.
5. The cloud computing and signal processing based ultrasound image processing method of claim 4, wherein the method further comprises: responding to a second control instruction of the terminal equipment, acquiring ultrasonic signal data, preprocessing the ultrasonic signal data, and transmitting the preprocessed ultrasonic signal data to the terminal equipment;
and responding to a third control instruction of the terminal equipment, controlling the ultrasonic equipment to acquire ultrasonic signal data, performing beam forming processing on the acquired ultrasonic signal data, and transmitting the formed beam signal data to the terminal equipment.
6. The cloud computing and signal processing based ultrasound image processing method of claim 4 or 5, wherein the preprocessing comprises: and carrying out longitudinal depth fixing treatment on the acquired ultrasonic signal data, and extracting and filtering the ultrasonic signal data with fixed longitudinal depth.
7. An ultrasound image processing method based on cloud computing and signal processing is characterized in that the method is applied to a cloud server, the cloud server is in communication connection with an ultrasound device and a terminal device, and the method comprises the following steps:
responding to a link request of the terminal equipment, receiving ultrasonic signal data transmitted by the ultrasonic equipment, generating first data according to the ultrasonic signal data, and transmitting the first data to the terminal equipment.
8. The cloud computing and signal processing based ultrasound image processing method of claim 7, further comprising:
responding to a transmission request of the terminal equipment, receiving partial ultrasonic signal data transmitted by the terminal equipment, generating second data according to the partial ultrasonic signal data, and transmitting the second data to the terminal equipment.
9. The ultrasound image processing method based on cloud computing and signal processing according to claim 8, wherein the generating second data according to the partial ultrasound signal data comprises:
and performing feature estimation on the part of the ultrasonic signal data through a pre-trained second deep learning model to generate second data.
10. The cloud computing and signal processing based ultrasound image processing method of any of claims 7-9, wherein the generating first data from the ultrasound signal data comprises:
generating a high-order ultrasonic image according to the ultrasonic signal data by adopting a high-order imaging technology, extracting image features of the high-order ultrasonic image by utilizing a pre-trained first deep learning model, and overlapping the extracted features to the high-order ultrasonic image to generate first data;
the high-order imaging technology is one of plane wave composite imaging, full focusing composite imaging, synthetic aperture imaging and Fourier imaging.
11. An electronic device, characterized in that the electronic device comprises:
one or more processors;
memory for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-3 or the method of any of claims 7-10.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 4, or the method according to any one of claims 7 to 10.
13. The ultrasonic equipment is characterized by responding to a first control instruction of terminal equipment, acquiring ultrasonic signal data, preprocessing the ultrasonic signal data and transmitting the preprocessed ultrasonic signal data to a cloud server.
14. The ultrasound apparatus according to claim 13, characterized in that the ultrasound apparatus comprises: a channel signal compression module for compressing the channel signal,
the channel signal compression module is used for carrying out longitudinal depth fixing processing on the ultrasonic signal data acquired by the ultrasonic probe and carrying out extraction filtering on the ultrasonic signal data after the longitudinal depth is fixed.
15. The ultrasound device according to claim 13 or 14, wherein the ultrasound device is further configured to: responding to a second control instruction of the terminal equipment, acquiring ultrasonic signal data, preprocessing the ultrasonic signal data, and transmitting the preprocessed ultrasonic signal data to the terminal equipment;
and responding to a third control instruction of the terminal equipment, controlling the ultrasonic equipment to acquire ultrasonic signal data, performing beam forming processing on the acquired ultrasonic signal data, and transmitting the formed beam signal data to the terminal equipment.
CN202111397617.8A 2021-11-23 2021-11-23 Ultrasonic image processing method and device based on cloud computing and signal processing Pending CN113965573A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111397617.8A CN113965573A (en) 2021-11-23 2021-11-23 Ultrasonic image processing method and device based on cloud computing and signal processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111397617.8A CN113965573A (en) 2021-11-23 2021-11-23 Ultrasonic image processing method and device based on cloud computing and signal processing

Publications (1)

Publication Number Publication Date
CN113965573A true CN113965573A (en) 2022-01-21

Family

ID=79471515

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111397617.8A Pending CN113965573A (en) 2021-11-23 2021-11-23 Ultrasonic image processing method and device based on cloud computing and signal processing

Country Status (1)

Country Link
CN (1) CN113965573A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114727117A (en) * 2022-03-04 2022-07-08 上海深至信息科技有限公司 Packing method of ultrasonic scanning video

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103202712A (en) * 2012-01-17 2013-07-17 三星电子株式会社 Probe Device, Server, System For Diagnosing Ultrasound Image, And Method Of Processing Ultrasound Image
CN106770641A (en) * 2017-03-02 2017-05-31 张潮 A kind of portable intelligent nondestructive detection system and method for detection based on mobile terminal
CN107707449A (en) * 2017-08-18 2018-02-16 珠海市魅族科技有限公司 A kind of method of data transfer, relevant apparatus and storage medium
CN110477950A (en) * 2019-08-29 2019-11-22 浙江衡玖医疗器械有限责任公司 Ultrasonic imaging method and device
CN111307936A (en) * 2019-12-04 2020-06-19 广东工业大学 Ultrasonic detection method and device
CN211237737U (en) * 2020-03-27 2020-08-11 深圳开立生物医疗科技股份有限公司 Ultrasonic cloud platform system
CN111544038A (en) * 2020-05-12 2020-08-18 上海深至信息科技有限公司 Cloud platform ultrasonic imaging system
CN112336373A (en) * 2019-08-08 2021-02-09 深圳市恩普电子技术有限公司 Portable ultrasonic diagnosis system and method based on mobile terminal
CN112666561A (en) * 2020-12-01 2021-04-16 飞依诺科技(苏州)有限公司 Ultrasonic scanning system, equipment, method and terminal
CN112932533A (en) * 2021-01-27 2021-06-11 深圳华声医疗技术股份有限公司 Ultrasonic equipment scanning control system and ultrasonic equipment scanning control method
CN113056035A (en) * 2021-03-11 2021-06-29 深圳华声医疗技术股份有限公司 Ultrasonic image processing method and ultrasonic system based on cloud computing
WO2021208776A1 (en) * 2020-04-15 2021-10-21 南京超维景生物科技有限公司 Ultrasound system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103202712A (en) * 2012-01-17 2013-07-17 三星电子株式会社 Probe Device, Server, System For Diagnosing Ultrasound Image, And Method Of Processing Ultrasound Image
CN106770641A (en) * 2017-03-02 2017-05-31 张潮 A kind of portable intelligent nondestructive detection system and method for detection based on mobile terminal
CN107707449A (en) * 2017-08-18 2018-02-16 珠海市魅族科技有限公司 A kind of method of data transfer, relevant apparatus and storage medium
CN112336373A (en) * 2019-08-08 2021-02-09 深圳市恩普电子技术有限公司 Portable ultrasonic diagnosis system and method based on mobile terminal
CN110477950A (en) * 2019-08-29 2019-11-22 浙江衡玖医疗器械有限责任公司 Ultrasonic imaging method and device
CN111307936A (en) * 2019-12-04 2020-06-19 广东工业大学 Ultrasonic detection method and device
CN211237737U (en) * 2020-03-27 2020-08-11 深圳开立生物医疗科技股份有限公司 Ultrasonic cloud platform system
WO2021208776A1 (en) * 2020-04-15 2021-10-21 南京超维景生物科技有限公司 Ultrasound system
CN111544038A (en) * 2020-05-12 2020-08-18 上海深至信息科技有限公司 Cloud platform ultrasonic imaging system
CN112666561A (en) * 2020-12-01 2021-04-16 飞依诺科技(苏州)有限公司 Ultrasonic scanning system, equipment, method and terminal
CN112932533A (en) * 2021-01-27 2021-06-11 深圳华声医疗技术股份有限公司 Ultrasonic equipment scanning control system and ultrasonic equipment scanning control method
CN113056035A (en) * 2021-03-11 2021-06-29 深圳华声医疗技术股份有限公司 Ultrasonic image processing method and ultrasonic system based on cloud computing

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114727117A (en) * 2022-03-04 2022-07-08 上海深至信息科技有限公司 Packing method of ultrasonic scanning video

Similar Documents

Publication Publication Date Title
KR100508276B1 (en) Ultrasound scan conversion with spatial dithering
JP6243126B2 (en) Ultrasonic system and method
KR100748585B1 (en) Ultra sound system for constituting image with use of additional information
JP2002526143A (en) Handheld ultrasonic diagnostic equipment
KR20070019070A (en) Method of Compounding a Ultrasound Image
CN105726064A (en) Ultrasonic Diagnostic Device And Control Method
CN101448460A (en) Methods and apparatus for ultrasound imaging
US11715454B2 (en) Beamforming device, method of controlling the same, and ultrasound diagnostic apparatus
JP4469031B2 (en) System and method for imaging ultrasound scatterers
CN106210719A (en) Channel data based on Ultrasound beamforming device compresses
CN113965573A (en) Ultrasonic image processing method and device based on cloud computing and signal processing
US6514205B1 (en) Medical digital ultrasonic imaging apparatus capable of storing and reusing radio-frequency (RF) ultrasound pulse echoes
JP5566300B2 (en) Ultrasonic diagnostic apparatus and signal processing method of ultrasonic diagnostic apparatus
CN101744644A (en) Ultrasonic imaging apparatus and control method for ultrasonic imaging apparatus
JP5457788B2 (en) System and method for clutter filtering for improved adaptive beamforming
US6740034B2 (en) Three-dimensional ultrasound imaging system for performing receive-focusing at voxels corresponding to display pixels
JP2000149015A (en) Method for edge enhancement of image and imaging device
KR101555267B1 (en) Method And Apparatus for Beamforming by Using Unfocused Ultrasound
JP2010201110A (en) Ultrasonic diagnostic apparatus and method for controlling the ultrasonic diagnostic apparatus
JPH10309277A (en) Ultrasonic imaging system
JP2015033569A (en) Ultrasonic diagnostic device, medical image processor and medical image processing method
CN115768356A (en) Method and system for reducing data transmission in ultrasonic imaging
CN112890855A (en) Multi-beam p-order root compression coherent filtering beam synthesis method and device
EP4088665A1 (en) Dematerialized, multi-user system for the acquisition, generation and processing of ultrasound images
US11953591B2 (en) Ultrasound imaging system with pixel extrapolation image enhancement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination