CN116058867A - Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method - Google Patents

Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method Download PDF

Info

Publication number
CN116058867A
CN116058867A CN202310017632.8A CN202310017632A CN116058867A CN 116058867 A CN116058867 A CN 116058867A CN 202310017632 A CN202310017632 A CN 202310017632A CN 116058867 A CN116058867 A CN 116058867A
Authority
CN
China
Prior art keywords
dimensional
ultrasonic
wireless
data
probe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310017632.8A
Other languages
Chinese (zh)
Inventor
王江涛
齐昊
潘海林
王雪松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Changchao Medical Technology Co ltd
East China Normal University
Original Assignee
Beijing Changchao Medical Technology Co ltd
East China Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Changchao Medical Technology Co ltd, East China Normal University filed Critical Beijing Changchao Medical Technology Co ltd
Priority to CN202310017632.8A priority Critical patent/CN116058867A/en
Publication of CN116058867A publication Critical patent/CN116058867A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • A61B8/565Details of data transmission or power supply involving data transmission via a network

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides a light-weight imaging ultrasonic system with an unfixed scanning probe, which comprises a wireless scanning probe, terminal equipment and three-dimensional imaging computing equipment; the wireless scanning probe transmits ultrasonic image data and acceleration and angular velocity data of the probe to the terminal equipment and the three-dimensional imaging computing equipment in a wireless mode; the terminal equipment provides a required two-dimensional ultrasonic image for the three-dimensional imaging computing equipment; the three-dimensional imaging computing device transmits the reconstructed three-dimensional ultrasonic data to the terminal device. The invention solves the problem that a doctor is difficult to understand the two-dimensional ultrasonic image, can more intuitively understand the tissue organ structure through a real-time three-dimensional view, trains the doctor to do ultrasonic scanning and understand the two-dimensional ultrasonic image in a higher-efficiency mode, and solves the problem of long training time consumption of the ultrasonic doctor.

Description

Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method
Technical Field
The invention belongs to the technical field of ultrasound, and relates to a light-weight imaging ultrasound system with an unfixed scanning probe and an ultrasound detection method.
Background
In the medical field, ultrasonic scanning detection is widely used in clinical diagnosis due to its unique advantages. Compared with Computed Tomography (CT) and Magnetic Resonance Imaging (MRI), the ultrasonic imaging technology has the advantages of non-invasive, non-radiative, low cost and the like. In conventional ultrasound examination, a doctor places an ultrasound scanning probe on a patient's body to scan, so as to obtain a two-dimensional image of a cross section of a human tissue organ. In this conventional manner, however, the physician must reconstruct a two-dimensional ultrasound image into the three-dimensional anatomy of the human body in the brain. In order to be able to do this, however, the doctor must be trained professionally, and the training process takes a lot of time and money. Compared with a two-dimensional ultrasonic image, the three-dimensional ultrasonic image can provide the whole volume condition of the region of interest, and a more visual space structure is provided for doctors.
Early three-dimensional reconstruction of ultrasound images can be accomplished in three stages, acquisition, reconstruction, and visualization, respectively. The specific flow is to collect a series of ultrasonic images obtained by ultrasonic scanning of the target position, and to perform visual display through a corresponding three-dimensional reconstruction algorithm. The three-dimensional reconstruction technology separates ultrasonic scanning, volume reconstruction and visualization phases, and belongs to non-real-time three-dimensional image reconstruction. In this reconstruction mode, the probe is typically translated, tilted and rotated by a micro-motor driven mechanism to effect cross-sectional scanning, and a series of two-dimensional ultrasound images of the examination area are acquired slowly and then recorded in a computer. The scanning is performed by a mechanical device for controlling and fixing the probe according to a preset scheme, so that the relative position and angle of each image can be known, but the scanning device is heavy, the scanning mode is inflexible and the like, and the three-dimensional reconstruction can be performed after the scanning, so that the scanning device is not practical clinically.
At present, the real-time three-dimensional reconstruction of a two-dimensional ultrasonic image is realized, and the main technology is that a sensor is added on an ultrasonic probe, then the ultrasonic probe is held by a hand to perform free arm scanning, the sensor transmits the recorded real-time spatial information such as the real-time spatial position of the ultrasonic probe and the scanning direction back to a computer in real time, the computer combines the real-time obtained two-dimensional ultrasonic image with the spatial information of the probe, and then the real-time three-dimensional reconstruction and visualization of the ultrasonic image are performed. The sensor positioning technology mainly adopts optical positioning and electromagnetic positioning, for example, an optical positioning instrument is used for measuring three-dimensional space position coordinates of an ultrasonic probe, such as a three-dimensional ultrasonic imaging method based on a space positioning device, a storage medium and equipment (patent publication number: CN 113081033A), and an electromagnetic tracking system is also used for positioning, such as a hand-held unconstrained scanning wireless three-dimensional ultrasonic real-time voxel imaging system (patent publication number: CN 111184535A) by using an ultrasonic probe with an electromagnetic sensor and an electromagnetic transmitter capable of tracking the position and direction of the electromagnetic sensor. In addition, acoustic positioning technology is used, and the specific method is to fix three acoustic emission devices on an ultrasonic probe, fix a microphone array on a patient, and acquire positioning information through measuring the sound propagation time. Recently, a sensor Kinect with a depth camera is combined with an angle sensor to realize the positioning of an ultrasonic probe, such as a three-dimensional ultrasonic imaging method for reconstructing a two-dimensional ultrasonic image set (patent publication No. CN 107582098A).
Different sensors have different requirements on the applicable scene, so that certain disadvantages exist. The problem found in optical tracking positioning systems is that the markers mounted on the ultrasound probe are too large, resulting in an inconvenient ultrasound scanning process and the view of the camera cannot be blocked. The problem with electromagnetic positioning is that the surrounding ferrous metal can affect the magnetic field, resulting in a bias in position measurement. The problem with acoustic positioning is that the microphone must be placed above the patient so that the sound rays can reach unimpeded, and the distance between the transmitter and the microphone is sufficiently close to ensure that a high signal to noise ratio is achieved, and the speed of sound will vary with changes in air humidity, thus producing a certain error. The problem with the Kinect somatosensory detection device with the depth of field camera is that the requirement on environment and light intensity is higher when outdoor use is adopted, because the influence of strong natural light is easily received outdoors, the projected coded light is submerged, the effect cannot be satisfied, and the influence of smooth plane reflection is easily received. Therefore, the methods have the problems of large volume, wired constraint, incapability of truly freely moving and operating of ultrasonic equipment and the like.
In addition, an inertial navigation system-based three-dimensional ultrasonic reconstruction system (publication number: CN 114533111A) proposes a method for performing ultrasonic three-dimensional reconstruction by using inertial navigation. The mentioned position calculation algorithm is too simple, the accuracy error of position calculation is too large, and because the data of the two-dimensional ultrasonic image and the IMU sensor are data acquired discretely in time, after the data are mapped to the real three-dimensional space, some vacant parts exist in the three-dimensional space, even if the sampling frequency is increased, the vacant parts still exist, so that the reconstruction result is inaccurate. Meanwhile, the method has the defect that the equipment is bound by a cable and cannot move and operate truly freely.
Disclosure of Invention
In order to solve the defects in the prior art, the invention provides a light-weight imaging ultrasonic system without a fixed scanning probe and an ultrasonic detection method.
The invention can guide doctors to scan and read two-dimensional images without difficulty through three-dimensional image reconstruction, namely, the invention can be free from the limitation of a fixed scanning probe of a mechanical arm, is free from the limitation of the abstract and unreadable two-dimensional images, and can obtain the expected optimal two-dimensional images by adopting free arm scanning.
Specifically, the invention realizes real-time three-dimensional reconstruction of a two-dimensional ultrasonic image by installing a wireless inertial measurement unit on a wireless ultrasonic probe and acquiring ultrasonic image data and space information by using the inertial measurement unit and the wireless ultrasonic probe, and finally simultaneously displays the two-dimensional ultrasonic image and the reconstructed three-dimensional ultrasonic image on a screen of a mobile device in real time. Because the two-dimensional ultrasonic image and the corresponding three-dimensional ultrasonic image can be simultaneously displayed on the mobile equipment, the three-dimensional ultrasonic image can more intuitively help doctors to understand the two-dimensional ultrasonic image, the problem that doctors are difficult to understand the two-dimensional ultrasonic image is solved, the doctors can more intuitively understand the tissue organ structure through real-time three-dimensional views, the doctors are trained to do ultrasonic scanning and understand the two-dimensional ultrasonic image in a more efficient mode, and the problem that the training time of the ultrasonic doctors is long is solved.
Secondly, the invention adopts a free scanning mode of the wireless ultrasonic probe to scan the interested part. Therefore, the scanning mode of the imaging system is flexible, light and small, and the problems of fixed/semi-fixed scanning mode and heavy scanning device of the traditional three-dimensional ultrasonic imaging system are solved.
The invention provides a light-weight imaging ultrasonic system with an unfixed scanning probe, which comprises a wireless scanning probe, terminal equipment and three-dimensional imaging computing equipment; the wireless scanning probe transmits ultrasonic image data and data such as acceleration, angular velocity and the like of the probe to the terminal equipment and the three-dimensional imaging computing equipment respectively in a wireless mode; the terminal equipment provides the required two-dimensional ultrasonic image for the three-dimensional imaging computing equipment, and the three-dimensional imaging computing equipment transmits the reconstructed three-dimensional ultrasonic data to the terminal equipment.
The wireless scanning probe comprises two modules, namely a wireless ultrasonic probe module and a wireless inertial measurement unit module; specifically, the invention obtains real-time spatial position information of the probe in the ultrasonic scanning process by installing a wireless inertial measurement unit on the wireless ultrasonic probe. The scanning probe is a wireless ultrasonic probe, and has the advantages of convenient wireless transmission and use, no constraint of cables, and lower power and longer endurance time. The wireless ultrasonic probe module is used for converting the coded and decoded ultrasonic echo signals into JPEG image data and transmitting the image data to the terminal equipment through the USB module or the WIFI module.
The wireless inertial measurement unit is arranged on the wireless ultrasonic probe and consists of a gyroscope and an accelerometer, the accelerometer is used for detecting triaxial acceleration signals of the ultrasonic probe, the gyroscope detects angular velocity signals, and the wireless inertial measurement unit module is used for transmitting three-dimensional attitude data of the probe to the three-dimensional imaging computing equipment through the wireless module. The inertial measurement unit has the advantages that the inertial measurement unit can be used for positioning the data without any external help, and the inertial measurement unit has low requirements on external environment, is not influenced by light and electromagnetic interference, and can even work under water.
The terminal device is a mobile display device provided with ultrasonic imaging software and is used for simultaneously and real-timely visualizing a two-dimensional ultrasonic image obtained by the ultrasonic probe and a three-dimensional ultrasonic image obtained by the three-dimensional imaging computing device. The terminal equipment establishes connection between the two-dimensional ultrasonic image and the three-dimensional imaging computing equipment through a Socket and uses a mobile network to communicate, the two-dimensional ultrasonic image is transmitted to the three-dimensional imaging computing equipment in real time, the three-dimensional imaging computing equipment transmits the three-dimensional ultrasonic image data after reconstruction back to the terminal equipment, and the terminal equipment synchronously displays the two-dimensional ultrasonic image and the three-dimensional ultrasonic image in real time. The terminal equipment is mobile equipment such as a mobile phone or tablet equipment, and has the advantages of small size, light weight and convenient operation.
The three-dimensional imaging computing equipment is a high-performance computer, real-time angular velocity and acceleration data of the ultrasonic probe obtained by the wireless inertial measurement unit in a three-dimensional space are transmitted to the three-dimensional imaging computing equipment through the 433 wireless transparent transmission module, then the three-dimensional imaging computing equipment is used for resolving the data to obtain the gesture of the ultrasonic probe and combining with a two-dimensional ultrasonic image obtained by the terminal equipment in real time to reconstruct three-dimensionally, and meanwhile, the reconstructed three-dimensional ultrasonic image data are transmitted to the terminal equipment.
In order to solve the problem that the existing method has a gap when pixel-to-voxel mapping is performed after the position resolving precision is poor and resolving, the invention provides a complete set of position resolving algorithm and three-dimensional reconstruction interpolation algorithm.
The three-dimensional imaging computing device carries out the principle of resolving the gesture and the position according to the data of the inertial measurement unit as follows:
two coordinate systems are involved in the pose and position resolving process, one is the IMU (Inertial Measurement Unit) own coordinate system b and the other is the finally reconstructed reference coordinate system R, so that the information output by the IMU needs to be converted into the reference system through resolving.
The IMU comprises a triaxial accelerometer for measuring the change of triaxial acceleration during movement and a triaxial gyroscope for measuring the change of triaxial angular velocity.
(1) Three-axis gesture solution
The IMU outputs direction information in the form of quaternions. The general form of a quaternion is a scalar q 0 (angle of rotation) followed by a vector q (axis of rotation), q= (q 1 ,q 2 ,q 3 )
Figure BDA0004041126280000041
The quaternion is similar to the complex number, and is formed by adding three imaginary units to a real number
Figure BDA0004041126280000042
Composition, and they have the following relationship: />
Figure BDA0004041126280000043
Each quaternion is 1,/and->
Figure BDA0004041126280000044
Is a linear combination of (a) and (b). For->
Figure BDA0004041126280000045
The geometrical meaning of itself can be understood as a rotation, wherein +.>
Figure BDA0004041126280000046
Rotation represents rotation in the positive direction of the Z-axis to the positive direction of the Y-axis in the plane of intersection of the Z-axis and the Y-axis, +.>
Figure BDA0004041126280000047
Rotation represents a positive rotation of the X-axis positive Z-axis in the plane of intersection of the X-axis and Z-axis, +.>
Figure BDA0004041126280000048
The rotation represents a positive rotation of the Y-axis in the positive direction of the X-axis in a plane where the Y-axis intersects the X-axis.
As shown in fig. 1, given a rotation defined by a vectorRotating shaft
Figure BDA0004041126280000049
The quaternion may represent a rotation of rotation angle θ about this axis u:
Figure BDA00040411262800000410
/>
thus an operation that converts vector v into a square rotation about the u-axis and with a rotation angle θ can be represented by a square of the quaternion, expressed as:
Figure BDA00040411262800000411
wherein ,q* =q 0 Q is a conjugated quaternion of q.
The pose of the motion system b (the coordinate system of the ultrasound probe with IMU) with respect to the reference system R (the coordinate system of the final three-dimensional reconstruction) is completely determinable by both the rotation axis u and the rotation angle θ. A pose quaternion can be constructed with two parameters, u and θ:
Figure BDA00040411262800000412
the formula shows that the reference coordinate system R rotates around u by an angle theta and then coincides with the motion coordinate system b, so that the gesture of the motion coordinate system b can pass through the gesture quaternion
Figure BDA0004041126280000051
Transformed into a pose in the reference frame R, the coordinate transformation formula is:
Figure BDA0004041126280000052
after the initial posture information is given, the posture quaternion recursion can be performed in real time to convert the rotation posture information of the IMU self coordinate system b into the rotation posture information of the reference coordinate system R, and the recursion formula of the posture quaternion is as follows:
Figure BDA0004041126280000053
wherein t represents the moment, and t starts from 0, so that the attitude quaternion of the current moment can be recursively derived from the attitude information of the previous moment, and the attitude information of the IMU at each moment can be further calculated.
(2) Solution of displacement
The triaxial accelerometer in the IMU can obtain the triaxial acceleration value at the current moment, but the acceleration of the earth gravity and the Ge-type acceleration generated by the earth rotation need to be compensated, and the acceleration is integrated to obtain the speed, namely the speed v at the k moment k From the integral of the velocity, acceleration and error compensation terms at time k-1:
Figure BDA0004041126280000054
wherein ,Δvg/cor(k) Representing the effects of local gravitational acceleration and the coriolis acceleration at time k, which can be calculated using the local latitude, the angular velocity of the earth's rotation, and the output of the IMU gyroscope, alpha (tau) is the output of the IMU accelerometer,
Figure BDA0004041126280000055
is that the gesture transformation matrix can be obtained by converting gesture quaternion obtained by the first step of calculation>
Figure BDA0004041126280000056
Representing the transformation of the acceleration output of the IMU from its own coordinate system b to the reference coordinate system R, the subscript k representing time.
(3) Positioning
The position P at the moment k can be obtained by integrating and accumulating the speeds k
Figure BDA0004041126280000057
(4) Complete flow
Firstly, giving initial gesture, position and speed information of the IMU, then moving the ultrasonic probe with the IMU, and carrying out real-time recursion by combining gesture calculation and displacement calculation along with time, so that the real-time position of the current ultrasonic probe can be obtained, and the final positioning is completed. The invention is different from the simple integral of IMU data in the existing method, because the gesture and the position of the current ultrasonic probe are solved by dynamically recursively in real time through gesture quaternion, and the influence of earth gravity and autorotation on IMU output is considered, the precision of the solved gesture and position is higher.
The invention also provides an ultrasonic detection method using the imaging ultrasonic system, which comprises the following steps:
step one, acquiring data by moving and rotating a wireless scanning probe; the data comprise ultrasonic data of the region to be detected and information such as acceleration, angular velocity and the like measured by an inertial measurement unit;
step two, transmitting the ultrasonic data acquired by the wireless scanning probe to terminal equipment;
step three, the terminal equipment performs visualization of the two-dimensional ultrasonic image according to the ultrasonic data, and transmits the two-dimensional ultrasonic image to a three-dimensional imaging computing platform;
transmitting the information such as acceleration, angular velocity and the like obtained by measurement to a three-dimensional imaging calculation platform by an inertia measurement unit;
step five, the three-dimensional imaging computing platform performs three-dimensional reconstruction according to the two-dimensional ultrasonic image obtained in the step three and the information such as acceleration, angular velocity and the like obtained in the step four in combination with an interpolation algorithm; if the reconstruction is completed, the three-dimensional imaging computing platform transmits the three-dimensional ultrasonic image data to the terminal equipment for visualization, and the ultrasonic detection is completed; if the reconstruction is not completed, the method returns to the first step to carry out the subsequent steps again.
The interpolation algorithm and visualization in the three-dimensional reconstruction process are described as follows:
(1) Interpolation algorithm
In the fifth step, the three-dimensional computing device synchronizes the two-dimensional ultrasound image with the information output by the IMU according to the timestamp information, then calculates the position of each frame of two-dimensional ultrasound image in the three-dimensional space according to the foregoing resolving process, and then completes the final three-dimensional reconstruction by using the interpolation algorithm for the empty voxels which are not mapped in the three-dimensional space.
The method finally uses voxels to represent the three-dimensional ultrasonic data after reconstruction, wherein the voxels are cube blocks with fixed sizes in the three-dimensional space, are the minimum units which are most used for representing the three-dimensional data, and correspond to pixels in the two-dimensional image. The two-dimensional ultrasonic image is a gray level image, each pixel has a determined gray level value, and the interpolation algorithm assigns the gray level value of the pixel in the two-dimensional ultrasonic image to a corresponding voxel through a certain strategy according to the positioning information.
The interpolation algorithm of the three-dimensional reconstruction method adopts a square distance weighting algorithm to complete interpolation, and the specific algorithm is described as follows:
Figure BDA0004041126280000061
Figure BDA0004041126280000062
wherein ,I(VT ) Is the gray value of the target voxel, n is the gray value of the target voxel falling within a predefined range of V T Is the number of pixels in the sphere region of the sphere center, I (P i ) Represents the gray value, W, of the ith pixel in the region i Is the weight of the ith pixel, and to the sphere center voxel V T D is inversely proportional to the square of the distance d, which can be found by previous position solutions. All voxels may ultimately be assigned a value by traversing each voxel. The algorithm is schematically shown in fig. 2, and the three-dimensional space is mapped onto a two-dimensional plane in fig. 2, wherein the small square represents voxels in the three-dimensional space, the gray circular region represents a predefined sphere region, and the straight line passing through the voxels represents twoAn ultrasound image is maintained, with black dots on the straight lines representing pixels.
(2) Visualization of
The three-dimensional computing equipment communicates with the terminal equipment through a Socket and a mobile network, and transmits the reconstructed three-dimensional voxel data to the terminal equipment in real time, the terminal equipment stores the data by using a buffer queue, performs volume drawing through a ray projection algorithm, finally renders the three-dimensional voxel data in real time, and simultaneously displays the three-dimensional voxel data and a two-dimensional ultrasonic image on a screen in a picture-in-picture mode.
The beneficial effects of the invention include: the invention provides an unfixed high-flexibility wireless scanning probe, which is formed by installing a wireless inertial measurement unit on a wireless ultrasonic probe. The wireless ultrasonic probe has the advantages that the host and the probe are of an integrated lightweight design, the probe is small and exquisite and convenient to hold, the wireless ultrasonic probe is not limited by wires, and the images are transmitted in real time. The wireless inertial measurement unit is used for positioning, so that the measurement data can be obtained only by using the internal sensor without any external help, and in addition, the inertial measurement unit has lower requirements on external environment, is not influenced by light and electromagnetic interference, and can even work under water. The two-dimensional ultrasonic image and the three-dimensional ultrasonic image after reconstruction are displayed on the portable mobile device (such as a smart phone or a tablet computer) in real time, and the ultrasonic image displayed by the mobile device has the advantages of small size and convenience in carrying and carrying. The invention can carry out three-dimensional reconstruction of the ultrasonic image in real time during ultrasonic scanning, which solves the problems of low efficiency and long time consumption in the three-dimensional reconstruction process in the prior art. The wireless ultrasonic probe used by the invention integrates a host, the ultrasonic probe can directly encode and decode ultrasonic echo signals into JPEG image data, compared with the traditional ultrasonic equipment, the volume of the equipment is greatly reduced, the built-in WIFI module of the probe can transmit the image data to the terminal equipment in a wireless mode, and a sensor adopted for acquiring the three-dimensional posture of the probe is an inertial measurement unit, so that the real-time acceleration and the angular speed of the wireless probe can be transmitted to the three-dimensional computing platform in a wireless mode. The invention greatly improves the efficiency of three-dimensional reconstruction of the ultrasonic image, further can help doctors to understand the two-dimensional image by using the three-dimensional view in real time, and can solve the problems that the doctors are difficult to correctly scan and understand the two-dimensional ultrasonic image when training the ultrasonic doctors.
Drawings
FIG. 1 is a schematic representation of the rotation axis transformation in resolving gestures in accordance with the present invention.
Fig. 2 is a schematic diagram of an interpolation algorithm used in the present invention.
Fig. 3 is a system block diagram of an imaging ultrasound system of the present invention with a scanning probe free.
Fig. 4 is a block diagram of an ultrasonic probe and an inertial measurement unit used in the present invention.
Fig. 5 is an internal structural view of a wireless scanning probe with an inertial measurement unit of the present invention.
FIG. 6 is a flow chart of an ultrasonic detection method of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following specific examples and drawings. The procedures, conditions, experimental methods, etc. for carrying out the present invention are common knowledge and common knowledge in the art, except for the following specific references, and the present invention is not particularly limited.
The invention provides a light-weight imaging ultrasonic system without a fixed scanning probe and an ultrasonic detection method, wherein ultrasonic original data are processed and converted into image coding data through a wireless ultrasonic probe, then the image coding data are transmitted to terminal equipment for display, and the terminal equipment transmits the obtained image data to a three-dimensional computing platform; the wireless inertial measurement unit in the probe transmits probe posture data obtained in real time to the three-dimensional computing platform, the three-dimensional computing platform utilizes all received data to reconstruct three-dimensional ultrasonic images, finally the reconstructed three-dimensional data are transmitted to the terminal equipment, and the terminal equipment can display the two-dimensional ultrasonic images and the three-dimensional ultrasonic images simultaneously.
The system structure diagram of the invention is shown in fig. 3, and consists of the following parts: wireless scanning probe, terminal equipment, three-dimensional imaging computing device. The external and internal structures of the wireless scanning probe are shown in fig. 4 and 5, and the wireless scanning probe consists of the following two main parts: the wireless ultrasonic probe comprises a wireless ultrasonic probe and a wireless inertial measurement unit, wherein a wireless 433 transparent transmission module is integrated in the wireless inertial measurement unit, and the wireless inertial measurement unit and the wireless ultrasonic probe work together with a battery in the wireless ultrasonic probe.
The specific flow of the ultrasonic detection method of the present invention is shown in fig. 6, and is specifically described as follows:
step 1, scanning the interested human organ tissue by using a wireless ultrasonic probe provided with a sensor to collect ultrasonic signals. The scanning mode is to hold the ultrasonic probe, and the probe can be freely rotated and translated to repeatedly scan the interested part.
And 2, acquiring spatial information through a sensor arranged outside the wireless ultrasonic probe, wherein the selected sensor is an inertial measurement unit, the inertial measurement unit can transmit acceleration and angular velocity information of the wireless ultrasonic probe in the moving and rotating processes to the three-dimensional imaging computing equipment, and the data transmission mode is that the data are transmitted through a 433 wireless transparent transmission module.
And 3, transmitting the collected ultrasonic signals in the scanning process to the terminal equipment in real time by the wireless ultrasonic probe, wherein the transmission mode of the ultrasonic data is to transmit the ultrasonic signals through WIFI.
And 4, reconstructing an ultrasonic signal obtained from the ultrasonic probe into a two-dimensional ultrasonic image by the terminal equipment (a mobile phone or a tablet) in real time and visualizing the two-dimensional ultrasonic image at the terminal equipment.
And 5, transmitting the two-dimensional ultrasonic image to the three-dimensional imaging computing equipment through the mobile network in real time by the terminal equipment.
And 6, the three-dimensional imaging computing equipment calculates the acceleration and angular velocity information obtained by the inertial measurement unit to obtain the coordinate position of the ultrasonic probe in the three-dimensional space, and then reconstructs the three-dimensional ultrasonic image in real time by combining the two-dimensional ultrasonic image obtained in real time from the terminal equipment with the space information obtained by settlement.
And 7, transmitting the reconstructed three-dimensional ultrasonic image data to the terminal equipment when the three-dimensional imaging computing equipment passes through a network, and simultaneously visualizing the real-time two-dimensional ultrasonic image and the three-dimensional ultrasonic image by the terminal equipment to form contrast.
In this example, a wireless ultrasound probe is used in step 1. As shown in fig. 5, the two large scale integrated circuits a are controlled by 1 large scale integrated circuit B to perform the functions of transmitting high voltage pulses to and receiving echoes from an ultrasonic probe (the lowest 128 array elements of the ultrasonic probe). The two large-scale integrated circuits A are used for transmitting high-voltage driving pulses to the ultrasonic probe and receiving sound wave signals reflected by the probe, each large-scale integrated circuit A is used for respectively transmitting the high-voltage driving pulses to half of array elements and receiving the sound wave signals reflected by the probe, the transmitting and receiving functions are controlled and switched by the large-scale integrated circuit B, and the large-scale integrated circuit B is provided with at least 64 physical channels to finish analog front-end amplification and analog-to-digital conversion. The digital signal is converted into a brightness signal, the brightness signal is converted into JPEG through an internal coder and decoder, then the JPEG is transmitted to the WIFI module through the synchronous serial bus, and finally the digital signal is wirelessly transmitted to the terminal equipment through the WIFI module, so that the real-time performance of wireless transmission is realized and ensured. Meanwhile, the system also has a wired transmission function, and if the data is not transmitted by adopting WIFI, the large-scale integrated circuit B can also transmit the JPEG data to the terminal equipment in real time in a wired mode through USB.
In this example, the scanning mode of the wireless ultrasonic probe in step 1 is a free scanning mode that is not fixed, including a translation and a rotation ultrasonic probe, the translation wireless ultrasonic probe can detect longitudinal sections of different positions in the three-dimensional space of the target, the detected information can not be further increased only by the translation mode, the rotation ultrasonic sensor can detect the three-dimensional space of the target from different angles, and the translation and rotation movement modes are combined to detect the three-dimensional imaging area.
In this example, the sensor used in step 2 to detect the spatial position of the wireless ultrasonic probe uses an inertial measurement unit, and its main components include a gyroscope and an accelerometer, where the accelerometer is used to detect the three-axis acceleration signal of the wireless ultrasonic probe, and the gyroscope detects the angular velocity signal. The inertial measurement unit has the advantages that the inertial measurement unit can be used for positioning the data without any external help, and the inertial measurement unit has low requirements on external environment, is not influenced by light and electromagnetic interference, can work underwater, and is suitable for more application scenes. As shown in fig. 5, the inertial measurement unit is connected with the 433 wireless transparent transmission module by using an asynchronous serial bus, the inertial measurement unit transmits measured data to the 433 wireless transparent transmission module in real time through the asynchronous serial bus, and then the 433 wireless module transmits the data to the three-dimensional imaging computing device through a 433MHz wireless frequency band.
In this example, the three-dimensional imaging computing device in step 2 is a high-performance computer, and is capable of receiving two-dimensional ultrasound image data from the terminal device and data from the inertial measurement unit, performing three-dimensional reconstruction on the obtained data through a three-dimensional reconstruction algorithm, and finally transmitting the three-dimensional reconstruction to the terminal device through a mobile network.
In this example, in step 3, the terminal device is a mobile device such as a mobile phone or a tablet computer, and the wireless ultrasonic probe transmits the acquired ultrasound to the terminal device through a wireless transmission protocol, and the terminal device visualizes the ultrasound two-dimensional image and the ultrasound three-dimensional image.
The protection of the present invention is not limited to the above embodiments. Variations and advantages that would occur to one skilled in the art are included within the invention without departing from the spirit and scope of the inventive concept, and the scope of the invention is defined by the appended claims.

Claims (8)

1. A lightweight imaging ultrasound system with an unfixed scanning probe, the imaging ultrasound system comprising a wireless scanning probe, a terminal device, and a three-dimensional imaging computing device; wherein,
the wireless scanning probe transmits ultrasonic image data and acceleration and angular velocity data of the probe to the terminal equipment and the three-dimensional imaging computing equipment in a wireless mode;
the terminal equipment provides a required two-dimensional ultrasonic image for the three-dimensional imaging computing equipment;
the three-dimensional imaging computing device transmits the reconstructed three-dimensional ultrasonic data to the terminal device.
2. The non-stationary scanning probe lightweight imaging ultrasound system of claim 1, wherein said wireless scanning probe comprises a wireless ultrasound probe module and a wireless inertial measurement unit module; wherein,
acquiring real-time spatial position information of the probe in the ultrasonic scanning process by installing a wireless inertial measurement unit module on a wireless ultrasonic probe module;
the wireless ultrasonic probe module scans the ultrasonic echo signals to obtain image data, and transmits the image data to the terminal equipment;
the wireless inertial measurement unit module consists of a gyroscope and an accelerometer, the accelerometer is used for detecting triaxial acceleration signals of the wireless ultrasonic probe, and the gyroscope is used for detecting angular velocity signals;
the wireless inertial measurement unit module transmits the three-dimensional attitude data of the wireless ultrasonic probe to the three-dimensional imaging computing equipment through the wireless module.
3. The non-stationary scanning probe lightweight imaging ultrasound system of claim 1, wherein the terminal device is a mobile display device equipped with ultrasound imaging software for visualizing simultaneously and in real time two-dimensional ultrasound images obtained by the ultrasound probe and three-dimensional ultrasound images obtained by a three-dimensional imaging computing device;
the terminal equipment establishes connection between the two-dimensional ultrasonic image and the three-dimensional imaging computing equipment through a Socket and uses a mobile network to communicate, the two-dimensional ultrasonic image is transmitted to the three-dimensional imaging computing equipment in real time, the three-dimensional imaging computing equipment transmits the three-dimensional ultrasonic image data after reconstruction back to the terminal equipment, and the terminal equipment synchronously displays the two-dimensional ultrasonic image and the three-dimensional ultrasonic image in real time.
4. The lightweight imaging ultrasound system of an ambulatory scanning probe according to claim 1, wherein the data of the ultrasound probe in three-dimensional space obtained by the wireless scanning probe is transmitted to a three-dimensional imaging computing device through a wireless transmission module, and then the three-dimensional imaging computing device is used for resolving the data to obtain the posture of the wireless scanning probe and carrying out three-dimensional reconstruction by combining with a two-dimensional ultrasound image obtained by a terminal device in real time, and simultaneously transmitting the reconstructed three-dimensional ultrasound image data to the terminal device.
5. The non-stationary scanning probe lightweight imaging ultrasound system of claim 1, wherein said three-dimensional imaging computing device performs pose and position solving methods of:
in the process of resolving the gesture and the position, two coordinate systems are involved, one is a coordinate system b of the IMU, the other is a finally reconstructed reference system R, and information output by the IMU is converted into the reference system through resolving:
the IMU comprises a triaxial accelerometer and a triaxial gyroscope, wherein the accelerometer is used for measuring the change of triaxial acceleration in the motion process, and the gyroscope is used for measuring the change of triaxial angular velocity;
(1) Three-axis gesture solution
The IMU outputs direction information in the form of quaternion; the general form of quaternion is a rotated angle scalar q 0 Followed by a rotated axis vector q, q= (q 1 ,q 2 ,q 3 ):
Figure FDA0004041126270000021
wherein ,
Figure FDA0004041126270000022
each quaternion is 1,/and->
Figure FDA0004041126270000023
Is a linear combination of (a); for->
Figure FDA0004041126270000024
The geometrical meaning of itself is understood as a rotation in which +.>
Figure FDA0004041126270000025
Rotation represents rotation in the positive direction of the Z-axis to the positive direction of the Y-axis in the plane of intersection of the Z-axis and the Y-axis, +.>
Figure FDA0004041126270000026
Rotation represents a positive rotation of the X-axis positive Z-axis in the plane of intersection of the X-axis and Z-axis, +.>
Figure FDA0004041126270000027
Rotation represents positive rotation of the Y-axis positive direction X-axis in a plane intersecting the Y-axis and the X-axis;
given an axis of rotation defined by a vector
Figure FDA0004041126270000028
The quaternion represents a rotation of rotation angle θ about this axis u:
Figure FDA0004041126270000029
converting a vector v into a square rotation about the u-axis and having a rotation angle θ is represented by a continuous multiplication of quaternions, expressed by:
Figure FDA00040411262700000214
wherein ,q* =q 0 Q, a conjugated quaternion of q;
the posture of the coordinate system b relative to the reference system R is determined by two parameters of a rotation axis u and a rotation angle theta; constructing a gesture quaternion by using the u parameter and the theta parameter:
Figure FDA00040411262700000210
the formula shows that the reference coordinate system R rotates around u by an angle theta and then coincides with the motion coordinate system b, and the posture of the coordinate system b passes through the posture quaternion
Figure FDA00040411262700000211
Transformed into a pose in the reference frame R, the coordinate transformation formula is:
Figure FDA00040411262700000212
after the initial attitude information is given, recursion of attitude quaternions is carried out in real time to convert the rotation attitude information of the self coordinate system b of the IMU into rotation attitude information of a reference system R, and a recursion formula of the attitude quaternions is as follows:
Figure FDA00040411262700000213
wherein t represents the moment, t starts from 0, the attitude quaternion of the current moment is recursively obtained from the attitude information of the previous moment, and the attitude information of the IMU at each moment is further calculated;
(2) Solution of displacement
The acceleration value of the triaxial at the current moment is compensated by a triaxial accelerometer in the IMU to generate the gravity acceleration of the earth and the Ge-type acceleration generated by the rotation of the earth, and the acceleration is integrated to obtain the speed, namely the speed v at the k moment k From the integral of the velocity, acceleration and error compensation term at time k-1:
Figure FDA0004041126270000031
wherein ,Δvg/cor(k) Indicating the influence of local gravity acceleration and Ge acceleration at time k, and using local latitude, earth rotation angular velocity and IMU gyroscope output meterAs calculated, a (τ) is the output of the IMU accelerometer,
Figure FDA0004041126270000032
is obtained by converting the gesture conversion matrix by using the gesture quaternion obtained by the first step of calculation>
Figure FDA0004041126270000033
Representing transforming the acceleration output of the IMU from the coordinate system b to the reference system R, the subscript k representing time;
(3) Positioning
Integrating and accumulating the speeds to obtain the position P at the moment k k
Figure FDA0004041126270000034
6. An ultrasonic testing method of a lightweight imaging ultrasound system using a loose scanning probe, comprising the steps of:
step one, acquiring data by moving and rotating a wireless scanning probe; the data comprise ultrasonic data of the region to be detected and information such as acceleration, angular velocity and the like measured by an inertial measurement unit;
step two, transmitting the ultrasonic data acquired by the wireless scanning probe to terminal equipment;
step three, the terminal equipment performs visualization of the two-dimensional ultrasonic image according to the ultrasonic data, and transmits the two-dimensional ultrasonic image to a three-dimensional imaging computing platform;
transmitting the acceleration and angular velocity information obtained by measurement to a three-dimensional imaging calculation platform by an inertia measurement unit;
step five, the three-dimensional imaging computing platform performs three-dimensional reconstruction according to the two-dimensional ultrasonic image obtained in the step three and the acceleration and angular velocity information obtained in the step four in combination with an interpolation algorithm; if the reconstruction is completed, the three-dimensional imaging computing platform transmits the three-dimensional ultrasonic image data to the terminal equipment for visualization, and the ultrasonic detection is completed; if the reconstruction is not completed, the method returns to the first step to carry out the subsequent steps again.
7. The ultrasonic testing method of claim 6, wherein the interpolation algorithm in the three-dimensional reconstruction process comprises:
in the fifth step, the three-dimensional computing device synchronizes the two-dimensional ultrasonic image with the information output by the IMU according to the timestamp information, then calculates the position of each frame of two-dimensional ultrasonic image in the three-dimensional space according to the resolving process, and then completes the final three-dimensional reconstruction by using an interpolation algorithm on the empty voxels which are not mapped in the three-dimensional space;
representing the reconstructed three-dimensional ultrasonic data by voxels; the two-dimensional ultrasonic image is a gray level image, and an interpolation algorithm assigns gray level values of pixels in the two-dimensional ultrasonic image to corresponding voxels through strategies according to the positioning information;
the interpolation algorithm completes interpolation by adopting a square distance weighting algorithm, and the specific algorithm is described as follows:
Figure FDA0004041126270000041
Figure FDA0004041126270000042
wherein ,I(VT ) Is the gray value of the target voxel, n is the gray value of the target voxel falling within a predefined range of V T Is the number of pixels in the sphere region of the sphere center, I (P i ) Represents the gray value, W, of the ith pixel in the region i Is the weight of the ith pixel, and to the sphere center voxel V T D is inversely proportional to the square of the distance d, d being obtained by a previous position solution; assigning all voxels finally by traversing each voxel; mapping the three-dimensional space onto a two-dimensional plane, wherein the small square represents voxels in the three-dimensional space, the gray circular region represents a predefined sphere region, the straight line passing through the voxels represents a two-dimensional ultrasound image, and the black dot on the straight line representsA pixel.
8. The ultrasonic testing method of claim 6, wherein visualizing in the three-dimensional reconstruction process comprises:
the three-dimensional computing equipment communicates with the terminal equipment through a Socket and a mobile network, and transmits the reconstructed three-dimensional voxel data to the terminal equipment in real time, the terminal equipment stores the data by using a buffer queue, performs volume drawing through a ray projection algorithm, finally renders the three-dimensional voxel data in real time, and simultaneously displays the three-dimensional voxel data and a two-dimensional ultrasonic image on a screen in a picture-in-picture mode.
CN202310017632.8A 2023-01-06 2023-01-06 Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method Pending CN116058867A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310017632.8A CN116058867A (en) 2023-01-06 2023-01-06 Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310017632.8A CN116058867A (en) 2023-01-06 2023-01-06 Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method

Publications (1)

Publication Number Publication Date
CN116058867A true CN116058867A (en) 2023-05-05

Family

ID=86179691

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310017632.8A Pending CN116058867A (en) 2023-01-06 2023-01-06 Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method

Country Status (1)

Country Link
CN (1) CN116058867A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117045281A (en) * 2023-10-12 2023-11-14 深圳华声医疗技术股份有限公司 Ultrasound imaging system, control method, imaging controller, and storage medium
CN117481695A (en) * 2023-12-28 2024-02-02 江苏霆升科技有限公司 Intracardiac ultrasonic catheter three-dimensional modeling method and device based on IMU registration

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117045281A (en) * 2023-10-12 2023-11-14 深圳华声医疗技术股份有限公司 Ultrasound imaging system, control method, imaging controller, and storage medium
CN117045281B (en) * 2023-10-12 2024-01-26 深圳华声医疗技术股份有限公司 Ultrasound imaging system, control method, imaging controller, and storage medium
CN117481695A (en) * 2023-12-28 2024-02-02 江苏霆升科技有限公司 Intracardiac ultrasonic catheter three-dimensional modeling method and device based on IMU registration

Similar Documents

Publication Publication Date Title
CN116058867A (en) Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method
CN104271046B (en) For tracking the method and system with guiding sensor and instrument
KR101468419B1 (en) Medical system and method for providing measurement information using three-dimensional calliper
CN100450445C (en) Real-time, freedom-arm, three-D ultrasonic imaging system and method therewith
CN1872001B (en) Systems, methods and apparatus for dual mammography image detection
US20090306509A1 (en) Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors
CN101371786B (en) Method and system of X ray image three-dimensional reconstruction
CN100536789C (en) Mechanical scanning realtime three-dimension ultrasonic imaging system and method
CN102119865B (en) Ultrasonic diagnosis apparatus, medical image processing apparatus, and medical image diagnosis apparatus
CN113689577B (en) Method, system, equipment and medium for matching virtual three-dimensional model with entity model
CN109223030B (en) Handheld three-dimensional ultrasonic imaging system and method
Huang et al. Linear tracking for 3-D medical ultrasound imaging
CN101053517A (en) Method and system for tracking internal mini device
US20140187950A1 (en) Ultrasound imaging system and method
US20180322628A1 (en) Methods and system for shading a two-dimensional ultrasound image
CN115668389A (en) Acquisition system of ultrasonic image of human organ
CN101681516A (en) Systems and methods for labeling 3-d volume images on a 2-d display of an ultrasonic imaging system
WO2001028426A1 (en) 3-dimensional ultrasonic imaging
Goldsmith et al. An inertial-optical tracking system for portable, quantitative, 3D ultrasound
CN114429458A (en) Endoscope image processing method and device, readable medium and electronic equipment
CN203328720U (en) Non-contact three-dimensional ultrasound imaging system based on computer vision technology
CN116312122A (en) Virtual simulation system and method for medical ultrasonic operation training
CN111184535B (en) Handheld unconstrained scanning wireless three-dimensional ultrasonic real-time voxel imaging system
CN114581599A (en) Method for rapidly acquiring system by ultrasonic three-dimensional structure independent of external positioning equipment
Siang et al. A framework of position tracked freehand 3d ultrasound reconstruction using game controller and pixel nearest neighbour method for marching cubes volume visualization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination