CN118161194A - Three-dimensional scanning imaging system and method for handheld probe - Google Patents
Three-dimensional scanning imaging system and method for handheld probe Download PDFInfo
- Publication number
- CN118161194A CN118161194A CN202410586839.1A CN202410586839A CN118161194A CN 118161194 A CN118161194 A CN 118161194A CN 202410586839 A CN202410586839 A CN 202410586839A CN 118161194 A CN118161194 A CN 118161194A
- Authority
- CN
- China
- Prior art keywords
- module
- data
- detection module
- coordinate system
- handheld
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000000523 sample Substances 0.000 title claims abstract description 49
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000003384 imaging method Methods 0.000 title claims abstract description 34
- 238000001514 detection method Methods 0.000 claims abstract description 124
- 238000005259 measurement Methods 0.000 claims abstract description 73
- 230000003287 optical effect Effects 0.000 claims abstract description 57
- 230000009466 transformation Effects 0.000 claims description 30
- 238000012545 processing Methods 0.000 claims description 21
- 238000002604 ultrasonography Methods 0.000 claims description 9
- 230000002902 bimodal effect Effects 0.000 claims description 4
- 230000008569 process Effects 0.000 claims description 4
- 238000013519 translation Methods 0.000 claims description 4
- 239000000284 extract Substances 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000012285 ultrasound imaging Methods 0.000 description 3
- 230000003111 delayed effect Effects 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000003325 tomography Methods 0.000 description 2
- 206010021143 Hypoxia Diseases 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 239000011324 bead Substances 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 206010020718 hyperplasia Diseases 0.000 description 1
- 230000002390 hyperplastic effect Effects 0.000 description 1
- 230000007954 hypoxia Effects 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0093—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
- A61B5/0095—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Acoustics & Sound (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention belongs to the technical field of biomedical imaging, and discloses a three-dimensional scanning imaging system and method of a handheld probe. The invention utilizes the matching performance of different modal imaging systems and different three-dimensional tracking systems on frame rate to match the pose information of the handheld detection module acquired by the binocular optical tracking module with the photoacoustic image, matches the pose information of the handheld detection module acquired by the inertial measurement unit with the ultrasonic image, and matches the time sequences of all modules through the hardware time sequence control module, thereby realizing photoacoustic ultrasonic multimode handheld probe scanning imaging with high global precision and high instantaneous frame rate. The mutual calibration of the three-dimensional coordinates is realized by combining two positioning modes, and the global precision and the instantaneous precision are considered.
Description
Technical Field
The invention belongs to the technical field of biomedical imaging, and particularly relates to a three-dimensional scanning imaging system and method of a handheld probe.
Background
Three-dimensional reconstruction of ultrasound images is the visualization of a three-dimensional region of interest by reconstructing a series of two-dimensional ultrasound and optical images, which can be acquired by various scanning techniques, such as mechanical scanning, hand-held probe scanning. In various scanning technologies, the hand-held probe adopts a freely moving ultrasonic probe, and generates an ultrasonic image with greater flexibility, so that the method is a scanning method which is most suitable for the operation habit of doctors clinically. The three-dimensional probe has higher cost, low suitability for various scenes and lower use frequency. Photoacoustic tomography is an emerging medical imaging technique developed in recent years. In terms of imaging information, photoacoustic imaging is good at providing physiological imaging information directly related to tumor screening diagnosis, such as the structure and distribution of the hyperplastic blood vessels of biological tissues, the distribution of local hypoxia, and the like. Since the ultrasound transducer array and the data acquisition card used by the photoacoustic imaging modality can also be used for ultrasound imaging, the photoacoustic imaging modality is easily fused with the ultrasound imaging modality, thereby providing ultrasound imaging information and photoacoustic imaging information required for body surface lesion diagnosis. Therefore, the three-dimensional imaging is realized by scanning by using a two-dimensional ultrasonic probe and fusing photoacoustic tomography, and is a more economical ultrasonic three-dimensional scanning reconstruction method.
The hand-held three-dimensional scanning is mainly realized by tracking the motion trail of the hand-held probe. The common probe positioning and tracking technology comprises an inertial measurement unit and a binocular optical tracking system, wherein the inertial measurement unit depends on a sensor built in a probe, can realize six-degree-of-freedom gesture measurement with high frame rate, has more sensitive instantaneous sensing capability, and realizes the spatial position positioning of the probe through integral operation, but has lower global precision due to larger accumulated error in long-time measurement by adopting the integral operation. The binocular optical tracking system has higher global precision, but is limited by factors such as exposure time, brightness, calculation efficiency and the like, the acquisition frame rate of the binocular optical tracking system is lower than that of the inertial measurement unit and the ultrasonic acquisition frame rate, and the precision loss of detection and three-dimensional scanning can be caused when the handheld probe moves rapidly or rotates at a large angle.
Disclosure of Invention
The invention aims to provide a three-dimensional scanning imaging system and method for a handheld probe so as to solve the technical problems.
The invention relates to a three-dimensional scanning imaging system and a three-dimensional scanning imaging method for a handheld probe, which concretely comprises the following technical scheme:
A three-dimensional scanning imaging system of a handheld probe comprises a hardware time sequence control module, a binocular optical tracking module, an inertial measurement unit, a handheld detection module and a data acquisition module;
the binocular optical tracking module is used for capturing the image data of the handheld detection module in real time and uploading the image data to the data acquisition module;
The inertial measurement unit and the handheld detection module synchronously move, are used for capturing original measurement data which occur in the movement of the handheld detection module and correspond to the initial position of the handheld detection module in real time, and are uploaded to the data acquisition module;
the handheld detection module is used for realizing the emission of ultrasonic waves, the emission of pulse light, the detection of ultrasonic signals and the detection of photoacoustic signals;
The hardware time sequence control module is used for controlling the time sequences of the binocular optical tracking module, the inertial measurement unit, the handheld detection module and the data acquisition module, and aligning the relative gesture of the handheld detection module detected by the inertial measurement unit to the absolute gesture of the handheld detection module detected by the binocular optical tracking module, so that the gesture data of all the handheld detection modules are unified under the same coordinate system;
The data acquisition module is used for acquiring image data shot by the binocular optical tracking module to obtain a first absolute posture of the handheld detection module; and meanwhile, acquiring original measurement data of the handheld detection module captured by the inertial measurement unit, converting the original measurement data into six-degree-of-freedom relative posture data, and creating the six-degree-of-freedom relative posture data as a second relative posture of the handheld detection module.
Further, the binocular optical tracking module comprises a binocular camera and a plurality of positioning mark points, the binocular camera is fixed near the point to be detected, the plurality of positioning mark points are fixed on the surface of the handheld detection module, and the binocular camera shoots the positioning mark points on the surface of the handheld detection module, acquires image data of the handheld detection module and uploads the image data to the data acquisition module.
Further, the device also comprises an image processing module, wherein the image processing module is used for aligning the data frame numbers of the first relative gesture and the second relative gesture, matching the first relative gesture and the corresponding photoacoustic signals, and matching the second relative gesture and the corresponding ultrasonic signals.
Further, the device also comprises a pattern projection module, wherein the pattern projection module is arranged in the middle of the binocular camera and used for recording the relative positions of the to-be-detected body and the handheld detection module, and the pattern projection module projects patterns to the surfaces of the to-be-detected body and the handheld detection module and performs binocular matching based on the projected pattern characteristics through the binocular optical tracking module.
The invention also discloses a three-dimensional scanning method of the three-dimensional scanning imaging system of the handheld probe, which comprises the following steps:
S1: an operator moves a detection window of the handheld detection module along the surface of the part to be detected, performs ultrasonic and photoacoustic bimodal scanning, captures pose data of the handheld detection module in real time by the binocular optical tracking module and the inertial measurement unit, and uploads the pose data to the data acquisition module;
S2: the hardware time sequence control module controls the time sequences of the binocular optical tracking module, the inertial measurement unit, the handheld detection module and the data acquisition module, and unifies the pose data of all the handheld detection modules under the same coordinate system;
s3: the data acquisition module acquires image data shot by the binocular optical tracking module, extracts positioning mark point information in the image, and reconstructs three-dimensional coordinates of the positioning mark points based on a binocular triangulation principle to obtain a first absolute posture of the handheld detection module; acquiring original measurement data of a handheld detection module captured by an inertial measurement unit, converting the original measurement data into six-degree-of-freedom relative posture data, and creating the six-degree-of-freedom relative posture data as a second relative posture; finally, the first absolute gesture and the second relative gesture are transmitted to an image processing module;
S4: the image processing module aligns the data frame numbers of the first absolute pose and the second relative pose, matches the first absolute pose and the corresponding photoacoustic signal, and matches the second relative pose and the corresponding ultrasound signal.
Further, the step S1 includes the following specific steps:
s1.1: the handheld detection module emits ultrasonic waves and pulse light, detects ultrasonic signals and photoacoustic signals, and transmits the detected ultrasonic signals and photoacoustic signals to the data acquisition module;
S1.2: simultaneously with the step S1.1, the binocular optical tracking module captures the image data of the handheld detection module in real time by shooting the positioning mark points and uploads the image data to the data acquisition module;
S1.3: simultaneously with the step S1.1, the pattern projection module records the relative positions of the to-be-detected body and the handheld detection module, projects patterns on the surfaces of the to-be-detected body and the handheld detection module, and performs binocular matching based on the projected pattern features through the binocular optical tracking module;
S1.4: simultaneously with step S1.1, the inertial measurement unit captures in real time raw measurement data of the handheld detection module relative to its own initial position, which occurs during movement, and uploads the raw measurement data to the data acquisition module.
Furthermore, the acquisition frame rate of the binocular optical tracking module can cover the acquisition frame rate of the photoacoustic signals; the inertial measurement unit acquires the frame rate of pose transformation of the handheld detection module in real time to be consistent with the ultrasonic signal acquisition frame rate of an ultrasonic imaging mode, and when the ultrasonic signal acquisition frame rate is greater than the ultrasonic signal acquisition frame rate of the inertial measurement unit, the rotation quantity of corresponding time is calculated by adopting quaternion spherical linear interpolation, and the translation quantity of corresponding time is calculated by adopting a linear interpolation method.
Further, the S4 specifically includes:
The image processing module converts the 2D image data into 3D image data, and the real-time 3D image frame is interpolated and fused into three-dimensional voxels through a voxel reconstruction method, and the following formula is specifically applied:
Wherein the method comprises the steps of Representing 3D image data,/>Representing 2D image data,/>Representing the transformation of a 2D image coordinate system to a probe coordinate system,/>Representing the transformation of the probe coordinate system to the positioning mark point coordinate system,/>Representing the transformation of the locating mark point coordinate system to the binocular camera coordinate system,/>Representing the transformation of the binocular camera coordinate system into the voxel field coordinate system.
Further, when there is a large density in the middle of the object or the object cannot be penetrated due to other reasons, the following steps are performed:
The first step: installing positioning mark points on a body to be tested, and recording the original six-degree-of-freedom gesture of the body to be tested in the overturning process by using a binocular optical tracking module;
And a second step of: carrying out handheld three-dimensional scanning on the to-be-detected body, and when the image processing module converts 2D image data into 3D image data based on the preprocessed image and data, superposing six-degree-of-freedom gestures of the to-be-detected body in overturning based on the following formula:
Wherein the method comprises the steps of Representing 3D image data,/>Representing 2D image data,/>Representing the transformation of a 2D image coordinate system to a probe coordinate system,/>Representing the transformation of the probe coordinate system to the positioning mark point coordinate system,/>Representing the transformation of the locating mark point coordinate system to the binocular camera coordinate system,/>Representing the transformation of the binocular camera coordinate system to the voxel field coordinate system,Representing the transformation of the binocular camera coordinate system to the object coordinate system.
Further, the method also comprises the step of recording three-dimensional data of the current working scene, wherein the steps are as follows:
The pattern projection module projects a structured light pattern such as speckle or stripe, and the structured light pattern is beaten on the surfaces of the handheld detection module and the object to be detected, binocular matching is carried out based on the characteristics of the projected pattern, and three-dimensional point cloud data of the current environment are obtained by a triangulation method and stored.
The three-dimensional scanning imaging system and method of the handheld probe have the following advantages: the built-in inertial measurement unit in the handheld detection module has high acquisition frequency, and is matched with the high acquisition frequency of the ultrasonic array, so that three-dimensional ultrasonic image reconstruction of a body to be detected is realized. The binocular optical tracking module is used for identifying and extracting positioning mark points on the surface of the handheld detection module, and is suitable for low emission frequency of the pulse laser, so that three-dimensional photoacoustic image reconstruction of the body to be detected of the handheld detection module is realized. Simultaneously, the four devices are synchronously triggered by the control of the hardware time sequence control module, and the relative gesture calculated by the inertial measurement unit is aligned to the absolute gesture calculated by the binocular optical tracking module, so that all the hand-held probe gesture data are established under the same coordinate system, and the photoacoustic ultrasonic bimodal hand-held probe scanning imaging with high global precision and high instantaneous frame rate is further realized. The mutual calibration of the three-dimensional coordinates is realized by combining two positioning modes, and the global precision and the instantaneous precision are considered.
Drawings
FIG. 1 is a block diagram of a three-dimensional scanning imaging system of a hand-held probe according to the present invention;
FIG. 2 is a schematic diagram of an image processing module of the present invention aligning data frame numbers of a first relative pose and a second relative pose, matching the first relative pose and photoacoustic signals, and matching the second relative pose and ultrasound signals;
FIG. 3 is a schematic representation of one embodiment of the present invention;
FIG. 4 is a schematic diagram of an embodiment of the invention for scanning a body under test that needs to be flipped;
The figure indicates: 1. a hardware timing control module; 2. a binocular optical tracking module; 201. a left monocular camera; 202. a right monocular camera; 3. a hand-held detection module; 4. an inertial measurement unit; 5. a data acquisition module; 6. an image processing module; 7. and a pattern projection module.
Detailed Description
For a better understanding of the objects, structures and functions of the present invention, a system and method for three-dimensional scanning imaging of a hand-held probe of the present invention will be described in further detail with reference to the accompanying drawings.
As shown in fig. 1, the three-dimensional scanning imaging system of the handheld probe of the invention comprises a hardware time sequence control module 1, a binocular optical tracking module 2, a handheld detection module 3, an inertial measurement unit 4, a data acquisition module 5, an image processing module 6 and a pattern projection module 7;
The binocular optical tracking module 2 is used for capturing image data of the handheld detection module 3 in real time and uploading the image data to the data acquisition module 5, specifically, the binocular optical tracking module 2 comprises a binocular camera and a plurality of positioning mark points, the binocular camera is fixed near a to-be-detected point and comprises two monocular cameras, namely a left monocular camera 201 and a right monocular camera 202, the left monocular camera 201 and the right monocular camera 202 are respectively provided with a certain interval, the positioning mark points are fixed on the surface of the handheld detection module 3, and the binocular camera shoots the positioning mark points on the surface of the handheld detection module 3, acquires the image data of the handheld detection module 3 and uploads the image data to the data acquisition module 5. Preferably, the positioning mark points are glass bead positioning mark points or reflective balls.
The inertial measurement unit 4 is fixed in the handheld detection module 3, moves synchronously with the handheld detection module 3, is used for capturing original measurement data of the handheld detection module 3 relative to the initial position thereof in real time, and uploads the original measurement data to the data acquisition module 5. Preferably, the inertial measurement unit 4 uses a nine-axis imu inertial measurement unit 4, including a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer. In this embodiment, an MPU 9250 is used.
The handheld detection module 3 comprises a handheld probe, an ultrasonic array is arranged in the handheld probe, and the handheld detection module 3 is used for realizing the sending of ultrasonic waves, the sending of pulse light, the detection of ultrasonic signals and the detection of photoacoustic signals and transmitting the detected ultrasonic signals and photoacoustic signals to the data acquisition module 5. The pulse light is emitted through an optical fiber integrated on the handheld probe; the emission of ultrasonic waves, the detection of ultrasonic signals and the detection of photoacoustic signals are realized through one ultrasonic array or two separated ultrasonic arrays.
The hardware time sequence control module 1 is used for controlling the time sequences of the binocular optical tracking module 2, the inertial measurement unit 4, the handheld detection module 3 and the data acquisition module 5, so that the detection of the binocular optical tracking module 2, the detection of the inertial measurement unit 4, the emission of ultrasonic signals and pulse light of the handheld detection module 3, the reception of ultrasonic signals and the detection of photoacoustic signals are realized, and the data acquisition of the data acquisition module 5 is synchronous or delayed for a certain time at a certain time sequence; and the relative gesture of the handheld detection module 3 detected by the inertial measurement unit 4 is aligned to the absolute gesture of the handheld detection module 3 detected by the binocular optical tracking module 2, so that the gesture data of all the handheld detection modules 3 are unified under the same coordinate system.
The data acquisition module 5 is used for acquiring image data shot by the binocular optical tracking module 2, extracting positioning mark point information in the image, reconstructing to obtain three-dimensional coordinates of the positioning mark points based on a binocular triangulation principle, and obtaining a first absolute posture of the handheld detection module 3; meanwhile, acquiring original measurement data of the handheld detection module 3 captured by the inertial measurement unit 4, converting the original measurement data into six-degree-of-freedom relative posture data, and creating the six-degree-of-freedom relative posture data as a second relative posture of the handheld detection module 3; finally the first absolute pose and the second relative pose are transmitted to the image processing module 6.
The image processing module 6 is configured to align the frame numbers of the data frames of the first absolute pose and the second relative pose, match the first absolute pose and the corresponding photoacoustic signals, and match the second relative pose and the corresponding ultrasound signals.
The pattern projection module 7 is arranged in the middle of the binocular camera and is used for recording the relative positions of the to-be-detected body and the handheld detection module 3, projecting patterns to the surfaces of the to-be-detected body and the handheld detection module 3, and performing binocular matching based on the projected pattern features through the binocular optical tracking module 2.
The invention discloses a three-dimensional scanning method of a handheld probe, which comprises the following steps of:
s1: the operator moves the detection window of the handheld detection module 3 along the surface of the part to be detected, performs ultrasonic and photoacoustic bimodal scanning, captures pose data of the handheld detection module 3 in real time by the binocular optical tracking module 2 and the inertial measurement unit 4, and uploads the pose data to the data acquisition module 5;
S1.1: the hand-held detection module 3 emits ultrasonic waves and pulse light, detects ultrasonic signals and photoacoustic signals, and transmits the detected ultrasonic signals and photoacoustic signals to the data acquisition module 5;
S1.2: simultaneously with the step S1.1, the binocular optical tracking module 2 captures the image data of the handheld detection module 3 in real time by shooting the positioning mark points and uploads the image data to the data acquisition module 5;
As shown in fig. 3, in one embodiment, four positioning mark points are attached to the surface of the handheld detection module 3, the binocular optical tracking module 2 establishes a certain three-dimensional space coordinate system, and records that the coordinates of the positioning mark points under the three-dimensional space coordinates are P1, P2, P3 and P4; the coordinates of the four locating mark points on the surface of the handheld detection module 3 change with the translation and rotation of the handheld detection module 3, such as P1', P2', P3', P4'. The binocular optical tracking module 2 detects image data of the four positioning mark points at a frequency of 60Hz and transmits the acquired data to the data acquisition module 5.
S1.3: simultaneously with the step S1.1, the pattern projection module 7 records the relative positions of the to-be-detected body and the handheld detection module 3, the pattern projection module 7 projects patterns on the surfaces of the to-be-detected body and the handheld detection module 3, and binocular matching is carried out based on the projected pattern characteristics through the binocular optical tracking module 2;
s1.4: simultaneously with the step S1.1, the inertial measurement unit 4 captures the original measurement data of the handheld detection module 3 relative to the initial position thereof in real time during movement, and uploads the original measurement data to the data acquisition module 5;
S2: the hardware time sequence control module 1 controls the time sequences of the binocular optical tracking module 2, the inertial measurement unit 4, the handheld detection module 3 and the data acquisition module 5, so that the detection of the binocular optical tracking module 2, the detection of the inertial measurement unit 4, the emission of ultrasonic signals and the emission of pulse light of the handheld detection module 3, the reception of ultrasonic signals and the detection of photoacoustic signals are realized, and the data acquisition of the data acquisition module 5 is synchronous or delayed for a certain time at a certain time sequence; the relative gesture of the hand-held probe detected by the inertial measurement unit 4 is aligned to the absolute gesture of the hand-held probe detected by the binocular optical tracking module 2, so that the gesture data of all the hand-held detection modules 3 are unified under the same coordinate system;
In this embodiment, the data frame rate of the binocular camera is 60Hz, the data frame acquired by the photoacoustic signals of the ultrasonic array in the handheld detection module 3 is 20Hz, and the data frame rates of the inertial measurement unit 4 and the ultrasonic signals are 300Hz. The acquisition frame rate of the binocular optical tracking module 2 can cover the acquisition frame rate of the photoacoustic signals; the frame rate of the inertial measurement unit 4 for acquiring the posture transformation of the handheld detection module 3 in real time can be consistent with the frame rate of the ultrasonic signal acquisition of the ultrasonic imaging mode, and when the frame rate of the ultrasonic signal acquisition is greater than the frame rate of the inertial measurement unit 4, a more accurate ultrasonic posture is calculated by adopting an interpolation method.
Preferably, when the ultrasonic signal acquisition frame rate (such as 600 Hz) is greater than the highest acquisition frame rate (such as 300 Hz) of the inertial measurement unit 4, the rotation amount of the corresponding time is calculated by using quaternion spherical linear interpolation, and the translation amount of the corresponding time is calculated by using a linear interpolation method. The linear interpolation algorithm is as follows:
Based on the following formula, the time proportion information of a certain frame of ultrasonic and the gesture of two adjacent frames before and after the inertial measurement unit 4 are known to be linearly interpolated to obtain a corresponding gesture result:
Wherein the method comprises the steps of And/>Respectively represent a first rotation gesture and a second rotation gesture,/>Representing the angle between the two rotational poses,Representing the proportion of time occupied during this period,/>And/>The first translational posture and the second translational posture are respectively represented.
Preferably, according to the hardware characteristics of the system, the hardware timing control module 1 is designed to trigger the inertial measurement unit 4 and the ultrasonic transducer unit in the handheld detection module 3 with pulse signals with the frequency of 300 Hz. When the frequency of 60hz is reached, the pulse signal triggering the binocular optical tracking module 2 is turned on. When the frequency of 20hz is reached, a pulse signal triggering the laser light source to emit is started. Therefore, the time sequence of all the equipment is completely and synchronously aligned, and the high-precision three-dimensional reconstruction effect is realized.
S3: the data acquisition module 5 acquires image data shot by the binocular optical tracking module 2, extracts positioning mark point information in the image, and reconstructs three-dimensional coordinates of the positioning mark points based on a binocular triangulation principle to obtain a first absolute posture of the handheld detection module 3; acquiring original measurement data of the handheld detection module 3 captured by the inertial measurement unit 4, converting the original measurement data into six-degree-of-freedom relative posture data, and creating the six-degree-of-freedom relative posture data as a second relative posture; finally, the first absolute gesture and the second relative gesture are transmitted to the image processing module 6;
S4: the image processing module 6 aligns the data frame numbers of the first absolute pose and the second relative pose, matches the first absolute pose and the corresponding photoacoustic signal, and matches the second relative pose and the corresponding ultrasound signal;
the image processing module 6 transforms the 2D image data into 3D image data, and interpolates and fuses the real-time 3D image frame into three-dimensional voxels by a voxel reconstruction method, specifically applying the following formula:
Wherein therein is Representing 3D image data,/>Representing 2D image data,/>Representing the transformation of a 2D image coordinate system to a probe coordinate system,/>Representing the transformation of the probe coordinate system to the positioning mark point coordinate system,/>Representing the transformation of the locating mark point coordinate system to the binocular camera coordinate system,/>Representing the transformation of the binocular camera coordinate system into the voxel field coordinate system.
In one embodiment, as shown in fig. 4, the method of the present invention can also be used to scan the object to be tested that needs to be flipped when there is a greater density in the middle of the object or other reasons that cannot be penetrated.
The first step: positioning mark points are arranged on the body to be measured, and the original six-degree-of-freedom gesture of the body to be measured in the overturning process can be recorded by using the binocular optical tracking module 2;
and a second step of: according to the above-mentioned three-dimensional scanning imaging method of a hand-held probe, hand-held three-dimensional scanning is performed on a body to be measured, and when the image processing module 6 converts 2D image data into 3D image data based on the preprocessed image and data, the six-degree-of-freedom pose of the body to be measured in the overturning process is superimposed based on the following formula:
Wherein the method comprises the steps of Representing 3D image data,/>Representing 2D image data,/>Representing the transformation of a 2D image coordinate system to a probe coordinate system,/>Representing the transformation of the probe coordinate system to the positioning mark point coordinate system,/>Representing the transformation of the locating mark point coordinate system to the binocular camera coordinate system,/>Representing the transformation of the binocular camera coordinate system to the voxel field coordinate system,Representing the transformation of the binocular camera coordinate system to the object coordinate system.
In one embodiment, the system of the invention can also record three-dimensional data of the current working scene, and is convenient for review, and the recording steps are as follows:
The pattern projection module 7 in the middle of the binocular camera of the binocular optical tracking module 2 projects a structural light pattern such as speckle or stripes, and the structural light pattern is projected on the surfaces of the handheld detection module 3 and the object to be detected, and binocular matching is performed based on the characteristics of the projected pattern, such as sgbm stereo matching method, multi-frequency phase shift method and the like. Three-dimensional point cloud data such as the current environment and the like are obtained and stored by a triangulation method, so that the direction of a patient can be quickly checked when clinical diseases are rechecked.
It will be understood that the application has been described in terms of several embodiments, and that various changes and equivalents may be made to these features and embodiments by those skilled in the art without departing from the spirit and scope of the application. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the application without departing from the essential scope thereof. Therefore, it is intended that the application not be limited to the particular embodiment disclosed, but that the application will include all embodiments falling within the scope of the appended claims.
Claims (10)
1. The three-dimensional scanning imaging system of the handheld probe is characterized by comprising a hardware time sequence control module (1), a binocular optical tracking module (2), an inertial measurement unit (4), a handheld detection module (3) and a data acquisition module (5);
The binocular optical tracking module (2) is used for capturing the image data of the handheld detection module (3) in real time and uploading the image data to the data acquisition module (5);
The inertial measurement unit (4) moves synchronously with the handheld detection module (3) and is used for capturing original measurement data of the handheld detection module (3) relative to the initial position in real time, which occurs in the movement, and uploading the original measurement data to the data acquisition module (5);
the handheld detection module (3) is used for realizing the emission of ultrasonic waves, the emission of pulse light, the detection of ultrasonic signals and the detection of photoacoustic signals;
The hardware timing control module (1) is used for controlling the timing sequences of the binocular optical tracking module (2), the inertial measurement unit (4), the handheld detection module (3) and the data acquisition module (5), and aligning the relative posture of the handheld detection module (3) detected by the inertial measurement unit (4) to the absolute posture of the handheld detection module (3) detected by the binocular optical tracking module (2), so that the position posture data of all the handheld detection modules (3) are unified under the same coordinate system;
The data acquisition module (5) is used for acquiring image data shot by the binocular optical tracking module (2) to obtain a first absolute posture of the handheld detection module (3); and meanwhile, acquiring the original measurement data of the handheld detection module (3) captured by the inertial measurement unit (4), and obtaining a second relative posture of the handheld detection module (3).
2. The three-dimensional scanning imaging system of a hand-held probe according to claim 1, wherein the binocular optical tracking module (2) comprises a binocular camera and a plurality of positioning mark points, the binocular camera is fixed near a point to be detected, the plurality of positioning mark points are fixed on the surface of the hand-held detection module (3), the binocular camera shoots the positioning mark points on the surface of the hand-held detection module (3), acquires image data of the hand-held detection module (3), and uploads the image data to the data acquisition module (5).
3. The hand-held probe three-dimensional scanning imaging system according to claim 1, further comprising an image processing module (6), the image processing module (6) being adapted to align data frame numbers of the first relative pose and the second relative pose, match the first relative pose and the corresponding photoacoustic signal, and match the second relative pose and the corresponding ultrasound signal.
4. The three-dimensional scanning imaging system of a hand-held probe according to claim 2, further comprising a pattern projection module (7), wherein the pattern projection module (7) is arranged in the middle of the binocular camera and is used for recording the relative positions of the object to be detected and the hand-held detection module (3), the pattern projection module (7) projects patterns on the surfaces of the object to be detected and the hand-held detection module (3), and binocular matching is carried out based on the projected pattern features through the binocular optical tracking module (2).
5. A three-dimensional scanning method of a hand-held probe three-dimensional scanning imaging system according to any one of claims 1-4, comprising the steps of:
S1: an operator moves a detection window of the handheld detection module (3) along the surface of a part to be detected, performs ultrasonic and photoacoustic bimodal scanning, captures pose data of the handheld detection module (3) in real time by the binocular optical tracking module (2) and the inertial measurement unit (4), and uploads the pose data to the data acquisition module (5);
S2: the hardware time sequence control module (1) controls the time sequences of the binocular optical tracking module (2), the inertial measurement unit (4), the handheld detection module (3) and the data acquisition module (5), and the pose data of all the handheld detection modules (3) are unified under the same coordinate system;
S3: the data acquisition module (5) acquires image data shot by the binocular optical tracking module (2), extracts positioning mark point information in the image, and reconstructs three-dimensional coordinates of the positioning mark points based on a binocular triangulation principle to obtain a first absolute posture of the handheld detection module (3); acquiring original measurement data of a handheld detection module (3) captured by an inertial measurement unit (4), converting the original measurement data into six-degree-of-freedom relative posture data, and creating the six-degree-of-freedom relative posture data as a second relative posture; finally, the first absolute gesture and the second relative gesture are transmitted to an image processing module (6);
S4: the image processing module (6) aligns the data frame numbers of the first absolute pose and the second relative pose, matches the first absolute pose and the corresponding photoacoustic signal, and matches the second relative pose and the corresponding ultrasound signal.
6. The three-dimensional scanning method according to claim 5, wherein S1 comprises the following specific steps:
S1.1: the handheld detection module (3) emits ultrasonic waves and pulse light, detects ultrasonic signals and photoacoustic signals, and transmits the detected ultrasonic signals and photoacoustic signals to the data acquisition module (5);
S1.2: simultaneously with the step S1.1, the binocular optical tracking module (2) captures the image data of the handheld detection module (3) in real time by shooting the positioning mark points and uploads the image data to the data acquisition module (5);
S1.3: simultaneously with the step S1.1, the pattern projection module (7) records the relative positions of the to-be-detected body and the handheld detection module (3), the pattern projection module (7) projects patterns on the surfaces of the to-be-detected body and the handheld detection module (3), and binocular matching is carried out based on the projected pattern characteristics through the binocular optical tracking module (2);
s1.4: simultaneously with the step S1.1, the inertial measurement unit (4) captures the original measurement data of the handheld detection module (3) relative to the initial position thereof in real time, and uploads the original measurement data to the data acquisition module (5).
7. The three-dimensional scanning method according to claim 5, characterized in that the acquisition frame rate of the binocular optical tracking module (2) guarantees that the acquisition frame rate of photoacoustic signals can be covered; the inertial measurement unit (4) collects the frame rate of the pose transformation of the handheld detection module (3) in real time to be consistent with the ultrasonic signal collection frame rate of an ultrasonic imaging mode, when the ultrasonic signal collection frame rate is greater than the inertial measurement unit (4), the quaternion spherical linear interpolation is adopted to calculate the rotation quantity of the corresponding time, and the linear interpolation method is adopted to calculate the translation quantity of the corresponding time.
8. The three-dimensional scanning method according to claim 5, wherein S4 is specifically as follows:
the image processing module (6) converts the 2D image data into 3D image data, and the real-time 3D image frame is interpolated and fused into three-dimensional voxels through a voxel reconstruction method, and the following formula is specifically applied:
,
Wherein the method comprises the steps of Representing 3D image data,/>Representing 2D image data,/>Representing the transformation of a 2D image coordinate system to a probe coordinate system,/>Representing the transformation of the probe coordinate system to the positioning mark point coordinate system,/>Representing the transformation of the locating mark point coordinate system to the binocular camera coordinate system,/>Representing the transformation of the binocular camera coordinate system into the voxel field coordinate system.
9. The method according to claim 5, wherein when there is a large density in the object or the object is not penetrated by other reasons, the following steps are performed:
the first step: installing positioning mark points on a body to be tested, and recording the original six-degree-of-freedom gesture of the body to be tested in the overturning process by using the binocular optical tracking module (2);
And a second step of: carrying out handheld three-dimensional scanning on the body to be detected, and when the image processing module (6) converts 2D image data into 3D image data based on the preprocessed image and data, superposing six-degree-of-freedom gestures of the body to be detected in overturning based on the following formula:
Wherein the method comprises the steps of Representing 3D image data,/>Representing 2D image data,/>Representing the transformation of a 2D image coordinate system to a probe coordinate system,/>Representing the transformation of the probe coordinate system to the positioning mark point coordinate system,/>Representing the transformation of the locating mark point coordinate system to the binocular camera coordinate system,/>Representing the transformation of the binocular camera coordinate system to the voxel field coordinate system,Representing the transformation of the binocular camera coordinate system to the object coordinate system.
10. The three-dimensional scanning method according to claim 5, further comprising recording three-dimensional data of a current working scene, comprising the steps of:
The pattern projection module (7) projects speckle or stripe structured light patterns, the speckle or stripe structured light patterns are printed on the surfaces of the handheld detection module (3) and the object to be detected, binocular matching is carried out based on the characteristics of the projected patterns, and three-dimensional point cloud data of the current environment are obtained by a triangulation method and stored.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410586839.1A CN118161194B (en) | 2024-05-13 | 2024-05-13 | Three-dimensional scanning imaging system and method for handheld probe |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410586839.1A CN118161194B (en) | 2024-05-13 | 2024-05-13 | Three-dimensional scanning imaging system and method for handheld probe |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118161194A true CN118161194A (en) | 2024-06-11 |
CN118161194B CN118161194B (en) | 2024-07-23 |
Family
ID=91357004
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410586839.1A Active CN118161194B (en) | 2024-05-13 | 2024-05-13 | Three-dimensional scanning imaging system and method for handheld probe |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118161194B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090024038A1 (en) * | 2007-07-16 | 2009-01-22 | Arnold Stephen C | Acoustic imaging probe incorporating photoacoustic excitation |
CN104736068A (en) * | 2012-10-19 | 2015-06-24 | 皇家飞利浦有限公司 | Ultrasound head frame for emergency medical services |
US20160163115A1 (en) * | 2014-12-08 | 2016-06-09 | Align Technology, Inc. | Intraoral scanning using ultrasound and optical scan data |
CN108731672A (en) * | 2018-05-30 | 2018-11-02 | 中国矿业大学 | Coalcutter attitude detection system and method based on binocular vision and inertial navigation |
WO2022002133A1 (en) * | 2020-07-01 | 2022-01-06 | 青岛小鸟看看科技有限公司 | Gesture tracking method and apparatus |
CN114533111A (en) * | 2022-01-12 | 2022-05-27 | 电子科技大学 | Three-dimensional ultrasonic reconstruction system based on inertial navigation system |
CN116058867A (en) * | 2023-01-06 | 2023-05-05 | 华东师范大学 | Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method |
-
2024
- 2024-05-13 CN CN202410586839.1A patent/CN118161194B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090024038A1 (en) * | 2007-07-16 | 2009-01-22 | Arnold Stephen C | Acoustic imaging probe incorporating photoacoustic excitation |
CN104736068A (en) * | 2012-10-19 | 2015-06-24 | 皇家飞利浦有限公司 | Ultrasound head frame for emergency medical services |
US20160163115A1 (en) * | 2014-12-08 | 2016-06-09 | Align Technology, Inc. | Intraoral scanning using ultrasound and optical scan data |
CN108731672A (en) * | 2018-05-30 | 2018-11-02 | 中国矿业大学 | Coalcutter attitude detection system and method based on binocular vision and inertial navigation |
WO2022002133A1 (en) * | 2020-07-01 | 2022-01-06 | 青岛小鸟看看科技有限公司 | Gesture tracking method and apparatus |
CN114533111A (en) * | 2022-01-12 | 2022-05-27 | 电子科技大学 | Three-dimensional ultrasonic reconstruction system based on inertial navigation system |
CN116058867A (en) * | 2023-01-06 | 2023-05-05 | 华东师范大学 | Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method |
Also Published As
Publication number | Publication date |
---|---|
CN118161194B (en) | 2024-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11678804B2 (en) | Methods and systems for tracking and guiding sensors and instruments | |
EP3076892B1 (en) | A medical optical tracking system | |
US20170273665A1 (en) | Pose Recovery of an Ultrasound Transducer | |
KR100380819B1 (en) | Method of determining relative camera orientation position to create 3-d visual images | |
CN113543718B (en) | Apparatus and method for determining motion of an ultrasound probe including front-to-back directionality | |
WO2003053241A2 (en) | Device, system and method for image based size analysis | |
JP2018509204A (en) | Jaw movement tracking | |
JP6974354B2 (en) | Synchronized surface and internal tumor detection | |
CN111031918A (en) | X-ray imaging apparatus and control method thereof | |
WO2016039915A2 (en) | Systems and methods using spatial sensor data in full-field three-dimensional surface measurment | |
US12053326B2 (en) | Apparatus and method for automatic ultrasound segmentation for visualization and measurement | |
CN118161194B (en) | Three-dimensional scanning imaging system and method for handheld probe | |
US20220215562A1 (en) | Registration method and setup | |
KR102717121B1 (en) | Apparatus, method and system for displaying ultrasound image based on mixed reality | |
Van Bogart | Motion analysis technologies | |
TW202110404A (en) | Ultrasonic image system enables the processing unit to obtain correspondingly two-dimensional ultrasonic image when the ultrasonic probe is at different inclination angles | |
CN117838192A (en) | Method and device for three-dimensional B-type ultrasonic imaging based on inertial navigation module | |
TW202322766A (en) | Ultrasonic imaging system including an ultrasonic probe, a first characteristic pattern, a second characteristic pattern, a storage unit, an image capture unit, a display unit, and a processing unit | |
KR20210136355A (en) | Apparatus and method for generating 3d ultrasound image | |
JP2002000612A (en) | Ultrasonic device for medical treatment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |