CN115615427A - Ultrasonic probe navigation method, device, equipment and medium - Google Patents

Ultrasonic probe navigation method, device, equipment and medium Download PDF

Info

Publication number
CN115615427A
CN115615427A CN202211105204.2A CN202211105204A CN115615427A CN 115615427 A CN115615427 A CN 115615427A CN 202211105204 A CN202211105204 A CN 202211105204A CN 115615427 A CN115615427 A CN 115615427A
Authority
CN
China
Prior art keywords
pose
ultrasonic
change value
offset
ultrasonic probe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211105204.2A
Other languages
Chinese (zh)
Inventor
刘佳
孙钦佩
杨叶辉
王晓荣
黄海峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202211105204.2A priority Critical patent/CN115615427A/en
Publication of CN115615427A publication Critical patent/CN115615427A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The present disclosure provides an ultrasound probe navigation method, apparatus, device and medium, which relate to the technical field of data processing, in particular to the technical field of artificial intelligence and AI medical treatment. The specific implementation scheme is as follows: acquiring a first ultrasonic sectional image acquired by an ultrasonic probe at a first moment and a second ultrasonic sectional image acquired by the ultrasonic probe at a second moment; determining a first pose change value based on the first pose offset and the second pose offset; correcting the first pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first time and the second time to obtain a corrected pose change value; and correcting the second pose deviation based on the corrected pose change value to generate navigation information of the ultrasonic probe. Under the condition that an operator has no abundant experience, the high-quality ultrasonic sectional image can be obtained.

Description

Ultrasonic probe navigation method, device, equipment and medium
Technical Field
The present disclosure relates to the field of data processing technology, and in particular, to the field of artificial intelligence and AI medical technology.
Background
Ultrasonic imaging is a medical imaging technology which utilizes a high-frequency sound wave technology and acquires images in real time, and because ionizing radiation cannot be generated in ultrasonic imaging examination, the ultrasonic imaging can be suitable for health examination of pregnant women and infants, and the technology has an important role in the field of medical examination.
Disclosure of Invention
The present disclosure provides an ultrasound probe navigation method, apparatus, device and medium.
In a first aspect, the present disclosure provides an ultrasound probe navigation method, including:
acquiring a first ultrasonic section image acquired by an ultrasonic probe at a first moment and a second ultrasonic section image acquired at a second moment in the scanning process of an operator by using the ultrasonic probe;
determining a first pose change value based on the first pose offset and the second pose offset; the first pose offset is a pose offset between a pose of the ultrasonic probe and a target pose when the first ultrasonic sectional image is acquired, the second pose offset is a pose offset between a pose of the ultrasonic probe and a target pose when the second ultrasonic sectional image is acquired, and the target pose is a standard pose when the ultrasonic probe scans a target scanning position;
correcting the first pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment to obtain a corrected pose change value;
and correcting the second position and posture offset based on the corrected position and posture change value, and generating navigation information of the ultrasonic probe based on the corrected second position and posture offset.
In a second aspect, the present disclosure provides an ultrasound probe navigation device comprising:
the ultrasonic scanning device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a first ultrasonic sectional image acquired by an ultrasonic probe at a first moment and a second ultrasonic sectional image acquired by the ultrasonic probe at a second moment in the scanning process of an operator by using the ultrasonic probe;
a determination module to determine a first pose change value based on the first pose offset and the second pose offset; wherein the first pose offset is a pose offset between a pose of the ultrasonic probe and a target pose when the first ultrasonic sectional image is acquired, the second pose offset is a pose offset between a pose of the ultrasonic probe and a target pose when the second ultrasonic sectional image is acquired, and the target pose is a standard pose when the ultrasonic probe scans a target scanning position;
the correction module is used for correcting the first pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment to obtain a corrected pose change value;
and the correction module is used for correcting the second posture offset based on the corrected posture change value and generating the navigation information of the ultrasonic probe based on the corrected second posture offset.
In a third aspect, the present disclosure provides an electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of the first aspect.
In a fourth aspect, the present disclosure provides a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of the first aspect.
In a fifth aspect, the present disclosure provides a computer program product comprising a computer program which, when executed by a processor, implements the method of the first aspect described above.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
fig. 1 is a schematic flow chart of an ultrasound probe navigation method provided by an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of another ultrasound probe navigation method provided by the embodiments of the present disclosure;
FIG. 3 is a schematic view of an ultrasound probe with an optical marker and IMU mounted thereto according to an embodiment of the present disclosure;
FIG. 4 is a schematic flow chart illustrating a process for obtaining a second attitude change value according to an embodiment of the disclosure;
FIG. 5 is a schematic flow chart diagram for obtaining a first attitude offset and a second attitude offset according to an embodiment of the disclosure;
FIG. 6 is a schematic flow chart illustrating a process of obtaining a third posture change value according to an embodiment of the present disclosure;
fig. 7 is a schematic flow chart of obtaining a pose change value according to an embodiment of the disclosure;
fig. 8 is a schematic flow chart diagram of another ultrasound probe navigation method provided by the embodiments of the present disclosure;
fig. 9 is an exemplary flow chart diagram of an ultrasound probe navigation method provided by an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of an ultrasound probe navigation device provided in an embodiment of the present disclosure;
fig. 11 is a block diagram of an electronic device for implementing an ultrasound probe navigation method of an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the related art, the acquisition of high-quality ultrasound sectional images by using ultrasound equipment is highly dependent on the skilled operation of doctors, the ultrasound sectional images are acquired in a 2D dynamic imaging manner, and in order to perform comprehensive examination on a three-dimensional structure of a human organ, dynamic images of the human organ in different sectional directions need to be acquired. An operator of the ultrasonic equipment needs to know how to place and move the ultrasonic probe to obtain the ultrasonic sectional images in different sectional directions, and the operator needs to judge the position of the ultrasonic probe according to the currently scanned ultrasonic sectional image and move the ultrasonic probe to obtain the required ultrasonic sectional image, so that the operator needs to know the anatomical structure of a human organ and the characteristics of the ultrasonic sectional image corresponding to the anatomical structure of the human organ, and the operator with insufficient experience is difficult to obtain the high-quality ultrasonic sectional image.
It can be seen that, currently, obtaining high-quality ultrasound sectional images requires relying on experience of operators, and operators who have not been trained professionally cannot obtain high-quality ultrasound sectional images.
The ultrasound device in the embodiments of the present disclosure may be a desktop ultrasound device, a portable ultrasound device, or a palm ultrasound device.
Among them, the desktop ultrasound device has the largest volume, the best imaging quality, but is expensive.
The portable ultrasonic equipment has the advantages of medium volume, good imaging quality and lower price than the desktop ultrasonic equipment.
The handheld ultrasonic equipment has the smallest volume and general imaging quality, but the equipment is low in price, and the main application scenes comprise bedside diagnosis, out-of-hospital first aid, primary medical screening and the like.
In order to solve the above technical problem, an embodiment of the present disclosure provides an ultrasound probe navigation method, which may be executed by an electronic device capable of communicating with an ultrasound device, where the electronic device may be a smart phone, a desktop computer, or a tablet computer, and the electronic device may be connected to the ultrasound device through a wired connection, or the electronic device may be connected to the ultrasound device through technologies such as WiFi and bluetooth.
As shown in fig. 1, the method includes:
s101, in the process that an operator scans by using the ultrasonic probe, a first ultrasonic sectional image acquired by the ultrasonic probe at a first moment and a second ultrasonic sectional image acquired by the ultrasonic probe at a second moment are acquired.
An operator places the ultrasonic probe on the surface of the skin of a human organ to be scanned, controls the ultrasonic probe to move to scan the human organ, and the ultrasonic probe can acquire an ultrasonic section image in real time in the scanning process.
In the scanning process of the ultrasonic probe by an operator, the electronic equipment can acquire the ultrasonic sectional image acquired by the ultrasonic probe once every fixed time, and the first time and the second time are adjacent two acquisition times. For example, the preset time period may be set to 2S.
And S102, determining a first posture change value based on the first posture offset and the second posture offset.
The first pose offset is a pose offset between the pose of the ultrasonic probe and the target pose when the first ultrasonic sectional image is acquired, the second pose offset is a pose offset between the pose of the ultrasonic probe and the target pose when the second ultrasonic sectional image is acquired, and the target pose is a standard pose when the ultrasonic probe scans the target scanning position. The target scanning position is a preset scanning end position of the scanning.
The first posture change value is a difference value between the first posture offset and the second posture offset.
The method of obtaining the first and second position offsets will be described in detail below.
S103, correcting the first pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment to obtain a corrected pose change value.
In the embodiment of the disclosure, the first pose change value is corrected, so that a corrected pose change value of the ultrasonic probe between the first time and the second time can be obtained.
And S104, correcting the second posture offset based on the corrected posture change value, and generating navigation information of the ultrasonic probe based on the corrected second posture offset.
By adopting the embodiment of the disclosure, the pose offset between the pose when the ultrasonic sectional image is acquired by the ultrasonic probe at two moments and the target pose can be obtained, so that the first pose change value can be obtained. And then correcting the first pose change value through the ultrasonic sectional images acquired by the ultrasonic probe at two moments and the poses of the ultrasonic probe at the two moments to obtain a corrected pose change value, so that the obtained corrected pose change value can accurately represent the actual pose change of the ultrasonic probe. And then, correcting the second posture offset based on the corrected posture change value, and generating accurate navigation information of the ultrasonic probe according to the corrected second posture offset. Therefore, an operator can move the ultrasonic probe according to the navigation information to obtain a high-quality ultrasonic sectional image, and the high-quality ultrasonic sectional image can be obtained under the condition that the operator has no abundant experience.
In another embodiment of the present disclosure, as shown in fig. 2, in the step S103, the first pose change value is corrected based on the first ultrasonic sectional image, the second ultrasonic sectional image, and the poses of the ultrasonic probe at the first time and the second time, so as to obtain a corrected pose change value, which may specifically be implemented as:
and S1031, acquiring poses of the ultrasonic probe measured by an Inertial Measurement Unit (IMU) installed in the ultrasonic probe at a first time and a second time.
The IMU is installed on the ultrasonic probe, and the IMU can measure the pose of the ultrasonic probe in the scanning process in real time.
As shown in fig. 3, fig. 3 is a schematic view of an ultrasound probe with an IMU installed according to an embodiment of the present disclosure, where the ultrasound probe in fig. 3 is at t 1 、t 2 …t N The ultrasonic sectional images collected at the moment are respectively I (t) 1 )、、I(t 2 )…I(t N ) The IMU may acquire an ultrasound probe at t 1 、t 2 …t N Pose at time U (t) 1 )、U(t 2 )、…U(t N )。
S1032, calculating a difference value between the pose of the ultrasonic probe at the first moment and the pose of the ultrasonic probe at the second moment to obtain a second pose change value.
For example, if t in FIG. 3 1 Is a first time t 2 At the second moment, the second posture change value delta P IMU =U(t 1 )-U(t 1 )。
And S1033, determining a third posture change value based on the first ultrasonic section image, the second ultrasonic section image and the postures of the ultrasonic probe at the first moment and the second moment.
The manner of determining the third posture change value will be described in the following embodiments.
S1034, correcting the first posture change value based on the second posture change value and the third posture change value to obtain a corrected posture change value.
By adopting the embodiment of the disclosure, the second position and posture change value is obtained through the position and posture of the ultrasonic probe measured by the IMU at two moments. And then, a third posture change value is obtained by synthesizing the ultrasonic sectional images acquired by the ultrasonic probe at two moments and the postures of the ultrasonic probe measured by the IMU at the two moments. And correcting the first pose change value based on the second pose change value and the third pose change value to obtain a corrected pose change value, which is equivalent to performing fusion and complementation on pose change values obtained by three different motion estimation modes, so that the obtained corrected pose change value can accurately represent the actual pose change of the ultrasonic probe. And then, correcting the second posture offset based on the corrected posture change value, wherein the corrected second posture offset can generate accurate navigation information, so that an operator can acquire a high-quality ultrasonic sectional image according to the navigation information without abundant experience.
For S1031 to S1032, as shown in fig. 4, fig. 4 is a schematic flowchart of a process for acquiring the second posture change value according to the embodiment of the present disclosure.
S401, the IMU collects the pose of the ultrasonic probe in real time.
S402, acquiring ultrasonic probe acquired by IMU at a first moment t j Position and posture U (t) j )。
S403, acquiring ultrasonic probe acquired by IMU at a second moment t j+1 Position and attitude of U (t) j+1 )。
S404, calculating a second position change value delta P IMU
Wherein, Δ P iMU =U(t j )-U(t j+1 )。
It is understood that the second posture change value at each two adjacent time instants can be obtained through the process shown in fig. 4.
In one embodiment of the present disclosure, the first and second attitude offsets in the above embodiments are obtained by:
inputting the first ultrasonic section image into a first estimation model to obtain a first posture offset output by the first estimation model; and inputting the second ultrasonic sectional image into the first estimation model to obtain a second attitude offset output by the first estimation model.
The first estimation model is a regression model obtained through training of a first preset training set, the first preset training set comprises a plurality of sample ultrasonic section images, and when each sample ultrasonic section image is collected, the pose of an ultrasonic probe is shifted from the pose of a target; the pose of the ultrasonic probe is obtained by adopting an optical positioning tracking device or a magnetic positioning tracking device.
The first estimation model may be any type of regression model, for example, a logistic regression model.
Taking an example in which an optical positioning and tracking device is installed in an ultrasonic probe, as shown in fig. 3, fig. 3 also shows an optical marker installed on the ultrasonic probe, in the embodiment of the present disclosure, the ultrasonic probe installed with the optical marker may be used in advance to acquire multi-frame sample ultrasonic sectional images, and the optical positioning and tracking device may acquire the pose of the ultrasonic probe when acquiring each frame of sample ultrasonic sectional image through the optical marker. A doctor with abundant experience can observe a plurality of frames of sample ultrasonic section images and select a standard section image. The electronic equipment can calculate the pose offset between the pose of the ultrasonic probe when the ultrasonic probe acquires the ultrasonic section image of each sample and the pose of the target when the ultrasonic probe acquires the standard section image, and the pose offset is used as a label of the ultrasonic section image of each sample.
In this way, a first preset training set may be obtained, and the first estimation model is trained through the first preset training set. Specifically, each sample ultrasonic section image can be input into the first estimation model, the predicted pose offset output by the first estimation model for each sample ultrasonic section image is obtained, a loss function value is calculated based on the predicted pose offset and the label of the sample ultrasonic section image, the parameter of the first estimation model is adjusted based on the loss function value until the first estimation model is converged, and it is determined that the training of the first estimation model is completed.
Fig. 5 is a schematic flow chart illustrating a process of obtaining the first posture offset and the second posture offset through the trained first estimation model, and the method of obtaining the first posture offset and the second posture offset through the first estimation model is described below with reference to fig. 5.
S501, the ultrasonic probe collects an ultrasonic sectional image in real time.
S502, acquiring the ultrasonic probe at a first time t j The first ultrasonic sectional image I (t) is acquired j )。。
S503, acquiring the ultrasonic probe at a second time t j+1 The second ultrasonic sectional image I (t) is acquired j+1 )。
S504, inputting the first ultrasonic sectional image and the second ultrasonic sectional image into a first estimation model F respectively 1 ()。
S505, obtainingA first attitude offset D (t) of the estimated model output j )。
Wherein, D (t) j )=F 1 (I(t j ))。
S506, acquiring a second attitude deviation D (t) output by the first estimation model j+1 )。
Wherein, D (t) j+1 )=F 1 (I(t j+1 ))。
It is understood that the first attitude change value in the above embodiment is:
ΔP IMG =D(t j )-D(t j+1 )。
by adopting the embodiment of the disclosure, the pose estimation can be respectively carried out on the first ultrasonic sectional image and the second ultrasonic sectional image through the first estimation model, so that the first pose offset and the second pose offset can be obtained, and because the first estimation model is obtained by using the first preset training set in advance, the label of each sample ultrasonic sectional image included in the first training set is obtained by adopting the optical positioning tracking device or the magnetic positioning tracking device, and the label can accurately represent the pose offset of the ultrasonic probe when the sample ultrasonic sectional image is acquired and when the standard sectional image is acquired, the first estimation model obtained by training can accurately obtain the first pose offset and the second pose offset, so that the accuracy of the motion estimation of the ultrasonic probe is improved, and the navigation information finally obtained based on the first pose offset and the second pose offset is more accurate, so that an operator can be guided to acquire the high-quality ultrasonic sectional image.
In another embodiment of the present disclosure, in the step S1033, the determining the third pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image, and the poses of the ultrasonic probe at the first time and the second time may specifically be implemented as:
and inputting the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment into the second estimation model, and acquiring a third pose change value output by the second estimation model.
The second estimation model is a regression model obtained through training of a second preset training set, the second preset training set comprises multiple groups of sample data, and the pose change value of the ultrasonic probe in the process of collecting each group of sample data is acquired; each group of sample data comprises two ultrasonic slice images acquired at two adjacent moments and the poses of the ultrasonic probes acquired by the IMU at the two adjacent moments; the pose change value of the ultrasonic probe is obtained by adopting an optical positioning tracking device or a magnetic positioning tracking device.
In the embodiments of the present disclosure, the second estimation model may be various types of regression models. For example, it may be a logistic regression model.
Similar to the method for acquiring the first preset training set, the two ultrasound section images at adjacent moments in each group of sample data in the second preset training set can also be acquired by an ultrasound probe provided with an optical marker. Moreover, the positions and postures of the ultrasonic probe at two adjacent moments when the ultrasonic probe acquires two ultrasonic section images can be obtained through the optical positioning and tracking device, the difference value of the positions and postures of the ultrasonic probe at two adjacent moments is calculated, and the position and posture change value of the ultrasonic probe in the process of acquiring each group of sample data is obtained, namely the position and posture change value of the ultrasonic probe when the label of each group of sample data is used for acquiring two ultrasonic section images included in the group of sample data.
In this way, the second estimation model is trained through a second preset training set, each group of sample data is input into the second estimation model, and the predicted pose change value of each group of sample data output by the second estimation model is obtained. And calculating a loss function value between the predicted pose change value of each group of sample data and the label of the sample data, adjusting parameters of the second estimation model based on the calculated loss function value until the second estimation model is converged, and determining that the training of the second estimation model is finished.
As shown in fig. 6, fig. 6 is a schematic flowchart of a process of obtaining a third posture change value through a trained second estimation model according to an embodiment of the present disclosure.
S601, the ultrasonic probe collects an ultrasonic sectional image in real time.
S602, acquiring the ultrasonic probe at a first time t j The first ultrasonic sectional image I (t) is acquired j )。
S603, acquiring the ultrasonic probe at a second moment t j+1 A second ultrasonic sectional image I (t) j+1 )。
And S604, the IMU acquires the pose of the ultrasonic probe in real time.
S605, acquiring ultrasonic probe acquired by IMU at a first time t j Position and attitude of U (t) j )。
S606, acquiring ultrasonic probe acquired by IMU at second time t j+1 Position and posture U (t) j+1 )。
Wherein S601-S603 and S604-S606 may be performed in parallel.
S607, the first ultrasonic section image I (t) j ) A second ultrasonic section image I (t) j+1 ) First time t j Position and posture U (t) j ) And a second time t j+1 Position and posture U (t) j+1 ) Inputting a second estimation model F 2 ()。
S608, obtaining a third posture change value delta P output by the second estimation model AI
Wherein, Δ P AI =F 2 ({I(t j ),I(t j+1 )},{U(t j )),U(t j+1 )})。
By adopting the embodiment of the disclosure, the pose of the ultrasonic probe acquired by the IMU at the first time and the second time can be estimated and the third pose change value can be obtained through the second estimation model based on the first ultrasonic sectional image, the second ultrasonic sectional image and the pose of the ultrasonic probe acquired by the IMU at the first time and the second time. Because the second estimation model is obtained by using the second training set in advance for training, the labels of each group of sample data included in the second training set are obtained by adopting the optical positioning device or the magnetic positioning tracking device, the precision is higher, so that the labels can accurately represent the pose change value of the ultrasonic probe, and the pose change values of the ultrasonic probe at the first moment and the second moment can be accurately predicted by the trained second estimation model. In addition, the second estimation model can combine the pose of the ultrasonic probe acquired by the IMU equipment with the ultrasonic sectional image analysis, so that the obtained pose change value is more accurate, the error of ultrasonic probe motion estimation is reduced, more accurate navigation information can be obtained, and an operator is guided to acquire high-quality ultrasonic sectional images.
In another embodiment of the present disclosure, S1034 may correct the first pose change value based on the second pose change value and the third pose change value to obtain a corrected pose change value, which may specifically be implemented as follows:
and filtering the first position and posture change value, the second position and posture change value and the third position and posture change value based on an indirect Kalman filtering method to obtain a corrected position and posture change value.
As shown in fig. 7, fig. 7 is a flowchart of a method for acquiring a corrected pose change value according to an embodiment of the present disclosure.
S701, acquiring a first attitude change value delta P IMG
S702, acquiring a second position posture change value delta P IMU 。。
S703, acquiring a third posture change value delta P AI
The method for obtaining the first position change value, the second position change value, and the third position change value in S701-S703 may refer to the relevant description in the foregoing embodiments, and is not described herein again.
S704, filtering the first position and posture change value, the second position and posture change value and the third position and posture change value through an indirect Kalman filtering technology to obtain a corrected position and posture change value delta P K
Optionally, a kalman filter implemented by an indirect method may be used to filter the first, second, and third pose change values to obtain first, second, and third pose change values, and modify the first, second, and third pose change values to reduce errors in the pose change values and obtain modified pose change values. The kalman filter may be any one of the kalman filters implemented by an indirect method in the related art.
By adopting the embodiment of the disclosure, the first attitude change value obtained based on the ultrasonic sectional image, the second attitude change value obtained based on the IMU technology and the third attitude change value obtained based on the second estimation model are filtered by the indirect Kalman filtering technology, that is, the results of three different navigation technologies are fused and corrected, so that the problem of large navigation error caused by independently adopting the ultrasonic sectional image analysis technology or the IMU technology is solved, and the accuracy of ultrasonic probe navigation is improved.
In another embodiment of the present disclosure, in the S104, correcting the second pose offset based on the corrected pose change value may specifically be implemented as: and calculating a difference value between the first pose deviation and the corrected pose change value to obtain a corrected second pose deviation.
The corrected second posture offset is as follows: d (t) j+1 ) K =D(t j )-ΔP K (t j ,t j+1 )。
By adopting the method, after the accurate corrected pose change value is obtained, the pose offset at the first moment is subtracted from the corrected pose change value, so that the theoretical pose offset of the second sample section image at the second moment can be obtained, the navigation information is generated based on the corrected second pose offset, the operator can be guided to move the scanning probe towards the target scanning position, and the personal experience of the operator is not required.
In another implementation of the present disclosure, as shown in FIG. 8, the method includes S801-S807.
S801 to S803 are the same as S101 to S103, and reference may be made to the related descriptions in S101 to S103 in the foregoing embodiments, which are not repeated herein.
And S804, correcting the second posture offset based on the corrected posture change value.
And S805, judging whether the corrected second posture offset is smaller than a first preset threshold value.
The first preset threshold may be a threshold preset according to an actual scene.
If yes, executing S806; if the determination result is no, S807 is executed.
And S806, determining that the ultrasonic probe is moved to the target scanning position.
If the corrected second pose offset is smaller than the first preset threshold, the difference value between the pose of the ultrasonic probe at the second moment and the target pose is smaller, and the second ultrasonic sectional image acquired at the second moment can be regarded as a standard ultrasonic sectional image, that is, the ultrasonic probe is determined to be moved to the target scanning position without moving.
It is understood that when the corrected second posture offset is smaller than the first preset threshold, a prompt message for stopping moving the ultrasonic probe can be sent to the operator.
And S807, generating navigation information of the ultrasonic probe based on the corrected second position and posture offset, acquiring an ultrasonic sectional image acquired by the ultrasonic probe at the next moment, taking the current second moment as the first moment and the next moment as the second moment, returning to the step S802 of determining the first position and posture change value based on the first position and posture offset and the second position and posture offset until the corrected second position and posture offset is smaller than a first preset threshold value.
If the corrected second pose offset is not smaller than the first preset threshold, it indicates that the difference between the pose of the ultrasonic probe at the second moment and the pose of the target is large, and at this time, the distance from the ultrasonic probe to the scanning position of the target is still large, and the ultrasonic probe needs to be moved continuously.
By adopting the embodiment of the disclosure, the navigation information is generated, so that the operator can be guided to gradually move the ultrasonic probe to the target scanning position to complete the scanning of the target scanning position and obtain the ultrasonic section image of the target scanning position, the operator does not need to judge whether the ultrasonic probe has moved to the target scanning position according to experience, and the requirement on the operator is reduced.
In another embodiment of the present disclosure, the corrected second posture offset includes an offset distance and an offset angle, and the step S104 of generating the navigation information of the ultrasound probe based on the corrected second posture offset may specifically be implemented as:
if the offset distance is larger than a second preset threshold value, converting the offset distance into a moving direction to obtain navigation information; and if the offset distance is smaller than or equal to a second preset threshold value, converting the offset angle into a rotating direction to obtain navigation information.
In the embodiment of the present disclosure, under the condition that the offset distance included in the corrected second posture offset is greater than the second preset threshold, that is, the current ultrasound probe is farther from the target scanning position, the operator may be preferentially guided to move the ultrasound probe to the target scanning position, so that the offset angle of the ultrasound probe may not be considered, and only the offset distance may be converted into the moving direction. If the offset distance included in the second position and posture offset is smaller than or equal to a second preset threshold value, the ultrasonic probe is indicated to be moved to the position close to the target scanning position, and the offset angle can be converted into the rotation direction at the moment so as to guide an operator to adjust the position and posture of the ultrasonic probe, so that the high-quality ultrasonic sectional image is scanned.
If the offset distance is a positive number, the moving direction is upward movement; if the offset distance is negative, the moving direction is downward movement.
If the offset angle is positive, the rotating direction is rightward rotation; if the offset angle is negative, the rotation direction is left rotation.
For example, the second preset threshold may be set to 1cm, and if the offset distance included in the second position offset is +2cm, and the offset angle is +90 degrees, at this time, the offset distance is greater than the second preset threshold, and the offset distance +2cm may be converted into an upward movement, so as to obtain the navigation information.
If the offset distance included in the second position posture offset is +0.1cm, the offset angle is +90 degrees, and the offset distance is smaller than a second preset threshold value, the offset angle +90 degrees can be converted into right rotation to obtain navigation information.
In the embodiment of the disclosure, fine-grained navigation information may be generated according to the corrected second posture offset, that is, if the offset distance is greater than a second preset threshold, the offset distance may be converted into a moving direction and a moving distance to obtain the navigation information; and if the offset distance is less than or equal to a second preset threshold value, converting the offset angle into a rotating direction and a rotating angle to obtain navigation information.
For example, the second preset threshold may be set to 1cm, if the offset distance included in the second position offset is +2cm, and the offset angle is +90 degrees, at this time, the offset distance is greater than the second preset threshold, and the offset distance +2cm may be converted into an upward movement of 2cm, so as to obtain the navigation information.
If the second posture offset includes an offset distance of +0.1cm and an offset angle of +90 degrees, and the offset distance is smaller than a second preset threshold, the offset angle of +90 degrees can be converted into a right rotation of 90 degrees, and the navigation information is obtained.
By adopting the embodiment of the disclosure, by setting the second preset threshold, only the navigation information of the mobile ultrasonic probe is generated under the condition that the offset distance included in the corrected second position and posture offset is greater than the second preset threshold, and the navigation information of the rotary ultrasonic probe is generated under the condition that the offset distance included in the corrected second position and posture offset is less than or equal to the second preset threshold, that is, the ultrasonic probe is required to be rotated only when the ultrasonic probe is moved to a position closer to the target scanning position, so that the operation of an operator on the ultrasonic probe according to the navigation information can be simplified, and the operator is prevented from repeatedly rotating the ultrasonic probe for multiple times at the position farther from the target scanning position.
As shown in fig. 9, fig. 9 is a schematic view of a navigation flow of an ultrasound probe according to an embodiment of the present disclosure, which is described below with reference to fig. 9.
S901, an operator places an ultrasonic probe on the surface of the skin of a human body target scanning organ.
S902, acquiring t through ultrasonic probe 0 Ultrasonic sectional image I (t) of time 0 )。
S903, mixing t 0 Ultrasonic sectional image I (t) of time 0 ) Input displacement deviation estimation model F 1 ()。
Wherein, the displacement deviation estimation model F 1 () Is the first estimation model in the above embodiment.
S904, obtaining a displacement deviation estimation model F 1 Output ultrasonic probe at t 0 Pose offset D (t) of pose at time and pose at target 0 )。
S905, shifting the pose to D (t) 0 ) And converting into navigation information.
And S906, the operator controls the ultrasonic probe to move according to the navigation information.
S907, obtaining t 1 Ultrasonic sectional image I (t) of time 1 )、t 0 And t 1 Time of dayPose U (t) of ultrasonic probe acquired by IMU 0 ) And U (t) 1 )。
S908, comparing t 1 Time ultrasonic section image input displacement deviation estimation model F 1 () Obtaining the ultrasonic probe at t 1 Pose offset D (t) of pose at time and pose at target 1 )。
S909, calculating t 0 And t 1 Obtaining a first pose change value delta P by the difference of the time pose deviation IMG
Wherein, Δ P IMG =D(t 0 )-D(t 1 )。
S910, calculating t 0 And t 1 Obtaining a second position and posture change value delta P by the pose difference value acquired by the IMU at the moment IMU
Wherein, Δ P IMU =U(t 0 )-U(t 1 )。
S911, mixing t 0 Ultrasonic sectional image I (t) of time 0 )、t 1 Ultrasonic sectional image I (t) of time 1 )、U(t 0 ) And U (t) 1 ) Input estimation model F 2 () Obtaining a third posture change value delta P AI
Wherein the model F is estimated 2 Is the second estimation model in the above embodiment.
ΔP AI =F 2 ({I(t 0 ),I(t 1 )},{U(t 0 )),U(t 1 )})。
S912, changing the value delta P of the first attitude through an indirect Kalman filtering technology IMG Second position posture change value delta P IMU And a third attitude change value Δ P AI Filtering to obtain t 0 And t 1 Pose change correction value delta P of time ultrasonic probe K (0)。
S913, aligning the ultrasonic probe at t 1 Pose offset D (t) of pose at time and pose at target 1 ) Performing correction to obtain corrected t 1 Pose offset at time D (t) 1 ) K
Wherein, D (t) 1 ) K =D(t 0 )-ΔP K (0)。
S914, mixing D (t) 1 ) K Converted into navigation information.
And S915, controlling the ultrasonic probe to move by an operator according to the navigation information.
S916, acquiring t j 、t j+1 The ultrasonic sectional image at the moment and the ultrasonic probe pose acquired by the IMU.
Subsequently, t can be obtained j 、t j+1 The first position and attitude change value, the second position and attitude change value at the moment, and the three position and attitude change values are corrected, and t is corrected 0 And t 1 The processing modes of the three pose change values at all times are the same, and a specific implementation flow is omitted here.
S917, obtaining t j 、t j+1 Correction of pose change value delta P of time ultrasonic probe K (j)。
S918, for t j+1 Correcting the pose deviation at the moment to obtain the corrected pose deviation D (t) j+1 ) K =D(t j )-ΔP K (j)。
S919, determination D (t) j+1 ) K Whether the value is less than a first preset threshold value.
If yes, go to step S920; if the determination result is no, the process returns to S915.
And S920, stopping moving the ultrasonic probe and reaching the target scanning position.
The navigation scheme provided by the embodiment of the disclosure can be applied to an ultrasonic scanning system, can reduce the use difficulty of ultrasonic equipment, enables an operator to quickly master the use method of the ultrasonic equipment, can enable the application scene of the ultrasonic equipment to be wider, enables the operator of a basic medical institution to skillfully use the ultrasonic equipment, and can be applied to the training scene of an ultrasonic examination doctor to reduce the training time of the ultrasonic examination doctor.
Based on the same inventive concept, the embodiment of the present disclosure further provides an ultrasound probe navigation apparatus, as shown in fig. 10, the apparatus including:
the acquiring module 1001 is used for acquiring a first ultrasonic sectional image acquired by an ultrasonic probe at a first moment and a second ultrasonic sectional image acquired by the ultrasonic probe at a second moment in the scanning process of an operator by using the ultrasonic probe;
a determining module 1002 for determining a first pose change value based on the first pose offset and the second pose offset; the first pose offset is a pose offset between the pose of the ultrasonic probe and the pose of a target when the first ultrasonic sectional image is acquired, the second pose offset is a pose offset between the pose of the ultrasonic probe and the pose of the target when the second ultrasonic sectional image is acquired, and the pose of the target is a standard pose when the ultrasonic probe scans the scanning position of the target;
the correction module 1003 is further configured to correct the first pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image, and poses of the ultrasonic probe at the first time and the second time to obtain a corrected pose change value;
and the correcting module 1004 is configured to correct the second pose deviation based on the corrected pose change value, and generate navigation information of the ultrasonic probe based on the corrected second pose deviation.
Optionally, the modification module 1003 is specifically configured to:
the method comprises the steps of obtaining the poses of an ultrasonic probe measured by an inertial measurement unit IMU installed in the ultrasonic probe at a first moment and a second moment;
calculating a difference value between the pose of the ultrasonic probe at the first moment and the pose of the ultrasonic probe at the second moment to obtain a second pose change value;
determining a third pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first time and the second time;
and correcting the first pose change value based on the second pose change value and the third pose change value to obtain a corrected pose change value.
Optionally, the apparatus further comprises:
the judging module is used for judging whether the corrected second posture offset is smaller than a first preset threshold value;
the determining module 1002 is further configured to determine that the ultrasound probe has been moved to the target scanning position if the determination result of the determining module is yes;
the correcting module 1003 is further configured to, if the determination result of the determining module is negative, execute a step of generating navigation information of the ultrasound probe based on the corrected second position posture offset, trigger the obtaining module 1001 to obtain an ultrasound sectional image acquired by the ultrasound probe at a next time, take a current second time as a first time, take the next time as a second time, and trigger the determining module 1002 to execute a step of determining the first position posture change value based on the first position posture offset and the second position posture offset until the determining module determines that the corrected second position posture offset is smaller than the first preset threshold.
Optionally, the determining module 1002 is further configured to obtain the first position and second position offsets by:
inputting the first ultrasonic section image into a first estimation model to obtain a first posture offset output by the first estimation model;
inputting the second ultrasonic sectional image into the first estimation model to obtain a second attitude offset output by the first estimation model;
the first estimation model is a regression model obtained through training of a first preset training set, the first preset training set comprises a plurality of sample ultrasonic section images, and the pose offset between the pose of the ultrasonic probe and the pose of the target is obtained when each sample ultrasonic section image is collected; the pose of the ultrasonic probe is obtained by adopting an optical positioning tracking device or a magnetic positioning tracking device.
Optionally, the modification module 1003 is specifically configured to:
inputting the first ultrasonic sectional image, the second ultrasonic sectional image and the poses measured by the ultrasonic probe at the first moment and the second moment into a second estimation model, and acquiring a third pose change value output by the second estimation model;
the second estimation model is a regression model obtained through training of a second preset training set, the second preset training set comprises a plurality of groups of sample data, and the pose change value of the ultrasonic probe in the process of collecting each group of sample data is acquired; each group of sample data comprises two ultrasonic slice images acquired at two adjacent moments and the poses of the ultrasonic probes acquired by the IMU at the two adjacent moments; the pose change value of the ultrasonic probe is obtained by adopting an optical positioning tracking device or a magnetic positioning tracking device.
Optionally, the modification module 1003 is specifically configured to:
and carrying out filtering processing on the first position and orientation change value, the second position and orientation change value and the third position and orientation change value based on an indirect Kalman filtering method to obtain a corrected position and orientation change value.
Optionally, the correcting module 1004 is specifically configured to:
and calculating the difference value of the first pose deviation and the corrected pose change value to obtain the corrected second pose deviation.
Optionally, the corrected second posture offset includes an offset distance and an offset angle;
the straightening module 1004 is specifically configured to:
if the offset distance is larger than a second preset threshold value, converting the offset distance into a moving direction to obtain navigation information;
and if the offset distance is less than or equal to a second preset threshold value, converting the offset angle into a rotation direction to obtain navigation information.
In the technical scheme of the disclosure, the collection, storage, use, processing, transmission, provision, disclosure and other processing of the personal information of the related user are all in accordance with the regulations of related laws and regulations and do not violate the good customs of the public order.
It should be noted that the ultrasound sectional image in the present embodiment is from a public data set.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
FIG. 11 shows a schematic block diagram of an example electronic device 1100 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not intended to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 11, the device 1100 comprises a computing unit 1101, which may perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 1102 or a computer program loaded from a storage unit 1108 into a Random Access Memory (RAM) 1103. In the RAM 1103, various programs and data necessary for the operation of the device 1100 may also be stored. The calculation unit 1101, the ROM 1102, and the RAM 1103 are connected to each other by a bus 1104. An input/output (I/O) interface 1105 is also connected to bus 1104.
A number of components in device 1100 connect to I/O interface 1105, including: an input unit 1106 such as a keyboard, a mouse, and the like; an output unit 1107 such as various types of displays, speakers, and the like; a storage unit 1108 such as a magnetic disk, optical disk, or the like; and a communication unit 1109 such as a network card, a modem, a wireless communication transceiver, and the like. The communication unit 1109 allows the device 1100 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 1101 can be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 1101 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and the like. The calculation unit 1101 performs the respective methods and processes described above, such as the ultrasound probe navigation method. For example, in some embodiments, the ultrasound probe navigation method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as the storage unit 1108. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 1100 via ROM 1102 and/or communication unit 1109. When the computer program is loaded into RAM 1103 and executed by the computing unit 1101, one or more steps of the ultrasound probe navigation method described above may be performed. Alternatively, in other embodiments, the computing unit 1101 may be configured to perform the ultrasound probe navigation method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server with a combined blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel or sequentially or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (19)

1. An ultrasound probe navigation method comprising:
acquiring a first ultrasonic section image acquired by an ultrasonic probe at a first moment and a second ultrasonic section image acquired at a second moment in the scanning process of an operator by using the ultrasonic probe;
determining a first pose change value based on the first pose offset and the second pose offset; the first pose offset is a pose offset between a pose of the ultrasonic probe and a target pose when the first ultrasonic sectional image is acquired, the second pose offset is a pose offset between a pose of the ultrasonic probe and a target pose when the second ultrasonic sectional image is acquired, and the target pose is a standard pose when the ultrasonic probe scans a target scanning position;
correcting the first pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment to obtain a corrected pose change value;
and correcting the second position and posture offset based on the corrected position and posture change value, and generating navigation information of the ultrasonic probe based on the corrected second position and posture offset.
2. The method of claim 1, wherein the correcting the first pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first time and the second time to obtain a corrected pose change value comprises:
acquiring poses of the ultrasonic probe measured by an Inertial Measurement Unit (IMU) installed in the ultrasonic probe at the first moment and the second moment;
calculating a difference value between the pose of the ultrasonic probe at the first moment and the pose of the ultrasonic probe at the second moment to obtain a second pose change value;
determining a third pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment;
and correcting the first pose change value based on the second pose change value and the third pose change value to obtain a corrected pose change value.
3. The method of claim 1, after the correcting the second pose offset based on the revised pose change value, the method further comprising:
judging whether the corrected second posture offset is smaller than a first preset threshold value or not;
if so, determining that the ultrasonic probe has been moved to the target scanning position;
if not, executing the step of generating the navigation information of the ultrasonic probe based on the corrected second position and posture offset, acquiring an ultrasonic section image acquired by the ultrasonic probe at the next moment, taking the current second moment as the first moment, taking the next moment as the second moment, and returning to the step of determining the first position and posture change value based on the first position and posture offset until the corrected second position and posture offset is smaller than the first preset threshold value.
4. The method of any of claims 1-3, wherein the first and second attitude offsets are obtained by:
inputting the first ultrasonic sectional image into a first estimation model to obtain the first attitude offset output by the first estimation model;
inputting the second ultrasonic sectional image into the first estimation model to obtain the second attitude offset output by the first estimation model;
the first estimation model is a regression model obtained through training of a first preset training set, the first preset training set comprises a plurality of sample ultrasonic section images, and when each sample ultrasonic section image is collected, the pose of the ultrasonic probe is shifted from the pose of the target; the pose of the ultrasonic probe is obtained by adopting an optical positioning and tracking device or a magnetic positioning and tracking device.
5. The method of claim 2, wherein the determining a third pose change value based on the first ultrasound sectional image, the second ultrasound sectional image, and the pose of the ultrasound probe at the first time and the second time comprises:
inputting the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment into a second estimation model, and acquiring the third pose change value output by the second estimation model;
the second estimation model is a regression model obtained through training of a second preset training set, the second preset training set comprises multiple groups of sample data, and the pose change value of the ultrasonic probe in the process of collecting each group of sample data is acquired; each group of sample data comprises two ultrasonic slice images acquired at two adjacent moments and the poses of the ultrasonic probes acquired at the two adjacent moments by the IMU; the pose change value of the ultrasonic probe is obtained by adopting an optical positioning and tracking device or a magnetic positioning and tracking device.
6. The method of claim 2, wherein the correcting the first pose change value based on the second and third pose change values to obtain the corrected pose change value comprises:
and filtering the first pose change value, the second pose change value and the third pose change value based on an indirect Kalman filtering method to obtain the corrected pose change value.
7. The method of claim 1, wherein the correcting the second pose offset based on the revised pose change value comprises:
and calculating a difference value between the first pose deviation and the correction pose change value to obtain a corrected second pose deviation.
8. The method of claim 1 or 7, wherein the corrected second posture offset comprises an offset distance and an offset angle;
generating navigation information of the ultrasonic probe based on the corrected second posture offset comprises:
if the offset distance is larger than a second preset threshold value, converting the offset distance into a moving direction to obtain the navigation information;
and if the offset distance is less than or equal to the second preset threshold, converting the offset angle into a rotating direction to obtain the navigation information.
9. An ultrasound probe navigation device comprising:
the ultrasonic scanning device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a first ultrasonic sectional image acquired by an ultrasonic probe at a first moment and a second ultrasonic sectional image acquired by the ultrasonic probe at a second moment in the scanning process of an operator by using the ultrasonic probe;
a determination module to determine a first pose change value based on the first pose offset and the second pose offset; wherein the first pose offset is a pose offset between a pose of the ultrasonic probe and a target pose when the first ultrasonic sectional image is acquired, the second pose offset is a pose offset between a pose of the ultrasonic probe and a target pose when the second ultrasonic sectional image is acquired, and the target pose is a standard pose when the ultrasonic probe scans a target scanning position;
the correction module is used for correcting the first pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment to obtain a corrected pose change value;
and the correction module is used for correcting the second posture offset based on the corrected posture change value and generating the navigation information of the ultrasonic probe based on the corrected second posture offset.
10. The apparatus according to claim 9, wherein the modification module is specifically configured to:
acquiring poses of the ultrasonic probe measured by an Inertial Measurement Unit (IMU) installed in the ultrasonic probe at the first moment and the second moment;
calculating a difference value between the pose of the ultrasonic probe at the first moment and the pose of the ultrasonic probe at the second moment to obtain a second pose change value;
determining a third pose change value based on the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment;
and correcting the first posture change value based on the second posture change value and the third posture change value to obtain a corrected posture change value.
11. The apparatus of claim 9, the apparatus further comprising:
the judging module is used for judging whether the corrected second posture offset is smaller than a first preset threshold value;
the determining module is further configured to determine that the ultrasonic probe has been moved to the target scanning position if the determination result of the determining module is yes;
the correction module is further configured to, if the determination result of the determination module is negative, execute the step of generating the navigation information of the ultrasonic probe based on the corrected second attitude offset, trigger the acquisition module to acquire an ultrasonic sectional image acquired by the ultrasonic probe at a next moment, take a current second moment as a first moment, take the next moment as a second moment, and trigger the determination module to execute the step of determining the first attitude change value based on the first attitude offset and the second attitude offset until the determination module determines that the corrected second attitude offset is smaller than the first preset threshold.
12. The apparatus according to any of claims 9-11, wherein the determining module is further configured to obtain the first and second position offsets by:
inputting the first ultrasonic sectional image into a first estimation model to obtain the first attitude offset output by the first estimation model;
inputting the second ultrasonic sectional image into the first estimation model to obtain the second attitude offset output by the first estimation model;
the first estimation model is a regression model obtained through training of a first preset training set, the first preset training set comprises a plurality of sample ultrasonic section images, and when each sample ultrasonic section image is collected, the pose of the ultrasonic probe is shifted from the pose of the target; the pose of the ultrasonic probe is obtained by adopting an optical positioning tracking device or a magnetic positioning tracking device.
13. The apparatus according to claim 10, wherein the modification module is specifically configured to:
inputting the first ultrasonic sectional image, the second ultrasonic sectional image and the poses of the ultrasonic probe at the first moment and the second moment into a second estimation model, and acquiring the third pose change value output by the second estimation model;
the second estimation model is a regression model obtained through training of a second preset training set, the second preset training set comprises multiple groups of sample data, and the pose change value of the ultrasonic probe in the process of acquiring each group of sample data is acquired; each group of sample data comprises two ultrasonic slice images acquired at two adjacent moments and the poses of the ultrasonic probes acquired by the IMU at the two adjacent moments; the pose change value of the ultrasonic probe is obtained by adopting an optical positioning and tracking device or a magnetic positioning and tracking device.
14. The apparatus according to claim 10, wherein the correction module is specifically configured to:
and filtering the first pose change value, the second pose change value and the third pose change value based on an indirect Kalman filtering method to obtain the corrected pose change value.
15. The device according to claim 9, wherein the corrective module is specifically configured to:
and calculating a difference value between the first pose deviation and the correction pose change value to obtain a corrected second pose deviation.
16. The apparatus according to claim 9 or 15, wherein the corrected second posture offset comprises an offset distance and an offset angle;
the correction module is specifically configured to:
if the offset distance is larger than a second preset threshold value, converting the offset distance into a moving direction to obtain the navigation information;
and if the offset distance is less than or equal to the second preset threshold, converting the offset angle into a rotating direction to obtain the navigation information.
17. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
18. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-8.
19. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-8.
CN202211105204.2A 2022-09-09 2022-09-09 Ultrasonic probe navigation method, device, equipment and medium Pending CN115615427A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211105204.2A CN115615427A (en) 2022-09-09 2022-09-09 Ultrasonic probe navigation method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211105204.2A CN115615427A (en) 2022-09-09 2022-09-09 Ultrasonic probe navigation method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN115615427A true CN115615427A (en) 2023-01-17

Family

ID=84859672

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211105204.2A Pending CN115615427A (en) 2022-09-09 2022-09-09 Ultrasonic probe navigation method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN115615427A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116152610A (en) * 2023-04-04 2023-05-23 北京智源人工智能研究院 Intelligent heart ultrasonic probe pose estimation model training method and pose estimation method
CN116549020A (en) * 2023-07-11 2023-08-08 深圳微创心算子医疗科技有限公司 Ultrasonic detection method, training method and device for digital heart tangential plane network model
CN116671974A (en) * 2023-06-06 2023-09-01 河北大学 Magnetic positioning system for ultrasonic inspection

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116152610A (en) * 2023-04-04 2023-05-23 北京智源人工智能研究院 Intelligent heart ultrasonic probe pose estimation model training method and pose estimation method
CN116152610B (en) * 2023-04-04 2023-06-23 北京智源人工智能研究院 Intelligent heart ultrasonic probe pose estimation model training method and pose estimation method
CN116671974A (en) * 2023-06-06 2023-09-01 河北大学 Magnetic positioning system for ultrasonic inspection
CN116671974B (en) * 2023-06-06 2024-02-06 河北大学 Magnetic positioning system for ultrasonic inspection
CN116549020A (en) * 2023-07-11 2023-08-08 深圳微创心算子医疗科技有限公司 Ultrasonic detection method, training method and device for digital heart tangential plane network model
CN116549020B (en) * 2023-07-11 2023-11-03 深圳微创心算子医疗科技有限公司 Ultrasonic detection method, training method and device for digital heart tangential plane network model

Similar Documents

Publication Publication Date Title
CN115615427A (en) Ultrasonic probe navigation method, device, equipment and medium
CN112215843B (en) Ultrasonic intelligent imaging navigation method and device, ultrasonic equipment and storage medium
CN111012377B (en) Echocardiogram heart parameter calculation and myocardial strain measurement method and device
CN101243475B (en) Method and apparatus featuring simple click style interactions according to a clinical task workflow
EP2672890A1 (en) Methods, systems, and media for determining carotid intima-media thickness
JP6376873B2 (en) Image processing apparatus, image processing method, and program
US11744554B2 (en) Systems and methods of determining dimensions of structures in medical images
EP3071113B1 (en) Method and apparatus for displaying ultrasound image
WO2015010745A1 (en) Multi-modal segmentation of image data
CN114387317B (en) CT image and MRI three-dimensional image registration method and device
US20150080735A1 (en) Method and apparatus for providing ultrasound information by using guidelines
WO2016128040A1 (en) Preview visualisation of tracked nerve fibers
JP7437192B2 (en) medical image processing device
CN101175442A (en) Imaging diagnosis device, measurement point setting method, and program
US20210113191A1 (en) Image data adjustment method and device
CN110910348A (en) Method, device, equipment and storage medium for classifying positions of pulmonary nodules
JP2021166578A (en) Ultrasound diagnosis device and ultrasound diagnosis system
CN115969414A (en) Method and system for using analytical aids during ultrasound imaging
Aydin et al. A hybrid image processing system for X-ray images of an external fixator
CN112515944B (en) Ultrasound imaging with real-time feedback for cardiopulmonary resuscitation (CPR) compressions
CN111292248B (en) Ultrasonic fusion imaging method and ultrasonic fusion navigation system
CN114787867A (en) Organ deformation compensation for medical image registration
US12016731B2 (en) Ultrasound credentialing system
US10163529B2 (en) Display processing method and apparatus
US20230206576A1 (en) Vessel displaying method, computer device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination