CN115944317A - Ultrasonic scanning navigation system and ultrasonic scanning method - Google Patents

Ultrasonic scanning navigation system and ultrasonic scanning method Download PDF

Info

Publication number
CN115944317A
CN115944317A CN202310109287.0A CN202310109287A CN115944317A CN 115944317 A CN115944317 A CN 115944317A CN 202310109287 A CN202310109287 A CN 202310109287A CN 115944317 A CN115944317 A CN 115944317A
Authority
CN
China
Prior art keywords
information
dimensional
ultrasonic
image information
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310109287.0A
Other languages
Chinese (zh)
Inventor
谷晓林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lease Medical Technology Suzhou Co ltd
Original Assignee
Lease Medical Technology Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lease Medical Technology Suzhou Co ltd filed Critical Lease Medical Technology Suzhou Co ltd
Priority to CN202310109287.0A priority Critical patent/CN115944317A/en
Publication of CN115944317A publication Critical patent/CN115944317A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention relates to an ultrasonic scanning navigation system and an ultrasonic scanning method, wherein the system comprises: the system comprises a sensor module, a data acquisition module, a three-dimensional reconstruction module, a target identification module, a scanning navigation module and a visualization module, wherein the method comprises the following steps: the method comprises the steps of acquiring ultrasonic image information of a person to be detected and pose information of a sensor module in real time, converting a two-dimensional ultrasonic image of the person to be detected into a three-dimensional volume image according to the ultrasonic image information and the pose information of the sensor module by adopting an interpolation algorithm, constructing a target scanning area, calculating the area to be scanned and an optimal scanning path by adopting a physical model and displaying the area to be scanned and the optimal scanning path on a visualization module, and leading an operator to carry out comprehensive and accurate ultrasonic scanning according to the visualization module, thereby simplifying the ultrasonic scanning process and avoiding missed diagnosis and misdiagnosis.

Description

Ultrasonic scanning navigation system and ultrasonic scanning method
Technical Field
The invention relates to the technical field of medical appliances, in particular to an ultrasonic scanning navigation system and an ultrasonic scanning method.
Background
Ultrasound is a safe, convenient, low-cost medical imaging technique, and utilizes an ultrasound probe to transmit and receive sound wave signals to a human body to reflect the anatomical structure inside the human body. The conventional ultrasonic scanning in the related technology depends on the technology and experience of an operator, the operator only grasps the pose of the ultrasonic probe and the corresponding ultrasonic image at the current moment, cannot accurately sense the part which is subjected to the ultrasonic scanning and the part to be subjected to the ultrasonic scanning, and the conditions of incomplete and incomplete scanning are easy to occur, so that missed diagnosis is generated. Furthermore, inexperienced and technically difficult ultrasound operators are not competent in ultrasound screening efforts, resulting in a limited supply of ultrasound screening that fails to meet the enormous market demands.
Disclosure of Invention
Therefore, the invention provides an ultrasonic scanning navigation system and an ultrasonic scanning method, which guide an ultrasonic operator to complete an ultrasonic scanning process by using a tracking sensor and a visual navigation technology, simplify the ultrasonic scanning process and avoid missed diagnosis and misdiagnosis.
In order to achieve the purpose, the invention mainly adopts the following technical scheme: the embodiment of the application provides an ultrasonic scanning navigation system, which comprises: the system comprises a sensor module, a processing module and a display module, wherein the sensor module is used for acquiring ultrasonic image information of a person to be measured and pose information of the sensor module, and the second ultrasonic image information and the pose information respectively comprise corresponding timestamps; the data acquisition module is connected with the sensor module and used for aligning a timestamp between the ultrasonic image information and the pose information; the three-dimensional reconstruction module is connected with the data acquisition module and used for obtaining a three-dimensional coordinate system of the ultrasonic image information corresponding to each timestamp through coordinate transformation according to the ultrasonic image information and the pose information and the sequence of the timestamps, and calculating the three-dimensional coordinate system by adopting an interpolation algorithm to obtain a three-dimensional volume image corresponding to the ultrasonic image information; the target identification module is connected with the three-dimensional reconstruction module and used for carrying out image processing on the three-dimensional volume image, extracting three-dimensional boundary information of the person to be detected by adopting a segmentation algorithm and constructing a target scanning area of the person to be detected based on the three-dimensional boundary information; the scanning navigation module is respectively connected with the three-dimensional reconstruction module and the target identification module and is used for inputting the three-dimensional volume image and the target scanning area into a physical model to obtain an area to be scanned and calculating the optimal scanning path between the position and attitude information of the sensor module at the current moment and the area to be scanned according to the area to be scanned; and the visualization module is used for displaying the ultrasonic image information and the pose information acquired by the sensor module and simultaneously displaying a target scanning area and an optimal scanning path.
In some embodiments, the sensor module comprises: an ultrasound probe and a tracking sensor, wherein: the ultrasonic probe is used for collecting ultrasonic image information of the person to be measured, the tracking sensor is used for collecting pose information of the sensor module, and the pose information comprises: orientation information of the sensor module and attitude information of the sensor module.
In some embodiments, the three-dimensional reconstruction module is further to: receiving the ultrasonic image information and the pose information, and performing first coordinate transformation from a two-dimensional coordinate system to a tracking sensor coordinate system on the ultrasonic image information by adopting an ultrasonic probe calibration method; performing a second coordinate transformation from the tracking sensor coordinate system to a world coordinate system according to the pose information; and performing third coordinate transformation from the world coordinate system to the three-dimensional coordinate system of the ultrasonic image information through coordinate operation to obtain the three-dimensional coordinate system of the ultrasonic image information corresponding to each timestamp.
In some embodiments, the visualization module is further configured to update the ultrasound image information, the pose information, the target scanning area, and the optimal scanning path in real time according to a timestamp.
The embodiment of the application provides an ultrasonic scanning method, which is applied to an ultrasonic scanning navigation system and comprises the following steps: acquiring ultrasonic image information of a person to be measured and pose information of a sensor module, wherein the ultrasonic image information and the pose information comprise corresponding timestamps; aligning timestamps between the ultrasound image information and the pose information; according to the ultrasonic image information and the pose information and the sequence of the timestamps, obtaining a three-dimensional coordinate system of the ultrasonic image information corresponding to each timestamp through coordinate transformation, and calculating the three-dimensional coordinate system by adopting an interpolation algorithm to obtain a three-dimensional volume image corresponding to the ultrasonic image; performing image processing on the three-dimensional volume image, extracting three-dimensional boundary information of the person to be detected by adopting a segmentation algorithm, and constructing a target scanning area of the person to be detected based on the three-dimensional boundary information; inputting the three-dimensional volume image and the target scanning area into a physical model, outputting to obtain an area to be scanned, and calculating the optimal scanning path between the pose information of the sensor module and the area to be scanned at the current moment according to the area to be scanned; displaying the ultrasonic image information and the pose information, and simultaneously displaying the target scanning area and the optimal scanning path; and traversing the target scanning area by the operator according to the optimal scanning path.
In some embodiments, the sensor module comprises: an ultrasonic probe and a tracking sensor; the acquiring of the ultrasonic image information of the person to be measured and the pose information of the sensor module comprises: the ultrasonic probe collects ultrasonic image information of the person to be measured, and the tracking sensor collects pose information of the sensor module, wherein the pose information comprises: orientation information of the sensor module and attitude information of the sensor module.
In some embodiments, the obtaining, according to the ultrasound image information and the pose information and according to the timestamp sequence, a three-dimensional coordinate system of the ultrasound image information corresponding to each timestamp through coordinate transformation includes: performing first coordinate transformation from a two-dimensional coordinate system to a tracking sensor coordinate system on the ultrasonic image information by adopting an ultrasonic probe calibration method; performing a second coordinate transformation from the tracking sensor coordinate system to a world coordinate system according to the pose information; and performing third coordinate transformation from the world coordinate system to the three-dimensional coordinate system of the ultrasonic image information through coordinate operation to obtain the three-dimensional coordinate system of the ultrasonic image information corresponding to each timestamp.
In some embodiments, the method further comprises: and updating the ultrasonic image information, the pose information, the target scanning area and the optimal scanning path in real time according to the timestamp.
An embodiment of the present application provides an electronic device, which includes: a processor; and a memory in which is stored a computer program that is loaded and executed by the processor to implement the ultrasound scanning method described above.
The embodiment of the application provides a computer readable storage medium, on which at least one computer program is stored, and the computer program is loaded and executed by a processor to realize the ultrasonic scanning method.
The ultrasonic scanning method provided by the invention is based on the fact that a tracking sensor in an ultrasonic scanning navigation system acquires data of a person to be detected, determines an area to be scanned according to an internal three-dimensional reconstruction algorithm, accurately prompts the scanned area and the area to be scanned in a visual mode, and gives the optimal path information of an operator, so that the operator is guided to complete comprehensive and thorough ultrasonic scanning, the threshold of the operator is reduced, and missed diagnosis is avoided.
Drawings
Fig. 1 is a schematic structural diagram of an ultrasound scanning navigation system provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a sensor module;
fig. 3 is a schematic flow chart of an ultrasound scanning method according to an embodiment of the present application.
Detailed Description
In order that the above objects, technical solutions and advantages of the present application can be more clearly understood, the following detailed description is given with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments and features of the embodiments of the present application can be combined with each other without conflict, and the specific embodiments described below are only used for explaining the present application and are not used for limiting the present application.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, and the described embodiments are merely some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
In the following description, suffixes such as "module", "component", or "unit" used to denote components are used only for the convenience of description of the present application, and have no specific meaning by themselves. Thus, "module", "component" or "unit" may be used mixedly.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an ultrasound scanning navigation system 100 in the embodiment of the present application. In the embodiment shown in fig. 1, the ultrasound scanning navigation system 100 includes a sensor module 101, a data acquisition module 102, a three-dimensional reconstruction module 103, an object recognition module 104, a scanning navigation module 105, and a visualization module 106. The components of the ultrasound scanning navigation system 100 may be connected in any manner, including wired or wireless connections, etc. It will be understood by those skilled in the art that the specific structure of the ultrasound scanning navigation system 100 shown in fig. 1 does not constitute a limitation of the ultrasound scanning navigation system 100, that the ultrasound scanning navigation system 100 may include more or less components than those shown, that certain components do not belong to the essential constitution of the ultrasound scanning navigation system 100, that certain components may be omitted or combined as necessary within the scope not changing the essence of the present application.
The following describes the components of the ultrasound scanning navigation system 100 in detail with reference to fig. 1:
fig. 2 is a schematic structural diagram of a sensor module, and as shown in fig. 2, the sensor module 101 includes: the ultrasonic probe 111 and the tracking sensor 112, and the type and number of the sensors included in the sensor module 101 can be set according to actual detection requirements, wherein: the ultrasound probe 111 is used for acquiring ultrasound image information of the person to be measured, and the tracking sensor 112 is used for acquiring pose information of the sensor module 101. Here, the ultrasonic probe 111 is configured to transmit an ultrasonic wave to a person to be measured and receive an echo signal reflected by the person to be measured, so as to obtain ultrasonic image information of the person to be measured at each time, where the ultrasonic image information acquired by the ultrasonic probe 111 includes a corresponding time stamp, that is, the ultrasonic image information has a mapping relationship with the time stamp, each ultrasonic image information corresponds to the time stamp at the current time, and the ultrasonic image information acquired by the ultrasonic probe 111 is a two-dimensional image slice of the person to be measured at the current time. The ultrasonic probe 111 is provided with an acceleration sensor, and the speed, the acceleration and the displacement of the ultrasonic probe 111 are obtained through the acceleration sensor. Here, the surface of the ultrasonic probe 111 is provided with a flexible material layer (not shown in the figure), so that frictional resistance when the ultrasonic probe is in contact with a human body can be avoided in the measurement process, and measurement errors caused by reflection loss of ultrasonic energy at a contact interface due to air coupling when the ultrasonic probe 111 is in direct contact with the human body surface in the scanning process can be eliminated.
The tracking sensor 112 is used to collect pose information of the sensor module, including: orientation information of the sensor module 101 and attitude information of the sensor module 101. The orientation information of the sensor module 101 is three-dimensional coordinates of the position of the sensor module 101 in space, and includes information of 6 degrees of freedom, including position information of 3 degrees of freedom and direction information of 3 degrees of freedom. The attitude information of the sensor module 101 is the attitude of the sensor module at the current time, for example, the attitude is parallel to the X-Y plane and perpendicular to the X-Z plane. Here, the tracking sensor 112 may be composed of one or a combination of various sensors such as an optical sensor, an electromagnetic sensor, and an inertial sensor. The tracking sensor 112 may be fixedly connected to the outer surface of the ultrasound probe 111 (as shown in fig. 2) by a customized connector, or may be embedded in the ultrasound probe to form an integral body. The pose information collected by the tracking sensor 112 includes corresponding timestamps, that is, the pose information and the timestamps have a one-to-one mapping relationship.
Referring to fig. 1, the data acquisition module 102 is connected to the sensor module 101 through a data line or in a wireless manner (for example, a connection manner such as bluetooth, WIFI, 5G, etc.). The data acquisition module 102 receives ultrasound image information acquired by the ultrasound probe 111 and pose information acquired by the tracking sensor 112 in real time, and simultaneously acquires time stamps corresponding to the ultrasound image information and the pose information, and the data acquisition module 102 aligns the time stamps of the ultrasound image information and the pose information to realize time synchronization.
The three-dimensional reconstruction module 103 is connected to the data acquisition module 102, the three-dimensional reconstruction module 103 performs a first coordinate transformation from a two-dimensional coordinate system to the tracking sensor coordinate system on the ultrasound image information by using an ultrasound probe calibration method, where the ultrasound image information is a two-dimensional image slice of a person to be measured, that is, the ultrasound image information has a coordinate matrix in the two-dimensional coordinate system, the first coordinate transformation is a coordinate matrix transforming the coordinate matrix in the two-dimensional coordinate system to the tracking sensor coordinate system, and the first coordinate transformation is represented as T S←I . Subsequently, the three-dimensional reconstruction module 103 performs second coordinate transformation from the tracking sensor coordinate system to the world coordinate system according to the pose information acquired by the tracking sensor; here, the second coordinate transformation is expressed as: t is a unit of W←S . Finally, the three-dimensional reconstruction module 103 performs a third coordinate transformation from the world coordinate system to the three-dimensional coordinate system of the ultrasound image information through coordinate operation to obtain the three-dimensional coordinate system of the ultrasound image information corresponding to each timestamp, where the third coordinate transformation is represented as T V←W . After the three times of coordinate transformation, the two-dimensional coordinates of the ultrasonic image of the person to be measured acquired by the ultrasonic probe can be mapped into a three-dimensional coordinate system, and a three-dimensional volume image is calculated by adopting an interpolation algorithm according to the three-dimensional coordinate system of the obtained ultrasonic image information. Here, the calculation may be performed using a variety of interpolation algorithms, for example, nearest neighbor interpolation, trilinear interpolation, and the like. Through three-dimensional reconstruction, the two-dimensional ultrasonic image information of the person to be measured acquired by the sensor module 101 is converted into a three-dimensional volume image so as to restore the actual ultrasonic scanning image of the person to be measured.
The target identification module 104 is connected with the three-dimensional reconstruction module 103, performs image processing on the three-dimensional volume image obtained by the three-dimensional reconstruction module 103, extracts three-dimensional boundary information of the person to be measured by adopting a deep learning (neural network) and a segmentation algorithm, and constructs a target scanning area of the person to be measured based on the three-dimensional boundary information; for example, the examinee needs to scan a breast area by using the ultrasound scanning navigation system 100, extract three-dimensional boundary information of the breast by using a segmentation algorithm such as deep learning (neural network) and level set, and construct a target scanning area, that is, an area that the breast scanning must cover, based on the boundary information of the breast. The target scanning area comprises a mammary gland area and a mammary gland peripheral safety area.
The scanning navigation module 105 is respectively connected with the three-dimensional reconstruction module 103 and the target identification module 104, inputs the three-dimensional volume image and the target scanning area into a physical model which is built in advance based on prior knowledge, outputs the physical model to obtain an area to be scanned, and calculates the optimal scanning path between the pose information of the sensor module and the area to be scanned at the current moment according to the area to be scanned.
The visualization module 106 is connected with the scanning navigation module 105, and is configured to display the ultrasound image information and the pose information acquired by the sensor module, and simultaneously display a target scanning area and an optimal scanning path; here, the visualization module 106 is further configured to update the ultrasound image information, the pose information, the target scanning area, and the optimal scanning path in real time according to the timestamp. The visualization module 106 may include a Display panel, which may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), and the like.
An ultrasound scanning method provided in an embodiment of the present application is applied to an ultrasound scanning navigation system, for example, the ultrasound scanning navigation system 100 described above, fig. 3 is a schematic flow diagram of the ultrasound scanning method provided in the embodiment of the present application, and a scanning process of the ultrasound scanning method provided in the embodiment of the present application is described below by taking the ultrasound scanning navigation system 100 as an example, as shown in fig. 3, the method includes:
step S101: the method comprises the steps of collecting ultrasonic image information of a person to be measured and pose information of a sensor module, wherein the second ultrasonic image information and the pose information respectively comprise corresponding timestamps. Here, the sensor module 101 acquires ultrasound image information and sensor pose information, wherein an ultrasound probe 111 in the sensor module acquires ultrasound image information in real time, and a tracking sensor 112 in the sensor module acquires pose information in real time.
Step S102: aligning timestamps between the ultrasound image information and the pose information; here, the sensor module 101 sends the ultrasound image information and the pose information to the data acquisition module 102, and the data acquisition module 102 extracts corresponding timestamps in the ultrasound image information and the pose information and aligns the timestamps of the ultrasound image information and the pose information to achieve time synchronization.
Step S103: and according to the ultrasonic image information and the pose information and the sequence of the timestamps, obtaining a three-dimensional coordinate system of the ultrasonic image information corresponding to each timestamp through coordinate transformation, and calculating the three-dimensional coordinate system by adopting an interpolation algorithm to obtain a three-dimensional volume image corresponding to the ultrasonic image.
In the embodiment of the present application, the three-dimensional reconstruction module 103 performs three-dimensional reconstruction on a plurality of two-dimensional ultrasound images, and establishes the interrelation among an ultrasound image two-dimensional coordinate system, a tracking sensor coordinate system, a world coordinate system, and an ultrasound image three-dimensional coordinate system according to the ultrasound two-dimensional image acquired by the ultrasound probe, pose information acquired by the tracking sensor, and related reference information such as the relative position of the ultrasound probe and the tracking sensor. And then mapping the ultrasonic two-dimensional image acquired by the ultrasonic probe to a three-dimensional real physical space through a series of coordinate transformation T so as to restore the three-dimensional image of the person to be detected. The above process can be expressed as:
V=T V←W T W←S T S←I P (1)
wherein V is the physical position of a voxel point in the three-dimensional volume image generated by reconstruction; p is the physical position of a pixel point in a two-dimensional plane in an ultrasonic image acquired by an ultrasonic probe; t is S←I Is a first coordinate transformation from an ultrasound image two-dimensional coordinate system I to a tracking sensor coordinate system S; t is W←S Is a second coordinate transformation from the tracking sensor coordinate system S to the world coordinate system W; t is V←W Is a third coordinate transformation from the world coordinate system W to the ultrasound image three-dimensional coordinate system V.
Here, the coordinate transformation T is a matrix composed of a rotation transformation and a translation transformation, T V←W 、T W←S 、T S←I Can be calculated by equation (2):
Figure BDA0004076166680000071
wherein T can be replaced by T in turn according to the above transformation process V←W 、T W←S 、T S←I
According to the formula (1), the specific process of three-dimensional reconstruction is as follows:
step S31: and receiving the ultrasonic image information and the pose information, and performing first coordinate transformation from a two-dimensional coordinate system to a tracking sensor coordinate system on the ultrasonic image information by adopting an ultrasonic probe calibration method. The ultrasound image information is a two-dimensional image slice of the person to be examined, i.e. the ultrasound image information has a coordinate matrix in a two-dimensional coordinate system, the first coordinate transformation T S←I I.e. transforming the coordinate matrix in the two-dimensional coordinate system into a coordinate matrix in the tracking sensor coordinate system. First coordinate transformation T S←I Can be obtained by using ultrasonic probe calibration method, and the first coordinate transformation T S←I The ultrasonic scanning and three-dimensional reconstruction process is kept constant.
Step S32: performing a second coordinate transformation T from the tracking sensor coordinate system to the world coordinate system according to pose information acquired by the tracking sensor W←S
Step S33: performing a third coordinate transformation T from the world coordinate system to a three-dimensional coordinate system of the ultrasound image information by coordinate calculation V←W . Here, the third coordinate transformation T is performed based on the world coordinate system obtained in step S32 V←W ,T V←W Can be obtained by coordinate operation, and the third coordinate is transformed by T V←W The ultrasonic scanning and three-dimensional reconstruction process is kept constant.
Through the steps, all ultrasonic image information acquired by the ultrasonic probe is processed in a traversing mode through a formula (1), a two-dimensional coordinate system of the ultrasonic image information is converted into a three-dimensional coordinate system of the ultrasonic image information, and then three-dimensional grid data with a specific physical distance, namely an ultrasonic three-dimensional volume image after three-dimensional reconstruction, is formed through interpolation hole-filling algorithms such as nearest neighbor interpolation, trilinear interpolation and the like. And after three-dimensional reconstruction, the pose information of the ultrasonic image is reduced to a real three-dimensional coordinate system, namely the obtained ultrasonic three-dimensional volume image is the three-dimensional ultrasonic image of the actual object scanned by the ultrasonic probe.
Step S104: and carrying out image processing on the three-dimensional volume image, extracting three-dimensional boundary information of the person to be detected by adopting a segmentation algorithm, and constructing a target scanning area of the person to be detected based on the three-dimensional boundary information. Here, the target recognition module 104 receives the three-dimensional volume image, extracts the three-dimensional boundary information of the person to be measured by using a segmentation algorithm such as deep learning (neural network) and level set, and constructs a target scanning area, i.e., an area that the ultrasonic scanning must cover.
Step S105: inputting the three-dimensional volume image and the target scanning area into a physical model, outputting to obtain an area to be scanned, and calculating the best scanning path between the pose information of the sensor module and the area to be scanned at the current moment according to the area to be scanned. Here, the scanning navigation module 105 builds and trains a physical model in advance by using the prior knowledge, inputs the three-dimensional volume image and the target scanning area into the physical model, and outputs the area to be scanned.
Step S106: displaying ultrasonic image information and pose information acquired by the ultrasonic probe, and simultaneously displaying a target scanning area and an optimal scanning path; here, the ultrasound probe, the ultrasound image information, the target scanning area, the area to be scanned, and the optimal scanning path are displayed through the visualization module 106. The three-dimensional images of the ultrasonic probe and the ultrasonic image are updated in real time according to the feedback of the tracking sensor and the ultrasonic probe, the target scanning area is highlighted, and the three-dimensional images are updated in real time according to the feedback of the target identification module. The scanned area and the area to be scanned are highlighted in different colors with distinct contrast, and are updated in real time according to the feedback of the scanning navigation module. The optimal path of the ultrasonic probe reaching the adjacent region to be scanned is highlighted and displayed, and the optimal path is updated in real time according to the feedback of the scanning navigation module.
Step S107: and traversing the target scanning area by the operator according to the optimal scanning path. An operator traverses the target scanning area under the assistance of the display content of the visualization module, so that the full coverage of the area to be scanned is realized, and the scanning omission and the diagnosis omission are avoided.
Ultrasound is an effective means for breast screening, and is more suitable for dense breast screening than X-ray (molybdenum target). Breast ultrasound screening all suspicious breast nodules are detected by ultrasound imaging through the breast tissue. The ultrasonic scanning navigation system provided by the embodiment of the application can be used for carrying out ultrasonic scanning on the mammary gland, and the scanning process comprises the following steps:
when the ultrasonic scanning is required to be started, the ultrasonic scanning navigation system can be opened by issuing an opening instruction, for example, the opening instruction is issued through a start button, and the ultrasonic probe can be sensed to be picked up through a sensing device arranged in the ultrasonic probe so as to issue the opening instruction, at the moment, an operator picks up the ultrasonic probe to scan along a mammary gland region of a person to be detected, the ultrasonic probe acquires two-dimensional ultrasonic image information corresponding to each moment, a tracking sensor acquires pose information of a sensor module corresponding to each moment, the acquired ultrasonic image information and the pose information are sent to a data acquisition module, the data acquisition module aligns the ultrasonic image information with a timestamp of the pose information and then sends the ultrasonic image information to a three-dimensional reconstruction module, the three-dimensional reconstruction module carries out three-dimensional reconstruction, the two-dimensional ultrasonic image information is converted into a three-dimensional volume image and is displayed on a visualization module, a target identification region constructs the region to be scanned of the person to be detected based on the three-dimensional volume image and is displayed on the visualization module, and the target scanning region comprises the mammary gland region and a mammary gland periphery safety region. Then, inputting the three-dimensional volume image and the target scanning area into a physical model by a scanning navigation module to obtain a scanned area and an optimal scanning path, and displaying the scanned area and the optimal scanning path on a visualization module; in this way, the operator conducts the ultrasonic scanning process according to the optimal path guidance of the visualization module until the target scanning area is traversed.
According to the ultrasonic scanning navigation system and the ultrasonic scanning method provided by the embodiment of the application, the data of a person to be detected is acquired in real time based on the sensor module, and the scanned and scanned parts are accurately prompted in a visual mode, so that an ultrasonic operator is guided to complete comprehensive and thorough scanning, the threshold of the operator is reduced, and missed diagnosis is avoided.
Embodiments of the present application provide an electronic device including, but not limited to, a processor, a memory, and a computer program stored in the memory and executable on the processor. For example, the computer program is an ultrasound scanning navigation program. The computer program is loaded and executed by the processor to implement the ultrasound scanning method described above.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable gate array (FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware component, etc.
The memory may be used for storing the computer programs and/or modules/units, and the processor may implement various functions of the electronic device by executing or executing the computer programs and/or modules/units stored in the memory and calling data stored in the memory. In addition, the memory may include volatile and non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other storage device.
The electronic device integrated module/unit, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the embodiments of the methods described above can be realized.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and those skilled in the art can make various changes, modifications, substitutions and alterations without departing from the principle and spirit of the present invention, and the scope of the present invention is defined by the appended claims and their equivalents.

Claims (10)

1. An ultrasonic scanning navigation system, characterized in that the system comprises:
the sensor module is used for acquiring ultrasonic image information of a person to be measured and pose information of the sensor module, wherein the second ultrasonic image information and the pose information respectively comprise corresponding timestamps;
the data acquisition module is connected with the sensor module and is used for aligning a timestamp between the ultrasonic image information and the pose information;
the three-dimensional reconstruction module is connected with the data acquisition module and used for obtaining a three-dimensional coordinate system of the ultrasonic image information corresponding to each timestamp through coordinate transformation according to the ultrasonic image information and the pose information and the sequence of the timestamps, and calculating the three-dimensional coordinate system by adopting an interpolation algorithm to obtain a three-dimensional volume image corresponding to the ultrasonic image information;
the target identification module is connected with the three-dimensional reconstruction module and used for carrying out image processing on the three-dimensional volume image, extracting three-dimensional boundary information of the person to be detected by adopting a segmentation algorithm and constructing a target scanning area of the person to be detected based on the three-dimensional boundary information;
the scanning navigation module is respectively connected with the three-dimensional reconstruction module and the target identification module and is used for inputting the three-dimensional volume image and the target scanning area into a physical model to obtain an area to be scanned and calculating the optimal scanning path between the pose information of the sensor module and the area to be scanned at the current moment according to the area to be scanned;
and the visualization module is used for displaying the ultrasonic image information and the pose information acquired by the sensor module and simultaneously displaying a target scanning area and an optimal scanning path.
2. The system of claim 1, wherein the sensor module comprises: an ultrasound probe and a tracking sensor, wherein:
the ultrasonic probe is used for collecting ultrasonic image information of the person to be measured, the tracking sensor is used for collecting pose information of the sensor module, and the pose information comprises: orientation information of the sensor module and attitude information of the sensor module.
3. The system of claim 2, wherein the three-dimensional reconstruction module is further configured to:
receiving the ultrasonic image information and the pose information, and performing first coordinate transformation from a two-dimensional coordinate system to a tracking sensor coordinate system on the ultrasonic image information by adopting an ultrasonic probe calibration method;
performing a second coordinate transformation from the tracking sensor coordinate system to a world coordinate system according to the pose information;
and performing third coordinate transformation from the world coordinate system to the three-dimensional coordinate system of the ultrasonic image information through coordinate operation to obtain the three-dimensional coordinate system of the ultrasonic image information corresponding to each timestamp.
4. The system of claim 3, wherein the visualization module is further configured to update the ultrasound image information, the pose information, the target scanning area, and the optimal scanning path in real time according to a timestamp.
5. An ultrasonic scanning method is applied to an ultrasonic scanning navigation system, and is characterized by comprising the following steps:
acquiring ultrasonic image information of a person to be measured and pose information of a sensor module, wherein the ultrasonic image information and the pose information comprise corresponding timestamps;
aligning timestamps between the ultrasound image information and the pose information;
obtaining a three-dimensional coordinate system of the ultrasonic image information corresponding to each timestamp through coordinate transformation according to the ultrasonic image information and the pose information and the sequence of the timestamps, and calculating the three-dimensional coordinate system by adopting an interpolation algorithm to obtain a three-dimensional volume image corresponding to the ultrasonic image;
performing image processing on the three-dimensional volume image, extracting three-dimensional boundary information of the person to be detected by adopting a segmentation algorithm, and constructing a target scanning area of the person to be detected based on the three-dimensional boundary information;
inputting the three-dimensional volume image and the target scanning area into a physical model, outputting to obtain an area to be scanned, and calculating the optimal scanning path between the pose information of the sensor module and the area to be scanned at the current moment according to the area to be scanned;
displaying the ultrasonic image information and the pose information, and simultaneously displaying the target scanning area and the optimal scanning path;
and traversing the target scanning area by the operator according to the optimal scanning path.
6. The method of claim 5, wherein the sensor module comprises: an ultrasonic probe and a tracking sensor; the acquiring of the ultrasonic image information of the person to be measured and the pose information of the sensor module comprises:
the ultrasonic probe collects ultrasonic image information of the person to be measured, and the tracking sensor collects pose information of the sensor module, wherein the pose information comprises: orientation information of the sensor module and attitude information of the sensor module.
7. The method according to claim 6, wherein obtaining the three-dimensional coordinate system of the ultrasound image information corresponding to each timestamp through coordinate transformation according to the order of the timestamps based on the ultrasound image information and the pose information comprises:
performing first coordinate transformation from a two-dimensional coordinate system to a tracking sensor coordinate system on the ultrasonic image information by adopting an ultrasonic probe calibration method;
performing a second coordinate transformation from the tracking sensor coordinate system to a world coordinate system according to the pose information;
and performing third coordinate transformation from the world coordinate system to the three-dimensional coordinate system of the ultrasonic image information through coordinate operation to obtain the three-dimensional coordinate system of the ultrasonic image information corresponding to each timestamp.
8. The method of claim 7, further comprising: and updating the ultrasonic image information, the pose information, the target scanning area and the optimal scanning path in real time according to the timestamp.
9. An electronic device, characterized in that the electronic device comprises:
a processor; and
a memory having stored therein a computer program that is loaded and executed by the processor to implement the ultrasound scanning method of any of claims 5 to 7.
10. A computer-readable storage medium, on which at least one computer program is stored, which is loaded and executed by a processor to implement an ultrasound scanning method according to any of claims 5 to 7.
CN202310109287.0A 2023-02-14 2023-02-14 Ultrasonic scanning navigation system and ultrasonic scanning method Pending CN115944317A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310109287.0A CN115944317A (en) 2023-02-14 2023-02-14 Ultrasonic scanning navigation system and ultrasonic scanning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310109287.0A CN115944317A (en) 2023-02-14 2023-02-14 Ultrasonic scanning navigation system and ultrasonic scanning method

Publications (1)

Publication Number Publication Date
CN115944317A true CN115944317A (en) 2023-04-11

Family

ID=87287841

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310109287.0A Pending CN115944317A (en) 2023-02-14 2023-02-14 Ultrasonic scanning navigation system and ultrasonic scanning method

Country Status (1)

Country Link
CN (1) CN115944317A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117351037A (en) * 2023-12-04 2024-01-05 合肥合滨智能机器人有限公司 Rotary and parallel moving type equidistant breast scanning track planning method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117351037A (en) * 2023-12-04 2024-01-05 合肥合滨智能机器人有限公司 Rotary and parallel moving type equidistant breast scanning track planning method
CN117351037B (en) * 2023-12-04 2024-02-09 合肥合滨智能机器人有限公司 Rotary and parallel moving type equidistant breast scanning track planning method

Similar Documents

Publication Publication Date Title
US20220000449A1 (en) System and methods for at home ultrasound imaging
CN112215843B (en) Ultrasonic intelligent imaging navigation method and device, ultrasonic equipment and storage medium
US20180271484A1 (en) Method and systems for a hand-held automated breast ultrasound device
CN101449985B (en) Anatomical modeling from a 3-d image and surface mapping
US10881353B2 (en) Machine-guided imaging techniques
JP5858636B2 (en) Image processing apparatus, processing method thereof, and program
US10219782B2 (en) Position correlated ultrasonic imaging
CN102300505B (en) Ultrasonic diagnostic device and control program for displaying image data
CN100450445C (en) Real-time, freedom-arm, three-D ultrasonic imaging system and method therewith
US11642096B2 (en) Method for postural independent location of targets in diagnostic images acquired by multimodal acquisitions and system for carrying out the method
CN105407811A (en) Method and system for 3D acquisition of ultrasound images
CN104902839B (en) Use the registering system and method for ultrasonic probe
JP7362354B2 (en) Information processing device, inspection system and information processing method
US11712224B2 (en) Method and systems for context awareness enabled ultrasound scanning
CN106333700A (en) Medical imaging apparatus and method of operating same
JP7321836B2 (en) Information processing device, inspection system and information processing method
CN107106128A (en) Supersonic imaging device and method for splitting anatomical object
KR20170084435A (en) Ultrasound imaging apparatus and control method for the same
CN112386278A (en) Method and system for camera assisted ultrasound scan setup and control
CN115944317A (en) Ultrasonic scanning navigation system and ultrasonic scanning method
CN112603373A (en) Method and system for diagnosing tendon injury via ultrasound imaging
KR20200104103A (en) Ultrasound diagnosis apparatus for registrating an ultrasound image and other modality image and method for operating the same
CN109907801A (en) One kind can position ultrasound guided puncture method
KR20150031091A (en) Method and apparatus for providing ultrasound information using guidelines
CN110418610A (en) Determine guidance signal and for providing the system of guidance for ultrasonic hand-held energy converter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination