CN109171817B - Three-dimensional breast ultrasonic scanning method and ultrasonic scanning system - Google Patents

Three-dimensional breast ultrasonic scanning method and ultrasonic scanning system Download PDF

Info

Publication number
CN109171817B
CN109171817B CN201811031335.4A CN201811031335A CN109171817B CN 109171817 B CN109171817 B CN 109171817B CN 201811031335 A CN201811031335 A CN 201811031335A CN 109171817 B CN109171817 B CN 109171817B
Authority
CN
China
Prior art keywords
image
ring
probe
ultrasound
breast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811031335.4A
Other languages
Chinese (zh)
Other versions
CN109171817A (en
Inventor
朱轲
梁浈
檀韬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shenbo Medical Technology Co ltd
Original Assignee
Zhejiang Shenbo Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shenbo Medical Technology Co ltd filed Critical Zhejiang Shenbo Medical Technology Co ltd
Priority to CN201811031335.4A priority Critical patent/CN109171817B/en
Publication of CN109171817A publication Critical patent/CN109171817A/en
Application granted granted Critical
Publication of CN109171817B publication Critical patent/CN109171817B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0825Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image

Abstract

The application relates to the field of ultrasonic scanning, and discloses a three-dimensional breast ultrasonic scanning method and an ultrasonic scanning system. The position of the nipple relative to the side ultrasound image can be accurately and automatically determined. The method comprises the steps that a probe is placed at the center of a breast to conduct ultrasonic scanning, a first ultrasonic image is obtained, meanwhile, a camera shoots a first video image containing a marking ring, and the first ultrasonic image covers the position of a nipple; determining a nipple position from the first ultrasound image; placing the probe at the side position of the breast for ultrasonic scanning to obtain a second ultrasonic image, and simultaneously shooting a second video image containing the marking ring by the camera; image analysis is performed on the first and second video images to determine the relative position of the second ultrasound image with respect to the nipple position based on the position of the marker ring in the first and second video images.

Description

Three-dimensional breast ultrasonic scanning method and ultrasonic scanning system
Technical Field
The application relates to the field of ultrasonic scanning, in particular to a mammary gland ultrasonic scanning technology.
Background
When the automatic breast ultrasound scanning system is used for scanning the breast at present, a three-dimensional image of one automatic scanning can cover a partial area of the breast. In diagnosis or screening, in order to cover the entire breast, the operator typically needs to place an ultrasound probe at each of three positions of the breast to obtain three-dimensional images to cover the entire breast area. These three positions are typically AP (center of breast), MED (medial breast), LAT (lateral breast), and are shown in FIG. 1 as three positions of the right breast. The three-dimensional images obtained will also be labeled as corresponding locations to facilitate reading by the physician. This annotation action is typically entered manually by the operator in the scanning workstation. In the process of obtaining the image, because the position of the image pixel relative to the nipple needs to be determined, after the scanning is finished, a worker needs to manually mark the nipple position information on the scanned image, so that when a focus is found, the relative position of the focus relative to the nipple can be determined according to the ultrasonic scanning image.
The problem with the prior art is that if the ultrasound scan image is in a bilateral position, it is possible that the nipple is not covered in the scan image, and at this time the approximate location of a nipple can only be estimated empirically, and if a lesion is found in the bilateral image, the relative location of the lesion with respect to the nipple may be less accurate.
Disclosure of Invention
The application aims to provide a three-dimensional breast ultrasound scanning method and an ultrasound scanning system, which can automatically determine the scanning position of a current scanning image and accurately and automatically determine the position of a nipple relative to a side ultrasound image.
In order to solve the above problems, the present application discloses a method for performing three-dimensional breast ultrasound scanning using an ultrasound scanning system, the system including a manipulator and a camera, one end of the manipulator being provided with an ultrasound scanning probe, the manipulator being provided with a marker ring, the marker ring moving along with movement of the probe, the method including:
placing the probe at the center of the breast for ultrasonic scanning to obtain a first ultrasonic image, and simultaneously shooting a first video image containing the marking ring by the camera, wherein the first ultrasonic image covers the nipple position;
determining a nipple position from the first ultrasound image;
placing the probe at the side position of the breast for ultrasonic scanning to obtain a second ultrasonic image, and simultaneously shooting a second video image containing the marker ring by the camera;
image analysis is performed on the first and second video images, and the relative position of the second ultrasound image with respect to the nipple position is determined based on the position of the marker ring in the first and second video images.
In a preferred embodiment, the marker ring is a plurality of differently colored rings juxtaposed.
In a preferred embodiment, the image analyzing the first and second video images further comprises:
calculating an actual physical movement distance of the probe, wherein the movement distance is (marker ring actual diameter/marker ring edge length pixel count) x the moving pixels of the marker ring in the first and second video images;
and calculating the relative displacement of the second ultrasonic image and the nipple position according to the actual physical movement distance of the probe.
In a preferred embodiment, the relative positions include: on which side of the breast the probe is located, and/or the relative displacement of the second ultrasound image and the nipple position.
In a preferred embodiment, the step of performing image analysis on the first and second video images further comprises:
the first and second ultrasound images are stitched based on the position of the marker ring in the first and second video images.
In a preferred embodiment, the step of performing image analysis on the first and second video images further includes:
determining a lesion location in the second ultrasound image;
determining a relative displacement vector of the lesion location and the nipple location based on the locations of the marker rings in the first and second video images.
In a preferred embodiment, the lateral position includes a medial breast and a lateral breast.
In a preferred embodiment, the second ultrasound image does not cover the nipple position.
In a preferred embodiment, the position of the camera and the shooting parameters are fixed.
The application also discloses an ultrasonic scanning system, includes:
the device comprises a manipulator, a camera and a calculation module;
one end of the mechanical arm is provided with an ultrasonic scanning probe, and the mechanical arm is provided with a marking ring which moves along with the movement of the probe;
the camera is used for shooting a video image containing the mark ring;
the calculation module is used for identifying the position of the marking ring according to the video image shot by the camera and calculating the displacement of the probe according to the change of the position of the marking ring.
Compared with the prior art, the technical scheme of the application can accurately and automatically determine the position of the nipple relative to the side ultrasonic image, including automatically determining the scanning position of the current ultrasonic image on which side, and further can accurately determine the relative position of the focus relative to the nipple.
Further, using multi-colored rings as the marking rings can improve the accuracy with which the marking rings are correctly recognized.
Further, according to the known size of the marker ring and the pixels occupied in the video image, the physical length corresponding to each pixel can be calculated, the moving distance of the probe can be calculated, and further the physical relative displacement between the second ultrasonic image and the nipple position can be calculated.
Further, the background of the first and second video images taken by the fixed position camera is not moving, and the amount of calculation is small when calculating the relative movement of the marker rings.
The present specification describes a number of technical features distributed throughout the various technical aspects, and if all possible combinations of technical features (i.e. technical aspects) of the present specification are listed, the description is made excessively long. In order to avoid this problem, the respective technical features disclosed in the above summary of the invention of the present application, the respective technical features disclosed in the following embodiments and examples, and the respective technical features disclosed in the drawings may be freely combined with each other to constitute various new technical solutions (which are considered to have been described in the present specification) unless such a combination of the technical features is technically infeasible. For example, in one example, the feature a + B + C is disclosed, in another example, the feature a + B + D + E is disclosed, and the features C and D are equivalent technical means for the same purpose, and technically only one feature is used, but not simultaneously employed, and the feature E can be technically combined with the feature C, then the solution of a + B + C + D should not be considered as being described because the technology is not feasible, and the solution of a + B + C + E should be considered as being described.
Drawings
FIG. 1 is a schematic representation of the three positions of AP, MED and LAT during a prior art breast ultrasound scan of the right breast;
FIG. 2 is a schematic flow chart of a method for three-dimensional breast ultrasound scanning using an ultrasound scanning system in a first embodiment of the present application;
FIG. 3 is a schematic structural diagram of the main components of an ultrasound scanning system according to the first and second embodiments of the present application
FIG. 4 is a schematic diagram of a computer processing flow taking a mark ring as red and green as an example according to an embodiment of the present application
Detailed Description
In the following description, numerous technical details are set forth in order to provide a better understanding of the present application. However, it will be understood by those skilled in the art that the technical solutions claimed in the present application may be implemented without these technical details and with various changes and modifications based on the following embodiments.
Description of partial concepts:
AP physical Central position of the Breast
MED physical location of the medial (near the medial axis) of the breast
LAT physical position of lateral (near arm) breast
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
A first embodiment of the invention is directed to a method of three-dimensional breast ultrasound scanning using an ultrasound scanning system. Fig. 2 is a schematic flow chart of the method for breast ultrasound scanning using the ultrasound scanning system. Fig. 3 shows an ultrasonic scanning system used in the method, which comprises a manipulator 302 and a camera 301, wherein one end of the manipulator is provided with an ultrasonic scanning probe 304, and the manipulator is provided with a marking ring 303, and the marking ring moves along with the movement of the probe.
The method for breast ultrasonic scanning comprises the following steps:
in step 201, a probe is placed at a central position of a breast for ultrasonic scanning, so as to obtain a first ultrasonic image, and the camera takes a first video image containing a marker ring, wherein the first ultrasonic image covers a nipple position.
Thereafter, step 202 is entered for determining the position of the nipple from the first ultrasound image. Alternatively, the nipple position is determined by means of image analysis or image recognition. Optionally, by providing a human-machine interface, the operator uses an input tool such as a mouse, touch screen, etc. to input the nipple position on the first ultrasound image.
Thereafter, step 203 is entered, and the probe is placed at a lateral position (e.g., MED position, or LAT position) of the breast for an ultrasound scan to obtain a second ultrasound image, while the camera takes a second video image containing the marker ring. Usually the nipple position is not covered in the second ultrasound image. Alternatively, in some application scenarios, the lateral position may not be a canonical MED or LAT position, but any other lateral position that does not cover the nipple.
Thereafter, step 204 is entered for image analysis of the first and second video images to determine the relative position of the second ultrasound image with respect to the nipple position based on the position of the marker ring in the first and second video images. Optionally, the relative position includes which side of the breast the probe is on (e.g., MED position, or LAT position). Optionally, the relative position comprises a relative displacement of the second ultrasound image and the nipple position.
The above process is a three-dimensional breast ultrasound scanning workflow, by which the position of the nipple relative to the side ultrasound image can be accurately and automatically determined, thereby facilitating the doctor to accurately and effectively report the position of the lesion relative to the nipple on the image, including automatically determining the scanning position of which side (e.g., MED position, or LAT position) the current ultrasound image is on, and further, accurately determining the relative position of the lesion relative to the nipple. By using the flow, all the scanned images contain nipple position information, and great help is provided for the doctor to read.
Preferably, the marker ring is a plurality of differently colored rings juxtaposed. The use of multi-colored rings as marker rings can improve the accuracy with which the marker rings are correctly identified. For example, the multiple colors are red and green, and because red and green appear in the ward less frequently and at the same time side by side much less frequently, the red-green two-color marker rings are identified with high accuracy. As another example, the multiple colors may be a combination of other colors, such as a 3 color combination of red, green, and blue, a combination of yellow and violet, and so forth. Alternatively, the marker ring may be a monochrome ring, preferably having a color different from the other colors in the patient room.
Optionally, step 204 further comprises the following sub-steps:
substep 1, calculating the actual physical movement distance of the probe, wherein the movement distance (actual diameter of the marker ring/number of long pixels on the edge of the marker ring) is the movement pixel of the marker ring in the first and second video images. Wherein the marker rings are embodied as rectangles in the video image, one side length of the rectangle (usually the longer side length) corresponding to the diameter of the marker ring. The actual diameter (physical diameter) of the marker ring is pre-measurable.
And a substep 2 of calculating the relative displacement of the second ultrasound image and the nipple position according to the actual physical movement distance of the probe.
According to the known size of the marking ring and the pixels occupied in the video image, the physical length corresponding to each pixel can be calculated, the moving distance of the probe can be calculated, and the physical relative displacement between the second ultrasonic image and the nipple position can be calculated. The background of the first and second video images taken by the fixed position camera is not moving and the amount of computation is small when calculating the relative movement of the marker rings.
Optionally, after step 204, the first and second ultrasound images may also be stitched based on the position of the marker ring in the first and second video images. In one embodiment, the ultrasound images of the three positions AP, MED, and LAT can be stitched together to form a complete ultrasound image of a breast.
Optionally, step 204 may further include the following sub-steps:
the lesion location is determined in the second ultrasound image.
A relative displacement vector of the lesion position and the nipple position is determined based on the position of the marker ring in the first and second video images.
There are various implementations of the position of the camera and the shooting parameters. Alternatively, the position of the camera and the shooting parameters are fixed. Alternatively, the position and imaging parameters of the camera may be varied, and the position and imaging parameters of the camera may be obtained by sensors or other means, and this information may be used to calculate the relative movement of the marker rings. Alternatively, the camera position is unchanged, but the capture parameters may be varied, in which case the capture parameters at the time of capturing the first and second video images need to be obtained from the camera for calculating the relative movement of the marker rings. The shooting parameters include aperture, shutter, zoom, gain, and the like.
For a better understanding of this embodiment, an example of the present application is described below.
As shown in fig. 3, a high-definition camera capable of acquiring pictures in real time is fixed on a right-side fixed metal rod, a marking ring (a ring with single or multiple colors and known diameter and height) is fixed on a free rod capable of moving along with an automatic manipulator, and the camera is horizontally arranged opposite to the marking ring, so that the marking ring is in a rectangle with single or multiple colors in a real-time image of the camera. And reading the image acquired by the camera by using a video acquisition program and identifying the target marker by using multiple information of color, shape and position.
The specific procedure is that the user selects the scanned breast (left or right), and then first scans the AP (mid-breast) position according to the system prompt. The user then needs to place the manipulator directly over the breast with the nipple in the center of the manipulator. During scanning, our video tracking system records the position of the marker rings on the video image.
After the AP position scanning is finished, when the user moves the robot to scan the image at the MED or LAT position, the robot changes position. Because the absolute physical position of the camera is not changed, the marking ring is horizontally displaced along with the movement of the manipulator, and the position of the marking ring on the video image is also changed. With object tracking, we can again calculate the contour position and the traversal distance of the marker ring on the video image. The scanning position can be automatically judged through the outline position, and the relative position of the nipple can be automatically calculated through the traversing distance and automatically calibrated in the image. Taking the left breast as an example, when scanning the left breast, if the position of the marker ring is shifted to the right compared to the position of the AP, it means that the position of the robot is left to the position of the AP, and the position scanned is LAT. If the marker ring position is shifted to the left compared to when scanning the AP, it indicates that the robot position is to the right of the AP position, and the scanned position is MED. In the same way we can obtain the position information when scanning the right breast.
Since the mark ring is displayed as a rectangle on the video image, the corresponding number of pixels in the image per unit physical size can be obtained according to the side length pixel value of the rectangle and the actual diameter of the ring, and the physical distance actually moved by the manipulator can be proportionally obtained according to the distance moved by the rectangle in the image (see the following formula).
Moving distance (actual diameter of marking ring/number of long pixels of marking ring edge) x moving pixel in marking ring image
Taking the label ring as red and green as an example, the computer processing flow is shown in fig. 4. The process comprises the following specific steps:
the video port image is read first, and the image is converted into color information of HSV color space (of course, other color spaces such as RBG, etc.) are also possible.
Find and locate all green and red rectangles separately.
The red and green rectangles of the closest size, position and shape are defined as marker rings.
It is then determined on which side of the breast (LAT of left breast, or MED of right breast, or LAT of right breast) the current ultrasound scan image is based on the current information of whether left or right breast, in combination with the information of whether marker ring is left or right shifted.
Then, the related physical displacement (such as the physical displacement of the ultrasonic scanning probe, the displacement of the nipple relative to the current ultrasonic scanning image, the displacement of the focus in the current ultrasonic scanning image relative to the nipple) is accurately calculated according to the pixel displacement, and the position of the nipple relative to the current ultrasonic scanning image is automatically marked.
From the physical spatial position of the three-scan images (AP, LAT and MED) and the nipple markers, a physical overview of the entire scanned area of the breast can be obtained.
A second embodiment of the invention relates to an ultrasound scanning system. As shown in fig. 3, the ultrasound scanning system includes:
comprises a manipulator 302, a camera 301 and a computing module.
One end of the manipulator is provided with an ultrasonic scanning probe 304, and the manipulator is provided with a marking ring 303 which moves along with the movement of the probe.
The camera is used for shooting a video image containing the mark ring.
The calculation module is used for identifying the position of the marker ring according to the video image shot by the camera and calculating the displacement of the probe according to the change of the position of the marker ring. Based on the displacement of the probe, in combination with the image analysis of the ultrasound images, other information may be further calculated, such as which side (e.g., MED position, or LAT position) the current ultrasound image is on, the relative displacement of an ultrasound image and the nipple position, and the relative displacement of the lesion in an ultrasound image and the nipple position. A calculation module, not shown in fig. 3, acquires data (including video images and ultrasound scanning images) from the camera and the ultrasound scanning probe, and performs related calculations.
Preferably, the marker ring is a plurality of differently colored rings juxtaposed. The use of multi-colored rings as marker rings can improve the accuracy with which the marker rings are correctly identified.
The first embodiment is an application method of the system proposed in the present embodiment, and the technical details concerning the system are fully usable in the present embodiment and will not be repeated here.
For the steps designed in the above method embodiments and executed on a computer, the embodiments of the present invention also provide a computer storage medium, in which computer executable instructions are stored, and when executed by a processor, the computer executable instructions implement the method embodiments of the present invention. In addition, the embodiment of the invention also provides an ultrasonic scanning system, which comprises a memory for storing computer executable instructions and a processor; the processor is configured to implement the steps of the method embodiments described above when executing the computer-executable instructions in the memory.
It is noted that, in the present patent application, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the use of the verb "comprise a" to define an element does not exclude the presence of another, same element in a process, method, article, or apparatus that comprises the element. In the present patent application, if it is mentioned that a certain action is executed according to a certain element, it means that the action is executed according to at least the element, and two cases are included: performing the action based only on the element, and performing the action based on the element and other elements. The expression of a plurality of, a plurality of and the like includes 2, 2 and more than 2, more than 2 and more than 2.
All documents mentioned in this application are incorporated by reference into this application as if each were individually incorporated by reference. Further, it should be understood that various changes or modifications can be made to the present application by those skilled in the art after reading the above teachings of the present application, and these equivalents also fall within the scope of the claimed application.

Claims (8)

1. A method for carrying out mammary gland ultrasonic scanning by using an ultrasonic scanning system is characterized in that the system comprises a mechanical arm and a camera, an ultrasonic scanning probe is arranged at one end of the mechanical arm, a marking ring is arranged on the mechanical arm, the marking ring moves along with the movement of the probe, the marking ring is a plurality of rings with different colors which are arranged side by side, and the camera is horizontally arranged right opposite to the marking ring, and the method comprises the following steps:
placing the probe at the center of the breast for ultrasonic scanning to obtain a first ultrasonic image, and simultaneously shooting a first video image containing the marking ring by the camera, wherein the first ultrasonic image covers the nipple position;
determining a nipple position from the first ultrasound image;
placing the probe at the side position of the breast for ultrasonic scanning to obtain a second ultrasonic image, and simultaneously shooting a second video image containing the marking ring by the camera;
performing image analysis on the first and second video images, determining a relative position of the second ultrasound image with respect to the nipple position based on the position of the marker ring in the first and second video images, wherein further comprising:
calculating an actual physical movement distance of the probe, wherein the movement distance is (marker ring actual diameter/marker ring edge length pixel count) x the moving pixels of the marker ring in the first and second video images; the marking ring is embodied as a rectangle in the video image, and one side length of the rectangle corresponds to the actual diameter of the marking ring;
calculating a relative displacement of the second ultrasound image and the nipple position according to an actual physical movement distance of the probe.
2. The method of claim 1, wherein the relative position comprises: on which side of the breast the probe is, and/or the relative displacement of the second ultrasound image from the nipple position.
3. The method of claim 1, wherein the step of image analyzing the first and second video images is further followed by:
stitching the first and second ultrasound images according to the position of the marker ring in the first and second video images.
4. The method of claim 1, wherein the step of image analyzing the first and second video images further comprises:
determining a lesion location in the second ultrasound image;
determining a relative displacement vector of the lesion location and the nipple location based on the locations of the marker rings in the first and second video images.
5. The method of breast ultrasound scanning using an ultrasound scanning system according to any of claims 1 to 4, wherein the lateral positions include medial and lateral breast.
6. The method of breast ultrasound scanning using an ultrasound scanning system according to any of claims 1 to 4, wherein the second ultrasound image does not cover the nipple position.
7. The method of breast ultrasound scanning using an ultrasound scanning system according to any of claims 1 to 4, wherein the position of the camera and the shooting parameters are fixed.
8. An ultrasound scanning system, comprising:
the device comprises a manipulator, a camera and a calculation module;
an ultrasonic scanning probe is arranged at one end of the manipulator, a marking ring is arranged on the manipulator, and the marking ring moves along with the movement of the probe; the marking ring is a plurality of rings of different colors which are arranged in parallel;
the camera is horizontally arranged right opposite to the marking ring and is used for shooting a video image containing the marking ring;
the calculation module is used for identifying the position of the marker ring according to a video image shot by the camera and calculating the displacement of the probe according to the change of the position of the marker ring;
placing the probe at the central position of the breast for ultrasonic scanning to obtain a first ultrasonic image, and simultaneously shooting a first video image containing the marking ring by the camera, wherein the first ultrasonic image covers the position of the nipple;
determining a nipple position from the first ultrasound image;
placing the probe at the side position of the breast for ultrasonic scanning to obtain a second ultrasonic image, and simultaneously shooting a second video image containing the marking ring by the camera;
the calculation module is configured to calculate an actual physical movement distance of the probe, and calculate a relative displacement between the second ultrasound image and the nipple position according to the actual physical movement distance of the probe, where the actual physical movement distance is (actual diameter of the marker ring/number of long pixels of the marker ring) x moving pixels of the marker ring in the first and second video images, the marker ring is embodied as a rectangle in the video images, and one side length of the rectangle corresponds to the actual diameter of the marker ring.
CN201811031335.4A 2018-09-05 2018-09-05 Three-dimensional breast ultrasonic scanning method and ultrasonic scanning system Active CN109171817B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811031335.4A CN109171817B (en) 2018-09-05 2018-09-05 Three-dimensional breast ultrasonic scanning method and ultrasonic scanning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811031335.4A CN109171817B (en) 2018-09-05 2018-09-05 Three-dimensional breast ultrasonic scanning method and ultrasonic scanning system

Publications (2)

Publication Number Publication Date
CN109171817A CN109171817A (en) 2019-01-11
CN109171817B true CN109171817B (en) 2021-12-07

Family

ID=64914502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811031335.4A Active CN109171817B (en) 2018-09-05 2018-09-05 Three-dimensional breast ultrasonic scanning method and ultrasonic scanning system

Country Status (1)

Country Link
CN (1) CN109171817B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111275617B (en) * 2020-01-09 2023-04-07 云南大学 Automatic splicing method and system for ABUS breast ultrasound panorama and storage medium
CN112075957B (en) * 2020-07-27 2022-05-17 深圳瀚维智能医疗科技有限公司 Mammary gland circular scanning track planning method and device and computer readable storage medium
CN114305502A (en) * 2020-09-29 2022-04-12 深圳迈瑞生物医疗电子股份有限公司 Mammary gland ultrasonic scanning method, device and storage medium
CN112704514B (en) * 2020-12-24 2021-11-02 重庆海扶医疗科技股份有限公司 Focus positioning method and focus positioning system
CN112603368A (en) * 2020-12-25 2021-04-06 上海深博医疗器械有限公司 Mammary gland ultrasonic navigation and diagnosis system and method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200632716A (en) * 2005-03-11 2006-09-16 Hi Touch Imaging Tech Co Ltd Method of displaying an ultrasonic image
CN101347361B (en) * 2008-08-14 2012-03-07 浙江省中医院 Stent secondary release imbedding device under direct view by endoscope
US10610196B2 (en) * 2013-06-28 2020-04-07 Koninklijke Philips N.V. Shape injection into ultrasound image to calibrate beam patterns in real-time
US20160199009A1 (en) * 2013-08-10 2016-07-14 Needleways Ltd. Medical needle path display
CN105873521B (en) * 2014-01-02 2020-09-15 皇家飞利浦有限公司 Instrument alignment and tracking relative to ultrasound imaging plane
CN104758066B (en) * 2015-05-06 2017-05-10 中国科学院深圳先进技术研究院 Equipment for surgical navigation and surgical robot
CN104856720B (en) * 2015-05-07 2017-08-08 东北电力大学 A kind of robot assisted ultrasonic scanning system based on RGB D sensors
CN105769244B (en) * 2016-03-22 2020-04-03 上海交通大学 Calibration device for ultrasonic probe calibration
KR20180091282A (en) * 2017-02-06 2018-08-16 삼성메디슨 주식회사 Ultrasonic probe and controlling method of thereof
CN107854177A (en) * 2017-11-18 2018-03-30 上海交通大学医学院附属第九人民医院 A kind of ultrasound and CT/MR image co-registrations operation guiding system and its method based on optical alignment registration
CN107981888B (en) * 2017-12-21 2021-07-13 浙江深博医疗技术有限公司 Automatic mechanical positioning system for computer mammary gland scanning
CN108427421A (en) * 2018-04-26 2018-08-21 广东水利电力职业技术学院(广东省水利电力技工学校) A kind of intelligent distribution robot control system

Also Published As

Publication number Publication date
CN109171817A (en) 2019-01-11

Similar Documents

Publication Publication Date Title
CN109171817B (en) Three-dimensional breast ultrasonic scanning method and ultrasonic scanning system
US10706610B2 (en) Method for displaying an object
JP7004094B2 (en) Fish length measurement system, fish length measurement method and fish length measurement program
US20200268339A1 (en) System and method for patient positioning
KR101553283B1 (en) Information processing apparatus
US11576578B2 (en) Systems and methods for scanning a patient in an imaging system
JP5453000B2 (en) Method and apparatus for 3D digitizing an object
US20170084036A1 (en) Registration of video camera with medical imaging
EP2849157A2 (en) Image processing apparatus, image processing method, and computer-readable storage medium
JP6736414B2 (en) Image processing apparatus, image processing method and program
CN109426835A (en) Information processing unit, the control method of information processing unit and storage medium
JP2010219825A (en) Photographing device for three-dimensional measurement
US20160379368A1 (en) Method for determining an imaging specification and image-assisted navigation as well as device for image-assisted navigation
EP3800451A1 (en) Temperature measurement processing method and apparatus, and thermal imaging device
JP2010256253A (en) Image capturing device for three-dimensional measurement and method therefor
US20200281556A1 (en) X-ray Detector Pose Estimation in Medical Imaging
CN114022547A (en) Endoscope image detection method, device, equipment and storage medium
CN109171789A (en) A kind of calibration method and calibration system for diagnostic imaging equipment
JP2007252707A (en) Image analysis apparatus and program
JP2005140547A (en) 3-dimensional measuring method, 3-dimensional measuring device and computer program
JP4963492B2 (en) Image segmentation method, program and apparatus
JP7057086B2 (en) Image processing equipment, image processing methods, and programs
US20220409324A1 (en) Systems and methods for telestration with spatial memory
JP7309556B2 (en) Image processing system and its control method
US20150121276A1 (en) Method of displaying multi medical image and medical image equipment for performing the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant