CN113017683A - Ultrasonic measurement method, device and storage medium - Google Patents

Ultrasonic measurement method, device and storage medium Download PDF

Info

Publication number
CN113017683A
CN113017683A CN201911358646.6A CN201911358646A CN113017683A CN 113017683 A CN113017683 A CN 113017683A CN 201911358646 A CN201911358646 A CN 201911358646A CN 113017683 A CN113017683 A CN 113017683A
Authority
CN
China
Prior art keywords
point
image
preset target
distance
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911358646.6A
Other languages
Chinese (zh)
Inventor
温博
安兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN201911358646.6A priority Critical patent/CN113017683A/en
Publication of CN113017683A publication Critical patent/CN113017683A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0825Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data

Abstract

The application provides an ultrasonic measurement method, an ultrasonic measurement device and a storage medium, wherein the method comprises the following steps: controlling a probe to emit ultrasonic waves to a target area of a measured object, receiving echoes of the ultrasonic waves, and acquiring ultrasonic echo signals based on the echoes of the ultrasonic waves; generating a breast ultrasonic image based on the ultrasonic echo signal, and detecting a focus and a preset target part in the breast ultrasonic image, wherein the preset target part comprises skin and/or nipple; calculating a distance from the lesion to the preset target site based on the result of the detection. According to the ultrasonic measurement method, the ultrasonic measurement device and the storage medium, the focus and the preset target position in the breast ultrasonic image can be automatically or semi-automatically detected, and the distance from the focus to the preset target position can be automatically calculated, so that the measurement process of the distance from the focus to the position is simplified, the measurement efficiency is improved, and the accuracy of the measurement result is also improved.

Description

Ultrasonic measurement method, device and storage medium
Technical Field
The present application relates to the field of ultrasound technology, and more particularly, to an ultrasound measurement method, apparatus, and storage medium.
Background
In modern medical image examination, the ultrasonic technology has become the examination means which has the widest application and the highest use frequency and is the fastest when a new technology is popularized and applied due to the advantages of high reliability, rapidness, convenience, real-time imaging, repeatable examination and the like. The development of some new ultrasonic technologies further promotes the application of ultrasonic image examination in clinical diagnosis and treatment.
Currently, in the process of actual clinical examination, it is often necessary for a user to manually determine the position of a lesion on an ultrasound image or manually determine a measurement point on the ultrasound image to measure the distance from the lesion to some part. The process not only has the problem of time and labor waste, but also has the problem that the accuracy of the measurement result is difficult to ensure because the manually determined position or the selected measurement point is easy to have certain errors.
Disclosure of Invention
The present application proposes an ultrasound measurement scheme that not only simplifies the measurement procedure of the distance of the lesion to the skin/nipple, but also improves the accuracy of the measurement result. The ultrasonic measurement scheme proposed by the present application is briefly described below, and more details will be described in the following detailed description with reference to the accompanying drawings.
According to an aspect of the present application, there is provided an ultrasonic measurement method, the method including: controlling a probe to emit ultrasonic waves to a target area of a measured object, receiving echoes of the ultrasonic waves, and acquiring ultrasonic echo signals based on the echoes of the ultrasonic waves; generating a breast ultrasonic image based on the ultrasonic echo signal, and detecting a focus and a preset target part in the breast ultrasonic image, wherein the preset target part comprises skin and/or nipple; and calculating a distance from the lesion to the preset target site based on the result of the detection.
In an embodiment of the present application, the detecting the lesion in the ultrasound image includes: detecting the area where the focus is located and obtaining the boundary of the focus.
In an embodiment of the present application, the detecting the predetermined target region in the breast ultrasound image includes: detecting the position of the top layer of the image in the breast ultrasonic image to serve as the position of the preset target part; or detecting a position in a preset area at the upper end of the breast ultrasound image, where the brightness is greater than a preset threshold value, as the position of the preset target part.
In an embodiment of the present application, the calculating a distance from the lesion to the predetermined target site based on the detection result includes: determining a point with the minimum depth direction coordinate on the boundary of the focus as a target point; and calculating the distance from the position of the target point to the position of the preset target part to be used as the distance from the focus to the preset target part.
In one embodiment of the present application, the preset target portion is a nipple, and the detecting the preset target portion in the breast ultrasound image includes: and detecting the area where the preset target part is located, and acquiring the position of the central point of the preset target part to be used as the position of the preset target part.
In an embodiment of the application, the acquiring the position of the central point of the preset target portion includes: projecting all points within the region along a vertical direction of the breast ultrasound image; and calculating the position of the midpoint of a line segment corresponding to the horizontal direction of the image top layer of the mammary gland ultrasonic image after projection to be used as the position of the center point of the preset target part.
In an embodiment of the present application, the calculating a distance from the lesion to the predetermined target site based on the detection result includes: determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
In an embodiment of the present application, the determining a point on a boundary of the lesion closest to a center point of the preset target site as a target point includes: and traversing all points on the boundary of the focus to determine a point closest to the central point of the preset target part as the target point.
In an embodiment of the present application, the determining a point on a boundary of the lesion closest to a center point of the preset target site as a target point includes: determining a centroid point in a region surrounded by the boundary of the focus, and connecting the centroid point with a central point of the preset target part to form a line segment; and determining a point closest to the central point of the preset target part in a preset area near the intersection point of the line segment and the boundary of the focus as the target point.
In an embodiment of the present application, the calculating a distance from the lesion to the predetermined target site based on the detection result includes: mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
In one embodiment of the present application, the preset target portion is a nipple, and the detecting the preset target portion in the breast ultrasound image includes: and detecting the area where the preset target part is located, and acquiring a line segment for defining the preset target part to serve as the position of the preset target part.
In an embodiment of the present application, the calculating a distance from the lesion to the predetermined target site based on the detection result includes: and calculating the minimum distance between the line segment and a graph formed by the boundary of the focus to be used as the distance between the focus and the preset target part.
In one embodiment of the present application, the preset target portion is a nipple, and the detecting the preset target portion in the breast ultrasound image includes: and detecting one side with an image mark in the breast ultrasonic image, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image in the points on the one side as the position of the preset target part.
In one embodiment of the present application, the preset target portion is a nipple, and the detecting the preset target portion in the breast ultrasound image includes: determining that one end of the probe with the mark is aligned with the center of a nipple when the user uses the probe, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image on the side with the image mark in the breast ultrasonic image as the position of the preset target part; or, determining that one end of the probe, which is not used by the user and has the mark, is aligned with the center of the nipple, and acquiring a position of a point on an image top layer of the breast ultrasound image on the opposite side of the side, which has the image mark, of the breast ultrasound image as the position of the preset target part.
In an embodiment of the present application, the calculating a distance from the lesion to the predetermined target site based on the detection result includes: mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; and calculating the distance from the target point to the preset target part to be used as the distance from the focus to the preset target part.
In one embodiment of the present application, the predetermined target site is a nipple, and the method further comprises: only if the focus is detected in the breast ultrasonic image and no nipple area is detected, sending out a prompt that no nipple area is detected in the current section; or if the distance of the focus from the nipple is detected to exceed a predefined threshold in the breast ultrasound image, prompting to enter a wide view or to use a stitched image for measurement.
In one embodiment of the present application, the preset target portion is a nipple, and the detecting the preset target portion in the breast ultrasound image includes: and receiving the position of the preset target part input by the user as the position of the preset target part.
In one embodiment of the present application, the detection of the lesion and/or the predetermined target site is based on a fully automatic detection or a semi-automatic detection, wherein the semi-automatic detection comprises: receiving a predefined region of the lesion and/or the preset target site input by a user; and detecting the lesion and/or the preset target site based on the predefined area.
In one embodiment of the present application, the detection of the lesion and/or the predetermined target site is based on at least one of the following algorithms: a boundary segmentation algorithm, a target detection algorithm, a machine learning algorithm, and a deep learning algorithm.
In one embodiment of the present application, the method further comprises: after calculating the distance from the focus to the preset target site, displaying the calculated result.
In one embodiment of the present application, the method further comprises: after calculating the distance from the lesion to the preset target site, generating an ultrasound report based on the result of the calculation.
In one embodiment of the present application, the method further comprises: and displaying the breast ultrasonic image, wherein the calculated result is displayed on the breast ultrasonic image.
According to another aspect of the present application, there is provided an ultrasonic measurement method, the method including: controlling a probe to emit ultrasonic waves to a target area of a measured object, receiving echoes of the ultrasonic waves, and acquiring ultrasonic echo signals based on the echoes of the ultrasonic waves; generating an ultrasonic image based on the ultrasonic echo signal, and detecting a focus and a preset target part in the ultrasonic image; and calculating a distance from the lesion to the preset target site based on the result of the detection.
According to still another aspect of the present application, there is provided an ultrasonic measuring apparatus, the apparatus including: an ultrasound probe, a transmit/receive sequence controller, and a processor, wherein: the transmitting/receiving sequence controller is used for exciting the ultrasonic probe to transmit ultrasonic waves to a target area of a measured object, receiving echoes of the ultrasonic waves and acquiring ultrasonic echo signals based on the echoes of the ultrasonic waves; the processor is used for generating a breast ultrasonic image based on the ultrasonic echo signal, detecting a focus and a preset target part in the breast ultrasonic image, and calculating the distance from the focus to the preset target part based on the detection result, wherein the preset target part comprises skin and/or nipple.
In one embodiment of the present application, the processor detects a lesion in the breast ultrasound image, including: detecting the area where the focus is located and obtaining the boundary of the focus.
In an embodiment of the present application, the preset target region is skin, and the detecting, by the processor, the preset target region in the breast ultrasound image includes: detecting the position of the top layer of the image in the breast ultrasonic image to serve as the position of the preset target part; or detecting a position in a preset area at the upper end of the breast ultrasound image, where the brightness is greater than a preset threshold value, as the position of the preset target part.
In one embodiment of the present application, the processor calculates a distance from the lesion to the predetermined target site based on the detection result, including: determining a point with the minimum depth direction coordinate on the boundary of the focus as a target point; and calculating the distance from the position of the target point to the position of the preset target part to be used as the distance from the focus to the preset target part.
In one embodiment of the present application, the preset target site is a nipple, and the processor detects the preset target site in the breast ultrasound image, including: and detecting the area where the preset target part is located, and acquiring the position of the central point of the preset target part to be used as the position of the preset target part.
In an embodiment of the present application, the acquiring, by the processor, a position of a central point of the preset target portion includes: projecting all points within the region along a vertical direction of the breast ultrasound image; and calculating the position of the midpoint of a line segment corresponding to the horizontal direction of the image top layer of the mammary gland ultrasonic image after projection to be used as the position of the center point of the preset target part.
In one embodiment of the present application, the processor calculates a distance from the lesion to the predetermined target site based on the detection result, including: determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
In an embodiment of the present application, the determining, by the processor, a point on a boundary of the lesion closest to a center point of the preset target site as a target point includes: and traversing all points on the boundary of the focus to determine a point closest to the central point of the preset target part as the target point.
In an embodiment of the present application, the determining, by the processor, a point on a boundary of the lesion closest to a center point of the preset target site as a target point includes: determining a centroid point in a region surrounded by the boundary of the focus, and connecting the centroid point with a central point of the preset target part to form a line segment; and determining a point closest to the central point of the preset target part in a preset area near the intersection point of the line segment and the boundary of the focus as the target point.
In one embodiment of the present application, the processor calculates a distance from the lesion to the predetermined target site based on the detection result, including: mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
In one embodiment of the present application, the preset target site is a nipple, and the processor detects the preset target site in the breast ultrasound image, including: and detecting the area where the preset target part is located, and acquiring a line segment for defining the preset target part to serve as the position of the preset target part.
In one embodiment of the present application, the processor calculates a distance from the lesion to the predetermined target site based on the detection result, including: and calculating the minimum distance between the line segment and a graph formed by the boundary of the focus to be used as the distance between the focus and the preset target part.
In one embodiment of the present application, the preset target site is a nipple, and the processor detects the preset target site in the breast ultrasound image, including: and detecting one side with an image mark in the breast ultrasonic image, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image in the points on the one side as the position of the preset target part.
In one embodiment of the present application, the preset target site is a nipple, and the processor detects the preset target site in the breast ultrasound image, including: determining that one end of the probe with the mark is aligned with the center of a nipple when the user uses the probe, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image on the side with the image mark in the breast ultrasonic image as the position of the preset target part; or, determining that one end of the probe, which is not used by the user and has the mark, is aligned with the center of the nipple, and acquiring a position of a point on an image top layer of the breast ultrasound image on the opposite side of the side, which has the image mark, of the breast ultrasound image as the position of the preset target part.
In one embodiment of the present application, the processor calculates a distance from the lesion to the predetermined target site based on the detection result, including: mapping points on the boundary of the focus to the top image layer of the mammary gland ultrasonic image to obtain quasi target points, determining points which are closest to the preset target position in the quasi target points, and mapping the determined points to the top image layer of the mammary gland ultrasonic image to obtain target points; and calculating the distance from the target point to the preset target part to be used as the distance from the focus to the preset target part.
In one embodiment of the present application, the predetermined target site is a nipple, and the processor is further configured to: only if the focus is detected in the breast ultrasonic image and no nipple area is detected, sending out a prompt that no nipple area is detected in the current section; or if the distance of the focus from the nipple is detected to exceed a predefined threshold in the breast ultrasound image, prompting to enter a wide view or to use a stitched image for measurement.
According to a further aspect of the present application, an ultrasound measurement apparatus is provided, the apparatus comprising a memory and a processor, the memory having stored thereon a computer program for execution by the processor, the computer program, when executed by the processor, performing the ultrasound measurement method described above.
According to a further aspect of the present application, a storage medium is provided, on which a computer program is stored, which computer program, when executed, performs the above-described ultrasound measurement method.
According to a further aspect of the application, a computer program is provided for performing the above-mentioned ultrasound measurement method when the computer program is run by a computer or a processor.
According to the ultrasonic measurement method and the device, the focus and the skin and/or the nipple in the breast ultrasonic image can be automatically or semi-automatically detected, and the distance from the focus to the skin and/or the nipple is automatically calculated, so that the measurement process of the distance from the focus to the skin and/or the nipple is simplified, the measurement efficiency is improved, and the accuracy of the measurement result is also improved.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 shows a schematic block diagram of an exemplary ultrasonic measurement apparatus for implementing an ultrasonic measurement method according to an embodiment of the present application.
FIG. 2 shows a schematic flow diagram of an ultrasonic measurement method according to one embodiment of the present application.
Fig. 3 shows an exemplary schematic of the distance between the lesion and the skin.
Fig. 4A shows a schematic diagram of one example of the distance between a lesion and a nipple.
Fig. 4B shows a schematic diagram of another example of the distance between the lesion and the nipple.
Fig. 4C shows a schematic diagram of yet another example of the distance between the lesion and the nipple.
FIG. 5 shows a schematic flow diagram of an ultrasonic measurement method according to another embodiment of the present application.
FIG. 6 shows a schematic block diagram of an ultrasonic measurement device according to one embodiment of the present application.
FIG. 7 shows a schematic block diagram of an ultrasonic measurement device according to another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail below with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the application described in the application without inventive step, shall fall within the scope of protection of the application.
First, an exemplary ultrasonic measurement apparatus for implementing the ultrasonic measurement method of the embodiment of the present application is described with reference to fig. 1.
Fig. 1 is a block diagram illustrating an exemplary ultrasonic measurement apparatus 10 for implementing an ultrasonic measurement method according to an embodiment of the present disclosure. As shown in fig. 1, the ultrasonic measuring apparatus 10 may include an ultrasonic probe 100, a transmission/reception selection switch 101, a transmission/reception sequence controller 102, a processor 103, a display 104, and a memory 105. The transmission/reception sequence controller 102 may excite the ultrasound probe 100 to transmit an ultrasonic wave to a target object (measured object), and may also control the ultrasound probe 100 to receive an ultrasonic echo returned from the target object, thereby obtaining an ultrasonic echo signal/data. The processor 103 processes the ultrasound echo signals/data to obtain tissue related parameters and ultrasound images of the target object. Ultrasound images obtained by the processor 103 may be stored in the memory 105 and displayed on the display 104.
In this embodiment, the display 104 of the ultrasonic measurement apparatus 10 may be a touch display screen, a liquid crystal display, or the like, or may be an independent display apparatus such as a liquid crystal display, a television, or the like, which is independent of the ultrasonic measurement apparatus 10, or may be a display screen on an electronic apparatus such as a mobile phone, a tablet computer, or the like.
In the embodiment of the present application, the memory 105 of the ultrasonic measurement apparatus 10 can be a flash memory card, a solid-state memory, a hard disk, or the like.
The embodiment of the present application further provides a computer-readable storage medium, where multiple program instructions are stored in the computer-readable storage medium, and after the multiple program instructions are called and executed by the processor 103, some or all of the steps of the ultrasonic measurement method in the embodiments of the present application, or any combination of the steps in the ultrasonic measurement method may be performed.
In one embodiment, the computer readable storage medium may be memory 105, which may be a non-volatile storage medium such as a flash memory card, solid state memory, hard disk, or the like.
In the embodiment of the present application, the processor 103 of the ultrasonic measurement apparatus 10 may be implemented by software, hardware, firmware or a combination thereof, and may use an electric circuit, a single or multiple Application Specific Integrated Circuits (ASICs), a single or multiple general-purpose integrated circuits, a single or multiple microprocessors, a single or multiple programmable logic devices, or a combination of the foregoing electric circuits or devices, or other suitable electric circuits or devices, so that the processor 103 may perform the corresponding steps of the ultrasonic measurement method in each embodiment.
The ultrasonic measurement method in the present application, which can be performed by the aforementioned ultrasonic measurement apparatus 10, is described in detail below with reference to fig. 2 to 5.
FIG. 2 shows a schematic flow diagram of an ultrasonic measurement method 200 according to one embodiment of the present application. As shown in fig. 2, the ultrasonic measurement method 200 may include the steps of:
in step S210, the probe is controlled to transmit an ultrasonic wave to a target region of a measured object, receive an echo of the ultrasonic wave, and acquire an ultrasonic echo signal based on the echo of the ultrasonic wave.
In step S220, a breast ultrasound image is generated based on the ultrasound echo signal, and a lesion and a preset target portion in the breast ultrasound image are detected, where the preset target portion includes skin and/or nipple.
In step S230, a distance from the lesion to the preset target site is calculated based on the result of the detection.
In an embodiment of the present application, the object to be measured may be a person to be subjected to an ultrasound examination, and the target region of the object to be measured may be a region of a body part of the object to be subjected to the ultrasound examination. For example, the target region may be a breast region, and accordingly, the ultrasound examination is a breast ultrasound examination. In breast ultrasound examinations, it is often necessary to generate a breast ultrasound report that includes: (1) a record of patient information; (2) bilateral breast tissue total sonogram description; (3) sonographic depiction of meaningful abnormalities and lesions. Where the lesion sonogram description section explicitly records the location of the lesion, consistent, repeatable systematic localization is required, usually using lesion-to-skin distance and lesion-to-nipple distance. Therefore, in breast ultrasound examination, the distance from the lesion to the skin and/or the distance from the lesion to the nipple needs to be measured. Wherein the distance between the lesion and the skin may be as shown in fig. 3, generally referring to the minimum distance from the skin layer to the edge of the lesion. The distance between the lesion and the nipple may be as shown in fig. 4A to 4C, in which fig. 4A shows the distance from the center of the nipple to the edge of the lesion mapped to the body surface when the probe acquires an image with the side edge thereof aligned with the center of the nipple, fig. 4B shows the distance from the center of the nipple to the edge of the lesion mapped to the body surface when the probe acquires a breast image while covering the nipple and the lesion, and fig. 4C shows the spatial straight line minimum distance from the center of the nipple to the lesion. The lesion-to-skin distance and lesion-to-nipple distance are key values for locating and distinguishing lesions, and are crucial for follow-up and clinical protocol planning. However, in most cases, the edge morphology of the breast lesion is usually irregular, and currently, a user generally needs to manually select a measurement point, which has a certain error, and is relatively cumbersome, time-consuming and labor-consuming. Based on this, the ultrasonic measurement scheme provided by the present application can solve such a problem. In the following, the ultrasound measurement scheme of the present application is mainly described by taking breast ultrasound measurement as an example, because the scheme of the present application is very suitable for relevant measurement during breast ultrasound examination, but it should be understood that this is only exemplary, and the ultrasound measurement scheme of the present application can also be used for ultrasound examination and measurement of any other part.
In an embodiment of the present application, the ultrasound echo signals are processed to obtain a breast ultrasound image. For example, the ultrasound echo signal may be processed, such as gain compensation, beam synthesis, quadrature demodulation, image enhancement, etc., to obtain a breast ultrasound image of the target region of the object.
Specifically, the ultrasonic device transmits ultrasonic signals, and the transducer array elements of the probe convert the electric signals into acoustic signals to be transmitted to a target object; then, the sound signals of the ultrasonic echoes are converted into electric signals through a transducer array element of the probe; the signal is subjected to front-end filtering amplification through an analog circuit and then is converted into a digital signal through an analog-to-digital converter; and further performing line beam forming on the data of each array element channel to obtain a radio frequency signal, and then performing quadrature demodulation to obtain in-phase/quadrature two-path quadrature signals and sending the in-phase/quadrature two-path quadrature signals to a subsequent imaging processing module. The use of gain compensation for the received ultrasonic echo signals can mitigate subsequent processing problems due to the reduction in signal strength with depth. The processed signal is actually an analog signal, so in order to improve the signal processing efficiency and reduce the complexity of a hardware platform, an analog-to-digital converter is required to convert the analog echo signal into a digital echo signal. After the analog-to-digital conversion is completed, the channel data may be formed by performing digital beam forming according to the delay difference caused by the difference between the distances from the focus point to the channels, and the data processing performed before the digital beam forming may be collectively referred to as front-end processing. The data obtained after this stage is completed may be referred to as radio frequency signal data. So called "rf" means that the signal carries the probe receive clock frequency, and the carrier frequency is exactly in the rf band of the communications domain. After the radio frequency signal data is acquired, signal carriers are removed through in-phase/quadrature demodulation, organization structure information contained in the signals is extracted, filtering is carried out to remove noise, and the acquired signals are baseband signals. All processing required in the rf signal processing to the baseband signal may be collectively referred to as mid-end processing. And finally, obtaining the intensity of the baseband signal and carrying out logarithmic compression and gray scale conversion on the gray scale level of the baseband signal to obtain the breast ultrasound image, wherein the finished processing can be collectively called back-end processing. Thus, a frame of ultrasound image of the breast can be obtained for display.
In the embodiment of the present application, the lesion and the preset target portion in the breast ultrasound image of the target region of the object to be detected may be detected in a fully automatic detection manner or a semi-automatic detection manner.
The full-automatic detection mode is, for example, to detect a lesion and a preset target site in an ultrasound image based on a boundary segmentation algorithm or a target detection algorithm; from another perspective, a fully automatic detection method is to detect a lesion and a preset target site in an ultrasound image based on a machine learning algorithm or a deep learning algorithm, for example. The machine learning or deep learning algorithm is to put the image of the focus boundary/preset target part marked by the doctor and the frame coordinate of the boundary/region of interest (ROI) into a deep learning segmentation or target detection network (convolutional neural network) for training, and in the training process, the reference model for detecting the focus/preset target part is obtained by calculating the error between the predicted value (namely the focus boundary/preset target part position output by the neural network) and the calibrated position, continuously iterating and gradually approaching. The focus and the preset target part are detected in a full-automatic detection mode, so that rapid detection can be realized, an accurate result can be obtained, the detection efficiency and accuracy are improved, and the subsequent measurement efficiency and accuracy based on the detection result are improved.
In one example, the manner of semi-automatic detection may include: receiving a predefined region of the lesion and/or the preset target site input by a user; detecting the lesion and/or the predetermined target site based on the predefined area. That is, the semi-automatic detection method is that a user first determines an approximate region of a lesion and/or the preset target region according to an ultrasound image, and then further detects the lesion and/or the preset target region in the approximate region through at least one of a boundary segmentation algorithm, a target detection algorithm, a machine learning algorithm, a deep learning algorithm, and the like. In another example, the semi-automatic detection mode may be: the lesion and/or the predetermined target site is detected by the fully automatic detection method described above, and then the user verifies or corrects the detection result. The semi-automatic detection requires the participation of a user, but can further improve the accuracy of the detection result.
The detection of a lesion and a predetermined target site in a breast ultrasound image will now be described by taking the breast ultrasound examination as an example. As described above, this is merely exemplary, and the ultrasound measurement method of the present application can also be used for detection of a lesion and a preset target site and measurement of a distance between the lesion and the preset target site in any other ultrasound examination.
In the embodiment of the present application, the detection of the lesion position may be to detect a region where the lesion is located and obtain a boundary of the lesion. Based on the detected regional location of the lesion and its boundary, the distance of the lesion to a preset target site (skin and/or nipple) may be calculated.
For breast ultrasound, for example, the preset target site may include the skin and/or the nipple. The skin will now be described as an example. In an embodiment of the application, when the preset target region is skin, the detecting the preset target region in the ultrasound image may include: detecting the position of the top layer (i.e. the uppermost edge) of the image in the ultrasonic image as the position of the preset target part; or detecting the position of the preset target part, in the preset area at the upper end of the ultrasonic image, of which the brightness is greater than a preset threshold value. Generally, for a breast ultrasound image, the top layer of the image is the skin layer, so the position of the top layer of the image in the ultrasound image can be detected as the position of the skin. Alternatively, in a breast ultrasound image, a bright area at the upper end of the image is the position of the skin, so a preset area and a preset threshold value can be set as required, and a position with brightness greater than the preset threshold value in the preset area at the upper end of the ultrasound image is detected as the position of the skin.
Accordingly, the calculating of the distance from the lesion to the preset target site (i.e., skin) based on the result of the detection may include: determining a point with the minimum depth direction coordinate on the boundary of the focus as a target point; and calculating the distance from the position of the target point to the position of the preset target part to be used as the distance from the focus to the preset target part. Since the minimum distance between the skin layer and the edge of the lesion is generally taken as the lesion-to-skin distance in breast ultrasound examination, it is possible to select a point having the minimum distance from the skin position on the boundary of the lesion as a target point and calculate the distance between the target point and the skin position as the lesion-to-skin distance. In general, a point having the smallest depth direction coordinate on the lesion boundary (generally, in a region of the image, the depth direction coordinate of an upper region is smaller, and the depth direction coordinate of a lower region is larger) is a point having the smallest distance from the skin position on the lesion boundary, and therefore, the point can be selected as a target point, and the distance between the target point and the skin position can be calculated as the distance from the lesion to the skin.
In an embodiment of the present application, when the predetermined target region is a nipple, the distance from the lesion to the nipple may be defined as a distance between a position where an edge of the lesion is mapped to the body surface and a position of a center of the nipple (as shown in fig. 4A and 4B above), or may be defined as a spatial straight-line distance between an edge of the lesion and a center of the nipple (as shown in fig. 4C above), which will be described below.
First, fig. 4A shows a scene in which an image is acquired with the probe side aligned with the center of the nipple, and in which the leftmost or rightmost side of the image area can be approximated as the center position of the nipple. Generally, with the end of the probe with the mark aligned with the center of the nipple, the position of the point on the top layer of the image (i.e. the point at the top left corner of the image) on the side of the breast ultrasound image with the image mark (generally, the leftmost edge of the image area) can be regarded as the position of the center of the nipple; accordingly, rather than aligning the center of the nipple with the end of the probe having the marker, the position of the point on the top of the image (i.e., the point in the upper right corner of the image) opposite the side of the breast ultrasound image having the image marker (typically the rightmost edge of the image area) can be considered the position of the center of the nipple. Generally, it may be default that the user scans the measured object with the end of the probe having the mark aligned with the center of the nipple, and of course, the user may determine whether to scan the measured object with the end of the probe having the mark aligned with the center of the nipple, and obtain the position of the center of the nipple based on the selection of the user.
Based on this, in an example that the default user scans the measured object with one end of the probe having the mark aligned with the center of the nipple, detecting a preset target portion in the breast ultrasound image may include: and detecting one side with an image mark in the breast ultrasonic image, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image in the points on the one side as the position of the preset target part.
In an example where the user does not default to scan the measured object with one end of the probe having the mark aligned with the center of the nipple, detecting a preset target portion in the breast ultrasound image may include: determining that one end of the probe with the mark is aligned with the center of a nipple when the user uses the probe, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image on the side with the image mark in the breast ultrasonic image as the position of the preset target part; or, determining that one end of the probe, which is not used by the user and has the mark, is aligned with the center of the nipple, and acquiring a position of a point on an image top layer of the breast ultrasound image on the opposite side of the side, which has the image mark, of the breast ultrasound image as the position of the preset target part.
Calculating a distance of the lesion to the preset target site based on the result of the detection based on the determined position of the preset target site (i.e., nipple), may include: mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; the distance from the target point to the preset target site is calculated as the distance from the lesion to the preset target site, as shown by the dotted line in fig. 4A.
Fig. 4B and 4C show a scenario in which the breast image acquired by the probe covers both the nipple and the lesion, and in this scenario, there is typically a sound shadow behind the nipple in the ultrasound image. Generally, the nipple position is determined, during the operation of a doctor, the nipple is determined to be always in the scanned image, according to the tissue characteristics of the nipple, the sound shadow of the obvious black area is formed at the position behind the image, as shown in fig. 4B and 4C, the position with the largest sound shadow area in the depth direction is determined to be the approximate position of the nipple, and the black areas on the left side and the right side of the image are smaller and are not the approximate position area of the nipple. And performing linear projection based on the approximate position of the nipple, and taking the midpoint of a line segment corresponding to the horizontal direction at the top layer of the image after projection as the center of the nipple. As shown in fig. 4B and 4C, the dark gray straight line at the upper edge of the image is the range of the nipple in the horizontal direction of the body surface, and the vertical dotted line is the sound shadow area behind the nipple. Fig. 4B is different from fig. 4C in that the lesion-to-nipple distance shown in fig. 4B is defined as a distance between a position where an edge of a lesion is mapped to a body surface and a position of a center of a nipple, and the lesion-to-nipple distance shown in fig. 4C is defined as a spatial straight-line distance between the edge of the lesion and the center of the nipple.
Based on this, in the example shown in fig. 4B, detecting the preset target site (i.e. nipple) in the breast ultrasound image may include: and detecting the area where the preset target part is located, and acquiring the position of the central point of the preset target part to be used as the position of the preset target part. In this embodiment, the area where the entire nipple is located may be detected, the boundary of the nipple may be segmented, and the position of the center point of the nipple may be determined and obtained as the position of the nipple. Illustratively, the area where the nipple is located can be detected through image processing, machine learning, deep learning and the like, and the typical image features that the position behind the ultrasonic image (including a near field, a middle field and a far field) has obvious sound shadow (black); after the area is detected, the value range of all points in the area in the horizontal direction in the image can be calculated, that is, all points in the area are projected along the vertical direction of the image, and the position of the midpoint of the line segment corresponding to the horizontal direction at the top layer of the image after projection is calculated to be used as the position of the center point of the nipple, that is, the position of the nipple. Accordingly, the calculating the distance from the lesion to the preset target site based on the detection result may include: mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; the distance from the target point to the center point of the preset target site is calculated as the distance from the lesion to the preset target site, as indicated by the horizontal dotted line in fig. 4B.
In the example shown in fig. 4C, detecting a preset target site (i.e., a nipple) in the breast ultrasound image may include: and detecting the area where the preset target part is located, and acquiring the position of the central point of the preset target part to be used as the position of the preset target part. In this embodiment, the area where the entire nipple is located may be detected, the boundary of the nipple may be segmented, and the position of the center point of the nipple may be determined and obtained as the position of the nipple. Illustratively, the area where the nipple is located can be detected through image processing, machine learning, deep learning and the like, and the typical image features that the position behind the ultrasonic image (including a near field, a middle field and a far field) has obvious sound shadow (black); after the area is detected, the value range of all points in the area in the horizontal direction in the image can be calculated, that is, all points in the area are projected along the vertical direction of the image, and the position of the midpoint of the line segment corresponding to the horizontal direction at the top layer of the image after projection is calculated to be used as the position of the center point of the nipple, that is, the position of the nipple. Accordingly, the calculating the distance from the lesion to the preset target site based on the detection result may include: determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; the distance from the target point to the center point of the preset target site is calculated as the distance from the lesion to the preset target site, as shown by the oblique dotted line in fig. 4C.
The mode of determining the target point may be: traversing all points on the boundary of the lesion to determine a point closest to a nipple center point as the target point. Alternatively, the manner of determining the target point may be: determining a centroid point in an area surrounded by the boundary of the focus, connecting the centroid point and a nipple central point to form a line segment, and determining a point which is closest to the nipple central point in a preset area near an intersection point of the line segment and the boundary of the focus as the target point. For example, if the center point of the nipple is defined as N and the centroid point of the lesion is defined as M, the point closest to N in the preset region near the intersection point (e.g., defined as X) of the line segment MN and the boundary of the lesion is the target point. And calculating the distance between the target point and the center point of the nipple to obtain the distance from the focus to the nipple. Wherein, the vicinity of the intersection point X can be understood as: and respectively expanding a certain distance to the left and the right of the X point on the focus boundary by taking the intersection point X as the center to obtain a point X1 which is positioned on the focus boundary and is positioned on the left side of the X point and obtain a point X2 which is positioned on the right side of the X point and is positioned on the focus boundary, wherein the point from the point X1 to the point X2 on the focus boundary is the nearby preset region, and the point with the nearest distance N is selected from the region to be the target point. An included angle a1 between the segment NX1 and the segment NX may be equal to an included angle a2 between the segment NX2 and the segment NX, so that an included angle a may be preset, and the included angles a1 and a2 are equal to half of the included angle a, so as to obtain respective distances that should be expanded to the left and right sides of the point X, and then obtain a point X1 and a point X2, and then obtain the above-mentioned nearby preset region.
Further, in a scenario when an image is captured with the probe side aligned with the nipple center, if a distance from a lesion to the nipple is defined as a spatial straight-line distance between an edge of the lesion to the nipple center and the edge is mapped to a distance between a position of a body surface and a position of the nipple center, calculating a distance from the lesion to the preset target region based on a result of the detection may include: determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part. The manner of determining the target point may be as described above, and is not described herein again.
Alternatively, in yet another embodiment of the present application, when the preset target site is a nipple, the detecting the preset target site in the ultrasound image may include: and detecting the area where the preset target part is located, and acquiring a line segment for defining the preset target part to serve as the position of the preset target part. In this embodiment, the position of the nipple is defined as a line segment, and the position of the line segment is the position of the nipple. Accordingly, the calculating the distance from the lesion to the preset target site based on the detection result may include: determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
In still another embodiment of the present application, when the preset target portion is a nipple, the nipple may be defined as a line segment, and then the detection result of the nipple position may be a position of one line segment. The detection result of the focus position is the area where the focus is located and the boundary thereof. Based on this, the calculating the distance from the lesion to the preset target site based on the detection result may include: and calculating the minimum distance between the line segment and a graph formed by the boundary of the focus to be used as the distance between the focus and the preset target part.
In another embodiment of the present application, when the predetermined target region is a nipple, a user (e.g., a doctor) may be prompted to select a current nipple position after the distance measurement is initiated, and since the center of the nipple is at the top of the image, the closest distance between the click position of the doctor and the lesion (e.g., mapped to the skin) may be calculated as the lesion-to-nipple distance. In this embodiment, the detecting the preset target site in the breast ultrasound image may include: and receiving the position of the preset target part input by the user as the position of the preset target part. Accordingly, the calculating the distance from the lesion to the preset target site based on the detection result may include: mapping points on the boundary of the focus to the top layer of the image of the mammary gland ultrasonic image to obtain a quasi-target point, determining a point which is closest to the position of the preset target part in the quasi-target point to obtain a target point, and calculating the distance from the target point to the center point of the preset target part to be used as the distance from the focus to the preset target part; or, determining a point on the boundary of the lesion closest to the center point of the preset target site as a target point, and calculating a distance from the target point to the center point of the preset target site as a distance from the lesion to the preset target site.
Furthermore, in the embodiment of the present application, if only a lesion is detected in the ultrasound image, and a nipple is not detected, it may prompt "no nipple image is detected in the current section" to prompt the section to be reselected for the ultrasound image to be detected. When one end of the probe is aligned with the center of the nipple to collect data, most ultrasonic images do not have obvious sound shadow behind the nipple, and the ultrasonic images can give a prompt for a doctor to confirm. Further, if a lesion is detected in the ultrasound image as being farther from the nipple (e.g., the distance between the two exceeds a predetermined threshold), the user may be prompted to enter a wide view or to take measurements using a stitched image, which may be a two-screen stitch.
Based on the above description, the ultrasonic measurement method according to the embodiment of the present application can automatically locate the lesion and the position of the preset target region (skin and/or nipple), and automatically calculate the distance from the lesion to the preset target region (skin and/or nipple), thereby simplifying the measurement process of the distance from the lesion to the region (skin and/or nipple), improving the measurement efficiency, and improving the accuracy of the measurement result.
In an embodiment of the present application, after calculating the distance from the lesion to the preset target region, the calculation result may be displayed, for example, on an ultrasound image for a user to view or record. Further, in the embodiment of the application, after the distance from the lesion to the preset target site is calculated, an ultrasound report can be generated based on the calculation result, so that the burden of writing documents on a user is further reduced, the work efficiency is improved, and the possibility of reporting errors is also reduced.
The above exemplarily illustrates an ultrasonic measurement method according to an embodiment of the present application. An ultrasonic measurement method according to another embodiment of the present application is described below with reference to fig. 5. FIG. 5 shows a schematic flow diagram of an ultrasonic measurement method 500 according to another embodiment of the present application. As shown in fig. 5, the ultrasonic measurement method 500 may include the steps of:
in step S510, the probe is controlled to emit an ultrasonic wave to a target region of a measured object, receive an echo of the ultrasonic wave, and acquire an ultrasonic echo signal based on the echo of the ultrasonic wave.
In step S520, an ultrasound image is generated based on the ultrasound echo signal, and a lesion and a preset target site in the ultrasound image are detected.
In step S530, a distance from the lesion to the predetermined target site is calculated based on the detection result.
Steps S510 to S530 in the ultrasound measurement method 500 according to the embodiment of the present application described with reference to fig. 5 are similar to steps S210 to S230 in the ultrasound measurement method 200 according to the embodiment of the present application described with reference to fig. 2, and both involve detecting a lesion and a preset target region in an ultrasound image and calculating a distance from the lesion to the preset target region based on a detection result, which may be referred to the foregoing description, and therefore, for brevity, are not repeated herein. Unlike the ultrasound measurement method 200 according to the embodiment of the present application described with reference to fig. 2, the ultrasound measurement method 500 according to the embodiment of the present application described with reference to fig. 5 is not only used for breast ultrasound examination, but also used for ultrasound examination of any other site, and can be used for measuring the distance between a lesion of any other site and a preset target site.
An ultrasonic measuring device provided according to another aspect of the present application is described below in conjunction with fig. 6. FIG. 6 shows a schematic block diagram of an ultrasonic measurement device 600 according to an embodiment of the present application. As shown in fig. 6, the ultrasonic measurement apparatus 600 may include a transmit/receive sequence controller 610, an ultrasonic probe 620, and a processor 630. The transmit/receive sequence controller 610 is configured to excite the ultrasonic probe 620 to transmit an ultrasonic wave to a target region of a measured object, receive an echo of the ultrasonic wave, and acquire an ultrasonic echo signal based on the echo of the ultrasonic wave. The processor 630 is configured to generate an ultrasound image of a breast based on the ultrasound echo signal, detect a lesion and a preset target site in the ultrasound image, and calculate a distance from the lesion to the preset target site based on a result of the detection, where the preset target site includes skin and/or a nipple.
In an embodiment of the present application, the object to be measured may be a person to be subjected to an ultrasound examination, and the target region of the object to be measured may be a region of a body part of the object to be subjected to the ultrasound examination. For example, the target region may be a breast region, and accordingly, the ultrasound examination is a breast ultrasound examination. In breast ultrasound examinations, it is often necessary to generate a breast ultrasound report that includes: (1) a record of patient information; (2) bilateral breast tissue total sonogram description; (3) sonographic depiction of meaningful abnormalities and lesions. Where the lesion sonogram description section explicitly records the location of the lesion, consistent, repeatable systematic localization is required, usually using lesion-to-skin distance and lesion-to-nipple distance. Therefore, in breast ultrasound examination, the distance from the lesion to the skin and/or the distance from the lesion to the nipple needs to be measured. Wherein the distance between the lesion and the skin may be as shown in fig. 3, generally referring to the minimum distance from the skin layer to the edge of the lesion. The distance between the lesion and the nipple may be as shown in fig. 4A to 4C, in which fig. 4A shows the distance from the center of the nipple to the edge of the lesion mapped to the body surface when the probe acquires an image with the side edge thereof aligned with the center of the nipple, fig. 4B shows the distance from the center of the nipple to the edge of the lesion mapped to the body surface when the probe acquires a breast image while covering the nipple and the lesion, and fig. 4C shows the spatial straight line minimum distance from the center of the nipple to the lesion. The lesion-to-skin distance and lesion-to-nipple distance are key values for locating and distinguishing lesions, and are crucial for follow-up and clinical protocol planning. However, in most cases, the edge morphology of the breast lesion is usually irregular, and currently, a user generally needs to manually select a measurement point, which has a certain error, and is relatively cumbersome, time-consuming and labor-consuming. Based on this, the ultrasonic measurement scheme provided by the present application can solve such a problem. In the following, the ultrasound measurement scheme of the present application is mainly described by taking breast ultrasound measurement as an example, because the scheme of the present application is very suitable for relevant measurement during breast ultrasound examination, but it should be understood that this is only exemplary, and the ultrasound measurement scheme of the present application can also be used for ultrasound examination and measurement of any other part.
In an embodiment of the present application, the processor 630 may include an image processing module that processes the ultrasound echo signals to obtain a breast ultrasound image. Illustratively, the image processing module may perform processing such as gain compensation, beam synthesis, quadrature demodulation, image enhancement, etc. on the ultrasound echo signals to obtain a breast ultrasound image of the target region of the object under test.
Specifically, the transmit/receive sequence controller 610 excites the ultrasound probe 620 to transmit ultrasound signals via transducer elements of the ultrasound probe 620 to convert electrical signals into acoustic signals for transmission to the target object; then, the transducer array element of the ultrasonic probe 620 converts the acoustic signal of the ultrasonic echo into an electric signal; the signal is subjected to front-end filtering amplification through an analog circuit of the image processing module and then is converted into a digital signal through an analog-to-digital converter of the image processing module; the image processing module further forms the data of each array element channel by line beam to obtain a radio frequency signal, and then obtains in-phase/quadrature two-path quadrature signals through quadrature demodulation to carry out imaging processing. The image processing module can reduce subsequent processing problems caused by the fact that the signal intensity is reduced along with the depth by adopting gain compensation on the received ultrasonic echo signals. The processed signal is actually an analog signal, so in order to improve the signal processing efficiency and reduce the complexity of a hardware platform, an analog-to-digital converter is required to convert the analog echo signal into a digital echo signal. After the analog-to-digital conversion is completed, the channel data may be formed by performing digital beam forming according to the delay difference caused by the difference between the distances from the focus point to the channels, and the data processing performed before the digital beam forming may be collectively referred to as front-end processing. The data obtained after this stage is completed may be referred to as radio frequency signal data. So called "rf" means that the signal carries the probe receive clock frequency, and the carrier frequency is exactly in the rf band of the communications domain. After the radio frequency signal data is acquired, signal carriers are removed through in-phase/quadrature demodulation, organization structure information contained in the signals is extracted, filtering is carried out to remove noise, and the acquired signals are baseband signals. All processing required in the rf signal processing to the baseband signal may be collectively referred to as mid-end processing. And finally, obtaining the intensity of the baseband signal and carrying out logarithmic compression and gray scale conversion on the gray scale level of the baseband signal to obtain the breast ultrasound image, wherein the finished processing can be collectively called back-end processing. Thus, a frame of ultrasound image of the breast can be obtained for display.
In an embodiment of the present application, the processor 630 may detect the lesion and the predetermined target portion in the breast ultrasound image of the target region of the object by a fully automatic detection method or a semi-automatic detection method.
The full-automatic detection mode is, for example, to detect a lesion and a preset target site in an ultrasound image based on a boundary segmentation algorithm or a target detection algorithm; from another perspective, a fully automatic detection method is to detect a lesion and a preset target site in an ultrasound image based on a machine learning algorithm or a deep learning algorithm, for example. The machine learning or deep learning algorithm is to put the image of the focus boundary/preset target part marked by the doctor and the frame coordinate of the boundary/region of interest (ROI) into a deep learning segmentation or target detection network (convolutional neural network) for training, and in the training process, the reference model for detecting the focus/preset target part is obtained by calculating the error between the predicted value (namely the focus boundary/preset target part position output by the neural network) and the calibrated position, continuously iterating and gradually approaching. The focus and the preset target part are detected in a full-automatic detection mode, so that rapid detection can be realized, an accurate result can be obtained, the detection efficiency and accuracy are improved, and the subsequent measurement efficiency and accuracy based on the detection result are improved.
In one example, the manner of semi-automatic detection may include: receiving a predefined region of the lesion and/or the preset target site input by a user through a human-machine interaction device (not shown in fig. 6); detecting the lesion and/or the predetermined target site based on the predefined area. That is, the semi-automatic detection is performed by a user first determining an approximate region of the lesion and/or the preset target region according to an ultrasound image displayed on a display interface of a display device (not shown in fig. 6), and then further detecting the lesion and/or the preset target region in the approximate region by the processor 630 through at least one of a boundary segmentation algorithm, a target detection algorithm, a machine learning algorithm, and a deep learning algorithm. In another example, the semi-automatic detection mode may be: the lesion and/or the predetermined target site are detected by the processor 630 through the fully automatic detection method described above, and then the detection result is verified or corrected by the user based on the human-computer interaction device. The semi-automatic detection requires the participation of a user, but can further improve the accuracy of the detection result.
The detection of a lesion and a predetermined target site in a breast ultrasound image will now be described by taking the breast ultrasound examination as an example. As described above, this is merely exemplary, and the ultrasound measuring apparatus of the present application can also be used for detection of a lesion and a preset target site and measurement of a distance between the lesion and the preset target site in any other ultrasound examination.
In the embodiment of the present application, the detection of the lesion position may be to detect a region where the lesion is located and obtain a boundary of the lesion. Based on the detected regional location of the lesion and its boundary, the processor 630 may calculate the distance of the lesion to a preset target site (skin and/or nipple).
For breast ultrasound, for example, the preset target site may include the skin and/or the nipple. The skin will now be described as an example. In an embodiment of the present application, when the preset target region is skin, the detecting, by the processor 630, the preset target region in the ultrasound image may include: detecting the position of the top layer of the image in the ultrasonic image to serve as the position of the preset target part; or detecting the position of the preset target part, in the preset area at the upper end of the ultrasonic image, of which the brightness is greater than a preset threshold value. Generally, for breast ultrasound images, the top layer of the image is the skin layer, so the processor 630 can detect the position of the top layer of the image in the ultrasound image as the position of the skin. Alternatively, in a breast ultrasound image, a lighter area at the upper end of the image is the position of the skin, so the processor 630 may set a preset area and a preset threshold value as required, and detect a position with a brightness greater than the preset threshold value in the preset area at the upper end of the ultrasound image as the position of the skin.
Accordingly, the processor 630, based on the detection result, calculates a distance from the lesion to the predetermined target site, which may include: determining a point with the minimum depth direction coordinate on the boundary of the focus as a target point; and calculating the distance from the position of the target point to the position of the preset target part to be used as the distance from the focus to the preset target part. Since the minimum distance of the skin layer from the lesion edge is generally taken as the lesion-to-skin distance in breast ultrasound examination, processor 630 may select a point on the lesion boundary having the minimum distance from the skin location as a target point and calculate the distance between the target point and the skin location as the lesion-to-skin distance. Generally, the point with the smallest depth direction coordinate on the lesion boundary is the point with the smallest distance from the skin position on the lesion boundary, so the processor 630 may select the point as the target point and calculate the distance between the target point and the skin position as the distance from the lesion to the skin.
In an embodiment of the present application, when the predetermined target region is a nipple, the distance from the lesion to the nipple may be defined as a distance between a position where an edge of the lesion is mapped to the body surface and a position of a center of the nipple (as shown in fig. 4A and 4B, above), or may be defined as a spatial straight-line distance between the edge of the lesion and the center of the nipple (as shown in fig. 4C, above).
Fig. 4A shows a scenario when images are acquired with the probe side aligned with the center of the nipple, in which the leftmost or rightmost side of the image area can be approximated as the nipple center position. Generally, with the end of the probe with the mark aligned with the center of the nipple, the position of the point on the top layer of the image (i.e. the point at the top left corner of the image) on the side of the breast ultrasound image with the image mark (generally, the leftmost edge of the image area) can be regarded as the position of the center of the nipple; accordingly, rather than aligning the center of the nipple with the end of the probe having the marker, the position of the point on the top of the image (i.e., the point in the upper right corner of the image) opposite the side of the breast ultrasound image having the image marker (typically the rightmost edge of the image area) can be considered the position of the center of the nipple. Generally, it may be default that the user scans the measured object with the end of the probe having the mark aligned with the center of the nipple, and of course, the user may determine whether to scan the measured object with the end of the probe having the mark aligned with the center of the nipple, and obtain the position of the center of the nipple based on the selection of the user.
Based on this, in an example that the default user scans the measured object by aligning one end of the probe with the mark with the center of the nipple, the detecting, by the processor 630, of the preset target portion in the breast ultrasound image may include: and detecting one side with an image mark in the breast ultrasonic image, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image in the points on the one side as the position of the preset target part.
In an example where the user does not default to scan the measured object with the end of the probe having the mark aligned with the center of the nipple, the processor 630 detects a preset target site in the breast ultrasound image, which may include: determining that one end of the probe with the mark is aligned with the center of a nipple when the user uses the probe, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image on the side with the image mark in the breast ultrasonic image as the position of the preset target part; or, determining that one end of the probe, which is not used by the user and has the mark, is aligned with the center of the nipple, and acquiring a position of a point on an image top layer of the breast ultrasound image on the opposite side of the side, which has the image mark, of the breast ultrasound image as the position of the preset target part.
Based on the determined location of the predetermined target site (i.e., nipple), processor 630 calculates a distance from the lesion to the predetermined target site based on the result of the detection, which may include: mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; the distance from the target point to the preset target site is calculated as the distance from the lesion to the preset target site, as shown by the dotted line in fig. 4A.
Fig. 4B and 4C show a scenario in which the breast image acquired by the probe covers both the nipple and the lesion, and in this scenario, there is typically a sound shadow behind the nipple in the ultrasound image. Generally, the nipple position is determined, during the operation of a doctor, the nipple is determined to be always in the scanned image, according to the tissue characteristics of the nipple, the sound shadow of the obvious black area is formed at the position behind the image, as shown in fig. 4B and 4C, the position with the largest sound shadow area in the depth direction is determined to be the approximate position of the nipple, and the black areas on the left side and the right side of the image are smaller and are not the approximate position area of the nipple. And performing linear projection based on the approximate position of the nipple, and taking the midpoint of a line segment corresponding to the horizontal direction at the top layer of the image after projection as the center of the nipple. As shown in fig. 4B and 4C, the dark gray straight line at the upper edge of the image is the range of the nipple in the horizontal direction of the body surface, and the vertical dotted line is the sound shadow area behind the nipple. Fig. 4B is different from fig. 4C in that the lesion-to-nipple distance shown in fig. 4B is defined as a distance between a position where an edge of a lesion is mapped to a body surface and a position of a center of a nipple, and the lesion-to-nipple distance shown in fig. 4C is defined as a spatial straight-line distance between the edge of the lesion and the center of the nipple.
Based on this, in the example shown in fig. 4B, the detecting of the preset target portion (i.e. the nipple) in the breast ultrasound image by the processor 630 may include: and detecting the area where the preset target part is located, and acquiring the position of the central point of the preset target part to be used as the position of the preset target part. In this embodiment, the processor 630 may detect the area where the whole nipple is located, segment the boundary of the nipple, and determine and obtain the position of the center point of the nipple as the position of the nipple. Illustratively, the processor 630 may detect the area where the nipple is located through image processing, machine learning, deep learning, and the like, and the typical image feature is that the position behind the ultrasound image (including near field, middle field, and far field) has a distinct sound shadow (black); after the area is detected, the value range of all points in the area in the horizontal direction in the image can be calculated, that is, all points in the area are projected along the vertical direction of the image, and the position of the midpoint of the line segment corresponding to the horizontal direction at the top layer of the image after projection is calculated to be used as the position of the center point of the nipple, that is, the position of the nipple.
Accordingly, the processor 630, based on the detection result, calculates a distance from the lesion to the predetermined target site, which may include: mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; the distance from the target point to the center point of the preset target site is calculated as the distance from the lesion to the preset target site, as indicated by the horizontal dotted line in fig. 4B.
In the example shown in fig. 4C, the detection of the preset target site (i.e. nipple) by the processor 630 in the breast ultrasound image may include: and detecting the area where the preset target part is located, and acquiring the position of the central point of the preset target part to be used as the position of the preset target part. In this embodiment, the area where the entire nipple is located may be detected, the boundary of the nipple may be segmented, and the position of the center point of the nipple may be determined and obtained as the position of the nipple. Illustratively, the area where the nipple is located can be detected through image processing, machine learning, deep learning and the like, and the typical image features that the position behind the ultrasonic image (including a near field, a middle field and a far field) has obvious sound shadow (black); after the area is detected, the value range of all points in the area in the horizontal direction in the image can be calculated, that is, all points in the area are projected along the vertical direction of the image, and the position of the midpoint of the line segment corresponding to the horizontal direction at the top layer of the image after projection is calculated to be used as the position of the center point of the nipple, that is, the position of the nipple.
Accordingly, the processor 630, based on the detection result, calculates a distance from the lesion to the predetermined target site, which may include: determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; the distance from the target point to the center point of the preset target site is calculated as the distance from the lesion to the preset target site, as shown by the oblique dotted line in fig. 4C.
The way for the processor 630 to determine the target point may be: traversing all points on the boundary of the lesion to determine a point closest to a nipple center point as the target point. Alternatively, the manner of determining the target point may be: determining a centroid point in an area surrounded by the boundary of the focus, connecting the centroid point and a nipple central point to form a line segment, and determining a point which is closest to the nipple central point in a preset area near an intersection point of the line segment and the boundary of the focus as the target point. For example, if the center point of the nipple is defined as N and the centroid point of the lesion is defined as M, the point at which the intersection (e.g., defined as X) of the line segment MN with the boundary of the lesion is closest to N is the target point. And calculating the distance between the target point and the center point of the nipple to obtain the distance from the focus to the nipple. Wherein, the vicinity of the intersection point X can be understood as: and respectively expanding a certain distance to the left and the right of the X point on the focus boundary by taking the intersection point X as the center to obtain a point X1 which is positioned on the focus boundary and is positioned on the left side of the X point and obtain a point X2 which is positioned on the right side of the X point and is positioned on the focus boundary, wherein the point from the point X1 to the point X2 on the focus boundary is the nearby preset region, and the point with the nearest distance N is selected from the region to be the target point. An included angle a1 between the segment NX1 and the segment NX may be equal to an included angle a2 between the segment NX2 and the segment NX, so that an included angle a may be preset, and the included angles a1 and a2 are equal to half of the included angle a, so as to obtain respective distances that should be expanded to the left and right sides of the point X, and then obtain a point X1 and a point X2, and then obtain the above-mentioned nearby preset region.
Further, in a scenario when an image is captured with the probe side aligned with the nipple center, if a lesion-to-nipple distance is defined as a spatial straight-line distance between a lesion edge and the nipple center, and the edge is mapped to a distance between a position of the body surface and a position of the nipple center, the processor 630 may calculate a distance of the lesion to the preset target site based on a result of the detection, and may include: determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part. The manner of determining the target point may be as described above, and is not described herein again.
Alternatively, in another embodiment of the present application, when the preset target site is a nipple, the detecting of the preset target site in the ultrasound image by the processor 630 may include: and detecting the area where the preset target part is located, and acquiring a line segment for defining the preset target part to serve as the position of the preset target part. In this embodiment, the position of the nipple is defined as a line segment, and the position of the line segment is the position of the nipple. Accordingly, the processor 630, based on the detection result, calculates a distance from the lesion to the predetermined target site, which may include: determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
In still another embodiment of the present application, when the preset target portion is a nipple, the nipple may be defined as a line segment, and then the detection result of the nipple position may be a position of one line segment. The detection result of the focus position is the area where the focus is located and the boundary thereof. Based on this, the processor 630 calculates the distance from the lesion to the preset target site based on the detection result, and may include: and calculating the minimum distance between the line segment and a graph formed by the boundary of the focus to be used as the distance between the focus and the preset target part.
In another embodiment of the present application, when the predetermined target region is a nipple, the user (e.g., a doctor) may be prompted to select a current nipple position after the distance measurement is initiated, and since the center of the nipple is at the top of the image, the processor 630 may calculate the closest distance between the click position of the doctor and the lesion (e.g., mapped to the skin) as the lesion-to-nipple distance. In this embodiment, the detecting of the preset target site in the breast ultrasound image by the processor 630 may include: and receiving the position of the preset target part input by the user as the position of the preset target part. Accordingly, the processor 630, based on the detection result, calculates a distance from the lesion to the predetermined target site, which may include: mapping points on the boundary of the focus to the top layer of the image of the mammary gland ultrasonic image to obtain a quasi-target point, determining a point which is closest to the position of the preset target part in the quasi-target point to obtain a target point, and calculating the distance from the target point to the center point of the preset target part to be used as the distance from the focus to the preset target part; or, determining a point on the boundary of the lesion closest to the center point of the preset target site as a target point, and calculating a distance from the target point to the center point of the preset target site as a distance from the lesion to the preset target site.
Furthermore, in an embodiment of the present application, if only a lesion is detected in the ultrasound image, and a nipple is not detected, the processor 630 may prompt, for example, through voice or text, the "no nipple image is detected in the current section" through the human-machine interaction interface to prompt the section to be reselected for the ultrasound image for detection. When one end of the probe is aligned with the center of the nipple to collect data, most ultrasonic images do not have obvious sound shadow behind the nipple, and the ultrasonic images can give a prompt for a doctor to confirm. Further, if a lesion is detected in the ultrasound image as being farther from the nipple (e.g., the distance between the two exceeds a predetermined threshold), the user may be prompted to enter a wide view or to take measurements using a stitched image, which may be a two-screen stitch.
Based on the above description, the ultrasonic measuring apparatus according to the embodiment of the present application can automatically locate the lesion and preset the position of the target site (skin and/or nipple), and automatically calculate the distance from the lesion to the preset target site (skin and/or nipple), thereby not only simplifying the measurement process of the distance from the lesion to the site (skin and/or nipple), improving the measurement efficiency, but also improving the accuracy of the measurement result.
In an embodiment of the present application, after the processor 630 calculates the distance from the lesion to the preset target site, the calculation result may be displayed on a display device, for example, an ultrasound image for a user to view or record. Further, in the embodiment of the present application, after calculating the distance from the lesion to the preset target region, the processor 630 may further generate an ultrasound report based on the calculation result, thereby further reducing the burden of writing documents on the user, improving the work efficiency, and reducing the possibility of reporting errors.
An ultrasonic measuring device provided according to another embodiment of the present application is described below with reference to fig. 7. FIG. 7 shows a schematic block diagram of an ultrasonic measurement device 700 according to an embodiment of the present application. As shown in fig. 7, ultrasonic measurement apparatus 700 includes a memory 710 and a processor 720. The memory 710 stores, among other things, program code for implementing the corresponding steps in the ultrasonic measurement method 500 according to embodiments of the present application. The processor 720 is configured to execute the program code stored in the memory 710 to perform the corresponding steps of the ultrasonic measurement methods 200 and 500 according to embodiments of the present application.
Furthermore, according to an embodiment of the present application, there is also provided a storage medium having stored thereon program instructions for executing the respective steps of the ultrasound measurement methods 200 and 500 of the embodiments of the present application when the program instructions are executed by a computer or a processor. The storage medium may include, for example, a memory card of a smart phone, a storage component of a tablet computer, a hard disk of a personal computer, a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a portable compact disc read only memory (CD-ROM), a USB memory, or any combination of the above storage media.
In addition, according to the embodiment of the application, a computer program is further provided, and the computer program can be stored on a storage medium in a cloud or a local place. When being executed by a computer or processor, for performing the respective steps of the ultrasound measurement method of the embodiments of the present application.
According to the ultrasonic measurement method and device, the focus and the preset target part (skin and/or nipple) in the ultrasonic image can be automatically or semi-automatically detected, the distance from the focus to the preset target part (skin and/or nipple) is automatically calculated, the measurement process of the distance from the focus to the part (skin and/or nipple) is simplified, the measurement efficiency is improved, and the accuracy of the measurement result is improved.
Although the example embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above-described example embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present application. All such changes and modifications are intended to be included within the scope of the present application as claimed in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one type of logical function division, and other division manners may be available in actual implementation, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the present application, various features of the present application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the application and aiding in the understanding of one or more of the various application aspects. However, the method of the present application should not be construed to reflect the intent: this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be appreciated by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where mutually exclusive application points exist. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some of the modules in an item analysis apparatus according to embodiments of the present application. The present application may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiments of the present application or the description thereof, and the protection scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope disclosed in the present application, and shall be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (41)

1. An ultrasonic measurement method, characterized in that the method comprises:
controlling a probe to emit ultrasonic waves to a target area of a measured object, receiving echoes of the ultrasonic waves, and acquiring ultrasonic echo signals based on the echoes of the ultrasonic waves;
generating a breast ultrasonic image based on the ultrasonic echo signal, and detecting a focus and a preset target part in the breast ultrasonic image, wherein the preset target part comprises skin and/or nipple; and
calculating a distance from the lesion to the preset target site based on the result of the detection.
2. The method of claim 1, wherein said detecting a lesion in said breast ultrasound image comprises:
detecting the area where the focus is located and obtaining the boundary of the focus.
3. The method of claim 1, wherein the predetermined target site is skin, and the detecting the predetermined target site in the breast ultrasound image comprises:
detecting the position of the top layer of the image in the breast ultrasonic image to serve as the position of the preset target part; or
And detecting the position of the preset target part, wherein the brightness of the preset area at the upper end of the breast ultrasonic image is greater than a preset threshold value.
4. The method of claim 3, wherein calculating the distance from the lesion to the predetermined target site based on the detection result comprises:
determining a point with the minimum depth direction coordinate on the boundary of the focus as a target point;
and calculating the distance from the position of the target point to the position of the preset target part to be used as the distance from the focus to the preset target part.
5. The method of claim 1, wherein the predetermined target site is a nipple, and the detecting the predetermined target site in the breast ultrasound image comprises:
and detecting the area where the preset target part is located, and acquiring the position of the central point of the preset target part to be used as the position of the preset target part.
6. The method of claim 5, wherein the obtaining the position of the center point of the preset target portion comprises:
projecting all points within the region along a vertical direction of the breast ultrasound image;
and calculating the position of the midpoint of a line segment corresponding to the horizontal direction of the image top layer of the mammary gland ultrasonic image after projection to be used as the position of the center point of the preset target part.
7. The method of claim 5 or 6, wherein the calculating the distance from the lesion to the preset target site based on the detection result comprises:
determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; and
and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
8. The method of claim 7, wherein determining a point on the boundary of the lesion closest to a center point of the predetermined target site as a target point comprises:
and traversing all points on the boundary of the focus to determine a point closest to the central point of the preset target part as the target point.
9. The method of claim 7, wherein determining a point on the boundary of the lesion closest to a center point of the predetermined target site as a target point comprises:
determining a centroid point in a region surrounded by the boundary of the focus, and connecting the centroid point with a central point of the preset target part to form a line segment; and
and determining a point which is closest to the central point of the preset target part in a preset area near the intersection point of the line segment and the boundary of the focus as the target point.
10. The method of claim 5 or 6, wherein the calculating the distance from the lesion to the preset target site based on the detection result comprises:
mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; and
and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
11. The method of claim 1, wherein the predetermined target site is a nipple, and the detecting the predetermined target site in the breast ultrasound image comprises:
and detecting the area where the preset target part is located, and acquiring a line segment for defining the preset target part to serve as the position of the preset target part.
12. The method of claim 11, wherein calculating the distance from the lesion to the predetermined target site based on the detection result comprises:
and calculating the minimum distance between the line segment and a graph formed by the boundary of the focus to be used as the distance between the focus and the preset target part.
13. The method of claim 1, wherein the predetermined target site is a nipple, and the detecting the predetermined target site in the breast ultrasound image comprises:
and detecting one side with an image mark in the breast ultrasonic image, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image in the points on the one side as the position of the preset target part.
14. The method of claim 1, wherein the predetermined target site is a nipple, and the detecting the predetermined target site in the breast ultrasound image comprises:
determining that one end of the probe with the mark is aligned with the center of a nipple when the user uses the probe, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image on the side with the image mark in the breast ultrasonic image as the position of the preset target part; or
And if the end, which is not used by the user and has the mark, of the probe is determined to be aligned with the center of the nipple, the position of a point on the top layer of the image of the breast ultrasonic image on the opposite side of the side, which has the image mark, of the breast ultrasonic image is obtained to be used as the position of the preset target part.
15. The method of claim 13 or 14, wherein the calculating the distance from the lesion to the predetermined target site based on the detection result comprises:
mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; and
and calculating the distance from the target point to the preset target part to be used as the distance from the focus to the preset target part.
16. The method of claim 1, wherein the predetermined target site is a nipple, the method further comprising:
sending a prompt that no nipple area is detected in the current section if the focus is detected only in the breast ultrasonic image and no nipple area is detected; or
Detecting that the distance of the lesion from the nipple exceeds a predefined threshold in the breast ultrasound image, prompting entry into a wide view or measurement using a stitched image.
17. The method of claim 1, wherein the predetermined target site is a nipple, and the detecting the predetermined target site in the breast ultrasound image comprises:
and receiving the position of the preset target part input by the user as the position of the preset target part.
18. The method of claim 1, wherein the detection of the lesion and/or the predetermined target site is based on a fully automated detection or a semi-automated detection, wherein the semi-automated detection comprises:
receiving a predefined region of the lesion and/or the preset target site input by a user; and
detecting the lesion and/or the predetermined target site based on the predefined area.
19. The method according to claim 1 or 18, wherein the detection of the lesion and/or the predetermined target site is based on at least one of the following algorithms: a boundary segmentation algorithm, a target detection algorithm, a machine learning algorithm, and a deep learning algorithm.
20. The method of claim 1, further comprising:
after calculating the distance from the focus to the preset target site, displaying the calculated result.
21. The method according to claim 1 or 20, further comprising:
after calculating the distance from the lesion to the preset target site, generating an ultrasound report based on the result of the calculation.
22. The method according to claim 1 or 20, further comprising: and displaying the breast ultrasonic image, wherein the calculated result is displayed on the breast ultrasonic image.
23. An ultrasonic measurement method, characterized in that the method comprises:
controlling a probe to emit ultrasonic waves to a target area of a measured object, receiving echoes of the ultrasonic waves, and acquiring ultrasonic echo signals based on the echoes of the ultrasonic waves;
generating an ultrasonic image based on the ultrasonic echo signal, and detecting a focus and a preset target part in the ultrasonic image; and
calculating a distance from the lesion to the preset target site based on the result of the detection.
24. An ultrasonic measuring device, comprising an ultrasonic probe, a transmission/reception sequence controller, and a processor, wherein:
the transmitting/receiving sequence controller is used for exciting the ultrasonic probe to transmit ultrasonic waves to a target area of a measured object, receiving echoes of the ultrasonic waves and acquiring ultrasonic echo signals based on the echoes of the ultrasonic waves;
the processor is used for generating a breast ultrasonic image based on the ultrasonic echo signal, detecting a focus and a preset target part in the breast ultrasonic image, and calculating the distance from the focus to the preset target part based on the detection result, wherein the preset target part comprises skin and/or nipple.
25. The apparatus of claim 24, wherein the processor detects a lesion in the breast ultrasound image, comprising:
detecting the area where the focus is located and obtaining the boundary of the focus.
26. The apparatus of claim 24, wherein the predetermined target site is skin, and the processor detects the predetermined target site in the breast ultrasound image comprises:
detecting the position of the top layer of the image in the breast ultrasonic image to serve as the position of the preset target part; or
And detecting the position of the preset target part, wherein the brightness of the preset area at the upper end of the breast ultrasonic image is greater than a preset threshold value.
27. The apparatus of claim 26, wherein the processor calculates a distance of the lesion to the predetermined target site based on the result of the detection, comprising:
determining a point with the minimum depth direction coordinate on the boundary of the focus as a target point;
and calculating the distance from the position of the target point to the position of the preset target part to be used as the distance from the focus to the preset target part.
28. The apparatus of claim 24, wherein the predetermined target site is a nipple, and the processor detects the predetermined target site in the breast ultrasound image, comprising:
and detecting the area where the preset target part is located, and acquiring the position of the central point of the preset target part to be used as the position of the preset target part.
29. The apparatus of claim 28, wherein the processor obtains the position of the center point of the preset target portion, comprising:
projecting all points within the region along a vertical direction of the breast ultrasound image;
and calculating the position of the midpoint of a line segment corresponding to the horizontal direction of the image top layer of the mammary gland ultrasonic image after projection to be used as the position of the center point of the preset target part.
30. The apparatus of claim 28 or 29, wherein the processor calculates a distance of the lesion to the predetermined target site based on the result of the detection, comprising:
determining a point on the boundary of the focus, which is closest to the central point of the preset target part, as a target point; and
and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
31. The apparatus of claim 30, wherein the processor determines a point on the boundary of the lesion closest to a center point of the predetermined target site as a target point, comprising:
and traversing all points on the boundary of the focus to determine a point closest to the central point of the preset target part as the target point.
32. The apparatus of claim 30, wherein the processor determines a point on the boundary of the lesion closest to a center point of the predetermined target site as a target point, comprising:
determining a centroid point in a region surrounded by the boundary of the focus, and connecting the centroid point with a central point of the preset target part to form a line segment; and
and determining a point which is closest to the central point of the preset target part in a preset area near the intersection point of the line segment and the boundary of the focus as the target point.
33. The apparatus of claim 28 or 29, wherein the processor calculates a distance of the lesion to the predetermined target site based on the result of the detection, comprising:
mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; and
and calculating the distance from the target point to the central point of the preset target part to be used as the distance from the focus to the preset target part.
34. The apparatus of claim 24, wherein the predetermined target site is a nipple, and the processor detects the predetermined target site in the breast ultrasound image, comprising:
and detecting the area where the preset target part is located, and acquiring a line segment for defining the preset target part to serve as the position of the preset target part.
35. The apparatus of claim 34, wherein the processor calculates a distance of the lesion to the predetermined target site based on the detection result, comprising:
and calculating the minimum distance between the line segment and a graph formed by the boundary of the focus to be used as the distance between the focus and the preset target part.
36. The apparatus of claim 24, wherein the predetermined target site is a nipple, and the processor detects the predetermined target site in the breast ultrasound image, comprising:
and detecting one side with an image mark in the breast ultrasonic image, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image in the points on the one side as the position of the preset target part.
37. The apparatus of claim 24, wherein the predetermined target site is a nipple, and the processor detects the predetermined target site in the breast ultrasound image, comprising:
determining that one end of the probe with the mark is aligned with the center of a nipple when the user uses the probe, and acquiring the position of a point on the top layer of the image of the breast ultrasonic image on the side with the image mark in the breast ultrasonic image as the position of the preset target part; or
And if the end, which is not used by the user and has the mark, of the probe is determined to be aligned with the center of the nipple, the position of a point on the top layer of the image of the breast ultrasonic image on the opposite side of the side, which has the image mark, of the breast ultrasonic image is obtained to be used as the position of the preset target part.
38. The apparatus of claim 36 or 37, wherein the processor calculates a distance of the lesion to the predetermined target site based on the result of the detection, comprising:
mapping points on the boundary of the focus to the top layer of the image of the breast ultrasonic image to obtain quasi target points, and determining points, which are closest to the preset target position, in the quasi target points to obtain target points; and
and calculating the distance from the target point to the preset target part to be used as the distance from the focus to the preset target part.
39. The apparatus of claim 24, wherein the predetermined target site is a nipple, the processor further configured to:
only if the focus is detected in the breast ultrasonic image and no nipple area is detected, sending out a prompt that no nipple area is detected in the current section; or
Detecting that the distance of the lesion from the nipple exceeds a predefined threshold in the breast ultrasound image, prompting entry into a wide view or measurement using a stitched image.
40. An ultrasonic measurement apparatus, characterized in that the apparatus comprises a memory and a processor, the memory having stored thereon a computer program for execution by the processor, the computer program, when executed by the processor, performing the ultrasonic measurement method of any one of claims 1-23.
41. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when executed, performs the ultrasound measurement method of any of claims 1-23.
CN201911358646.6A 2019-12-25 2019-12-25 Ultrasonic measurement method, device and storage medium Pending CN113017683A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911358646.6A CN113017683A (en) 2019-12-25 2019-12-25 Ultrasonic measurement method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911358646.6A CN113017683A (en) 2019-12-25 2019-12-25 Ultrasonic measurement method, device and storage medium

Publications (1)

Publication Number Publication Date
CN113017683A true CN113017683A (en) 2021-06-25

Family

ID=76458839

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911358646.6A Pending CN113017683A (en) 2019-12-25 2019-12-25 Ultrasonic measurement method, device and storage medium

Country Status (1)

Country Link
CN (1) CN113017683A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113393462A (en) * 2021-08-18 2021-09-14 浙江杜比医疗科技有限公司 Focal region segmentation method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113393462A (en) * 2021-08-18 2021-09-14 浙江杜比医疗科技有限公司 Focal region segmentation method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US20210022704A1 (en) Ultrasound diagnostic system, ultrasound image generation apparatus, and ultrasound image generation method
JP6367425B2 (en) Ultrasonic diagnostic equipment
JP6295956B2 (en) Ultrasonic diagnostic apparatus and control method of ultrasonic diagnostic apparatus
RU2667617C2 (en) System and method of elastographic measurements
TWI473598B (en) Breast ultrasound image scanning and diagnostic assistance system
JP5002260B2 (en) Medical diagnostic imaging equipment
JP5949558B2 (en) Ultrasonic diagnostic apparatus and control method of ultrasonic diagnostic apparatus
JPWO2011052602A1 (en) Ultrasonic imaging apparatus, ultrasonic imaging method, and ultrasonic imaging program
WO2014024453A1 (en) Medical data processing device, medical data processing method, and ultrasound diagnostic device
CN112568933B (en) Ultrasonic imaging method, apparatus and storage medium
US20210169454A1 (en) Ultrasonic analysis apparatus, ultrasonic analysis method, and ultrasonic analysis program
KR20150068162A (en) Apparatus for integration of three dimentional ultrasound images and method thereof
US20150055841A1 (en) Orientation-Aware Average Intensity Histogram to Indicate Object Boundary Depth in Ultrasound Images
CN106659480A (en) Ultrasound diagnostic image generating device and method
JP7321836B2 (en) Information processing device, inspection system and information processing method
CN113017683A (en) Ultrasonic measurement method, device and storage medium
CN111956309B (en) Image acquisition method, device, equipment and medium
CN113693627A (en) Ultrasonic image-based focus processing method, ultrasonic imaging device and storage medium
CN114375179A (en) Ultrasonic image analysis method, ultrasonic imaging system, and computer storage medium
CN114245726A (en) Prostate elasticity measuring method and ultrasonic imaging system
WO2019056431A1 (en) Echo intensity processing method, device, computer readable medium and electronic apparatus
CN113768544A (en) Ultrasonic imaging method and equipment for mammary gland
CN114680936A (en) Blood vessel ultrasonic data processing method, device and storage medium
CN112754522A (en) Doppler calculus imaging method and ultrasonic imaging device
US20190183453A1 (en) Ultrasound imaging system and method for obtaining head progression measurements

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination