CN112294360A - Ultrasonic imaging method and device - Google Patents

Ultrasonic imaging method and device Download PDF

Info

Publication number
CN112294360A
CN112294360A CN201910667500.3A CN201910667500A CN112294360A CN 112294360 A CN112294360 A CN 112294360A CN 201910667500 A CN201910667500 A CN 201910667500A CN 112294360 A CN112294360 A CN 112294360A
Authority
CN
China
Prior art keywords
scanning
thyroid
ultrasonic
items
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910667500.3A
Other languages
Chinese (zh)
Inventor
安兴
丛龙飞
温博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN201910667500.3A priority Critical patent/CN112294360A/en
Publication of CN112294360A publication Critical patent/CN112294360A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves

Abstract

An ultrasound imaging method and apparatus are provided. The method comprises the following steps: acquiring a multi-frame ultrasonic image of the thyroid; analyzing the multi-frame ultrasonic image to judge whether an interested target exists in the multi-frame ultrasonic image; when an interested target exists, determining the attribute content of the interested target and the corresponding thyroid scanning position; and displaying the attribute content of the interested target at the corresponding position of a preset thyroid structure diagram according to the thyroid scanning position corresponding to the interested target. Therefore, the method provided by the embodiment of the invention can intuitively display the attribute content of the interested target.

Description

Ultrasonic imaging method and device
Technical Field
The invention relates to the field of medical use, in particular to an ultrasonic imaging method and device.
Background
The thyroid gland is a very important gland of the human body. The physiological functions of the secreted thyroid hormones are mainly as follows: 1. promoting metabolism, increasing oxygen consumption of most tissues, and increasing heat production; 2. promoting growth and development, which is important for the development and growth of long bones, brain and reproductive organs, especially in infancy, and dull symptoms can be caused by the deficiency of thyroid hormone; 3. improve the excitability of the central nervous system. In addition, the medicine also has the effects of strengthening and regulating other hormones, accelerating the heart rate, strengthening the heart contraction force, increasing the cardiac output and the like; the thyroid gland also produces Calcitonin (Calcitonin), which regulates the balance of calcium in the body. Therefore, thyroid gland is a very important endocrine organ in human body and is responsible for regulating the basic physiological activities of human body.
Thyroid nodules are lumps in the thyroid gland, can move up and down along with the thyroid gland along with swallowing, are common diseases in clinic, have high morbidity in middle-aged women, and are usually 3-4 times of the morbidity in men. The incidence of thyroid disease increases year by year, and the cause of the rapid increase in the incidence of thyroid disease is not clear, but several factors are associated with the onset of thyroid cancer: radiation, obesity, excessive or insufficient iodine intake, etc. Thyroid nodules can be single or multiple, and multiple nodules have higher morbidity than single nodules, but the single nodules have higher incidence rate of thyroid cancer, and thyroid nodules, lumps and goiter are main symptoms of thyroid cancer. Numerous epidemiological studies have shown that the incidence of thyroid nodules is close to 50%, i.e., nearly half of the population have thyroid nodules. Of these, approximately 10% of thyroid nodules are malignant, i.e., thyroid cancer. Therefore, diagnosis, treatment, prevention, and the like of thyroid nodules are particularly important.
The diagnosis of thyroid cancer currently relies mainly on palpation, medical imaging (ultrasound, CT), puncture and intraoperative pathology. The ultrasonic image examination has the advantages of being noninvasive, low in price, capable of being operated repeatedly and the like, and is a preferred scheme for thyroid gland routine examination, nodule diagnosis and nodule preoperative examination. Detection of thyroid nodules is most commonly performed using ultrasound. However, the imaging detection depends on the knowledge and experience of the detection doctor to judge, and is highly subjective. Therefore, how to improve the standardization of scanning, the efficiency of nodule detection and the accuracy in the thyroid gland ultrasonic examination process is always a research hotspot in academia and industry. These studies are mainly related to the following aspects: (1) the auxiliary detection equipment is designed to be capable of fitting the neck and covering the whole thyroid gland, so that the standardization of acquisition and the integrity of data are guaranteed, and missed diagnosis is avoided; (2) the intelligent thyroid diagnosis algorithm is optimized, full-automatic and semi-automatic algorithms are designed, ultrasonic images are analyzed through technologies such as image processing, machine learning and deep learning, and the detection rate and efficiency of thyroid nodules are continuously improved.
It should be noted that although the above methods are optimized to some extent for thyroid nodule detection, there are areas to be improved. In particular, the auxiliary equipment is used for improving the nodule detection rate of thyroid gland scanning, an additional hardware device is needed, the operation mode is slightly different from that of the traditional ultrasonic scanning, and a doctor needs an adaptive process. In addition, various advanced algorithms improve the form of nodule detection rate, and the more main function is to effectively improve the diagnosis efficiency of doctors, but if the thyroid gland scanning is not comprehensive enough, the diagnosis omission problem also exists.
Disclosure of Invention
According to a first aspect of the present invention, there is provided a method of ultrasound imaging, the method comprising:
providing scanning guide;
transmitting an ultrasonic beam to the thyroid according to the scanning guide to perform scanning;
receiving an ultrasonic echo returned from the thyroid, and acquiring an ultrasonic echo signal based on the ultrasonic echo;
processing the ultrasonic echo signal to obtain a multi-frame ultrasonic image of the thyroid;
analyzing the multi-frame ultrasonic image to judge whether one or more interested targets exist in the multi-frame ultrasonic image of the thyroid;
when one or more interested targets exist in the multi-frame ultrasonic image of the thyroid, determining the attribute content of each interested target, and determining the thyroid scanning position corresponding to each interested target; and
displaying the attribute content of the interested target at a corresponding position of a preset thyroid structure diagram according to the thyroid scanning position corresponding to each interested target;
and when receiving a selection input of the attribute content of the target of interest displayed on the preset thyroid structure diagram, outputting the frame of ultrasonic image corresponding to the selected attribute content.
According to a second aspect of the present invention, there is provided an ultrasound image processing method including:
acquiring ultrasonic images of different sections of the thyroid;
analyzing the ultrasonic images of different sections to obtain an interested target from the ultrasonic images of one or more sections;
determining scanning position information corresponding to the ultrasonic image with the interested target on the thyroid, wherein the scanning position information is used as the thyroid scanning position of the interested target, and determining the attribute content of the interested target; and
and correlating and displaying the thyroid scanning position and the attribute content of the same interested target.
According to a third aspect of the present invention, there is provided an ultrasound imaging apparatus, capable of being used to implement the steps of the method of the aforementioned first aspect or any implementation thereof, the ultrasound imaging apparatus comprising:
an ultrasonic probe;
the transmission/reception controller is used for exciting the ultrasonic probe to transmit ultrasonic waves to one or more target parts of a tested object and receiving ultrasonic echoes returned by the one or more target parts to obtain ultrasonic echo signals;
a memory for storing a program executed by the processor;
a processor to:
processing the ultrasonic echo signals to obtain the ultrasonic of the one or more target parts
An image;
analyzing the ultrasonic images of the one or more target parts to correspondingly determine the one
Or attribute content of a plurality of target sites, and scanning of ultrasound images of one or more target sites
A location;
and the display is used for displaying the attribute contents corresponding to the one or more target parts at the corresponding position of a preset structure diagram of the object to be detected according to the scanning position of the ultrasonic images of the one or more target parts.
According to a fourth aspect of the present invention, there is provided an apparatus for ultrasound image processing, the apparatus being adapted to implement the steps of the method of the second aspect or any implementation thereof, the apparatus comprising:
the acquisition module is used for acquiring ultrasonic images of different sections of the thyroid;
the analysis module is used for analyzing the ultrasonic images of different sections and obtaining an interested target from the ultrasonic images of one or more sections;
the determining module is used for determining scanning position information of an ultrasonic image with the interested target, serving as the scanning position information of the interested target and determining the attribute content of the interested target; and
and the display module is used for displaying the scanning position information of the interested target and the attribute content in a correlated manner.
According to a fifth aspect of the present invention, there is provided an apparatus for ultrasound image processing, comprising a memory, a processor and a computer program stored in the memory and running on the processor, wherein the processor implements the steps of the method for ultrasound image processing according to the second aspect or any implementation manner thereof when executing the computer program.
According to a sixth aspect of the present invention, there is provided a computer storage medium having stored thereon a computer program which, when executed by a computer or a processor, carries out the steps of the method of ultrasound imaging as described in the first aspect or the first implementation or the steps of the method of ultrasound image processing as described in the second aspect or any implementation thereof.
Therefore, by the method of the embodiment of the invention, the ultrasonic image can be analyzed to determine the interested target and obtain the attribute content of the interested target, the associated attribute content and the original ultrasonic image can be intuitively displayed, a user can intuitively know the sweeping-out position of the interested target and quickly acquire the image information of the interested target conveniently, and the operations do not need excessive additional hardware devices, do not need to increase additional hardware cost and are easy to expand.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail embodiments of the present invention with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings, like reference numbers generally represent like parts or steps.
FIG. 1 is a schematic block diagram of an electronic device of an embodiment of the present invention;
FIG. 2 is a schematic block diagram of an ultrasound imaging apparatus of an embodiment of the present invention;
FIG. 3 is a schematic flow chart of a method of ultrasound image processing in accordance with an embodiment of the present invention;
FIG. 4 is a schematic flow chart of an ultrasound imaging method of an embodiment of the present invention;
FIG. 5 is a schematic view of a work interface of an embodiment of the present invention;
FIG. 6 is an enlarged view of area 300 of FIG. 5, including unscanned items;
FIG. 7 is an enlarged view of the area 300 of FIG. 5, including unscanned and scanned items;
FIG. 8 is a schematic illustration of a displayed ultrasound results image of an embodiment of the present invention;
FIG. 9 is an enlarged view of area 600 of FIG. 8;
FIG. 10 is another schematic flow chart diagram of an ultrasound imaging method of an embodiment of the present invention;
FIG. 11 is a schematic block diagram of an apparatus for ultrasound image processing in accordance with an embodiment of the present invention;
fig. 12 is another schematic block diagram of an apparatus for ultrasound image processing according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of embodiments of the invention and not all embodiments of the invention, with the understanding that the invention is not limited to the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention described herein without inventive step, shall fall within the scope of protection of the invention.
Fig. 1 is a schematic block diagram of an electronic device according to an embodiment of the present invention, which can implement the method according to the embodiment of the present invention. The electronic device 10 shown in FIG. 1 includes one or more processors 102, one or more memory devices 104, an input device 106, an output device 108, and an image sensor 110, which are interconnected via a bus system 112 and/or other form of connection mechanism (not shown). It should be noted that the components and configuration of the electronic device 10 shown in FIG. 1 are exemplary only, and not limiting, and that the electronic device may have other components and configurations as desired.
The processor 102 may include a Central Processing Unit (CPU) 1021 and a Graphics Processing Unit (GPU) 1022 or other forms of Processing units having data Processing capability and/or Instruction execution capability, such as a Field-Programmable Gate Array (FPGA) or an Advanced Reduced Instruction Set Machine (Reduced Instruction Set Computer) Machine (ARM), and the like, and the processor 102 may control other components in the electronic device 10 to perform desired functions.
The storage 104 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory 1041 and/or non-volatile memory 1042. The volatile Memory 1041 may include, for example, a Random Access Memory (RAM), a cache Memory (cache), and/or the like. The non-volatile Memory 1042 may include, for example, a Read-Only Memory (ROM), a hard disk, a flash Memory, and the like. One or more computer program instructions may be stored on the computer-readable storage medium and executed by processor 102 to implement the functions of ultrasound imaging or ultrasound image processing and/or various other desired functions in embodiments of the present invention (implemented by the processor) as described below. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer-readable storage medium.
The input device 106 may be a device used by a user to input instructions and may include one or more of a keyboard, a mouse, a microphone, a touch screen, and the like. The input device 106 may be any interface for receiving information. For example, in embodiments of the present invention, the input device 106 may receive input from a user.
The output device 108 may output various information (e.g., images or sounds) to an outside (e.g., a user), and may include one or more of a display, speakers, printer, and the like. The output device 108 may be any other device having an output function. For example, in an embodiment of the present invention, the output device 108 may display ultrasound images to a user operating the electronic device.
The image sensor 110 may image a target site and store the acquired image in the storage device 104 for use by other components. For example, in an embodiment of the present invention, the image sensor 110 may be an image acquisition device that acquires an ultrasound image of the thyroid gland. For example, it may comprise an ultrasound probe or the like.
It should be noted that the components and structure of the electronic device 10 shown in fig. 1 are only exemplary, and although the electronic device 10 shown in fig. 1 includes a plurality of different devices, those skilled in the art may modify, change, etc. as needed, for example, some of the devices may not be necessary, some of the devices may be more numerous, and the invention is not limited thereto.
Illustratively, the electronic device 10 may be implemented as an ultrasound system for performing ultrasound imaging, although the invention is not limited thereto.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an ultrasound imaging apparatus according to an embodiment of the present invention, the ultrasound imaging apparatus includes an ultrasound probe 01, a transmitting circuit 02, a receiving circuit 03, a beam forming module 04, a processor 05, and a human-computer interaction device 06, where the transmitting circuit 02 and the receiving circuit 03 may be connected to the ultrasound probe 01 through a transmitting/receiving selection switch 07.
In the ultrasonic imaging process, the transmitting circuit 02 sends a delay-focused transmitting pulse with a certain amplitude and polarity to the ultrasonic probe 01 through the transmitting/receiving selection switch 07 to excite the ultrasonic probe 01 to transmit an ultrasonic beam to a target tissue (for example, an organ, a tissue, a blood vessel, etc. in a human body or an animal body), in the embodiment of the present invention, to transmit the ultrasonic beam to the thyroid. After a certain delay, the receiving circuit 03 receives the echo of the ultrasonic beam through the transmitting/receiving selection switch 07 to obtain an ultrasonic echo signal, and sends the echo signal to the beam synthesis module 04, and the beam synthesis module 04 performs processing such as focusing delay, weighting, channel summation and the like on the ultrasonic echo signal to obtain a beam-synthesized ultrasonic echo signal, and then sends the beam-synthesized ultrasonic echo signal to the processor 05 for related processing to obtain a required ultrasonic image. When thyroid ultrasound imaging is performed in the embodiment of the present invention, an ultrasound beam is continuously emitted to the thyroid, so that a section of ultrasound image sequence including a plurality of frames of ultrasound images of the thyroid is obtained. The ultrasonic beam is emitted to the thyroid gland, but not limited to the ultrasonic beam emitted to only the organ of the thyroid gland; illustratively, the emission of an ultrasound beam to a site including the thyroid gland is considered consistent with the emission of an ultrasound beam to the thyroid gland as described herein.
The ultrasound probe 01 typically comprises an array of a plurality of array elements. At each time of transmitting the ultrasonic wave, all or a part of all the array elements of the ultrasonic probe 01 participate in the transmission of the ultrasonic wave. At this time, each array element or each part of array elements participating in ultrasonic wave transmission is excited by the transmission pulse and respectively transmits ultrasonic waves, and the ultrasonic waves respectively transmitted by the array elements are superposed in the propagation process to form a synthesized ultrasonic beam transmitted to a scanning target.
The human-computer interaction device 06 is connected to the processor 05, for example, the processor 05 may be connected to the human-computer interaction device 06 through an external input/output port, and the human-computer interaction device 06 may detect input information of a user, for example, the input information may be a control instruction for transmitting and receiving timing of the ultrasound waves, an operation input instruction for editing and labeling the ultrasound images, or the like, or may further include other instruction types. Generally, the operation instruction obtained when the user performs operation input such as editing, labeling, measuring and the like on the ultrasound image is used for measuring the target tissue. The human-computer interaction device 06 may include one or more of a keyboard, a mouse, a scroll wheel, a track ball, a mobile input device (such as a mobile device with a touch display screen, a mobile phone, etc.), a multifunctional knob, etc., so that the corresponding external input/output port may be a wireless communication module, a wired communication module, or a combination of the two. The external input/output port may also be implemented based on USB, bus protocols such as CAN, and/or wired network protocols, etc.
The human interaction device 06 may comprise a display which may display the ultrasound images obtained by the processor 05. In addition, the display can provide a graphical interface for human-computer interaction for a user while displaying the ultrasonic image, one or more controlled objects are arranged on the graphical interface, and the user is provided with a human-computer interaction device 06 to input operation instructions to control the controlled objects, so that corresponding control operation is executed. For example, an icon is displayed on the graphical interface, and the icon can be operated by the man-machine interaction device to execute a specific function, such as a function for labeling the ultrasonic image. For example, scanning guidance of the thyroid is displayed on the display, and the user is specified to perform comprehensive scanning on the thyroid according to the scanning guidance. In practice, the display may be a touch screen display. In addition, the display in this embodiment may include one display, or may include a plurality of displays.
In the embodiment of the present invention, the processor 05 is configured to process the ultrasound echo signal processed by the beam forming module 04 to obtain an ultrasound image sequence of a multi-frame ultrasound image including a thyroid gland; the processor 05 is further configured to analyze the multi-frame ultrasound image to determine whether one or more objects of interest exist in the image; when one or more interested targets exist in the multi-frame ultrasound image of the thyroid, the processor 05 is further configured to determine the attribute content of each interested target, and determine a thyroid scanning position corresponding to each interested target. The human-computer interaction device 06 may then provide a preset thyroid structure diagram, and the processor 05 controls to display the attribute content of the object of interest at the corresponding position of the preset thyroid structure diagram according to the thyroid scanning position corresponding to each object of interest. The human-computer interaction device 06 is further configured to receive a selection input of the attribute content of the object of interest displayed on the preset thyroid architecture diagram, and upon receiving the selection input, the processor 05 is further configured to control the display to output an original ultrasound image corresponding to the selected attribute content.
Illustratively, the transmission/reception selection switch 07 may also be referred to as a transmission/reception controller or the like, to which the present invention is not limited.
Based on the ultrasound imaging apparatus of the above embodiment, fig. 3 is a schematic flow chart of a method for processing an ultrasound image according to an embodiment of the present invention. The method shown in fig. 3 comprises:
s110, acquiring an ultrasonic image sequence of the thyroid;
s120, analyzing the multi-frame ultrasonic images included in the ultrasonic image sequence to judge whether one or more frames of ultrasonic images including the interested target exist in the ultrasonic image sequence;
s130, determining the attribute content of the thyroid at the interested target.
Illustratively, the ultrasound image sequence acquired in S110 may be acquired in real time or may be acquired from a storage device. The following detailed description is to be read in connection with specific embodiments.
Fig. 4 is a schematic flow chart of an ultrasonic imaging method according to an embodiment of the present invention. The method shown in fig. 4 includes:
s200, providing scanning guidance;
s210, emitting ultrasonic beams to the thyroid according to scanning directions to perform scanning;
s220, receiving an ultrasonic echo returned from the thyroid, and acquiring an ultrasonic echo signal based on the ultrasonic echo;
s230, processing the ultrasonic echo signal to obtain a multi-frame ultrasonic image of the thyroid;
s240, analyzing the multi-frame ultrasonic image to judge whether an interested target exists in the multi-frame ultrasonic image of the thyroid;
s250, when one or more interested targets exist in the multi-frame ultrasonic image of the thyroid, determining the attribute content of each interested target, and determining the thyroid scanning position corresponding to each interested target;
and S260, displaying the attribute content of the interested target at the corresponding position of a preset thyroid structure diagram according to the thyroid scanning position corresponding to each interested target.
Illustratively, S210 to S230 may be considered as a process of obtaining an ultrasound image sequence by real-time acquisition. In particular, the real-time acquisition can be realized by an ultrasonic probe, and the real-time acquisition can be obtained after the real-time acquisition is processed by a beam forming module and a processor. In S210, an ultrasonic beam may be transmitted to the thyroid gland by the transmission circuit 02 through the ultrasonic probe 01. In S220, the ultrasound probe 01 may receive the ultrasound echo and convert the ultrasound echo into an ultrasound echo signal through the receiving circuit 03. In S230, the beam forming module may perform signal processing, and then send the ultrasonic echo signal formed by beam forming to the processor 05 for related processing, so as to obtain a multi-frame ultrasonic image of the thyroid gland. Alternatively, the multi-frame Ultrasound image may be a B-Mode (Brightness-Mode Ultrasound) Ultrasound image, referred to simply as a B-Mode Ultrasound image.
An ultrasound imaging device may include a probe (also referred to as an ultrasound probe),
For example, in S210 to S230, the ultrasonic probe may be directed by the user at a test site including the thyroid gland, an ultrasonic beam is emitted to the thyroid gland, and then echoes representing internal structures of the thyroid gland are received. The gray level image obtained by processing the ultrasonic echo signal can reflect the internal structure of the thyroid gland.
Illustratively, the real-time acquisition process may guide the user through. That is, a prompt or the like for the user to perform the normalization operation according to the prompt, so as to obtain the desired ultrasound image sequence.
Optionally, the scanning guidance provided by the ultrasound imaging method may include: and presenting a plurality of scanning items of the thyroid and the scanning sequence of each scanning item so as to scan the plurality of scanning items one by one according to the scanning sequence. The scanning items comprise scanning positions and scanning directions. The scanned region indicates a local region of the thyroid gland, for example, a left side, a right side, an isthmus, and the like. The scan direction represents the orientation of the ultrasound probe relative to the thyroid. For example, the median sagittal plane of the thyroid is taken as a reference frame, and the direction perpendicular to the median sagittal plane is a scanning direction, which is called transverse direction; the direction parallel to the median sagittal plane is the other scanning direction, called longitudinal. Of course, other scanning directions can be defined according to other scanning requirements. Taking the above scanning positions and scanning directions as examples, in one embodiment, the scanning items that can be presented in the present invention include five scanning items, namely, a left longitudinal cutting position, a left transverse cutting position, a right longitudinal cutting position, a right transverse cutting position, and an isthmus position. For the scanning sequence, the invention defines the scanning sequence and aims to ensure the comprehensive scanning of the thyroid gland in a controllable mode and avoid missing detection. The invention is within the protection scope of the present invention as long as the defined scanning sequence covers all the scanning items to be scanned. For example, the scanning order may be a left longitudinal position, a left transverse position, a right longitudinal position, a right transverse position, and an isthmus position scanned sequentially; or a left transverse cutting position, a left longitudinal cutting position, a right transverse cutting position, a right longitudinal cutting position and an isthmus position which are scanned sequentially; the scanning sequence of the left side part and the right side part can be changed; or scanning the isthmus part first. The scanning items and the scanning sequence can be predetermined and stored in the imaging device; subsequent users can also adjust the scanning items and scanning sequence according to personal preference, better scanning mode and the like.
Thus, in S210, the scan items determined in the scan order may be acquired; and according to the determined scanning items, emitting ultrasonic beams to the corresponding scanning parts according to the corresponding scanning directions so as to scan the plurality of scanning items one by one. After the real-time imaging is started, scanning items are prompted according to a preset scanning sequence, and a user performs real-time scanning on the corresponding part according to the scanning items. And after scanning is finished, prompting a next scanning item according to the scanning sequence until all the objects needing scanning are finished.
Therefore, guidance can be provided for the user who operates the scanning device, and the user can operate the scanning device according to the guidance, so that the requirement on the experience level of the user is not high, even an inexperienced user can complete the scanning device, the scanning of any scanning item is avoided from being missed, and meanwhile, the scanning of any user is normalized.
When a user performs an ultrasound scanning on the thyroid gland, a working interface (i.e., an ultrasound scanning working interface) may be presented on the display, for example, the working interface may be as shown in fig. 5, it should be noted that fig. 5 is only an example of the working interface for the ultrasound scanning, and the actual interface may be in other forms. When the user aligns the ultrasound probe with the thyroid, the ultrasound image at the aligned corresponding site may be viewed on the work interface.
The scanning items and scanning sequence of the thyroid can be presented in a predetermined area of the working interface (i.e. ultrasound scanning working interface). As an example, the predetermined area may be a lower left corner area. It is understood that the predetermined area may be other areas, upper right corner, lower right corner, etc. Referring to FIG. 5, it is presented in the lower left corner area 300 of the work interface. And figure 6 shows an enlarged view of the area 300 in figure 5.
Referring to fig. 6, 5 scanned items and their scanning orders are shown in a region 300, and as shown in the figure, the sequence numbers near the 5 scanned items represent the order, the two item orders on the right side in the figure are respectively 1 and 2, the two item orders on the left side in the figure are respectively 3 and 4, and the one item order in the middle in the figure is 5. These 5 sites included the left longitudinal cut, left lateral cut, right longitudinal cut, right lateral cut and isthmus of the thyroid. This ensures coverage of the entire thyroid gland area. It can be understood that in order to enable the user to clearly view the scan items, 5 scan items and their scan order are shown in fig. 6 on the basis of the image of the thyroid anatomy shape. Of course, the scan items and their scan order may be shown in other image forms, or the scan items may be directly displayed in the scan order in a predetermined area, which is not limited in the present invention. In addition, while the thyroid anatomy map is shown in fig. 6-7 in the form of a two-dimensional plan view, a three-dimensional structure solid view may alternatively be used to characterize the thyroid anatomy map. Meanwhile, except for independently displaying the thyroid, the mode of displaying the thyroid on the whole or part of the human body model can be adopted as the basis for displaying each scanning item.
In this way, a standardized scanning guide can be provided for the user performing the ultrasound operation. It should be understood that although in the examples of fig. 5-7, 5 scan items and their scan order are shown in region 300; in practice, a greater number or a lesser number of scan items may be scanned, and the present invention is not limited in this regard.
After the user sees the directions in the area 300 at the work interface, the user can further operate according to the directions. Specifically, the user can move the ultrasonic probe to scan items with a scanning sequence of 1 in the scanning map; then moving the ultrasonic probe to scan the scanning items with the scanning sequence of 2 in the scanning graph; … are in this order until all 5 scan items have been completed, so that an ultrasound image sequence of 5 scan items can be acquired. That is, the user performing the ultrasound operation can perform the operation in accordance with the guidance of the standardized scanning, thereby obtaining the ultrasound image sequence of each part of the thyroid gland.
It should be understood that the user can scan one by one according to the scanning sequence shown in the figure, so that the scanning of the whole thyroid gland area can be completed, and omission is avoided. However, in actual operation, the user may operate according to his own habit rather than the scanning sequence shown in the figure, for example, the user may scan the scanning items with sequence numbers 3 and 4 first and then scan the scanning items with sequence numbers 1 and 2. To prevent a user from missing any scanned item, scanned and unscanned items may be represented differently in the area 300. That is, the scanning guidance provided by the present invention may include presenting a plurality of scanned items, and after scanning of the corresponding scanned items is completed according to user input, distinguishing scanned items from unscanned items in different presentation manners. And guiding the user to complete scanning guidance of all the scanning items through the reminding function of the presentation mode. In particular, a preset graphic characterizing the thyroid may be provided on which different presentation manners are used to distinguish between scanned and unscanned items of the plurality of scanned items. Alternatively, the predetermined graphic characterizing the thyroid gland may be an anatomical map of the thyroid gland.
Illustratively, the different presentation manners may include: and distinguishing the scanned items and the unscanned items in the plurality of scanned items by different colors, different word sizes, different patterns or different brightness of the same color.
Taking different colors as an example: referring to fig. 6, the corresponding numerals of 5 positions have the same first color, which indicates that none of the 5 scanned items has been scanned. After the user scans the scanning item 1 according to the guide, the number 1 can be changed into a second color; after the user scans the scanning item 2 according to the guide, the number 2 can be changed into a second color; as shown in fig. 7, where the numbers 1 and 2 have a different color than the numbers 3-5, it means that scan items 1 and 2 have been scanned and scan items 3-5 have not been scanned. Specifically, when none of the regions is scanned, fig. 6 is presented, i.e., the presented plurality of scan items are displayed in a first color. After completing the scanning of a first scan item of the plurality of scan items, the first scan item is displayed in a second color. After completing scanning of a second scan item of the plurality of scan items, displaying the second scan item as a second color. For example, the first scan item may be scan item 1 in fig. 6, and the second scan item may be scan item 2 in fig. 6. Thus, after this, it can be presented as fig. 7.
The scan items shown in fig. 6-7 include icons that may include sequential identification, thyroid map and scan location identification overlaid on the thyroid map. The sequence identification is used for prompting the scanning sequence of the scanning item. The setting position of the scanning position mark on the thyroid structure diagram represents the scanning position corresponding to the scanning item, and the setting direction of the scanning position mark on the thyroid structure diagram represents the scanning direction corresponding to the scanning item. For example, the icons of the scanning items of fig. 6-7 include a thyroid map and a probe identification, and the probe identification has different position locations and orientations arranged on the thyroid map to form scanning position identifications. For example, the probe mark of scan item 1 is located at the right side of the thyroid gland structure diagram, which means that the corresponding scan site is the left side of the thyroid gland, the probe mark of scan item 1 is arranged in the vertical direction, and the scan direction corresponding to the mark is the vertical direction. When the icon of the scanning item 1 is presented on the working interface, the user can know that the scanning item corresponds to the left longitudinal cutting position. In addition, the setting position of the icon on the preset graph of the thyroid can also indicate the scanning position of the scanning item indicated by the icon. For example, scan item 1 is located on the right side of the preset pattern of the thyroid gland, which indicates that the scan site corresponding to scan item 1 is the left side of the thyroid gland.
In another implementation, not shown in the figures, the distinction may be made according to the size of the font size. For example, a scanned item with a font size of a first font size represents an unscanned item, while a scanned item with a font size of a second font size represents a scanned item. The first font size can be larger than the second font size, so that a user can see the unscanned items conveniently; alternatively, the first font size may be smaller than the second font size.
It will be appreciated that other presentation means may be used to distinguish scanned items from unscanned items, and are not listed here.
In other embodiments not shown in the figures, different from the presentation manner of presenting all the scanned items in fig. 6 to 7, each scanned item may also be presented in a time-sharing manner according to the scanning sequence of each scanned item. Specifically, the time-sharing presentation refers to, for example, when which scanning item needs to be scanned, presenting the corresponding scanning item on the interface. For example, when the scanning item 1 needs to be scanned, the scanning item 1 is presented in an icon manner, and after the scanning of the scanning item is finished, the next scanning item determined according to the scanning sequence, that is, the scanning item 2, is presented on the interface. In the presentation mode, the user does not need to distinguish which items are scanned, and the scanning of the corresponding scanned items is completed according to the interface prompt.
In other embodiments not shown in the figures, different from the presentation manner of presenting all scanned items in fig. 6 to 7, only the scanned items and the unscanned items currently required to be scanned may be presented according to the scanning sequence of each scanned item. For example, after the real-time imaging is started, since no scanning is performed on any scanning item, scanning item 1 will be presented on the interface. After the scanning of the scanning item 1 is completed, the scanning item continues to be kept on the interface, and meanwhile, the scanning item 2 is presented on the interface. Further, the non-scanned items currently to be scanned can be highlighted so as to be distinguished from the scanned items. Under the presentation mode, the user does not need to distinguish which items are scanned, and only needs to complete scanning of the last sorted corresponding scanned item according to interface prompt.
In another embodiment, unlike the examples shown in fig. 6-7, the individual scan items may be presented separately in the form of icons, without providing preset graphics for the thyroid. In another embodiment not shown in the figures, each scan item may be presented separately using only icons including thyroid map and scan location identification. For example, 5 icons are presented on the ultrasound scanning work interface, each icon can also include a thyroid structure diagram and a scanning position identifier overlapped on the thyroid structure diagram, a user can know the scanning position and the scanning direction only by looking at the icon, and the scanning sequence of each scanning item can be known through the sequence of each icon from top to bottom, or from left to right, or in any direction. Further, the icons are presented using different presentation styles to distinguish scanned items from unscanned items in the plurality of scanned items. For example, the scanning position identifiers are displayed in different colors, the scanning position identifiers are displayed in different sizes, the scanning position identifiers corresponding to the scanned items are displayed in a virtual mode, the thyroid structure diagram is displayed in different colors, and the like. The specific implementation is not limited herein, and the embodiments that can distinguish the scanned items from the unscanned items are within the scope of the present invention.
In this way, scanned and unscanned scanned items can be distinguished in a particular region of the work interface by representing the scanned and unscanned items in different presentation manners. According to the distinction, the user can quickly see the unscanned items and move the ultrasonic probe to complete the scanning of the scanned items. When scanning is carried out according to the scanning sequence, different presentation modes can facilitate a user to quickly know the current scanning progress and determine which scanning item is to be scanned. Even if the user does not operate according to the marked scanning sequence, scanning can be completed on each scanning item of the thyroid gland without omission. That is, the user performing the ultrasound operation may obtain the ultrasound image sequence of each part of the thyroid gland without performing the operation in accordance with the standard scanning guidance.
As such, in this implementation, a real-time acquired ultrasound image sequence can be acquired through S210 to S230, and specifically the acquired ultrasound image sequence includes ultrasound image sequences of various parts of the thyroid gland. In the embodiment described in connection with fig. 5-7, a sequence of 5 ultrasound images may be included.
In another embodiment, the scanning guidance provided by the ultrasound imaging method may also include guiding the user to acquire the sequence of ultrasound images of the corresponding number of segments according to the predetermined number of scanning items. For example, if it is predetermined that a thyroid scanning is provided with scanning of 5 scanning items in fig. 6 to 7, the user needs to acquire 5 ultrasound image sequences according to the scanning guidance, so as to obtain five ultrasound image sequences corresponding to the respective scanning items. The guidance of the acquisition number can be realized by prompting numbers on a scanning working interface, and a user is prompted in real time that a plurality of ultrasonic image sequences are required to be acquired currently. Or judging whether the ultrasonic image sequence of the required segment number is acquired or not by judging after the user finishes scanning. The scanning guiding mode can prompt the scanning operation amount required by a user to avoid missed detection to a certain extent.
After acquiring the multi-frame ultrasound image in S230, the method may further include: in the scanning process according to the scanning sequence, whether the scanning items included in the currently obtained ultrasonic image correspond to the current scanning items determined according to the scanning sequence is judged. And/or judging whether the ultrasonic image comprises the information of all the scanning items after scanning is finished according to the scanning sequence.
Considering that the situation that the user may have nonstandard image acquisition operation or wrong image acquisition position/direction in the scanning process, judging whether the obtained ultrasonic image corresponds to the scanning item or not after obtaining the ultrasonic image can ensure that the ultrasonic image obtained after the current scanning of the user conforms to the position and the direction corresponding to the current scanning item. For example, the current scan entry is "right crosscut," but the user places the probe in the left crosscut. The ultrasound image obtained in S230 may be regarded as an ultrasound image of the right lateral transection position, which may affect the subsequent further analysis operation. In addition, there may be a deviation in the probe placement position, which results in that complete information of the scanned part is not obtained, which may result in information loss and also affect subsequent further analysis operations.
Illustratively, making the determination includes: and comparing the multi-frame ultrasonic image under a certain scanning item obtained in the step S230 with a pre-stored standard ultrasonic image corresponding to the scanning item. The pre-stored standard ultrasound image of the scan item may be a standard slice-down-scan ultrasound image representing the scan location. In performing the alignment, a frame of ultrasound images may be divided into a plurality of regions, and then each region may be aligned separately. In the alignment, a similarity between the two images (i.e., the ultrasound image obtained in S230 and the pre-stored ultrasound image) may be calculated, and it may be determined whether the ultrasound image obtained in S230 is correct according to the obtained similarity.
For example, the ultrasound imaging apparatus may be further equipped with a navigation device disposed on the ultrasound probe so that a spatial position for acquisition, including a position and an orientation, may be obtained during acquisition by the ultrasound probe. Accordingly, after acquiring the multi-frame ultrasound image in S230, the method may further include: and judging whether the current scanning space azimuth information corresponds to the current scanning position and scanning direction determined according to the scanning sequence so as to ensure that each scanning step obtains the multi-frame ultrasonic image under the corresponding scanning item. Whether the set of all scanned spatial orientation information covers all scanned parts and scanning directions of the thyroid can be judged, so that all ultrasonic images under the required scanning items can be obtained before scanning is stopped. The navigation equipment has the advantages that the spatial orientation of the ultrasonic probe can be accurately obtained, and compared with an image processing method, the navigation equipment increases certain cost, but is more accurate in position and direction judgment.
Illustratively, the object of interest in S240 may be a thyroid nodule. That is, whether or not a nodule exists may be determined by analyzing the ultrasound image in S240. For convenience of description, the following examples illustrate nodules.
If it is determined in S240 that one or more objects of interest (e.g., nodules) exist in the multi-frame ultrasound image, in S250, attribute contents of each object of interest may be further determined, and a thyroid scanning position corresponding to each object of interest may be determined, for example, a scanning position corresponding to an image where a nodule exists and attribute contents of the nodule may be determined. The thyroid scanning position corresponding to each interested target can be determined by determining the scanning position of a frame of ultrasonic image with the interested target.
In the embodiment of the present invention, a machine learning algorithm may be adopted to analyze the ultrasound image in S240, and the scanning positions and the respective attribute contents of the nodules in different sections are obtained in S250 through full-automatic and semi-automatic detection.
In an embodiment, when an interested target exists in any frame image of a multi-frame ultrasound image of a thyroid gland through analysis and finding, a scanning position corresponding to an ultrasound probe when the frame image is scanned can be correspondingly obtained, the position of the scanning position on the thyroid gland is a thyroid gland scanning position of the interested target, and attribute content of the interested target is displayed at a corresponding position of a preset thyroid gland structure diagram. For example, if it is found after the analysis that the target of interest exists in the ultrasound image of the right side portion of the frame, and the corresponding scanning position when the ultrasound image of the frame is acquired is the right side portion, the right side portion of the thyroid is taken as the thyroid scanning position of the target of interest, and accordingly, the attribute content of the target of interest is displayed on the left side portion of the preset thyroid structure diagram (the structure diagram displays the scanning result in a mirror image relationship). When the analysis finds that a plurality of interested targets exist in the same part, the attribute contents of the plurality of interested targets can be arranged in the thyroid gland according to the sequence of the acquisition time, and the attribute contents of the plurality of interested targets can also be arranged according to other standards.
In an embodiment, when an interested target exists in any frame of image of a multi-frame ultrasound image of a thyroid gland through analysis and finding, a scanning part corresponding to an ultrasound probe when the frame of image is scanned can be correspondingly obtained, the position when the frame of image is scanned is further positioned in the scanning part according to scanning sequence, the position is used as a thyroid gland scanning position of the interested target, and attribute content of the interested target is displayed at the corresponding position of a preset thyroid gland structure diagram. Compared with the mode of only determining the scanning position, the embodiment can integrate the time information of continuously acquiring the ultrasonic image sequence and further position the specific position of the ultrasonic probe at the time of acquisition. For example, when performing a cross-sectional scan of the right side of the thyroid, 120 frames of images are expected to be acquired from the top to the bottom to obtain an ultrasound image sequence, and the scan order is nthThe position of the nth frame image in the right side portion is (scan range n/120). Aiming at the situation that a plurality of interested targets exist in the same scanning part, due to the fact that the scanning sequencing is used for specifically positioning the inside of the scanning part, the interested targets displaying different frame images can be better distinguished, and the method is more suitable for the use situation of continuously scanning the thyroid to obtain video files.
The embodiment of the present invention does not limit the specific machine learning algorithm. Illustratively, the machine learning algorithm may be a convolutional neural network based ultrasound thyroid nodule calcification extraction algorithm. The convolutional neural network can be obtained by training a training data set, the input of the convolutional neural network can be an ultrasonic image or a characteristic vector thereof, the output of the convolutional neural network can be a binarization vector, whether the training data set is a calcification point or not is determined through the binarization vector, and then whether a nodule exists or not can be determined. The training data set may include a large number of ultrasound image samples, and each ultrasound image sample has annotation information, which may include calcifications and/or nodules. Those skilled in the art will appreciate that other machine learning algorithms may be used, and the present invention is not described in detail herein.
For example, referring to the embodiment of fig. 5-7, after the ultrasound image at scan item 1 is acquired, the ultrasound image is analyzed to determine whether a nodule is present. If the machine learning algorithm determines that no nodule exists in the scanning item 1, the next scanning item (such as the scanning item 2) is processed similarly. If the nodule exists in the scanning item 1, the attribute content of the nodule is further determined and recorded.
For example, the preset thyroid map may be an image of the anatomical shape of the thyroid gland. Illustratively, the attribute content may include an attribute image, identification information of the object of interest, and the like. The attribute image may include a maximum diameter image, such as a cross-section maximum diameter image and a longitudinal-section maximum diameter image; alternatively, the attribute image may include a typical feature image; alternatively, the property image may include a plurality of section images; and so on. It is understood that the attribute image may be other images or any combination of the listed. The identification information of the object of interest may include a size of the object of interest, such as a size of a nodule.
Thus, by the intelligent nodule detection means provided by the embodiment of the invention, the thyroid can be comprehensively scanned, the nodules in the ultrasonic image can be detected as comprehensively as possible, and the attribute content of the nodules can be acquired.
Further, the visualized presentation may be performed in S260. In particular, the attribute content of a nodule may be displayed at the scanned location where the nodule was scanned. For example, referring to the embodiments of fig. 5 to 7, if it is determined that a nodule exists at scan item 1 and the attribute content of the nodule has been acquired in S250, the attribute content may be displayed in S260.
Illustratively, S260 may include: taking a preset thyroid structure diagram as a presented background; displaying the corresponding attribute content at the position of the object of interest in the foreground.
For example, the preset thyroid structure map may be an image of the anatomical shape of the thyroid gland, the object of interest may be a nodule, and the attribute content may include an attribute image and/or the size of the nodule. Then the image of the thyroid anatomy shape can be used as the background of the presentation in S260; the position of the nodule is noted in the foreground and the corresponding attribute image and/or size of the nodule is displayed at that position.
The position of the nodule may be marked in the form of an image, as shown in fig. 8, the image for marking the position of the nodule corresponds to the attribute content, as shown in fig. 9, taking the rightmost oval frame 600 in fig. 8 as an example, wherein the image 610 marks the scanning direction of the probe to the scanned part, and the image 620 is the attribute content of the nodule at the position.
Thus, the display can be intuitively performed without additional operation by the user. And through the display in S260, the position and the attribute content of the interested target can be provided for the user, and the user can know the image scanning result more quickly through the visual display mode.
Exemplarily, after S260, the method may further include: and receiving a selected input of a user aiming at the attribute content of the interested target displayed on the preset thyroid structure chart, and displaying the original ultrasonic image of the attribute content according to the selected input. For example, the user may double click on image 620, or the user may right click on image 620 and select to view the original ultrasound image, or the user may use other input means; the original ultrasound image corresponding to the image 620 may be displayed according to the viewing input, for example, the original ultrasound image is acquired in S230. The original ultrasound image may be displayed overlaying image 620, or may be displayed at a position other than image 620, which is not limited in the present invention.
Exemplarily, after S260, the method may further include: receiving an output instruction of a user, and outputting an ultrasonic report according to the output instruction, wherein the ultrasonic report comprises a preset thyroid structure diagram and attribute contents displayed on the foreground of the structure diagram. For example, the user may select a print area, click to print or click to generate a report, thereby inputting the output instruction; or the user can input the output instruction in other modes; it is further possible to output according to the output instruction.
As an example, the output ultrasound report may include the resulting image shown in fig. 8, i.e., where the attribute content at each nodule is displayed. As another example, if the output instruction of the user indicates that only a few of the attribute contents are displayed, referring to fig. 8, 5 attribute contents are included in fig. 8, and the user may select only a few of them (e.g., 2, 3, etc.), the output ultrasound report may display the partially nodular attribute contents. As another example, if the user's output instruction indicates that several of the attribute contents and several other original ultrasound images are displayed, referring to fig. 7, the user may select several (e.g., 2) of the displayed attribute contents and several (e.g., 3) of the displayed original ultrasound images. As another example, if the user's output instructions indicate that the original ultrasound image at each nodule is displayed.
Illustratively, the ultrasound report may be a document in pdf-like format, including the original ultrasound image and/or attribute content. In addition, a preliminary diagnosis conclusion in the form of text can be included to provide reference for the diagnosis of the doctor. Of course, basic information of the patient may also be included, such as name, age, medical card number, date of visit, etc.
Therefore, the method provided by the embodiment of the invention can be used for analyzing the ultrasonic image to determine the pathological changes, assisting the user in more comprehensive diagnosis of the disease condition by visually displaying the attribute content of the target characteristic image, improving the diagnosis efficiency of the user, and is easy to expand without excessive additional hardware devices and additional hardware cost. In addition, the method of the embodiment of the invention can also provide a standardized scanning process to guide the user to scan one by one and avoid omission, so that the scanning is more comprehensive and the omission is avoided. In addition, the method of the embodiment of the invention has low requirements on the proficiency, the specialty and the like of the user operation, and does not need to change the scanning habit of the user.
As another implementation, the ultrasound image in S110 may be read from a storage medium (e.g., a storage device). The sequence of ultrasound images stored in the storage medium may be previously acquired, and the acquisition process may be the same as or different from the process described in the above S210 to S230, and will not be described herein again. In this implementation, the method described in fig. 2 may be understood as a post-processing procedure.
FIG. 10 is a schematic flow chart diagram of a method of ultrasound image processing in accordance with an embodiment of the present invention. The method shown in fig. 9 may include:
s310, acquiring ultrasonic images of different sections of the thyroid;
s320, analyzing the ultrasonic images of different sections, and obtaining an interested target from the ultrasonic images of one or more sections;
s330, determining scanning position information of an ultrasonic image with an interested target, serving as the scanning position information of the interested target, and determining the attribute content of the interested target;
and S340, correlating and displaying the thyroid scanning position and the attribute content of the same interested target.
Illustratively, S310 may include reading ultrasound images of different sections of the thyroid gland from a storage medium. The ultrasonic images of different sections under the same scanning item form an ultrasonic image sequence.
Illustratively, the object of interest in S320 may refer to an image feature that is inconsistent with a conventional ultrasound image of the thyroid gland, and optionally may include an image feature of a thyroid nodule. That is, whether or not a nodule exists may be determined by analyzing the ultrasound image in S320. For convenience of description, the following examples illustrate nodules.
If the object of interest (e.g., a nodule) is obtained by analyzing the ultrasound images of different sections in S320, the scan position information and the attribute content of the object of interest may be further determined in S330. For example, the scan location of the nodule and the attribute content of the nodule may be determined.
In the embodiment of the invention, a machine learning algorithm can be adopted to analyze the ultrasonic image in S320, and the thyroid scanning position and the attribute content are obtained in S330 through full-automatic and semi-automatic detection.
It can be understood that the process of analyzing the ultrasound image in this embodiment is similar to the process of S240 and S250 in the above embodiment, and the specific implementation process may refer to the description performed in conjunction with the embodiment of fig. 4, and is not described here again to avoid repetition.
Illustratively, the attribute content may include an attribute image, identification information of the object of interest, and the like. The attribute image may include a maximum diameter image, such as a cross-section maximum diameter image and a longitudinal-section maximum diameter image; alternatively, the attribute image may include a typical feature image; alternatively, the property image may include a plurality of section images; and so on. It is understood that the attribute image may be other images or any combination of the listed. The identification information of the object of interest may include a size of the object of interest, such as a size of a nodule.
Exemplarily, S340 may include: according to the thyroid scanning position of the interesting target, respectively displaying the attribute content of the corresponding interesting target at the corresponding position of a preset thyroid structure diagram. The preset thyroid structure map may be an image of an anatomical structure shape of the thyroid gland.
Optionally, the displaying the attribute contents of the corresponding object of interest at one or more thyroid parts of a preset thyroid structure diagram in S340 may include: taking a preset thyroid structure diagram as a presented background; displaying the corresponding attribute content at the position of the object of interest in the foreground. So that objects of interest (such as nodules) in the ultrasound image can be visually presented.
For example, the preset thyroid structure map may be an image of the anatomical shape of the thyroid gland, the object of interest may be a nodule, and the attribute content may include an attribute image and the size of the nodule. Then the image of the thyroid anatomy shape can be used as the background of the presentation in S340; the position of the nodule is noted in the foreground and the corresponding attribute image and the size of the nodule are displayed at that position.
Taking a nodule as an example, as shown in fig. 8, the image marking the position of the nodule corresponds to the attribute content, and taking the rightmost oval box 600 in fig. 8 as an example, as shown in fig. 9, wherein the image 610 marks the scanning position, and the image 620 is the attribute content of the nodule at the position.
Illustratively, S340 may include graphically correlating thyroid scanning locations and attribute content showing the same object of interest. For example, one column of the chart represents the thyroid scanning position of the object of interest, the other column represents the attribute content of the object of interest, and the thyroid scanning position and the attribute content of the same object of interest are displayed on the same row. The display mode can also achieve the aim of enabling the user to quickly know detailed information (such as position information and attribute content) of the interested target, and the usability and the convenience are improved.
In this way, the object of interest can also be determined by analysis for the pre-stored ultrasound image sequence, and the position and attribute content of the object of interest can be provided for the user, thereby facilitating.
Exemplarily, after S340, the method may further include: and receiving a selection input of a user for the attribute content, and displaying the original ultrasonic image of the attribute content according to the selection input. For example, in conjunction with FIG. 7, the user may double-click on image 620, or the user may right-click on image 620 and select to view the original ultrasound image, or the user may use other input means; the original ultrasound image corresponding to the image 620 may be displayed according to the viewing input, for example, the original ultrasound image is acquired in S230. The original ultrasound image may be displayed overlaying image 620, or may be displayed at a position other than image 620, which is not limited in the present invention.
Exemplarily, after S340, the method may further include: and receiving an output instruction of a user, and outputting an ultrasonic report according to the output instruction, wherein the ultrasonic report comprises the preset thyroid structure diagram and the attribute content displayed on the foreground of the thyroid structure diagram. For example, the user may select a print area, click to print or click to generate a report, thereby inputting the output instruction; or the user can input the output instruction in other modes; it is further possible to output according to the output instruction.
Illustratively, the ultrasound report may be a document in pdf-like format, including the original ultrasound image and/or attribute content. In addition, a preliminary diagnosis conclusion in the form of text can be included to provide reference for the diagnosis of the doctor. Of course, basic information of the patient may also be included, such as name, age, medical card number, date of visit, etc.
Therefore, the method provided by the embodiment of the invention can be used for analyzing the ultrasonic image to determine the pathological changes, assisting the user in more comprehensive diagnosis of the disease condition by visually displaying the attribute content of the target characteristic image, improving the diagnosis efficiency of the user, and is easy to expand without additional hardware devices or additional hardware cost.
Fig. 11 is a schematic block diagram of an apparatus for ultrasound image processing in an embodiment of the present invention. The apparatus 1100 shown in fig. 11 may include: an acquisition module 1110, an analysis module 1120, a determination module 1130, and a display module 1140.
An obtaining module 1110, configured to obtain ultrasound images of different sections of a thyroid gland;
the analysis module 1120 is used for analyzing the ultrasonic images of different sections to obtain an interested target from the ultrasonic images of one or more sections;
a determining module 1130, configured to determine scanning position information of an ultrasound image where an object of interest exists, as a thyroid scanning position of the object of interest, and determine attribute content of the object of interest; and
and a display module 1140 for displaying the thyroid scanning position and the attribute content of the same object of interest in association.
Illustratively, the acquisition module 1110 may be specifically configured to acquire ultrasound images from other storage media.
Illustratively, the display module 1140 may be configured to: determining a thyroid part corresponding to the interesting target according to the thyroid scanning position of the interesting target, and respectively displaying the attribute content of the corresponding interesting target at one or more thyroid parts of a preset thyroid structure diagram, particularly at a specific scanning position of the thyroid part.
In particular, the display module 1140 may be configured to: taking a preset thyroid structure diagram as a presented background; and displaying the corresponding attribute content at the position of the target of interest in the foreground.
By intuitively displaying the attribute content, accurate presentation of the ultrasonic information of the thyroid is facilitated.
Illustratively, the analysis module 1120 may be specifically configured to: and analyzing the ultrasonic image by adopting a machine learning algorithm. By adopting a machine learning algorithm based on big data, the analysis process is more intelligent, and the result is more accurate.
The apparatus 1100 shown in fig. 11 may further include an input module and an output module (not shown in the figure), for example. The input module can be used for receiving the selection input of the user for the attribute content; the output module may be used to display the original ultrasound image at the object of interest via the display module 1140 according to the viewing input.
Exemplarily, the input module may be further configured to receive an output instruction of a user; the output module can also be used for outputting an ultrasonic report according to the output instruction, wherein the ultrasonic report comprises a preset thyroid structure diagram and attribute contents displayed on the foreground of the structure diagram.
Optionally, the preset thyroid structure map is an image of the shape of the thyroid gland anatomy, and/or the attribute content includes one or more of the following combinations: a maximum diameter image, a representative feature image, a plurality of slice images, and identification information of the object of interest.
The apparatus 1100 shown in fig. 11 can implement the aforementioned steps of the method for processing an ultrasound image shown in fig. 10, and is not repeated here to avoid repetition.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In addition, another ultrasound image processing apparatus is provided in an embodiment of the present invention, which includes a memory, a processor, and a computer program stored in the memory and running on the processor, and when the processor executes the computer program, the processor implements the foregoing steps of the ultrasound image processing method shown in fig. 10.
As shown in fig. 12, the apparatus 1200 may include a memory 1210 and a processor 1220. The memory 1210 stores computer program codes for implementing respective steps in the method of ultrasound image processing according to an embodiment of the present invention. The processor 1220 is used to execute the computer program codes stored in the memory 1210 to perform the respective steps of the method for ultrasound image processing according to the embodiment of the present invention, and to implement the respective modules in the apparatus 1100 described in fig. 11 according to the embodiment of the present invention.
Illustratively, the following steps are performed when the computer program code is run by the processor 1220: acquiring ultrasonic images of different sections of the thyroid; analyzing the ultrasonic images of different sections to obtain an interested target from the ultrasonic images of one or more sections; determining scanning position information of an ultrasonic image with an interested target, taking the scanning position information as a thyroid scanning position of the interested target, and determining attribute content of the interested target; and correlating and displaying the thyroid scanning position and the attribute content of the same interested target.
In one embodiment, the program in memory 1210, when executed by processor 1220, causes ultrasound image processing apparatus 1200 to perform the steps of: determining a thyroid part corresponding to the interesting target according to the thyroid scanning position of the interesting target, and respectively displaying the attribute content of the corresponding interesting target at one or more thyroid parts of a preset thyroid structure diagram.
In one embodiment, the program in memory 1210, when executed by processor 1220, causes ultrasound image processing apparatus 1200 to perform the steps of: and analyzing the ultrasonic image by adopting a machine learning algorithm.
In one embodiment, the program in memory 1210, when executed by processor 1220, causes ultrasound image processing apparatus 1200 to perform the steps of: taking the preset thyroid structure chart as a presented background; the position of the object of interest is marked in the foreground and the corresponding attribute content is displayed at the position of the object of interest.
In one embodiment, the program in memory 1210, when executed by processor 1220, causes ultrasound image processing apparatus 1200 to perform the steps of: receiving user selection input for the attribute content; the original ultrasound image at the object of interest is displayed according to the selected input.
In one embodiment, the program in memory 1210, when executed by processor 1220, causes ultrasound image processing apparatus 1200 to perform the steps of: receiving an output instruction of a user; and outputting an ultrasonic report according to the output instruction, wherein the ultrasonic report comprises a preset thyroid structure chart and the attribute content displayed on the foreground of the structure chart.
Illustratively, the preset thyroid map is an image of the shape of the thyroid anatomy, and/or the attribute content includes one or more of the following in combination: a maximum diameter image, a representative feature image, a plurality of slice images, and identification information of the object of interest.
In addition, an embodiment of the present invention further provides an electronic device, which may include the apparatus 1100 shown in fig. 11 or include the apparatus 1200 described in fig. 12. The electronic device may implement the method for processing an ultrasound image shown in fig. 9. Alternatively, the electronic device may comprise the electronic device 10 shown in fig. 1. Alternatively, the electronic device may be a medical instrument.
In addition, the ultrasonic imaging device provided by the invention can also be used for carrying out ultrasonic scanning on different target parts and obtaining ultrasonic images of different target parts. In order to more visually display the ultrasonic images of different target parts, a structure diagram of the tested object can be provided, and the structure diagram of the tested object at least comprises different target parts scanned by the ultrasonic. After the ultrasonic images of different target parts are obtained, scanning results of the different target parts can be displayed at corresponding positions of the structure diagram of the object to be detected according to the scanning positions of the different target parts. Specifically, based on the ultrasound image of each target portion, the attribute content and the scanning position of each target portion may be determined, the scanning position determines which position in the structure diagram of the object to be tested to display, and the attribute content determines the display content on the structure diagram of the object to be tested. By respectively displaying the attribute contents of each target part on the structure diagram of the detected object, a user can know which parts are scanned at a glance, the user can be well prompted, and repeated scanning and missed detection are avoided.
For example, a body model map may be provided, and the attribute contents of different target regions of the lung, thyroid, breast, liver, kidney, heart, etc. are respectively displayed at the corresponding regions of the body model. In specific detail parts, the arrangement can be further carried out according to the anatomical structure of the part, such as left lung, right lung, left breast, right breast and the like.
For example, a model map of a specific scanned region may be provided, such as the above-described thyroid structure map, and the attribute contents of different target regions, such as the left side, the right side, and the isthmus of the thyroid are respectively displayed at corresponding regions of the thyroid structure map. For example, a lung model map may be provided, and the attribute contents of different target portions, such as the left lung and the right lung, are respectively displayed at corresponding portions of the lung model map.
In addition, as in the foregoing embodiment, when the user clicks the attribute content on the structure diagram of the object to be tested, the original ultrasound image corresponding to the attribute content may be called out for display. Therefore, the display friendliness is improved, and a more convenient data calling mode is provided, so that a user does not need to manually input search information such as a scanned part to call original ultrasonic image data.
In addition, as in the foregoing embodiment, when the output instruction is received, the ultrasound report may be output according to the output instruction. When the ultrasonic report is output, the structure diagram of the object to be tested and the attribute content marked on the structure diagram can be output together, or the related original ultrasonic image can be automatically merged into the ultrasonic report and output together, and the like.
In addition, the attribute content may include a thumbnail of a typical image, may prompt for abnormal information found during image analysis, may identify some acquisition parameters of the image, may identify quantitative measurements during image analysis, may identify time information of image acquisition, an operator, and the like. The system processes and analyzes the ultrasonic images of each target part, and the obtained and characterizable contents belong to the attribute contents.
In addition, the embodiment of the invention also provides a computer storage medium, and the computer storage medium is stored with the computer program. The steps of the ultrasound imaging method shown in fig. 4 described above may be implemented when the computer program is executed by a computer or a processor. For example, the computer storage medium is a computer-readable storage medium.
In one embodiment, the computer program instructions, when executed by a computer or processor, cause the computer or processor to perform the steps of: transmitting an ultrasonic beam to the thyroid for scanning; receiving an ultrasonic echo returned from the thyroid, and acquiring an ultrasonic echo signal based on the ultrasonic echo; processing the ultrasonic echo signal to obtain a multi-frame ultrasonic image of the thyroid; analyzing the multi-frame ultrasonic image to judge whether one or more interested targets exist in the multi-frame ultrasonic image of the thyroid; when one or more interested targets exist in the ultrasonic image of the thyroid, determining the attribute content of each interested target and the corresponding thyroid scanning position; and displaying the attribute content of the interested target at a corresponding position of a preset thyroid structure diagram according to the thyroid scanning position corresponding to the interested target.
In addition, the embodiment of the invention also provides a computer storage medium, and the computer storage medium is stored with the computer program. The steps of the method for ultrasound image processing shown in fig. 10 described above may be implemented when the computer program is executed by a computer or a processor. For example, the computer storage medium is a computer-readable storage medium.
In one embodiment, the computer program instructions, when executed by a computer or processor, cause the computer or processor to perform the steps of: acquiring ultrasonic images of different sections of the thyroid; analyzing the ultrasonic images of different sections to obtain an interested target from the ultrasonic images of one or more sections; determining the spatial position information of a section in which the interested target exists, wherein the spatial position information is used as the spatial position information of the interested target, and determining the attribute content of the interested target; and displaying the spatial position information of the interested target and the attribute content in an associated manner.
The computer storage medium may include, for example, a memory card of a smart phone, a storage component of a tablet computer, a hard disk of a personal computer, a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a portable compact disc read only memory (CD-ROM), a USB memory, or any combination of the above storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
Therefore, the method provided by the embodiment of the invention can be used for analyzing the ultrasonic image so as to determine the interested target and obtain the attribute content of the interested target, can assist the user in more comprehensive disease diagnosis by visually displaying the attribute content of the interested target, improves the diagnosis efficiency of the user, does not need an additional hardware device for the operations, does not need to increase the additional hardware cost, and is easy to expand. In addition, the method of the embodiment of the invention can also provide a standardized scanning process to guide the user to scan one by one and avoid omission, so that the scanning is more comprehensive and the omission is avoided. In addition, the method of the embodiment of the invention has low requirements on the proficiency, the specialty and the like of the user operation, and does not need to change the scanning habit of the user.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the foregoing illustrative embodiments are merely exemplary and are not intended to limit the scope of the invention thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present invention should not be construed to reflect the intent: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some of the modules in an item analysis apparatus according to embodiments of the present invention. The present invention may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiment of the present invention or the description thereof, and the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the protection scope of the present invention. The protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (21)

1. A method of ultrasound imaging, the method comprising:
providing scanning guide;
transmitting an ultrasonic beam to the thyroid according to the scanning guide to perform scanning;
receiving an ultrasonic echo returned from the thyroid, and acquiring an ultrasonic echo signal based on the ultrasonic echo;
processing the ultrasonic echo signal to obtain a multi-frame ultrasonic image of the thyroid;
analyzing the multi-frame ultrasonic image to judge whether one or more interested targets exist in the multi-frame ultrasonic image of the thyroid;
when one or more interested targets exist in the multi-frame ultrasonic image of the thyroid, determining the attribute content of each interested target, and determining the thyroid scanning position corresponding to each interested target; and
displaying the attribute content of the interested target at a corresponding position of a preset thyroid structure diagram according to the thyroid scanning position corresponding to each interested target;
and when receiving a selection input of the attribute content of the interested target displayed on the preset thyroid structure diagram, outputting an original ultrasonic image corresponding to the selected attribute content.
2. The method of claim 1, wherein providing scanning directions comprises:
providing a plurality of scanning items of the thyroid and a scanning sequence of each scanning item, and carrying out ultrasonic scanning on the plurality of scanning items of the thyroid one by one according to the scanning sequence; wherein the scanning items comprise scanning positions and scanning directions.
3. The method of claim 2, wherein transmitting an ultrasound beam to the thyroid for scanning according to the scanning prescription comprises:
acquiring scanning items determined according to the scanning sequence; and
and according to the determined scanning items, emitting ultrasonic beams to corresponding scanning parts according to corresponding scanning directions so as to scan the plurality of scanning items one by one.
4. The method of claim 2, wherein providing a plurality of scan items for the thyroid gland and a scan order for each scan item comprises:
respectively presenting each scanning item by adopting an icon; and
presenting the respective icons using different presentations to distinguish scanned items from unscanned items of the plurality of scanned items.
5. The method of claim 4, further comprising:
providing a preset graph representing the thyroid, and presenting each icon on the preset graph; and the display position of each icon on the preset graph corresponds to the scanning part included by each icon.
6. The method of claim 4, wherein the icon comprises a thyroid structure map and a scanning location identifier overlaid on the thyroid structure map; the setting position of the scanning position mark on the thyroid structure diagram represents the scanning position corresponding to the scanning item, and the setting direction of the scanning position mark on the thyroid structure diagram represents the scanning direction corresponding to the scanning item.
7. The method of claim 2, wherein providing a plurality of scan items for the thyroid gland and a scan order for each scan item comprises:
according to the scanning sequence of each scanning item, displaying each scanning item in a time-sharing manner by adopting an icon;
or one or more scanning items are presented by using icons, wherein the currently presented scanning items comprise all scanned items and currently unscanned items after the scanned items determined according to the scanning sequence of each scanning item.
8. The method of claim 1, wherein providing scanning directions comprises:
presenting a plurality of scan items of the thyroid, the scan items including a scan location and a scan direction;
receiving an input scanning item;
according to the input scanning items, emitting ultrasonic beams to the corresponding scanning parts according to the corresponding scanning directions; and
distinguishing scanned items from unscanned items of the plurality of scanned items using different presentation styles to direct completion of scanning of the plurality of scanned items.
9. The method of claim 4 or 8, wherein the different presentation comprises:
distinguishing the scanned items and the unscanned items in the plurality of scanned items in different colors, different font sizes, different patterns, or different intensities of the same color.
10. The method of claim 2, further comprising:
in the scanning process according to the scanning sequence, judging whether a scanning item included in the currently obtained ultrasonic image corresponds to a current scanning item determined according to the scanning sequence and/or judging whether the currently scanned spatial orientation information corresponds to a current scanning position and a scanning direction determined according to the scanning sequence;
and/or after scanning is finished according to the scanning sequence, judging whether the ultrasonic image comprises all information of all the scanning items, and/or judging whether the set of all scanned spatial azimuth information covers all the scanning positions and scanning directions of the thyroid.
11. The method of claim 2 or 8, wherein the plurality of scan entries includes a left side rip cut location, a left side cross cut location, a right side rip cut location, a right side cross cut location, and an isthmus cross cut location.
12. An ultrasound image processing method, comprising:
acquiring ultrasonic images of different sections of the thyroid;
analyzing the ultrasonic images of different sections to obtain an interested target from the ultrasonic images of one or more sections;
determining scanning position information corresponding to the ultrasonic image with the interested target on the thyroid, wherein the scanning position information is used as the thyroid scanning position of the interested target, and determining the attribute content of the interested target; and
and correlating and displaying the thyroid scanning position and the attribute content of the same interested target.
13. The method of claim 12, further comprising:
receiving user selection input aiming at the attribute content;
displaying an original ultrasound image at the object of interest according to the selected input.
14. The method of claim 12, wherein associating thyroid scanning location and attribute content showing the same object of interest comprises:
and respectively displaying the attribute contents of the corresponding interested targets at the corresponding positions of a preset thyroid structure diagram according to the thyroid scanning position of the interested target.
15. The method according to claim 1 or 14, wherein displaying the attribute content of the object of interest on a preset thyroid structure map comprises:
taking the preset thyroid structure chart as a presented background;
displaying the corresponding attribute content at the position of the object of interest in a foreground.
16. The method according to claim 1 or 12,
determining a thyroid scanning location of the object of interest comprises:
acquiring a corresponding scanned part when an ultrasonic image of an object of interest is scanned, an
Taking the position of the scanning part on the thyroid as the thyroid scanning position of the interested target;
or the like, or, alternatively,
determining a thyroid scanning location of the object of interest comprises:
acquiring a corresponding scanned part when an ultrasonic image of an interested target exists in scanning,
acquiring scanning sequence when the scanning part is scanned to obtain an ultrasonic image with an interested target;
and positioning the position of the ultrasonic image with the interested target in the scanning position according to the scanning sequence, and taking the position as the thyroid scanning position of the interested target.
17. The method of claim 1 or 12, wherein determining the thyroid scan location of the object of interest comprises: and acquiring a spatial position when an ultrasonic image of the target of interest exists in scanning, and taking the spatial position as a thyroid scanning position of the target of interest.
18. The method of claim 1 or 12, further comprising:
receiving an output instruction of a user;
and outputting an ultrasonic report according to the output instruction, wherein the ultrasonic report comprises the preset thyroid structure diagram and the attribute content displayed on the foreground of the structure diagram.
19. The method according to claim 1 or 12, wherein the preset thyroid structure map is an image of the shape of the thyroid gland anatomy, and/or wherein the attribute content comprises one or more of the following in combination: a maximum diameter image, a representative feature image, a plurality of slice images, and identification information of the object of interest.
20. An ultrasound imaging apparatus, comprising:
an ultrasonic probe;
the transmission/reception controller is used for exciting the ultrasonic probe to emit ultrasonic beams to one or more target parts of a tested object and receiving ultrasonic echoes returned by the one or more target parts to obtain ultrasonic echo signals;
a memory for storing a program executed by the processor;
a processor to:
processing the ultrasonic echo signals to obtain ultrasonic images of the one or more target parts;
determining attribute content of the one or more target sites and a scanning position of the ultrasound images of the one or more target sites based on the ultrasound images of the one or more target sites;
and the display is used for displaying the attribute contents corresponding to the one or more target parts at the corresponding position of a preset structure diagram of the object to be detected according to the scanning position of the ultrasonic images of the one or more target parts.
21. The ultrasound imaging apparatus of claim 20, further comprising:
the input device is used for receiving selection input of attribute contents of one or more target parts displayed on the preset tested object structure diagram;
the display is further configured to display an ultrasound image corresponding to the selected attribute content according to the selected input.
CN201910667500.3A 2019-07-23 2019-07-23 Ultrasonic imaging method and device Pending CN112294360A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910667500.3A CN112294360A (en) 2019-07-23 2019-07-23 Ultrasonic imaging method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910667500.3A CN112294360A (en) 2019-07-23 2019-07-23 Ultrasonic imaging method and device

Publications (1)

Publication Number Publication Date
CN112294360A true CN112294360A (en) 2021-02-02

Family

ID=74329201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910667500.3A Pending CN112294360A (en) 2019-07-23 2019-07-23 Ultrasonic imaging method and device

Country Status (1)

Country Link
CN (1) CN112294360A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113842166A (en) * 2021-10-25 2021-12-28 上海交通大学医学院 Ultrasonic image acquisition method based on ultrasonic imaging equipment and related device

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101040801A (en) * 2006-03-23 2007-09-26 株式会社东芝 Apparatus and method of displaying image scanning report
US20110110576A1 (en) * 2009-10-07 2011-05-12 Hologic, Inc. Selective Display Of Computer-Aided Detection Findings With Associated Breast X-Ray Mammogram and/or Tomosynthesis Image Information
US20110182493A1 (en) * 2010-01-25 2011-07-28 Martin Huber Method and a system for image annotation
CN102203714A (en) * 2008-11-06 2011-09-28 皇家飞利浦电子股份有限公司 Breast ultrasound annotation user interface
CN102915400A (en) * 2011-08-02 2013-02-06 西门子公司 Method and arrangement for computer-assisted representation and/or evaluation of medical examination data
US20130261447A1 (en) * 2012-04-02 2013-10-03 Fujifilm Corporation Ultrasound diagnostic apparatus
US20140018681A1 (en) * 2012-07-10 2014-01-16 National Taiwan University Ultrasound imaging breast tumor detection and diagnostic system and method
CN103565470A (en) * 2012-08-07 2014-02-12 香港理工大学 Ultrasonic image automatic annotating method and system based on three-dimensional virtual image
US20180000453A1 (en) * 2016-07-01 2018-01-04 YoR Labs Methods and Systems for Ultrasound Imaging
US20180161010A1 (en) * 2016-12-09 2018-06-14 Samsung Electronics Co., Ltd. Apparatus and method for processing ultrasound image
CN109069131A (en) * 2016-04-18 2018-12-21 皇家飞利浦有限公司 Ultrasonic system and method for breast tissue imaging
CN109310400A (en) * 2016-06-07 2019-02-05 皇家飞利浦有限公司 The ultrasonic system and method for breast ultrasound image are imaged and annotated for breast tissue
JP2019081111A (en) * 2019-03-11 2019-05-30 キヤノンメディカルシステムズ株式会社 Ultrasonic image diagnostic apparatus and program

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101040801A (en) * 2006-03-23 2007-09-26 株式会社东芝 Apparatus and method of displaying image scanning report
CN102203714A (en) * 2008-11-06 2011-09-28 皇家飞利浦电子股份有限公司 Breast ultrasound annotation user interface
US20110110576A1 (en) * 2009-10-07 2011-05-12 Hologic, Inc. Selective Display Of Computer-Aided Detection Findings With Associated Breast X-Ray Mammogram and/or Tomosynthesis Image Information
US20110182493A1 (en) * 2010-01-25 2011-07-28 Martin Huber Method and a system for image annotation
CN102915400A (en) * 2011-08-02 2013-02-06 西门子公司 Method and arrangement for computer-assisted representation and/or evaluation of medical examination data
US20130261447A1 (en) * 2012-04-02 2013-10-03 Fujifilm Corporation Ultrasound diagnostic apparatus
US20140018681A1 (en) * 2012-07-10 2014-01-16 National Taiwan University Ultrasound imaging breast tumor detection and diagnostic system and method
CN103565470A (en) * 2012-08-07 2014-02-12 香港理工大学 Ultrasonic image automatic annotating method and system based on three-dimensional virtual image
CN109069131A (en) * 2016-04-18 2018-12-21 皇家飞利浦有限公司 Ultrasonic system and method for breast tissue imaging
CN109310400A (en) * 2016-06-07 2019-02-05 皇家飞利浦有限公司 The ultrasonic system and method for breast ultrasound image are imaged and annotated for breast tissue
US20180000453A1 (en) * 2016-07-01 2018-01-04 YoR Labs Methods and Systems for Ultrasound Imaging
US20180161010A1 (en) * 2016-12-09 2018-06-14 Samsung Electronics Co., Ltd. Apparatus and method for processing ultrasound image
JP2019081111A (en) * 2019-03-11 2019-05-30 キヤノンメディカルシステムズ株式会社 Ultrasonic image diagnostic apparatus and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113842166A (en) * 2021-10-25 2021-12-28 上海交通大学医学院 Ultrasonic image acquisition method based on ultrasonic imaging equipment and related device

Similar Documents

Publication Publication Date Title
US8172753B2 (en) Systems and methods for visualization of an ultrasound probe relative to an object
US11389139B2 (en) Echo window artifact classification and visual indicators for an ultrasound system
US20120108960A1 (en) Method and system for organizing stored ultrasound data
CN109069131A (en) Ultrasonic system and method for breast tissue imaging
US10121272B2 (en) Ultrasonic diagnostic apparatus and medical image processing apparatus
CA2989910C (en) Obstetrical imaging at the point of care for untrained or minimally trained operators
US20220249202A1 (en) Multiple bone density displaying method for establishing implant procedure plan, and image processing device therefor
US11344371B2 (en) Visualization of three-dimensional image data on a two-dimensional image
CN115811961A (en) Three-dimensional display method and ultrasonic imaging system
US8636662B2 (en) Method and system for displaying system parameter information
CN113349897A (en) Ultrasonic puncture guiding method, device and equipment
US20120188240A1 (en) Medical image display apparatus, method and program
JP2009195283A (en) Ultrasonic image processor
CN112568933A (en) Ultrasonic imaging method, apparatus and storage medium
CN112294360A (en) Ultrasonic imaging method and device
JP2015100479A (en) Ultrasonic image processor
CN114375179A (en) Ultrasonic image analysis method, ultrasonic imaging system, and computer storage medium
JP6258026B2 (en) Ultrasonic diagnostic equipment
CN113516701A (en) Image processing method, image processing device, related equipment and storage medium
CN113693627A (en) Ultrasonic image-based focus processing method, ultrasonic imaging device and storage medium
CN114007513A (en) Ultrasonic imaging equipment, method and device for detecting B line and storage medium
CN114035713A (en) Ultrasonic scanning flow control method and system
CN111568469A (en) Method and apparatus for displaying ultrasound image and computer program product
JP7299100B2 (en) ULTRASOUND DIAGNOSTIC DEVICE AND ULTRASOUND IMAGE PROCESSING METHOD
US20240096086A1 (en) Information processing apparatus, information processing method, and information processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210202