CN111466948B - Ultrasonic scanning method and ultrasonic scanning device - Google Patents

Ultrasonic scanning method and ultrasonic scanning device Download PDF

Info

Publication number
CN111466948B
CN111466948B CN201910062019.1A CN201910062019A CN111466948B CN 111466948 B CN111466948 B CN 111466948B CN 201910062019 A CN201910062019 A CN 201910062019A CN 111466948 B CN111466948 B CN 111466948B
Authority
CN
China
Prior art keywords
image
area
ultrasound
ultrasonic
prediction box
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910062019.1A
Other languages
Chinese (zh)
Other versions
CN111466948A (en
Inventor
余春贤
陈国男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN201910062019.1A priority Critical patent/CN111466948B/en
Publication of CN111466948A publication Critical patent/CN111466948A/en
Application granted granted Critical
Publication of CN111466948B publication Critical patent/CN111466948B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings

Abstract

The embodiment of the invention provides an ultrasonic scanning method and an ultrasonic scanning device. The method comprises the following steps: performing an ultrasonic scanning operation on a human body via an ultrasonic scanner to obtain an ultrasonic image; analyzing, via an image recognition module, the ultrasound image to identify an organ pattern in the ultrasound image; and automatically generating guide information according to the recognition result of the organ pattern, wherein the guide information is used for guiding the movement of the ultrasonic scanner to scan the target organ of the human body. Therefore, the efficiency of the traditional manual ultrasonic scanning can be improved.

Description

Ultrasonic scanning method and ultrasonic scanning device
Technical Field
The present invention relates to ultrasonic scanning technologies, and in particular, to an ultrasonic scanning method and an ultrasonic scanning apparatus.
Background
The ultrasound scanning apparatus may scan an organ image in the human body based on the ultrasound to evaluate the organ state from this organ image. However, the ultrasonic scanning apparatus conventionally requires a professional to operate, and the ultrasonic image obtained through the ultrasonic scanning also requires a professional to recognize. It is not easy for a person without specialized training to identify a particular organ from an ultrasound image. However, in practice, even when a professional such as a doctor or a laboratory technician operates the ultrasonic scanning apparatus, erroneous interpretation of the organ pattern may occur due to various conditions (for example, insufficient expertise or fatigue), which may result in inefficient examination or incorrect examination results.
Disclosure of Invention
The present invention provides an ultrasonic scanning method and an ultrasonic scanning apparatus, which can automatically analyze an ultrasonic image and provide guidance information for auxiliary scanning according to the analysis result, thereby improving the above problems.
The embodiment of the invention provides an ultrasonic scanning method which is used for an ultrasonic scanning device. The ultrasonic scanning method comprises the following steps: performing an ultrasonic scanning operation on a human body via an ultrasonic scanner to obtain an ultrasonic image; analyzing, via an image recognition module, the ultrasound image to identify an organ pattern in the ultrasound image; and automatically generating guide information according to the recognition result of the organ pattern, wherein the guide information is used for guiding the movement of the ultrasonic scanner to scan the target organ of the human body.
An embodiment of the present invention further provides an ultrasonic scanning apparatus, which includes an ultrasonic scanner and a processor. The ultrasonic scanner is used for performing ultrasonic scanning operation on a human body to obtain an ultrasonic image. The processor is coupled to the ultrasound scanner and configured to analyze the ultrasound image via an image recognition module to identify an organ pattern in the ultrasound image. The processor is also used for automatically generating guide information according to the recognition result of the organ pattern, and the guide information is used for guiding the movement of the ultrasonic scanner to scan the target organ of the human body.
Based on the above, after performing an ultrasound scanning operation on a human body via an ultrasound scanner to obtain an ultrasound image, an image recognition module may be used to analyze the ultrasound image to identify an organ pattern in the ultrasound image. Then, guidance information may be automatically generated according to the recognition result of the organ pattern. In particular, the guiding information can guide the movement of the ultrasonic scanner to scan the target organ of the human body, thereby reducing the burden of a professional performing ultrasonic scanning and/or enabling a person without professional training to easily operate the ultrasonic scanning device to perform simple scanning.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 is a schematic view of an ultrasound scanning apparatus according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating an ultrasound scanning operation in accordance with an embodiment of the present invention;
FIG. 3 is a schematic diagram of an ultrasound image and corresponding coordinate information, according to an embodiment of the present invention;
FIGS. 4 and 5 are schematic diagrams illustrating generation of guidance information according to a numerical relationship between areas of prediction boxes according to an embodiment of the present invention;
FIG. 6 is a flow chart illustrating a method of ultrasound scanning according to an embodiment of the present invention;
fig. 7 and 8 are flowcharts illustrating an ultrasonic scanning method according to an embodiment of the present invention.
Description of the reference numerals
10: ultrasonic scanning device
101: ultrasonic scanner
102: storage device
103: image recognition module
104: processor with a memory having a plurality of memory cells
105: input/output interface
21: human body
201: organ
31: coordinate table
41. 42, 51, 52: ultrasound imaging
410. 420, 510, 520: prediction box
401. 501: guide information
S601 to S603, S701 to S704, S801 to S808: step (ii) of
Detailed Description
Fig. 1 is a schematic view of an ultrasonic scanning apparatus according to an embodiment of the present invention. Referring to fig. 1, the ultrasound scanning apparatus 10 includes an ultrasound scanner 101, a storage device 102, an image recognition module 103, a processor 104, and an input/output interface 105. The ultrasonic scanner 101 is configured to perform an ultrasonic scanning operation on a human body to obtain an ultrasonic image. For example, the ultrasound scanner 101 may include a hand-held probe. The ultrasonic scanner 101 can transmit ultrasonic waves and receive ultrasonic waves reflected back via organs of the human body. From the reflected ultrasonic waves, an ultrasonic image can be obtained. In the following embodiments, a two-dimensional ultrasonic image is exemplified. However, in another embodiment, the ultrasound image may also include a three-dimensional ultrasound image, and the invention is not limited thereto.
The storage device 102 is used for storing data. For example, storage 102 may include volatile storage media and non-volatile storage media. Volatile storage media may include Random Access Memory (RAM). The non-volatile memory module may include a flash memory module, a Read Only Memory (ROM), a Solid State Drive (SSD), and/or a Hard Disk Drive (HDD). In addition, the number of the storage devices 102 may be one or more, and the invention is not limited thereto.
The image recognition module 103 is configured to perform image recognition on the obtained ultrasound image. For example, the image recognition module 103 may perform image recognition based on a Convolutional Neural Network (CNN) architecture or other type of image recognition architecture (or algorithm). The image recognition module 103 may be implemented in software or hardware. In one embodiment, the image recognition module 103 comprises a software module. For example, the program code of the image recognition module 103 may be stored in the storage device 102 and may be executed by the processor 104. In one embodiment, the image recognition module 103 includes hardware circuitry. For example, the image recognition module 103 may include a Graphics Processing Unit (GPU) or other programmable general purpose or special purpose microprocessor, digital signal processor, programmable controller, application specific integrated circuit, programmable logic device, or other similar device or combination of devices. In addition, the number of the image recognition modules 103 may be one or more, and the invention is not limited thereto.
The processor 104 is coupled to the ultrasound scanner 101, the storage device 102, and the image recognition module 103. The processor 104 may be used to control the ultrasound scanner 101, the storage device 102, and the image recognition module 103. For example, the processor 104 may include a Central Processing Unit (CPU), a graphics processor or other programmable general or special purpose microprocessor, a digital signal processor, a programmable controller, an application specific integrated circuit, a programmable logic device, or other similar devices or combinations thereof. In one embodiment, the processor 104 may be used to control the overall or partial operation of the ultrasound scanning apparatus 10. In one embodiment, the image recognition module 103 may be implemented within the processor 104 in software, firmware, or hardware. In addition, the number of the processors 104 may be one or more, and the invention is not limited thereto.
The input/output interface 105 is coupled to the processor 104. The input/output interface 105 is used to receive signals and/or output signals. For example, the input/output interface 105 may include a screen, a touch pad, a mouse, a keyboard, physical buttons, a speaker, a microphone, a wired communication interface, and/or a wireless communication interface, and the type of the input/output interface 105 is not limited thereto.
Figure 2 is a schematic diagram illustrating an ultrasound scanning operation in accordance with one embodiment of the present invention. Referring to fig. 1 and 2, an operator can hold the ultrasonic scanner 101 and move the ultrasonic scanner 101 on the human body 21 to perform an ultrasonic scanning operation. In addition, a gel may be applied between the ultrasonic scanner 101 and the human body 21 to facilitate the ultrasonic scanning operation. Taking fig. 2 as an example, an ultrasound image obtained via this ultrasound scanning operation may present a pattern of the organ 201 (also referred to as an organ pattern). For example, the organ 201 may include various human organs such as heart, liver and/or uterus, etc., and the present invention is not limited thereto. The obtained ultrasound images may be stored in the storage device 102.
The processor 104 may analyze the ultrasound image via the image recognition module 103 to identify an organ pattern in the ultrasound image. The processor 104 may automatically generate the guidance information according to the recognition result of the organ pattern. This guidance information may be used to guide the movement of the ultrasound scanner 101 to scan a particular organ (also referred to as a target organ) of the human body 21. For example, this guidance information may be output in the form of an image through a screen in the input/output interface 105 and/or in the form of sound through a speaker in the input/output interface 105. Alternatively, the guiding message may be output in other forms (such as vibration or buzzer), and the invention is not limited thereto. In addition, the target organ may be various human organs such as heart, liver and/or uterus, and the present invention is not limited thereto. The operator can move the ultrasound scanner 101 according to this guidance information to continue scanning the target organ and/or to continue enlarging the pattern area of the target organ in the ultrasound image during the scanning.
In one embodiment, the guidance information may include direction information. The operator can move the ultrasonic scanner 101 in a specific direction based on the direction information. In one embodiment, this guidance information may include information whether the current direction of movement of the ultrasound scanner 101 is correct. The operator can decide whether to change the moving direction of the ultrasonic scanner 101 based on the guidance information. Thereby, even without being trained in professional ultrasound scanning and/or analysis of ultrasound images, a general user can perform ultrasound scanning of a target organ and/or obtain an ultrasound image presenting a complete pattern (or a maximum pattern) of the target organ according to this guidance information. In addition, the guiding information can also be used to guide professionals such as doctors or inspectors to complete ultrasonic scanning of the target organ.
In one embodiment, the processor 104 may record coordinate information corresponding to a particular ultrasound image. This coordinate information may reflect the position of the ultrasound scanner 101 at the time the ultrasound image was acquired. For example, a sensor module may be provided in the ultrasonic scanner 101. For example, such a sensor module may include an optical sensor, a gyroscope, a gravity sensor (e.g., G-sensor), and the like to sense the current position, movement direction, movement distance, and/or the like of the ultrasound scanner 101. The processor 104 may obtain coordinate information corresponding to a particular ultrasound image from information provided by the sensor module.
FIG. 3 is a schematic diagram of an ultrasound image and corresponding coordinate information shown in accordance with an embodiment of the present invention. Referring to fig. 3, the coordinate table 31 records coordinates (x 1, y 1) to (x 5, y 5) corresponding to a plurality of ultrasound images numbered 1 to 5. Taking the ultrasound images numbered 1 and 2 as an example, when the ultrasound image numbered 1 is acquired, the position of the ultrasound scanner 101 is at the coordinates (x 1, y 1); when the ultrasonic image of number 2 is acquired, the position of the ultrasonic scanner 101 is at the coordinates (x 2, y 2), and so on. After performing image recognition of an organ pattern on a certain ultrasound image, the processor 104 may generate the guidance information based on the recognition result of the organ pattern and the coordinate information corresponding to this ultrasound image in the coordinate table 31. In other words, in the operation of generating the guidance information, the current position, the past position, and/or the movement trajectory of the ultrasound scanner 101 may be considered. It should be noted that the 5 ultrasound images in the coordinate table 31 are only exemplary. In another embodiment, more or fewer ultrasound images and corresponding coordinate information may be recorded in the coordinate table 31. Alternatively, the coordinate table 31 may record the ultrasound image and the corresponding coordinate information in other forms. In addition, more information for generating the guiding information may be recorded in the coordinate table 31, depending on the practical requirements.
In one embodiment, the image recognition module 103 may determine a prediction frame in an ultrasound image. This prediction box reflects the range of the organ pattern identified by the image identification module 103 in the ultrasound image. Taking an ultrasound image containing a liver pattern as an example, after performing image recognition on the ultrasound image, the image recognition module 103 may determine a prediction frame in the ultrasound image. The prediction frame reflects the approximate range of the liver pattern in the ultrasound image.
In one embodiment, the prediction box covers the range of organ patterns of the target organ. For example, assuming that a certain ultrasound image includes a plurality of organ patterns, the image recognition module 103 may determine a prediction box according to an organ pattern belonging to a target organ (also referred to as a target organ pattern) in the organ patterns, so that the prediction box (only) covers the range of the target organ pattern. Viewed from another perspective, after deciding on the target organ, the image recognition module 103 may begin tracking the organ pattern of the target organ, ignoring the remaining organ patterns that do not belong to the target organ. For example, assuming the determined target organ is the liver, the image recognition module 103 may begin tracking liver patterns that may appear in the ultrasound image and ignore organ patterns of the remaining organs (e.g., the kidney or heart) in the ultrasound image.
In an embodiment, the processor 104 may obtain the area of a prediction box in a certain ultrasound image. The size of the area may reflect the proportion of the area occupied by the prediction frame in the ultrasound image. The processor 104 may generate the guidance information based on this area. For example, after a plurality of ultrasound images are successively obtained, the processor 104 may generate the guidance information according to a numerical relationship between areas of prediction frames in such ultrasound images. For example, such a numerical relationship may reflect an area change between such prediction blocks.
Fig. 4 and 5 are schematic diagrams illustrating generation of guidance information according to a numerical relationship between areas of prediction boxes according to an embodiment of the present invention. Referring to fig. 4, it is assumed that after the ultrasonic image 41 is obtained, the ultrasonic image 42 is successively obtained by moving the ultrasonic scanner 101. The prediction frame 410 in the ultrasound image 41 is determined, and the prediction frame 420 in the ultrasound image 42 is determined. Prediction blocks 410 and 420 may encompass an organ pattern (marked with diagonal lines in fig. 4) of at least a portion of the target organ, respectively.
The processor 104 may compare the area of the prediction box 410 to the area of the prediction box 420 to obtain a numerical relationship between the area of the prediction box 410 and the area of the prediction box 420. In the present embodiment, the numerical relationship between the area of the prediction box 410 and the area of the prediction box 420 reflects that the area of the prediction box 410 is smaller than the area of the prediction box 420. Accordingly, the processor 104 may generate a guidance message (also referred to as a first guidance message) 401 to alert an operator of the ultrasound scanner 101 that the current scanning direction is correct and that scanning may continue.
In one embodiment, the area of prediction block 410 being smaller than the area of prediction block 420 may be considered as a first numerical relationship between prediction blocks 410 and 420. This first numerical relationship indicates that the area of the prediction frame in the ultrasound image gradually increases (equivalent to the area of the pattern of the target organ in the ultrasound image gradually increases) as the ultrasound scanner 101 moves, as shown in fig. 4. Therefore, the guiding message 401 can inform the operator to keep the current moving direction of the ultrasonic scanner 101 to continue scanning without adjusting the moving direction of the ultrasonic scanner 101. Thereby, the scanning position of the ultrasonic scanner 101 can be gradually brought close to the position of the target organ.
Referring to fig. 5, it is assumed that after the ultrasonic image 51 is obtained, the ultrasonic image 52 is obtained by moving the ultrasonic scanner 101. The prediction frame 510 in the ultrasound image 51 is determined, and the prediction frame 520 in the ultrasound image 52 is determined. Prediction blocks 510 and 520 may encompass an organ pattern (marked with diagonal lines in fig. 5) of at least a portion of the target organ, respectively.
Processor 104 may compare the area of prediction box 510 to the area of prediction box 520 to obtain a numerical relationship between the area of prediction box 510 and the area of prediction box 520. In the present embodiment, the numerical relationship between the area of the prediction box 510 and the area of the prediction box 520 reflects that the area of the prediction box 510 is larger than the area of the prediction box 520. Accordingly, the processor 104 may generate a guidance message (also referred to as a second guidance message) 501 to alert the operator of the ultrasound scanner 101 that the current scanning direction is wrong and may move the ultrasound scanner 101 in the opposite direction (or other directions).
In an embodiment, the area of prediction block 510 being greater than the area of prediction block 520 may be considered a second numerical relationship between prediction blocks 510 and 520. This second numerical relationship indicates that the area of the prediction frame in the ultrasound image gradually decreases (equivalent to the area of the pattern of the target organ in the ultrasound image gradually decreases) as the ultrasound scanner 101 moves, as shown in fig. 5. Thus, the guidance information 501 may suggest to the operator to change the direction of movement of the ultrasound scanner 101. Thereby, the scanning position of the ultrasonic scanner 101 can be prevented from being continuously away from the position of the target organ. In one embodiment, the guidance information 401 and/or 501 may be generated with reference to the coordinate table 31 of FIG. 3 to obtain the previous movement direction of the ultrasound scanner 101 and provide a suggested movement direction.
In one embodiment, the processor 104 may continuously record the area of the prediction box in the plurality of ultrasound images. Based on the comparison of the areas of the plurality of prediction blocks, the processor 104 may obtain a particular coordinate information (also referred to as target coordinate information). The target coordinate information corresponds to one of the ultrasound images (also referred to as a target image). The prediction frame in the target image has the largest coverage area relative to the prediction frames in the remaining ultrasound images. For example, assuming that the target image is the ultrasound image numbered 3 in fig. 3, the target coordinate information is (x 3, y 3), and the area of the prediction frame in the ultrasound image numbered 3 is larger than the area of the prediction frame in any of the ultrasound images numbered 1, 2, 4, and 5.
In an embodiment, the processor 104 may generate the guidance information (also referred to as third guidance information) according to the target coordinate information. For example, the processor 104 may generate third guidance information according to the target coordinate information and the current position of the ultrasound scanner 101. The third guiding information can be used to assist the operator to move the ultrasonic scanner 101 to the scanning position corresponding to the target coordinate information. After moving the ultrasound scanner 101 to the scanning position corresponding to the target coordinate information, an ultrasound image (e.g., ultrasound image 42 of fig. 4 or ultrasound image 51 of fig. 5) having the largest prediction frame (equivalent to the largest organ pattern of the target organ) can be obtained again. In other words, in one embodiment, the third guidance information may be used to guide the ultrasound scanner 101 to a scanning position where the maximum organ pattern of the target organ is available.
Fig. 6 is a flow chart illustrating an ultrasound scanning method according to an embodiment of the present invention. Referring to fig. 6, in step S601, an ultrasound scanning operation is performed on a human body through an ultrasound scanner to obtain an ultrasound image. In step S602, the ultrasound image is analyzed via an image recognition module to identify an organ pattern in the ultrasound image. In step S603, guidance information is automatically generated according to the recognition result of the organ pattern. The guidance information is used to guide movement of the ultrasound scanner to scan a target organ of the human body.
Fig. 7 and 8 are flow charts illustrating an ultrasonic scanning method according to an embodiment of the invention. Referring to fig. 7, in step S701, an ultrasound scanning operation is performed on a human body through an ultrasound scanner to obtain an ultrasound image. In step S702, the ultrasound image is analyzed via an image recognition module to identify an organ pattern in the ultrasound image. In step S703, it is determined whether or not an organ pattern appears in the ultrasound image. If no organ pattern is present in the ultrasound image, the procedure can return to step S701. If there is an organ pattern in the ultrasound image, in step S704, it is determined whether to track the organ pattern. For example, in step S704, it may be determined whether to track the organ pattern according to a user operation. If it is determined to track the organ pattern, the organ pattern is the organ pattern of the target organ and the process proceeds to step S801 of fig. 8 to start tracking. On the contrary, if the organ pattern is not tracked, it indicates that the operator may not move the ultrasound scanner to the scanning position of the organ pattern covering the target organ, so step S701 may be repeated.
Referring to fig. 8, in step S801, the area of the prediction frame in the ultrasound image and the coordinate information of the ultrasound image are recorded. In step S802, the areas of the plurality of recorded prediction boxes are compared. In step S803, it is determined whether the area of the prediction frame in the ultrasound images gradually increases. If the area of the prediction frame in the ultrasound images gradually increases, in step S804, first guiding information indicating that the scanning path (or the scanning direction) is maintained is generated. Alternatively, if the area of the prediction frame in the ultrasound images is not gradually increased (e.g., gradually decreased), in step S805, second guiding information indicating that the scanning path is changed (or the scanning direction is wrong) is generated. After steps S804 and S805, step S806 may be performed. In addition, in another embodiment, the step S806 can be executed at any time point, and the invention is not limited thereto.
In step S806, it is determined whether the operator intends to return to the scanning position of the measured maximum organ pattern according to the user operation. If no user operation is received to reflect the scan position to return to the measured maximum organ pattern, the method may return to step S701 of fig. 7 to continue scanning. In addition, if the received user operation reflects that the operator intends to return to the measured scanning position of the maximum organ pattern, in step S807, the comparison result of the areas of the prediction frames recorded previously is queried to obtain the target coordinate information corresponding to the target image. In step S808, third guidance information is generated based on the target coordinate information and the current coordinate position of the ultrasound scanner.
However, the steps in fig. 6 to fig. 8 have been described in detail above, and are not repeated herein. It is noted that the steps in fig. 6 to fig. 8 can be implemented as a plurality of program codes or circuits, and the invention is not limited thereto. In addition, the methods in fig. 6 to fig. 8 can be used with the above embodiments, or can be used alone, and the invention is not limited thereto.
In summary, after performing an ultrasound scanning operation on a human body via an ultrasound scanner to obtain an ultrasound image, an image recognition module may analyze the ultrasound image to identify an organ pattern in the ultrasound image. Then, guidance information may be automatically generated according to the recognition result of the organ pattern. In particular, the guiding information can be used to guide the movement of the ultrasound scanner to scan the target organ of the human body, so that the burden of performing ultrasound scanning by a professional can be reduced and/or a person without professional training can easily operate the ultrasound scanning apparatus to perform simple scanning.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention.

Claims (6)

1. An ultrasonic scanning method for an ultrasonic scanning apparatus, comprising:
performing an ultrasonic scanning operation on a human body via an ultrasonic scanner to obtain an ultrasonic image;
analyzing, via an image recognition module, the ultrasound image to identify an organ pattern in the ultrasound image; and
automatically generating guide information according to the recognition result of the organ pattern, wherein the guide information is used to guide the movement of the ultrasonic scanner to scan a target organ of the human body,
wherein the step of automatically generating the guide information according to the recognition result of the organ pattern comprises:
generating the guide information according to an area of a prediction box in the ultrasound image,
wherein the prediction box reflects a range of the organ pattern identified by the image identification module in the ultrasound image,
wherein the ultrasound image includes a first image and a second image, the prediction box includes a first prediction box in the first image and a second prediction box in the second image, and the step of automatically generating the guide information according to the recognition result of the organ pattern includes:
generating the guide information according to a numerical relationship between an area of the first prediction box and an area of the second prediction box,
wherein the numerical relationship comprises a first numerical relationship and a second numerical relationship, wherein the first numerical relationship reflects that the area of the first prediction box is smaller than the area of the second prediction box, and wherein the second numerical relationship reflects that the area of the first prediction box is larger than the area of the second prediction box.
2. The ultrasonic scanning method according to claim 1, wherein the step of automatically generating the guide information according to the recognition result of the organ pattern includes:
generating the guide information according to the recognition result of the organ pattern and coordinate information,
wherein the coordinate information reflects a position of the ultrasound scanner at a time of acquiring the ultrasound image.
3. The ultrasound scanning method of claim 1, wherein the step of generating the guidance information according to the numerical relationship between the area of the first prediction box and the area of the second prediction box comprises:
obtaining target coordinate information according to the numerical relationship, wherein the target coordinate information corresponds to a target image in the ultrasonic image, and when the numerical relationship between the area of the first prediction frame and the area of the second prediction frame is the first numerical relationship, the target image is the second image; and
and generating the guide information according to the target coordinate information.
4. An ultrasonic scanning device comprising:
an ultrasonic scanner for performing an ultrasonic scanning operation on a human body to obtain an ultrasonic image;
a processor coupled to the ultrasound scanner and configured to analyze the ultrasound image via an image recognition module to identify an organ pattern in the ultrasound image,
wherein the processor is further configured to automatically generate guidance information according to a recognition result of the organ pattern, and the guidance information is configured to guide movement of the ultrasonic scanner to scan a target organ of the human body,
wherein the operation of the processor automatically generating the guidance information according to the recognition result of the organ pattern comprises:
generating the guide information according to an area of a prediction box in the ultrasound image,
wherein the prediction box reflects a range of the organ pattern identified by the image identification module in the ultrasound image,
wherein the ultrasound image comprises a first image and a second image, the prediction box comprises a first prediction box in the first image and a second prediction box in the second image, and the operation of the processor automatically generating the guidance information according to the recognition result of the organ pattern comprises:
generating the guide information according to a numerical relationship between an area of the first prediction box and an area of the second prediction box,
wherein the numerical relationship comprises a first numerical relationship and a second numerical relationship, wherein the first numerical relationship reflects that the area of the first prediction box is smaller than the area of the second prediction box, and wherein the second numerical relationship reflects that the area of the first prediction box is larger than the area of the second prediction box.
5. The ultrasound scanning device of claim 4, wherein the operation of the processor automatically generating the guidance information from the recognition of the organ pattern comprises:
generating the guide information according to the recognition result of the organ pattern and coordinate information,
wherein the coordinate information reflects a position of the ultrasound scanner at a time of acquiring the ultrasound image.
6. The ultrasound scanning device of claim 4, wherein the operation of the processor generating the guidance information according to the numerical relationship between the area of the first prediction box and the area of the second prediction box comprises:
obtaining target coordinate information according to the numerical relationship, wherein the target coordinate information corresponds to a target image in the ultrasonic image, and when the numerical relationship between the area of the first prediction frame and the area of the second prediction frame is the first numerical relationship, the target image is the second image; and
and generating the guide information according to the target coordinate information.
CN201910062019.1A 2019-01-23 2019-01-23 Ultrasonic scanning method and ultrasonic scanning device Active CN111466948B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910062019.1A CN111466948B (en) 2019-01-23 2019-01-23 Ultrasonic scanning method and ultrasonic scanning device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910062019.1A CN111466948B (en) 2019-01-23 2019-01-23 Ultrasonic scanning method and ultrasonic scanning device

Publications (2)

Publication Number Publication Date
CN111466948A CN111466948A (en) 2020-07-31
CN111466948B true CN111466948B (en) 2023-04-07

Family

ID=71743850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910062019.1A Active CN111466948B (en) 2019-01-23 2019-01-23 Ultrasonic scanning method and ultrasonic scanning device

Country Status (1)

Country Link
CN (1) CN111466948B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002041776A1 (en) * 2000-11-24 2002-05-30 Feinberg David A Ultrasound within mri scanners for guidance of mri pulse sequences
JP2006271588A (en) * 2005-03-29 2006-10-12 Hitachi Medical Corp Ultrasonic apparatus
CN101271500A (en) * 2008-05-14 2008-09-24 宏碁股份有限公司 Biological recognition starter and its method
WO2015097690A1 (en) * 2013-12-24 2015-07-02 Ge Medical Systems Israel, Ltd. Systems and methods for controlling motion of detectors having moving detector heads
WO2015119338A1 (en) * 2014-02-04 2015-08-13 한국디지털병원수출사업협동조합 Method for guiding scan position of three-dimensional ultrasound probe and ultrasound diagnostic system employing same
CN107261346A (en) * 2016-03-31 2017-10-20 通用电气公司 For using the method and system for being ultrasonically formed obstruction
TW201800057A (en) * 2016-06-20 2018-01-01 蝴蝶網路公司 Automated image acquisition for assisting a user to operate an ultrasound device
WO2018166789A1 (en) * 2017-03-16 2018-09-20 Koninklijke Philips N.V. Optimal scan plane selection for organ viewing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6263447B2 (en) * 2014-06-30 2018-01-17 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic apparatus and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002041776A1 (en) * 2000-11-24 2002-05-30 Feinberg David A Ultrasound within mri scanners for guidance of mri pulse sequences
JP2006271588A (en) * 2005-03-29 2006-10-12 Hitachi Medical Corp Ultrasonic apparatus
CN101271500A (en) * 2008-05-14 2008-09-24 宏碁股份有限公司 Biological recognition starter and its method
WO2015097690A1 (en) * 2013-12-24 2015-07-02 Ge Medical Systems Israel, Ltd. Systems and methods for controlling motion of detectors having moving detector heads
WO2015119338A1 (en) * 2014-02-04 2015-08-13 한국디지털병원수출사업협동조합 Method for guiding scan position of three-dimensional ultrasound probe and ultrasound diagnostic system employing same
CN107261346A (en) * 2016-03-31 2017-10-20 通用电气公司 For using the method and system for being ultrasonically formed obstruction
TW201800057A (en) * 2016-06-20 2018-01-01 蝴蝶網路公司 Automated image acquisition for assisting a user to operate an ultrasound device
WO2018166789A1 (en) * 2017-03-16 2018-09-20 Koninklijke Philips N.V. Optimal scan plane selection for organ viewing

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
An assessment of ultrasound scanning in the recognition of colorectal liver metastases.;Lamb G, Taylor I.;《Ann R Coll Surg Engl.》;19821130;全文 *
闫士举 ; 钱理为 ; 葛斌.XRII图像引导手术系统的研究与实现.《中国生物医学工程学报 》.2010, *
高分辨SAR图像目标区域提取方法研究;代慧;《中国优秀硕士学位论文全文数据库信息科技辑》;20180415;I136-2466 *

Also Published As

Publication number Publication date
CN111466948A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
US20210177373A1 (en) Ultrasound system with an artificial neural network for guided liver imaging
US9773305B2 (en) Lesion diagnosis apparatus and method
CN103919573A (en) Lesion Diagnosis Apparatus And Method
KR100954989B1 (en) Ultrasound diagnostic apparatus and method for measuring size of target object
US9730675B2 (en) Ultrasound imaging system and an ultrasound imaging method
CN109846513A (en) Ultrasonic imaging method, system and image measuring method, processing system and medium
CN109754396A (en) Method for registering, device, computer equipment and the storage medium of image
CN110087550A (en) A kind of ultrasound pattern display method, equipment and storage medium
US11317895B2 (en) Ultrasound diagnosis apparatus and method of operating the same
JP2022549281A (en) Method, system and computer readable storage medium for registering intraoral measurements
US11903760B2 (en) Systems and methods for scan plane prediction in ultrasound images
US20210350529A1 (en) Gating machine learning predictions on medical ultrasound images via risk and uncertainty quantification
WO2022098859A1 (en) Robust segmentation through high-level image understanding
CN116407154A (en) Ultrasonic diagnosis data processing method and device, ultrasonic equipment and storage medium
CN114299015A (en) Method and device for determining scoliosis angle
US11253228B2 (en) Ultrasonic scanning method and ultrasonic scanning device
CN111466948B (en) Ultrasonic scanning method and ultrasonic scanning device
CN104706379B (en) Diagnostic ultrasound equipment
CN113274051B (en) Ultrasonic auxiliary scanning method and device, electronic equipment and storage medium
EP2740408A1 (en) Ultrasound diagnostic method and ultrasound diagnostic apparatus using volume data
CN114782364A (en) Image detection method, device and system and detection equipment
WO2021120059A1 (en) Measurement method and measurement system for three-dimensional volume data, medical apparatus, and storage medium
CN113133781A (en) Ultrasonic diagnostic apparatus and operating method for ultrasonic diagnostic apparatus
WO2021117349A1 (en) Information processing device, information processing system, information processing method, and information processing program
US20230196580A1 (en) Ultrasound diagnostic apparatus and ultrasound image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant