CN108652663B - Ultrasonic image processing method and device, storage medium and ultrasonic imaging equipment - Google Patents

Ultrasonic image processing method and device, storage medium and ultrasonic imaging equipment Download PDF

Info

Publication number
CN108652663B
CN108652663B CN201810469754.XA CN201810469754A CN108652663B CN 108652663 B CN108652663 B CN 108652663B CN 201810469754 A CN201810469754 A CN 201810469754A CN 108652663 B CN108652663 B CN 108652663B
Authority
CN
China
Prior art keywords
ultrasonic
image
depth range
currently displayed
ultrasonic image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810469754.XA
Other languages
Chinese (zh)
Other versions
CN108652663A (en
Inventor
许龙
何丹妮
廖静秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sonoscape Medical Corp
Original Assignee
Sonoscape Medical Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sonoscape Medical Corp filed Critical Sonoscape Medical Corp
Priority to CN201810469754.XA priority Critical patent/CN108652663B/en
Publication of CN108652663A publication Critical patent/CN108652663A/en
Application granted granted Critical
Publication of CN108652663B publication Critical patent/CN108652663B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4411Device being modular
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The embodiment of the application discloses an ultrasonic image processing method, an ultrasonic image processing device, a computer readable storage medium and an ultrasonic imaging device, wherein regardless of the depth range to which a currently displayed ultrasonic image belongs, the currently displayed ultrasonic image is divided into N sections according to the depth range to which the currently displayed ultrasonic image belongs; the method comprises the steps that a one-to-one corresponding relation is established between N preset time gain compensation parameter adjusting keys and N sections of divided ultrasonic images, so that when the currently displayed ultrasonic image is a local image, any one of the N time gain compensation parameter adjusting keys can adjust the currently displayed ultrasonic image, a user can roughly determine the time gain compensation parameter adjusting key corresponding to the image with the depth to be adjusted through visual inspection instead of blind attempts, user operation is simplified, and adjustment accuracy is improved.

Description

Ultrasonic image processing method and device, storage medium and ultrasonic imaging equipment
Technical Field
The present application relates to the field of medical image processing technologies, and in particular, to an ultrasound image processing method and apparatus, a computer-readable storage medium, and an ultrasound imaging device.
Background
Ultrasound images are commonly used images in the medical field. During the scanning of the irradiation object by the ultrasonic imaging apparatus, the user (generally, a medical worker) displays an image (hereinafter, referred to as an ultrasound image) at different depth positions on a certain section of the irradiation object.
In practice, a user sometimes needs to zoom in on an ultrasound image to see details. In the magnified image, it may be necessary to adjust a Time Gain Compensation (TGC) parameter so that the display effect of the ultrasound image is more consistent with the user's needs. Currently, to facilitate user review, the user may be allowed to make adjustments to the TGC parameters for images within different depth ranges of the ultrasound image.
However, the inventor has found that, when the TGC parameter adjustment needs to be performed after the ultrasound image is enlarged, since the currently displayed ultrasound image is a part of the ultrasound image before enlargement, the user needs to select the TGC parameter adjustment key corresponding to the depth to which the currently displayed ultrasound image belongs from the multiple TGC parameter adjustment keys for adjustment, and since the user does not know which depth range of the ultrasound image before enlargement the currently displayed ultrasound image corresponds to, the user needs to try many times to find the corresponding TGC parameter adjustment key, so that the user operation is complicated.
Disclosure of Invention
The application aims to provide an ultrasonic image processing method, an ultrasonic image processing device, a computer readable storage medium and an ultrasonic imaging device, so as to simplify the operation of a user in adjusting TGC parameters.
In order to achieve the purpose, the application provides the following technical scheme:
an ultrasonic image processing method is applied to an ultrasonic imaging device and comprises the following steps:
determining a first depth range to which a currently displayed ultrasonic image belongs;
dividing the currently displayed ultrasonic image into N sections according to the first depth range;
establishing a one-to-one corresponding relation between preset N time gain compensation parameter adjusting keys and N sections of ultrasonic images obtained by division; the N time gain compensation parameter adjustment keys are all time gain compensation parameter adjustment keys provided by the ultrasonic imaging equipment.
The above method, preferably, further comprises:
when an adjusting instruction of a first time gain compensation parameter adjusting key is received, determining a second depth range of the ultrasonic image corresponding to the first time gain compensation parameter adjusting key, and adjusting the ultrasonic image in the second depth range.
The method preferably further includes, before determining the first depth range to which the currently displayed ultrasound image belongs:
monitoring whether an ultrasonic image displayed by the ultrasonic imaging device is enlarged or reduced;
when the fact that the ultrasonic image displayed by the ultrasonic imaging equipment is zoomed in or zoomed out is monitored, the image displayed after being zoomed in or zoomed out is determined to be the currently displayed ultrasonic image.
The method preferably further includes, before determining the first depth range to which the currently displayed ultrasound image belongs:
zooming in or zooming out the region of interest of the ultrasonic image displayed by the ultrasonic imaging device;
correspondingly, the determining a first depth range to which the currently displayed ultrasound image belongs, where the currently displayed ultrasound image is a local zoom-in or zoom-out image, specifically includes: determining a first depth range to which a local enlarged or reduced image in the ultrasonic image belongs, wherein the first depth range is a part of a maximum depth range scanned by the ultrasonic imaging equipment.
An ultrasonic image processing device applied to an ultrasonic imaging device comprises:
the determining module is used for determining a first depth range to which the currently displayed ultrasonic image belongs;
a dividing module, configured to divide the currently displayed ultrasound image into N segments according to the first depth range;
the mapping module is used for establishing a one-to-one correspondence relationship between preset N time gain compensation parameter adjusting keys and the N sections of ultrasonic images obtained by division; the N time gain compensation parameter adjustment keys are all time gain compensation parameter adjustment keys provided by the ultrasonic imaging equipment.
The above apparatus, preferably, further comprises:
and the adjusting module is used for determining a second depth range of the ultrasonic image corresponding to the first time gain compensation parameter adjusting key when an adjusting instruction of the first time gain compensation parameter adjusting key is received, and adjusting the ultrasonic image in the second depth range.
The above apparatus, preferably, further comprises:
the monitoring module is used for monitoring whether the ultrasonic image displayed by the ultrasonic imaging equipment is enlarged or reduced; when the fact that the ultrasonic image displayed by the ultrasonic imaging equipment is zoomed in or zoomed out is monitored, the image displayed after being zoomed in or zoomed out is determined to be the currently displayed ultrasonic image.
The above apparatus, preferably, further comprises:
the zooming module is used for zooming in or zooming out the region of interest of the ultrasonic image displayed by the ultrasonic imaging equipment;
correspondingly, the currently displayed ultrasonic image is a local enlarged or reduced image;
the determining module is specifically configured to determine a first depth range to which a local zoom-in or zoom-out map in the ultrasound image belongs, where the first depth range is a part of a maximum depth range obtained by scanning with the ultrasound imaging device.
A computer readable storage medium having stored therein instructions which, when run on an ultrasound imaging apparatus, cause the ultrasound imaging apparatus to execute the ultrasound image processing method described above.
An ultrasound imaging device comprising: the ultrasonic image processing method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein when the processor executes the computer program, the ultrasonic image processing method is realized.
According to the above scheme, the ultrasound image processing method, the ultrasound image processing device, the computer-readable storage medium and the ultrasound imaging apparatus provided by the application divide the currently displayed ultrasound image into N segments according to the depth range to which the currently displayed ultrasound image belongs, regardless of the depth range to which the currently displayed ultrasound image belongs; the method comprises the steps that a one-to-one corresponding relation is established between N preset time gain compensation parameter adjusting keys and N divided ultrasonic images, so that when the currently displayed ultrasonic image is a local image, any one of the N time gain compensation parameter adjusting keys can adjust the currently displayed ultrasonic image, a user can roughly determine the time gain compensation parameter adjusting key corresponding to the image with the depth to be adjusted through visual inspection instead of blind attempts, and user operation is simplified. In addition, when the currently displayed ultrasonic image is a partial image, all the time gain compensation parameter adjustment keys are mapped to the currently displayed ultrasonic image, which is equivalent to subdividing the part of the currently displayed ultrasonic image again, so that the adjustment precision is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of an implementation of an ultrasound image processing method according to an embodiment of the present application;
fig. 2 is an exemplary diagram of a mapping relationship between N TGC parameter adjustment keys and an ultrasound image when a currently displayed ultrasound image is an image within a maximum depth range according to an embodiment of the present application;
fig. 3 is an exemplary diagram of a mapping relationship between a TGC parameter adjustment key and an ultrasound image when a currently displayed ultrasound image is an image in a local depth range in the prior art;
fig. 4 is an exemplary diagram of a mapping relationship between TGC parameter adjustment keys and an ultrasound image when a currently displayed ultrasound image is an image in a local depth range according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an ultrasound image processing apparatus according to an embodiment of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be practiced otherwise than as specifically illustrated.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive effort based on the embodiments of the present invention, are within the scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of an implementation of an ultrasound image processing method according to an embodiment of the present application, which may include:
step S11: the depth range to which the currently displayed ultrasound image belongs (denoted as the first depth range for convenience of description) is determined.
The first depth range is at least a part of a maximum depth range scanned when the irradiation object is scanned by the ultrasound imaging apparatus, that is, in this embodiment, the currently displayed ultrasound image may be an image within the maximum depth range of the irradiation object when the irradiation object (for example, the abdomen, the thorax, and the like of a human body) is scanned by the ultrasound imaging apparatus, or may be an image within a partial depth range of the irradiation object, that is, a local part of the irradiation object. Generally, zooming in on an ultrasound image within the maximum depth range of the scanned object may result in an ultrasound image of the part of the illuminated object.
For example, if the maximum depth from the body surface to the inside of the body is 11cm when the thorax of the human body is scanned, the maximum depth range scanned by the ultrasonic imaging device when the thorax of the human body is scanned is 0-11cm, and the currently displayed ultrasonic image may be an image in the range of 0-11cm, or may be any part of an image in the range of 0-11cm, for example, an image in the range of 3-8 cm.
Step S12: and dividing the currently displayed ultrasonic image into N sections according to the first depth range to obtain N sections of ultrasonic images. N is a positive integer, which is the number of all time gain compensation parameter adjustment keys provided by the ultrasound imaging apparatus, and the value of N may be 6 or 8, or may be a larger value, such as 10.
The depth range of each section of ultrasonic image is a sub-range of the first depth range, and the depth ranges of the N sections of ultrasonic images are combined to obtain the first depth range.
When the currently displayed ultrasound image is divided, the currently displayed ultrasound image may be divided into N segments on average, or divided into N segments non-uniformly.
Specifically, when dividing, there may be two optional implementations:
the first method is as follows:
the currently displayed ultrasonic image is directly and averagely divided into N sections according to a first depth range, and the difference between the maximum depth value and the minimum depth value of each section of ultrasonic image is the same.
The second method comprises the following steps:
determining a region of interest of the user and a third depth range to which the region of interest of the user belongs according to a region of interest mark marked on the currently displayed ultrasonic image by the user; according to a third depth range, averagely dividing the region of interest of the user into M sections of first-class ultrasonic images, and dividing the region of the currently displayed ultrasonic image except the region of interest of the user into P sections of second-class ultrasonic images, wherein M is the depth of the currently displayed ultrasonic image<N,P<And N, wherein M + P is equal to N, and the difference between the maximum depth value and the minimum depth value of the first type of ultrasonic image is smaller than the difference between the maximum depth value and the minimum depth value of the second type of ultrasonic image. For example, assume that the maximum depth value of a first type of ultrasound image is d1maxThe minimum depth value is d1minThe maximum depth value of a second type ultrasonic image is d2maxThe minimum depth value is d2minThen (d)1max-d1min)<(d2max-d2min)。
Step S13: and establishing a one-to-one correspondence relationship between the preset N time gain compensation TGC parameter adjusting keys and the N sections of the ultrasonic images obtained by division. Wherein, the N time gain compensation parameter adjusting keys are all time gain compensation parameter adjusting keys provided by the ultrasonic imaging equipment.
The TGC parameter adjusting keys can be virtual keys or physical keys, and can be sliding blocks sliding on a sliding rod, the sliding directions of the sliding blocks on the sliding rod are different, and the adjusting directions of the TGC parameters are different. For example, if the TGC parameter increases when the slider slides on the slide bar in a first direction, the TGC parameter decreases when the slider slides on the slide bar in a second direction; the sliding distance of the sliding block on the sliding rod is different, and the adjustment amount of the TGC parameter is different, for example, generally, the larger the sliding distance of the sliding block on the sliding rod is, the larger the adjustment amount of the TGC parameter is, and conversely, the smaller the sliding distance of the sliding block on the sliding rod is, the smaller the adjustment amount of the TGC parameter is.
As shown in fig. 2, fig. 2 is a diagram illustrating an example of the mapping relationship between the N TGC parameter adjustment keys (shown in fig. 2-a) and the ultrasound image (shown in fig. 2-B) when the currently displayed ultrasound image is an image within the maximum depth range. In this example, there are 8 TGC parameter adjustment keys, and the depth of the ultrasound image gradually increases from top to bottom.
Fig. 3 is a diagram illustrating an example of a mapping relationship between TGC parameter adjustment keys (as shown in fig. 3-a) and an ultrasound image (as shown in fig. 3-B) when a currently displayed ultrasound image is an image in a local depth range (an image displayed after the ultrasound image shown in fig. 2 is enlarged) in the prior art. In the prior art, only part of the TGC parameter adjustment keys are mapped to the currently displayed ultrasound image, that is, in the example shown in fig. 2, which ones of the TGC parameter adjustment keys (in this example, the 3 rd TGC parameter adjustment key to the 6 th TGC parameter adjustment key) correspond to the images within the local depth range in fig. 3, and which ones of the TGC parameter adjustment keys still correspond to the enlarged ultrasound image.
Fig. 4 is an exemplary diagram of the mapping relationship between the TGC parameter adjustment key (as shown in fig. 4-a) and the ultrasound image (as shown in fig. 4-B and the same as fig. 3-B) in the case where the currently displayed ultrasound image is an image in the local depth range (an image displayed after the ultrasound image shown in fig. 2 is enlarged) in the present embodiment. In the present application, all TGC parameter adjustment keys are mapped to the currently displayed ultrasound image, that is, regardless of which TGC parameter adjustment keys correspond to the image in the local depth range in fig. 4 in the example shown in fig. 2, after the ultrasound image is enlarged, all TGC parameter adjustment keys are the ultrasound image displayed after being enlarged.
In the embodiment of the application, regardless of the depth range to which the currently displayed ultrasound image belongs, the currently displayed ultrasound image is divided into N sections according to the depth range to which the currently displayed ultrasound image belongs; the preset N time gain compensation parameter adjusting keys are in one-to-one correspondence with the N divided ultrasonic images, so that when the currently displayed ultrasonic image is a local image, any one of the N TGC parameter adjusting keys can adjust the currently displayed ultrasonic image, namely each TGC parameter adjusting key can adjust the currently displayed ultrasonic image, and a user can roughly determine the TGC parameter adjusting key corresponding to the image with the depth to be adjusted through visual inspection instead of blind attempt, so that the user operation is simplified.
In addition, when the currently displayed ultrasonic image is a partial image, all the TGC parameter adjustment keys are mapped to the currently displayed ultrasonic image, which means that the partial region of the currently displayed ultrasonic image is subdivided again, so that the adjustment accuracy is improved.
The following explains an implementation idea of the present application by taking an example of averagely dividing a currently displayed ultrasound image into N segments according to a first depth range.
Assuming that N is 8, that is, 8 TGC parameter adjustment keys are preset, and the depth range to which the currently displayed ultrasound image belongs is 0-12cm, the currently displayed ultrasound image is averagely divided into 8 segments of ultrasound images, and the depth ranges of the segments of ultrasound images are respectively: [0-1.5cm), [1.5cm-3cm), [3cm-4.5cm), [4.5cm-6cm), [6cm-7.5cm), [7.5cm-9cm), [9cm-10.5cm ], and [10.5cm-12 cm.
Assuming that the 8 TGC parameter adjustment keys are TGC1, TGC2, TGC3, TGC4, TGC5, TGC6, TGC7, and TGC8, respectively, the TGC parameter adjustment keys and the divided 8-segment ultrasound images can be associated as follows:
TGC1 corresponds to ultrasound images with a depth range of [0-1.5cm), TGC2 corresponds to ultrasound images with a depth range of [1.5cm-3cm), TGC3 corresponds to ultrasound images with a depth range of [3cm-4.5cm), TGC4 corresponds to ultrasound images with a depth range of [4.5cm-6cm), TGC5 corresponds to ultrasound images with a depth range of [6cm-7.5cm), TGC6 corresponds to ultrasound images with a depth range of [7.5cm-9cm), TGC7 corresponds to [9cm-10.5cm), and TGC8 corresponds to ultrasound images with a depth range of [10.5cm-12cm ].
Assuming that a user performs an enlarging operation on a currently displayed ultrasound image as required, and a depth range to which the currently displayed ultrasound image belongs after enlargement is 7cm-11cm, the ultrasound image in the depth range of 7cm-11cm is averagely divided into 8 sections of ultrasound images, and the depth ranges of the ultrasound images are respectively as follows: [7cm-7.5cm), [7.5cm-8cm), [8cm-8.5cm), [8.5cm-9cm), [9cm-9.5cm), [9.5cm-10cm), [10cm-10.5cm, [10.5cm-11cm ], then the following corresponding relations can be established between the 8 TGC parameter adjusting keys and the 8 ultrasound images obtained by division:
TGC1 corresponds to an ultrasound image with a depth range of [7cm-7.5cm), [7.5cm-8cm ] TGC2 corresponds to an ultrasound image with a depth range of [7.5cm-8cm ], TGC3 corresponds to an ultrasound image with a depth range of [8cm-8.5cm), [8.5cm-9cm ] TGC4 corresponds to an ultrasound image with a depth range of [8.5cm-9cm ], TGC5 corresponds to an ultrasound image with a depth range of [9cm-9.5cm ], TGC6 corresponds to an ultrasound image with a depth range of [9.5cm-10cm ], TGC7 corresponds to an ultrasound image with a depth range of [10cm-10.5cm ], and TGC8 corresponds to an ultrasound image with a depth range of [10.5cm-11cm ].
Assuming that the user performs a zoom-out operation on the currently displayed ultrasound image as required, and the depth range to which the currently displayed ultrasound image belongs after being zoomed out is 6cm-12cm, the ultrasound image in the depth range of 6cm-12cm is averagely divided into 8 segments of ultrasound images, and the depth ranges of the ultrasound images are respectively as follows: [6cm-6.75cm), [6.75cm-7.5cm), [7.5cm-8.25cm), [8.25cm-9cm), [9cm-9.75cm), [9.75cm-10.5cm), [10.5cm-11.25cm ], and [11.25cm-12cm ], the following correspondence relationship can be established between the 8 TGC parameter adjustment keys and the 8 ultrasound images obtained by dividing:
TGC1 corresponds to an ultrasound image with a depth range of [6cm-6.75cm), TGC2 corresponds to an ultrasound image with a depth range of [6.75cm-7.5cm), TGC3 corresponds to an ultrasound image with a depth range of [7.5cm-8.25cm), TGC4 corresponds to an ultrasound image with a depth range of [8.25cm-9cm), TGC5 corresponds to an ultrasound image with a depth range of [9cm-9.75cm), TGC6 corresponds to an ultrasound image with a depth range of [9.75cm-10.5cm), TGC7 corresponds to an ultrasound image with a depth range of [10.5cm-11.25cm), and TGC8 corresponds to an ultrasound image with a depth range of [11.25cm-12cm ].
In an optional embodiment, after a one-to-one correspondence relationship is established between preset N time gain compensation parameter adjustment keys and N divided ultrasound images, if an adjustment instruction of the time gain compensation parameter adjustment key (for convenience of description, recorded as a first time gain compensation parameter adjustment key) is received, a second depth range of the ultrasound image corresponding to the first time gain compensation parameter adjustment key is determined, and an ultrasound image in the second depth range is adjusted.
Wherein, the first time gain compensation parameter adjustment key is any one of the N time gain compensation parameter adjustment keys.
After the user operates the first time gain compensation parameter adjustment key, an adjustment instruction is generated, wherein the adjustment instruction carries the identification of the first time gain compensation parameter adjustment key and the adjustment amount of the time gain compensation parameter generated by the user operating the first time gain compensation parameter adjustment key.
And adjusting the ultrasound image belonging to the second depth range in the currently displayed ultrasound image according to the adjustment quantity of the time gain compensation parameter carried in the adjustment instruction so as to change the display effect of the ultrasound image belonging to the second depth range in the currently displayed ultrasound image.
How to adjust the image in the currently displayed ultrasound image that belongs to the second depth range can refer to the implementation manner in the prior art, and since it does not belong to the focus of the present application, it will not be described in detail here.
In an optional embodiment, before determining the first depth range to which the currently displayed ultrasound image belongs, the method may further include:
acquiring a mapping instruction;
in response to the mapping instructions, a first depth range to which the currently displayed ultrasound image belongs is determined.
In the embodiment of the present application, after the ultrasound image is zoomed in or zoomed out, the user may manually trigger the ultrasound image processing method provided by the present application to establish a mapping relationship between preset N time gain compensation parameter adjustment keys and N segments of ultrasound images in the currently displayed ultrasound image.
Manual operation makes the operation of user comparatively many, in order to further simplify user's operation, this application provides another implementation:
in an optional embodiment, before determining the first depth range to which the currently displayed ultrasound image belongs, the method may further include:
whether the ultrasonic image displayed by the ultrasonic imaging device is enlarged or reduced is monitored. The ultrasound image displayed by the ultrasound imaging device is enlarged or reduced, which includes two situations: the ultrasound image is entirely enlarged or reduced, or a part of the ultrasound image is enlarged or reduced.
When the fact that the ultrasonic image displayed by the ultrasonic imaging equipment is zoomed in or zoomed out is monitored, the image displayed after being zoomed in or zoomed out is determined to be the currently displayed ultrasonic image.
That is to say, in the embodiment of the present application, whether a user performs a zoom operation on an ultrasound image displayed by an ultrasound imaging device is monitored in real time, and when the user performs a zoom-in or zoom-out operation on the ultrasound image displayed by the ultrasound imaging device, an image displayed after being zoomed in or out (that is, a partial image of the ultrasound image in the maximum depth range scanned by the ultrasound imaging device) is determined as a currently displayed ultrasound image. Therefore, the mapping relation between the preset N time gain compensation parameter adjusting keys and the currently displayed N sections of ultrasonic images can be established in real time.
In an optional embodiment, before determining the first depth range to which the currently displayed ultrasound image belongs, the method may further include:
the region of interest of the ultrasound image displayed by the ultrasound imaging device is enlarged or reduced. The region of interest may be manually marked by a user.
And determining the image displayed after being enlarged or reduced as the currently displayed ultrasonic image. That is, the currently displayed ultrasound image is a local zoom-in or zoom-out image, and after the region of interest is zoomed in or zoomed out, the ultrasound imaging apparatus displays the whole region of the region of interest, and may also display a partial region of the region of interest, so the local zoom-in or zoom-out image may be the whole region of the region of interest, and may also be the partial region of the region of interest.
The determining the first depth range to which the currently displayed ultrasound image belongs may specifically include: and determining a first depth range to which the local enlarged or reduced image in the ultrasonic image belongs, wherein the first depth range is a part of the maximum depth range scanned by the ultrasonic imaging equipment.
Corresponding to the method embodiment, the application also provides an ultrasonic image processing device. A schematic structural diagram of an ultrasound image processing apparatus provided in the present application is shown in fig. 5, and may include:
a determination module 51, a partitioning module 52 and a mapping module 53, wherein:
the determining module 51 is configured to determine a first depth range to which the currently displayed ultrasound image belongs;
the dividing module 52 is configured to divide the currently displayed ultrasound image into N segments according to a first depth range;
the mapping module 53 is configured to establish a one-to-one correspondence relationship between preset N time gain compensation parameter adjustment keys and the N divided ultrasound images; the N time gain compensation parameter adjusting keys are all time gain compensation parameter adjusting keys provided by the ultrasonic imaging equipment.
The ultrasound image processing device provided by the application divides the currently displayed ultrasound image into N sections according to the depth range to which the currently displayed ultrasound image belongs, regardless of the depth range to which the currently displayed ultrasound image belongs; the preset N time gain compensation parameter adjusting keys are in one-to-one correspondence with the N sections of the divided ultrasonic images, so that when the currently displayed ultrasonic image is a local image, any one of the N TGC parameter adjusting keys can adjust the currently displayed ultrasonic image, a user can roughly determine the TGC parameter adjusting key corresponding to the image with the depth to be adjusted through visual inspection instead of blind attempt, and the user operation is simplified.
In addition, when the currently displayed ultrasonic image is a partial image, all the TGC parameter adjustment keys are mapped to the currently displayed ultrasonic image, which means that the partial region of the currently displayed ultrasonic image is subdivided again, so that the adjustment accuracy is improved.
In an optional embodiment, the ultrasound image processing apparatus provided in the present application may further include:
and the adjusting module is used for determining a second depth range of the ultrasonic image corresponding to the first time gain compensation parameter adjusting key when an adjusting instruction of the first time gain compensation parameter adjusting key is received, and adjusting the ultrasonic image in the second depth range.
In an optional embodiment, the ultrasound image processing apparatus provided in the present application may further include:
the monitoring module is used for monitoring whether the ultrasonic image displayed by the ultrasonic imaging equipment is enlarged or reduced; when the fact that the ultrasonic image displayed by the ultrasonic imaging equipment is zoomed in or zoomed out is monitored, the image displayed after being zoomed in or zoomed out is determined to be the currently displayed ultrasonic image.
In an optional embodiment, the ultrasound image processing apparatus provided in the present application may further include:
the zooming module is used for zooming in or zooming out the region of interest of the ultrasonic image displayed by the ultrasonic imaging equipment;
accordingly, the currently displayed ultrasound image is a partially enlarged or reduced image.
The determining module 51 may be specifically configured to determine a first depth range to which a local zoom-in or zoom-out map in the ultrasound image belongs, where the first depth range is a part of a maximum depth range scanned by the ultrasound imaging apparatus.
In addition, a computer-readable storage medium is provided, and has instructions stored therein, and when the instructions are executed on an ultrasound imaging apparatus, the instructions cause the ultrasound imaging apparatus to execute the ultrasound image processing method disclosed in the embodiment of the present application.
An embodiment of the present application further provides an ultrasound imaging apparatus, including: the ultrasonic image processing method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein when the processor executes the computer program, the ultrasonic image processing method disclosed by the embodiment of the application is realized.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
It should be understood that the technical problems can be solved by combining and combining the features of the embodiments from the claims.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (6)

1. An ultrasonic image processing method is applied to an ultrasonic imaging device, and is characterized by comprising the following steps:
determining a first depth range to which a currently displayed ultrasonic image belongs;
dividing the currently displayed ultrasonic image into N sections according to the first depth range;
establishing a one-to-one corresponding relation between preset N time gain compensation parameter adjusting keys and N sections of ultrasonic images obtained by division; the N time gain compensation parameter adjusting keys are all time gain compensation parameter adjusting keys provided by the ultrasonic imaging equipment;
before determining the first depth range to which the currently displayed ultrasound image belongs, the method further comprises the following steps:
monitoring whether an ultrasonic image displayed by the ultrasonic imaging equipment is enlarged or reduced in real time; when the situation that the ultrasonic image displayed by the ultrasonic imaging equipment is enlarged or reduced is monitored, determining the image displayed after being enlarged or reduced as the currently displayed ultrasonic image;
or, monitoring the enlargement or reduction of the region of interest of the ultrasonic image displayed by the ultrasonic imaging device in real time; correspondingly, the determining a first depth range to which the currently displayed ultrasound image belongs, where the currently displayed ultrasound image is a local zoom-in or zoom-out image, specifically includes: determining a first depth range to which a local enlarged or reduced image in the ultrasonic image belongs, wherein the first depth range is a part of a maximum depth range scanned by the ultrasonic imaging equipment.
2. The method of claim 1, further comprising:
when an adjusting instruction of a first time gain compensation parameter adjusting key is received, determining a second depth range of the ultrasonic image corresponding to the first time gain compensation parameter adjusting key, and adjusting the ultrasonic image in the second depth range.
3. An ultrasound image processing device applied to an ultrasound imaging apparatus, comprising:
the determining module is used for determining a first depth range to which the currently displayed ultrasonic image belongs;
a dividing module, configured to divide the currently displayed ultrasound image into N segments according to the first depth range;
the mapping module is used for establishing a one-to-one correspondence relationship between preset N time gain compensation parameter adjusting keys and the N sections of ultrasonic images obtained by division; the N time gain compensation parameter adjusting keys are all time gain compensation parameter adjusting keys provided by the ultrasonic imaging equipment;
wherein, this ultrasonic image processing apparatus still includes:
the monitoring module is used for monitoring whether the ultrasonic image displayed by the ultrasonic imaging equipment is enlarged or reduced in real time; when the situation that the ultrasonic image displayed by the ultrasonic imaging equipment is enlarged or reduced is monitored, determining the image displayed after being enlarged or reduced as the currently displayed ultrasonic image;
or the zooming module is used for zooming in or zooming out the region of interest of the ultrasonic image displayed by the ultrasonic imaging equipment; correspondingly, the currently displayed ultrasonic image is a local enlarged or reduced image; the determining module is specifically configured to determine a first depth range to which a local zoom-in or zoom-out map in the ultrasound image belongs, where the first depth range is a part of a maximum depth range obtained by scanning with the ultrasound imaging device.
4. The apparatus of claim 3, further comprising:
and the adjusting module is used for determining a second depth range of the ultrasonic image corresponding to the first time gain compensation parameter adjusting key when an adjusting instruction of the first time gain compensation parameter adjusting key is received, and adjusting the ultrasonic image in the second depth range.
5. A computer-readable storage medium having stored therein instructions that, when executed on an ultrasound imaging device, cause the ultrasound imaging device to perform the ultrasound image processing method of any of claims 1-2.
6. An ultrasound imaging apparatus, comprising: a memory, a processor, and a computer program stored on the memory and executable on the processor, when executing the computer program, implementing the ultrasound image processing method of any of claims 1-2.
CN201810469754.XA 2018-05-16 2018-05-16 Ultrasonic image processing method and device, storage medium and ultrasonic imaging equipment Active CN108652663B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810469754.XA CN108652663B (en) 2018-05-16 2018-05-16 Ultrasonic image processing method and device, storage medium and ultrasonic imaging equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810469754.XA CN108652663B (en) 2018-05-16 2018-05-16 Ultrasonic image processing method and device, storage medium and ultrasonic imaging equipment

Publications (2)

Publication Number Publication Date
CN108652663A CN108652663A (en) 2018-10-16
CN108652663B true CN108652663B (en) 2022-04-01

Family

ID=63779849

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810469754.XA Active CN108652663B (en) 2018-05-16 2018-05-16 Ultrasonic image processing method and device, storage medium and ultrasonic imaging equipment

Country Status (1)

Country Link
CN (1) CN108652663B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110327076B (en) * 2019-07-05 2022-08-16 深圳开立生物医疗科技股份有限公司 Blood flow gain adjusting method, device, equipment and readable storage medium
CN110322413A (en) * 2019-07-05 2019-10-11 深圳开立生物医疗科技股份有限公司 Gain adjusting method therefore, device, equipment and the storage medium of supersonic blood image
CN113126871A (en) * 2019-12-30 2021-07-16 无锡祥生医疗科技股份有限公司 Gain adjustment method, system and storage medium
CN111110272B (en) * 2019-12-31 2022-12-23 深圳开立生物医疗科技股份有限公司 Ultrasonic image measurement information display method, device and equipment and readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3742437B2 (en) * 1994-03-15 2006-02-01 フクダ電子株式会社 Ultrasonic diagnostic equipment
CN103181778B (en) * 2011-12-31 2016-05-25 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging method and system
US9024902B2 (en) * 2012-03-26 2015-05-05 General Electric Company Ultrasound device and method thereof
KR101630761B1 (en) * 2012-09-24 2016-06-15 삼성전자주식회사 Ultrasound apparatus and method for providing information using the ultrasound apparatus
KR102245193B1 (en) * 2014-03-13 2021-04-28 삼성메디슨 주식회사 Medical diagnostic apparatus and operating method thereof
KR102426784B1 (en) * 2015-05-29 2022-07-29 삼성전자주식회사 Ultrasound apparatus and method for displaying ultrasoudn images

Also Published As

Publication number Publication date
CN108652663A (en) 2018-10-16

Similar Documents

Publication Publication Date Title
CN108652663B (en) Ultrasonic image processing method and device, storage medium and ultrasonic imaging equipment
CN110569854B (en) Image processing method and device, electronic equipment and storage medium
US10542225B2 (en) In-time registration of temporally separated images acquired with image acquisition system having three dimensional sensor
EP3104331A1 (en) Digital image manipulation
CN113643314A (en) Spine segmentation method in medical image
US9824189B2 (en) Image processing apparatus, image processing method, image display system, and storage medium
CN102668556A (en) Medical support apparatus, medical support method, and medical support system
JPWO2017122541A1 (en) Image processing apparatus, image processing method, program, and surgical system
CN111161268B (en) Image processing method, device, electronic equipment and computer storage medium
CN111127430A (en) Method and device for determining medical image display parameters
DE112018005075T5 (en) INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING PROCESS AND PROGRAM
JP4807824B2 (en) Medical diagnostic imaging system
KR20150049585A (en) Polyp detection apparatus and operating method for the same
CN116778094B (en) Building deformation monitoring method and device based on optimal viewing angle shooting
CN110473235A (en) A kind of method and device being aligned two sides breast image
CN108304840A (en) A kind of image processing method and device
JP6451350B2 (en) Medical image processing apparatus, medical image processing method and program
CN111918610A (en) Gradation conversion method for chest X-ray image, gradation conversion program, gradation conversion device, server device, and conversion method
US20080136815A1 (en) Image display controlling apparatus, image display controlling program and image display controlling method
JP2020131019A (en) Image processing device, image processing method, and program
CN110766631A (en) Face image modification method and device, electronic equipment and computer readable medium
JP7208713B2 (en) Image analysis device and image analysis method
JP2017162034A (en) Image processing device, image processing method, image processing system, and program
CN107578811A (en) A kind of realization method and system of portable medical image workstation
JP2016008830A (en) Surveying device, method for displaying of the surveying device, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant