CN108309347B - Parameter measuring method based on ultrasonic image and ultrasonic imaging system - Google Patents

Parameter measuring method based on ultrasonic image and ultrasonic imaging system Download PDF

Info

Publication number
CN108309347B
CN108309347B CN201710032665.4A CN201710032665A CN108309347B CN 108309347 B CN108309347 B CN 108309347B CN 201710032665 A CN201710032665 A CN 201710032665A CN 108309347 B CN108309347 B CN 108309347B
Authority
CN
China
Prior art keywords
straight line
line
instruction
interface
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710032665.4A
Other languages
Chinese (zh)
Other versions
CN108309347A (en
Inventor
温博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN202110300246.0A priority Critical patent/CN113367722A/en
Priority to CN201710032665.4A priority patent/CN108309347B/en
Publication of CN108309347A publication Critical patent/CN108309347A/en
Application granted granted Critical
Publication of CN108309347B publication Critical patent/CN108309347B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Physiology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides a parameter measurement method based on an ultrasonic image and an ultrasonic imaging system, which can accelerate marking of a target position in the ultrasonic image and improve accuracy. The method comprises the steps of superposing and displaying a first straight line and a second straight line which are perpendicular to each other on the ultrasonic image, wherein the first straight line and the second straight line are intersected to generate an intersection point; receiving a first instruction generated when a first operation is input by utilizing the human-computer interaction equipment; and updating the display of the first straight line and/or the second straight line according to the first instruction, so that the display of the first straight line and/or the second straight line is linked with the input of the first operation, and the second straight line and the first straight line are vertically intersected all the time in the updating process.

Description

Parameter measuring method based on ultrasonic image and ultrasonic imaging system
Technical Field
The present invention relates to an ultrasound imaging apparatus, and more particularly, to a method and system for performing parameter measurement in an ultrasound imaging image.
Background
Female pelvic floor dysfunctional diseases are one of the chronic diseases affecting female daily life, including urinary incontinence, fecal incontinence, repeated urinary infection and pelvic organ prolapse. Currently, the ultrasonic examination gradually replaces MRI with the characteristics of repeatability, low price, simplicity, no contraindication and the like as the evaluation means of pelvic floor diseases. In recent years, the development and maturity of pelvic floor ultrasound have become the preferred imaging examination method for FPFD patients.
For ultrasonic pelvic floor examination, it is often necessary to assess the position and mobility of the pelvic floor organs of a patient under various typical motion conditions, and the like. Pubic symphysis is a non-synostotic cartilage structure that connects the two lateral pubic bones together, with little positional movement at rest and in the actions of Valsalva (Valsalva) or anal contraction. The pubic symphysis is therefore an important reference marker structure for pelvic floor ultrasound, and it is common in the industry to use a reference line associated with the pubic symphysis as the basis for distance measurement, regardless of the probe used.
At present, when doctors measure the position of pelvic floor organs or descending distance, a common distance scale is generally used, a horizontal common distance scale is drawn on the lower edge after pubic symphysis, and the horizontal common distance scale is drawn by adopting two points input by users; the vertical distance from the target structure (e.g., bladder neck or uterine nadir) entered by the user on the ultrasound image to the horizontal line is then used to measure a distance term based solely on the horizontal and vertical relationship of the line of sight. Therefore, the operation is not convenient, and the measurement accuracy has no reference basis. Based on the new requirements of doctors on parameter measurement in actual operation, a more convenient and convenient measurement mode is to be provided.
Disclosure of Invention
Based on this, it is necessary to provide an ultrasound image-based parameter measurement method and an ultrasound imaging system for solving the inconvenience of operation in the prior art.
In one embodiment, a method for measuring parameters based on an ultrasound image is provided, which includes:
acquiring an ultrasound image, wherein the ultrasound image comprises target tissue, and the ultrasound image is obtained by receiving an ultrasound signal from the target tissue by an ultrasound probe;
displaying an ultrasonic image;
superposing and displaying a first straight line and a second straight line which are perpendicular to each other on the ultrasonic image, wherein the first straight line and the second straight line are intersected to generate an intersection point;
receiving a first instruction generated when a first operation is input by utilizing the human-computer interaction equipment; and
and updating the display of the first straight line and/or the second straight line according to the first instruction, so that the display of the first straight line and/or the second straight line is linked with the input of the first operation, and the second straight line and the first straight line are vertically intersected all the time in the updating process.
In one embodiment, a method for measuring parameters based on an ultrasound image is provided, which includes:
acquiring an ultrasound image, wherein the ultrasound image comprises target tissue, and the ultrasound image is obtained by receiving an ultrasound signal from the target tissue by an ultrasound probe;
displaying an ultrasonic image;
displaying a third straight line on the ultrasonic image in an overlapping manner;
displaying a movable cursor on the third straight line in an overlapping manner;
receiving a ninth instruction generated when the ninth operation is input by utilizing the human-computer interaction equipment;
according to the ninth instruction, enabling the display of the movable cursor to be linked with the ninth operation;
receiving at least one tenth instruction generated when the human-computer interaction equipment is used for inputting at least one tenth operation;
extracting a corresponding interface operation position according to the tenth instruction;
recording at least one interface operation position associated with the tenth instruction.
In one embodiment, an ultrasound imaging system is provided, comprising:
the probe head is provided with a probe head,
the transmitting circuit and the receiving circuit are used for exciting the probe to transmit ultrasonic beams to target tissues, receiving echoes of the ultrasonic beams and obtaining ultrasonic echo signals;
the image processing module is used for obtaining an ultrasonic image according to the ultrasonic echo signal;
a display for displaying the ultrasound image; and,
the image processing module is also used for superposing and displaying a first straight line and a second straight line which are vertical to each other on the ultrasonic image, wherein the first straight line and the second straight line are intersected to generate an intersection point, and a first instruction generated when a first operation is input by utilizing a human-computer interaction device is received; and updating the display of the first straight line and/or the second straight line according to the first instruction, so that the display of the first straight line and/or the second straight line is linked with the input of the first operation, and the second straight line and the first straight line are vertically intersected all the time in the updating process.
In one embodiment, an ultrasound imaging system is provided, comprising:
the probe head is provided with a probe head,
the transmitting circuit and the receiving circuit are used for exciting the probe to transmit ultrasonic beams to target tissues, receiving echoes of the ultrasonic beams and obtaining ultrasonic echo signals;
the image processing module is used for obtaining an ultrasonic image according to the ultrasonic echo signal;
a display for displaying the ultrasound image; and,
the image processing module is also used for
Displaying a third straight line on the ultrasonic image in an overlapping manner;
displaying a movable cursor on the third straight line in an overlapping manner;
receiving a ninth instruction generated when the ninth operation is input by utilizing the human-computer interaction equipment;
according to the ninth instruction, enabling the display of the movable cursor to be linked with the ninth operation;
receiving at least one tenth instruction generated when the human-computer interaction equipment is used for inputting at least one tenth operation;
extracting a corresponding interface operation position according to the tenth instruction; and
recording at least one interface operation position associated with the tenth instruction.
Based on the embodiments, the invention actually provides a relatively simple manual measurement operation mode, and can be used for marking the measurement items of the pelvic floor ultrasound.
Drawings
FIG. 1 is a system architecture diagram providing an ultrasound imaging system in accordance with some embodiments;
FIG. 2 is a schematic flow chart diagram of an embodiment;
FIG. 3 is a schematic view of a graphical interface for a single window display; FIG. 4 is a schematic diagram of an image interface displayed by a dual window;
FIG. 5 is a schematic diagram of one embodiment of an image overlay display;
FIG. 6 is a schematic view of translating a first line and a second line in one embodiment;
FIG. 7 is a schematic view of a first line of rotation in one embodiment;
FIGS. 8 and 9 are schematic views of measurement indicia in various embodiments;
FIG. 10 is an operational diagram illustrating the continuous performance of a combined measurement of multiple target locations, in accordance with one embodiment;
FIG. 11 is a schematic flow chart diagram of another embodiment;
FIG. 12 is a diagram illustrating translating a fourth line in one embodiment;
FIG. 13 is a schematic view of a fourth line of rotation in one embodiment;
FIG. 14 is an operational schematic diagram illustrating the continuous performance of a combined measurement of multiple target locations, in one embodiment.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art. Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the method descriptions may be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed. The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning.
The ultrasound imaging system shown in figure 1 comprises: a probe 101, a transmitting circuit 101, a transmitting/receiving selection switch 102, a receiving circuit 104, a beam forming module 105, a signal processing module 116, and an image processing module 126. In the ultrasound imaging process, the transmission circuit 101 transmits a delay-focused transmission pulse having a certain amplitude and polarity to the probe 101 through the transmission/reception selection switch 102. The probe 101 is excited by the transmission pulse, transmits an ultrasonic wave to a target tissue (for example, an organ, a tissue, a blood vessel, or the like in a human body or an animal body), receives an ultrasonic echo with information of the target tissue reflected from a target region with a certain delay, and converts the ultrasonic echo back into an electric signal. The receiving circuit 104 receives the electric signals generated by the conversion of the probe 101, obtains ultrasonic echo signals, and sends the ultrasonic echo signals to the beam forming module 105. The beam forming module 105 performs focusing delay, weighting, channel summation and other processing on the ultrasonic echo signal, and then sends the ultrasonic echo signal to the signal processing module 116 for related signal processing. The ultrasonic echo signals processed by the signal processing module 116 are sent to the image processing module 126. The image processing module 126 performs different processing on the signals according to different imaging modes required by a user to obtain ultrasonic image data of different modes, and then performs processing such as logarithmic compression, dynamic range adjustment, digital scan conversion and the like to form ultrasonic images of different modes, such as a B image, a C image, a D image, a doppler blood flow image, an elastic image containing elastic properties of tissues, and the like, or other types of two-dimensional ultrasonic images or three-dimensional ultrasonic images. The elastic image may be obtained by emitting ultrasonic waves to detect the characteristics of shear waves inside the target tissue, or may be obtained by emitting ultrasonic waves to detect the deformation of the target tissue due to an external force, wherein the shear waves may be obtained by vibration of the external force or may be generated by excitation by emitting ultrasonic waves to the target tissue.
In some embodiments of the present invention, the signal processing module 116 and the image processing module 126 may be integrated on one main board 106, or one or more (including the same number) of them may be integrated on one processor/controller chip.
Next, the ultrasound imaging system further includes an external input/output port 108, and the external input/output port 108 is disposed on the main board 106. The ultrasonic imaging system can be connected with the human-computer interaction device through the external input/output port 108, and is used for receiving instruction signals input through the human-computer interaction device through the external input/output port 108, wherein the instruction signals include control instructions for transmitting and receiving time sequences of ultrasonic waves, operation input instructions for editing and labeling ultrasonic images and the like, output instructions for reminding a user and the like, or other instruction types. Generally, the operation instruction obtained when the user performs operation input such as editing, labeling, measuring, etc. on the ultrasound image is used for measurement on the target tissue. The human-computer interaction device may include one or more of a keyboard, a mouse, a scroll wheel, a track ball, a mobile input device (mobile device with touch screen, mobile phone, etc.), a multifunctional knob, etc., so that the corresponding external input/output port 108 may be a wireless communication module, a wired communication module, or a combination thereof. The external input/output port 108 may also be implemented based on USB, bus protocols such as CAN, and/or wired network protocols, among others.
In addition, the ultrasound imaging system may further comprise a display 107, the display 107 being adapted to display ultrasound image data from the image processing module. The display 107 may be a touch screen display. Of course, the ultrasound imaging system can also be connected with another display through an external input/output port to realize a dual-screen display system. In addition, the display in this embodiment may include one display, or may include a plurality of displays, and the number of displays is not limited in this embodiment. The displayed ultrasound image data (ultrasound image) may be displayed on one display, or may be simultaneously displayed on a plurality of displays, or certainly, the ultrasound image data (ultrasound image) may also be displayed on a plurality of displays synchronously, respectively, and this embodiment is not limited herein. In addition, the display displays the ultrasonic images and simultaneously provides a graphical interface for human-computer interaction for a user, one or more controlled objects are arranged on the graphical interface, and the user is provided with a human-computer interaction device to input operation instructions to control the controlled objects, so that corresponding control operation is executed. For example, icons are displayed on the graphical interface, and the icons can be operated by the man-machine interaction device to execute specific functions. Tracking the output of the human-computer interaction device can obtain the interface operation position corresponding to the graphical interface, and perform relevant specific function operation based on the interface operation position, such as positioning the display update of the controlled object, and the like. The interface operation position mentioned in the present text refers to a position on the display interface corresponding to the operation input of the user on the interface object by using the human-computer interaction device. The "position" referred to herein includes orientation information, coordinate information, and/or angle information, etc., for example, the position of the line can be characterized by coordinate information of all pixel points on the line, and can also be characterized by angle information based on a reference line. Therefore, the position change may be a change in the coordinate information or a change in the angle information. The interface object comprises a cursor, a first straight line, a second straight line, a measurement mark and other interface controlled objects. There are many ways to locate the operation position of the interface based on the human-computer interaction device, and different algorithms are used for tracking the position according to different human-computer interaction devices, for example, the position of the operation position of the interface is located by tracking the moving direction and speed of the input operation by using a trackball.
Referring to fig. 2, a flow chart of a method for measuring parameters in an ultrasound image is provided, and the following describes in detail the implementation of the method for measuring parameters in this embodiment with reference to fig. 1.
In step S210, the image processing module 126 in the ultrasound imaging system acquires an ultrasound image, where the ultrasound image includes the target tissue.
As shown in fig. 1, the ultrasound image may be obtained by receiving an ultrasound signal from a target tissue with an ultrasound probe 101. The ultrasonic signal in the present embodiment is not limited to the ultrasonic echo signal explained above with reference to fig. 1, but may be an ultrasonic signal generated in the target tissue by using, for example, a photoacoustic imaging method. Further, tissue of interest herein includes, but is not limited to, pelvic floor tissue, including one or more anatomical tissue structures within the female pelvic cavity, such as, for example, uterus, labia, perineum, pelvic bones, phalanx combinations, and the like. Ultrasound images containing pelvic floor tissue include, but are not limited to, anterior pelvic ultrasound images and posterior pelvic ultrasound images, and may also include medial pelvic cavity ultrasound images. Commonly used measurement items of pelvic floor ultrasound are divided into an anterior pelvic cavity part and a posterior pelvic cavity part, wherein parameter measurement based on an anterior pelvic cavity ultrasound image and a middle pelvic cavity ultrasound image is mainly completed on a human body median sagittal plane image obtained by adopting a trans-labial (trans) or trans-perineal (trans) probe. The parameter measurement based on the posterior pelvic cavity ultrasonic image can adopt an intra-anal (anal) probe to collect images, or more conveniently adopts a Transperineal (transvascular) probe or a Transvaginal (transvascular) probe to collect static three-dimensional images or four-dimensional images, and selects an appropriate section on an axial plane for relevant measurement. Based on the parameter measurement of the posterior pelvic cavity ultrasound image, the description of this embodiment is based on the second case described above, that is, the posterior pelvic cavity ultrasound image obtained by performing image acquisition using a transperineal or transvaginal probe.
In step S220, the image processing module 126 in the ultrasound imaging system outputs the ultrasound image to a display for displaying the ultrasound image.
See the description above for the display 107. The mode of displaying the ultrasound image in this embodiment is not limited, and for example, the ultrasound image may be displayed on a plurality of displays simultaneously, or only on one display, or may be displayed on a plurality of displays in a synchronized manner in sections, so as to enlarge the observation angle of the ultrasound image. Further, in one embodiment, the image processing module 126 may transmit the ultrasound image to the display in a wireless or wired manner. The display may be a touch display screen on the mobile device. Further, in one embodiment, the ultrasound image is displayed on a first level, which is a software interface layer in addition to non-image data for displaying annotations, labels, text, cursors, measurement scales, and the like. Accordingly, the software interface layer for displaying non-image data such as annotations, labels, texts, cursors, measurement scales, etc. is referred to as a second layer, and the second layer is set to be transparent in the region overlapping with the first layer, so that the ultrasound images can be not shielded, the visibility is enhanced, and the interface friendliness is achieved. Further, the whole layer of the second layer is set to be transparent.
In addition, in one embodiment, as shown in fig. 3, an area for displaying an ultrasound image on the display interface 300 may be independently set as an image display window 301, and a video image is displayed in the image display window 301 or a single frame image is displayed, and besides the image display window, the area may also be used for displaying non-image data such as annotations, labels, texts, and the like, where the ultrasound image of the target tissue after image processing is 302. Of course, there may be 1 or more image display windows on the same display interface to achieve the same-screen multi-window display effect for comparing and browsing images, as shown in fig. 4, two image display windows 401 and 402 are displayed on the display interface 400 in a double-screen manner, where the ultrasound images of the target tissue after image processing are 411 and 412, respectively. In some embodiments, the image display window is set by the image pixel position range, for example, the current display interface is 1478 × 2897, then 200 × 400 set at the corresponding position is an image display window, and an image display window with the same size is provided in parallel at the same position, wherein one image display window displays a frame of still image and another image display window displays moving video. The video image includes a plurality of frames of ultrasound images.
In step S230, the image processing module 126 displays a first straight line and a second straight line perpendicular to each other on the ultrasound image by superimposing them on the display 107, and the first straight line and the second straight line intersect to generate an intersection. For example, by default, the first line is positioned horizontally and the second line is positioned vertically.
For example, in fig. 3, a first straight line 303 is superimposed on an ultrasound image 302 in an image display window 301, the first straight line 303 is disposed absolutely horizontally, while a second straight line 304 is superimposed, the second straight line 304 is disposed perpendicularly intersecting the first straight line, and the second straight line 304 intersects the first straight line 303 to form an intersection point 305. In addition, the first straight line 303 and the second straight line 304 can be displayed in a distinguishing manner, so that the first straight line 303 and the second straight line 304 cannot be distinguished after the adjustment operation is performed on the first straight line. In fig. 3, the first straight line and the second straight line are displayed in different line types, but may be distinguished by colors. The line type here may include one of a solid line, a broken line, a dot-dash line, and the like. In one embodiment, in fig. 3, the first line 303 and the second line 304 cross the image display window 301 of the ultrasound image, and as a result of the crossing, the first line 303 and the second line 304 cross the boundary 311 of the image display window 301 by a preset distance, the oversized mark can provide a better interaction experience for a user, and the user can more accurately position a position on the ultrasound image.
In order to facilitate the effect of displaying the first line and the second line as the measurement scale, and improve the interaction experience with the user operation, the first line and the second line float on the ultrasound image display, and the ultrasound image is not blocked, in some examples, as shown in fig. 5, the first line 512 and the second line 513 are disposed on the second layer 502, the ultrasound image 511 is displayed on the first layer 501 below the second layer 502, and the second layer 502 is a transparent layer. When the adjustment operation is performed on the first straight line 512, the attribute of the first straight line on the second layer 502 is changed, where the attribute includes parameters such as size, line type, color, visibility, and the like, and the adjustment of the attribute can change the display result of the first straight line or the second straight line, including the display position and the display form. The first and second slices are superimposed to form the rightmost display of fig. 5, i.e., the first and second lines 512 and 513 are superimposed on the ultrasound image 511. Further, the ultrasound image 511 is displayed in the image display window 503. Of course, the area range in which the first straight line 512 and the second straight line 513 can be displayed and operated on the second layer 502 may be equal to or larger than the area range of the single image display window 503 on the first layer 501. The range of areas where the first straight line 512 and the second straight line 513 are displayable and operable on the second level 502 may include: the length of the first straight line 512 and the second straight line 513 is used as the length and width of the rectangle or square area, or the longest one of the first straight line 512 and the second straight line 513 is used as the diameter of the circle area.
As shown in fig. 4, in a multi-screen display, for example, two image display windows 401 and 402 in fig. 4, where the ultrasound images are 411 and 412, respectively, the first straight line and the second straight line superimposed on the ultrasound images 411 and 412 are, respectively, a first straight line 431 and a second straight line 421 located in the image display window 401, and a first straight line 432 and a second straight line 442 located in the image display window 402. In order to avoid confusion in the measurement of the images in the multi-screen display mode, in one embodiment, the first line and/or the second line can be adjusted only by activating the image display window, so as to perform the measurement on the ultrasound image. For example, when the first straight line 431 and the second straight line 421 located in the image display window 401 are to be adjusted, the image display window 401 is activated, and when the first straight line 432 and the second straight line 442 located in the image display window 402 are to be adjusted, the image display window 402 is activated. The first line and/or the second line may be adjustable within the activated image display window.
The current image display window is activated during the adjustment of the first straight line, and the activation may be implemented as follows.
Firstly, an image processing module determines an interface operation position corresponding to a first operation or a second operation input by using a human-computer interaction device; secondly, determining the image display window according to the interface operation position associated with the first operation or the second operation, and using the image display window as the current image display window; then, the current image display window is activated. In the embodiment, the current image processing window can be identified by tracking the interface operation position corresponding to the first operation or the second operation input by the human-computer interaction device, so that the image display window where the current image processing window is located is automatically activated, independent activation operation is not required to be performed through the human-computer interaction device such as a key, the use experience of a user is improved, and the measurement operation is more friendly.
In step S240, the image processing module 126 receives a first instruction generated when a first operation is input using the human-computer interaction device. In step S250, the display of the first straight line and/or the second straight line is updated in accordance with the first command, and the display of the first straight line and/or the second straight line is interlocked with the input of the first operation. And updating the display of the first straight line and/or the second straight line, wherein the purpose is that when the input of the first operation is changed, the display of the first straight line and/or the second straight line is changed, and the second straight line and the first straight line are perpendicularly intersected all the time in the updating process of the first straight line and/or the second straight line. For example, when the first command is generated by continuously inputting the first operation using the trackball, the display of the first straight line and/or the second straight line may be changed accordingly.
In this embodiment, the intersection point formed by the first straight line and the second straight line or the first straight line can be used as the controlled cursor, and the cursor icon is not required to be displayed on the interface separately to be used as the controlled interface object except for the first straight line and the second straight line, so that the first straight line and the second straight line can be used as the scale or the cursor. When the man-machine interaction device is used for operation input, the operation can be performed on the intersection point of the first straight line and the second straight line by default, and the operation is different from the operation performed on the intersection point of the first straight line and the second straight line, so that the interface display can be simplified, and the operation is convenient for a user.
In the process from step S240 to step S250, the first instruction generated during the first operation is input by using the human-computer interaction device, so that the overall movement of the first straight line and the second straight line, the independent movement of the first straight line or the second straight line, the rotation operation of the first straight line, and the like can be realized. The following is a specific example.
For example, in one embodiment, the above-mentioned process from step S240 to step S250 realizes the integral movement of the first straight line and the second straight line in the following manner:
first, the image processing module receives a third instruction generated when a human-computer interaction device is used to input a third operation, where the third operation in this embodiment may be implemented by using an operation on a trackball. The input may be continued for the operation of the trackball.
Secondly, the image processing module monitors a second interface operation position corresponding to the third operation, and tracks the change of the second interface operation position, wherein the second interface operation position is any position in the image display window;
and then, assigning the intersection point position of the first straight line and the second straight line as a tracking result of the second interface operation position, and redrawing the first straight line and the second straight line based on the reassigned intersection point. Therefore, the linkage effect is generated by utilizing the input of the human-computer interaction equipment and the integral operation of the first straight line and the second straight line.
As shown in fig. 6, on the display interface 600, there is an image display window 601, an ultrasound image 608 is displayed within the image display window 601, and a first straight line 602 is displayed at a position 614 and a second straight line 604 is displayed at a position 611, and an intersection 603 of the first straight line 602 and the second straight line 604 is at a position 612. And taking the intersection point 603 as a controlled object of human-computer interaction. The image processing module receives a third instruction generated when a third operation is input by using a human-computer interaction device such as a trackball, the third operation is used for controlling the intersection point 603, monitoring a second interface operation position corresponding to the third operation, tracking the change of the second interface operation position to 612 in fig. 6, changing the position of the intersection point 603 of the first straight line 602 and the second straight line 604 to the second interface operation position 612, and redrawing the first straight line 602 and the first straight line 604 based on the changed intersection point position, so that the integral movement of the first straight line and the second straight line is realized, and the first straight line and the second straight line can integrally move to any position in the image display window along with the input of the first three operations. In this embodiment, the overall movement of the first straight line and the second straight line may be performed in a manner of moving to the target position in one step, or may be performed in two steps to reach the target position, for example, the intersection point is moved along the second straight line to adjust the position of the second straight line on the first straight line, and then the intersection point is moved along the first straight line to adjust the position of the first straight line on the second straight line, so that the intersection point is moved to the target position, which also falls into the protection range of this embodiment. It is obvious that the third operation includes one operation input, and may also include a continuous input of a plurality of operations. The second interface operation position may include one interface operation position, or may include a plurality of interface operation positions, and the plurality of interface operation positions may be continuous or discontinuous.
For another example, in one embodiment, the following manner is adopted to implement the translation of the first straight line in the process from step S240 to step S250:
firstly, the image processing module receives a fourth instruction generated when a human-computer interaction device is used for inputting a fourth operation, wherein the fourth operation in the embodiment can be realized by using the operation of a trackball;
secondly, the image processing module monitors a third interface operation position corresponding to the fourth operation, and tracks the change of the third interface operation position in the vertical direction, wherein the position of the third interface operation position in the horizontal direction is unchanged;
and thirdly, assigning the position of the intersection point of the first straight line and the second straight line as a tracking result of the operation position of the third interface, and redrawing the first straight line based on the assigned intersection point.
As shown in fig. 8, in panel a, a first straight line 81 is displayed at a position 83 on the display interface, while a second straight line 82 is displayed perpendicularly intersecting the first straight line 81, with the intersection position at the bladder neck. In the diagram B, the image processing module receives a fourth instruction generated when a fourth operation is input by using a human-computer interaction device such as a trackball, monitors a third interface operation position corresponding to the fourth operation, tracks the change of the third interface operation position in the vertical direction, and keeps the position of the third interface operation position in the horizontal direction unchanged, namely, the intersection point in the diagram B changes to a position 85 along a second straight line, assigns the position of the intersection point of the first straight line and the second straight line as the tracking result of the third interface operation position, and redraws 81 the first straight line at the position 85 based on the assigned intersection point, wherein the second straight line 82 is fixed during the movement. In this embodiment, the intersection of the first straight line and the second straight line is used as the controlled object, and the fourth operation is used to control the intersection. Therefore, through the embodiment, the measuring mode that the first straight line gradually moves along the second straight line can be presented, the measuring precision is improved, the effect that the second straight line is used as a measuring reference is realized, the operation of a user is facilitated, and the operation experience is improved. The user is presented with a clearer and clearer position change process, accuracy determination during measurement operation by a doctor is facilitated, and input of misoperation is reduced, which is shown in the process from A, B to C in sequence in fig. 8. It is obvious that the fourth operation includes one operation input, and may also include a continuous input of a plurality of operations. The third interface operation position may include one interface operation position, or may include a plurality of interface operation positions, and the plurality of interface operation positions may be continuous or discontinuous.
Similarly, for example, in one embodiment, the following manner is adopted to implement the translation of the second straight line in the process from step S240 to step S250:
firstly, the image processing module receives a fifth instruction generated when a human-computer interaction device is used for inputting a fifth operation, wherein the fifth operation in the embodiment can be realized by using the operation of a track ball; secondly, the image processing module monitors a fourth interface operation position corresponding to the fifth operation, and tracks the change of the fourth interface operation position in the horizontal direction, wherein the position of the fourth interface operation position in the vertical direction is unchanged; and thirdly, assigning the position of the intersection point of the first straight line and the second straight line as a tracking result of the operation position of the fourth interface, and redrawing the second straight line based on the assigned intersection point. Not specifically described herein is how to translate the second line alone, as described above in relation to moving the first line in fig. 8, the fifth and fourth operations may be the same, and the fifth instruction may be the same as the fourth instruction. It is apparent that the fifth operation may also include one-time operation input or continuous input of a plurality of operations. The fourth interface operation position may also include one interface operation position or a plurality of interface operation positions, and the plurality of interface operation positions may be continuous or discontinuous.
The second straight line can be translated by adopting the embodiment, and the integral translation of the first straight line and the second straight line can be realized by further combining the translation of the first straight line and the translation of the second straight line, as shown below.
The above-mentioned process from step S240 to step S250 realizes the integral movement of the first straight line and the second straight line in the following manner:
firstly, the image processing module receives a fourth instruction generated when a human-computer interaction device is used for inputting a fourth operation, wherein the fourth operation in the embodiment can be realized by using the operation of a trackball;
secondly, the image processing module monitors a third interface operation position corresponding to the fourth operation, and tracks the change of the third interface operation position in the vertical direction, wherein the position of the third interface operation position in the horizontal direction is unchanged;
thirdly, assigning the position of the intersection point of the first straight line and the second straight line as a tracking result of the operation position of the third interface, and redrawing the first straight line based on the assigned intersection point;
then, the image processing module receives a fifth instruction generated when the human-computer interaction device is used for inputting a fifth operation again, wherein the fifth operation in the embodiment can be realized by using the operation on the trackball;
secondly, the image processing module monitors a fourth interface operation position corresponding to the fifth operation, and tracks the change of the fourth interface operation position in the horizontal direction, wherein the position of the fourth interface operation position in the vertical direction is unchanged;
and thirdly, assigning the position of the intersection point of the first straight line and the second straight line as a tracking result of the operation position of the fourth interface, and redrawing the second straight line based on the assigned intersection point.
The fourth instruction and the fifth instruction output by the fourth operation and the fifth operation which are input twice in the above process may contain different instruction information, for example, the input of the two operations is distinguished by the operation of the trackball in different directions.
For example, in one embodiment, the rotation of the first line and the second line is realized in the following manner in the process from step S240 to step S250:
firstly, the image processing module receives a sixth instruction generated when a human-computer interaction device is used for inputting a sixth operation;
secondly, the image processing module extracts rotation indication information according to the sixth instruction;
then, the image processing module rotates the first straight line or the second straight line with the intersection point as a center according to the extracted rotation instruction information.
As shown in fig. 7, on the display interface 700, there is an image display window 701, an ultrasound image 708 is displayed within the image display window 701, and a first straight line 702 is displayed at a position 71 and a second straight line 704 is displayed at a position 73, and an intersection point of the first straight line 702 and the second straight line 704 is 703. The first straight line 702 is used as a controlled object for man-machine interaction. The image processing module receives a sixth instruction generated when a sixth operation is input using a human-computer interaction device such as a knob, the sixth operation is used for controlling the intersection point 703, and rotation instruction information including angle information for adjusting the angle of the first straight line 702 with respect to the initial position 71 or the angle of the second straight line 704 with respect to the initial position 73 is extracted according to the sixth instruction. The image processing module rotates the first straight line 702 or the second straight line 704 around the intersection of the first straight line and the second straight line according to the extracted rotation instruction information, thereby realizing rotation adjustment of the first straight line and/or the second straight line. Also, as the first line 702 rotates, the second line may also rotate in synchronization with the rotation of the first line, for example from position 73 to position 74, thereby ensuring that the first line always intersects the second line perpendicularly.
In order to ensure that the first line and the second line provided to the user provide absolute measurement references, and to improve accuracy, in one embodiment the first line and the second line always intersect perpendicularly. For example, in the above steps S240 and S250, the image processing module 126 receives a first instruction generated when a first operation is input by using the human-computer interaction device, and updates the display of the first straight line according to the first instruction so that the first straight line and the second straight line are always perpendicularly intersected by one of the following manners:
as shown in fig. 7, during the process of adjusting the first straight line 702, the second straight line 704 changes with the position change of the first straight line 702 and always perpendicularly intersects with the first straight line 702, and in fig. 7, when the first straight line 702 rotates from the position 71 to the position 72, the second straight line 704 also synchronously rotates from the position 73 to the position 74; and,
as shown in fig. 8, when the first straight line 702 is adjusted to rotate from the position 71 to the position 72, the second straight line 704 always moves at the position 73, and when the first straight line 702 is adjusted to the target position or the desired position, the position of the second straight line is automatically changed to intersect the first straight line perpendicularly.
The target position mentioned herein is a position to which the user desires to adjust the interface object when performing an adjustment input operation on the interface object through the human-computer interaction device, for example, a relevant anatomical structure of a target tissue such as a pubic symphysis lower edge, a pubic symphysis middle axis, and the like on the ultrasound image. When a user performs input operation on an interface object (including a controlled object) through the human-computer interaction device, a corresponding interface operation position is generated on the display interface. The interface operation position may be a pixel coordinate position, or a neighborhood position region formed by a plurality of pixels.
Through each embodiment, the measuring effect that the first straight line and/or the second straight line are linked along with the input operation of the user can be realized, the operation experience is improved, and the user can accurately position the image position to perform the measuring operation conveniently.
In the above embodiments, the rotation of the first straight line, the translation of the first straight line, and the integral movement of the first straight line and the second straight line are respectively realized. The third operation, the fourth operation, the fifth operation and the sixth operation may be continuous input of instructions by a user based on the human-computer interaction device. Therefore, the third instruction, the fourth instruction, the fifth instruction and the sixth instruction may be continuous input performed by the user based on the human-computer interaction device, where the continuous input mentioned herein represents an operation that the user continuously performs instruction input using the human-computer interaction device for a period of time, and for example, the continuous input may include a continuous input movement operation using a trackball by the user, and a continuous input of position change information using a keyboard by the user, so that an interface object displayed on the interface may present a continuously moving display effect on a screen through the continuous input, and interaction experience may be improved. Therefore, in steps S240 and S250, the image processing module receives a continuous input related to the first command to update the display of the first straight line and/or the second straight line, so that the display of the first straight line and/or the second straight line is interlocked with the continuous input.
Referring to fig. 2, the method for measuring parameters based on ultrasound images in this embodiment may further include steps S260, S270, and S280, configured to confirm the position after the linkage operation is performed on the first straight line, and obtain a measurement instruction, so as to implement calculation on ultrasound image measurement items.
In step S260, the image processing module receives a second instruction generated when a second operation is input using the human-computer interaction device. In step S270, the image processing module extracts a corresponding first interface operation position according to the second instruction. In step S280, the image processing module records a first interface operation position associated with the second instruction. As specifically illustrated in conjunction with fig. 8 and 9.
As shown in fig. 8 and 9, an embodiment in which an operation instruction is input using a first straight line and a second straight line is provided. The oval circles in the figure indicate the pubic symphysis and the black dots indicate the bladder neck. The image processing module 126 receives a first instruction generated when a first operation is input using the human-computer interaction device, updates the first straight line 81 according to the first instruction, and records a position determined by measurement according to a second instruction generated when a second operation is input using the human-computer interaction device.
In fig. 8, in panel a, the intersection of the first and second lines is at the location of the bladder neck. When the initial display positions of the first straight line and the second straight line are not at the desired position or the target position (such as when the intersection point is not at the bladder neck position), the intersection point of the first straight line 81 and the second straight line can be updated to the desired position or the target position according to the first instruction generated when the first operation is input by using the human-computer interaction device, which can be specifically referred to the related description above.
In the process of changing from the a diagram to the B diagram, after the intersection of the first straight line and the second straight line is moved to a desired or target position, the user needs to mark the position and record the position for measurement calculation with respect to the ultrasound image. Therefore, the user can input a second operation by using the human-computer interaction device to confirm the position, wherein the second operation can provide instruction input through hardware such as keys, and a set setting key on a keyboard of an ultrasonic device is commonly used. The image processing module receives a second instruction generated when the human-computer interaction device is used for inputting a second operation, and extracts a corresponding first interface operation position, namely a position 83 according to the second instruction, wherein the first interface operation position is used as data for measuring the ultrasonic device, and the subsequent measurement instruction is convenient to generate.
When the intersection of the first line and the second line is moved to the position 85, the first line is translated downward, so that the user can easily see the marked position on the interface, and therefore, when the first line or the intersection of the first line and the second line is moved, a position mark 87 is marked at the initial position (namely, the bladder neck position) where the intersection of the first line and the second line is located. And continuing the measurement operation, when the third interface operation position corresponding to the fourth operation related to the fourth instruction is monitored according to the received four instructions, and the change of the third interface operation position is tracked, for example, the change of the third interface operation position in the vertical direction is tracked in fig. 8, the location of the third interface operation position in the horizontal direction is unchanged, that is, the intersection point in the B diagram changes to a position 84 along a second straight line, and when the user determines that the first straight line reaches the position 84 and passes another target position (for example, the lower edge of the pubic bone combination), the second operation is input by using the human-computer interaction device to confirm the position, the image processing module receives the second instruction generated when the second operation is input by using the human-computer interaction device, and extracts the corresponding first interface operation position, that is, according to the second instruction, the position 84. The image processing module records a first interface operational position 84 associated with the second instruction, representing a position of a lower margin of the pubic symphysis determined using the first line. After the measurement operation is completed, when a second instruction input by the user is received, the representation in fig. 8 indicates that the user has confirmed the position of at least one reference line or a target position to be measured. At this time, the first straight line 81 is shortened, or hidden, or deleted, and when the first straight line 81 is hidden or deleted, a position mark 88 is marked at a corresponding position, thereby forming a D-diagram. In one embodiment, there is only one first straight line in the same image display window. Based on the position between the position markers 87 and 88, corresponding measurement instructions can be obtained.
Fig. 9 shows an embodiment mainly directed to the rotation operation of the first straight line 91. Both the translation operation and the rotation operation belong to the movement operation, and therefore, the first instruction to update the first linear display includes: sixth instructions for rotating the first straight line, the moving of the first straight line comprising: the image processing module receives a sixth instruction generated when a human-computer interaction device is used for inputting a sixth operation, and extracts rotation indication information, such as angle information of rotation from a position 93 to a target position 94, according to the sixth instruction; the first straight line 91 is displayed in an updated state at the interface operation position 94 in conjunction with the sixth command, so that the first straight line 91 is rotated from the first position 93 at the present position to the target position 94, as shown in the process from a to B in fig. 9. During the rotation, the first straight line 91 is rotated around the intersection of the first straight line and the second straight line in accordance with the angle information included in the extracted rotation instruction information so as to coincide with the reference line (the red dotted line 910 in fig. 9). Of course, the second line may also be rotated in the manner described above. The second straight line 92 is automatically adjusted to be perpendicular to the first straight line 91 when the rotation reaches the target position 94, or the perpendicular relationship of the second straight line 92 is automatically adjusted along with the position change of the first straight line 91 in the process of not reaching the target position, so that the first straight line 91 is always perpendicular to the second straight line 92 when the rotation operation is executed, and the second straight line 92 is also rotated from the position 95 or synchronously rotated along with the first straight line to the position 96. In rotating the first straight line 91 according to the fifth instruction, the first straight line 91 is rotated centering on the intersection of the first straight line 91 and the second straight line 92. If the position of the bladder neck is to be marked, at this time, when the position of the reference line is determined after the first straight line is overlapped with the red reference line 910, the position is confirmed by inputting a second operation through the human-computer interaction device (for example, confirmation through key input), and the image processing module receives a second instruction generated when the second operation is input through the human-computer interaction device, and extracts a corresponding first interface operation position, namely the position 94, according to the second instruction. The image processing module records a first interface operational position 94 associated with the second instructions indicating that the user has confirmed the position of the at least one reference line or a target position to be measured, such as the pubic symphysis axis position determined using the first line. When the first line is translated to continue marking the location of the bladder neck, the location of the marked location can be recorded by marking a location mark 97 (in fig. C) on the second line for recording the location as the determined mid axis of pubic symphysis after the first line is moved in order to allow the user to view the marked location on the image. When the marking of the bladder neck position is continued, receiving a fourth instruction generated when a human-computer interaction device is used for inputting a fourth operation, monitoring a third interface operation position corresponding to the fourth operation, wherein the third interface operation position is unchanged in the horizontal direction, and the C figure shows that the intersection point of the first straight line and the second straight line always moves on the second straight line and tracks the change of the third interface operation position in the vertical direction; assigning the position of the intersection point of the first straight line and the second straight line as a tracking result of the third interface operation position, redrawing the first straight line 91 based on the assigned intersection point, thereby presenting a translation display effect of the first straight line along the second straight line until the first straight line passes through the bladder neck when the first straight line is at the position 95, receiving a second instruction generated when the second operation is input by using a human-computer interaction device by using an image processing module, for example, receiving a confirmation instruction performed by a user through key clicking operation, then extracting the corresponding first interface operation position according to the second instruction by using the image processing module, recording the first interface operation position associated with the second instruction, and representing that the user has confirmed the position of at least one reference line or a target position to be detected, for example, the position of the bladder neck is determined by using the first straight line. The position that has been marked can then be recorded by marking the position marker 98 on a second line for obtaining measurement instructions for performing measurement calculations with respect to the ultrasound image. The position mark may appear based on one operation input of the user or based on two operation inputs of the user, which is not limited in this embodiment, that is, the second instruction generated when the second operation is input by using the human-computer interaction device is received, and the second instruction may be input by a single instruction or input by multiple instructions. For example, the marking of the position mark on the ultrasound image may be realized by performing one operation input through a key, or may be realized by performing two consecutive operation inputs. The distance relationship between the bladder neck position and the central axis of pubic symphysis can be obtained from the position markers 97 and 98 and used to calculate the corresponding measurement terms.
As shown in fig. 8 and 9, in one embodiment, after recording the interface operation position associated with the second instruction, the image processing module further includes:
position markers (87, 88, 97, 98) are drawn, and the position markers (87, 88, 97, 98) are located on the second straight line (82, 92). The position marker may be a line segment, or a cross marker or the like may be used as an icon for position identification.
Further, in one embodiment, the image processing module eliminates redundant second lines, avoids the marking obscuring the ultrasound image, displays portions (89, 99) of the second lines, the portions (89, 99) of the second lines being located between two position markers and/or between a position marker and a first line.
The red dotted line 910 in fig. 9 provides a position prompt mark of the central axis of pubic symphysis, which is obtained by automatic image segmentation and automatically provided by the system, so that a certain reference can be provided when a user performs coordinate positioning or reference line positioning in image measurement, and the system has a relatively practical clinical value. Therefore, in some variations of this embodiment, a position cue marker may also be displayed on the ultrasound image. The position cue marks can be obtained by automatic image segmentation of the system, and can also be obtained in an automatic scanning process in a system and manual mode.
In the embodiments provided above, the point-to-point distance measurement, or the point-to-line measurement, and the combination measurement of the two can be realized by the operation of the first straight line and the second straight line, which is very simple and easy to operate, and: the absolute horizontal reference line can be obtained, and the measurement accuracy is improved. After the measurement is finished, the two ends of the ruler are short lines, for example, the position of the position mark and the mode of displaying the second straight line part can reduce image shielding; the distance end point reference line can move at two sides of the distance starting point short line, so that the distance can be measured at two sides, and the application of the basin bottom is met. The two perpendicular cursor lines (the first line and the second line) are different in color or line type, and the user knows which direction to rotate the minimum angle in any way to quickly coincide the measuring cursor with the reference line.
In the embodiment shown in FIG. 9, when the first straight line 92 moves to the pubic symphysis axis position at the position 94, the property of the first straight line 92 may be altered when recording the interface operation position associated with the second instruction according to the input of the second instruction, the first straight line being located at the interface operation position associated with the second instruction. For example, the line shape of the first straight line 92 is changed to prompt the user that the position of the reference line has been confirmed. Changing the attribute of the first line includes at least one of the following ways: the operation attribute of the first straight line is set to be immovable, and the line type of the first straight line is set to be a preset line type. See in particular the measurement process shown in fig. 10.
In order to facilitate the measurement operation, after the position marks (87, 88, 97, 98) are obtained by marking at least once during the movement process of the first straight line shown in fig. 8 and 9, the first straight line can move not only on one side of the position mark on the second straight line, but also on both sides of the position mark on the second straight line, that is, during the translation process of the first straight line, the position of the intersection point of the first straight line and the second straight line can be arbitrarily moved on the first straight line or the second straight line.
Fig. 10 shows a process of performing measurement by using various operation and display modes of the first straight line and the second straight line, and a specific flow is as follows.
The operation through the key input or other mode selection enters the measurement operation mode, and the horizontally disposed first straight line 102 and the vertically disposed second straight line 101 are displayed in an overlapped manner on the ultrasound image, as shown in fig. 10 (1). Different colors/line types may be used between the first line and the second line to indicate the difference. The ultrasound image 120 is displayed in the image display window 100.
As shown in fig. 10(2), in combination with the aforementioned operation of moving the first straight line and/or the second straight line according to the received first command, so that the intersection point of the first straight line and the second straight line is located at the target position, for example, the lower edge of the pubic symphysis, the specific operation can be referred to the related description of fig. 6, fig. 7, fig. 8 and fig. 9.
As shown in fig. 10(3), assuming that the intersection point of the first straight line 102 and the second straight line 101 in fig. 10(2) moves to the first target position (e.g., the lower pubic symphysis edge), and the first straight line is also located at the second target position (e.g., the middle pubic symphysis axis), the first interface operation position, i.e., the second target, associated with the second instruction may be recorded as a reference of the production measurement instruction according to the reception of the second instruction. Therefore, in fig. 10(3), after receiving the second instruction, the image processing module records a first interface operation position where the first straight line is currently located, for example, a position of a central axis of pubic symphysis. The first line at this position is taken or configured as a reference line and the properties of the first line are modified, characterizing the first line 102 as marked as the reference line. Altering the property of the first line includes at least: the first line is configured to be immovable, and an attribute of the first line is set to one of preset attributes. The property settings may be changes to color, line type, etc., e.g. changing the line type of the first straight line 102 from a dashed line to a dotted line. Further, in one embodiment, a movable active cursor 130 is drawn at the intersection of the first line 102 and the second line 101. The movable active cursor can be moved to any position on either side of the first line 102. In order to facilitate the operation of the movable cursor 130, the image processing module reduces the cursor, or adjusts the second straight line to be hidden, or deletes the second straight line, or changes the second straight line to a changing line whose length can change with the position of the movable cursor, thereby forming the effect shown in fig. 10 (3).
As shown in fig. 10(4), the movable cursor 130 can be moved to any position on both sides of the first straight line 102, in which the movable cursor 130 is moved to the position BN, and in order to clearly reflect the relative position relationship between the position BN and the reference line, the measurement based on the reference line is implemented, in one embodiment, a vertical line 141 from the interface operation position BN where the movable cursor 130 is located at the current time to the first straight line 102 is drawn, so that the vertical line 141 changes in position with the change in the position of the movable cursor 130. In order to realize the position change of the vertical line 141 with the position change of the movable cursor 130, in one embodiment, the process from the step S260 to the step S280 includes: the image processing module receives a seventh instruction generated when the seventh operation is input by using the human-computer interaction device, and the seventh instruction can be input through a track ball. The image processing module causes the display of the movable cursor 130 to be linked with the seventh operation according to the seventh instruction. When the user moves the trackball, the linkage effect of the movable cursor 130 on the ultrasound image is that when the display of the movable cursor is linked with the seventh operation according to the seventh instruction, a vertical line 141 from the interface operation position where the movable cursor 130 is located to the first straight line 102 is drawn, and when the movement position of the movable cursor 130 is changed, the vertical line is changed accordingly.
As shown in fig. 10(5), the display of the active cursor 130 is linked with the seventh operation until the active cursor moves to the target position BN, and at this time, an eighth command generated when the eighth operation is input by the human-computer interaction device is received, and the eighth operation may be the same as the input of the aforementioned second operation, so that the eighth command may also be the same as the aforementioned second command, for example, the eighth command is obtained by inputting a key. And extracting the corresponding interface operation position according to the eighth instruction, and recording the interface operation position associated with the eighth instruction, which indicates that the user determines that the measurement target position BN needs to be marked on the ultrasonic image through the operation. After recording, a position mark 131 may be marked at the position BN, the position mark based on the eighth instruction being located at the interface operation position associated with the eighth instruction. The movable cursor 130 is then moved to the position BL by means of the moving trackball, during which the display of the position mark 131 remains at the already marked position BN, while the vertical line 141 between the position mark 131 and the first straight line 102 is displayed. When the movable cursor 130 is moved to the position BL by using the moving trackball, a vertical line 142 from the interface operation position where the movable cursor is located at the current time to the first straight line 102 is synchronously drawn. If the position BL is determined to need to be measured and recorded, recording an interface operation position associated with an eighth instruction according to the eighth instruction input by the eighth operation, indicating that the user determines that the measurement target position BL needs to be marked on the ultrasonic image through the operation.
As shown in fig. 10(6), the display of the active cursor 130 is interlocked with the seventh operation according to the seventh command until the active cursor moves to the target position UT. The display of the position mark 131 is retained at the position BN already marked in this process, while the vertical line 141 between the position mark 131 and the first line 102 is displayed. It is also possible to draw a position mark 132 at the position BL for displaying the target position that has been marked, while displaying a vertical line 142 between the position mark 132 and the first straight line 102. In the process of linking the display of the active cursor 130 with the seventh operation according to the seventh instruction, the vertical line 143 from the interface operation position where the active marker 130 is located to the first straight line 102 at the present time may be automatically drawn in the process of moving the active marker 130 to the position UT.
In the process, each time a position mark is carried out, the measurement of one measurement item is represented, so that the marking can be directly carried out by moving the movable cursor according to the seventh instruction and recording the moving position according to the eighth instruction when the measurement of the next measurement item is carried out, and the cyclic measurement operation of the next measurement item can be switched by key input. When all the measurement items are operated, the measurement end input can be performed by double-click or other key operation. As can be seen from the embodiment shown in fig. 10, the vertical line is always perpendicular to the first straight line 102, and therefore, the vertical line can be understood as a deformation of the second straight line, for example, changing the second straight line into a changing line whose length can be changed with the position of the active cursor. When the movable cursor moves at any position on two sides of the first straight line, the vertical lines (141, 142 and 143) from the interface operation position where the movable cursor is located at the current moment to the first straight line are drawn all the time, so that a user can more clearly know that the reference-based vertical measurement reference needed in the measurement process appears. Further, in one embodiment, vertical lines (141, 142) between the position markers and the first line are displayed after the interface operation position associated with the second instruction is recorded, or after the interface operation position associated with the eighth instruction is recorded.
In the activated current image display window, the display of the first straight line is linked with the first operation, or the display of the movable cursor is linked with the sixth operation, and the activation manner can be referred to the related description, and will not be described herein. This avoids interference between the ultrasound image operations in the two-screen display mode.
In step S290, the image processing module 126 obtains a measurement instruction; in step S300, calculating a measurement item of the target tissue according to the measurement instruction, and obtaining a calculation result; in step S310, the image processing module 126 outputs the calculation result using the display 107, the printer, or the like.
And obtaining related measurement instructions through the operation of the first straight line or the second straight line and the movable mark. For example, the measurement instruction is obtained after recording the first interface operation position associated with the second instruction.
In this embodiment, the measurement instruction may be obtained by the system according to the measurement operation performed by the user on the ultrasound image. The measurement instruction is obtained by calculating measurement items related to target tissues, and the medical meaning of part of the measurement items is described in detail below by taking pelvic ultrasound images as an example.
There are numerous academic and clinical measurements of anterior Pelvic ultrasound images, most of which are associated with diagnosing anterior Pelvic prolapse (POP) and Urinary Incontinence (Urinary Incontinence). The measurement items that may be involved in the present embodiment include, but are not limited to, the following items:
1) the posterior urethral Angle or bladder Angle (RVA), the Angle between the proximal urethra and the posterior end of the trigone.
2) Urethral Inclination Angle (UTA or Urehral incorporation Angle, UI).
3) Pubic Urethral Angle (PUA), the Angle between the central axis of the symphysis pubis and the line connecting the lower edge of the symphysis pubis and the bladder neck.
4) Bladder Neck Distance (BSD) or puborethral Distance (PUD).
5) Pubic bladder Angle (PVA), the Angle between the central axis of pubic symphysis and the line connecting the lower edge of pubic symphysis and the lowest point of the posterior wall of the bladder.
6) Pubic bladder Distance (PVD), or bladder descent Distance (BL desc.
7) Angle of Rotation of Urethra (URA).
8) Bladder Neck descent distance or Bladder Neck mobility (BND). And
9) bladder Wall descent distance (BWD), etc.
These measurements depend in large part on how to establish an appropriate reference coordinate system. For example, PUD and PVD require a uniform reference to the pubic symphysis and the location of its central axis, relying on the establishment of a reference coordinate system with the central axis of the pubic symphysis as the X-axis.
The user may select various measurements, but each measurement needs to be associated with a corresponding anatomical location, for example, at measurements of the pubic symphysis sum UVJ (urinary cervical Junction, or urethral-bladder Junction) angle PUA, the distance BSD of UVJ from the X-axis, the urethral inclination angle UTA, and the bladder relief angle RVA, it is necessary to determine first (a) the inferior pubic symphysis margin (SP), (b) the medial pubic symphysis axis (SP), (c) the urinary bladder urethra Junction (Uvrnasal j UVj), (d) the Proximal urethra (Proximal of urethra), and (e) the posterior bladder margin (Proximal of cervical). And the determination of these positions may be obtained based on the aforementioned operation input of step S240 to step S280. In the above embodiments, the recording of the interface operation position associated with the input instruction is characterized by determining a target position or a plurality of target positions by operating the first straight line and/or the second straight line, for example, when the first straight line passes through the lower edge of the pubic symphysis, the urinary bladder urethra junction, the proximal urethra end, and the like, the user records the interface operation position associated with the second instruction according to the second instruction, and performs the input of the measurement instruction. And when the movable cursor is positioned at the lower edge of the pubic symphysis, the joint of the bladder and the urethra, the near end of the urethra and the like, recording the interface operation position associated with the seventh instruction according to the seventh instruction, and inputting the measurement instruction. And when the first straight line coincides with the central axis of the pubic symphysis, recording the interface operation position associated with the second instruction according to the second instruction, and inputting a measurement instruction.
Based on the input of the measurement operation performed on the ultrasound image, the calculation results of the respective measurement items may be calculated and output in accordance with the definition of the anatomical medical parameter related to the target tissue.
The embodiment shown in fig. 10 also provides a variant, which provides a simpler way of measuring and calibrating multiple target positions, and the specific flow is shown in fig. 11.
In step S110, the image processing module 126 in the ultrasound imaging system acquires an ultrasound image, which includes the target tissue. This step is the same as the aforementioned step S210.
In step S120, the image processing module 126 in the ultrasound imaging system outputs the ultrasound image to a display for displaying the ultrasound image. This step is the same as the aforementioned step S220.
In step S130, the image processing module 126 superimposes and displays a third straight line on the ultrasound image using the display 107, and superimposes and displays an active cursor on the third straight line. The moving manner of the third straight line in this step can be referred to the moving operation description of all the first straight lines in the foregoing, and the third straight line can be identical to the first straight line in the foregoing, and can also be identical to the first straight line after rotating and/or moving to the reference line, for example, refer to operations performed according to the first instruction or the second instruction in fig. 6 to 10, such as translation about the first straight line, rotation of the first straight line, and the like, and will not be described repeatedly herein.
Further, in one embodiment, the displaying the third line on the ultrasound image and the displaying the active cursor on the third line in an overlapping manner includes:
the image processing module displays a horizontally placed fourth straight line on the ultrasonic image, and displays the movable cursor on the fourth straight line in an overlapping mode. The image processing module receives an eleventh instruction generated when the eleventh operation is input by the human-computer interaction device, updates the display of the fourth straight line according to the eleventh instruction to obtain a third straight line, and enables the display of the fourth straight line to be linked with the input of the eleventh operation in the updating process. The fourth straight line is the first straight line in fig. 6 to 9, and regarding the process of obtaining the third straight line by updating the display of the fourth straight line according to the eleventh instruction, reference may be made to the related description of updating the display of the first straight line according to the first instruction in fig. 6 to 9. Specific examples are as follows.
In one embodiment, the receiving an eleventh instruction generated when the eleventh operation is input by using the human-computer interaction device, and updating the display of the fourth straight line to obtain the third straight line according to the eleventh instruction includes:
the image processing module receives a twelfth instruction generated when a twelfth operation is input by using the human-computer interaction device,
the image processing module monitors a fifth interface operation position corresponding to the twelfth operation, wherein the fifth interface operation position is any position in the image display window;
the image processing module tracks the change of the operation position of the fifth interface, an
And the image processing module assigns the position of the movable cursor as a tracking result of the operation position of the fifth interface, and redraws a fourth straight line based on the assigned movable cursor.
In this embodiment, the twelfth operation is the same as the third operation mentioned above, the twelfth instruction is the same as the third instruction, and all of the related embodiments regarding the third operation and the third instruction can be applied herein, except that the input of the second straight line is reduced. For example, as shown in fig. 12, on the display interface 811, there is an image display window 812, an ultrasound image 813 is displayed within the image display window 811, and a fourth straight line 814 is displayed at a position 820 without the second straight line 604, a position 821 in fig. 12 is shown at an intersection 603 of the first straight line 602 and the second straight line 604 in fig. 6, and an active cursor 815 is drawn at the position 821. The active cursor 815 is used as a controlled object for man-machine interaction. The image processing module receives a twelfth instruction generated when a twelfth operation is input by using a human-computer interaction device such as a trackball, the twelfth operation is used for controlling the movable cursor 815, monitoring a fifth interface operation position corresponding to the twelfth operation, tracking the change of the fifth interface operation position to 818 in fig. 12, changing the position of the movable cursor to the fifth interface operation position 818, and redrawing a fourth straight line 814 at the position 819 based on the changed position of the movable cursor, so that the fourth straight line and the movable cursor are translated, and the fourth straight line can be integrally moved to any position in the image display window along with the input of the twelfth operation.
In one embodiment, the receiving an eleventh instruction generated when the eleventh operation is input by using the human-computer interaction device, and updating the display of the fourth straight line to obtain the third straight line according to the eleventh instruction may further include: the image processing module receives a thirteenth instruction generated when a thirteenth operation is input by using the human-computer interaction device; the image processing module extracts rotation indication information according to the thirteenth instruction; and the image processing module rotates the fourth straight line according to the extracted rotation indication information by taking the position of the movable cursor as a center.
As shown in fig. 13, on the display interface 900, there is an image display window 901, an ultrasound image 908 is displayed within the image display window 901, and a fourth straight line 902 is displayed at a position 905, and an active cursor 904 is drawn at a position 903 on the fourth straight line 902. And taking the fourth straight line 902 as a controlled object for man-machine interaction. The image processing module receives a thirteenth instruction generated when a thirteenth operation is input using a human-computer interaction device such as a knob, the thirteenth operation being for controlling the movable cursor, and extracts rotation instruction information including angle information for adjusting an angle of the fourth straight line 902 with respect to the initial position 905 according to the thirteenth instruction. The image processing module rotates the fourth straight line 902 around the position of the movable cursor according to the extracted rotation instruction information.
The process of obtaining the third line may need to be performed by combining the operations of translating the fourth line and rotating the fourth line, or may need only the operations of translating the fourth line or rotating the fourth line.
In step S140, the image processing module receives a ninth instruction generated when a ninth operation is input by using the human-computer interaction device; in step S150, the movable cursor is displayed in association with the ninth operation in accordance with the ninth command. In this step, the manner of linkage of the display of the movable cursor with the ninth operation can be referred to as the manner of linkage of the display of the movable cursor with the seventh operation mentioned in fig. 10, and the seventh operation and the ninth operation may be the same, and the seventh instruction and the ninth instruction may also be the same. In one embodiment, the display of the active cursor in the activated current image display window may be linked to the ninth operation, and the above description may be referred to "in the activated current image display window, the display of the first straight line may be linked to the first operation, or the display of the active cursor may be linked to the sixth operation. "is used in connection with the description.
The operation manner of the movable cursor in this step can also refer to the specific description about fig. 10, and the third straight line is the same as the first straight line, for example, the movable cursor can move to any position on both sides of the third straight line; a vertical line from the interface operation position where the active cursor is located at the current moment to the third straight line is drawn, so that the position of the vertical line changes along with the position change of the active cursor, and both can be used in the embodiment.
In step S160, the image processing module receives at least one tenth instruction generated when a human-computer interaction device is used to input at least one tenth operation; in step S170, the image processing module extracts a corresponding interface operation position according to the tenth instruction; in step S180, the image processing module records at least one interface operation position associated with the tenth instruction. The process of step S160 to step S180 can be referred to the related description about (3) to (6) in fig. 10, and is the same as the process of the image processing module receiving the eighth instruction generated when the eighth operation is input by using the human-computer interaction device, extracting the corresponding interface operation position according to the eighth instruction, and recording the interface operation position associated with the eighth instruction. The tenth operation may be the same as the eighth operation, and the tenth instruction may be the same as the eighth instruction. See, in particular, the description below in connection with fig. 14.
Fig. 14 shows a process of performing measurement by using the above third line and various operation and display modes of the movable cursor, and a specific flow is as follows.
The operation by the key input or other mode selection enters the measurement operation mode, and the horizontally placed fourth line and the active cursor set on the fourth line are displayed superimposed on the ultrasound image, as shown in fig. 14 (a). The ultrasound image 220 is displayed in the image display window 200. In conjunction with the moving operation of the fourth straight line performed according to the received ninth instruction and the rotating operation of the fourth straight line performed according to the tenth instruction in the foregoing with respect to fig. 12 and 13, the fourth straight line and the movable cursor are located at the target position in fig. 14(a), for example, the movable cursor is located at the first target position, for example, the lower pubic symphysis edge, and the fourth straight line is located at the second target position, for example, the middle pubic symphysis axis, so as to obtain the third straight line. The process of obtaining the third line may be performed by combining the operation of translating the fourth line of fig. 12 and the operation of rotating the fourth line of fig. 13, or may be performed only by the operation of translating the fourth line of fig. 12 or the operation of rotating the fourth line of fig. 13.
As shown in fig. 14(B), the movable cursor 230 can be moved to any position on both sides of the third straight line 202, in which the movable cursor 230 is moved to a position BN, and in order to clearly reflect the relative position relationship between the position BN and the reference line, the measurement based on the reference line is implemented, in one embodiment, a vertical line 242 from the interface operation position BN where the movable cursor 230 is located at the current time to the third straight line 202 is drawn, so that the vertical line 242 changes in position with the change in the position of the movable cursor 230. In order to realize the position change of the vertical line 242 with the position change of the movable cursor 230, in one embodiment, the process from the step S260 to the step S280 includes: the image processing module receives a ninth instruction generated when the ninth operation is input by the human-computer interaction device, and the ninth instruction can be input through a track ball. The image processing module causes the display of the movable cursor 230 to be interlocked with the ninth operation according to the ninth instruction. When the user moves the trackball, the linkage effect of the active cursor 230 on the ultrasound image is that when the display of the active cursor is linked with the ninth operation according to the ninth instruction, a vertical line 242 from the interface operation position where the active cursor 230 is located to the third straight line 202 is drawn, and when the movement position of the active cursor 230 is changed, the vertical line is changed accordingly.
As shown in fig. 10(C), the display of the active cursor 230 is interlocked with the ninth operation until the active cursor moves to the target position BN, at which time a tenth instruction generated when the tenth operation is input using the human machine interaction device is received, and the tenth operation may be the same as the input of the aforementioned second operation, and thus the tenth instruction may also be the same as the aforementioned second instruction, for example, the tenth instruction is obtained by the input of a key. And extracting the corresponding interface operation position according to the tenth instruction, and recording the interface operation position associated with the tenth instruction, which indicates that the user determines that the measurement target position BN needs to be marked on the ultrasonic image through the operation. After recording, a position mark 232 may be marked at the position BN, the position mark based on the tenth instruction being located at the interface operation position associated with the tenth instruction. The moving trackball is then used to move the active cursor 230 to position BL, during which the display of the position marker 232 remains at the marked position BN, while the vertical line 242 between the position marker 232 and the third straight line 202 is displayed. When the movable cursor 230 is moved to the position BL by using the moving trackball, a vertical line 242 from the interface operation position where the movable cursor is located at the current time to the third straight line 202 is synchronously drawn. If the position BL is determined to need to be measured and recorded, recording an interface operation position associated with a tenth instruction according to the tenth instruction input by the tenth operation, indicating that the user determines that the measurement target position BL needs to be marked on the ultrasonic image through the operation.
As shown in fig. 10(D), the display of the active cursor 230 is interlocked with the ninth operation according to the ninth command until the active cursor moves to the target position UT. The display of the position mark 232 is retained at the position BN already marked in the process, while the vertical line 242 between the position mark 232 and the third straight line 202 is displayed. It is also possible to draw a position mark 232 at the position BL for displaying the target position that has been marked, while displaying a vertical line 242 between the position mark 232 and the third straight line 202. In the process of linking the display of the active cursor 230 with the ninth operation according to the ninth instruction, the vertical line 243 from the interface operation position where the active marker 230 is located to the third straight line 202 at the present time can be automatically drawn in the process of moving the active marker 230 to the position UT.
In the process, each time a position mark is carried out, the measurement of one measurement item is represented, so that the marking can be directly carried out by moving the movable cursor according to the ninth instruction and recording the moving position according to the tenth instruction when the measurement of the next measurement item is carried out, and the cyclic measurement operation of the next measurement item can be switched by key input. When all the measurement items are operated, the measurement end input can be performed by double-click or other key operation. When the movable cursor moves at any position on two sides of the third straight line, vertical lines (242, 242 and 243) from the interface operation position where the movable cursor is located at the current moment to the third straight line are drawn all the time, so that a user can more clearly know that the reference-based vertical measurement reference needed in the measurement process appears. Further, in one embodiment, after the interface operation position associated with the tenth instruction is recorded, vertical lines (242 ) between the position mark and the third straight line are displayed.
In the activated current image display window, the display of the third straight line is linked with the eleventh operation, or the display of the active cursor is linked with the ninth operation, and the manner of activation can be referred to the related description, and will not be described herein. This avoids interference between the ultrasound image operations in the two-screen display mode. As can be seen in the embodiment shown in fig. 10, the marking of the at least one target position can be effected continuously by means of active markings. In the process from step 140 to step 180, the image processing module receives a ninth instruction generated when a ninth operation is input by using the human-computer interaction device, causes the display of the movable cursor to be linked with the ninth operation according to the ninth instruction, receives at least one tenth instruction generated when at least one tenth operation is input by using the human-computer interaction device, extracts a corresponding interface operation position according to the tenth instruction, and records at least one interface operation position associated with the tenth instruction, so that the cooperative marking of a plurality of continuous target positions can be realized. Furthermore, the ninth operation in this embodiment may be a continuous input, and therefore in one embodiment, the image processing module receives a ninth instruction generated when the ninth operation is continuously input by using the human-computer interaction device, and causes the continuous display of the movable cursor to be linked with the ninth operation according to the ninth instruction.
In step S190, the image processing module 126 obtains a measurement instruction; in step S191, a measurement item of the target tissue is calculated according to the measurement instruction, and a calculation result is obtained; in step S192, the image processing module 126 outputs the calculation result using the display 107, the printer, or the like. Steps S190, S191, and S192 may be the same as steps S290, S300, and S310, and will not be described in detail herein.
Fig. 2 and fig. 11 respectively provide only a flow execution sequence among steps, and various modifications can also be obtained by adjusting the sequence of the steps in fig. 2 and fig. 11 based on the foregoing, the steps are not limited to be executed only according to the sequence in fig. 2 and fig. 11, the steps can be mutually replaced and the execution sequence can be changed if the basic logic is satisfied, and one or more steps can be repeatedly executed and then the last step or steps can be executed, and the modifications belong to the modifications performed according to the embodiments provided herein.
The image processing module and the signal processing module may be integrated on a circuit board, or may be implemented by dividing into a plurality of circuit boards, and of course, the computer program of fig. 2 or fig. 11 executed on the image processing module may also be implemented by using one or more processors. Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is carried in a non-volatile computer-readable storage medium (such as ROM, magnetic disk, optical disk, hard disk, server cloud space), and includes several instructions for enabling a terminal device (which may be a mobile phone, a computer, a server, or a network device) to execute the system structure and method according to the embodiments of the present invention. For example, a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the embodiments mentioned in the foregoing on the basis of the flow shown in fig. 2 or fig. 11.
The embodiment provides a convenient and rapid parameter measurement scheme based on two-dimensional or three-dimensional ultrasound, and the scheme can be suitable for measuring parameters of the pelvic floor. The scheme provides a more flexible measurement input mode, and can meet the requirement of a convenient manual measurement method for measuring the position and the descending distance of the pelvic floor organ. The default reference line of the method is a horizontal line, so that a doctor does not need to manually determine the horizontal line; in addition the method supports distance measurements extending to either side of a reference line; after the measurement is finished, only short lines are displayed at two ends of the distance scale, and the shielded images are reduced to a great extent. The embodiment is an improved parallel line distance measuring method, and can be better applied to the aspect of the basin bottom mainly used for measuring the distance of the horizontal reference line. The embodiment is further optimized, can be used for measuring the prolapse of a plurality of organs of the pelvic floor in a combined manner, can save the time of doctors and improve the efficiency.
The above examples only show some embodiments, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (48)

1. A method of ultrasound image based parameter measurement, comprising:
acquiring an ultrasound image, wherein the ultrasound image comprises target tissue, and the ultrasound image is obtained by receiving an ultrasound signal from the target tissue by an ultrasound probe;
displaying an ultrasonic image;
superposing and displaying a first straight line and a second straight line which are perpendicular to each other on the ultrasonic image, wherein the first straight line and the second straight line are intersected to generate an intersection point;
receiving a first instruction generated when a first operation is input by using the human-computer interaction device, and
updating the display of the first straight line and/or the second straight line according to the first instruction, so that the display of the first straight line and/or the second straight line is linked with the input of the first operation, and the second straight line and the first straight line are vertically intersected all the time in the updating process;
the updating the display of the first straight line and/or the second straight line according to the first instruction, and the linkage between the display of the first straight line and/or the second straight line and the input of the first operation comprises:
receiving a third instruction generated when the third operation is input by using the human-computer interaction device,
monitoring a second interface operation position corresponding to the third operation, wherein the second interface operation position is any position in the image display window,
tracking the change of the second interface operation position, assigning the position of the intersection point as a tracking result of the second interface operation position, and redrawing the first straight line and the second straight line based on the assigned intersection point; or,
the updating the display of the first straight line and/or the second straight line according to the first instruction, and the linkage between the display of the first straight line and/or the second straight line and the input of the first operation at least comprises one of the following modes:
receiving a fourth instruction generated when the fourth operation is input by using the human-computer interaction device,
monitoring a third interface operation position corresponding to the fourth operation,
tracking a change in the vertical direction of the third interface operation position,
assigning the position of the intersection point on the second straight line as a tracking result of the third interface operation position, and redrawing the first straight line based on the assigned intersection point; and,
receiving a fifth instruction generated when the fifth operation is input by utilizing the human-computer interaction equipment,
monitoring a fourth interface operation position corresponding to the fifth operation,
tracking a change in the horizontal direction of the fourth interface operation position,
and assigning the position of the intersection point on the first straight line as a tracking result of the fourth interface operation position, and redrawing the second straight line based on the assigned intersection point.
2. The method of claim 1, wherein the first line and the second line are displayed differently.
3. The method of claim 1, wherein the first line and the second line are disposed above a second layer, the ultrasound image is displayed on the first layer below the second layer, and the second layer is a transparent layer.
4. The method of claim 1, further comprising:
receiving a second instruction generated when a second operation is input by utilizing the human-computer interaction equipment;
extracting a corresponding first interface operation position according to the second instruction; and
and recording the first interface operation position associated with the second instruction.
5. The method of claim 1, wherein updating the display of the first line and/or the second line in accordance with the first command and linking the display of the first line and/or the second line with the input of the first operation further comprises:
receiving a sixth instruction generated when a human-computer interaction device is used for inputting a sixth operation;
extracting rotation indication information according to the sixth instruction; and
and rotating the first straight line and/or the second straight line by taking the intersection point as a center according to the extracted rotation indication information.
6. The method of claim 4, wherein the recording the first interface operation position associated with the second instruction comprises:
changing a property of the first line, the first line being located at an interface operation position associated with the second instruction, wherein the changing the property of the first line includes at least one of:
configuring the first line to be immovable, an
And setting the attribute of the first straight line as a preset attribute.
7. The method of claim 6, wherein the modifying the property of the first line comprises:
drawing a movable cursor at the intersection of the first line and the second line.
8. The method of claim 6, wherein the modifying the property of the first line comprises:
and adjusting the second straight line to be hidden, or deleting the second straight line, or changing the second straight line into a change line with the length capable of changing along with the position of the movable cursor.
9. The method of claim 7, wherein the movable cursor is movable to any position on both sides of the first line.
10. The method of claim 7, further comprising:
receiving a seventh instruction generated when the seventh operation is input by using the human-computer interaction device,
according to the seventh instruction, the display of the movable cursor is linked with the seventh operation,
receiving an eighth instruction generated when the eighth operation is input by using the human-computer interaction device,
extracting the corresponding interface operation position according to the eighth instruction, and,
and recording the interface operation position associated with the eighth instruction.
11. The method of claim 1, wherein a position indication mark is displayed on the ultrasound image.
12. The method for measuring parameters based on ultrasound images of claim 9, wherein when the movable cursor moves at any position on both sides of the first straight line, a vertical line from the interface operation position where the movable cursor is located at the current time to the first straight line is drawn.
13. The method of claim 4, wherein said recording the first interface operating position associated with the second instruction comprises, after:
drawing a position mark, wherein the position mark is positioned on the second straight line.
14. The method of claim 10, wherein the recording the first interface operating position associated with the second instruction or the recording the interface operating position associated with the eighth instruction comprises:
drawing a position mark, wherein the position mark is positioned on the second straight line or the position mark is positioned at an interface operation position associated with the eighth instruction.
15. The method of claim 1, wherein the position of the intersection point is arbitrarily moved on the first line or the second line.
16. The method of claim 13, wherein said recording the first interface operating position associated with the second instruction further comprises, after:
a portion of a second line is displayed, the portion of the second line being located between two position markers and/or between a position marker and the first line.
17. The method of claim 1, wherein the displaying of the first line is interlocked with the first operation in an active current image display window.
18. A method of ultrasound image based parameter measurement, comprising:
acquiring an ultrasound image, wherein the ultrasound image comprises target tissue, and the ultrasound image is obtained by receiving an ultrasound signal from the target tissue by an ultrasound probe;
displaying an ultrasonic image;
displaying a third straight line on the ultrasonic image in an overlapping manner;
displaying a movable cursor on the third straight line in an overlapping manner;
receiving a ninth instruction generated when the ninth operation is input by utilizing the human-computer interaction equipment;
according to the ninth instruction, enabling the display of the movable cursor to be linked with the ninth operation;
receiving at least one tenth instruction generated when the human-computer interaction equipment is used for inputting at least one tenth operation;
extracting a corresponding interface operation position according to the tenth instruction;
recording at least one interface operation position associated with the tenth instruction.
19. The method of claim 18, wherein the movable cursor is movable to any position on both sides of the third line.
20. The method of claim 19, wherein when the movable cursor moves at any position on both sides of the third line, a vertical line from the interface operation position where the movable cursor is located at the current time to the third line is drawn.
21. The method of claim 18, wherein said recording the interface operation location associated with the tenth instruction comprises, after:
drawing a position marker at the interface operation position associated with the tenth instruction;
displaying a vertical line between the position mark and the third straight line.
22. The method of claim 18, wherein the displaying of the active cursor is interlocked with the ninth operation within the activated current image display window.
23. The method according to claim 18, wherein said displaying a third line on the ultrasound image and displaying an active cursor on the third line in an overlapping manner comprises:
displaying a horizontally placed fourth straight line on the ultrasound image;
displaying a movable cursor on the fourth straight line in an overlapping manner;
receiving an eleventh instruction generated when the eleventh operation is input by utilizing the human-computer interaction equipment; and
and updating the display of the fourth straight line to obtain the third straight line according to the eleventh instruction, and enabling the display of the fourth straight line to be linked with the input of the eleventh operation in the updating process.
24. The method according to claim 23, wherein the receiving an eleventh instruction generated when an eleventh operation is inputted using a human-computer interaction device, and updating the display of the fourth line to obtain the third line according to the eleventh instruction comprises:
receiving a twelfth instruction generated when the twelfth operation is input by using the human-computer interaction device,
monitoring a fifth interface operation position corresponding to the twelfth operation, wherein the fifth interface operation position is any position in an image display window;
tracking changes in the operational position of the fifth interface, an
And assigning the position of the movable cursor as a tracking result of the fifth interface operation position, and redrawing the fourth straight line based on the assigned movable cursor.
25. The method according to claim 23, wherein the receiving an eleventh instruction generated when an eleventh operation is inputted using a human-computer interaction device, and updating the display of the fourth line to obtain the third line according to the eleventh instruction comprises:
receiving a thirteenth instruction generated when a thirteenth operation is input by using the human-computer interaction equipment;
extracting rotation indication information according to the thirteenth instruction; and
and rotating the fourth straight line by taking the position of the movable cursor as a center according to the extracted rotation indication information.
26. An ultrasound imaging system, characterized in that the ultrasound imaging system comprises:
the probe head is provided with a probe head,
the transmitting circuit and the receiving circuit are used for exciting the probe to transmit ultrasonic beams to target tissues, receiving echoes of the ultrasonic beams and obtaining ultrasonic echo signals;
the image processing module is used for obtaining an ultrasonic image according to the ultrasonic echo signal;
a display for displaying the ultrasound image; and,
the image processing module is also used for superposing and displaying a first straight line horizontally placed and a second straight line vertically placed on the ultrasonic image, the first straight line and the second straight line are intersected to generate an intersection point, and a first instruction generated when a first operation is input by utilizing a human-computer interaction device is received;
the image processing module is used for updating the display of the first straight line and/or the second straight line according to the first instruction, and the linkage between the display of the first straight line and/or the second straight line and the input of the first operation comprises the following steps:
receiving a third instruction generated when the third operation is input by using the human-computer interaction device,
monitoring a second interface operation position corresponding to the third operation, wherein the second interface operation position is any position in the image display window,
tracking the change of the second interface operation position, assigning the position of the intersection point as a tracking result of the second interface operation position, and redrawing the first straight line and the second straight line based on the assigned intersection point; or,
the image processing module is used for updating the display of the first straight line and/or the second straight line according to the first instruction, and the linkage between the display of the first straight line and/or the second straight line and the input of the first operation at least comprises one of the following modes:
receiving a fourth instruction generated when the fourth operation is input by using the human-computer interaction device,
monitoring a third interface operation position corresponding to the fourth operation,
tracking a change in the vertical direction of the third interface operation position,
assigning the position of the intersection point on the second straight line as a tracking result of the third interface operation position, and redrawing the first straight line based on the assigned intersection point; and,
receiving a fifth instruction generated when the fifth operation is input by utilizing the human-computer interaction equipment,
monitoring a fourth interface operation position corresponding to the fifth operation,
tracking a change in the horizontal direction of the fourth interface operation position,
and assigning the position of the intersection point on the first straight line as a tracking result of the fourth interface operation position, and redrawing the second straight line based on the assigned intersection point.
27. The ultrasound imaging system of claim 26, wherein the first line and the second line are displayed differently.
28. The ultrasound imaging system of claim 26, wherein the image processing module is further configured to:
receiving a second instruction generated when a second operation is input by utilizing the human-computer interaction equipment;
extracting a corresponding first interface operation position according to the second instruction; and
and recording the first interface operation position associated with the second instruction.
29. The ultrasound imaging system of claim 26, wherein the image processing module updates the display of the first line and/or the second line in accordance with the first instruction by linking the display of the first line and/or the second line with the input of the first operation by:
receiving a sixth instruction generated when a human-computer interaction device is used for inputting a sixth operation;
extracting rotation indication information according to the sixth instruction; and
and rotating the first straight line and/or the second straight line by taking the intersection point as a center according to the extracted rotation indication information.
30. The ultrasound imaging system of claim 28, wherein the image processing module records the first interface operational position associated with the second instruction by:
changing a property of the first line, the first line being located at an interface operation position associated with the second instruction, wherein the changing the property of the first line includes at least one of:
configuring the first line to be immovable, an
And setting the attribute of the first straight line as a preset attribute.
31. The ultrasound imaging system of claim 30, wherein the altering the property of the first line displayed at the interface operational position comprises:
drawing a movable cursor at the intersection of the first line and the second line.
32. The ultrasound imaging system of claim 31, wherein the active cursor is movable to any position on either side of the first line.
33. The ultrasound imaging system of claim 31, wherein the image processing module is further configured to:
receiving a seventh instruction generated when the seventh operation is input by using the human-computer interaction device,
according to the seventh instruction, the display of the movable cursor is linked with the seventh operation,
receiving an eighth instruction generated when the eighth operation is input by using the human-computer interaction device,
extracting the corresponding interface operation position according to the eighth instruction, and,
and recording the interface operation position associated with the eighth instruction.
34. The ultrasound imaging system according to claim 32, wherein the image processing module draws a vertical line from the interface operation position where the active cursor is located at the current time to the first straight line when the active cursor moves at any position on both sides of the first straight line.
35. The ultrasound imaging system of claim 28, wherein the image processing module, after recording the first interface operational position associated with the second instruction, is further configured to:
drawing a position mark, wherein the position mark is positioned on the second straight line.
36. The ultrasound imaging system of claim 33, wherein the image processing module, after recording the first interface operational position associated with the second instruction, or after recording the interface operational position associated with the eighth instruction, is further configured to:
drawing a position mark, wherein the position mark is positioned on the second straight line or the position mark is positioned at an interface operation position associated with the eighth instruction.
37. The ultrasound imaging system of claim 34, wherein the location of the intersection point is arbitrarily movable on the first line or the second line.
38. The ultrasound imaging system of claim 28, wherein the image processing module, after recording the first interface operational position associated with the second instruction, is further configured to:
displaying a partial second straight line, the partial second straight line being located between two position markers and/or between a position marker and the first straight line.
39. The ultrasound imaging system of claim 26, wherein the display of the first line is interlocked with the first operation within the activated current image display window.
40. The ultrasound imaging system of claim 33, wherein the image processing module, after recording the first interface operational position associated with the second instruction, or after recording the interface operational position associated with the eighth instruction, is further configured to:
drawing a position mark, wherein the position mark is positioned at an interface operation position associated with the eighth instruction;
displaying a vertical line between the position mark and the first line.
41. An ultrasound imaging system, characterized in that the ultrasound imaging system comprises:
the probe head is provided with a probe head,
the transmitting circuit and the receiving circuit are used for exciting the probe to transmit ultrasonic beams to target tissues, receiving echoes of the ultrasonic beams and obtaining ultrasonic echo signals;
the image processing module is used for obtaining an ultrasonic image according to the ultrasonic echo signal;
a display for displaying the ultrasound image; and,
the image processing module is also used for
Displaying a third straight line on the ultrasonic image in an overlapping manner;
displaying a movable cursor on the third straight line in an overlapping manner;
receiving a ninth instruction generated when the ninth operation is input by utilizing the human-computer interaction equipment;
according to the ninth instruction, enabling the display of the movable cursor to be linked with the ninth operation;
receiving at least one tenth instruction generated when the human-computer interaction equipment is used for inputting at least one tenth operation;
extracting a corresponding interface operation position according to the tenth instruction; and
recording at least one interface operation position associated with the tenth instruction.
42. The ultrasound imaging system of claim 41, wherein the active cursor is movable to any position on either side of the third line.
43. The ultrasound imaging system according to claim 41, wherein the image processing module draws a vertical line from the interface operation position where the active cursor is located at the current time to the third straight line when the active cursor moves at any position on both sides of the third straight line.
44. The ultrasound imaging system of claim 41, wherein the image processing module, after recording the interface operational position associated with the tenth instruction, is further configured to:
drawing a position marker at the interface operation position associated with the tenth instruction; and
displaying a vertical line between the position mark and the third straight line.
45. The ultrasound imaging system of claim 41, wherein the display of the active cursor is interlocked with the ninth operation within the activated current image display window.
46. The ultrasound imaging system of claim 41, wherein the image processing module superimposes a third line on the ultrasound image and a live cursor on the third line by:
displaying a horizontally placed fourth straight line on the ultrasound image;
displaying a movable cursor on the fourth straight line in an overlapping manner;
receiving an eleventh instruction generated when the eleventh operation is input by utilizing the human-computer interaction equipment; and
and updating the display of the fourth straight line to obtain the third straight line according to the eleventh instruction, and enabling the display of the fourth straight line to be linked with the input of the eleventh operation in the updating process.
47. The ultrasound imaging system of claim 46, wherein the image processing module is configured to receive an eleventh instruction generated when an eleventh operation is entered using the human-computer interaction device, and update the display of the fourth line to obtain the third line according to the eleventh instruction by:
receiving a twelfth instruction generated when the twelfth operation is input by using the human-computer interaction device,
monitoring a fifth interface operation position corresponding to the twelfth operation, wherein the fifth interface operation position is any position in an image display window;
tracking changes in the operational position of the fifth interface, an
And assigning the position of the movable cursor as a tracking result of the fifth interface operation position, and redrawing the fourth straight line based on the assigned movable cursor.
48. The ultrasound imaging system of claim 46, wherein the image processing module is configured to receive an eleventh instruction generated when an eleventh operation is entered using the human-computer interaction device, and update the display of the fourth line to obtain the third line according to the eleventh instruction by:
receiving a thirteenth instruction generated when a thirteenth operation is input by using the human-computer interaction equipment;
extracting rotation indication information according to the thirteenth instruction; and
and rotating the fourth straight line by taking the position of the movable cursor as a center according to the extracted rotation indication information.
CN201710032665.4A 2017-01-16 2017-01-16 Parameter measuring method based on ultrasonic image and ultrasonic imaging system Active CN108309347B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110300246.0A CN113367722A (en) 2017-01-16 2017-01-16 Parameter measuring method based on ultrasonic image and ultrasonic imaging system
CN201710032665.4A CN108309347B (en) 2017-01-16 2017-01-16 Parameter measuring method based on ultrasonic image and ultrasonic imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710032665.4A CN108309347B (en) 2017-01-16 2017-01-16 Parameter measuring method based on ultrasonic image and ultrasonic imaging system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110300246.0A Division CN113367722A (en) 2017-01-16 2017-01-16 Parameter measuring method based on ultrasonic image and ultrasonic imaging system

Publications (2)

Publication Number Publication Date
CN108309347A CN108309347A (en) 2018-07-24
CN108309347B true CN108309347B (en) 2021-04-09

Family

ID=62891217

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110300246.0A Pending CN113367722A (en) 2017-01-16 2017-01-16 Parameter measuring method based on ultrasonic image and ultrasonic imaging system
CN201710032665.4A Active CN108309347B (en) 2017-01-16 2017-01-16 Parameter measuring method based on ultrasonic image and ultrasonic imaging system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110300246.0A Pending CN113367722A (en) 2017-01-16 2017-01-16 Parameter measuring method based on ultrasonic image and ultrasonic imaging system

Country Status (1)

Country Link
CN (2) CN113367722A (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111012326B (en) * 2018-10-09 2022-07-05 深圳市理邦精密仪器股份有限公司 Pelvic floor calibration method, device and computer-readable storage medium
CN110570423A (en) * 2019-09-20 2019-12-13 深圳开立生物医疗科技股份有限公司 pelvic floor measuring method and system and ultrasonic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110301454A1 (en) * 2009-02-13 2011-12-08 Hitachi Medical Corporation Medical image display method, medical image diagnostic apparatus, and medical image display device
CN102405078A (en) * 2009-04-20 2012-04-04 皇家飞利浦电子股份有限公司 A control apparatus for controlling a therapeutic apparatus
CN103096807A (en) * 2010-09-10 2013-05-08 富士胶片株式会社 Ultrasound diagnostic device and method
CN103356236A (en) * 2012-04-02 2013-10-23 富士胶片株式会社 An ultrasound diagnostic apparatus
JP2015100479A (en) * 2013-11-22 2015-06-04 日立アロカメディカル株式会社 Ultrasonic image processor

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000139920A (en) * 1998-11-06 2000-05-23 Matsushita Electric Ind Co Ltd Image diagnosis device
JP4414720B2 (en) * 2003-10-14 2010-02-10 オリンパス株式会社 Ultrasonic diagnostic equipment
KR101194289B1 (en) * 2010-09-14 2012-10-24 삼성메디슨 주식회사 3d ultrasound system for 3d modeling of tissue and method for operating 3d ultrasound system
JP2016086880A (en) * 2014-10-30 2016-05-23 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasound image display apparatus and control program therefor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110301454A1 (en) * 2009-02-13 2011-12-08 Hitachi Medical Corporation Medical image display method, medical image diagnostic apparatus, and medical image display device
CN102405078A (en) * 2009-04-20 2012-04-04 皇家飞利浦电子股份有限公司 A control apparatus for controlling a therapeutic apparatus
CN103096807A (en) * 2010-09-10 2013-05-08 富士胶片株式会社 Ultrasound diagnostic device and method
CN103356236A (en) * 2012-04-02 2013-10-23 富士胶片株式会社 An ultrasound diagnostic apparatus
JP2015100479A (en) * 2013-11-22 2015-06-04 日立アロカメディカル株式会社 Ultrasonic image processor

Also Published As

Publication number Publication date
CN113367722A (en) 2021-09-10
CN108309347A (en) 2018-07-24

Similar Documents

Publication Publication Date Title
CN105243676B (en) Method for displaying ultrasonic image and ultrasonic equipment used by same
JP5702922B2 (en) An ultrasound system for visualizing an ultrasound probe on an object
KR102185726B1 (en) Method and ultrasound apparatus for displaying a ultrasound image corresponding to a region of interest
CN114027880B (en) Method for measuring parameters in ultrasonic image and ultrasonic imaging system
KR101630761B1 (en) Ultrasound apparatus and method for providing information using the ultrasound apparatus
US20170090571A1 (en) System and method for displaying and interacting with ultrasound images via a touchscreen
EP2783635B1 (en) Ultrasound system and method of providing direction information of object
CN108309354B (en) Ultrasonic pelvic floor detection guiding method and ultrasonic imaging system
CN102028500B (en) Ultrasonic diagnosis apparatus, ultrasonic image processing apparatus, ultrasonic image processing method
US20140164965A1 (en) Ultrasound apparatus and method of inputting information into same
JP2017136451A (en) Ultrasonic diagnostic device
US20110208052A1 (en) Breast ultrasound annotation user interface
CN112741648B (en) Method and system for multi-mode ultrasound imaging
KR102388130B1 (en) Apparatus and method for displaying medical image
KR20150003560A (en) The method and apparatus for changing user interface based on user motion information
CN109475343A (en) Ultrasonic elasticity measures display methods and system
EP3666193A1 (en) Ultrasound imaging apparatus, method of controlling the same, and computer program product
CN108309347B (en) Parameter measuring method based on ultrasonic image and ultrasonic imaging system
US20180146954A1 (en) Method of ultrasound apparatus parameters configuration and an ultrasound apparatus of using the same
US20200229795A1 (en) Method and systems for color flow imaging of arteries and veins
CN111053572A (en) Method and system for motion detection and compensation in medical images
CN111093511B (en) Medical image display device and tracking line processing method
JPWO2017104263A1 (en) Ultrasonic observation apparatus, processing apparatus, operation method of ultrasonic observation apparatus, and operation program of ultrasonic observation apparatus
US20190183453A1 (en) Ultrasound imaging system and method for obtaining head progression measurements
JP7310276B2 (en) Medical image display device, ultrasonic diagnostic imaging device, display control method, and display control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20180724

Assignee: Shenzhen Mindray Animal Medical Technology Co.,Ltd.

Assignor: SHENZHEN MINDRAY BIO-MEDICAL ELECTRONICS Co.,Ltd.

Contract record no.: X2022440020009

Denomination of invention: Ultrasound Image-Based Parameter Measurement Method and Ultrasound Imaging System

Granted publication date: 20210409

License type: Common License

Record date: 20220804

EE01 Entry into force of recordation of patent licensing contract