CN109069105B - Ultrasonic medical detection equipment, imaging control method, imaging system and controller - Google Patents

Ultrasonic medical detection equipment, imaging control method, imaging system and controller Download PDF

Info

Publication number
CN109069105B
CN109069105B CN201780024747.2A CN201780024747A CN109069105B CN 109069105 B CN109069105 B CN 109069105B CN 201780024747 A CN201780024747 A CN 201780024747A CN 109069105 B CN109069105 B CN 109069105B
Authority
CN
China
Prior art keywords
extension line
icon
contact
display screen
touch display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780024747.2A
Other languages
Chinese (zh)
Other versions
CN109069105A (en
Inventor
刘智光
周述文
何绪金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN109069105A publication Critical patent/CN109069105A/en
Application granted granted Critical
Publication of CN109069105B publication Critical patent/CN109069105B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Abstract

The invention provides an ultrasonic medical detection device, an imaging control method, an imaging system and a controller, wherein the device is used for overlaying and displaying an icon on an ultrasonic image (S218), displaying an extension line extending from the end of the icon along a preset direction (S220), monitoring the movement of the contact of an input object and a touch display screen (S224), determining the operation position of the contact on the touch display screen corresponding to the movement of the contact (S226), and updating the display of the extension line and the icon to enable the extension line to pass through the operation position (S228). Therefore, the convenience of user operation is improved, and the user experience is greatly improved.

Description

Ultrasonic medical detection equipment, imaging control method, imaging system and controller
Technical Field
The invention relates to an ultrasonic imaging control method with a touch display screen and an imaging system.
Background
In ultrasound imaging systems, it is often necessary to adjust the angle of an object on a screen. In the prior art, an object needing angle adjustment is usually clicked by a hand or a stylus pen, and the object is dragged on a screen after being pressed, so that the angle of the object is adjusted.
However, when there are many objects on the screen or the object to be selected is small, it is difficult to select the target object, and when the user clicks on the screen by using a hand or a stylus, the user is not easy to click the target object and may mistakenly select another object. Moreover, when the angle is adjusted, when the push-to-target to be adjusted is small, the sliding of the finger or the stylus with a small stroke can rotate the target by a large angle, so that it is difficult for the user to finely control the size of the adjusted angle.
Disclosure of Invention
Based on this, it is necessary to provide an ultrasonic medical detection apparatus, an imaging control method, an imaging system, and a controller, aiming at the problem of inconvenient operation in the prior art.
In one embodiment, an ultrasonic medical examination apparatus is provided, comprising:
a probe;
the transmitting circuit and the receiving circuit are used for exciting the probe to transmit ultrasonic beams to a detection object and receiving echoes of the ultrasonic beams to obtain ultrasonic echo signals;
the image processing module is used for obtaining an ultrasonic image according to the ultrasonic echo signal;
a touch display screen;
a first memory storing a computer program running on a processor; and the combination of (a) and (b),
a first processor that implements the following steps when executing the program:
displaying the ultrasound image on the touch display screen,
an icon is superimposed and displayed on the ultrasound image,
displaying an extension line extending in a predetermined direction from an end of the icon,
monitoring movement of an input object in contact with the touch display screen,
determining that the movement of the contact corresponds to an operational position on the touch display screen,
updating the display of the extension line and the icon so that the extension line passes through the operation position,
identifying an end location of the contact creation, and,
and determining that the extended line and the icon are displayed so that the extended line passes through the termination position, wherein the relative position relation between the icon and the extended line is always kept unchanged.
In one embodiment, an ultrasound imaging control method is provided, which includes:
exciting a probe to emit an ultrasonic beam to a detection object;
receiving the echo of the ultrasonic beam to obtain an ultrasonic echo signal;
obtaining an ultrasonic image according to the ultrasonic echo signal;
displaying the ultrasound image on the touch display screen,
an icon is superimposed and displayed on the ultrasound image,
displaying an extension line extending in a predetermined direction from an end of the icon,
monitoring movement of an input object in contact with the touch display screen,
determining that the movement of the contact corresponds to an operational position on the touch display screen,
updating the display of the extension line and the icon so that the extension line passes through the operation position,
identifying an end location of the contact creation, and,
and determining that the extended line and the icon are displayed so that the extended line passes through the termination position, wherein the relative position relation between the icon and the extended line is always kept unchanged.
In one embodiment, an ultrasound imaging system is provided, comprising: ultrasonic medical detection equipment and an intelligent controller; wherein the content of the first and second substances,
the ultrasonic medical detection apparatus includes:
a probe;
the transmitting circuit and the receiving circuit are used for exciting the probe to transmit ultrasonic beams to a detection object, receiving echoes of the ultrasonic beams and obtaining ultrasonic echo signals;
the image processing module is used for obtaining an ultrasonic image according to the ultrasonic echo signal; and the combination of (a) and (b),
the first communication module is electrically connected with the image processing module and is used for transmitting the ultrasonic image data to the intelligent controller and/or receiving a control signal input by the intelligent controller so as to set ultrasonic imaging parameters required for obtaining the ultrasonic image;
the intelligent controller includes:
the display screen is touched, and the display screen is touched,
the second communication module is used for receiving the ultrasonic image data transmitted by the first communication module and/or sending a control signal to the first communication module;
a second memory storing a computer program running on a processor; and the combination of (a) and (b),
a second processor that implements the following steps when executing the program:
displaying the ultrasound image on the touch display screen,
an icon is superimposed and displayed on the ultrasound image,
displaying an extension line extending in a predetermined direction from an end of the icon,
monitoring movement of an input object in contact with the touch display screen,
determining that the movement of the contact corresponds to an operational position on the touch display screen,
updating the display of the extension line and the icon so that the extension line passes through the operation position,
identifying an end location of the contact creation, and,
determining that the extended line and the icon are displayed such that the extended line passes through the termination position, wherein a relative positional relationship between the icon and the extended line is always maintained, and,
generating a control signal containing ultrasonic imaging parameters according to the icon, and/or generating image data containing the ultrasonic image and remark information;
and outputting the control signal to the first communication module through the second communication module, or outputting the image data.
In one embodiment, there is provided an intelligent controller comprising:
a touch display screen;
the second communication module is used for receiving ultrasonic image data transmitted by the ultrasonic medical detection equipment and/or sending a control signal to the ultrasonic medical detection equipment;
a second memory storing a computer program running on a processor; and the combination of (a) and (b),
a second processor that implements the following steps when executing the program:
displaying the ultrasound image on the touch display screen,
an icon is superimposed and displayed on the ultrasound image,
displaying an extension line extending in a predetermined direction from an end of the icon,
monitoring movement of an input object in contact with the touch display screen,
determining that the movement of the contact corresponds to an operational position on the touch display screen,
updating the display of the extension line and the icon so that the extension line passes through the operation position,
identifying an end location of the contact creation, and,
determining that the extended line and the icon are displayed such that the extended line passes through the termination position, wherein a relative positional relationship between the icon and the extended line is always maintained, and,
generating a control signal containing ultrasonic imaging parameters according to the icon, and/or generating image data containing the ultrasonic image and remark information;
and outputting the control signal to the first communication module through the second communication module, or outputting the image data.
Drawings
FIG. 1 is a system architecture diagram providing an ultrasound medical detection device in accordance with some embodiments;
FIG. 2 is a system architecture diagram providing an ultrasound medical detection device in accordance with some embodiments;
FIG. 3 provides a system architecture diagram of an ultrasound detection system in accordance with some embodiments;
FIG. 4 provides a flow chart of the ultrasound imaging control method in the present embodiment shown in FIG. 1 or FIG. 2;
FIG. 5 provides an embodiment of operational input on a graphical user interface for an icon 510 superimposed on an ultrasound image 501 in some embodiments;
FIG. 6 provides an embodiment of a rotation operation on a graphical user interface for an icon 610 superimposed on an ultrasound image 601 in some embodiments;
FIG. 7 provides an embodiment of a panning operation on a graphical user interface for an icon 710 superimposed on an ultrasound image 701 in some embodiments;
fig. 8 provides a schematic diagram of another implementation flow of the ultrasound imaging control method in the embodiment shown in fig. 1 or fig. 2.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the method descriptions may be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings).
Fig. 1 shows a schematic structural diagram of an ultrasonic medical detection apparatus 100 in an embodiment, and a specific structure is as follows. The ultrasonic medical inspection apparatus 100 shown in fig. 1 mainly includes: a probe 101, a transmission circuit 103, a transmission/reception selection switch 102, a reception circuit 104, a beam forming module 105, a signal processing module 116, and an image processing module 126. In the ultrasound imaging process, the transmission circuit 103 transmits a delay-focused transmission pulse having a certain amplitude and polarity to the probe 101 through the transmission/reception selection switch 102. The probe 101 is excited by the transmission pulse, transmits an ultrasonic wave (which may be any one of a plane wave, a focused wave, or a divergent wave) to a detection object (for example, an organ, a tissue, a blood vessel, etc. in a human body or an animal body, not shown in the figure), receives an ultrasonic echo reflected from a target region with information of the detection object after a certain delay, and converts the ultrasonic echo back into an electric signal again. The receiving circuit 104 receives the electric signals generated by the conversion of the probe 101, obtains ultrasonic echo signals, and sends the ultrasonic echo signals to the beam forming module 105. The beam forming module 105 performs focusing delay, weighting, channel summation and other processing on the ultrasonic echo signal, and then sends the ultrasonic echo signal to the signal processing module 116 for related signal processing. The ultrasonic echo signals processed by the signal processing module 116 are sent to the image processing module 126. The image processing module 126 performs different processing on the signals according to different imaging modes required by a user to obtain ultrasound image data of different modes, and then performs processing such as logarithmic compression, dynamic range adjustment, digital scan conversion, and the like to form ultrasound images of different modes, such as a B image, a C image, a D image, and the like, or other types of two-dimensional ultrasound images or three-dimensional ultrasound images. The transmitting circuit and the receiving circuit excite the probe to transmit ultrasonic beams to the detection object according to the setting of the ultrasonic imaging parameters, and receive the echoes of the ultrasonic beams to obtain ultrasonic echo signals, so that expected ultrasonic image data is obtained for displaying and showing the tissue structure in the detection object. The ultrasound imaging parameters referred to herein relate to all parameters that are available for the user to autonomously select during the imaging of the ultrasound tissue image, such as TGC (Time Gain compensation), sonic frequency, Pulse Repetition Frequency (PRF), ultrasound type, and dynamic range, among others.
In some embodiments of the present invention, the signal processing module 116 and the image processing module 126 in fig. 1 may be integrated on one main board 106, or one or more (including the same numbers) of the modules may be integrated on one processor/controller chip.
The obtained ultrasound image may be output to the display controller 170 and then displayed. The display controller 170 is connected to the image processing module through an input/output interface to implement data transmission. The display controller 170 may include a first touch display screen 130, a processor 140, and a first memory 160. The processor 140 invokes computer program instructions recorded on the first memory 160 to display the ultrasound image on the first touch screen 130 and/or to form a graphical user interface on the touch screen. In one embodiment, a Graphical User Interface (GUI) is displayed on the first touch display 130 and graphical controls such as those mentioned above with respect to ultrasound imaging parameter adjustments, various function buttons, and the like, involved in ultrasound image imaging procedures are presented. Control instructions of corresponding operations of the graphical controls generated by the operation of the input object on the touch display can be obtained based on a Graphical User Interface (GUI), and the control instructions of the information about the ultrasonic imaging parameters and the like can be transmitted to the ultrasonic medical detection equipment in a wired or wireless mode and used for controlling the operation of the probe, the transmitting circuit, the receiving circuit and the like so as to obtain a desired ultrasonic image. For the display of the ultrasound image, for example, the ultrasound image may be displayed on two display screens, respectively, or displayed on the same display screen in a split-screen manner. The ultrasonic image may be displayed on the touch display screen, and a Graphical User Interface (GUI) for inputting a user operation instruction may be displayed. Of course, a display area for displaying the ultrasound image may also be set based on a Graphical User Interface (GUI), and then the ultrasound image is edited by a gesture input of a user, the editing including: adjusting image size, adjusting definition, performing annotation, and the like.
Based on the graphical user interface displayed on the touch display screen, the processor 140 may invoke the gesture detection module 113 stored in the memory 160 to detect a control instruction obtained by the user performing a contact operation on the graphical user interface through the input object. In various embodiments, the touch screen display device includes a touch screen display having a Graphical User Interface (GUI), one or more processors, memory, and one or more modules, programs, or sets of instructions stored in the memory for performing various functions, which collectively enable detection of and derivation of associated control instructions based on Graphical User Interface (GUI) manipulation input. In various embodiments, these functions may include parameter adjustment, information entry, etc. of the subject (e.g., patient tissue) to obtain medical test data, image review, pathology database construction, retrieval and maintenance, patient profile information construction, display and management, patient catalog information construction, display and management, etc. The means, programs or instructions for executing may be included in a computer program product configured for execution by one or more processors. In some of the embodiments of the present invention, a user interacts with a graphical user interface primarily through gesture inputs on a touch display screen. Gesture input herein may include any type of user gesture input that may be detected by a device by direct contact or proximity to a touch display screen. For example, the gesture input may be an action of a user selecting one location, multiple locations, and/or multiple consecutive locations on the touch display screen using a finger of a right or left hand (e.g., index finger, thumb, etc.), or an input object detectable by touching the display screen (e.g., a stylus, a pen dedicated to touching the display screen), and may include an operation action like contact, release of touch, tap of touch, long contact, rotational spread, and the like. Here, the long contact corresponds to one gesture input of moving a finger, thumb, stylus pen, or the like in a predetermined direction or a variable direction while maintaining a continuous contact state with the touch display screen, for example, a gesture operation action such as a touch drag, flick, wipe, slide, sweep, or the like. As can be seen, the gesture input is realized by the contact of the input object with the touch display screen, the contact with the touch display screen may include the direct contact with the touch display screen by a finger, a thumb, a stylus pen or the like, or the indirect contact with the touch display screen, and the gesture input close to the touch display screen without direct contact refers to a gesture operation action on a spatial position close to the touch display screen. The graphical user interface refers to the overall design of human-computer interaction, operation logic and interface beauty of software, and can comprise one or more soft keyboards and a plurality of graphical control objects. The soft keyboard may include a number of icons (or soft keys). This may allow the user to select one or more icons in the soft keyboard and thus select one or more corresponding symbols for input. The gesture detection module 113 may detect gesture input interacting between an input object and the touch display screen. The gesture detection module 113 includes various program modules for performing various operations related to gesture input detection, such as determining whether contact has occurred, determining whether the gesture input is continuously input, determining whether a predetermined gesture corresponds to, determining an operation position corresponding to the gesture input, determining whether the operation position corresponding to the gesture input moves to an edge position of a corresponding display area, determining whether the gesture input has been interrupted (e.g., whether contact has stopped), determining movement of the gesture input and tracking a movement trajectory of the gesture input, and so on. Determining the motion of the gesture input may include determining a rate of motion (magnitude), a speed of motion (magnitude and direction), and/or an acceleration of motion (change in magnitude and/or direction), a trajectory of motion, etc., of the operation location to which the gesture input corresponds. These operations may be applied to a single operation location (e.g., gesture input by one finger), or to multiple simultaneous operation locations (e.g., "multi-touch," i.e., gesture input by multiple fingers). In some embodiments, the gesture detection module 113 is used to detect motion of one or more input objects on the surface of the touch display screen or at spatial locations proximate to the touch display screen. The gesture detection module 113 is stored in the memory, and is invoked by one or more processors to monitor the gesture input, and obtain an operation input instruction of the user.
Of course, in the embodiment shown in fig. 1, the processor 140 and the first memory 160 may be disposed on the main board 106, may be disposed separately from the main board 106, or may be integrally mounted with the touch display screen to form a separate display controller 170, i.e., to implement the display of the ultrasound image, or to implement the control instruction for obtaining the user input based on the ultrasound image. In one embodiment, the signal processing module 116 and/or the image processing module 126 of fig. 1, together with the processor 140, are collectively disposed on one or more processors for performing data processing of ultrasound images, and the monitoring of the gesture inputs and the generation of the graphical user interface, collectively referred to as a first processor. As shown in fig. 3, a conventional ultrasonic medical testing apparatus is provided, which includes a display 1, a control key operation area 3, a display support arm 2, a main body 4 and a foot control 5. The display 1 may be the same as the first touch display 130 described above, and the host 4 includes the motherboard 106 described above, or further includes a processor 140 and a first memory 160. Therefore, the portion involving all data processing is collectively referred to as the first processor.
Fig. 2 provides a schematic structural diagram of another embodiment. As shown in fig. 2, the ultrasonic medical inspection apparatus 200 includes: a probe 201, a transmission circuit 203, a transmission/reception selection switch 202, a reception circuit 204, a beam synthesis module 205, a signal processing module 216, and an image processing module 226. In the present embodiment, the functions and implementations of the probe 201, the transmitting circuit 203, the transmitting/receiving selection switch 202, the receiving circuit 204, the beam forming module 205, the signal processing module 216 and the image processing module 226 are the same as those of the probe 101, the transmitting circuit 103, the transmitting/receiving selection switch 102, the receiving circuit 104, the beam forming module 105, the signal processing module 116 and the image processing module 126 in the embodiment shown in fig. 1, and reference to the foregoing descriptions will not be repeated here. In some embodiments of the present invention, the signal processing module 216 and the image processing module 226 in fig. 2 may be integrated on one main board 206, or one or more (including the same number) modules may be integrated on one processor/controller chip. The difference from the embodiment shown in fig. 1 is that the ultrasonic medical inspection apparatus 200 further includes: the first communication module 215 is electrically connected to the image processing module 226, and is configured to transmit the ultrasound image data obtained by the image processing module 226 to the intelligent controller 270, and/or receive a control signal input by the intelligent controller 270 to set an ultrasound imaging parameter used in the ultrasound imaging process. The operation of setting the ultrasonic imaging parameters includes updating the ultrasonic imaging parameters, adjusting the ultrasonic imaging parameters, or initializing the setting of the ultrasonic imaging parameters. The intelligent controller 270 in the present embodiment includes: a second touch display 230, a second processor 240, a second memory 260, and a second communication module 214. The second memory 260 stores computer programs, such as the gesture detection module 213, running on the second processor 240, in which the gesture detection module 213 has the same function as the gesture detection module 113 in the embodiment shown in fig. 1, and the description thereof will not be repeated. The second touch screen 230 is implemented with the same function as the first touch screen 130, but the specific product parameters may not be the same, and the terms "first" and "second" are only used to distinguish entities in different application scenarios in the embodiments, and the following description about the method steps or the description of a single application scenario may be equally understood as a touch screen in the conventional sense, so that the description elsewhere herein may be simply referred to as a touch screen. The second communication module 214 receives ultrasound image data transmitted from the first communication module 215 and/or sends control signals, such as control signals containing ultrasound imaging parameter setting information, to the first communication module 215. The smart controller 270 includes the display controller mentioned in fig. 1, but may also include computer devices with touch screens such as various smart terminal devices, e.g., IPAD, cell phone, etc. For example, the intelligent controller 270 in this embodiment may also be the IPAD terminal controller 6 in fig. 3. The communication mode of the first communication module 215 and the second communication module 214 may adopt a wifi protocol, a bluetooth transmission protocol, a mobile communication network protocol, and the like. The ultrasonic medical testing device 200 and the intelligent controller 270 form an ultrasonic imaging system.
In the embodiment shown in fig. 3, an ultrasound imaging system or an ultrasound medical examination apparatus 100 is provided which integrates two display screens, the display 1 and the IPAD terminal controller 6. The IPAD terminal controller 6 can be used for generating a graphical user interface to obtain a user instruction about adjusting ultrasonic imaging parameters, or editing an ultrasonic image and the like; and may also be used to display ultrasound images. The IPAD terminal controller 6 also includes a second touch screen display 230. Of course, the same function can be realized by using an intelligent mobile terminal such as a smart phone for the IPAD terminal controller 6 equivalent to the intelligent controller 270.
Based on the structural schematic diagram of the ultrasonic medical detection apparatus (100, 200) provided in fig. 1, fig. 2 or fig. 3, the following will describe in detail the setting manner of the ultrasonic imaging parameters in conjunction with the hardware environment provided in fig. 1, fig. 2 or fig. 3.
When parameters are adjusted in a sliding manner on a touch screen (namely, a touch display screen), adjustment interfaces of the parameters are simultaneously displayed on the touch screen and the displayed adjustment interfaces change along with the sliding of a finger or a stylus. When the target object is clicked on the screen by a hand or a stylus, the target object is not easy to click and other objects are easy to be mistakenly selected because the finger can cover the target object. Moreover, when the angle is adjusted, when the push-to-target to be adjusted is small, the sliding of the finger or the stylus with a small stroke can rotate the target by a large angle, so that it is difficult for the user to finely control the size of the adjusted angle. Therefore, the device disclosed in this embodiment is an ultrasound imaging system or an ultrasound medical detection device that can adjust imaging parameters of the ultrasound medical detection device or edit ultrasound images through touch control, and more intuitively adjusts or edits image data and ultrasound imaging parameters obtained by the ultrasound imaging device by interaction between a user and a graphical user interface on a touch display screen, so that convenience of the user in operating the ultrasound device is increased, user experience is improved, and accurate positioning adjustment of an operation object on the graphical interface is realized.
Therefore, the present embodiment provides the following control manner of the ultrasonic imaging parameters shown in fig. 4. Fig. 4 provides a schematic flow chart of the ultrasound imaging control method in the embodiment shown in fig. 1, fig. 2 or fig. 3.
In step S210 of fig. 4, the transmitting circuit and the receiving circuit (103 and 104, 203 and 204) excite the probe (201, 101), transmit an ultrasonic beam to the inspection object according to the set ultrasonic imaging parameters, and in step S212, the exciting probe (201, 101) receives an echo of the ultrasonic beam, obtaining an ultrasonic echo signal. In one embodiment, the ultrasound imaging parameters in this embodiment include position information of the sampling gate.
In step S214 of fig. 4, an ultrasound image is obtained according to the ultrasound echo signal by the first processor (including the image processing module 126 in fig. 1) or the image processing module (226), for example, in the embodiment of fig. 1, the ultrasound image is obtained according to the ultrasound echo signal by the image processing module (126), and in the embodiment of fig. 2, the ultrasound image is obtained according to the ultrasound echo signal by the image processing module (226). Also provided within the ultrasound medical detection apparatus of fig. 1 is a first memory for storing a computer program running on a processor, such as the gesture detection module 113 described above. While a second memory is provided in the intelligent controller 270 in fig. 2 for storing a computer program running on a processor, such as the gesture detection module 213 described above. The ultrasound image may be an ultrasound image of the different modes described above, such as a B image, a C image, a D image, etc., or other types of two-dimensional ultrasound images or three-dimensional ultrasound images. Similarly, the ultrasound images referred to herein may be still frame images or dynamic video images.
In step S216 of fig. 4, the first processor or the second processor 240 of fig. 2 outputs the obtained ultrasound image to the touch screen for display. For example, an image display area for ultrasonic image display is set on a graphical user interface layer formed on a touch display screen. In one embodiment, the graphical user interface comprises at least two interface layers, wherein an ultrasound image is displayed on a first interface layer of the touch display screen, a second interface layer which is arranged in a transparent mode is superposed on the first interface layer, and the icon is arranged on the second interface layer. The setting mode can lead the rest data except the image data to be suspended on the ultrasonic image for displaying, does not block the display of the ultrasonic image, and can lead the user to observe the change of the ultrasonic image caused by the adjustment based on the ultrasonic imaging parameters, or save and transmit the information such as the edited annotation and the like together with the ultrasonic image.
In step S218 of fig. 4, the first processor or the second processor 240 of fig. 2 displays an icon superimposed on the displayed ultrasound image. This icon includes one of a probe marker, a comment marker, a sample line, a sample gate, and the like, for example, as shown by arrow 510 on graphical user interface 500 in FIG. 5, and sample gate 610 on graphical user interface 600 in FIG. 6. In one embodiment, the icon may be displayed on the second interface layer.
The shape of the icon is not limited and in one embodiment the icon is a sampling gate, such as sampling gate 610 in fig. 6.
In step S220 of fig. 4, the first processor or the second processor 240 of fig. 2 displays an extension line extending in a predetermined direction from an end of the icon on the first touch display 130 or the second touch display 230.
An extension line is displayed superimposed over the ultrasound image displayed by the touch screen display (130, 230), for example, in one of the embodiments, the extension line may be displayed on the second interface layer.
The extension lines are displayed and simultaneously a controllable sensitive area is correspondingly arranged, namely the sensitive areas corresponding to the extension lines are arranged on the touch display screens (130, 230), and the extension lines are correspondingly associated with the sensitive areas on the touch display screens one by one. The extension line corresponds to a sensitive area on the touch display screen (130, 230) for gesture recognition. The sensitive area refers to a position on the graphical user interface corresponding to the manipulated icon or indicator (such as an extended line, an icon, etc.), and the position on the graphical user interface is usually located by establishing an interface rectangular coordinate system, for example, assuming that a certain pixel point position represented by (X, Y) is in the coordinates of the rectangular coordinate system, then the set of (X, Y) constitutes a corresponding display area or sensitive area. The sensitive area corresponding to the extension line includes a preset neighborhood range at a position where the extension line is displayed, and when the contact of the input object with the touch display screen is located in the neighborhood range, the extension line is selected by default and an operation of adjusting the extension line is activated. For example, in fig. 5, an extended line 511 extending in a predetermined direction at the end of the icon 510 corresponds to a sensitive region 512 on the interface. In fig. 6, an extended line 611 extending in a predetermined direction at the end of the icon 610 corresponds to a sensitive area 612 on the interface. Of course, except that the preset neighborhood range on the position where the extension line is displayed is set as the sensitive area, the sensitive area can be set on the graphical user interface except the position where the extension line is displayed, so that a user can conveniently control the extension line outside the ultrasonic image without shielding the display of the ultrasonic image, and the purpose of more accurate adjustment is achieved. Monitoring contact of an input object with the touch display screen in a sensitive area outside the position where the extension line is displayed may be associated with performing manipulation of the extension line.
The "position" referred to herein includes orientation information, coordinate information, and/or angle information, etc., and for example, regarding the display position of the icon or the extension line on the graphical user interface, the coordinate information of the pixel point where the icon or the extension line is located may be used for characterization.
The extension line may be a broken line, but is not limited to a broken line, and may be any extension control object that is used when object adjustment is required. The extension line is a straight line.
The extension line starts from the end of the icon, and the end can be located at any point on the icon, for example, the end of the arrow in fig. 5, the middle point of the sampling gate (611) in fig. 6, and the like.
The extension line facilitates the user to accurately position the icon during position adjustment, and in order to prevent the ultrasound image from being obstructed as much as possible during the adjustment, in one embodiment, the extension line may extend from the end of the icon to the outside of the display area of the ultrasound image. For example, extension 511 in FIG. 5 may extend outside of display area 501 of the ultrasound image, and extension 611 in FIG. 6 may extend outside of display area 601 of the ultrasound image.
The extension line is obtained by extending from the end of the icon along a predetermined direction, where the predetermined direction may be any one direction with the end of the icon as a starting point, and may also be a specific preset direction, or the predetermined direction is determined based on the initial input of the user, for example, in one embodiment, the first processor or the second processor 240 in fig. 2 displays the extension line obtained by extending from the end of the icon along the predetermined direction on the first touch display 130 or the second touch display 230 in the following manner: firstly, detecting the contact of an input object (502, 602) and a touch display screen (130, 230) to obtain an initial operation position (513, 613); generating an extension line (511, 611) extending from an end of the icon (510, 610) to the initial operating position (513, 613); and, displaying the extension line (511, 611).
Of course, to provide a more friendly interactive experience, a prompt may be simultaneously displayed on the graphical user interface indicating the direction of rotation or translation. For example, a prompt 503 in FIG. 5 to prompt a left rotation; the prompt 603 in fig. 6 is used to prompt a right rotation and the prompt 604 is used to prompt a left rotation.
This step may be followed by: the first processor or the second processor 240 in fig. 2 associates the recorded correspondence of sensitive areas with extension lines, for example the recorded correspondence of sensitive areas with extension lines may also be stored in a memory (160, 260). The corresponding relationship between the sensitive area and the extension line may be recorded in a manner that the pixel area range of the sensitive area on the touch display screen is recorded, the pixel area range corresponds to the display position of the extension line, so that the subsequent quick search and gesture recognition according to the input of the user are facilitated, and the corresponding sensitive area changes along with the change of the position of the extension line, for example, in fig. 6, the extension line 611 changes in the corresponding sensitive area in the process that the input object 602 moves from the position 613 to the position 614. In one embodiment, the sensitive area of the extension line is set at a final position after the display of the extension line is determined, and whether an input operation is performed on the extension line is determined by judging whether an initial contact of an input object with the touch display screen is in the sensitive area, so that misoperation is avoided.
In step S224 of fig. 4, the first processor or the second processor 240 of fig. 2 invokes the gesture detection module (113, 213) to monitor the motion of the input object' S contact with the touch display screen (130, 230). In step S226 in fig. 4, the first processor or the second processor 240 in fig. 2 calls the gesture detection module (113, 213) to determine that the motion of the contact corresponds to the operation position on the touch display screen. The operation position on the interface mentioned herein refers to a position on the display interface corresponding to the operation input of the user on the interface object (e.g. icon, extended line) by using the human-computer interaction device, and the position can be represented by a coordinate position of a rectangular coordinate system or by angle information of a polar coordinate system. The determined operation position may be a pixel point position, or may be an area block formed by a plurality of pixel point positions.
In order to improve the user experience of the interaction operation and avoid misoperation, the following steps can be further included before the motion of the contact of the input object and the touch display screen (130, 230) is monitored:
first, as shown in fig. 6, contact of an input object (602) with a touch display screen (130, 230) is detected, and an initial operation position (613) is obtained; next, it is determined whether the initial operation position (613) is located in the sensitive area 612, and when the initial operation position (613) is located in the sensitive area 612, that is, the input object (602) is detected to be in contact with the touch display screen (130, 230) in the sensitive area 612, it is indicated that the input object 602 is detected to be in contact with the touch display screen (130, 230) in the sensitive area 612, indicating that the user operates the extension line. Again, movement of the contact of the monitored input object with the touch screen display (130, 230) is tracked, and the movement of the contact is determined to correspond to an operating position on the touch screen display, and the movement of the contact corresponds to a change in the operating position on the touch screen display from an initial operating position 613 to an operating position 614. The operation position determined in step S226 may be one operation position (position 614 in fig. 6) or a plurality of operation positions that are continuously changed from the initial operation position 613 to the operation position 614.
Further, in step S224, the first processor or the second processor 240 in fig. 2 calls the gesture detection module (113, or 213) to monitor the motion of the contact of the input object with the touch display screen (130, or 230), which may be a continuous contact of the input object with the touch display screen (130, or 230), such as the aforementioned long contact. For example, in one embodiment, the above step S224 is to monitor the continuous contact (or the contact that can be understood as continuous motion) of the input object with the touch display screen (130 or 230) in the sensitive region, and the first processor or the second processor 240 in fig. 2 can recognize some series of continuously changing operation positions on the touch display screen (130 or 230) through the gesture detection module (113 or 213) when the continuous contact or the contact moves. Thus, in step S226, the first processor or the second processor 240 in fig. 2 determines a plurality of operation positions of the contact on the touch display screen (130, or 230), and a plurality of continuously changing operation positions can be obtained. Of course, the operating positions in a plurality of successive variations may be varied in alignment along the direction of movement of the contact.
Next, in step S228 of fig. 4, the first processor or the second processor 240 of fig. 2 updates the display of the extension line and the icon so that the extension line passes through the above-determined operation position. Updating as referred to herein means deleting the display of the extension line and the icon at the original position and changing the positions thereof to the corresponding operation positions so that the extension line and the icon change following the change of the monitored operation positions. For example, in fig. 6, when the contact of the input object 602 with the touch display screen changes from the operation position 613 to the operation position 614, the display of the extension line and the icon changes, and the update state is maintained, in which the relative positional relationship between the icon 610 and the extension line 611 is always maintained during the update, and the extension line also always passes through the operation position determined in step S226, so that the display of the icon 610 and the extension line 611 is always updated and adjusted together, and the effect that the icon changes along with the adjustment of the extension line is achieved. Reference herein to "passing" of an extension line means that the extension line passes at least one pixel location within either the termination location or the operational location.
For example, in one embodiment, the steps S224 to S228 include:
the first processor or the second processor 240 in fig. 2 calls the gesture detection module to monitor the movement (also can be continuous contact) of the contact with the touch display screen (130 or 230) in the input object sensitive area, and determines a plurality of continuously-changed operation positions corresponding to the movement of the contact on the touch display screen; and updating the display of the extension line and the icon so that the display position of the extension line on the touch display screen sequentially changes through a plurality of continuously changing operation positions, so that the extension line and the icon change following the change of the monitored operation positions.
In addition, in some embodiments, before the process of step S224 to step S228, the following steps may be further included:
the first processor or the second processor 240 in fig. 2 invokes the gesture detection module to detect whether the contact between the input object and the touch display screen (130 or 230) is located in the sensitive area where the extension line is located, and when the contact between the input object and the touch display screen (130 or 230) is located in the sensitive area where the extension line is located, the processes of the steps S224 to S228 are performed. On the contrary, when the contact of the input object with the touch display screen (130, or 230) is not in the sensitive area where the extension line is located, the processes of the steps S224 to S228 are not performed. The process from step S224 to step S228 can be described with reference to the related descriptions. The embodiment can ensure the accuracy of control signal input and the positioning accuracy of icon adjustment by tracking and monitoring the input operation of the contact of the input object and the touch display screen (130 or 230) on the extension line. For example, as shown in fig. 6, first, the first processor or the second processor 240 in fig. 2 calls the gesture detection module to detect whether the contact (i.e., the contact at the operation position 613) of the input object 602 with the touch display screen (130 or 230) is located in the sensitive area 612 where the extension line is located, if so, the process from step S224 to step S228 is executed to start tracking and monitoring the movement of the contact of the input object 602 with the touch display screen (130 or 230) along the direction (the direction indicated by the indicator 603) or the direction (the direction indicated by the indicator 604). Otherwise, the process from step S224 to step S228 is not performed.
The above-mentioned processes of steps S224 to S228 allow the extension line to change the display position with the continuous contact of the input object with the touch display screen, and in order to improve the visual effect of the extension line with the continuous contact, in the process of updating the display of the extension line and the icon to the above-mentioned operation position, the change speed of the extension line between the two operation positions on the graphical user interface may be calculated according to the visual display movement speed, and the display movement of the extension line and the icon between the two first operation positions may be adjusted based on the change speed, thereby presenting the continuous display movement effect.
Further, in one embodiment, the process of updating the display of the extension line and the icon in the step S228 to enable the extension line to pass through the operation position may be one of the following manners:
1. as shown in fig. 6, rotating the icon 610 and the extension line 611 with a position on the icon 610 as a center causes the extension line 611 to pass through the operation position (613, 614), that is, redrawing the extension line 611 with a position on the icon 610 as a start point and passing through the operation position, and at the same time, drawing the icon in the original relative positional relationship of the icon 610 and the extension line 611. The extension line 611 is rotated centering on a position on the icon 610, and the direction of the rotation can be obtained according to the moving direction of recognizing the above-mentioned contact.
2. As shown in FIG. 7, translating the extension 711 and icon 710 causes the extension 711 to pass through the operational position (713, 714, 715). That is, the icon 710 is translated along with the extension line 711 and the extension line 711 is sequentially updated at the changed passing operation positions (713, 714, 715).
Of course, in the above process, the previous display of the extension line needs to be cleared, and the extension line displayed is guaranteed to be updated.
In order to ensure that the above-mentioned 1 st and 2 nd operation functions can be simultaneously realized on the same graphical user interface without additional control key operations, the process can be performed by using the flowchart shown in fig. 8. In fig. 8, steps S810, S812, S814, S816, S818, S820, S824, and S826 are respectively the same as the above-described steps S210, S212, S214, S216, S218, S220, S224, and S226, and will not be described in detail here. In the embodiment shown in fig. 8, the process of updating the extension line and the display of the icon in the above step S228 so that the extension line passes through the above operation position is implemented by the following steps, respectively.
In step S827, the first processor or the second processor 240 in fig. 2 recognizes a moving direction of the contact of the input object with the touch display screen, and then performs step S828 to determine an angle condition between the moving direction and the extension line, thereby updating the display of the extension line and the icon such that the extension line passes through the operation position. For example, as shown in fig. 8 and 6, step S828 is executed to determine that the angle between the movement direction (the direction indicated by the prompt 604) and the extension 611 satisfies the first preset condition, and step S8291 is executed to rotate the icon 610 and the extension 611 around a position on the icon 610 to make the extension 611 pass through the operation position 614, that is, to clear the extension 611 passing through the operation position 613 around the same position on the icon 610.
For another example, as shown in fig. 8 and 7, an ultrasound image 701 is displayed on the graphical user interface 700 shown in fig. 7, and an extension 711 of an icon 710 is controlled by an input operation of the input object 702. Monitoring the contact of the input object 702 with the touch display screen, obtaining an initial operation position 713, when the contact is in the sensitive area 712, that is, the initial operation position 713 is in the sensitive area 712 (or the initial operation position 713 coincides with the sensitive area 712), indicating that the user operates the extension line, tracking and monitoring the motion of the contact of the input object 702 with the touch display screen (which may be a continuous contact), determining that the motion of the contact corresponds to the operation position on the touch display screen (714, or a plurality of changed operation positions between 713 and 714), identifying the motion direction of the contact, determining that the included angle between the motion direction (the direction indicated by the prompt 703) and the extension line 711 satisfies a second preset condition, executing step S8292, translating the extension line 711 and the icon 710 to enable the extension line 711 to pass through the operation position (714, or a plurality of changed operation positions between 713 and 714), i.e., the original display of the extension line (i.e., the extension line 711 indicated by the dashed line at position 713) needs to be cleared at the same time. The variation of the movement of the extension 711, the input object 702 and the icon 710 corresponding to the contact at different times is characterized in fig. 7 by the different line types of the extension 711, the different line type outlines of the input object 702, and the different outline of the icon 710.
In one embodiment, the second predetermined condition is that the included angle is equal to zero, and the first predetermined condition is that the included angle is greater than zero. Of course, the embodiment is not limited thereto, and the second preset condition may be that the included angle is located in a smaller angle range, and the first preset condition is that the included angle is located in a larger angle range. When the included angle between the moving direction of the contact and the extension line is determined to be larger than zero, it can also be understood that the movement of the contact of the input object and the touch display screen is determined to deviate from the extending direction of the extension line. When the included angle between the moving direction of the contact and the extension line is determined to be equal to zero, it can also be understood that the movement of determining the contact between the input object and the touch display screen is performed along the extending direction of the extension line.
For recognizing the movement direction of the contact, the gesture detection module may calculate a movement speed and a corresponding direction when tracking and monitoring the movement of the contact on the touch screen, for example, determine the movement direction of the contact by using a connection line between two operation positions on the touch screen corresponding to the movement of the contact.
In order to move the icon to any one position by operating the extension line, the movement of monitoring the contact of the input object with the touch display screen is implemented in the process of performing the above step S826 (or step S226 in fig. 4) to step S8292 in the following manner, the movement of the contact is determined to correspond to the operation position on the touch display screen, and the extension line and the icon are translated to pass through the operation position, as shown in fig. 7.
1. Monitoring a first moving portion of said contact, said first moving portion proceeding along an extension direction in which said extension line is currently located.
For example, in fig. 7, the first moving portion is the process of moving from the operation position 713 to the operation position 714, 713 → 714. The movement of the first moving portion (713 → 714) described above proceeds in the direction indicated by the cue 703. Of course, referring to the foregoing embodiments, in order to avoid the misoperation, the present embodiment further includes: detecting contact of an input object (702) with a touch display screen (130, 230) to obtain an initial operation position (713); next, it is determined whether the initial operation position (713) is located in the sensitive area 712, and when the initial operation position (713) is located in the sensitive area 712, that is, the input object (702) is detected to be in contact with the touch display screen (130, 230) in the sensitive area 712, it is indicated that the input object 702 is detected to be in contact with the touch display screen (130, 230) in the sensitive area 712, indicating that the user performs the input operation on the extension line. Again, the first motion portion of the contact of the monitoring input object with the touch display screen (130, 230) is tracked.
2. And determining a first operation position of the first motion part of the contact on the touch display screen. In this embodiment, the first operation position may be any operation position located in the extending direction in which the extension line is currently located. For example 714 in fig. 7, or a plurality of varying operating positions between 713 and 714.
3. Translating the icon 710 with the extension 711 causes the extension 711 to pass through a first operating position, such as in fig. 7 translating the icon 710 with the extension 711 causes the extension 711 to pass through an operating position 714, or a plurality of varying operating positions between 713 and 714. If the extension 711 is translated so that the extension 711 passes through a plurality of changed operation positions between 713 and 714 when translated, a changed display effect of the icon 710 together with the extension 711 following the movement of the input object 702 can be achieved.
4. A second moving part of the contact is monitored, which deviates from the extension direction in which the extension line 711 is currently located. In this embodiment, the second moving portion may be a process of moving from the operation position 714 to the operation position 715 in fig. 7, 714 → 715. And the extension line 711 may be currently extending in the direction indicated by the cue 704.
5. The second motion part of the contact is determined to be associated with a second operation position on the touch display screen, but of course, in this embodiment, the second operation position may be any operation position on the touch display screen, for example, 715 in fig. 7, or a plurality of changed operation positions between 714 and 715.
6. Translating the icon 710 with the extension 711 causes the extension 711 to pass through the second operating position, e.g., translating the icon 710 with the extension 711 causes the extension 711 to pass through operating position 715, or a plurality of varying operating positions between 714 and 715, as in fig. 7. If the extension 711 is translated such that the extension 711 passes through a plurality of changed operation positions 714 to 715 during the translation, a changed display effect of the icon 710 together with the extension 711 following the movement of the input object 702 can be realized.
In the embodiment shown in fig. 7, the input object is in contact with the touch screen at all times without any break in the movement from the operation position 713 to the operation position 715.
In step S230 of fig. 4, the first processor or the second processor 240 of fig. 2 invokes the gesture detection module to recognize an end position generated when the input object is released from contact with the touch display screen (130, or 230).
In one embodiment, the first processor or the second processor 240 of FIG. 2 may at least identify the termination location of the release of the contact generation as described above in the following manner.
The disengagement of the contact between the input object and the touch display screen (130, or 230) is monitored, and the operating position at which the contact was located before the disengagement is taken as the termination position. For example, as shown in FIG. 6, contact between the input object 602 and the touch screen display (130, or 230) and the processor detects movement of the contact from the operational position 613 to the operational position 614, thereby also continuously updating the display of the extension line to continuously rotate the extension line past the operational position 614. At this point, the input object is disengaged at the operating position 614, and the processor monitors the disengagement of the contact between the input object 602 and the touch display screen (130, or 230), with the operating position 614 at which the contact was located prior to disengagement as the termination position. For another example, in FIG. 7, the input object 702 makes contact with the touch screen display (130, or 230), and the processor detects that the contact moves continuously from the operational position 713 to the operational position 715, thereby also continuously updating the display of the extension line to continuously translate the extension line past the operational position 715. At this time, the input object is disengaged at the operation position 715, and the processor monitors the disengagement of the contact between the input object 702 and the touch display screen (130, or 230), taking the operation position 715 at which the contact was located before the disengagement as the termination position. Of course, further, monitoring that the contact between the input object and the touch display screen (130, or 230) exceeds a predetermined range, which may be a certain interface area range including the ultrasound image, may also be regarded as releasing the contact.
In step S232 of fig. 4, the first processor or the second processor 240 of fig. 2 determines that the display of the extension line and the icon causes the extension line to pass through the above-described termination position. Reference herein to "passing" of an extension line means that the extension line passes at least one pixel location within either the termination location or the operational location. Of course, the relative positional relationship between the icon and the extension line is maintained constant throughout this process. The relative positional relationship between the icon and the extended line mentioned herein includes the coordinate positional relationship of both the angle relationship between the icon and the extended line, the position of the icon on the extended line, and the like. In the embodiment of the text, the icon and the extension line are linked, when the extension line rotates, the icon rotates, and when the extension line translates, the icon translates along with the extension line. And determining the display position of the extension line and the icon on the graphical user interface, or fixing the extension line and the icon on the graphical user interface to the display position which can enable the extension line to pass through the termination position in the updating process of the icon and the extension line.
The above steps S230 and S232 in fig. 4 are the same as the execution procedures of the steps S830 and S832 in fig. 8, and will not be described in detail here.
In addition, when the operation process of rotating or translating the extension line and the icon is required to be realized by inputting a corresponding operation on the graphical user interface, in the process of maintaining the contact state of the input object and the touch screen, whether the determination of the second operation position in the embodiment is executed or the operation position through which the extension line passes when the icon and the extension line are rotated can be distinguished by judging the condition of the included angle between the motion direction of the contact and the extension direction in which the extension line is currently located. For example, referring to the above description related to fig. 8, the second predetermined condition is set such that the included angle (i.e., the included angle between the moving direction of the contact and the extending direction in which the extension line is currently located) is within a first angular range that is smaller but larger than zero, and the first predetermined condition is set such that the included angle is within a second angular range that is larger and different from the second predetermined condition. This also allows for simultaneous rotation and translation of the extension and icon during uninterrupted contact monitoring. Of course, the rotation and translation operations of the extended line and the icon may be implemented in the intermittent contact monitoring by performing the above steps S230 and S232, or the steps S830 and S832.
In one embodiment, after the above embodiments perform step S232 or step S832, the following steps may be further included: the first processor or the second processor 240 in fig. 2 cancels the display of the extended line on the ultrasound image. After the icon is adjusted through the extension line, the extension line is not displayed any more, and complete display of an ultrasonic image interface is guaranteed.
In one embodiment, after the above embodiments perform step S232 or step S832, the following steps may be further included: the first processor or the second processor 240 in fig. 2 updates the set ultrasound imaging parameters according to the updated position of the icon. For example, in the embodiment shown in fig. 5, the processor may set the ultrasound imaging parameters in accordance with the updated position of the sampling gate. For another example, the processor in fig. 7 may update the remark information of the ultrasound image according to the icon (e.g., probe identifier, annotation text, etc.) after the position is updated, where the remark information includes: probe position, sampling gate position, annotation text, sampling line angle, sampling gate angle, and the like on or for the ultrasound image.
And transmitting the updated set ultrasonic imaging parameters to the probe, and obtaining an ultrasonic image according to the new ultrasonic imaging parameters. Or storing the ultrasonic images and the remark information and transmitting the ultrasonic images and the remark information to a remote device.
Further, for example, in the embodiment shown in fig. 2, the second processor in the intelligent controller 270 generates a control signal containing the ultrasound imaging parameters for the updated setting based on the icon, and/or generates image data containing the ultrasound image and the remark information obtained from the icon. The control signal is output to the first communication module 215 through the second communication module 214, so as to control the ultrasonic scanning of the probe on the target tissue through the first processor, and update the corresponding ultrasonic imaging parameters; or the image data containing the ultrasound image and the remark information is output to the first communication module 215 through the second communication module 214 for displaying through the first processor or being output to the host computer for storage.
Fig. 4 and 8 respectively provide only one execution sequence of the flow between steps, and various modifications can also be obtained by adjusting the sequence of the steps in fig. 4 and 8 based on the foregoing, the steps are not limited to be executed only according to the sequence in fig. 4 and 8, the steps can be mutually replaced and the execution sequence can be changed if the basic logic is satisfied, and one or more steps can be repeatedly executed and then the last step or steps can be executed, and the modifications belong to the modifications performed according to the embodiments provided herein.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is carried in a non-volatile computer-readable storage medium (such as ROM, magnetic disk, optical disk, hard disk, server cloud space), and includes several instructions for enabling a terminal device (which may be a mobile phone, a computer, a server, or a network device) to execute the system structure and method according to the embodiments of the present invention. For example, a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, can be used at least to implement the aforementioned embodiments based on the flow shown in steps S214 to S232 in fig. 4 or steps S814 to S832 in fig. 8.
In the embodiment, due to the existence of the extension line, a user can conveniently click the adjustment icon which is not blocked by the extension line, and the distance from the contact point of the finger and the screen to the icon is increased, so that the position information such as the angle of the icon can be more finely adjusted.
The above examples only show some embodiments, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (37)

1. An ultrasonic medical examination apparatus, characterized in that the apparatus comprises:
a probe;
the transmitting circuit and the receiving circuit are used for exciting the probe to transmit ultrasonic beams to a detection object and receiving echoes of the ultrasonic beams to obtain ultrasonic echo signals;
the image processing module is used for obtaining an ultrasonic image according to the ultrasonic echo signal;
a touch display screen;
a first memory storing a computer program running on a processor; and the combination of (a) and (b),
a first processor that implements the following steps when executing the program:
displaying the ultrasound image on the touch display screen,
an icon is superimposed and displayed on the ultrasound image,
displaying an extension line extending in a predetermined direction from an end of the icon,
monitoring movement of an input object in contact with the touch display screen,
determining that the movement of the contact corresponds to an operational position on the touch display screen,
updating the display of the extension line and the icon so that the extension line passes through the operation position,
identifying an end location of the contact creation, and,
and determining that the extended line and the icon are displayed so that the extended line passes through the termination position, wherein the relative position relation between the icon and the extended line is always kept unchanged.
2. The ultrasonic medical examination device of claim 1, wherein the first processor when executing the program effects the display of an extension line extending in a predetermined direction from an end of the icon in the following manner:
detecting the contact of an input object and the touch display screen to obtain an initial contact position;
generating an extension line extending from an end of the icon to the initial contact position; and the combination of (a) and (b),
and displaying the extension line.
3. The ultrasonic medical detection device of claim 1, wherein the icon comprises one of a probe marker, an annotation marker, a sampling line, and a sampling gate.
4. The ultrasonic medical examination device of claim 1, wherein the first processor when executing the program performs the updating the display of the extension line and icon to cause the extension line to pass through the operative position by at least one of:
rotating the icon and the extension line with a position on the icon as a center so that the extension line passes through the operation position, and,
translating the extension line and icon to pass the extension line through the operational position.
5. The ultrasonic medical examination device of claim 4, wherein the first processor, when executing the program, further comprises, before the rotating the icon and extension line about a position on the icon to pass the extension line through the operational position:
identifying a direction of movement of the contact, an
And determining that the motion of the contact deviates from the current extending direction of the extension line, or determining that an included angle between the motion direction of the contact and the extension line meets a first preset condition, wherein the first preset condition is that the included angle is in a first angle range.
6. The ultrasonic medical examination device of claim 4, wherein the first processor when executing the program further comprises, before the translating the extension line and icon causes the extension line to pass through the operative position:
identifying a direction of movement of the contact, an
And determining that the contact movement is carried out along the extending direction of the extension line, or determining that an included angle between the contact movement direction and the extension line meets a second preset condition, wherein the second preset condition is that the included angle is in a second angle range different from the first angle range.
7. The ultrasonic medical examination device of claim 4, wherein the first processor when executing the program effects movement of the monitoring input object in contact with the touch display screen, determining that the movement of the contact corresponds to an operational position on the touch display screen, and translating the extension line and the icon such that the extension line passes through the operational position by:
monitoring a first motion portion of the contact, the first motion portion proceeding along an extension direction in which the extension line is currently located,
determining that a first motion portion of the contact is associated with a first operational position on the touch display screen,
translating the icon along with the extension line such that the extension line passes through a first operational position,
monitoring a second moving part of the contact, the second moving part deviating from an extension direction in which the extension line is currently located,
determining that a second motion portion of the contact is associated with a second operational position on the touch display screen,
and translating the icon together with the extension line to enable the extension line to pass through the second operation position, wherein the second operation position is any one operation position on the touch display screen.
8. The ultrasonic medical testing device of claim 1, wherein the extension line extends beyond a display area of the ultrasonic image.
9. The ultrasonic medical detection device of claim 1, wherein the first processor when executing the program further comprises, after the monitoring the motion of the input object in contact with the touch display screen:
displaying a cue indicating a direction of rotation or translation.
10. The ultrasonic medical detection device of claim 1, wherein the first processor when executing the program further performs the steps of:
and displaying the ultrasonic image on a first interface layer of the touch display screen, superposing a second interface layer which is arranged in a transparent manner above the first interface layer, and arranging the icon and the extension line on the second interface layer.
11. An ultrasound imaging control method, characterized in that the method comprises:
exciting a probe to emit an ultrasonic beam to a detection object;
receiving the echo of the ultrasonic beam to obtain an ultrasonic echo signal;
obtaining an ultrasonic image according to the ultrasonic echo signal;
displaying the ultrasound image on a touch display screen,
an icon is superimposed and displayed on the ultrasound image,
displaying an extension line extending in a predetermined direction from an end of the icon,
monitoring movement of an input object in contact with the touch display screen,
determining that the movement of the contact corresponds to an operational position on the touch display screen,
updating the display of the extension line and the icon so that the extension line passes through the operation position,
identifying an end location of the contact creation, and,
and determining that the extended line and the icon are displayed so that the extended line passes through the termination position, wherein the relative position relation between the icon and the extended line is always kept unchanged.
12. The ultrasound imaging control method according to claim 11, wherein the displaying an extension line extending in a predetermined direction from an end of the icon includes:
detecting the contact of an input object and the touch display screen to obtain an initial contact position;
generating an extension line extending from an end of the icon to the initial contact position; and the combination of (a) and (b),
and displaying the extension line.
13. The ultrasound imaging control method of claim 11, wherein the icon comprises one of a probe marker, an annotation marker, a sampling line, and a sampling gate.
14. The ultrasound imaging control method according to claim 11, wherein the updating of the display of the extension line and the icon to cause the extension line to pass through the operation position includes one of:
rotating the icon and the extension line with a position on the icon as a center so that the extension line passes through the operation position, and,
translating the extension line and icon to pass the extension line through the operational position.
15. The ultrasound imaging control method according to claim 14, further comprising, before the rotating the icon and the extension line with one position on the icon as a center so that the extension line passes through the operation position:
identifying a direction of movement of the contact, an
And determining that the motion of the contact deviates from the current extending direction of the extension line, or determining that an included angle between the motion direction of the contact and the extension line meets a first preset condition, wherein the first preset condition is that the included angle is in a first angle range.
16. The ultrasound imaging control method according to claim 14, further comprising, before the translating the extension line and icon causing the extension line to pass the operational position:
identifying a direction of movement of the contact, an
And determining that the contact movement is carried out along the extending direction of the extension line, or determining that an included angle between the contact movement direction and the extension line meets a second preset condition, wherein the second preset condition is that the included angle is in a second angle range different from the first angle range.
17. The ultrasound imaging control method according to claim 14, wherein the monitoring movement of the input object in contact with the touch display screen, determining that the movement of the contact corresponds to an operational position on the touch display screen, and translating the extension line and the icon such that the extension line passes through the operational position comprises:
monitoring a first motion portion of the contact, the first motion portion proceeding along an extension direction in which the extension line is currently located,
determining that a first motion portion of the contact is associated with a first operational position on the touch display screen,
translating the icon along with the extension line such that the extension line passes through a first operational position,
monitoring a second moving part of the contact, the second moving part deviating from an extension direction in which the extension line is currently located,
determining that a second motion portion of the contact is associated with a second operational position on the touch display screen,
and translating the icon together with the extension line to enable the extension line to pass through the second operation position, wherein the second operation position is any one operation position on the touch display screen.
18. The ultrasound imaging control method according to claim 11, wherein the extension line extends outside a display area of the ultrasound image.
19. The ultrasonic imaging control method according to claim 11, wherein the ultrasonic image is displayed on a first interface layer of the touch display screen, a second interface layer provided in a transparent manner is superimposed on the first interface layer, and the icon and the extension line are provided on the second interface layer.
20. An ultrasound imaging system, characterized in that the system comprises: ultrasonic medical detection equipment and an intelligent controller; wherein the content of the first and second substances,
the ultrasonic medical detection apparatus includes:
a probe;
the transmitting circuit and the receiving circuit are used for exciting the probe to transmit ultrasonic beams to a detection object, receiving echoes of the ultrasonic beams and obtaining ultrasonic echo signals;
the image processing module is used for obtaining an ultrasonic image according to the ultrasonic echo signal; and the combination of (a) and (b),
the first communication module is electrically connected with the image processing module and is used for transmitting the ultrasonic image data to the intelligent controller and/or receiving a control signal input by the intelligent controller so as to set ultrasonic imaging parameters required for obtaining the ultrasonic image;
the intelligent controller includes:
the display screen is touched, and the display screen is touched,
the second communication module is used for receiving the ultrasonic image data transmitted by the first communication module and/or sending a control signal to the first communication module;
a second memory storing a computer program running on a processor; and the combination of (a) and (b),
a second processor that implements the following steps when executing the program:
displaying the ultrasound image on the touch display screen,
an icon is superimposed and displayed on the ultrasound image,
displaying an extension line extending in a predetermined direction from an end of the icon,
monitoring movement of an input object in contact with the touch display screen,
determining that the movement of the contact corresponds to an operational position on the touch display screen,
updating the display of the extension line and the icon so that the extension line passes through the operation position,
identifying an end location of the contact creation, and,
determining that the extended line and the icon are displayed such that the extended line passes through the termination position, wherein a relative positional relationship between the icon and the extended line is always maintained, and,
generating a control signal containing ultrasonic imaging parameters according to the icon, and/or generating image data containing the ultrasonic image and remark information;
and outputting the control signal to the first communication module through the second communication module, or outputting the image data.
21. The ultrasound imaging system of claim 20, wherein the second processor, when executing the program, effects the displaying an extension line extending in a predetermined direction from an end of the icon in the following manner:
detecting the contact of an input object and the touch display screen to obtain an initial contact position;
generating an extension line extending from an end of the icon to the initial contact position; and the combination of (a) and (b),
and displaying the extension line.
22. The ultrasound imaging system of claim 20, wherein the icon comprises one of a probe marker, an annotation marker, a sampling line, and a sampling gate.
23. The ultrasound imaging system of claim 20, wherein the second processor, when executing the program, performs the updating the display of the extension line and icon to cause the extension line to pass through the operative position using at least one of:
rotating the icon and the extension line with a position on the icon as a center so that the extension line passes through the operation position, and,
translating the extension line and icon to pass the extension line through the operational position.
24. The ultrasound imaging system of claim 20, wherein the second processor, when executing the program, further comprises, before rotating the icon and extension line about a position on the icon to pass the extension line through the operational position:
identifying a direction of movement of the contact, an
And determining that the motion of the contact deviates from the current extending direction of the extension line, or determining that an included angle between the motion direction of the contact and the extension line meets a first preset condition, wherein the first preset condition is that the included angle is in a first angle range.
25. The ultrasound imaging system of claim 23, wherein the second processor, when executing the program, further comprises, prior to the translating the extension line and icon causing the extension line to pass through the operative position:
identifying a direction of movement of the contact, an
And determining that the contact movement is carried out along the extending direction of the extension line, or determining that an included angle between the contact movement direction and the extension line meets a second preset condition, wherein the second preset condition is that the included angle is in a second angle range different from the first angle range.
26. The ultrasound imaging system of claim 23, wherein the second processor, when executing the program, effects movement of the monitoring input object in contact with the touch display screen, determining that the movement of the contact corresponds to an operational position on the touch display screen, and translating the extension line and icon such that the extension line passes through the operational position by:
monitoring a first motion portion of the contact, the first motion portion proceeding along an extension direction in which the extension line is currently located,
determining that a first motion portion of the contact is associated with a first operational position on the touch display screen,
translating the icon along with the extension line such that the extension line passes through a first operational position,
monitoring a second moving part of the contact, the second moving part deviating from an extension direction in which the extension line is currently located,
determining that a second motion portion of the contact is associated with a second operational position on the touch display screen,
and translating the icon together with the extension line to enable the extension line to pass through the second operation position, wherein the second operation position is any one operation position on the touch display screen.
27. The ultrasound imaging system of claim 20, wherein the extension line extends outside a display area of the ultrasound image.
28. The ultrasound imaging system of claim 20, wherein the second processor, when executing the program, further performs the steps of:
and displaying the ultrasonic image on a first interface layer of the touch display screen, superposing a second interface layer which is arranged in a transparent manner above the first interface layer, and arranging the icon and the extension line on the second interface layer.
29. An intelligent controller, characterized in that the intelligent controller comprises:
a touch display screen;
the second communication module is used for receiving ultrasonic image data transmitted by the ultrasonic medical detection equipment and/or sending a control signal to the ultrasonic medical detection equipment;
a second memory storing a computer program running on a processor; and the combination of (a) and (b),
a second processor that implements the following steps when executing the program:
displaying the ultrasound image on the touch display screen,
an icon is superimposed and displayed on the ultrasound image,
displaying an extension line extending in a predetermined direction from an end of the icon,
monitoring movement of an input object in contact with the touch display screen,
determining that the movement of the contact corresponds to an operational position on the touch display screen,
updating the display of the extension line and the icon so that the extension line passes through the operation position,
identifying an end location of the contact creation, and,
determining that the extended line and the icon are displayed such that the extended line passes through the termination position, wherein a relative positional relationship between the icon and the extended line is always maintained, and,
generating a control signal containing ultrasonic imaging parameters according to the icon, and/or generating image data containing the ultrasonic image and remark information;
and outputting the control signal to the first communication module through the second communication module, or outputting the image data.
30. The intelligent controller according to claim 29, wherein the second processor, when executing the program, implements the displaying of the extension line extending in the predetermined direction from the end of the icon in the following manner:
detecting the contact of an input object and the touch display screen to obtain an initial contact position;
generating an extension line extending from an end of the icon to the initial contact position; and the combination of (a) and (b),
and displaying the extension line.
31. The intelligent controller of claim 29, wherein the icon comprises one of a probe marker, an annotation marker, a sampling line, and a sampling gate.
32. The intelligent controller according to claim 29, wherein the second processor, when executing the program, performs the updating the display of the extension line and icon to cause the extension line to pass through the operational position using at least one of:
rotating the icon and the extension line with a position on the icon as a center so that the extension line passes through the operation position, and,
translating the extension line and icon to pass the extension line through the operational position.
33. The intelligent controller of claim 29, wherein the second processor, when executing the program, further comprises, before rotating the icon and the extension line about a position on the icon to pass the extension line through the operating position:
identifying a direction of movement of the contact, an
And determining that the motion of the contact deviates from the current extending direction of the extension line, or determining that an included angle between the motion direction of the contact and the extension line meets a first preset condition, wherein the first preset condition is that the included angle is in a first angle range.
34. The intelligent controller of claim 32, wherein the second processor, when executing the program, further comprises, prior to the translating the extension line and icon causing the extension line to pass through the operational position:
identifying a direction of movement of the contact, an
And determining that the contact movement is carried out along the extending direction of the extension line, or determining that an included angle between the contact movement direction and the extension line meets a second preset condition, wherein the second preset condition is that the included angle is in a second angle range different from the first angle range.
35. The intelligent controller according to claim 32, wherein the second processor, when executing the program, implements the monitoring of the movement of the contact of the input object with the touch screen display by determining that the movement of the contact corresponds to an operational position on the touch screen display, and translating the extension line and the icon such that the extension line passes through the operational position by:
monitoring a first motion portion of the contact, the first motion portion proceeding along an extension direction in which the extension line is currently located,
determining that a first motion portion of the contact is associated with a first operational position on the touch display screen,
translating the icon along with the extension line such that the extension line passes through a first operational position,
monitoring a second moving part of the contact, the second moving part deviating from an extension direction in which the extension line is currently located,
determining that a second motion portion of the contact is associated with a second operational position on the touch display screen,
and translating the icon together with the extension line to enable the extension line to pass through the second operation position, wherein the second operation position is any one operation position on the touch display screen.
36. The intelligent controller of claim 29, wherein the extension line extends outside a display area of the ultrasound image.
37. The intelligent controller of claim 29, wherein the second processor, when executing the program, further performs the steps of:
and displaying the ultrasonic image on a first interface layer of the touch display screen, superposing a second interface layer which is arranged in a transparent manner above the first interface layer, and arranging the icon and the extension line on the second interface layer.
CN201780024747.2A 2017-02-07 2017-02-07 Ultrasonic medical detection equipment, imaging control method, imaging system and controller Active CN109069105B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/073045 WO2018145244A1 (en) 2017-02-07 2017-02-07 Ultrasound medical detection device, imaging control method, imaging system, controller

Publications (2)

Publication Number Publication Date
CN109069105A CN109069105A (en) 2018-12-21
CN109069105B true CN109069105B (en) 2021-08-24

Family

ID=63107647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780024747.2A Active CN109069105B (en) 2017-02-07 2017-02-07 Ultrasonic medical detection equipment, imaging control method, imaging system and controller

Country Status (2)

Country Link
CN (1) CN109069105B (en)
WO (1) WO2018145244A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210015448A1 (en) * 2019-07-15 2021-01-21 GE Precision Healthcare LLC Methods and systems for imaging a needle from ultrasound imaging data

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5331335A (en) * 1990-08-10 1994-07-19 Fujitsu Limited Panning control device for controlling area of display image in computer aided design system
CN102440804A (en) * 2011-09-17 2012-05-09 无锡祥生医学影像有限责任公司 Ultrasonic diagnostic apparatus with touch screen and image amplification method thereof
CN102460364A (en) * 2009-06-10 2012-05-16 高通股份有限公司 User interface methods providing continuous zoom functionality
CN103356236A (en) * 2012-04-02 2013-10-23 富士胶片株式会社 An ultrasound diagnostic apparatus
CN103677616A (en) * 2012-09-18 2014-03-26 华硕电脑股份有限公司 Operating method of electronic device
CN104731449A (en) * 2009-05-19 2015-06-24 索尼公司 Digital image processing device and associated methodology of performing touch-based image scaling
CN105892857A (en) * 2016-03-31 2016-08-24 深圳市菲森科技有限公司 Image positioning method and device
CN105898189A (en) * 2014-05-06 2016-08-24 无锡威莱斯电子有限公司 Wireless reversing image system with adjustable reversing auxiliary lines

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101630761B1 (en) * 2012-09-24 2016-06-15 삼성전자주식회사 Ultrasound apparatus and method for providing information using the ultrasound apparatus
WO2014142468A1 (en) * 2013-03-13 2014-09-18 Samsung Electronics Co., Ltd. Method of providing copy image and ultrasound apparatus therefor
CN104407692B (en) * 2014-09-30 2018-09-07 深圳市亿思达科技集团有限公司 Hologram image interactive display method, control method and system based on ultrasound

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5331335A (en) * 1990-08-10 1994-07-19 Fujitsu Limited Panning control device for controlling area of display image in computer aided design system
CN104731449A (en) * 2009-05-19 2015-06-24 索尼公司 Digital image processing device and associated methodology of performing touch-based image scaling
CN102460364A (en) * 2009-06-10 2012-05-16 高通股份有限公司 User interface methods providing continuous zoom functionality
CN102440804A (en) * 2011-09-17 2012-05-09 无锡祥生医学影像有限责任公司 Ultrasonic diagnostic apparatus with touch screen and image amplification method thereof
CN103356236A (en) * 2012-04-02 2013-10-23 富士胶片株式会社 An ultrasound diagnostic apparatus
CN103677616A (en) * 2012-09-18 2014-03-26 华硕电脑股份有限公司 Operating method of electronic device
CN105898189A (en) * 2014-05-06 2016-08-24 无锡威莱斯电子有限公司 Wireless reversing image system with adjustable reversing auxiliary lines
CN105892857A (en) * 2016-03-31 2016-08-24 深圳市菲森科技有限公司 Image positioning method and device

Also Published As

Publication number Publication date
CN109069105A (en) 2018-12-21
WO2018145244A1 (en) 2018-08-16

Similar Documents

Publication Publication Date Title
US11801035B2 (en) Systems and methods for remote graphical feedback of ultrasound scanning technique
US10558350B2 (en) Method and apparatus for changing user interface based on user motion information
US9526473B2 (en) Apparatus and method for medical image searching
AU2013200054B2 (en) Touch free operation of devices by use of depth sensors
US9996160B2 (en) Method and apparatus for gesture detection and display control
CN109069108B (en) Ultrasonic medical detection equipment, transmission control method, imaging system and terminal
CN101179997B (en) Stylus-aided touchscreen control of ultrasound imaging devices
US20230329676A1 (en) Methods and apparatus for performing measurements on an ultrasound image
US11602332B2 (en) Methods and systems for multi-mode ultrasound imaging
EP3673813B1 (en) Ultrasound diagnosis apparatus
US20200129156A1 (en) Methods and apparatus for collecting color doppler ultrasound data
US20180210632A1 (en) Method and ultrasound imaging system for adjusting an ultrasound image with a touch screen
CN109069105B (en) Ultrasonic medical detection equipment, imaging control method, imaging system and controller
CN109069104B (en) Ultrasonic medical detection equipment, imaging control method, imaging system and controller
US10146908B2 (en) Method and system for enhanced visualization and navigation of three dimensional and four dimensional medical images
WO2017190360A1 (en) Medical detection system and control method therefor
CN102389322A (en) Ultrasonic diagnostic equipment with touch screen and color blood flow mode regulation method thereof
US20190114812A1 (en) Method and ultrasound imaging system for emphasizing an ultrasound image on a display screen
JP2011104109A (en) Ultrasonic diagnostic apparatus
US20200261062A1 (en) Ultrasound diagnostic apparatus, recording medium, and console guide display method
WO2018154714A1 (en) Operation input system, operation input method, and operation input program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant