WO2018145244A1 - 超声医学检测设备及成像控制方法、成像系统、控制器 - Google Patents

超声医学检测设备及成像控制方法、成像系统、控制器 Download PDF

Info

Publication number
WO2018145244A1
WO2018145244A1 PCT/CN2017/073045 CN2017073045W WO2018145244A1 WO 2018145244 A1 WO2018145244 A1 WO 2018145244A1 CN 2017073045 W CN2017073045 W CN 2017073045W WO 2018145244 A1 WO2018145244 A1 WO 2018145244A1
Authority
WO
WIPO (PCT)
Prior art keywords
extension line
icon
contact
touch display
display screen
Prior art date
Application number
PCT/CN2017/073045
Other languages
English (en)
French (fr)
Inventor
刘智光
周述文
何绪金
Original Assignee
深圳迈瑞生物医疗电子股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳迈瑞生物医疗电子股份有限公司 filed Critical 深圳迈瑞生物医疗电子股份有限公司
Priority to CN201780024747.2A priority Critical patent/CN109069105B/zh
Priority to PCT/CN2017/073045 priority patent/WO2018145244A1/zh
Publication of WO2018145244A1 publication Critical patent/WO2018145244A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • the present invention relates to an ultrasonic imaging control method with a touch display screen and an imaging system.
  • an object that needs to be adjusted in angle is usually selected by hand or a stylus, and is dragged on the screen after being pressed to realize adjustment of the angle of the object.
  • an ultrasonic medical testing apparatus comprising:
  • a transmitting circuit and a receiving circuit configured to excite the probe to emit an ultrasonic beam to the detecting object, and receive an echo of the ultrasonic beam to obtain an ultrasonic echo signal
  • An image processing module configured to obtain an ultrasound image according to the ultrasound echo signal
  • the first memory storing a computer program running on the processor
  • the first processor when the first processor executes the program, implements the following steps:
  • an ultrasound imaging control method comprising:
  • the excitation probe emits an ultrasonic beam to the detection object
  • an ultrasound imaging system comprising: an ultrasound medical detection device and an intelligent controller; wherein
  • the ultrasonic medical testing device includes:
  • a transmitting circuit and a receiving circuit configured to excite the probe to emit an ultrasonic beam to the detecting object, receive an echo of the ultrasonic beam, and obtain an ultrasonic echo signal
  • An image processing module configured to obtain an ultrasound image according to the ultrasound echo signal
  • a first communication module electrically connected to the image processing module, configured to transmit the ultrasound image data to the intelligent controller, and/or receive a control signal input by the intelligent controller to set to obtain the ultrasound image Required ultrasound imaging parameters;
  • the intelligent controller includes:
  • a second communication module configured to receive ultrasound image data transmitted from the first communication module, and/or send a control signal to the first communication module
  • the second processor when the second processor executes the program, implements the following steps:
  • an intelligent controller that includes:
  • a second communication module configured to receive ultrasound image data transmitted from the ultrasonic medical detection device, and/or to send a control signal to the ultrasonic medical detection device;
  • the second processor when the second processor executes the program, implements the following steps:
  • FIG. 1 is a schematic diagram of a system architecture of an ultrasonic medical testing device in accordance with some embodiments
  • FIG. 2 is a schematic diagram of a system architecture of an ultrasonic medical testing device in accordance with some embodiments
  • FIG. 3 provides a schematic diagram of a system architecture of an ultrasound detection system in accordance with some embodiments
  • FIG. 4 is a schematic flow chart of the ultrasonic imaging control method in the embodiment shown in FIG. 1 or FIG. 2;
  • FIG. 5 provides an embodiment of operational input on an graphical user interface for an icon 510 superimposed on an ultrasound image 501 in some embodiments
  • FIG. 6 provides an embodiment of a rotational operation of an icon 610 superimposed on an ultrasound image 601 on a graphical user interface in some embodiments
  • FIG. 7 provides an embodiment of a panning operation for an icon 710 superimposed on an ultrasound image 701 on a graphical user interface in some embodiments;
  • FIG. 8 is a schematic diagram showing another implementation flow of the ultrasonic imaging control method in the embodiment shown in FIG. 1 or FIG.
  • FIG. 1 is a schematic view showing the structure of an ultrasonic medical detecting apparatus 100 in an embodiment, and the specific structure is as follows.
  • the ultrasonic medical testing apparatus 100 shown in FIG. 1 mainly includes a probe 101, a transmitting circuit 103, a transmitting/receiving selection switch 102, a receiving circuit 104, a beam combining module 105, a signal processing module 116, and an image processing module 126.
  • the transmitting circuit 103 transmits a delayed-focused transmission pulse having a certain amplitude and polarity to the probe 101 through the transmission/reception selection switch 102.
  • the probe 101 is excited by a transmission pulse to emit an ultrasonic wave (which may be any one of a plane wave, a focused wave or a divergent wave) to a detection object (for example, an organ, a tissue, a blood vessel, or the like in a human body or an animal body, not shown). And receiving an ultrasonic echo with information of the detection object reflected from the target area after a certain delay, and reconverting the ultrasonic echo into an electrical signal.
  • the receiving circuit 104 receives the electrical signals generated by the conversion of the probe 101, obtains ultrasonic echo signals, and sends the ultrasonic echo signals to the beam combining module 105.
  • the beam synthesis module 105 performs processing such as focus delay, weighting, and channel summation on the ultrasonic echo signals, and then sends the ultrasonic echo signals to the signal processing module 116 for related signal processing.
  • the ultrasonic echo signals processed by the signal processing module 116 are sent to the image processing module 126.
  • the image processing module 126 performs different processing on the signals according to different imaging modes required by the user, obtains ultrasonic image data of different modes, and then forms ultrasonic images of different modes through logarithmic compression, dynamic range adjustment, digital scan conversion, and the like. Such as B image, C image, D image, etc., or other types of two-dimensional ultrasound images or three-dimensional ultrasound images.
  • the transmitting circuit and the receiving circuit exciter probe emit an ultrasonic beam to the detecting object according to the setting of the ultrasonic imaging parameter, and receive an echo of the ultrasonic beam to obtain an ultrasonic echo signal, thereby obtaining desired ultrasonic image data for display and display. Detects the internal structure of the object.
  • the ultrasound imaging parameters mentioned in this paper refer to all parameters that can be selected by the user during the imaging process of the ultrasound tissue image, such as TGC (Time Gain Compensate), acoustic frequency, pulse recurrence frequency. , PRF), ultrasonic type, and dynamic range, etc.
  • the signal processing module 116 and the image processing module 126 of FIG. 1 may be integrated on one motherboard 106, or one or more of the modules (including the number herein above) are integrated in Implemented on a processor/controller chip.
  • the obtained ultrasound image can be output to the display controller 170 for display.
  • the display controller 170 is connected to the image processing module through an input and output interface to implement data transmission.
  • Display controller 170 can A first touch display screen 130, a processor 140, and a first memory 160 are included.
  • the processor 140 invokes computer program instructions recited on the first memory 160 to display the ultrasound image on the first touch display 130 and/or form a graphical user interface on the touch display.
  • a graphical user interface is displayed on the first touch display 130 and graphical controls such as the ultrasound imaging parameter adjustments involved in the ultrasound image imaging process, various functional keys, and the like are presented.
  • Control instructions for corresponding operations on the graphical control generated by the operation of the input object on the touch display can be obtained based on a graphical user interface (GUI), and the control commands for information such as ultrasonic imaging parameters can be transmitted by wire or wirelessly.
  • GUI graphical user interface
  • An ultrasonic medical testing device is used and is used to control the operation of the probe, the transmitting circuit, the receiving circuit, etc., for obtaining a desired ultrasonic image.
  • the ultrasound image may be displayed on the two display screens, respectively, or on the same display screen.
  • An ultrasound image can be displayed on the touch display or a graphical user interface (GUI) for user input of the command input.
  • GUI graphical user interface
  • a display area for displaying an ultrasound image can also be set, and then the ultrasound image is edited by the user's gesture input, and the editing includes: adjusting the image size, adjusting the definition, performing annotation, and the like. .
  • the processor 140 can call the gesture detection module 113 stored in the memory 160 to detect a control command obtained by the user performing a contact operation on the graphical user interface through the input object.
  • a touch display having a graphical user interface (GUI), one or more processors, memory, and one or more modules, programs stored in memory for performing various functions are included Or an instruction set, which together implements a graphical user interface (GUI)-based manipulation input detection and obtains relevant control instructions.
  • these functions may include parameter adjustment, information input, etc.
  • an object eg, a patient's tissue
  • the modules, programs, or instructions for executing these may be included in a computer program product configured for execution by one or more processors.
  • the user interacts with the graphical user interface primarily through gesture input on the touch display.
  • the gesture input herein may include any type of user gesture input that the device can detect by directly touching the touch display or proximity to the touch display.
  • the gesture input may be a finger that the user uses a right or left hand (eg, an index finger, a thumb, etc.), or an input object that can be detected by touching the display screen (eg, a stylus, a touch screen dedicated pen) on the touch display screen
  • the action of selecting one position, multiple positions, and/or multiple consecutive positions may include similar touch, touch release, touch tap, long contact, rotation exhibition Open and other operational actions.
  • the long contact corresponds to a gesture input of moving a finger, a thumb, a stylus in a predetermined direction or a variable direction while a finger, a thumb, a stylus, or the like is kept in continuous contact with the touch display screen, for example, like a touch drag Gesture operation such as moving, flicking, wiping, sliding, sweeping, and the like.
  • the gesture input is realized by the contact of the input object with the touch display screen, and the contact with the touch display screen may include direct contact with the touch display screen, such as a finger, a thumb, or a stylus pen, or proximity to the touch display screen without direct contact.
  • a gesture input that is in proximity to the touch display screen in direct contact refers to a gesture operation action in a spatial position proximate to the touch screen display.
  • the above graphical user interface refers to an overall design of human-computer interaction, operation logic, and interface aesthetics of the software, which may include one or more soft keyboards and multiple graphic control objects.
  • a soft keyboard can include a number of icons (or soft keys). This allows the user to select one or more icons in the soft keyboard and thus select one or more corresponding symbols for input.
  • the gesture detection module 113 can detect a gesture input that interacts between the input object and the touch display screen.
  • the gesture detection module 113 includes various operations for performing gesture input detection, such as determining whether a contact has occurred, determining whether the gesture input is continuously input, determining whether to correspond to the predetermined gesture, determining an operation position corresponding to the gesture input, determining Whether the corresponding operation position of the gesture input moves to the edge position of the corresponding display area, determines whether the gesture input has been interrupted (eg, whether the contact has stopped), determines the movement of the gesture input, and tracks the movement trajectory of the gesture input, and the like.
  • Program module determines whether a contact has occurred, determining whether the gesture input is continuously input, determining whether to correspond to the predetermined gesture, determining an operation position corresponding to the gesture input, determining Whether the corresponding operation position of the gesture input moves to the edge position of the corresponding display area, determines whether the gesture input has been interrupted (eg, whether the contact has stopped), determines the movement of the gesture input, and tracks the movement trajectory of the gesture input, and the like.
  • Determining the motion of the gesture input may include determining a rate of motion (amplitude), a velocity of motion (amplitude and direction), a velocity of motion (a change in amplitude and/or direction), a motion trajectory, and the like of the operational position corresponding to the gesture input. These operations can be applied to a single operational location (eg, a gesture input implemented by one finger), or multiple simultaneous operational locations (eg, "multi-touch", ie, gesture input implemented by multiple fingers).
  • the gesture detection module 113 is configured to detect motion of one or more input objects on a surface of the touch screen display or at a spatial location proximate to the touch screen display. The gesture detection module 113 is stored on the memory, and the above-mentioned gesture input is monitored by the call of one or more processors to obtain an operation input instruction of the user.
  • the processor 140 and the first memory 160 may be disposed on the main board 106, or may be disposed independently of the main board 106, or integrated with the touch display screen to form an independent display controller. 170, that is, the display of the ultrasonic image is realized, and the control command for obtaining the user input based on the ultrasonic image can also be realized.
  • the signal processing module 116 and/or the image processing module 126 of FIG. 1 are disposed in conjunction with the processor 140 to perform data processing of the ultrasound image on one or more processors, and to monitor the gesture input described above. And the generation of graphical user interfaces, collectively referred to as the first processor. As shown in FIG.
  • a common ultrasonic medical testing apparatus which comprises a display 1, a control key operating area 3, a display support arm 2, a host 4 and a foot. Step control 5.
  • the display 1 may be the same as the first touch display 130 described above, and the host 4 includes the above-described main board 106, or further includes a processor 140 and a first memory 160. Therefore, the parts involved in all data processing are collectively referred to as the first processor.
  • the ultrasonic medical testing apparatus 200 includes a probe 201, a transmitting circuit 203, a transmitting/receiving selection switch 202, a receiving circuit 204, a beam combining module 205, a signal processing module 216, and an image processing module 226.
  • the functions and implementations implemented by the probe 201, the transmitting circuit 203, the transmit/receive selection switch 202, the receiving circuit 204, the beam combining module 205, the signal processing module 216, and the image processing module 226 are as shown in FIG.
  • the probe 101, the transmitting circuit 103, the transmitting/receiving selection switch 102, the receiving circuit 104, the beam synthesizing module 105, the signal processing module 116, and the image processing module 126 in the embodiment are the same, and may not be further described herein.
  • the signal processing module 216 and the image processing module 226 of FIG. 2 may be integrated on one motherboard 206, or one or more of the modules (including the number herein above) are integrated in Implemented on a processor/controller chip. The difference from the embodiment shown in FIG.
  • the ultrasonic medical detecting apparatus 200 further includes: a first communication module 215 electrically connected to the image processing module 226, for transmitting the ultrasonic image data obtained by the image processing module 226 to the intelligent control
  • the controller 270, and/or the control signal input by the intelligent controller 270 is used to set the ultrasound imaging parameters used in the ultrasound imaging process.
  • the operation of setting the ultrasound imaging parameters includes updating the ultrasound imaging parameters, adjusting the ultrasound imaging parameters, or initializing the settings of the ultrasound imaging parameters.
  • the intelligent controller 270 in this embodiment includes a second touch display screen 230, a second processor 240, a second memory 260, and a second communication module 214.
  • the second memory 260 stores a computer program running on the second processor 240, such as the gesture detection module 213.
  • the gesture detection module 213 in the embodiment has the same function as the gesture detection module 113 in the embodiment shown in FIG. Repeated.
  • the second touch display screen 230 has the same function as the first touch display screen 130, but the specific product parameters may be different, and the “first” and “second” of the crown are only used to distinguish different applications in the embodiment.
  • the entities in the scene, as described below with respect to method steps or describing a single application scenario, can also be equivalently interpreted as a touch display in the traditional sense, so the rest of the text can also be referred to as a touch display.
  • the second communication module 214 receives the ultrasound image data transmitted from the first communication module 215 and/or transmits a control signal, such as a control signal containing ultrasound imaging parameter setting information, to the first communication module 215.
  • the intelligent controller 270 includes the display controller mentioned in FIG. 1, but may also include computer devices with touch display screens such as various smart terminal devices such as an IPAD, a mobile phone, and the like.
  • the intelligent controller 270 in this embodiment may also be the IPAD terminal controller 6 in FIG.
  • Communication mode of the first communication module 215 and the second communication module 214 A wireless data transmission protocol such as a wifi protocol, a Bluetooth transmission protocol, a mobile communication network protocol, or the like can be employed.
  • the ultrasonic medical testing device 200 and the intelligent controller 270 constitute an ultrasonic imaging system.
  • an ultrasound imaging system or an ultrasonic medical testing device 100 which integrates two display screens, a display 1 and an IPAD terminal controller 6.
  • the IPAD terminal controller 6 can be used to generate a graphical user interface to obtain user-related ultrasound imaging parameter adjustment commands, or to edit ultrasound images, etc.; it can also be used to display ultrasound images.
  • the IPAD terminal controller 6 also includes a second touch display screen 230.
  • the intelligent mobile terminal such as a smart phone for the IPAD terminal controller 6 equivalent to the smart controller 270.
  • the adjustment interface of these parameters is simultaneously displayed on the touch screen and the displayed adjustment interface changes with the sliding of the finger or the stylus.
  • the target object When there are many objects on the screen or the object to be selected is small, it is difficult to select the target object.
  • the finger or the stylus is clicked on the screen, it is not easy to point the target object because the finger will cover the target object. It is also easy to misidentify other objects.
  • the angle when the push target to be adjusted is small, the sliding of the finger or the stylus with a small stroke can make the target rotate a large angle, so it is difficult for the user to finely control the size of the adjusted angle.
  • the device disclosed in this embodiment is an ultrasonic imaging system or an ultrasonic medical detecting device capable of adjusting an imaging parameter of an ultrasonic medical detecting device or editing an ultrasonic image by touch control, through a user and a touch display screen
  • the graphical user interface interacts to more intuitively adjust or edit the image data and ultrasound imaging parameters obtained by the ultrasound imaging device, increasing the convenience of the user operating the ultrasound device, improving the user experience, and achieving precise control of the graphical interface. Positioning adjustment.
  • FIG. 4 is a flow chart showing the ultrasonic imaging control method in the embodiment shown in FIG. 1, FIG. 2 or FIG.
  • step S210 of FIG. 4 the transmitting circuit and the receiving circuit (103 and 104, 203 and 204) excite the probe (201, 101), transmit an ultrasonic beam to the detecting object according to the set ultrasonic imaging parameter, and excite the probe in step S212. (201, 101) receiving an echo of the ultrasonic beam to obtain an ultrasonic echo signal.
  • the ultrasound imaging parameters in this embodiment include position information of the sampling gate.
  • the first processor (including the image processing module in FIG. 1) is utilized. 126) or the image processing module (226) obtains an ultrasound image based on the ultrasound echo signal, for example, the image processing module (126) in the embodiment of FIG. 1 obtains an ultrasound image from the ultrasound echo signal, which is passed in the embodiment of FIG.
  • An image processing module (226) is to obtain an ultrasound image from the ultrasound echo signals.
  • a first memory is provided for storing a computer program running on the processor, such as the gesture detecting module 113 described above.
  • a second memory is provided in the intelligent controller 270 of FIG. 2 for storing a computer program running on the processor, such as the gesture detection module 213 described above.
  • the ultrasound image herein may be a different mode of ultrasound image as previously described, such as a B image, a C image, a D image, etc., or other types of two-dimensional ultrasound images or three-dimensional ultrasound images.
  • the ultrasound image mentioned herein may be a static frame image or a dynamic video image.
  • the first processor or the second processor 240 of FIG. 2 outputs the obtained ultrasonic image to the touch display screen for display.
  • an image display area for ultrasound image display is set on a graphical user interface layer formed on the touch display screen.
  • the graphical user interface includes at least two layers of interfaces, an ultrasound image is displayed on the first interface layer of the touch screen display, and a second interface layer that is transparently disposed is superimposed on the first interface layer, and the icon is Set on the second interface layer.
  • Such a setting method allows the remaining data except the image data to be displayed on top of the ultrasound image, does not obscure the display of the ultrasound image itself, and enables the user to observe the ultrasound image due to the adjustment based on the ultrasound imaging parameters. Change, or save and transfer information such as edited annotations with ultrasound images.
  • the first processor or the second processor 240 of FIG. 2 superimposes the display icon over the above-described displayed ultrasound image.
  • This icon includes one of a probe marker, a comment marker, a sample line, a sampling gate, and the like, for example, an arrow 510 displayed on the graphical user interface 500 in FIG. 5, and a sample displayed on the graphical user interface 600 in FIG. Door 610.
  • an icon can be displayed on the second interface layer.
  • the shape of the icon is not limited.
  • the icon is a sampling gate, such as the sampling gate 610 in FIG.
  • step S220 of FIG. 4 the first processor or the second processor 240 of FIG. 2 displays an extension extending from the end of the icon in a predetermined direction on the first touch display screen 130 or the second touch display screen 230. line.
  • An extension line is superimposed over the ultrasound image displayed on the touch display screen (130, 230), for example, in one embodiment, the extension line can be displayed on the second interface layer.
  • a steerable sensitive area is correspondingly set, that is, on the touch display screen (130, 230) is provided with a sensitive area corresponding to the extension line, and the extension line is associated with a sensitive area on the touch display screen.
  • the extension line corresponds to a sensitive area on the touch display (130, 230) for gesture recognition.
  • a sensitive area refers to a position corresponding to a manipulated icon or indicator (eg, an extension line, an icon, etc.) on a graphical user interface, and the position on the graphical user interface is usually positioned by establishing an interface rectangular coordinate system, for example, assuming (X, Y) to indicate the coordinates of a certain pixel point on the Cartesian coordinate system, then the set of (X, Y) constitutes the corresponding display area or sensitive area.
  • the sensitive area corresponding to the extension line includes a preset neighborhood range at the position where the extension line is displayed. When the contact of the input object with the touch display screen is in the neighborhood range, the extension line is selected by default, and the extension is activated. Line adjustment operation. For example, in FIG.
  • an extension 511 of the end of the icon 510 extending in a predetermined direction corresponds to a sensitive area on the interface of 512.
  • an extension 611 of the end of the icon 610 extending in a predetermined direction corresponds to a sensitive area on the interface of 612.
  • the sensitive area on the graphical user interface except the position where the extension line is displayed, so that the user can conveniently
  • the extension line is manipulated outside the ultrasound image without obscuring the display of the ultrasound image for more precise adjustment.
  • the sensitive area outside the location where the extension line is displayed monitors the contact of the input object with the touch display and can be associated with the extension line manipulation.
  • the "location” mentioned herein includes orientation information, coordinate information, and/or angle information, etc., for example, regarding the display position of an icon or an extension line on a graphical user interface, the coordinates of the pixel at which the icon or extension line is located may be used. Information to characterize.
  • the extension line may be a broken line, but is not limited to a broken line, and may be an extension control object when any object adjustment is required.
  • the above extension line is a straight line.
  • the extension line starts from the end of the icon, and the end portion can be located at any point on the icon, for example, it can be the end of the arrow in FIG. 5, or the middle point of the sampling gate (611) in FIG. .
  • the extension cord is convenient for the user to accurately position the icon when the position is adjusted.
  • the extension line can extend from the end of the icon to the ultrasound. Outside the display area of the image.
  • the extension line 511 in FIG. 5 can be extended beyond the display area 501 of the ultrasound image
  • the extension line 611 in FIG. 6 can be extended beyond the display area 601 of the ultrasound image.
  • the predetermined direction may be any direction starting from the end of the icon, or may be a special preset direction, or based on a user.
  • Initial input to determine a predetermined direction such as in one of the embodiments
  • the first processor or the second processor 240 of FIG. 2 implements display on the first touch display screen 130 or the second touch display screen 230 to extend the predetermined direction from the end of the icon to obtain an extension line.
  • a prompt can be displayed on the graphical user interface at the same time, the prompt being used to indicate the direction of rotation or translation.
  • the prompt 503 in FIG. 5 is used to prompt the rotation to the left; the prompt 603 in FIG. 6 is used to prompt the rotation to the right, and the prompt 604 is used to prompt the rotation to the left.
  • the first processor or the second processor 240 in FIG. 2 is associated with the correspondence between the sensitive area and the extension line.
  • the correspondence between the sensitive area and the extended line of the record may also be stored in the memory. (160, 260).
  • the manner of recording the correspondence between the sensitive area and the extension line may be a range of the pixel area of the sensitive area on the touch display screen, corresponding to the display position of the extension line, so as to facilitate subsequent quick search and gesture recognition according to the user input,
  • the change of the position of the extension line, the corresponding sensitive area will also change accordingly.
  • the extension line 611 in the process of moving the input object 602 from the position 613 to the position 614 also changes in the corresponding sensitive area in the figure.
  • the sensitive area of the extension line is set with the final position after the display of the extension line is determined, and whether the extension line is input is determined by determining whether the initial contact of the input object with the touch display screen is within the sensitive area. Operation to avoid misuse.
  • step S224 of FIG. 4 the first processor or the second processor 240 of FIG. 2 invokes the gesture detection module (113, 213) to monitor the motion of the contact of the input object with the touch display screen (130, 230).
  • step S226 in FIG. 4 the first processor or the second processor 240 in FIG. 2 invokes the gesture detection module (113, 213) to determine that the motion of the contact corresponds to an operational position on the touch display.
  • the operation position on the interface mentioned in this document refers to the position corresponding to the display interface when the user inputs the interface object (such as icon, extension line) by using the human-machine interaction device, and the position can be the coordinate position of the Cartesian coordinate system. It can also be expressed by the angle information in the polar coordinate system.
  • the determined operation position may be a pixel point position or a block of a plurality of pixel point positions.
  • detecting the input object (602) and the touch display screen (130, 230) Contacting, obtaining an initial operating position (613); second, determining whether the initial operating position (613) is within the sensitive area 612, when the initial operating position (613) is within the sensitive area 612, that is, detecting the input object ( 602) contacting the touch display screen (130, 230) within the sensitive area 612, indicating that the input object 602 has been contacted with the touch display screen (130, 230) in the sensitive area 612, indicating that the user is performing the extension line.
  • the operation determining whether the initial operating position (613) is within the sensitive area 612, when the initial operating position (613) is within the sensitive area 612, that is, detecting the input object ( 602) contacting the touch display screen (130, 230) within the sensitive area 612, indicating that the input object 602 has been contacted with the touch display screen (130, 230) in the sensitive area 612, indicating that the user is performing the extension line.
  • the operation position determined in step S226 may be an operation position (position 614 in FIG. 6), or may be a plurality of continuously varying operation positions from the initial operation position 613 to the operation position 614.
  • step S224 the first processor or the second processor 240 in FIG. 2 invokes the gesture detection module (113, or 213) to monitor the contact of the input object with the touch display screen (130, or 230).
  • the motion may be a continuous contact of the input object with the touch display (130, or 230), such as the long contact mentioned above.
  • the above-mentioned step S224 monitors the continuous contact of the input object with the touch display screen (130, or 230) in the sensitive area (or can also be understood as a continuously moving contact), continuous contact or contact.
  • the first processor in motion or the second processor 240 in FIG. 2 can recognize a series of continuously changing operational positions on the touch display (130, or 230) by the gesture detection module (113, or 213).
  • step S226 the first processor or the second processor 240 in FIG. 2 determines a plurality of operational positions of the contact on the touch display screen (130, or 230), and a plurality of continuously changing operation positions can be obtained.
  • the plurality of continuously varying operating positions may be arranged in a sequence along the direction of movement of the contact.
  • step S228 of FIG. 4 the first processor or the second processor 240 of FIG. 2 updates the display of the extension line and the icon such that the extension line passes the determined operational position.
  • the update mentioned herein refers to deleting the display of the extension line and the icon at the original position, and changing its position to the corresponding operation position, so that the extension line and the icon change according to the change of the monitored operation position. For example, in FIG.
  • the above steps S224 to S228 include:
  • the first processor or the second processor 240 of FIG. 2 invokes the gesture detection module to monitor the motion of the contact in the sensitive area of the input object with the touch display screen (130, or 230) (which may also be continuous contact) to determine the contact. Moving a plurality of continuously changing operating positions on the touch display screen; and updating the display of the extension lines and icons such that the display position of the extension lines on the touch display screen sequentially changes through a plurality of continuously changing operating positions, thereby extending Lines and icons change as the monitored operating position changes.
  • the method may further include:
  • the first processor or the second processor 240 in FIG. 2 invokes the gesture detection module to detect whether the contact of the input object with the touch display screen (130, or 230) is in the sensitive area where the extension line is located, when the input object and the touch display When the contact of the screen (130, or 230) is within the sensitive area in which the extension line is located, the processes of the above steps S224 to S228 are performed. On the other hand, when the contact of the input object with the touch display screen (130, or 230) is not in the sensitive area where the extension line is located, the processes of the above steps S224 to S228 are not performed. For a description of the process from S224 to step S228, refer to the related description above.
  • the input operation of the extension line can be ensured by tracking the contact between the input object and the touch display screen (130, or 230) to ensure the accuracy of the control signal input, and the positioning accuracy of the icon adjustment is ensured.
  • the first processor or the second processor 240 in FIG. 2 invokes a gesture detection module to detect contact of the input object 602 with the touch display screen (130, or 230) (ie, the operation position 613).
  • step S224 to step S228 is performed to start tracking the contact direction of the input object 602 with the touch display screen (130, or 230) (indication) The direction indicated by the sign 603) or the direction (the direction indicated by the indicator 604). Otherwise, the processes of the above steps S224 to S228 are not performed.
  • the process of the above steps S224 to S228 causes the extension line to change the display position according to the continuous contact of the input object with the touch display screen, and in order to improve the visualization effect of the extension line with the continuous contact, the display of the extension line and the icon is updated to the above operation.
  • the speed of change of the extension line between the two operating positions on the graphical user interface can be calculated according to the visual display moving speed, and the extension line and the icon are adjusted in the two first operating positions based on the changing speed.
  • the display moves between them to present a continuous display movement effect.
  • the process of updating the extension line and the display of the icon in step S228 to cause the extension line to pass the above operation position may adopt one of the following ways:
  • the icon 610 and the extension line are rotated centering on a position on the icon 610.
  • 611 causes the extension line 611 to pass the operation position (613, 614), that is, redraws an extension line 611 starting from a position on the icon 610 and passing through the operation position, and simultaneously according to the original relative position of the icon 610 and the extension line 611. Relationship drawing icon.
  • the extension line 611 is rotated centering on a position on the icon 610, and the direction of rotation can be obtained in accordance with the direction of movement in which the above contact is recognized.
  • the panning extension line 711 and the icon 710 cause the extension line 711 to pass through the operating position (713, 714, 715). That is, the icon 710 is translated along with the extension line 711 and the extension line 711 sequentially updates the changed elapsed operation position (713, 714, 715).
  • the following steps are performed to separately update the display of the extension line and the icon in the above step S228 so that the extension line passes the above-mentioned operation position.
  • step S827 the first processor or the second processor 240 in FIG. 2 identifies the moving direction of the contact of the input object with the touch display screen, and then performs the step S828 moving direction to determine the clip between the moving direction and the extension line.
  • the angular condition thereby updating the display of the extension line and the icon, causes the extension line to pass the above operational position. For example, as shown in FIG. 8 and FIG.
  • step S828 is performed to determine that the angle between the moving direction (the direction indicated by the prompt 604) and the extension line 611 satisfies the first preset condition, and then step S8291 is performed to execute One position on the icon 610 is the center rotation icon 610 and the extension line 611 causes the extension line 611 to pass the operation position 614, i.e., at the same time, the extension line 611 that has passed the same position on the icon 610 and has passed the operation position 613 needs to be cleared.
  • an ultrasound image 701 is displayed, and an extension 711 of the icon 710 is controlled by an input operation of the input object 702.
  • Monitoring the contact of the input object 702 with the touch display screen first obtains an initial operational position 713 when the contact is within the sensitive area 712, ie, the initial operational position 713 is within the sensitive area 712 (or the initial operational position 713 and sensitivity)
  • the area 712 is coincident), indicating that the user is operating on the extension line, and then tracking the movement of the contact of the input object 702 with the touch display screen (which may also be continuous contact), and determining that the movement of the contact corresponds to the touch display screen
  • a plurality of varying operational positions between 714 i.e., simultaneous removal of the original display of the extension line (i.e., extension line 711 indicated by the dashed line at position 713).
  • the second preset condition is that the included angle is equal to zero
  • the first preset condition is that the included angle is greater than zero.
  • the embodiment is not limited thereto, and the second preset condition may be that the angle is located at a smaller angle range, and the first preset condition is that the angle is located at a larger angle range.
  • the motion detection speed and the corresponding direction can be calculated by using the gesture detection module to track and monitor the motion of the contact on the touch screen.
  • the motion using the contact corresponds to two operation positions on the touch display screen. The connection between them to determine the direction of motion of the above contact.
  • step S826 or step S226 in FIG. 4
  • step S8292 the contact between the monitoring input object and the touch display screen is implemented in the following manner.
  • the movement determines that the motion of the contact corresponds to the operating position on the touch display screen, and the translation of the extension line and the icon causes the extension line to pass the above-described operating position, as shown in FIG.
  • the above-described first moving portion is a process of moving from the operating position 713 to the operating position 714, 713 ⁇ 714.
  • the movement of the first moving portion (713 ⁇ 714) described above proceeds in the direction indicated by the prompt 703.
  • the embodiment further includes: detecting contact between the input object (702) and the touch display screen (130, 230) to obtain an initial operation position (713); second, determining Whether the initial operational position (713) is within the sensitive area 712, when the initial operational position (713) is within the sensitive area 712, that is, detecting the input object (702) Contact with the touch display screen (130, 230) within the sensitive area 712, indicating that the input object 702 has contacted the touch display screen (130, 230) within the sensitive area 712, indicating that the user has entered the extension line. operating.
  • the first moving portion of the contact that monitors the input object to the touch display screen (130, 230) is tracked.
  • the first operating position may be any one of the operating positions in the extending direction in which the extension line is currently located. For example, 714 in Figure 7, or a plurality of varying operational positions between 713 and 714.
  • the second moving portion may be a process of moving from the operating position 714 to the operating position 715 in FIG. 7, 714 ⁇ 715.
  • the direction in which the extension line 711 is currently located may be moved in the direction indicated by the prompt 704.
  • the second operating position may be any one of the operating positions on the touch display screen, for example, 715 in Figure 7, or a plurality of varying operational positions between 714 and 715.
  • the above-described icon 710 is translated along with the extension line 711 such that the extension line 711 passes through the second operation position.
  • the icon 710 is translated along with the extension line 711 such that the extension line 711 passes the operation position 715, or Multiple varying operational positions between 714 and 715. If the extension line 711 is translated during translation so that the extension line 711 passes through a plurality of changed operation positions between 714 and 715, the change of the icon 710 along with the extension line 711 following the movement of the input object 702 can be realized.
  • the display effect is provided.
  • the input object is always in contact with the touch display screen during the movement from the operating position 713 to the operating position 715.
  • step S230 of FIG. 4 the first processor or the second processor 240 in FIG. 2 invokes the above-described gesture detection module to recognize that the release input object is in contact with the touch display screen (130, or 230). The end position.
  • the first processor or the second processor 240 of FIG. 2 can at least realize the recognition of the termination position generated by the release of the contact.
  • the disengagement of contact between the input object and the touch display screen (130, or 230) is monitored, and the operational position at which the contact is located before disengagement is taken as the end position.
  • the input object 602 is in contact with the touch display screen (130, or 230), and the processor detects that the contact between the two moves from the operating position 613 to the operating position 614, thus also continuously displaying the extension line. Updated to continuously rotate the extension line past the operational position 614.
  • the processor monitors the disengagement of contact between the input object 602 and the touch display screen (130, or 230), and the operational position 614 at which the contact is located prior to disengagement is taken as the termination position.
  • the input object 702 is in contact with the touch display screen (130, or 230), and the processor detects that the contact between the two is continuously moved from the operation position 713 to the operation position 715, thus also continuously displaying the extension line. Updated to continuously translate the extension line past the operating position 715.
  • the input object is detached at the operating position 715, and the processor monitors the detachment of the contact between the input object 702 and the touch display screen (130, or 230), and the operating position 715 where the contact is located before the detachment is taken as the end position.
  • monitoring the contact between the input object and the touch display screen (130, or 230) beyond the predetermined range may also be regarded as releasing the contact, where the predetermined range may be a certain interface area containing the ultrasound image. range.
  • step S232 of FIG. 4 the first processor or the second processor 240 of FIG. 2 determines the display of the extension line and the icon such that the extension line passes the above-described end position.
  • "passing" of an extension line means that the extension line passes through the termination position or at least one pixel location within the operational position.
  • the relative positional relationship between the icon and the extension line mentioned herein includes the angular relationship between the icon and the extension line, the position of the icon at the position of the extension line, and the like.
  • the icon and the extension line are linked, and the icon rotates when the extension line rotates, and the icon moves as the extension line shifts.
  • Determining the display of extension cords and icons by determining where the extension lines and icons are displayed on the graphical user interface, or by attaching extension cords and icons to the graphical user interface during the update of icons and extensions.
  • the extension line passes the display position of the above termination position.
  • Step S230 and step S232 in FIG. 4 above are the same as the steps S830 and S832 in FIG. 8, and are not described here.
  • the input object can be kept in contact with the touch screen.
  • the determination of the second operation position in the above embodiment is performed by determining the angle condition between the moving direction of the contact and the extending direction of the extension line, or is it used to determine the rotation icon and the extension line.
  • the operating position through which the extension line passes is located at a small but greater than zero.
  • the first angular range is set, and the first preset condition is set to an angle that is located at a second angular range that is larger and different from the second predetermined condition.
  • This also enables simultaneous rotation and translation of the extension lines and icons during uninterrupted contact monitoring.
  • the rotation and translation operations of the extension lines and icons can be implemented in intermittent contact monitoring by performing the above steps S230 and S232, or steps S830 and S832.
  • the first processor or the second processor 240 in FIG. 2 cancels the extension line on the ultrasound image. display. It is guaranteed that after the adjustment of the icon is implemented through the extension cable, the extension line is no longer displayed, and the complete display of the ultrasound image interface is ensured.
  • the first processor or the second processor 240 in FIG. 2 is updated according to the updated position of the icon.
  • Set the ultrasound imaging parameters For example, in the embodiment shown in FIG. 5, the processor can set the ultrasound imaging parameters based on the sampling gate after the updated position.
  • the processor may update the remark information of the ultrasound image according to the icon after updating the location (eg, probe identification, annotation text, etc.), where the remark information includes: probe position, sampling gate position, batch Note text, sample line angle, sample gate angle, etc. Edit information on the ultrasound image or for the ultrasound image.
  • the ultrasound imaging parameters after the update settings are transmitted to the probe and the ultrasound image is obtained based on the new ultrasound imaging parameters.
  • the ultrasound image and the note information are stored and transmitted to the remote device.
  • the second processor in the intelligent controller 270 generates a control signal containing the updated imaging parameters according to the above icon, and/or generates an image containing the ultrasound image and the remark information.
  • Image data, and note information is obtained by the above icon.
  • the control signal is output to the first communication module 215 by the second communication module 214 for controlling the ultrasound scan of the target tissue by the first processor to update the corresponding ultrasound imaging parameter; or by the second communication module 214
  • the image data containing the ultrasonic image and the remark information is output to the first communication module 215 for display by the first processor or output to the upper computer for storage.
  • Figure 4 and Figure 8 respectively provide only a sequence of process execution between steps, and can also be based on In the foregoing, the various steps in the steps of FIG. 4 and FIG. 8 are adjusted to obtain various modifications.
  • the above steps are not limited to being performed only in the order of FIG. 4 and FIG. 8, and the steps may be replaced with each other if the basic logic is satisfied.
  • the order of execution may be changed, and one or more of the steps may be repeated, followed by the last step or steps, which are all variants in accordance with the embodiments provided herein.
  • the technical solution of the present invention which is essential or contributes to the prior art, may be embodied in the form of a software product carried on a non-transitory computer readable storage carrier (eg ROM, disk, optical disk, hard disk, server cloud space), comprising a plurality of instructions for causing a terminal device (which may be a mobile phone, a computer, a server, a network device, etc.) to execute the system structure and method of various embodiments of the present invention .
  • a terminal device which may be a mobile phone, a computer, a server, a network device, etc.
  • a computer readable storage medium having stored thereon a computer program, which when executed by a processor, can be used at least to implement the steps S214 to S232 in FIG. 4 or step S814 in FIG. 8 as mentioned in the foregoing.
  • the user can conveniently select the extension line to adjust the icon that is not occluded, and also increase the distance from the contact point of the finger to the icon to the icon, thereby realizing the position of the icon and the like.
  • the information is finerly adjusted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

一种超声医学检测设备及成像控制方法、成像系统、控制器,其设备用于在所述超声图像上叠加显示图标(S218),显示从所述图标的端部沿预定方向延伸的延长线(S220),监测输入对象与所述触摸显示屏的接触的运动(S224),确定所述接触的运动对应到触摸显示屏上的操作位置(S226),更新所述延长线和图标的显示使所述延长线经过所述操作位置(S228)。从而提升了用户操作的便利性,极大地提高了用户体验。

Description

超声医学检测设备及成像控制方法、成像系统、控制器 技术领域
本发明涉及带有触摸显示屏的超声成像控制方法及成像系统。
背景技术
超声成像系统中,常常需要调节屏幕上的对象的角度。现有技术中,通常是用手或者触笔点选需要调节角度的对象,按住后在屏幕上拖动,实现对象角度的调节。
但是,当屏幕上对象较多或者欲选中的对象较小时,要想选中目标对象会比较困难,当用手或者触笔在屏幕上点选时,由于手指会遮住目标对象,不容易点中目标对象,也容易误选中其他对象。而且,当调节角度时,当欲调节的推向目标较小时,手指或者触笔较小行程的滑动即可使目标转动较大的角度,因此用户很难精细地控制调节的角度的大小。
发明内容
基于此,有必要针对现有技术中存在的操作不便问题,提供一种超声医学检测设备及成像控制方法、成像系统、控制器。
在其中一个实施例中,提供了一种超声医学检测设备,其包括:
探头;
发射电路和接收电路,用于激励所述探头向检测对象发射超声波束,并接收所述超声波束的回波,获得超声回波信号;
图像处理模块,用于根据所述超声回波信号获得超声图像;
触摸显示屏;
第一存储器,所述第一存储器存储处理器上运行的计算机程序;和,
第一处理器,所述第一处理器执行所述程序时实现以下步骤:
在所述触摸显示屏上显示所述超声图像,
在所述超声图像上叠加显示图标,
显示从所述图标的端部沿预定方向延伸的延长线,
监测输入对象与所述触摸显示屏的接触的运动,
确定所述接触的运动对应到触摸显示屏上的操作位置,
更新所述延长线和图标的显示使所述延长线经过所述操作位置,
识别释放所述接触产生的终止位置,和,
确定所述延长线和图标的显示使所述延长线经过所述终止位置,其中,所述图标和所述延长线之间的相对位置关系始终维持不变。
在其中一个实施例中,提供了一种超声成像控制方法,其包括:
激励探头向检测对象发射超声波束;
接收所述超声波束的回波,获得超声回波信号;
根据所述超声回波信号获得超声图像;
在所述触摸显示屏上显示所述超声图像,
在所述超声图像上叠加显示图标,
显示从所述图标的端部沿预定方向延伸的延长线,
监测输入对象与所述触摸显示屏的接触的运动,
确定所述接触的运动对应到触摸显示屏上的操作位置,
更新所述延长线和图标的显示使所述延长线经过所述操作位置,
识别释放所述接触产生的终止位置,和,
确定所述延长线和图标的显示使所述延长线经过所述终止位置,其中,所述图标和所述延长线之间的相对位置关系始终维持不变。
在其中一个实施例中,提供了一种超声成像系统,其包括:超声医学检测设备和智能控制器;其中,
所述超声医学检测设备包括:
探头;
发射电路和接收电路,用于激励所述探头向检测对象发射超声波束,接收所述超声波束的回波,获得超声回波信号;
图像处理模块,用于根据所述超声回波信号获得超声图像;和,
与图像处理模块电连接的第一通信模块,用于将所述超声图像数据传输至所述智能控制器,和/或接收所述智能控制器输入的控制信号用以设置获得所述超声图像所需要的超声成像参数;
所述智能控制器包括:
触摸显示屏,
第二通信模块,用于接收来自所述第一通信模块传送的超声图像数据,和/或向所述第一通信模块发送控制信号;
第二存储器,所述存储器存储处理器上运行的计算机程序;和,
第二处理器,所述第二处理器执行所述程序时实现以下步骤:
在所述触摸显示屏上显示所述超声图像,
在所述超声图像上叠加显示图标,
显示从所述图标的端部沿预定方向延伸的延长线,
监测输入对象与所述触摸显示屏的接触的运动,
确定所述接触的运动对应到触摸显示屏上的操作位置,
更新所述延长线和图标的显示使所述延长线经过所述操作位置,
识别释放所述接触产生的终止位置,和,
确定所述延长线和图标的显示使所述延长线经过所述终止位置,其中,所述图标和所述延长线之间的相对位置关系始终维持不变,以及,
根据所述图标生成含有超声成像参数的控制信号,和/或生成含有所述超声图像以及备注信息的图像数据;
通过所述第二通信模块输出所述控制信号至所述第一通信模块,或者输出所述图像数据。
在其中一个实施例中,提供了一种智能控制器,其包括:
触摸显示屏;
第二通信模块,用于接收来自超声医学检测设备传送的超声图像数据,和/或向所述超声医学检测设备发送控制信号;
第二存储器,所述存储器存储处理器上运行的计算机程序;和,
第二处理器,所述第二处理器执行所述程序时实现以下步骤:
在所述触摸显示屏上显示所述超声图像,
在所述超声图像上叠加显示图标,
显示从所述图标的端部沿预定方向延伸的延长线,
监测输入对象与所述触摸显示屏的接触的运动,
确定所述接触的运动对应到触摸显示屏上的操作位置,
更新所述延长线和图标的显示使所述延长线经过所述操作位置,
识别释放所述接触产生的终止位置,和,
确定所述延长线和图标的显示使所述延长线经过所述终止位置,其中,所述图标和所述延长线之间的相对位置关系始终维持不变,以及,
根据所述图标生成含有超声成像参数的控制信号,和/或生成含有所述超声图像以及备注信息的图像数据;
通过所述第二通信模块输出所述控制信号至所述第一通信模块,或者输 出所述图像数据。
附图说明
图1为提供了依照一些实施例的超声医学检测设备的系统架构示意图;
图2为提供了依照一些实施例的超声医学检测设备的系统架构示意图;
图3提供了依照一些实施例的超声检测系统的系统架构示意图;
图4提供了图1或图2所示的本实施例中超声成像控制方法的流程示意图;
图5提供了一些实施例中在图形用户界面上针对超声图像501上叠加的图标510进行操作输入的实施例;
图6提供了一些实施例中在图形用户界面上针对超声图像601上叠加的图标610进行旋转操作的实施例;
图7提供了一些实施例中在图形用户界面上针对超声图像701上叠加的图标710进行平移操作的实施例;
图8提供了图1或图2所示的本实施例中超声成像控制方法的另一种实现流程的示意图。
具体实施方式
下面通过具体实施方式结合附图对本发明作进一步详细说明。其中不同实施方式中类似元件采用了相关联的类似的元件标号。在以下的实施方式中,很多细节描述是为了使得本申请能被更好的理解。然而,本领域技术人员可以毫不费力的认识到,其中部分特征在不同情况下是可以省略的,或者可以由其他元件、材料、方法所替代。在某些情况下,本申请相关的一些操作并没有在说明书中显示或者描述,这是为了避免本申请的核心部分被过多的描述所淹没,而对于本领域技术人员而言,详细描述这些相关操作并不是必要的,他们根据说明书中的描述以及本领域的一般技术知识即可完整了解相关操作。
另外,说明书中所描述的特点、操作或者特征可以以任意适当的方式结合形成各种实施方式。同时,方法描述中的各步骤或者动作也可以按照本领域技术人员所能显而易见的方式进行顺序调换或调整。因此,说明书和附图中的各种顺序只是为了清楚描述某一个实施例,并不意味着是必须的顺序,除非另有说明其中某个顺序是必须遵循的。
本文中为部件所编序号本身,例如“第一”、“第二”等,仅用于区分所描述的对象,不具有任何顺序或技术含义。而本申请所说“连接”、“联接”,如无特别说明,均包括直接和间接连接(联接)。
图1给出了一个实施例中超声医学检测设备100的结构示意图,具体结构如下所示。图1所示的超声医学检测设备100主要包括:探头101、发射电路103、发射/接收选择开关102、接收电路104、波束合成模块105、信号处理模块116和图像处理模块126。在超声成像过程中,发射电路103将经过延迟聚焦的具有一定幅度和极性的发射脉冲通过发射/接收选择开关102发送到探头101。探头101受发射脉冲的激励,向检测对象(例如,人体或者动物体内的器官、组织、血管等等,图中未示出)发射超声波(可以是平面波、聚焦波或发散波中的任何一种),经一定延时后接收从目标区域反射回来的带有检测对象的信息的超声回波,并将此超声回波重新转换为电信号。接收电路104接收探头101转换生成的电信号,获得超声回波信号,并将这些超声回波信号送入波束合成模块105。波束合成模块105对超声回波信号进行聚焦延时、加权和通道求和等处理,然后将超声回波信号送入信号处理模块116进行相关的信号处理。经过信号处理模块116处理的超声回波信号送入图像处理模块126。图像处理模块126根据用户所需成像模式的不同,对信号进行不同的处理,获得不同模式的超声图像数据,然后经对数压缩、动态范围调整、数字扫描变换等处理形成不同模式的超声图像,如B图像,C图像,D图像等等,或者其他类型的二维超声图像或三维超声图像。上述发射电路和接收电路激励探头根据超声成像参数的设定向检测对象发射超声波束,并接收超声波束的回波,获得超声回波信号,从而获得期望的超声图像数据,用以进行显示,展现检测对象内部的组织结构。本文中提到的超声成像参数涉及所有在超声组织图像的成像过程中可供用户进行自主选择的参数,例如,TGC(Time Gain Compensate,时间增益补偿),声波频率,脉冲重复频率(pulse recurrence frequency,PRF),超声波类型,和动态范围等等。
在本发明的其中一些实施例中,图1中的信号处理模块116和图像处理模块126可以集成在一个主板106上,或者其中的一个或两个以上(本文中以上包括本数)的模块集成在一个处理器/控制器芯片上实现。
获得超声图像可以输出至显示控制器170后进行显示。显示控制器170通过输入输出接口与图像处理模块连接实现数据传输。显示控制器170可以 包括第一触摸显示屏130、处理器140以及第一存储器160。处理器140调用第一存储器160上记载的计算机程序指令从而将超声图像显示在第一触摸显示屏130上,和/或在触摸显示屏上形成图形用户界面。在其中一个实施例中,在第一触摸显示器130上显示图形用户界面(GUI),并展现诸如前文提到的有关超声图像成像过程中涉及的超声成像参数调节、各种功能按键等图形控件。基于图形用户界面(GUI)可以获得因输入对象在触摸显示器上的操作而产生的对图形控件进行的相应操作的控制指令,这些关于超声成像参数等信息的控制指令可以通过有线或者无线的方式传输给超声医学检测设备,并用于控制探头、发射电路、接收电路等的工作,用于获得期望得到的超声图像。针对超声图像的显示,例如,超声图像可以分别显示在两个显示器屏幕上,或者在同一个显示器屏幕上进行分屏显示。在触摸显示屏上即可以显示超声图像,也可以显示用户操作指令输入的图形用户界面(GUI)。当然,基于图形用户界面(GUI)还可以设置用于显示超声图像的显示区域,然后通过用户的手势输入来对超声图像进行编辑,编辑包括:调整图像大小、调整清晰度、进行注释等等操作。
基于在触摸显示屏上显示的图形用户界面,处理器140可以调用存储器160中存储的手势检测模块113,来检测用户通过输入对象在图形用户界面上执行接触操作而获得的控制指令。在多个实施例中,包含具有带有图形用户界面(GUI)的触摸显示屏、一个或多个处理器、存储器、和存储在存储器中用于执行多种功能的一个或多个模块、程序或指令集,由它们共同实现了基于图形用户界面(GUI)的操控输入检测并获得相关控制指令。在多个实施例中,这些功能可以包括对检测对象(例如,病人的组织)进行参数调节、信息输入等以获得医疗检测数据、图像浏览、病理数据库构建、检索和维护、病人档案信息构建、显示和管理、病人目录信息构建、显示和管理、等等。用于执行这些模块、程序或指令可以包括在为供一个或多个处理器执行而配置的计算机程序产品中。在本发明的其中一些实施例中,用户主要在触摸显示屏上通过手势输入与图形用户界面进行交互。这里的手势输入可以包括通过直接接触触摸显示屏或接近触摸显示屏使设备可以检测的任何类型的用户手势输入。例如,手势输入可以是用户使用右手或左手的手指(例如,食指、拇指等)、或者可以通过触摸显示屏可检测的输入对象(例如,手写笔、触摸显示屏专用笔)在触摸显示屏上选择一个位置、多个位置、和/或多个连续位置的动作,可以包括类似接触、触摸的释放、触摸的轻拍、长接触、旋转展 开等操作动作。这里,长接触对应于在手指、拇指、手写笔等与触摸显示屏保持持续接触状态下沿着预定方向或可变的方向移动手指、拇指、手写笔的一种手势输入,例如,像触摸拖动、轻拂、擦过、滑动、扫掠等那样的手势操作动作。可见,手势输入通过输入对象与触摸显示屏的接触来实现,与触摸显示屏的接触可以包括手指、拇指、或手写笔等直接与触摸显示屏接触,或非直接接触地接近触摸显示屏,而非直接接触地接近触摸显示屏的手势输入是指在接近触摸显示屏的空间位置上的手势操作动作。而上述图形用户界面是指对软件的人机交互、操作逻辑、界面美观的整体设计,其可以包括一个或多个软键盘、以及多个图形控件对象。软键盘可以包括一定数量的图标(或软键)。这可以使用户可以选择软键盘中的一个或多个图标,并因此选择一个或多个相应符号进行输入。手势检测模块113可以检测输入对象与触摸显示屏之间进行交互的手势输入。手势检测模块113包括用于执行与手势输入检测相关的各种操作,譬如,确定是否发生了接触、确定手势输入是否持续输入、确定是否与预定手势对应、确定手势输入所对应的操作位置、确定手势输入对应的操作位置是否移动到相应显示区域的边缘位置、确定手势输入是否已中断(如,接触是否已停止)、确定手势输入的移动并跟踪手势输入的移动轨迹等等各个步骤的各种程序模块。确定手势输入的运动可以包括确定手势输入所对应的操作位置的运动速率(幅度)、运动速度(幅度和方向)、和/或运动加速度(幅度和/或方向的变化)、运动轨迹等等。这些操作可以应用于单个操作位置(例如,一个手指所实现的手势输入)、或多个同时操作位置(例如,“多触摸”,即多个手指所实现的手势输入)。在一些实施例中,手势检测模块113用于检测触摸显示屏表面上或在接近触摸显示屏的空间位置上的一个或多个输入对象的运动。手势检测模块113存储在存储器上,并通过一个或多个处理器的调用来实现上述手势输入的监测,获得用户的操作输入指令。
当然,在图1所示的实施例中,处理器140和第一存储器160可以设置在主板106上,也可以独立于主板106设置,或者与触摸显示屏集成安装在一起形成独立的显示控制器170,即实现超声图像的显示,也可以实现基于超声图像而获得用户输入的控制指令。在其中一个实施例中,图1中的信号处理模块116和/或图像处理模块126,连同处理器140统一设置在一个或多个处理器上执行超声图像的数据处理,以及上述手势输入的监测和图形用户界面的生成,统称为第一处理器。如图3所示,提供一种常见的超声医学检测设备,其包括显示器1,控制键操作区3,显示器支撑臂2,主机4以及脚 踏控制5。显示器1与上述第一触摸显示器130可以相同,主机4包括上述主板106,或者还包括处理器140和第一存储器160。因此,涉及所有数据处理的部分统称为第一处理器。
图2提供了另一个实施例的结构示意图。如图2所示,超声医学检测设备200包括:探头201、发射电路203、发射/接收选择开关202、接收电路204、波束合成模块205、信号处理模块216和图像处理模块226。在本实施例中,探头201、发射电路203、发射/接收选择开关202、接收电路204、波束合成模块205、信号处理模块216和图像处理模块226所实现的功能和实现方式与图1所示实施例中的探头101、发射电路103、发射/接收选择开关102、接收电路104、波束合成模块105、信号处理模块116和图像处理模块126相同,可参见前文说明在此不再累述。在本发明的其中一些实施例中,图2中的信号处理模块216和图像处理模块226可以集成在一个主板206上,或者其中的一个或两个以上(本文中以上包括本数)的模块集成在一个处理器/控制器芯片上实现。与图1所示实施例不同之处在于,超声医学检测设备200还包括:与图像处理模块226电连接的第一通信模块215,用于将图像处理模块226获得的超声图像数据传输至智能控制器270,和/或接收智能控制器270输入的控制信号用以设置在超声成像过程中使用的超声成像参数。设置超声成像参数的操作包括更新超声成像参数、调整超声成像参数、或初始化超声成像参数的设置等操作。本实施例中的智能控制器270包括:第二触摸显示屏230,第二处理器240,第二存储器260和第二通信模块214。第二存储器260存储第二处理器240上运行的计算机程序,例如手势检测模块213,本实施例中手势检测模块213和图1所示实施例中的手势检测模块113功能相同,在此不再累述。第二触摸显示屏230与第一触摸显示屏130的实现功能相同,但是具体的产品参数可能不相同,冠之“第一”和“第二”仅用于在表述实施例时区分不同的应用场景内的实体,下文关于方法步骤或者描述单一应用场景时也可等同理解为就是传统意义上的触摸显示屏,因此本文其他地方也可简称为触摸显示屏。第二通信模块214接收来自第一通信模块215传送的超声图像数据,和/或向第一通信模块215发送控制信号,例如含有超声成像参数设置信息的控制信号。智能控制器270包括图1中提到的显示控制器,但是也可以包含诸如各种智能终端设备,例如IPAD、手机等等带有触摸显示屏的计算机设备。例如,本实施例中的智能控制器270还可以是图3中的IPAD终端控制器6。第一通信模块215和第二通信模块214的通信方式 可以采用wifi协议、蓝牙传输协议、移动通信网络协议等等无线数据传输协议。超声医学检测设备200和智能控制器270构成一个超声成像系统。
图3所示的实施例中提供了一种超声成像系统或者一种超声医学检测设备100,它集成了两个显示屏,显示器1与IPAD终端控制器6。IPAD终端控制器6可以用来生成图形用户界面获得用户关于超声成像参数调节指令,或者对超声图像的编辑等处理;也可以用来显示超声图像。IPAD终端控制器6同样包括第二触摸显示屏230。当然,对于等同于智能控制器270的IPAD终端控制器6还可以用智能手机等智能移动终端来实现同样的功能。
基于上述图1、图2或者图3所提供的超声医学检测设备(100,200)的结构示意图,以下将结合图1、图2或者图3提供的硬件环境详细描述一下有关超声成像参数的设置方式。
当在触摸屏(即触摸显示屏)上用滑动方式调节参数时,在触摸屏上会同时显示这些参数的调节界面并且显示的调节界面会随着手指或者触笔的滑动而变化。当屏幕上对象较多或者欲选中的对象较小时,要想选中目标对象会比较困难,当用手或者触笔在屏幕上点选时,由于手指会遮住目标对象,不容易点中目标对象,也容易误选中其他对象。而且,当调节角度时,当欲调节的推向目标较小时,手指或者触笔较小行程的滑动即可使目标转动较大的角度,因此用户很难精细地控制调节的角度的大小。所以,本实施例中公开的设备是一种可通过触摸控制来对超声医学检测设备的成像参数进行调节或者对超声图像进行编辑的超声成像系统或者超声医学检测设备,其通过用户与触摸显示屏上的图形用户界面进行交互来更加直观的调整或者编辑超声成像设备获得的图像数据及超声成像参数,增加了用户操作超声设备的便利性,提升用户体验,并且实现对图形界面上操控对象的精确定位调节。
因此本实施例中提供了如下图4所示的超声成像参数的控制方式。图4提供了图1、图2或者图3所示的本实施例中超声成像控制方法的流程示意图。
在图4的步骤S210中,发射电路和接收电路(103和104,203和204)激励探头(201,101),根据设定的超声成像参数向检测对象发射超声波束,并在步骤S212中,激励探头(201,101)接收上述超声波束的回波,获得超声回波信号。在其中一个实施例中,本实施例中的超声成像参数包括采样门的位置信息。
在图4的步骤S214中,利用第一处理器(包含图1中的图像处理模块 126)或者图像处理模块(226)根据超声回波信号获得超声图像,例如,图1的实施例中通过图像处理模块(126)来根据超声回波信号获得超声图像,图2的实施例中通过图像处理模块(226)来根据超声回波信号获得超声图像。同时图1中的超声医学检测设备内,还提供第一存储器,用于存储处理器上运行的计算机程序,例如上述手势检测模块113。而在图2中的智能控制器270中提供第二存储器,用于存储处理器上运行的计算机程序,例如上述手势检测模块213。本文的超声图像可以是前文所述的不同模式的超声图像,如B图像,C图像,D图像等等,或者其他类型的二维超声图像或三维超声图像。同样的,本文提到的超声图像可以是静态帧图像,也可以是动态视频图像。
在图4的步骤S216中,第一处理器或者图2中的第二处理器240将获得的超声图像输出至触摸显示屏上进行显示。例如,在触摸显示屏上形成的图形用户界面层上设置用于超声图像显示的图像显示区域。在其中一个实施例中,图形用户界面至少包含两层界面层,在触摸显示屏的第一界面层上显示超声图像,在第一界面层的上方叠加透明设置的第二界面层,并将图标设置在第二界面层上。这样的设置方式可以让除图像数据之外的其余数据悬浮于超声图像之上显示,不遮挡超声图像本身的显示,并能够令用户观察到因为基于超声成像参数的调节而带来的超声图像的变化,或者将编辑的批注等信息一同与超声图像保存和传输。
在图4的步骤S218中,第一处理器或者图2中的第二处理器240在上述显示的超声图像之上叠加显示图标。此图标包括探头标记符、注释标记、采样线和采样门等等中的其中之一,例如,如图5中图形用户界面500上显示的箭头510,图6中图形用户界面600上显示的采样门610。在其中一个实施例中,图标可以显示在第二界面层上。
所述图标的形状不限,在其中一个实施例中,图标为采样门,如图6中的采样门610。
在图4的步骤S220中,第一处理器或者图2中的第二处理器240在第一触摸显示屏130或第二触摸显示屏230上显示从上述图标的端部沿预定方向延伸的延长线。
在触摸显示屏(130,230)显示的超声图像之上叠加显示延长线,例如,在其中一个实施例中,延长线可以显示在第二界面层上。
在显示延长线的同时对应设置一可操控的敏感区,即在触摸显示屏(130, 230)上设置有与延长线对应的敏感区,延长线与触摸显示屏上一敏感区一一对应关联。延长线对应在触摸显示屏(130,230)上有可供手势识别的敏感区。敏感区是指在图形用户界面上与被操控图标或指示符(例如延长线、图标等)关联对应的位置,通常在图形用户界面上的位置通过建立界面直角坐标系来定位,例如,假设用(X,Y)来表示上的某个像素点位置在该直角坐标系的坐标,那么,(X,Y)的集合则构成了相应的显示区域或者敏感区。与延长线对应的敏感区包括在延长线显示所在的位置上的预设邻域范围,当输入对象与触摸显示屏的接触位于该邻域范围时,默认选中了延长线,并激活了对延长线进行调节的操作。例如,图5中,图标510的端部沿预定方向延伸的延长线511,其对应于界面上的敏感区为512。图6中,图标610的端部沿预定方向延伸的延长线611,其对应于界面上的敏感区为612。当然,除了将延长线显示所在的位置上的预设邻域范围设置为敏感区之外,还可以在图形用户界面上除延长线显示所在的位置之外设置敏感区,这样的话可以方便用户在超声图像之外对延长线进行操控,而不会遮挡超声图像的显示,起到更加精确调节的目的。在延长线显示所在的位置之外的敏感区监测输入对象与触摸显示屏的接触,可以关联到对延长线操控执行。
本文提到的“位置”包含方位信息、坐标信息、和/或角度信息等等,例如,关于图标或者延长线在图形用户界面上的显示位置,可以用图标或者延长线所在的像素点的坐标信息来表征。
上述延长线可以是虚线,但不限于是虚线,也可以是任何需要进行对象调节时采用延长控制对象。上述延长线为直线。
上述延长线始于图标的端部,该端部可以位于图标上的任意一点位置,例如,可以是图5中箭头的尾端,也可以是图6中采样门(611)的中间点等等。
上述延长线便于用户在对图标进行位置调整时能够精确定位,为了在调整的过程中尽量不遮挡超声图像,所以,在其中一个实施例中,上述延长线可以从图标的端部开始延伸至超声图像的显示区域之外。例如,图5中延长线511可延长至超声图像的显示区域501之外,图6中延长线611可延长至超声图像的显示区域601之外。
从上述图标的端部沿预定方向延伸获得上述延长线,这里的预定方向可以是以上述图标的端部为起点的任意一个方向,当然也可以是某个特殊的预置方向,或者基于用户的初始输入来确定预定方向,例如在其中一个实施例 中,第一处理器或者图2中的第二处理器240在第一触摸显示屏130或第二触摸显示屏230上实现显示从上述图标的端部沿预定方向延伸获得延长线的过程采用以下方式:首先,检测输入对象(502,602)与触摸显示屏(130,230)的接触,获得初始的操作位置(513,613);生成从图标(510,610)的端部延伸至初始的操作位置(513,613)的延长线(511,611);和,显示延长线(511,611)。
当然为了提供更加友好的交互体验,可以同时在图形用户界面上显示提示符,所述提示符用于指示旋转或平移的方向。例如,图5中的提示符503,用于提示向左旋转;图6中的提示符603用于提示向右旋转,提示符604用于提示向左旋转。
在此步骤之后还可以包括:第一处理器或者图2中的第二处理器240关联记录敏感区与延长线的对应关系,例如还可以将记录的敏感区与延长线的对应关系存储在存储器(160,260)中。记录敏感区与延长线的对应关系的方式,可以是记录敏感区在触摸显示屏上的像素区域范围,对应于延长线的显示位置,便于后续根据用户的输入进行快速查找和手势识别,随着延长线位置的改变,其对应的敏感区也将随之改变,例如图6中,输入对象602从位置613移动到位置614的过程中延长线611在图中对应的敏感区也发生了变化。在其中一个实施例中,延长线的敏感区以延长线的显示确定之后的最终位置来设置,通过判断输入对象与触摸显示屏的初始接触是否在敏感区内来确定是否对延长线进行了输入操作,避免误操作。
在图4的步骤S224中,第一处理器或者图2中的第二处理器240调用手势检测模块(113,213)监测输入对象与触摸显示屏(130,230)的接触的运动。在图4中的步骤S226中,第一处理器或者图2中的第二处理器240调用手势检测模块(113,213)确定上述接触的运动对应到触摸显示屏上的操作位置。本文中提到的界面上的操作位置是指用户利用人机交互设备对界面对象(例如图标、延长线)进行操作输入时对应于显示界面上的位置,该位置可以用直角坐标系的坐标位置来表示,也可以用极坐标系下的角度信息来表示。确定的操作位置可以是一个像素点位置,也可以是多个像素点位置构成的区域块。
为了提高交互操作的用户体验和避免误操作,在上述监测输入对象与触摸显示屏(130,230)的接触的运动之前还可以包括以下步骤:
首先,如图6所示,检测输入对象(602)与触摸显示屏(130,230)的 接触,获得初始的操作位置(613);其次,判断初始的操作位置(613)是否位于敏感区612内,当初始的操作位置(613)位于敏感区612内时,也就是,检测输入对象(602)在敏感区612内与触摸显示屏(130,230)的接触,表征检测到了输入对象602在上述敏感区612内与触摸显示屏(130,230)发生了接触,表明用户对延长线进行了操作。再次,跟踪监测输入对象与触摸显示屏(130,230)的接触的运动,确定上述接触的运动对应到触摸显示屏上的操作位置,上述接触的运动对应到触摸显示屏上的操作位置从初始的操作位置613变化到操作位置614。在步骤S226中确定的操作位置可以是一个操作位置(图6中的位置614),也可以是从初始的操作位置613变化到操作位置614之间的多个连续变化的操作位置。
更进一步地,在步骤S224中,第一处理器或者图2中的第二处理器240调用手势检测模块(113,或213)来监测输入对象与触摸显示屏(130,或230)的接触的运动,可以是输入对象与触摸显示屏(130,或230)的持续接触,例如前文中提到的长接触。例如,在其中一个实施例中,上述步骤S224中监测输入对象在敏感区内与触摸显示屏(130,或230)的持续接触(或者也可以理解为是不断运动的接触),持续接触或者接触运动时第一处理器或者图2中的第二处理器240通过手势检测模块(113,或213)可以识别出在触摸显示屏(130,或230)上一些系列连续变化的操作位置。于是,在步骤S226中,第一处理器或者图2中的第二处理器240确定上述接触在触摸显示屏(130,或230)上的多个操作位置,可以获得多个连续变化的操作位置。当然,在多个连续变化的操作位置可以沿上述接触的运动方向排列变化。
接着,在图4的步骤S228中,第一处理器或者图2中的第二处理器240更新延长线和图标的显示使延长线经过上述确定的操作位置。本文中提到的更新是指删除延长线和图标在原先位置的显示,而将其位置变更为相应的操作位置,使得延长线和图标跟随监测到的操作位置的变化而变化。例如图6中,输入对象602与触摸显示屏的接触从操作位置613变化到操作位置614时,延长线和图标的显示也随之改变,保持更新状态,其中更新的过程中,图标610和延长线611之间的相对位置关系始终维持不变,延长线也始终经过步骤S226确定的操作位置,从而使得图标610和延长线611的显示总是一起发生更新调整,实现图标随着延长线的调节而变化的效果。本文提到的延长线的“经过”是指延长线经过终止位置或者操作位置内至少一个像素位置。
例如,在其中一个实施例中,上述步骤S224至步骤S228包括:
第一处理器或者图2中的第二处理器240调用手势检测模块监测输入对象敏感区内与触摸显示屏(130,或230)的接触的运动(也可以是持续接触),确定上述接触的运动在触摸显示屏上对应的多个连续变化的操作位置;和,更新延长线和图标的显示使延长线在触摸显示屏上的显示位置依次变化地经过多个连续变化的操作位置,使得延长线和图标跟随监测到的操作位置的变化而变化。
此外,在其中一些实施例中,在上述步骤S224至步骤S228的过程之前还可以包括:
上述第一处理器或者图2中的第二处理器240调用手势检测模块检测输入对象与触摸显示屏(130,或230)的接触是否位于延长线所在的敏感区内,当输入对象与触摸显示屏(130,或230)的接触在延长线所在的敏感区内时,执行上述步骤S224至步骤S228的过程。反之,当输入对象与触摸显示屏(130,或230)的接触不在延长线所在的敏感区内时,则不执行上述步骤S224至步骤S228的过程。有关S224至步骤S228的过程描述可参见前文相关说明。本实施例可以通过跟踪监测输入对象与触摸显示屏(130,或230)的接触对延长线进行的输入操作,来确保控制信号输入的准确性,保证对图标调节的定位准确性。例如,如图6所示,首先,上述第一处理器或者图2中的第二处理器240调用手势检测模块检测输入对象602与触摸显示屏(130,或230)的接触(即操作位置613处的接触)是否位于延长线所在的敏感区612内,若是,则执行上述步骤S224至步骤S228的过程,开始跟踪监测输入对象602与触摸显示屏(130,或230)的接触沿方向(指示符603指示的方向)或者方向(指示符604指示的方向)的运动。反之,则不执行上述步骤S224至步骤S228的过程。
上述步骤S224至步骤S228的过程使得延长线可以随输入对象与触摸显示屏的持续接触来变化显示位置,为了提高延长线随持续接触的可视化效果,在将延长线和图标的显示更新至上述操作位置的过程中,可以按照可视化的显示移动速度来计算延长线在图形用户界面上两个操作位置之间的变化速度,并基于该变化速度来调整延长线和图标在两个第一操作位置之间的显示移动,从而呈现连续的显示移动效果。
更进一步地,在其中一个实施例中,上述步骤S228中更新延长线和图标的显示使延长线经过上述操作位置的过程可以采用以下方式之一:
1、如图6所示,以图标610上的一个位置为中心旋转图标610和延长线 611使延长线611经过操作位置(613,614),也就是说,重新绘制以图标610上的一个位置为起点、且经过操作位置的延长线611,并同时按照图标610和延长线611原始的相对位置关系绘制图标。以图标610上的一个位置为中心旋转延长线611,而旋转的方向可以按照识别上述接触的运动方向来获得。
2、如图7所示,平移延长线711和图标710使延长线711经过操作位置(713,714,715)。也就是说,使图标710随同延长线711一起平移并使延长线711依次更新变化的经过操作位置(713,714,715)。
当然在上述过程中,需要清除延长线的在先显示,保证是更新显示的延长线。
为了保证在同一个图形用户界面上能同时实现上述第1种和第2种两种操作功能,而不需要附加额外的控制按键操作,则可以采用如图8所示的流程图来进行。在图8中步骤S810、步骤S812、步骤S814、步骤S816、步骤S818、步骤S820、步骤S824和步骤S826分别与上述步骤S210、步骤S212、步骤S214、步骤S216、步骤S218、步骤S220、步骤S224和步骤S226的过程相同,在此不再累述。在图8所示的实施例中,采用以下步骤分别实现上述步骤S228中更新延长线和图标的显示使延长线经过上述操作位置的过程。
在步骤S827中,第一处理器或者图2中的第二处理器240识别上述输入对象与触摸显示屏的接触的运动方向,接着执行步骤S828运动方向来确定运动方向与延长线之间的夹角条件,从而更新延长线和图标的显示使延长线经过上述操作位置。例如,如图8和图6所示,执行步骤S828,确定上述运动方向(提示符604所指示的方向)与延长线611之间的夹角满足第一预设条件,则执行步骤S8291,以图标610上的一个位置为中心旋转图标610和延长线611使延长线611经过操作位置614,即同时需要清除以图标610上的同一个位置为起点、且经过操作位置613的延长线611。
又例如,如图8和图7所示,在图7所示的图形用户界面700上,显示超声图像701,图标710的延长线711受控于输入对象702的输入操作。监测输入对象702与触摸显示屏的接触,首先获得一个初始的操作位置713,当该接触在敏感区712内时,即初始的操作位置713在敏感区712内(或者初始的操作位置713与敏感区712重合),表征用户在对延长线进行了操作,此时再跟踪监测到输入对象702与触摸显示屏的接触的运动(也可为持续接触),确定该接触的运动对应到触摸显示屏上的操作位置(714,或者713到714之间的多个变化的操作位置),识别上述接触的运动方向,确定上述运动 方向(提示符703所指示的方向)与延长线711之间的夹角满足第二预设条件,则执行步骤S8292,平移延长线711和图标710使延长线711经过操作位置(714,或者713到714之间的多个变化的操作位置),即同时需要清除延长线的原始显示(即通过713位置的虚线表示的延长线711)。图7中通过延长线711的不同线型、以及输入对象702的不同线型轮廓、和图标710的不同轮廓线,来表征延长线711、输入对象702和图标710在不同时刻时对应于上述接触的运动的变化情况。
在其中一个实施例中,上述第二预设条件为夹角等于零,上述第一预设条件为夹角大于零。当然,本实施例也不限于此,还可以是第二预设条件为夹角位于在一个较小的角度范围,而第一预设条件为夹角位于在一个较大的角度范围。确定上述接触的运动方向与延长线之间的夹角满足夹角大于零时,也可以理解为是,确定上述输入对象与触摸显示屏的接触的运动偏离了延长线当前所在的延伸方向。而,确定上述接触的运动方向与延长线之间的夹角满足夹角等于零时,也可以理解为是,确定上述输入对象与触摸显示屏的接触的运动是沿延长线当前所在的延伸方向进行。
对于识别上述接触的运动方向,利用手势检测模块跟踪监测上述接触在触摸屏上的运动时可以计算运动速度、以及相应的方向,例如,利用上述接触的运动对应到触摸显示屏上的两个操作位置之间的连线,来确定上述接触的运动方向。
为了能通过操作延长线将图标移动到任意一个位置,在执行上述步骤S826(或图4中的步骤S226)至步骤S8292的过程中采用以下方式实现上述监测输入对象与所述触摸显示屏的接触的运动,确定上述接触的运动对应到触摸显示屏上的操作位置,以及平移延长线和图标使延长线经过上述操作位置,参见图7所示。
1、监测上述接触的第一运动部分,上述第一运动部分沿上述延长线当前所在的延伸方向进行。
例如,图7中,上述第一运动部分即从操作位置713运动到操作位置714的过程,713→714。上述第一运动部分(713→714)的运动沿提示符703的指示方向进行。当然,参见前文的实施例,为了避免误操作,本实施例中还包括:检测输入对象(702)与触摸显示屏(130,230)的接触,获得初始的操作位置(713);其次,判断初始的操作位置(713)是否位于敏感区712内,当初始的操作位置(713)位于敏感区712内时,也就是,检测输入对象(702) 在敏感区712内与触摸显示屏(130,230)的接触,表征检测到了输入对象702在上述敏感区712内与触摸显示屏(130,230)发生了接触,表明用户对延长线进行了输入操作。再次,跟踪监测输入对象与触摸显示屏(130,230)的接触的第一运动部分。
2、确定上述接触的第一运动部分关联在上述触摸显示屏上的第一操作位置。本实施例中,第一操作位置可以为位于上述延长线当前所在的延伸方向上的任意一个操作位置。例如,图7中的714,或者713到714之间的多个变化的操作位置。
3、将上述图标710连同上述延长线711平移使得上述延长线711经过第一操作位置,例如图7中,将上述图标710连同上述延长线711平移使得上述延长线711经过操作位置714,或者713到714之间的多个变化的操作位置。若平移时,上述延长线711平移使得上述延长线711经过713到714之间的多个变化的操作位置,那么就可以实现上述图标710连同上述延长线711跟随输入对象702的移动而发生的变化的显示效果。
4、监测上述接触的第二运动部分,上述第二运动部分偏离上述延长线711当前所在的延伸方向。本实施例中,上述第二运动部分可以为图7中从操作位置714运动到操作位置715的过程,714→715。而上述延长线711当前所在的延伸方向可以是沿提示符704所示的方向运动。
5、确定上述接触的第二运动部分关联在上述触摸显示屏上的第二操作位置,当然,在本实施例中,上述第二操作位置可以为触摸显示屏上的任意一个操作位置,例如,图7中的715,或者714到715之间的多个变化的操作位置。
6、将上述图标710连同上述延长线711平移使得上述延长线711经过上述第二操作位置,例如图7中,将上述图标710连同上述延长线711平移使得上述延长线711经过操作位置715,或者714到715之间的多个变化的操作位置。若平移时,上述延长线711平移使得上述延长线711经过714到715之间的多个变化的操作位置,那么就可以实现上述图标710连同上述延长线711跟随输入对象702的移动而发生的变化的显示效果。
在图7所示的实施例中,从操作位置713移动到操作位置715的过程中不曾间断,输入对象一直与触摸显示屏接触。
在图4的步骤S230中,第一处理器或者图2中的第二处理器240调用上述手势检测模块识别释放输入对象与触摸显示屏(130,或230)接触时产生 的终止位置。
在其中一个实施例中,第一处理器或者图2中的第二处理器240至少可以通过以下方式实现识别释放上述接触产生的终止位置。
监测输入对象与触摸显示屏(130,或230)之间接触的脱离,将上述接触在脱离前位于的操作位置作为终止位置。例如,如图6所示,输入对象602与触摸显示屏(130,或230)之间接触,处理器检测两者的接触从操作位置613移动到操作位置614,因此也将延长线的显示不断更新,用以连续旋转延长线使其经过操作位置614。此时,在操作位置614输入对象脱离,处理器监测输入对象602与触摸显示屏(130,或230)之间接触的脱离,将上述接触在脱离前位于的操作位置614作为终止位置。又例如,图7中,输入对象702与触摸显示屏(130,或230)之间接触,处理器检测两者的接触从操作位置713连续移动到操作位置715,因此也将延长线的显示不断更新,用以连续平移延长线使其经过操作位置715。此时,在操作位置715输入对象脱离,处理器监测输入对象702与触摸显示屏(130,或230)之间接触的脱离,将上述接触在脱离前位于的操作位置715作为终止位置。当然,更进一步的,监测输入对象与触摸显示屏(130,或230)之间的接触超过预定范围之外,也可以视为释放上述接触,这里的预定范围可以是包含超声图像的一定界面区域范围。
在图4的步骤S232中,第一处理器或者图2中的第二处理器240确定延长线和图标的显示使延长线经过上述终止位置。本文提到的延长线的“经过”是指延长线经过终止位置或者操作位置内至少一个像素位置。当然在此过程中,图标和延长线之间的相对位置关系始终维持不变。本文中提到的图标和延长线之间的相对位置关系包括图标和延长线之间的夹角关系、图标位于延长线的位置等等两者的坐标位置关系。本文的实施例中图标和延长线是联动的,当延长线旋转时图标旋转,当延长线平移时图标随之平移。确定延长线和图标的显示,在于确定延长线和图标在图形用户界面上的显示位置,或者说是在图标和延长线的更新过程中将延长线和图标在图形用户界面上固定到上述可以使延长线经过上述终止位置的显示位置。
上述图4中的步骤S230和步骤S232与图8中的步骤S830和步骤S832的执行过程相同,在此不再累述。
此外,当需要通过在图形用户界面上输入相应操作即可实现上述旋转或者平移延长线和图标的操作过程时,可以通过在保持输入对象与触摸屏接触 状态的过程中,通过判断接触的运动方向与延长线当前所在的延伸方向之间的夹角条件来区分是执行上述实施例中第二操作位置的确定,还是用于确定旋转图标和延长线时延长线所要经过的操作位置。例如,参见前文中关于图8的相关说明,将第二预设条件设置为夹角(即接触的运动方向与延长线当前所在的延伸方向之间的夹角)位于在一个较小但大于零的第一角度范围,而第一预设条件设置为夹角位于在一个较大、且不同于第二预设条件的第二角度范围。这样也可以在不间断的接触监测过程中,同时实现对延长线和图标的旋转和平移操作。当然,利用执行上述步骤S230和步骤S232、或者步骤S830和步骤S832可以在间断的接触监测中实现对延长线和图标的旋转和平移操作。
在其中一个实施例中,在上述各个实施例执行完步骤S232、或者步骤S832之后,还可以包括以下步骤:第一处理器或者图2中的第二处理器240取消超声图像上的延长线的显示。保证在通过延长线实施对图标的调整之后,不再显示延长线,保证超声图像界面的完整显示。
在其中一个实施例中,在上述各个实施例执行完步骤S232、或者步骤S832之后,还可以包括以下步骤:第一处理器或者图2中的第二处理器240依据图标更新后的位置,更新设置超声成像参数。例如图5所示的实施例中,处理器可以依据更新位置后的采样门来设置超声成像参数。还比如,图7中处理器可以依据更新位置后的图标(例如,探头标识、批注文本等等)来更新超声图像的备注信息,此处备注的信息包括:探头位置、采样门位置、批注文本、采样线角度、采样门角度等等在超声图像上或者针对超声图像进行的编辑信息。
更新设置后的超声成像参数被传输至探头,并依据新的超声成像参数获得超声图像。或者,将超声图像以及备注信息存储,并传输给远端设备。
此外,还例如图2所示的实施例中,智能控制器270中的第二处理器根据上述图标生成含有更新设置的超声成像参数的控制信号,和/或,生成含有超声图像以及备注信息的图像数据,备注信息由上述图标获得。通过第二通信模块214将该控制信号输出至第一通信模块215,用以通过第一处理器来控制探头对目标组织的超声扫描,更新相应的超声成像参数;或者通过第二通信模块214将含有超声图像以及备注信息的图像数据输出至第一通信模块215,用以通过第一处理器来显示,或者输出至上位机存储。
图4和图8分别提供的仅仅是一种步骤间的流程执行顺序,还可以基于 前文中对图4和图8中的各个步骤进行调整顺序获得各种变形方案,上述各个步骤不限于仅按照图4和图8的顺序执行,步骤间在满足基本逻辑的情况下可以相互置换,更改执行顺序,还可以重复执行其中的一个或多个步骤后,在执行最后一个或多个步骤,这些方案均属于依据本文提供的实施例进行的变形方案。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品承载在一个非易失性计算机可读存储载体(如ROM、磁碟、光盘、硬盘、服务器云空间)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本发明各个实施例的系统结构和方法。例如,一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时至少可以用于实现前文中提到的基于图4中步骤S214至步骤S232或者图8中步骤S814至步骤S832所示流程的各个实施例。
本实施例中由于延长线的存在,即可方便用户点选延长线来非遮挡的调节图标,也由于增大从手指与屏幕的接触点到图标的距离,因此可以实现对图标的角度等位置信息的更精细地进行调节。
以上实施例仅表达了几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干变形和改进,这些都属于本发明的保护范围。因此,本发明专利的保护范围应以所附权利要求为准。

Claims (37)

  1. 一种超声医学检测设备,其特征在于,所述设备包括:
    探头;
    发射电路和接收电路,用于激励所述探头向检测对象发射超声波束,并接收所述超声波束的回波,获得超声回波信号;
    图像处理模块,用于根据所述超声回波信号获得超声图像;
    触摸显示屏;
    第一存储器,所述第一存储器存储处理器上运行的计算机程序;和,
    第一处理器,所述第一处理器执行所述程序时实现以下步骤:
    在所述触摸显示屏上显示所述超声图像,
    在所述超声图像上叠加显示图标,
    显示从所述图标的端部沿预定方向延伸的延长线,
    监测输入对象与所述触摸显示屏的接触的运动,
    确定所述接触的运动对应到触摸显示屏上的操作位置,
    更新所述延长线和图标的显示使所述延长线经过所述操作位置,
    识别释放所述接触产生的终止位置,和,
    确定所述延长线和图标的显示使所述延长线经过所述终止位置,其中,所述图标和所述延长线之间的相对位置关系始终维持不变。
  2. 根据权利要求1所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时采用以下方式实现所述显示从所述图标的端部沿预定方向延伸的延长线:
    检测输入对象与所述触摸显示屏的接触,获得初始接触位置;
    生成从所述图标的端部延伸至所述初始接触位置的延长线;和,
    显示所述延长线。
  3. 根据权利要求1所述的超声医学检测设备,其特在于,所述图标包括探头标记符、注释标记、采样线和采样门中的其中之一。
  4. 根据权利要求1所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时至少采用以下步骤之一来实现所述更新所述延长线和图标的显示使所述延长线经过所述操作位置:
    以所述图标上的一个位置为中心旋转所述图标和延长线使所述延长线经过所述操作位置,和,
    平移所述延长线和图标使所述延长线经过所述操作位置。
  5. 根据权利要求4所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时在所述重新绘制以所述图标上的一个位置为起点且经过所述操作位置的延长线之前还包括:
    识别所述接触的运动方向,和
    确定所述接触的运动偏离所述延长线当前所在的延伸方向,或者确定所述接触的运动方向与延长线之间的夹角满足第一预设条件,第一预设条件为所述夹角位于第一角度范围。
  6. 根据权利要求4或5所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时在所述平移所述延长线和图标使所述延长线经过所述操作位置之前还包括:
    识别所述接触的运动方向,和
    确定所述接触的运动沿所述延长线当前所在的延伸方向进行,或者确定所述接触的运动方向与延长线之间的夹角满足第二预设条件,第二预设条件为所述夹角位于不同于第一角度范围的第二角度范围。
  7. 根据权利要求4所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时采用以下方式实现所述监测输入对象与所述触摸显示屏的接触的运动,确定所述接触的运动对应到触摸显示屏上的操作位置,以及平移所述延长线和图标使所述延长线经过所述操作位置:
    监测所述接触的第一运动部分,所述第一运动部分沿所述延长线当前所在的延伸方向进行,
    确定所述接触的第一运动部分关联在所述触摸显示屏上的第一操作位置,
    将所述图标连同所述延长线平移使得所述延长线经过第一操作位置,
    监测所述接触的第二运动部分,所述第二运动部分偏离所述延长线当前所在的延伸方向,
    确定所述接触的第二运动部分关联在所述触摸显示屏上的第二操作位置,
    将所述图标连同所述延长线平移使得所述延长线经过所述第二操作位置,所述第二操作位置为触摸显示屏上的任意一个操作位置。
  8. 根据权利要求1所述的超声医学检测设备,其特在于,所述延长线延伸至超声图像的显示区域之外。
  9. 根据权利要求1所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时在所述监测输入对象在所述敏感区内与所述触摸显示屏的接触之后还包括:
    显示提示符,所述提示符用于指示旋转或平移的方向。
  10. 根据权利要求1所述的超声医学检测设备,其特在于,所述第一处理器还执行所述程序时实现以下步骤:
    在所述触摸显示屏的第一界面层上显示所述超声图像,在第一界面层的上方叠加透明设置的第二界面层,并将所述图标和延长线设置在所述第二界面层上。
  11. 一种超声成像控制方法,其包括:
    激励探头向检测对象发射超声波束;
    接收所述超声波束的回波,获得超声回波信号;
    根据所述超声回波信号获得超声图像;
    在所述触摸显示屏上显示所述超声图像,
    在所述超声图像上叠加显示图标,
    显示从所述图标的端部沿预定方向延伸的延长线,
    监测输入对象与所述触摸显示屏的接触的运动,
    确定所述接触的运动对应到触摸显示屏上的操作位置,
    更新所述延长线和图标的显示使所述延长线经过所述操作位置,
    识别释放所述接触产生的终止位置,和,
    确定所述延长线和图标的显示使所述延长线经过所述终止位置,其中,所述图标和所述延长线之间的相对位置关系始终维持不变。
  12. 根据权利要求11所述的超声成像控制方法,其特征在于,所述显示从所述图标的端部沿预定方向延伸的延长线包括:
    检测输入对象与所述触摸显示屏的接触,获得初始接触位置;
    生成从所述图标的端部延伸至所述初始接触位置的延长线;和,
    显示所述延长线。
  13. 根据权利要求11所述的超声成像控制方法,其特征在于,所述图标包括探头标记符、注释标记、采样线和采样门中的其中之一。
  14. 根据权利要求11所述的超声成像控制方法,其特征在于,所述更新所述延长线和图标的显示使所述延长线经过所述操作位置包括以下步骤之一:
    以所述图标上的一个位置为中心旋转所述图标和延长线使所述延长线经过所述操作位置,和,
    平移所述延长线和图标使所述延长线经过所述操作位置。
  15. 根据权利要求14所述的超声成像控制方法,其特征在于,在所述重新绘制以所述图标上的一个位置为起点且经过所述操作位置的延长线之前还包括:
    识别所述接触的运动方向,和
    确定所述接触的运动偏离所述延长线当前所在的延伸方向,或者确定所述接触的运动方向与延长线之间的夹角满足第一预设条件,第一预设条件为所述夹角位于第一角度范围。
  16. 根据权利要求14或15所述的超声成像控制方法,其特征在于,在所述平移所述延长线和图标使所述延长线经过所述操作位置之前还包括:
    识别所述接触的运动方向,和
    确定所述接触的运动沿所述延长线当前所在的延伸方向进行,或者确定所述接触的运动方向与延长线之间的夹角满足第二预设条件,第二预设条件为所述夹角位于不同于第一角度范围的第二角度范围。
  17. 根据权利要求14所述的超声成像控制方法,其特征在于,所述监测输入对象与所述触摸显示屏的接触的运动,确定所述接触的运动对应到触摸显示屏上的操作位置,以及平移所述延长线和图标使所述延长线经过所述操作位置包括:
    监测所述接触的第一运动部分,所述第一运动部分沿所述延长线当前所在的延伸方向进行,
    确定所述接触的第一运动部分关联在所述触摸显示屏上的第一操作位置,
    将所述图标连同所述延长线平移使得所述延长线经过第一操作位置,
    监测所述接触的第二运动部分,所述第二运动部分偏离所述延长线当前所在的延伸方向,
    确定所述接触的第二运动部分关联在所述触摸显示屏上的第二操作位置,
    将所述图标连同所述延长线平移使得所述延长线经过所述第二操作位置,所述第二操作位置为触摸显示屏上的任意一个操作位置。
  18. 根据权利要求11所述的超声成像控制方法,其特征在于,所述延长线延伸至超声图像的显示区域之外。
  19. 根据权利要求11所述的超声成像控制方法,其特征在于,在所述触摸显示屏的第一界面层上显示所述超声图像,在第一界面层的上方叠加透明设置的第二界面层,并将所述图标和延长线设置在所述第二界面层上。
  20. 一种超声成像系统,其特征在于,所述系统包括:超声医学检测设备和智能控制器;其中,
    所述超声医学检测设备包括:
    探头;
    发射电路和接收电路,用于激励所述探头向检测对象发射超声波束,接收所述超声波束的回波,获得超声回波信号;
    图像处理模块,用于根据所述超声回波信号获得超声图像;和,
    与图像处理模块电连接的第一通信模块,用于将所述超声图像数据传输至所述智能控制器,和/或接收所述智能控制器输入的控制信号用以设置获得所述超声图像所需要的超声成像参数;
    所述智能控制器包括:
    触摸显示屏,
    第二通信模块,用于接收来自所述第一通信模块传送的超声图像数据,和/或向所述第一通信模块发送控制信号;
    第二存储器,所述存储器存储处理器上运行的计算机程序;和,
    第二处理器,所述第二处理器执行所述程序时实现以下步骤:
    在所述触摸显示屏上显示所述超声图像,
    在所述超声图像上叠加显示图标,
    显示从所述图标的端部沿预定方向延伸的延长线,
    监测输入对象与所述触摸显示屏的接触的运动,
    确定所述接触的运动对应到触摸显示屏上的操作位置,
    更新所述延长线和图标的显示使所述延长线经过所述操作位置,
    识别释放所述接触产生的终止位置,和,
    确定所述延长线和图标的显示使所述延长线经过所述终止位置,其中,所述图标和所述延长线之间的相对位置关系始终维持不变,以及,
    根据所述图标生成含有超声成像参数的控制信号,和/或生成含有所述超声图像以及备注信息的图像数据;
    通过所述第二通信模块输出所述控制信号至所述第一通信模块,或者输出所述图像数据。
  21. 根据权利要求20所述的超声成像系统,其特在于,所述第一处理器执行所述程序时采用以下方式实现所述显示从所述图标的端部沿预定方向延伸的延长线:
    检测输入对象与所述触摸显示屏的接触,获得初始接触位置;
    生成从所述图标的端部延伸至所述初始接触位置的延长线;和,
    显示所述延长线。
  22. 根据权利要求20所述的超声成像系统,其特在于,所述图标包括探头标记符、注释标记、采样线和采样门中的其中之一。
  23. 根据权利要求20所述的超声成像系统,其特在于,所述第二处理器执行所述程序时至少采用以下步骤之一来实现所述更新所述延长线和图标的显示使所述延长线经过所述操作位置:
    以所述图标上的一个位置为中心旋转所述图标和延长线使所述延长线经过所述操作位置,和,
    平移所述延长线和图标使所述延长线经过所述操作位置。
  24. 根据权利要求20所述的超声成像系统,其特在于,所述第二处理器执行所述程序时在所述重新绘制以所述图标上的一个位置为起点且经过所述操作位置的延长线之前还包括:
    识别所述接触的运动方向,和
    确定所述接触的运动偏离所述延长线当前所在的延伸方向,或者确定所述接触的运动方向与延长线之间的夹角满足第一预设条件,第一预设条件为所述夹角位于第一角度范围。
  25. 根据权利要求23或24所述的超声成像系统,其特在于,所述第二处理器执行所述程序时在所述平移所述延长线和图标使所述延长线经过所述操作位置之前还包括:
    识别所述接触的运动方向,和
    确定所述接触的运动沿所述延长线当前所在的延伸方向进行,或者确定所述接触的运动方向与延长线之间的夹角满足第二预设条件,第二预设条件为所述夹角位于不同于第一角度范围的第二角度范围。
  26. 根据权利要求23所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时采用以下方式实现所述监测输入对象与所述触摸显示屏的接触的运动,确定所述接触的运动对应到触摸显示屏上的操作位置,以及平移所述延长线和图标使所述延长线经过所述操作位置:
    监测所述接触的第一运动部分,所述第一运动部分沿所述延长线当前所在的延伸方向进行,
    确定所述接触的第一运动部分关联在所述触摸显示屏上的第一操作位置,
    将所述图标连同所述延长线平移使得所述延长线经过第一操作位置,
    监测所述接触的第二运动部分,所述第二运动部分偏离所述延长线当前所在的延伸方向,
    确定所述接触的第二运动部分关联在所述触摸显示屏上的第二操作位置,
    将所述图标连同所述延长线平移使得所述延长线经过所述第二操作位置,所述第二操作位置为触摸显示屏上的任意一个操作位置。
  27. 根据权利要求20所述的超声成像系统,其特在于,所述延长线延伸至超声图像的显示区域之外。
  28. 根据权利要求20所述的超声成像系统,其特在于,所述第二处理器还执行所述程序时实现以下步骤:
    在所述触摸显示屏的第一界面层上显示所述超声图像,在第一界面层的上方叠加透明设置的第二界面层,并将所述图标和延长线设置在所述第二界面层上。
  29. 一种智能控制器,其特征在于,所述智能控制器包括:
    触摸显示屏;
    第二通信模块,用于接收来自超声医学检测设备传送的超声图像数据,和/或向所述超声医学检测设备发送控制信号;
    第二存储器,所述存储器存储处理器上运行的计算机程序;和,
    第二处理器,所述第二处理器执行所述程序时实现以下步骤:
    在所述触摸显示屏上显示所述超声图像,
    在所述超声图像上叠加显示图标,
    显示从所述图标的端部沿预定方向延伸的延长线,
    监测输入对象与所述触摸显示屏的接触的运动,
    确定所述接触的运动对应到触摸显示屏上的操作位置,
    更新所述延长线和图标的显示使所述延长线经过所述操作位置,
    识别释放所述接触产生的终止位置,和,
    确定所述延长线和图标的显示使所述延长线经过所述终止位置,其中,所述图标和所述延长线之间的相对位置关系始终维持不变,以及,
    根据所述图标生成含有超声成像参数的控制信号,和/或生成含有所述超声图像以及备注信息的图像数据;
    通过所述第二通信模块输出所述控制信号至所述第一通信模块,或者输 出所述图像数据。
  30. 根据权利要求29所述的智能控制器,其特在于,所述第一处理器执行所述程序时采用以下方式实现所述显示从所述图标的端部沿预定方向延伸的延长线:
    检测输入对象与所述触摸显示屏的接触,获得初始接触位置;
    生成从所述图标的端部延伸至所述初始接触位置的延长线;和,
    显示所述延长线。
  31. 根据权利要求29所述的智能控制器,其特在于,所述图标包括探头标记符、注释标记、采样线和采样门中的其中之一。
  32. 根据权利要求29所述的智能控制器,其特在于,所述第二处理器执行所述程序时至少采用以下步骤之一来实现所述更新所述延长线和图标的显示使所述延长线经过所述操作位置:
    以所述图标上的一个位置为中心旋转所述图标和延长线使所述延长线经过所述操作位置,和,
    平移所述延长线和图标使所述延长线经过所述操作位置。
  33. 根据权利要求29所述的智能控制器,其特在于,所述第二处理器执行所述程序时在所述重新绘制以所述图标上的一个位置为起点且经过所述操作位置的延长线之前还包括:
    识别所述接触的运动方向,和
    确定所述接触的运动偏离所述延长线当前所在的延伸方向,或者确定所述接触的运动方向与延长线之间的夹角满足第一预设条件,第一预设条件为所述夹角位于第一角度范围。
  34. 根据权利要求32或33所述的智能控制器,其特在于,所述第二处理器执行所述程序时在所述平移所述延长线和图标使所述延长线经过所述操作位置之前还包括:
    识别所述接触的运动方向,和
    确定所述接触的运动沿所述延长线当前所在的延伸方向进行,或者确定所述接触的运动方向与延长线之间的夹角满足第二预设条件,第二预设条件为所述夹角位于不同于第一角度范围的第二角度范围。
  35. 根据权利要求32所述的智能控制器,其特在于,所述第一处理器执行所述程序时采用以下方式实现所述监测输入对象与所述触摸显示屏的接触的运动,确定所述接触的运动对应到触摸显示屏上的操作位置,以及平移所 述延长线和图标使所述延长线经过所述操作位置:
    监测所述接触的第一运动部分,所述第一运动部分沿所述延长线当前所在的延伸方向进行,
    确定所述接触的第一运动部分关联在所述触摸显示屏上的第一操作位置,
    将所述图标连同所述延长线平移使得所述延长线经过第一操作位置,
    监测所述接触的第二运动部分,所述第二运动部分偏离所述延长线当前所在的延伸方向,
    确定所述接触的第二运动部分关联在所述触摸显示屏上的第二操作位置,
    将所述图标连同所述延长线平移使得所述延长线经过所述第二操作位置,所述第二操作位置为触摸显示屏上的任意一个操作位置。
  36. 根据权利要求29所述的智能控制器,其特在于,所述延长线延伸至超声图像的显示区域之外。
  37. 根据权利要求29所述的智能控制器,其特在于,所述第二处理器还执行所述程序时实现以下步骤:
    在所述触摸显示屏的第一界面层上显示所述超声图像,在第一界面层的上方叠加透明设置的第二界面层,并将所述图标和延长线设置在所述第二界面层上。
PCT/CN2017/073045 2017-02-07 2017-02-07 超声医学检测设备及成像控制方法、成像系统、控制器 WO2018145244A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780024747.2A CN109069105B (zh) 2017-02-07 2017-02-07 超声医学检测设备及成像控制方法、成像系统、控制器
PCT/CN2017/073045 WO2018145244A1 (zh) 2017-02-07 2017-02-07 超声医学检测设备及成像控制方法、成像系统、控制器

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/073045 WO2018145244A1 (zh) 2017-02-07 2017-02-07 超声医学检测设备及成像控制方法、成像系统、控制器

Publications (1)

Publication Number Publication Date
WO2018145244A1 true WO2018145244A1 (zh) 2018-08-16

Family

ID=63107647

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/073045 WO2018145244A1 (zh) 2017-02-07 2017-02-07 超声医学检测设备及成像控制方法、成像系统、控制器

Country Status (2)

Country Link
CN (1) CN109069105B (zh)
WO (1) WO2018145244A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210015448A1 (en) * 2019-07-15 2021-01-21 GE Precision Healthcare LLC Methods and systems for imaging a needle from ultrasound imaging data

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114036981B (zh) * 2021-11-01 2024-04-26 中国海洋大学 基于声波成像的手势识别方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104042236A (zh) * 2013-03-13 2014-09-17 三星电子株式会社 提供复制图像的方法及其所用的超声设备
US20150051491A1 (en) * 2012-09-24 2015-02-19 Samsung Medison Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
CN104407692A (zh) * 2014-09-30 2015-03-11 深圳市亿思达科技集团有限公司 基于超声波的全息图像交互式显示方法、控制方法及系统

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2666538B2 (ja) * 1990-08-10 1997-10-22 富士通株式会社 パニング制御システム
JP5326802B2 (ja) * 2009-05-19 2013-10-30 ソニー株式会社 情報処理装置、画像拡大縮小方法及びそのプログラム
US8823749B2 (en) * 2009-06-10 2014-09-02 Qualcomm Incorporated User interface methods providing continuous zoom functionality
CN102440804A (zh) * 2011-09-17 2012-05-09 无锡祥生医学影像有限责任公司 触摸屏超声诊断仪及其图像放大方法
JP5907780B2 (ja) * 2012-04-02 2016-04-26 富士フイルム株式会社 超音波診断装置
TWI480792B (zh) * 2012-09-18 2015-04-11 Asustek Comp Inc 電子裝置的操作方法
CN105898189A (zh) * 2014-05-06 2016-08-24 无锡威莱斯电子有限公司 倒车辅助线可调节的无线倒车影像系统
CN105892857B (zh) * 2016-03-31 2020-06-30 深圳市菲森科技有限公司 图像定位方法及装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150051491A1 (en) * 2012-09-24 2015-02-19 Samsung Medison Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
CN104042236A (zh) * 2013-03-13 2014-09-17 三星电子株式会社 提供复制图像的方法及其所用的超声设备
CN104407692A (zh) * 2014-09-30 2015-03-11 深圳市亿思达科技集团有限公司 基于超声波的全息图像交互式显示方法、控制方法及系统

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210015448A1 (en) * 2019-07-15 2021-01-21 GE Precision Healthcare LLC Methods and systems for imaging a needle from ultrasound imaging data

Also Published As

Publication number Publication date
CN109069105A (zh) 2018-12-21
CN109069105B (zh) 2021-08-24

Similar Documents

Publication Publication Date Title
US9526473B2 (en) Apparatus and method for medical image searching
US9430698B2 (en) Information input apparatus, information input method, and computer program
JP6469706B2 (ja) 深度センサを用いた構造のモデル化
JP5942456B2 (ja) 画像処理装置、画像処理方法及びプログラム
JP5807686B2 (ja) 画像処理装置、画像処理方法及びプログラム
US9996160B2 (en) Method and apparatus for gesture detection and display control
US20140098049A1 (en) Systems and methods for touch-based input on ultrasound devices
WO2018145320A1 (zh) 超声医学检测设备、传输控制方法以及成像系统和终端
US11638572B2 (en) Methods and apparatus for performing measurements on an ultrasound image
EP2180400A2 (en) Image processing apparatus, image processing method, and program
EP3353634B1 (en) Combining mobile devices with people tracking for large display interactions
US10754446B2 (en) Information processing apparatus and information processing method
WO2018145244A1 (zh) 超声医学检测设备及成像控制方法、成像系统、控制器
US20180210632A1 (en) Method and ultrasound imaging system for adjusting an ultrasound image with a touch screen
WO2017190360A1 (zh) 一种医疗检测系统及其控制方法
CN102389322B (zh) 触摸屏超声诊断仪及其彩色血流模式调节方法
WO2018145264A1 (zh) 超声医学检测设备及成像控制方法、成像系统、控制器
JP2010086367A (ja) 位置情報入力装置、位置情報入力方法、プログラム、情報処理システム、及び電子装置
WO2020132863A1 (zh) 续笔方法与显示终端
JP6304305B2 (ja) 画像処理装置、画像処理方法及びプログラム
US20190114812A1 (en) Method and ultrasound imaging system for emphasizing an ultrasound image on a display screen
US20240225605A1 (en) Methods and apparatus for performing measurements on an ultrasound image
JP2021189659A (ja) 情報処理装置及びユーザの入力操作に基づく情報処理方法並びに該方法を実行するためのコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17895764

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17895764

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14/02/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17895764

Country of ref document: EP

Kind code of ref document: A1