WO2018145264A1 - 超声医学检测设备及成像控制方法、成像系统、控制器 - Google Patents

超声医学检测设备及成像控制方法、成像系统、控制器 Download PDF

Info

Publication number
WO2018145264A1
WO2018145264A1 PCT/CN2017/073099 CN2017073099W WO2018145264A1 WO 2018145264 A1 WO2018145264 A1 WO 2018145264A1 CN 2017073099 W CN2017073099 W CN 2017073099W WO 2018145264 A1 WO2018145264 A1 WO 2018145264A1
Authority
WO
WIPO (PCT)
Prior art keywords
information indicators
information
contact
indication
indication identifier
Prior art date
Application number
PCT/CN2017/073099
Other languages
English (en)
French (fr)
Inventor
周述文
刘智光
何绪金
Original Assignee
深圳迈瑞生物医疗电子股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳迈瑞生物医疗电子股份有限公司 filed Critical 深圳迈瑞生物医疗电子股份有限公司
Priority to PCT/CN2017/073099 priority Critical patent/WO2018145264A1/zh
Priority to CN201780024746.8A priority patent/CN109069104B/zh
Publication of WO2018145264A1 publication Critical patent/WO2018145264A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves

Definitions

  • the present invention relates to an ultrasonic imaging control method with a touch display screen and an imaging system.
  • Ultrasound imaging systems In the process of imaging an imaging target, it is often necessary to adjust some of the imaging parameters to obtain the desired image.
  • the operator can adjust these parameters via the control panel of the ultrasound imaging system or via the touch screen.
  • the control panel provides controls such as adjustment buttons for the imaging parameters.
  • the parameter adjustment buttons are usually displayed on the touch screen, and the operator usually touches the buttons on the touch screen with a finger or a stylus. Thereby adjusting the parameters to the desired level.
  • an ultrasonic medical testing apparatus comprising:
  • a transmitting circuit and a receiving circuit configured to excite the probe to emit an ultrasonic beam to the detecting object, and receive an echo of the ultrasonic beam to obtain an ultrasonic echo signal
  • An image processing module configured to obtain an ultrasound image according to the ultrasound echo signal
  • the first memory storing a computer program running on the processor
  • the first processor when the first processor executes the program, implements the following steps:
  • the indicator is displayed on the touch display.
  • an ultrasound imaging control method comprising:
  • Transmitting an ultrasonic beam to the detecting object according to the ultrasonic imaging parameter receiving an echo of the ultrasonic beam, obtaining an ultrasonic echo signal, and obtaining an ultrasonic image according to the ultrasonic echo signal;
  • the indicator is displayed on the touch display.
  • an ultrasound imaging system comprising: an ultrasound medical detection device and an intelligent controller; wherein
  • the ultrasonic medical testing device includes:
  • a transmitting circuit and a receiving circuit configured to excite the probe to emit an ultrasonic beam to the detecting object, receive an echo of the ultrasonic beam, and obtain an ultrasonic echo signal
  • An image processing module configured to obtain an ultrasound image according to the ultrasound echo signal
  • a first communication module electrically connected to the image processing module, configured to transmit the ultrasound image data To the intelligent controller, and/or receiving a control signal input by the intelligent controller to obtain ultrasound imaging parameters required for the ultrasound image;
  • the intelligent controller includes:
  • a second communication module configured to receive ultrasound image data transmitted from the first communication module, and/or send a control signal to the first communication module
  • the second processor when the second processor executes the program, implements the following steps:
  • the indicator is displayed on the touch display.
  • an intelligent controller comprising:
  • a second communication module configured to receive ultrasound image data transmitted from the ultrasonic medical detection device, and/or to send a control signal to the ultrasonic medical detection device;
  • the second processor when the second processor executes the program, implements the following steps:
  • Displaying multiple information indicators in a preset order on the touch display screen one information refers to The indicator corresponds to a discrete alternative.
  • the indicator is displayed on the touch display.
  • a control signal containing the parameter value is output by the second communication module.
  • FIG. 1 is a schematic diagram of a system architecture of an ultrasonic medical testing device in accordance with some embodiments
  • FIG. 2 is a schematic diagram of a system architecture of an ultrasonic medical testing device in accordance with some embodiments
  • FIG. 3 is a schematic flow chart of the ultrasonic imaging control method in the embodiment shown in FIG. 1 or FIG. 2;
  • Figure 4 provides a display embodiment of the graphical user interface of the embodiment of Figure 3;
  • Figure 5 provides another display embodiment of the graphical interface of the example of Figure 3;
  • FIG. 6 is a modified embodiment of a graphical user interface in the embodiment of FIG. 3;
  • 7 and 8 are a modified embodiment of a plurality of information indicators in a circular arrangement in the embodiment of FIG.
  • FIG. 1 is a schematic view showing the structure of an ultrasonic medical detecting apparatus 100 in an embodiment, and the specific structure is as follows.
  • the ultrasonic medical testing apparatus 100 shown in FIG. 1 mainly includes a probe 101, a transmitting circuit 103, a transmitting/receiving selection switch 102, a receiving circuit 104, a beam combining module 105, a signal processing module 116, and an image processing module 126.
  • the transmitting circuit 103 transmits a delayed-focused transmission pulse having a certain amplitude and polarity to the probe 101 through the transmission/reception selection switch 102.
  • the probe 101 is excited by a transmission pulse to emit an ultrasonic wave (which may be any one of a plane wave, a focused wave or a divergent wave) to a detection object (for example, an organ, a tissue, a blood vessel, or the like in a human body or an animal body, not shown). And receiving an ultrasonic echo with information of the detection object reflected from the target area after a certain delay, and reconverting the ultrasonic echo into an electrical signal.
  • the receiving circuit 104 receives the electrical signals generated by the conversion of the probe 101, obtains ultrasonic echo signals, and sends the ultrasonic echo signals to the beam combining module 105.
  • the beam synthesis module 105 performs processing such as focus delay, weighting, and channel summation on the ultrasonic echo signals, and then sends the ultrasonic echo signals to the signal processing module 116 for related signal processing.
  • the ultrasonic echo signals processed by the signal processing module 116 are sent to the image processing module 126.
  • the image processing module 126 performs different processing on the signals according to different imaging modes required by the user, obtains ultrasonic image data of different modes, and then forms ultrasonic images of different modes through logarithmic compression, dynamic range adjustment, digital scan conversion, and the like. Such as B image, C image, D image, etc., or other types of two-dimensional ultrasound images or three-dimensional ultrasound images.
  • the transmitting circuit and the receiving circuit exciter probe emit an ultrasonic beam to the detecting object according to the setting of the ultrasonic imaging parameter, and receive an echo of the ultrasonic beam to obtain an ultrasonic echo signal, thereby obtaining desired ultrasonic image data for display and display. Detects the internal structure of the object. Ultrasound into the text
  • the image parameters refer to all parameters that can be selected by the user during the imaging process of the ultrasound tissue image, such as TGC (Time Gain Compensate), acoustic frequency, pulse recurrence frequency (PRF), ultrasonic type. , and dynamic range and more.
  • the ultrasound imaging parameters of the ultrasound imaging system include at least one discontinuous ultrasound imaging parameter, ie, the parameters are not continuously changed, but are divided into a plurality of levels or gear positions, and the levels or gear positions are There is a certain gap between them. When these parameters are adjusted, they will only take values at these levels and gears, and the values between these levels or gears will not take values.
  • These discontinuously varying parameters are referred to herein as discrete or discrete values that will form multiple discrete alternatives on the graphical user interface.
  • the signal processing module 116 and the image processing module 126 of FIG. 1 may be integrated on one motherboard 106, or one or more of the modules (including the number herein above) are integrated in Implemented on a processor/controller chip.
  • the acquired ultrasound image can be output to the first touch display 130 for display.
  • the first touch display screen 130 is connected to the image processing module through an input/output interface (the input/output interface can be implemented by means of wired communication or wireless communication) to implement data transmission.
  • a first processor 140 and a first memory 160 may also be included.
  • the first processor 140 invokes computer program instructions recited on the first memory 160 to display the ultrasound image on the first touch display 130 and/or form a graphical user interface on the touch display.
  • a graphical user interface is displayed on the first touch display 130 and graphical controls such as the ultrasound imaging parameter adjustments involved in the ultrasound image imaging process, various functional keys, and the like are presented.
  • Control instructions for corresponding operations on the graphical control generated by the operation of the input object on the touch display can be obtained based on a graphical user interface (GUI), and the control commands for information such as ultrasonic imaging parameters can be transmitted by wire or wirelessly.
  • the ultrasonic medical testing apparatus 100 is provided and used to control the operation of the probe, the transmitting circuit, the receiving circuit, and the like for obtaining a desired ultrasonic image.
  • the ultrasound image may be displayed on the two display screens, respectively, or on the same display screen.
  • An ultrasound image can be displayed on the touch display or a graphical user interface (GUI) for user input of the command input.
  • GUI graphical user interface
  • the first processor 140 can invoke the gesture detection module 113 stored in the memory 160 to detect a control command obtained by the user performing a contact operation on the graphical user interface through the input object.
  • a touch display having a graphical user interface (GUI), one or more processors, memory, and one or more modules, programs stored in memory for performing various functions are included Or an instruction set, which together implements a graphical user interface (GUI)-based manipulation input detection and obtains relevant control instructions.
  • GUI graphical user interface
  • these functions may include parameter adjustment, information input, etc.
  • an object eg, a patient's tissue
  • Patient directory information is built, displayed and managed, and more.
  • the modules, programs, or instructions for executing these may be included in a computer program product configured for execution by one or more processors.
  • the user interacts with the graphical user interface primarily through gesture input on the touch display.
  • the gesture input herein may include any type of user gesture input that the device can detect by directly touching the touch display or proximity to the touch display.
  • the gesture input may be a finger that the user uses a right or left hand (eg, an index finger, a thumb, etc.), or an input object that can be detected by touching the display screen (eg, a stylus, a touch screen dedicated pen) on the touch display screen
  • the actions of selecting one location, multiple locations, and/or multiple consecutive locations may include operational actions such as contact, touch release, touch tap, long contact, rotational deployment, and the like.
  • the long contact corresponds to a gesture input of moving a finger, a thumb, a stylus in a predetermined direction or a variable direction while a finger, a thumb, a stylus, or the like is kept in continuous contact with the touch display screen, for example, like a touch drag Gesture operation such as moving, flicking, wiping, sliding, sweeping, and the like.
  • the gesture input is realized by the contact of the input object with the touch display screen, and the contact with the touch display screen may include direct contact with the touch display screen, such as a finger, a thumb, or a stylus pen, or proximity to the touch display screen without direct contact.
  • a gesture input that is in proximity to the touch display screen in direct contact refers to a gesture operation action in a spatial position proximate to the touch screen display.
  • the above graphical user interface refers to an overall design of human-computer interaction, operation logic, and interface aesthetics of the software, which may include one or more soft keyboards and multiple graphic control objects.
  • a soft keyboard can include a number of icons (or soft keys). This allows the user to select one or more icons in the soft keyboard and thus select one or more corresponding symbols for input.
  • the gesture detection module 113 can detect a gesture input that interacts between the input object and the touch display screen.
  • the gesture detection module 113 includes various operations for performing gesture input detection, such as determining whether a contact has occurred, determining whether the gesture input is continuously input, determining whether to correspond to the predetermined gesture, determining an operation position corresponding to the gesture input, determining Whether the corresponding operation position of the gesture input moves to the edge position of the corresponding display area, determines whether the gesture input has been interrupted (eg, whether the contact has stopped), determines the movement of the gesture input, and tracks the movement trajectory of the gesture input, and the like.
  • Program module determines whether a contact has occurred, determining whether the gesture input is continuously input, determining whether to correspond to the predetermined gesture, determining an operation position corresponding to the gesture input, determining Whether the corresponding operation position of the gesture input moves to the edge position of the corresponding display area, determines whether the gesture input has been interrupted (eg, whether the contact has stopped), determines the movement of the gesture input, and tracks the movement trajectory of the gesture input, and the like.
  • Determining the motion of the gesture input may include determining a rate of motion (amplitude), a velocity of motion (amplitude and direction), a velocity of motion (a change in amplitude and/or direction), a motion trajectory, and the like of the operational position corresponding to the gesture input. These operations can be applied to a single operational location (eg, a gesture input implemented by one finger), or multiple simultaneous operational locations (eg, "multi-touch", ie, gesture input implemented by multiple fingers).
  • the gesture detection mode Block 113 is for detecting motion of one or more input objects on a touch display surface or at a spatial location proximate to the touch display.
  • the gesture detection module 113 is stored on the memory, and the above-mentioned gesture input is monitored by the call of one or more processors to obtain an operation input instruction of the user.
  • the first processor 140 and the first memory 160 may be disposed on the main board 106 , or may be disposed independently of the main board 106 or integrated with the touch display screen 130 to form an independent
  • the display controller that is, the display of the ultrasonic image, can also realize the control command for obtaining the user input based on the ultrasonic image.
  • the signal processing module 116 and/or the image processing module 126 of FIG. 1 together with the first processor 140 may be uniformly configured to perform data processing of the ultrasound image on one or more processors, and the above gestures Input monitoring and graphical user interface generation.
  • the ultrasonic medical testing apparatus 200 includes a probe 201, a transmitting circuit 203, a transmitting/receiving selection switch 202, a receiving circuit 204, a beam combining module 205, a signal processing module 216, and an image processing module 226.
  • the functions and implementations implemented by the probe 201, the transmitting circuit 203, the transmit/receive selection switch 202, the receiving circuit 204, the beam combining module 205, the signal processing module 216, and the image processing module 226 are as shown in FIG.
  • the probe 101, the transmitting circuit 103, the transmitting/receiving selection switch 102, the receiving circuit 104, the beam synthesizing module 105, the signal processing module 116, and the image processing module 126 in the embodiment are the same, and may not be further described herein.
  • the signal processing module 216 and the image processing module 226 of FIG. 2 may be integrated on one motherboard 206, or one or more of the modules (including the number herein above) are integrated in Implemented on a processor/controller chip. The difference from the embodiment shown in FIG.
  • the ultrasonic medical detecting apparatus 200 further includes: a first communication module 215 electrically connected to the image processing module 226, for transmitting the ultrasonic image data obtained by the image processing module 226 to the intelligent control
  • the controller 270, and/or the control signal input by the intelligent controller 270 is used to set the ultrasound imaging parameters used in the ultrasound imaging process.
  • the operation of setting the ultrasound imaging parameters includes updating the ultrasound imaging parameters, adjusting the ultrasound imaging parameters, or initializing the settings of the ultrasound imaging parameters.
  • the intelligent controller 270 in this embodiment includes a second touch display screen 230, a second processor 240, a second memory 260, and a second communication module 214.
  • the second memory 260 stores a computer program running on the second processor 240, such as the gesture detection module 213.
  • the gesture detection module 213 in the embodiment has the same function as the gesture detection module 113 in the embodiment shown in FIG. Repeated.
  • the second touch display screen 230 has the same function as the first touch display screen 130, but the specific product parameters may be different, and the “first” and “second” of the crown are only used to distinguish different applications in the embodiment. Entities within the scene, below regarding method steps or descriptions A single application scenario can also be equivalently interpreted as a touch display in the traditional sense, so the rest of the text can also be referred to as a touch display.
  • the second communication module 214 receives the ultrasound image data transmitted from the first communication module 215 and/or transmits a control signal, such as a control signal containing ultrasound imaging parameter setting information, to the first communication module 215.
  • the intelligent controller 270 includes the display controller mentioned in FIG. 1, but may also include computer devices with touch display screens such as various smart terminal devices such as an IPAD, a mobile phone, and the like.
  • the communication manners of the first communication module 215 and the second communication module 214 may adopt a wireless data transmission protocol such as a wifi protocol, a Bluetooth transmission protocol, a mobile communication network protocol, or the like.
  • the ultrasonic medical testing device 200 and the intelligent controller 270 constitute an ultrasonic imaging system.
  • the device disclosed in this embodiment is an ultrasonic imaging system capable of adjusting imaging parameters of an ultrasonic medical detecting device by touch control, which is more intuitively and ultrasonically imaged by a graphical user interface on the user and the touch display screen.
  • the interactive acquisition of image data and ultrasound imaging parameters increases the convenience of the user operating the ultrasound device and enhances the user experience.
  • the adjustment interface of these parameters is simultaneously displayed on the touch screen and the displayed adjustment interface changes with the sliding of the finger or the stylus.
  • the adjusted parameter is such a discontinuous parameter
  • these parameters can only take a discontinuous value, so the adjustment interface displayed during the adjustment will have a "jumping" phenomenon, which affects the user's adjustment and use experience of the parameters.
  • the ultrasonic parameter selection method provided in this embodiment can well avoid the problem. When the user adjusts the operation parameters on the touch screen, the adjustment effect of the adjustment flag moving with the user's touch can be better experienced.
  • FIG. 3 is a flow chart showing the ultrasonic imaging control method in the embodiment shown in FIG. 1 or FIG.
  • Figure 4 provides a display embodiment of the graphical user interface of the embodiment of Figure 3.
  • FIG. 5 provides another display embodiment of the graphical interface of the example of FIG. 3, and FIG. 5 differs from FIG. 4 in that the information indicators are arranged in different directions.
  • Figure 6 is a variant embodiment of the second region of the embodiment of Figure 3.
  • step S210 of FIG. 3 the transmitting circuit and the receiving circuit (103 and 104, 203 and 204) excite the probe (201, 101) to emit an ultrasonic beam to the detecting object according to the set ultrasonic imaging parameter, And in step S212, the excitation probe (201, 101) receives the echo of the ultrasonic beam to obtain an ultrasonic echo signal.
  • an ultrasound image is obtained from the ultrasound echo signal by using an image processing module.
  • an ultrasound image is obtained from the ultrasound echo signal by the image processing module (126), the implementation of FIG.
  • An ultrasound image is obtained from the ultrasound echo signal by an image processing module (226).
  • a first memory is provided for storing a computer program running on the processor, such as the gesture detecting module 113 described above.
  • a second memory is provided in the intelligent controller 270 of FIG. 2 for storing a computer program running on the processor, such as the gesture detection module 213 described above.
  • the ultrasound image herein may be a different mode of ultrasound image as previously described, such as a B image, a C image, a D image, etc., or other types of two-dimensional ultrasound images or three-dimensional ultrasound images.
  • the ultrasound image mentioned herein may be a static frame image or a dynamic video image.
  • the first processor 140 or the second processor 240 of FIG. 2 establishes a plurality of discrete alternatives based on the ultrasound imaging parameters.
  • the ultrasound imaging parameters in this embodiment are TGC (Time Gain Compensate), acoustic frequency, pulse recurrence frequency (PRF), ultrasonic type, and dynamic range.
  • TGC Time Gain Compensate
  • PRF pulse recurrence frequency
  • ultrasonic type acoustic frequency
  • dynamic range one.
  • multiple discrete alternatives obtained for TGC can be 1.5, 2.5, 3.5, ..., and so on, respectively.
  • the first processor or the second processor 240 of FIG. 2 displays a plurality of information indicators 305 in a preset sequence interval on the first touch display screen 130 or the second touch display screen 230.
  • an information indicator 305 is associated with a discrete alternative.
  • a plurality of information indicators 305 can be placed within region 301 (shown in Figure 4). Here the spacing arrangement can be equally spaced.
  • the representation of the information indicator 305 may be a presentation mode of the interface text, a presentation mode of drawing a scale indicator, and the like.
  • the preset arrangement order in this embodiment may be arranged in a straight line, arranged along a curve or arranged in a circle, or the like.
  • a plurality of information indicators 305 corresponding to a plurality of discrete alternatives of the TGC are displayed along a line in a graphical user interface of the touch display.
  • a plurality of information indicators 305 are presented on the graphical user interface in a scale prompt and text prompts to alert the user to information of discrete alternatives, such as numerical values.
  • a plurality of information indicators 305 are displayed within an area 301 on the graphical user interface, which may be any one of the pixel areas on the graphical user interface, for example, may be located adjacent to the ultrasound image display area, or within the ultrasound image display area A pixel area. of course, The area 301 can also be superimposed and displayed on the ultrasound image.
  • the graphical user interface includes at least two layers of interface layers, and the ultrasound image is displayed on the first interface layer of the touch screen display, and the transparent setting is superimposed on the first interface layer. Two interface layers, and a plurality of information indicators 305 are disposed on the second interface layer.
  • a floating window is displayed on the touch display screen that can be moved to any one of the pixel area locations on the graphical user interface.
  • the above-mentioned area 301 is disposed in the floating window, and a plurality of information indicators 305 are disposed in the area 301, so that the user can visually understand the plurality of discrete alternatives of the relevant ultrasound imaging parameters.
  • the above floating window is displayed on the touch display screen based on the received adjustment trigger signal from the user.
  • FIG. 400 Another embodiment of a graphical user interface is shown in FIG.
  • a graphical user interface 400 is displayed on the touch display screen, and an ultrasound image is displayed within area 402 on the interface.
  • a plurality of information indicators 415 are displayed in a predetermined order within the area 416, one information indicator 415 being associated with a discrete alternative.
  • Region 416 can be any one of the pixel regions on the graphical user interface, for example, can be located adjacent to the ultrasound image display region, or as a pixel region within the ultrasound image display region.
  • the expression of the information indicator 415 may be a prompt mode of the interface text, a prompt mode for drawing a scale indicator, and the like.
  • a plurality of display areas are arranged along the direction in which the information indicators are arranged, and adjacent display areas are spaced apart, for example, a plurality of The display areas are arranged at equal intervals.
  • each display area is used to display an information indicator.
  • the display area refers to a position corresponding to the information indicator on the graphical user interface. When the indication identifier is located in the display area, the display information indicator corresponding to the display area is selected, and each display area may be one or more. Pixel position.
  • step S220 of FIG. 3 the first processor 140 or the second processor 240 of FIG. 2 displays an indication flag on a touch display screen such as the first touch display screen 130 or the second touch display screen 230.
  • the first processor 140 or the second processor 240 displays an indication flag (412, 302) within the predetermined area (301, 416).
  • the indication marks are indicated by black squares and arrows, and of course, other shapes such as squares, triangles, and the like may be used, for example, the indication frame 712 is represented by the hollow frame shown in FIG.
  • the indication flag (412, 302) may be arranged along the information indicator.
  • the predetermined area here may be the area 301 or the area 416, that is, an area in which a plurality of information indicators are displayed.
  • the operation position on the interface mentioned herein refers to the position on the display interface when the user inputs the operation of the interface object (for example, the indication mark) by using the human-machine interaction device.
  • the "location” referred to herein includes orientation information, coordinate information, and/or angle information, etc., for example, regarding the display position of the indication identifier on the graphical user interface, may be characterized by coordinate information indicating the pixel point where the identification is located, It can also be characterized by a positional mark taken along the direction in which the information indicators are arranged.
  • the indicator flag can also be placed above the second interface layer.
  • the area 416 can also be placed in the floating window together with the area 301, and the indicator is also displayed in the floating window.
  • the first processor 140 or the second processor 240 in FIG. 2 may associate the record display area (309 or 418) with the discrete candidate, for example, the recorded display area ( 309 or 418)
  • the correspondence with the discrete alternative is stored in the first memory.
  • a discrete alternative association corresponds to a display area (309 or 418) on the touch display (300 or 400).
  • the manner of recording the correspondence between the display area (309 or 418) and the discrete alternative may be the range of the pixel area of the recording display area (309 or 418) on the touch display screen, and the correspondence relationship between the values of the discrete alternatives is convenient for subsequent Quickly find based on user input.
  • step S224 of FIG. 3 the first processor 140 or the second processor 240 of FIG. 2 invokes the gesture detection module (113, 213) to monitor the input object on the touch display screen (eg, the first touch display screen 130 or the The touch on the touch screen 230).
  • the gesture detection module 113, 213
  • the predetermined area herein may include a steerable movement area indicating the identification, or a plurality of information indicators may manipulate the movement area, for example, the predetermined area may be the area 301 or the area 416; or may be an interface area including the area 301 or the area 416 Or it may be an area other than the area 301 or the area 416.
  • an ultrasound image 606 is displayed on the graphical user interface 600, a plurality of information indicators 601 are displayed in spaced-apart intervals within the region 612, and an indicator 602 is displayed within the region 612, at region 612.
  • a plurality of display areas 613 are also disposed therein, and each of the display areas 631 correspondingly displays an information indicator 601.
  • the first processor 140 or the second processor 240 in FIG. 2 invokes the gesture detection module (113, 213) to monitor the input object 604 in the third area 608 and the touch display screen (130, or 230). )s contact.
  • the third area 608 is disposed in an area outside the area 612.
  • the touch operation in the third area 608 affects the relative positional relationship between the plurality of information indicators 601 and the indication marks 602 in the area 612.
  • One touch operation includes one contact of the input object on the touch display screen and release of the contact.
  • the method further includes:
  • the first processor 140 or the second processor 240 of FIG. 2 can enlarge the relevant according to the initial contact of the input object with the touch display screen before moving the indication of the indication or the display of the plurality of information indicators according to the contact More details of the plurality of information indicators allow the user to more accurately locate a particular discrete value to which the ultrasound imaging parameters are desired to be adjusted.
  • an ultrasound image 702 is displayed on the graphical user interface 700
  • a plurality of information indicators 715 are arranged on the graphical user interface 700 at circular intervals, and each of the plurality of spaced apart display regions 718 is displayed 718.
  • a message indicator is displayed.
  • the first processor 140 or the second processor 240 in FIG. 2 recognizes that the contact of the input object 703 on the touch display screen corresponds to the first operational position on the touch display screen, and 7191 in FIG. 7 is the initial contact of the first operational position.
  • the first processor 140 or the second processor 240 in FIG. 2 is based on at least a portion of the plurality of information indicators 715 found, in two Discrete alternatives for multiple sub-levels are established between two discrete alternatives corresponding to the information indicator.
  • the initial contact location 7191 is selected between the information indicators "35” and "40", then the expansion between the two discrete options "35” and “40” corresponding to the two information indicators will be enlarged.
  • Discrete alternatives for sub-levels The plurality of information indicators 715 within the box area on the interface 700 transition to a display within the dashed box 721 pointed to by the indicator line 720 after the input object 703 contacts the touch display screen.
  • the discrete options "37” and "39” will continue to be expanded between "35” and "40".
  • the remaining information indicators 715 will also expand the discrete devices of multiple children in sequence. Option.
  • the information indicators of the plurality of sub-levels are displayed on the touch display screen in a preset order, and the information indicators of one sub-level are associated with the discrete alternatives of one sub-level. Therefore, the original multiple indicators indicate that the discrete alternative will be scaled up due to the contact of the input object with the touch display.
  • the contact time can be judged by the contact time, if the contact time of the input object with the touch display screen exceeds a certain position at the same operation position.
  • Threshold it is considered that it is desired to enlarge at least a part of the plurality of information indicators year by year, the incentive causes the amplification effect shown in FIG. 7 , and conversely, if the contact time of the input object and the touch display screen does not exceed a certain threshold at the same operation position, the recognition is not recognized. It is desirable to enlarge at least a portion of the plurality of information indicators year-on-year, which will not result in the display of the dashed box 721 shown in FIG.
  • step S226 of FIG. 3 the first processor 140 or the second processor 240 of FIG. 2 is based on an input object monitored on a touch display screen such as the first touch display screen 130 or the second touch display screen 230.
  • the contacting, moving the indication of the indication or the display of the plurality of information indicators, causes a change in the positional relationship between the indication identifier and the plurality of information indicators.
  • the first processor 140 or the second processor 240 of FIG. 2 tracks the motion of the contact, and determines that the motion of the contact is associated with the second operational position on the touch display screen, according to The change in the two operating positions, the mobile display indication flag or the plurality of information indicators, such that the positional relationship between the indication identity and the plurality of information indicators changes.
  • the first processor 140 or the second processor 240 of FIG. 2 determines the above-described operational position, ie, the second operational position, associated with the touch display (130, or 230) on the touch display.
  • the second operational location may be located within region 301, 416, or 612, or within region 608 (i.e., an interface region 608 disposed on the graphical user interface other than region 612 in FIG. 6).
  • the input object 604 does not slide directly in the area 612, but instead slides in a control area (ie, the third area 608) on the touch screen (130, or 230).
  • the corresponding indication flag 602 slides correspondingly in the area 612, ie, the motion control indication flag 602 in the control area 608 is slid in the area 612 by the input object 604, thereby achieving Adjustment of ultrasound imaging parameters.
  • a second operational position 615 corresponding to the region 612 can be obtained depending on the contact of the input object 604 in the third region 608, a second operational position 615 corresponding to the region 612 can be obtained.
  • an operating position corresponding to the region 301 can be obtained according to the sliding contact, thereby obtaining a second operating position (3111). , 3112).
  • step S226 the first processor or the second processor 240 in FIG. 2 invokes the gesture detection module (113, 213) to track the monitoring input object and the touch display screen.
  • the above-mentioned step S226 monitors the continuous contact of the input object on the touch display screen (130, or 230), and the first processor 140 or the second processor 240 in FIG. 2 passes through the continuous contact.
  • the gesture detection module can identify a series of continuously varying positions on the touch display (130, or 230). Then, in step S228, in the embodiment shown in FIG. 4, FIG.
  • the first processor or the second processor 240 in FIG. 2 determines that the contact is on the touch display screen (130, or 230).
  • a plurality of continuously operating second operating positions can be obtained at a plurality of operating positions.
  • the continuously changing second operational positions may be arranged in a sequence along the direction in which the information indicators are arranged.
  • the first processor 140 or the second processor 240 in FIG. 2 moves the display indication identifier or the plurality of information indicators according to the change of the second operation position, thereby causing the position between the indication identifier and the plurality of information indicators The relationship has changed.
  • the display indication flag or the plurality of information indicators as shown in FIG. 4, when the contact of the input object 308 with the touch display screen is moved from the first operation position 3111 to the first operation position 3112, The display of the logo 302 moves from the first operating position 3111 to the first operating position 3112; as shown in FIG.
  • the display of the plurality of information indicators 715 is moved from the first operating position 7191 to The first operational position 7192, that is, the display result shown by the dashed box 732 pointed to by the indication line 730 in FIG. 8, the plurality of information indicators 715 are arranged in a circle, compared to the display result in the block 731 in FIG.
  • the display results within the dashed box 732 are deflected clockwise by an angle such that the indicator 712 is deflected from the pair of indication information indicators "20" to between the information indicators "75" and "80".
  • the manner of updating the display that is, updating means deleting the display of the indication indication at the original position, and changing the position to be related to the second operation position.
  • the location such that the indicator identity or the plurality of information indicators change in response to changes in the second operational location.
  • the meaning of "change” in this document can also be understood as changing, transforming, or replacing the display position of an interface object on an interface.
  • the above steps S224 to S226 include: the first processor 140 or the second processor 240 in FIG. 2 invokes the gesture detection module to monitor the input object on the touch display screen (130, or 230). Continuously contacting, determining that the plurality of corresponding operating positions on the touch display screen are continuously contacted to obtain a plurality of continuously changing first operating positions; and, sequentially, displaying the display of the indicating signs to the plurality of continuously changing first operating positions, The indication flag is caused to vary following the change in the second operational position. As shown in FIG.
  • the first processor monitors continuous contact by invoking the gesture detection module. , can get a set of multiple consecutive The changed second operational position (3111, 3112), the set of the plurality of continuously varying second operational positions (3111, 3112) sequentially changing along the direction in which the information indicators are arranged, and possibly crossing the direction of the information indicator
  • the at least one display area 309 sequentially moves the display of the indication identifier 302 to the plurality of continuously changing second operation positions according to the identification of the plurality of continuously changing second operation positions (3111, 3112), so that the indication identifier follows the second operation
  • the position changes and changes.
  • updating the display of the indication identifier in the second operation position comprises: the first processor or the second processor 240 in FIG. 2 invoking the gesture detection module to identify the second operation location in the information a position change of the direction of the arrangement of the indicator, according to a change in the position of the second operation position in the direction in which the information indicator is arranged, the position indicating the direction of the indication in the direction of the information indicator is sequentially changed to the second operation position in the information indication Positions of the symbols in the direction of alignment, thereby effecting movement of the display indication flag in accordance with the change in the second operational position described above.
  • the plurality of information indicators 305 in FIG. 4 are arranged in the vertical direction (ie, the Y direction), as shown in FIG. 3.
  • position 3111 and position 3112 may be represented as (x1, y1), (x2, y2), respectively, wherein the second operational position is identified in the vertical direction (ie, Y)
  • the position of the direction changes (y1 ⁇ y2), and then according to the identified position change (y1 ⁇ y2), the position of the indication mark (302) in the vertical direction (ie, the Y direction) is updated to the second operation position in the vertical direction.
  • the change position y2, that is, the position of the indication mark in the Cartesian coordinate system is located at (x1, y2).
  • the change in the second operational position may be only a change in the Y direction, that is, the position 3111 and the position 3112 may be expressed as (x1, y1), (x1, y2), respectively.
  • x ⁇ y may represent not only a coordinate position value, but also a certain coordinate range, or may be a center point coordinate position in a certain range.
  • the positional change of the second operational position in the horizontal direction i.e., the X direction
  • the activity of the indication mark can be restricted so that it moves only in the second area as the movement of the input object changes, for example, in the arrangement direction of the information indicator to any one of the operation positions in the predetermined area.
  • the method may further include:
  • the first processor 140 or the second processor 240 in FIG. 2 invokes the gesture detection module to detect whether the contact of the input object with the touch display screen (130, or 230) is within the operation position where the indication mark is located, when the input object and the touch When the contact of the display screen (130, or 230) is within the display area where the indication mark is located, the processes of the above steps S226 to S234 are performed. Conversely, when inputting a pair When the contact with the touch display screen (130, or 230) is not within the display area where the indication mark is located, the display position of the indication mark is not updated, and the processes of the above steps S226 to S234 are not performed. For a description of the process from step S226 to step S234, refer to the related description above.
  • the input operation of the indication flag can be ensured by tracking the contact between the input object and the touch display screen (130, or 230) to ensure the accuracy of the control signal input and ensure the reliability of the selection of the discrete alternative.
  • the first processor or the second processor 240 in FIG. 2 invokes the gesture detection module to detect whether the contact of the input object 403 with the touch display screen (130, or 230) is within the area 416. It is located in the operation position where the indication mark 412 is located, that is, whether the operation position 411 of the contact of the input object 403 with the touch display screen (130, or 230) on the touch display screen (130, or 230) and the indication mark 412 are located.
  • the operation positions coincide, and if so, the processes of the above steps S226 to S230 are performed to start tracking the movement of the contact of the input object 403 with the touch display screen (130, or 230) in the direction 404 or direction 405.
  • the display position of the indication flag is not updated according to the contact of the detected input object 403 with the touch display screen (130, or 230), that is, the process of the above steps S226 to S234 is not performed.
  • the indication mark may change the display position according to the continuous contact of the input object with the touch display screen.
  • the display of the indication mark is updated to the second operation.
  • the moving speed of the indication mark between the two operating positions on the graphical user interface may be calculated according to the visual display moving speed, and the indication mark is adjusted between the two first operating positions based on the moving speed. The movement is displayed to present a continuous display movement effect.
  • the manipulated may not be an indication identifier, but a plurality of information indicators, and as shown in FIG. 8, the first processor 140 or the second processor 240 in FIG. 2 invokes a gesture.
  • the detecting module identifies a change direction of the second operation position generated by the input object 703 contacting the touch display screen; and moving the plurality of information indicators in the change direction according to the detected change direction. For example, in Figure 8, it is detected that the second operational position transitions from 7191 to 7192, the change direction is a clockwise movement, thus moving the plurality of information indicators 715 clockwise such that the relative position between the indication identifier 712 and the plurality of information indicators is The relationship has changed.
  • step S228 of FIG. 3 the first processor 140 of FIG. 1 or the second processor 240 of FIG. 2 invokes a gesture detection module to detect release of contact of the input object with the touch display screen (130, or 230).
  • the first processor 140 of FIG. 1 or the second processor of FIG. 2 implements detecting the release of the contact at least by one of the following ways.
  • the first processor 140 of FIG. 1 or the second processor of FIG. 2 detects that the release of the contact is on the touch display screen.
  • the third operating position For example, the disengagement of contact between the input object and the touch display screen (130, or 230) is monitored on the touch display screen, and the operational position at which the contact is located before disengagement is taken as the third operational position. As shown in FIG.
  • the input object 403 is in contact with the touch display screen (130, or 230), and the processor detects that the contact between the two moves from the first operating position 4121 to the first operating position 4122, and thus The display of the indicator 412 is continuously updated for continuous movement to the first operational position 4122.
  • the disengagement of the contact between the input object 403 and the touch display screen (130, or 230) is detected at the first operation position 4122, and the operation position 4122 where the contact is located before the disengagement is recognized as the third operation position.
  • Continuous contact between the input object 703 and the touch display screen (130, or 230) in direction 704 or direction 705 is also monitored in a predetermined area (area of the area in which multiple information indicators are displayed in FIG.
  • the processor detects that the contact association between the two moves from the first operational position 7191 to the second operational position 7192 within the predetermined area, thus also continuously updating the display of the indication identification 712 for continuous movement to the second operational position 7192.
  • the processor detects that the contact between the input object 403 and the touch display screen (130, or 230) is disengaged, and identifies that the contact is located before the disengagement.
  • the operation position 7192 is used as the third operation position.
  • the relative positional relationship between the plurality of information indicators and the indication identifier jumps from the display result indicated by the dashed box 732 to the display frame 733 as indicated by the indication line 740.
  • the displayed result is indicated.
  • the "first” or “second” and “third” are used to distinguish the operating positions in different situations for distinguishing in the description
  • the operating position includes the first operating position and the second operating position.
  • the third operation position; the second operation position may include a first operation position, where the first operation position is an operation position generated by the initial contact between the input object 403 and the touch display screen (130, or 230), belonging to the second operation The starting point for the change in position.
  • the predetermined area may be the above area 416 or 301, or the above-described third area (608 in Fig. 6).
  • the gray arrow in FIG. 6 identifies the historical display location of the indicator 602 and the black arrow identifies the current display location of the indicator 602.
  • Continuous contact between the input object 604 and the touch display screen (130, or 230) in direction 603 or direction 605 is monitored within the third region 608, and the processor detects that the contact association between the two is within the region 612 from the first operational position.
  • the 615 moves to the first operational position 616, thus also continuously updating the display of the indication identification 602 for continuous movement to the first operational position 616.
  • the processor detects that the contact between the input object 403 and the touch display screen (130, or 230) is outside the third area 608, identifying the contact
  • the last operational position 616 within region 612 corresponds to the third operational position.
  • step S230 of FIG. 3 the first processor or the second processor 240 in FIG. 2 adjusts the indication identifier and the plurality according to the relative positional relationship between the contact release indication indicator and the plurality of information indicators.
  • the relative position between the information indicators is such that the indication identity is directed to one of the plurality of information indicators.
  • the above step 230 is implemented in the following manner: the first processor or the second processor 240 in FIG. 2 sets a plurality of spaced display areas on the touch display screen, and each display area is used. Displaying an information indicator, recognizing that the release of the input object in contact with the touch display screen results in a third operational position on the touch display screen; finding a display area associated with the third operational position; and displaying the indication identification in the search The display area to which the indicator is directed to face one of the plurality of information indicators. As shown in FIG. 5, a plurality of spaced-apart display areas 418 are disposed on the touch display screen.
  • Each display area 418 is configured to display an information indicator 415, and the release of the contact between the input object 403 and the touch display screen is generated.
  • a third operational position 4122 on the screen looking up the display area 4181 associated with the third operational location 4122; and displaying the indication identification 412 in the found display area 4181. It can thus be seen that when the contact of the input object 403 with the touch display screen is released, when the contact is released from the above, the indication flag 412 indicates the position between the information indicator "60" and the information indicator "80", and jumps to The indication identifier 412 is facing the position indicating the information indicator "80". As shown in FIG. 6, a plurality of spaced-apart display areas 613 are disposed on the touch display screen.
  • Each display area 613 is configured to display an information indicator 601, and the release of the contact between the input object 604 and the touch display screen is generated.
  • step 230 the first processor or the second processor in FIG. 2 is implemented in one of the following manners when searching for the display area associated with the third operation position.
  • the display area 4181 closest to the third operation position 4122 is searched for, and the display area closest to the third operation position is used as the display area associated with the third operation position.
  • looking up the contact of the input object 604 with the third region 608 causes the display of the indicator to finally traverse the display region 6131. That is, the display area last crossed by the input object 604 and the touch display screen in the area 612 is searched for based on the third operating position.
  • the above step 230 is implemented in the following manner: the first processor or the second processor 240 in FIG. 2 sets a plurality of spaced display areas on the touch display screen, each display area For displaying an information indicator; recognizing that the release of the input object in contact with the touch display screen is generated at a third operational position on the touch display screen; finding one of a plurality of information indicators located near the indication identifier; and, One of the plurality of information indicators is displayed at a position opposite the indication flag. For example, as shown by the dashed box 733 shown in FIG.
  • a plurality of spaced-apart display areas 718 are disposed on the touch display screen, each display area 718 is used to display an information indicator 715; the input object 703 is identified to be in contact with the touch display screen.
  • the indication flag 712 indicates a position intermediate the information indicators "75” and "80", and the indication flag 712 is jumped.
  • step S232 of FIG. 3 the first processor or the second processor of FIG. 2 determines a discrete alternative associated with the information indicator directly facing the indication identifier.
  • the information indicator "80" is directly indicated by the indication, and the discrete candidate is determined to be the scale "80".
  • the indication information is directly opposite the information indicator "12", determined.
  • the discrete alternative is the scale "12".
  • the processor may determine, by looking up the relationship between the display area occupied by the information indicator and the discrete candidate, the discrete candidate corresponding to the information indicator that the indicator identifier is directly associated with.
  • step S234 of FIG. 3 the first processor or the second processor in FIG. 2 adjusts the corresponding ultrasound imaging parameters according to the determined discrete candidate corresponding to the information indicator that is directly opposite to the indication identifier.
  • the parameter value and the adjusted parameter value are used to obtain the aforementioned ultrasound image.
  • the determined discrete candidate is used by the processor to reset the ultrasound imaging parameters used in the ultrasound scan.
  • the second processor obtains a parameter value corresponding to the ultrasound imaging parameter according to the determined discrete candidate corresponding to the information indicator that is directly opposite to the indication identifier, and then the intelligent controller. 270 generates a control signal containing the parameter value, and outputs the control signal to the first communication module 215 through the second communication module 214 for controlling the ultrasonic scanning and the formation of the ultrasonic image of the target tissue by the first processor. , thereby updating the display result of the ultrasound image.
  • the ultrasound imaging parameters required for the aforementioned ultrasound image can be obtained based on the above control signals.
  • FIG. 3 respectively provides only a sequence of execution steps between steps, and various modifications may be obtained based on the adjustment sequence of each step in FIG. 3, and the above steps are not limited to being performed only in the order of FIG.
  • the steps may be replaced with each other if the basic logic is satisfied, the execution order may be changed, and one or more of the steps may be repeatedly executed, and the last one or more steps are performed, which are all performed according to the embodiments provided herein. Deformation scheme.
  • the technical solution of the present invention which is essential or contributes to the prior art, may be embodied in the form of a software product carried on a non-transitory computer readable storage carrier (eg ROM, disk, optical disk, hard disk, server cloud space), comprising a plurality of instructions for causing a terminal device (which may be a mobile phone, a computer, a server, a network device, etc.) to execute the system structure and method of various embodiments of the present invention .
  • a computer readable storage medium having stored thereon a computer program, which when executed by a processor, is at least operable to implement various implementations based on the flow shown in steps S216 through S234 of FIG. 3 as mentioned hereinbefore. example.
  • the finger or the stylus presses the parameter adjustment mark to slide, and the parameter adjustment mark keeps following the finger or the stylus sliding during the sliding process. After releasing the finger or the stylus, it automatically finds the parameter level or gear position closest to the position of the current parameter adjustment mark, and automatically adjusts the parameter to the level or gear position. In this way, the visual feedback on the display interface during the parameter adjustment process is improved, the parameter adjustment is facilitated, and the user experience is improved.

Abstract

一种超声医学检测设备及成像控制方法、成像系统、控制器,其设备用于根据一个超声成像参数建立多个离散备选项;在触摸显示屏(300)上按预置顺序间隔排列显示多个信息指示符(305),一个信息指示符(305)与一个离散备选项关联对应;显示指示标识(302),检测输入对象与触摸显示屏(300)的接触,调整指示标识(302)与多个信息指示符(305)之间的位置关系,重新设置超声成像参数。上述设备提升了用户操作的便利性,极大地提高了用户体验。

Description

超声医学检测设备及成像控制方法、成像系统、控制器 技术领域
本发明涉及带有触摸显示屏的超声成像控制方法及成像系统。
背景技术
超声成像系统在对成像目标进行成像的过程中,通常需要对一些成像参数进行调节,以获得期望的图像。操作者可以通过超声成像系统的控制面板或者通过触摸屏调节这些参数。控制面板上提供相关成像参数的调节按钮等控件,当操作者通过触摸屏调节这些参数时,通常在触摸屏上显示参数调节按键,操作者通常用手指或者触笔等设备在触摸屏上触碰这些按键,从而使参数调节到期望的水平。
发明内容
基于此,有必要针对现有技术中存在的操作不便问题,提供一种超声医学检测设备及成像控制方法、成像系统、控制器。
在其中一个实施例中,提供了一种超声医学检测设备,所述设备包括:
探头;
发射电路和接收电路,用于激励所述探头向检测对象发射超声波束,并接收所述超声波束的回波,获得超声回波信号;
图像处理模块,用于根据所述超声回波信号获得超声图像;
触摸显示屏;
第一存储器,所述第一存储器存储处理器上运行的计算机程序;和,
第一处理器,所述第一处理器执行所述程序时实现以下步骤:
根据超声成像参数建立多个离散备选项,
在触摸显示屏上按预置顺序间隔排列显示多个信息指示符,一个信息指示符与一个离散备选项关联对应,
在触摸显示屏上显示指示标识,
监测输入对象在所述触摸显示屏上的接触,
根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化,
检测所述接触的释放,
根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符,
确定与所述指示标识正对的信息指示符所关联对应的离散备选项,和,
依据确定的与所述指示标识正对的信息指示符所关联对应的离散备选项,调整所述超声成像参数对应的参数值,并使用所述参数值获得所述超声图像。
在其中一个实施例中,提供了一种超声成像控制方法,其包括:
依据超声成像参数向检测对象发射超声波束,接收所述超声波束的回波,获得超声回波信号,根据所述超声回波信号获得超声图像;
根据超声成像参数建立多个离散备选项,
在触摸显示屏上按预置顺序间隔排列显示多个信息指示符,一个信息指示符与一个离散备选项关联对应,
在触摸显示屏上显示指示标识,
监测输入对象在所述触摸显示屏上的接触,
根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化,
检测所述接触的释放,
根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符,
确定与所述指示标识正对的信息指示符所关联对应的离散备选项,和,
依据确定的与所述指示标识正对的信息指示符所关联对应的离散备选项,调整所述超声成像参数对应的参数值,并使用所述参数值获得所述超声图像。
在其中一个实施例中,提供了一种超声成像系统,所述系统包括:超声医学检测设备和智能控制器;其中,
所述超声医学检测设备包括:
探头;
发射电路和接收电路,用于激励所述探头向检测对象发射超声波束,接收所述超声波束的回波,获得超声回波信号;
图像处理模块,用于根据所述超声回波信号获得超声图像;和,
与图像处理模块电连接的第一通信模块,用于将所述超声图像数据传输 至所述智能控制器,和/或接收所述智能控制器输入的控制信号用以获得所述超声图像所需要的超声成像参数;
所述智能控制器包括:
触摸显示屏,
第二通信模块,用于接收来自所述第一通信模块传送的超声图像数据,和/或向所述第一通信模块发送控制信号;
第二存储器,所述存储器存储处理器上运行的计算机程序;和,
第二处理器,所述第二处理器执行所述程序时实现以下步骤:
根据超声成像参数建立多个离散备选项,
在触摸显示屏上按预置顺序间隔排列显示多个信息指示符,一个信息指示符与一个离散备选项关联对应,
在触摸显示屏上显示指示标识,
监测输入对象在所述触摸显示屏上的接触,
根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化,
检测所述接触的释放,
根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符,
确定与所述指示标识正对的信息指示符所关联对应的离散备选项,
依据确定的与所述指示标识正对的信息指示符所关联对应的离散备选项,获得超声成像参数对应的参数值,和,
通过所述第二通信模块输出含有所述参数值的控制信号至所述第一通信模块。
在其中一个实施例中,提供了一种智能控制器,所述智能控制器包括:
触摸显示屏;
第二通信模块,用于接收来自超声医学检测设备传送的超声图像数据,和/或向所述超声医学检测设备发送控制信号;
第二存储器,所述存储器存储处理器上运行的计算机程序;和,
第二处理器,所述第二处理器执行所述程序时实现以下步骤:
根据超声成像参数建立多个离散备选项,
在触摸显示屏上按预置顺序间隔排列显示多个信息指示符,一个信息指 示符与一个离散备选项关联对应,
在触摸显示屏上显示指示标识,
监测输入对象在所述触摸显示屏上的接触,
根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化,
检测所述接触的释放,
根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符,
确定与所述指示标识正对的信息指示符所关联对应的离散备选项,
依据确定的与所述指示标识正对的信息指示符所关联对应的离散备选项,获得超声成像参数对应的参数值,和,
通过所述第二通信模块输出含有所述参数值的控制信号。
附图说明
图1为提供了依照一些实施例的超声医学检测设备的系统架构示意图;
图2为提供了依照一些实施例的超声医学检测设备的系统架构示意图;
图3提供了图1或图2所示的本实施例中超声成像控制方法的流程示意图;
图4提供了图3实施例中图形用户界面的一种显示实施例;
图5提供了图3实例中图形界面的另一种显示实施例;
图6为图3实施例中图形用户界面的一种变形实施例;
图7和图8为图3实施例中多个信息指示符呈圆形排列的一种变形实施例。
具体实施方式
下面通过具体实施方式结合附图对本发明作进一步详细说明。其中不同实施方式中类似元件采用了相关联的类似的元件标号。在以下的实施方式中,很多细节描述是为了使得本申请能被更好的理解。然而,本领域技术人员可以毫不费力的认识到,其中部分特征在不同情况下是可以省略的,或者可以由其他元件、材料、方法所替代。在某些情况下,本申请相关的一些操作并没有在说明书中显示或者描述,这是为了避免本申请的核心部分被过多的描 述所淹没,而对于本领域技术人员而言,详细描述这些相关操作并不是必要的,他们根据说明书中的描述以及本领域的一般技术知识即可完整了解相关操作。
另外,说明书中所描述的特点、操作或者特征可以以任意适当的方式结合形成各种实施方式。同时,方法描述中的各步骤或者动作也可以按照本领域技术人员所能显而易见的方式进行顺序调换或调整。因此,说明书和附图中的各种顺序只是为了清楚描述某一个实施例,并不意味着是必须的顺序,除非另有说明其中某个顺序是必须遵循的。
本文中为部件所编序号本身,例如“第一”、“第二”等,仅用于区分所描述的对象,不具有任何顺序或技术含义。而本申请所说“连接”、“联接”,如无特别说明,均包括直接和间接连接(联接)。
图1给出了一个实施例中超声医学检测设备100的结构示意图,具体结构如下所示。图1所示的超声医学检测设备100主要包括:探头101、发射电路103、发射/接收选择开关102、接收电路104、波束合成模块105、信号处理模块116和图像处理模块126。在超声成像过程中,发射电路103将经过延迟聚焦的具有一定幅度和极性的发射脉冲通过发射/接收选择开关102发送到探头101。探头101受发射脉冲的激励,向检测对象(例如,人体或者动物体内的器官、组织、血管等等,图中未示出)发射超声波(可以是平面波、聚焦波或发散波中的任何一种),经一定延时后接收从目标区域反射回来的带有检测对象的信息的超声回波,并将此超声回波重新转换为电信号。接收电路104接收探头101转换生成的电信号,获得超声回波信号,并将这些超声回波信号送入波束合成模块105。波束合成模块105对超声回波信号进行聚焦延时、加权和通道求和等处理,然后将超声回波信号送入信号处理模块116进行相关的信号处理。经过信号处理模块116处理的超声回波信号送入图像处理模块126。图像处理模块126根据用户所需成像模式的不同,对信号进行不同的处理,获得不同模式的超声图像数据,然后经对数压缩、动态范围调整、数字扫描变换等处理形成不同模式的超声图像,如B图像,C图像,D图像等等,或者其他类型的二维超声图像或三维超声图像。上述发射电路和接收电路激励探头根据超声成像参数的设定向检测对象发射超声波束,并接收超声波束的回波,获得超声回波信号,从而获得期望的超声图像数据,用以进行显示,展现检测对象内部的组织结构。本文中提到的超声成 像参数涉及所有在超声组织图像的成像过程中可供用户进行自主选择的参数,例如,TGC(Time Gain Compensate,时间增益补偿),声波频率,脉冲重复频率(pulse recurrence frequency,PRF),超声波类型,和动态范围等等。在其中一个实施例中,在超声成像系统的超声成像参数中至少包含一个不连续的超声成像参数,即这些参数不是连续变化的,而是分为多个水平或者档位,这些水平或者档位之间存在一定的间隔。这些参数在调节时,将只在这些水平和档位处取值,而这些水平或者档位之间的值将不会取值。本文将这些不连续变化的参数称之为离散量或离散值,这些离散值将在图形用户界面上形成多个离散备选项。在本发明的其中一些实施例中,图1中的信号处理模块116和图像处理模块126可以集成在一个主板106上,或者其中的一个或两个以上(本文中以上包括本数)的模块集成在一个处理器/控制器芯片上实现。
获得超声图像可以输出至第一触摸显示屏130进行显示。第一触摸显示屏130通过输入输出接口(输入输出接口可以采用有线通信或者无线通信的方式来实现)与图像处理模块连接实现数据传输。此外,还可以包括第一处理器140以及第一存储器160。第一处理器140调用第一存储器160上记载的计算机程序指令从而将超声图像显示在第一触摸显示屏130上,和/或在触摸显示屏上形成图形用户界面。在其中一个实施例中,在第一触摸显示器130上显示图形用户界面(GUI),并展现诸如前文提到的有关超声图像成像过程中涉及的超声成像参数调节、各种功能按键等图形控件。基于图形用户界面(GUI)可以获得因输入对象在触摸显示器上的操作而产生的对图形控件进行的相应操作的控制指令,这些关于超声成像参数等信息的控制指令可以通过有线或者无线的方式传输给超声医学检测设备100,并用于控制探头、发射电路、接收电路等的工作,用于获得期望得到的超声图像。针对超声图像的显示,例如,超声图像可以分别显示在两个显示器屏幕上,或者在同一个显示器屏幕上进行分屏显示。在触摸显示屏上即可以显示超声图像,也可以显示用户操作指令输入的图形用户界面(GUI)。
基于在触摸显示屏上显示的图形用户界面,第一处理器140可以调用存储器160中存储的手势检测模块113,来检测用户通过输入对象在图形用户界面上执行接触操作而获得的控制指令。在多个实施例中,包含具有带有图形用户界面(GUI)的触摸显示屏、一个或多个处理器、存储器、和存储在存储器中用于执行多种功能的一个或多个模块、程序或指令集,由它们共同实现了基于图形用户界面(GUI)的操控输入检测并获得相关控制指令。在多个实施 例中,这些功能可以包括对检测对象(例如,病人的组织)进行参数调节、信息输入等以获得医疗检测数据、图像浏览、病理数据库构建、检索和维护、病人档案信息构建、显示和管理、病人目录信息构建、显示和管理、等等。用于执行这些模块、程序或指令可以包括在为供一个或多个处理器执行而配置的计算机程序产品中。在本发明的其中一些实施例中,用户主要在触摸显示屏上通过手势输入与图形用户界面进行交互。这里的手势输入可以包括通过直接接触触摸显示屏或接近触摸显示屏使设备可以检测的任何类型的用户手势输入。例如,手势输入可以是用户使用右手或左手的手指(例如,食指、拇指等)、或者可以通过触摸显示屏可检测的输入对象(例如,手写笔、触摸显示屏专用笔)在触摸显示屏上选择一个位置、多个位置、和/或多个连续位置的动作,可以包括类似接触、触摸的释放、触摸的轻拍、长接触、旋转展开等操作动作。这里,长接触对应于在手指、拇指、手写笔等与触摸显示屏保持持续接触状态下沿着预定方向或可变的方向移动手指、拇指、手写笔的一种手势输入,例如,像触摸拖动、轻拂、擦过、滑动、扫掠等那样的手势操作动作。可见,手势输入通过输入对象与触摸显示屏的接触来实现,与触摸显示屏的接触可以包括手指、拇指、或手写笔等直接与触摸显示屏接触,或非直接接触地接近触摸显示屏,而非直接接触地接近触摸显示屏的手势输入是指在接近触摸显示屏的空间位置上的手势操作动作。而上述图形用户界面是指对软件的人机交互、操作逻辑、界面美观的整体设计,其可以包括一个或多个软键盘、以及多个图形控件对象。软键盘可以包括一定数量的图标(或软键)。这可以使用户可以选择软键盘中的一个或多个图标,并因此选择一个或多个相应符号进行输入。手势检测模块113可以检测输入对象与触摸显示屏之间进行交互的手势输入。手势检测模块113包括用于执行与手势输入检测相关的各种操作,譬如,确定是否发生了接触、确定手势输入是否持续输入、确定是否与预定手势对应、确定手势输入所对应的操作位置、确定手势输入对应的操作位置是否移动到相应显示区域的边缘位置、确定手势输入是否已中断(如,接触是否已停止)、确定手势输入的移动并跟踪手势输入的移动轨迹等等各个步骤的各种程序模块。确定手势输入的运动可以包括确定手势输入所对应的操作位置的运动速率(幅度)、运动速度(幅度和方向)、和/或运动加速度(幅度和/或方向的变化)、运动轨迹等等。这些操作可以应用于单个操作位置(例如,一个手指所实现的手势输入)、或多个同时操作位置(例如,“多触摸”,即多个手指所实现的手势输入)。在一些实施例中,手势检测模 块113用于检测触摸显示屏表面上或在接近触摸显示屏的空间位置上的一个或多个输入对象的运动。手势检测模块113存储在存储器上,并通过一个或多个处理器的调用来实现上述手势输入的监测,获得用户的操作输入指令。
当然,在图1所示的实施例中,第一处理器140和第一存储器160可以设置在主板106上,也可以独立于主板106设置,或者与触摸显示屏130集成安装在一起形成独立的显示控制器,即实现超声图像的显示,也可以实现基于超声图像而获得用户输入的控制指令。在其中一个实施例中,图1中的信号处理模块116和/或图像处理模块126,连同第一处理器140可以统一设置在一个或多个处理器上执行超声图像的数据处理,以及上述手势输入的监测和图形用户界面的生成。
图2提供了另一个实施例的结构示意图。如图2所示,超声医学检测设备200包括:探头201、发射电路203、发射/接收选择开关202、接收电路204、波束合成模块205、信号处理模块216和图像处理模块226。在本实施例中,探头201、发射电路203、发射/接收选择开关202、接收电路204、波束合成模块205、信号处理模块216和图像处理模块226所实现的功能和实现方式与图1所示实施例中的探头101、发射电路103、发射/接收选择开关102、接收电路104、波束合成模块105、信号处理模块116和图像处理模块126相同,可参见前文说明在此不再累述。在本发明的其中一些实施例中,图2中的信号处理模块216和图像处理模块226可以集成在一个主板206上,或者其中的一个或两个以上(本文中以上包括本数)的模块集成在一个处理器/控制器芯片上实现。与图1所示实施例不同之处在于,超声医学检测设备200还包括:与图像处理模块226电连接的第一通信模块215,用于将图像处理模块226获得的超声图像数据传输至智能控制器270,和/或接收智能控制器270输入的控制信号用以设置在超声成像过程中使用的超声成像参数。设置超声成像参数的操作包括更新超声成像参数、调整超声成像参数、或初始化超声成像参数的设置等操作。本实施例中的智能控制器270包括:第二触摸显示屏230,第二处理器240,第二存储器260和第二通信模块214。第二存储器260存储第二处理器240上运行的计算机程序,例如手势检测模块213,本实施例中手势检测模块213和图1所示实施例中的手势检测模块113功能相同,在此不再累述。第二触摸显示屏230与第一触摸显示屏130的实现功能相同,但是具体的产品参数可能不相同,冠之“第一”和“第二”仅用于在表述实施例时区分不同的应用场景内的实体,下文关于方法步骤或者描述 单一应用场景时也可等同理解为就是传统意义上的触摸显示屏,因此本文其他地方也可简称为触摸显示屏。第二通信模块214接收来自第一通信模块215传送的超声图像数据,和/或向第一通信模块215发送控制信号,例如含有超声成像参数设置信息的控制信号。智能控制器270包括图1中提到的显示控制器,但是也可以包含诸如各种智能终端设备,例如IPAD、手机等等带有触摸显示屏的计算机设备。第一通信模块215和第二通信模块214的通信方式可以采用wifi协议、蓝牙传输协议、移动通信网络协议等等无线数据传输协议。超声医学检测设备200和智能控制器270构成一个超声成像系统。
基于上述图1或图2所提供的超声医学检测设备(100,200)的结构示意图,以下将结合图1或者图2提供的硬件环境详细描述一下有关超声成像参数的设置方式。
本实施例中公开的设备是一种可通过触摸控制来对超声医学检测设备的成像参数进行调节的超声成像系统,其通过用户与触摸显示屏上的图形用户界面来更加直观与超声成像设备进行交互获得图像数据及超声成像参数,增加了用户操作超声设备的便利性,提升用户体验。当在触摸屏(即触摸显示屏)上用滑动方式调节参数时,在触摸屏上会同时显示这些参数的调节界面并且显示的调节界面会随着手指或者触笔的滑动而变化。当调节的参数是这种不连续的参数时,这些参数只能取不连续的值,因此调节时显示的调节界面会出现“跳动”的现象,影响用户对参数的调节和使用体验。而且,用户调节时难以正好将调节标识精确地定位到参数调节界面上的期望的水平或者档位的位置,不便于用户对参数的调节,使用不便。而本实施例中提供的超声参数选择方式可以很好的避免这一问题的产生,当用户在触摸屏上操作参数的调节时,可以更好的体验到调节标识随用户触摸而移动的调节效果,并不会出现因标识的随意跳动而导致调节定位不准确的问题出现,可以方便用户精确定位到期望的水平或档位位置。因此本实施例中提供了如下图3所示的超声成像参数的控制方式。图3提供了图1或图2所示的本实施例中超声成像控制方法的流程示意图。图4提供了图3实施例中图形用户界面的一种显示实施例。图5提供了图3实例中图形界面的另一种显示实施例,图5和图4的不同之处在于信息指示符的排列方向不同。图6为图3实施例中第二区域的一种变形实施例。
在图3的步骤S210中,发射电路和接收电路(103和104,203和204)激励探头(201,101),根据设定的超声成像参数向检测对象发射超声波束, 并在步骤S212中,激励探头(201,101)接收上述超声波束的回波,获得超声回波信号。
在图3的步骤S214中,利用图像处理模块根据超声回波信号获得超声图像,例如,图1的实施例中通过图像处理模块(126)来根据超声回波信号获得超声图像,图2的实施例中通过图像处理模块(226)来根据超声回波信号获得超声图像。同时图1中的超声医学检测设备内,还提供第一存储器,用于存储处理器上运行的计算机程序,例如上述手势检测模块113。而在图2中的智能控制器270中提供第二存储器,用于存储处理器上运行的计算机程序,例如上述手势检测模块213。本文的超声图像可以是前文所述的不同模式的超声图像,如B图像,C图像,D图像等等,或者其他类型的二维超声图像或三维超声图像。同样的,本文提到的超声图像可以是静态帧图像,也可以是动态视频图像。
在图3的步骤S216中,第一处理器140或者图2中的第二处理器240根据超声成像参数建立多个离散备选项。关于超声成像参数的解释参见前文相关说明。在其中一个实施例中,本实施例中的超声成像参数为TGC(Time Gain Compensate,时间增益补偿),声波频率,脉冲重复频率(pulse recurrence frequency,PRF),超声波类型,和动态范围中的其中之一。例如,针对TGC获得的多个离散备选项,可以分别为1.5,2.5,3.5,……,等等。
在图3的步骤S218中,第一处理器或者图2中的第二处理器240在第一触摸显示屏130或第二触摸显示屏230上按预置顺序间隔排列显示多个信息指示符305(如图4所示),一个信息指示符305与一个离散备选项关联对应。多个信息指示符305可以设置在区域301(如图4所示)内。这里间隔排列可以是等间距排列。
信息指示符305的表现形式可以是界面文本的提示方式,也可以是绘制刻度指示符的提示方式,等等。本实施例中的预置的排列顺序可以是沿直线排列、沿曲线的排列或者沿圆形排列等等。在其中一个实施例中,如图4所示的实施例中,TGC的多个离散备选项对应的多个信息指示符305沿着直线排列显示在触摸显示屏的图形用户界面上。多个信息指示符305以刻度提示符和文本提示的方式展现在图形用户界面上,用于提醒用户离散备选项的信息,例如数值。多个信息指示符305显示在图形用户界面上的区域301内,区域301可以是图形用户界面上的任意一个像素区域,例如,可以位于超声图像显示区域的附近,或者至于超声图像显示区域内的一个像素区域。当然, 区域301也可以叠加显示在超声图像之上,例如,图形用户界面至少包含两层界面层,在触摸显示屏的第一界面层上显示超声图像,在第一界面层的上方叠加透明设置的第二界面层,并将多个信息指示符305设置在第二界面层上。这样的设置方式可以让信息指示符305悬浮于超声图像之上,不遮挡超声图像的显示,并能够令用户观察到因为基于超声成像参数的调节而带来的超声图像的变化。更进一步地,在其中一个实施例中,在触摸显示屏上显示一个悬浮窗,该悬浮窗可以移动到图形用户界面上的任意一个像素区域位置。在悬浮窗内设置上述区域301,并在区域301内设置多个信息指示符305,便于用户可视化了解到相关超声成像参数的多个离散备选项。基于接收到的来自用户的调节触发信号,在触摸显示屏上显示上述悬浮窗。
图5中给出了另一个图形用户界面的实施例。在触摸显示屏上显示图形用户界面400,界面上的区域402内显示超声图像。在区域416内按预置顺序排列显示多个信息指示符415,一个信息指示符415与一个离散备选项关联对应。区域416可以是图形用户界面上的任意一个像素区域,例如,可以位于超声图像显示区域的附近,或者至于超声图像显示区域内的一个像素区域。信息指示符415的表现形式可以是界面文本的提示方式,也可以是绘制刻度指示符的提示方式,等等。
此外,在其中一个实施例中,如图4或者图5所示,多个显示区(309,418)沿信息指示符的排列方向设置,并且相邻显示区之间间隔设置,例如,多个显示区等间距排列。间隔设置的多个显示区中,每个显示区用以显示一个信息指示符。这样的目的使得在指示标识位于显示区时能够准确指示到信息指示符,准确提示用户其选择的离散备选项的刻度信息。显示区是指在图形用户界面上与信息指示符关联对应的位置,当指示标识位于显示区则表征选中了与该显示区关联对应的显示信息指示符,每个显示区可以为1个或多个像素位置。
在图3的步骤S220中,第一处理器140或者图2中的第二处理器240在触摸显示屏(如第一触摸显示屏130或第二触摸显示屏230)上显示指示标识。如图4或图5所示,第一处理器140或者第二处理器240在预定区域(301,416)内显示一个指示标识(412,302)。图4和图5中,指示标识采用黑色方块和箭头来表示,当然还可以采用其他方框、三角标识等等任意形状的标示符,例如采用图7所示的镂空框来表示指示标识712。
如图4或图5所示,指示标识(412,302)可以沿着信息指示符的排列方 向移动到预定区域内的任意一个操作位置,这里的预定区域可以为区域301或者区域416,即显示有多个信息指示符的区域。本文中提到的界面上的操作位置是指用户利用人机交互设备对界面对象(例如指示标识)进行操作输入时对应于显示界面上的位置。本文提到的“位置”包含方位信息、坐标信息、和/或角度信息等等,例如,关于指示标识在图形用户界面上的显示位置,可以用指示标识所在的像素点的坐标信息来表征,也可以用沿信息指示符的排列方向所占的位置标记来表征。指示标识也可以设置在第二界面层之上。同样的,区域416也可以与区域301一同设置在上述悬浮窗内,将指示标识也显示在悬浮窗内。
在上述步骤S220之后还可以包括:第一处理器140或者图2中的第二处理器240关联记录显示区(309或418)与离散备选项的对应关系,例如还可以将记录的显示区(309或418)与离散备选项的对应关系存储在第一存储器中。一个离散备选项关联对应于触摸显示屏(300或400)上的一个显示区(309或418)。记录显示区(309或418)与离散备选项的对应关系的方式,可以是记录显示区(309或418)在触摸显示屏上的像素区域范围,对于离散备选项的值的对应关系,便于后续根据用户的输入进行快速查找。
在图3的步骤S224中,第一处理器140或者图2中的第二处理器240调用手势检测模块(113,213)来监测输入对象在触摸显示屏(如第一触摸显示屏130或第二触摸显示屏230)上的接触。
例如,监测输入对象在预定区域内与触摸显示屏的接触。这里的预定区域可以包括指示标识的可操控移动区域,或者多个信息指示符可操控移动区域,例如,预定区域可以为区域301或区域416;或者也可以为包含区域301或区域416的界面区域,或者还可以为除区域301或区域416之外的区域。
还例如,如图6所示,在图形用户界面600上显示超声图像606,区域612内按照预置顺序间隔排列显示多个信息指示符601,并且在区域612内显示指示标识602,在区域612内还设置有多个显示区613,每个显示区631对应显示一个信息指示符601。其中,在步骤S226中,第一处理器140或者图2中的第二处理器240调用手势检测模块(113,213)来监测输入对象604在第三区域608与触摸显示屏(130,或230)的接触。第三区域608设置在区域612之外的区域。而在第三区域608内的触控操作,会影响区域612中多个信息指示符601和指示标识602之间的相对位置关系。一次触控操作包括输入对象在触摸显示屏上的一次接触及该次接触的释放。
在其中一个实施例中,在上述步骤S224之后还包括:
第一处理器140或者图2中的第二处理器240在根据上述接触移动所述指示标识或所述多个信息指示符的显示之前,根据输入对象与触摸显示屏的初次接触,能够放大关于多个信息指示符的更多细节,便于用户更加精确定位到期望将超声成像参数调整到的某一具体的离散值。
例如图7所示,在图形用户界面700上显示超声图像702,多个信息指示符715沿圆形间隔排列在图形用户界面700上,并且多个间隔排列的显示区718中每个显示区718显示一个信息指示符。第一处理器140或者图2中的第二处理器240识别输入对象703在触摸显示屏上的接触对应在触摸显示屏上的第一操作位置,图7中7191为第一操作位置的初始接触位置,然后,查找与第一操作位置(如初始接触位置)关联的多个信息指示符715中的至少一部分;其次,对查找到的多个信息指示符715中的至少一部分进行放大处理,以便用户可以清晰的了解到信息指示符所指示的离散备选项的更多细节。更进一步地,在其中一个实施例中,如图7所示,第一处理器140或者图2中的第二处理器240基于查找到的多个信息指示符715中的至少一部分,在两个信息指示符对应的两个离散备选项之间建立多个子级的离散备选项。例如,初始接触位置7191选中在信息指示符“35”和“40”之间,那么将放大展开这两个信息指示符所对应的两个离散备选项“35”和“40”之间的多个子级的离散备选项。界面700上的方框区域内的多个信息指示符715在输入对象703接触触摸显示屏后转变为指示线720所指向的虚线框721内的显示结果。“35”和“40”之间将继续展开为多个子级的离散备选项“37”、“39”,此外,其余部分的多个信息指示符715也会顺次展开多个子级的离散备选项。多个子级的信息指示符按预置顺序间隔排列显示在触摸显示屏上,一个子级的信息指示符与一个子级的离散备选项关联对应。因此,原先的多个信息指示符所指代离散备选项将因输入对象与触摸显示屏的接触发生同比例放大显示。当然,为了使产生放大效果的接触诱因区别下面移动指示标识或多个信息指示符的接触诱因,那么可以通过接触的时间来判断,若输入对象与触摸显示屏的接触时间在同一操作位置超过一定阈值,那么认为期望同比放大多个信息指示符的至少一部分,则诱因产生图7所示的放大效果,反之,输入对象与触摸显示屏的接触时间在同一操作位置未超过一定阈值,则不认期望同比放大多个信息指示符的至少一部分,将不导致图7所示的虚线框721的显示结果发生。
在图3的步骤S226中,第一处理器140或者图2中的第二处理器240根据在触摸显示屏(如第一触摸显示屏130或第二触摸显示屏230)上监测到的输入对象的接触,移动上述指示标识或上述多个信息指示符的显示,使指示标识与多个信息指示符之间的位置关系发生变化。
例如,在其中一个实施例中,第一处理器140或者图2中的第二处理器240跟踪监测上述接触的运动,确定上述接触的运动关联在触摸显示屏上的第二操作位置,根据第二操作位置的变化,移动显示指示标识或多个信息指示符,从而使得指示标识与多个信息指示符之间的位置关系发生变化。
第一处理器140或者图2中的第二处理器240确定上述与触摸显示屏(130,或230)接触关联在触摸显示屏上的操作位置,即第二操作位置。第二操作位置可以位于区域301,416,或者612内,也可以位于区域608内(即图6中设置在图形用户界面上除区域612之外的一个界面区域608)。在其中一个实施例中,在图6显示的实例中,输入对象604不是直接在区域612中滑动,而是在触摸屏(130,或230)上一个控制区域(即第三区域608)中滑动。当输入对象604在控制区域608中滑动时,相应的指示标识602对应地在区域612中滑动,即通过输入对象604在控制区域608中的运动控制指示标识602在区域612中滑动,从而实现对超声成像参数的调节。因此,根据输入对象604在第三区域608中的接触,可以获得对应于区域612中的第二操作位置615。
在另一些实施例中,在图4显示的实施例中,输入对象308在区域301中滑动接触时,根据该滑动接触可以得到对应于区域301内的操作位置,从而获得第二操作位置(3111,3112)。
更进一步地,如图4和图5所示,在步骤S226中,第一处理器或者图2中的第二处理器240调用手势检测模块(113,213)来跟踪监测输入对象与触摸显示屏(130,或230)的接触的运动,该接触的运动产生输入对象与触摸显示屏的持续接触结果,例如前文中提到的长接触。例如,在其中一个实施例中,上述步骤S226中监测输入对象在触摸显示屏(130,或230)上的持续接触,持续接触时第一处理器140或者图2中的第二处理器240通过手势检测模块可以识别出在触摸显示屏(130,或230)上一些系列连续变化的位置。于是,在步骤S228中,如图4、图5或图6所示的实施例中,第一处理器或者图2中的第二处理器240确定上述接触在触摸显示屏(130,或230)上的多个操作位置,可以获得多个连续变化的第二操作位置。当然,在多个 连续变化的第二操作位置可以沿信息指示符的排列方向排列变化。
接着,第一处理器140或者图2中的第二处理器240根据第二操作位置的变化,移动显示指示标识或多个信息指示符,从而使得指示标识与多个信息指示符之间的位置关系发生变化。例如,在移动显示指示标识或多个信息指示符的过程中,如图4所示,当输入对象308与触摸显示屏的接触从第一操作位置3111移动到第一操作位置3112时,将指示标识302的显示从第一操作位置3111移动到第一操作位置3112;如图5所示,当输入对象403与触摸显示屏的接触从第一操作位置4121移动到第一操作位置4122时,将指示标识412的显示从第一操作位置4121移动到第一操作位置4122,使得指示标识412偏转指示在信息指示符“60”和“80”之间;如图6所示,当输入对象604与触摸显示屏的接触从第一操作位置615移动到第一操作位置616时,将指示标识602的显示从第一操作位置615移动到第一操作位置616,使得指示标识602偏离其中一个信息指示符601。还比如,如图8所示,当输入对象703与触摸显示屏的接触从第一操作位置7191移动到第一操作位置7192,将多个信息指示符715的显示从第一操作位置7191移动到第一操作位置7192,即如图8中指示线730指向的虚线框732所示的显示结果,多个信息指示符715排列成圆圈,相比图8中方框731内的显示结果,图8中虚线框732内的显示结果顺时针偏转了一定角度,使得指示标识712从正对指示信息指示符“20”、偏转到指示在信息指示符“75”和“80”之间。
上述根据上述接触移动指示标识或多个信息指示符的显示的过程中,采用更新显示的方式,即更新是指删除指示标识在原先位置的显示,而将其位置变更为与第二操作位置相关的位置,使得指示标识或者多个信息指示符跟随第二操作位置的变化而变化。本文中的“变更”的含义也可以理解为改变、变换、或替换界面对象在界面上的显示位置。
例如,在其中一个实施例中,上述步骤S224至步骤S226包括:第一处理器140或者图2中的第二处理器240调用手势检测模块监测输入对象在触摸显示屏(130,或230)上的持续接触,确定上述持续接触在触摸显示屏上对应的多个操作位置,获得多个连续变化的第一操作位置;和,将指示标识的显示依次移动至多个连续变化的第一操作位置,使得指示标识跟随第二操作位置的变化而变化。如图4所示,当输入对象308沿方向306或者方向307在触摸显示屏的预定区域内持续移动,从而产生与触摸显示屏的持续接触,而第一处理器通过调用手势检测模块监测持续接触,可以获得一组多个连续 变化的第二操作位置(3111,3112),这一组多个连续变化的第二操作位置(3111,3112)沿信息指示符的排列方向顺序变化,并可能沿信息指示符的排列方向跨过至少一个显示区309,根据对多个连续变化的第二操作位置(3111,3112)的识别,将指示标识302的显示依次移动至多个连续变化的第二操作位置,使得指示标识跟随第二操作位置的变化而变化。
更进一步地,在其中一个实施例中,将指示标识的显示依次更新在第二操作位置包括:第一处理器或者图2中的第二处理器240调用手势检测模块识别第二操作位置在信息指示符的排列方向的位置变化,根据第二操作位置在信息指示符的排列方向上的位置变化,将指示标识在信息指示符的排列方向上的位置,依次变更为第二操作位置在信息指示符的排列方向上的位置,从而实现根据上述第二操作位置的变化移动显示指示标识。
例如,假设用(X,Y)来表示上的某个像素点位置在该直角坐标系的坐标,那么,图4中多个信息指示符305沿垂方向(即Y方向)排列,当图3中第二操作位置沿方向307从位置3111变化到位置3112时,位置3111和位置3112可以分别表示为(x1,y1),(x2,y2),其中识别第二操作位置沿垂直方向(即Y方向)的位置变化(y1→y2),然后根据识别的位置变化(y1→y2),将指示标识(302)在垂直方向(即Y方向)的位置,更新为第二操作位置在垂直方向的变化位置y2,即指示标识在直角坐标系中的位置位于(x1,y2)。当然,第二操作位置的变化可能只是沿Y方向的变化,即位置3111和位置3112可以分别表示为(x1,y1),(x1,y2)。本实施例中x\y表示的可以不仅仅是一个坐标位置值,还可以是一定的坐标范围、或者还可以是一定范围中的中心点坐标位置。同样地,在图5中,识别第二操作位置沿水平方向(即X方向)的位置变化,并进行将指示标识412的显示更新在第二操作位置4121的操作。采用这种显示方式可以限制指示标识的活动,使其随输入对象的移动变化仅在第二区域内进行移动,例如,沿着信息指示符的排列方向移动到预定区域内的任意一个操作位置。
此外,在其中一些实施例中,在上述步骤S226至步骤S234的过程之前还可以包括:
上述第一处理器140或者图2中的第二处理器240调用手势检测模块检测输入对象与触摸显示屏(130,或230)的接触是否位于指示标识所在的操作位置内,当输入对象与触摸显示屏(130,或230)的接触在指示标识所在的显示区域内时,执行上述步骤S226至步骤S234的过程。反之,当输入对 象与触摸显示屏(130,或230)的接触不在指示标识所在的显示区域内时,则不更新指示标识的显示位置,不执行上述步骤S226至步骤S234的过程。有关步骤S226至步骤S234的过程描述可参见前文相关说明。本实施例可以通过跟踪监测输入对象与触摸显示屏(130,或230)的接触对指示标识进行的输入操作,来确保控制信号输入的准确性,保证离散备选项的选择可靠性。例如,如图5所示,首先,上述第一处理器或者图2中的第二处理器240调用手势检测模块在区域416内检测输入对象403与触摸显示屏(130,或230)的接触是否位于指示标识412所在的操作位置内,也就是判断输入对象403与触摸显示屏(130,或230)的接触在触摸显示屏(130,或230)上的操作位置411是否与指示标识412所在的操作位置重合,若是,则执行上述步骤S226至步骤S230的过程,开始跟踪监测输入对象403与触摸显示屏(130,或230)的接触沿方向404或者方向405的运动。反之,则不根据检测到的输入对象403与触摸显示屏(130,或230)的接触更新指示标识的显示位置,即不执行上述步骤S226至步骤S234的过程。
上述步骤S226至步骤S234的过程中,指示标识可以随输入对象与触摸显示屏的持续接触来变化显示位置,为了提高指示标识随持续接触的可视化效果,在将指示标识的显示更新至第二操作位置的过程中,可以按照可视化的显示移动速度来计算指示标识在图形用户界面上两个操作位置之间的移动速度,并基于该移动速度来调整指示标识在两个第一操作位置之间的显示移动,从而呈现连续的显示移动效果。
此外,在其中一个实施例中,被操控的可以不是指示标识,而是多个信息指示符,于是如图8所示,上述第一处理器140或者图2中的第二处理器240调用手势检测模块识别上述输入对象703接触触摸显示屏产生的第二操作位置的变化方向;根据检测的变化方向,沿变化方向移动多个信息指示符。例如图8中,检测到第二操作位置从7191变换到7192,变化方向是顺时针移动,因此沿顺时针移动多个信息指示符715使得指示标识712与多个信息指示符之间的相对位置关系发生改变。
在图3的步骤S228中,图1中的第一处理器140或者图2中的第二处理器240调用手势检测模块检测输入对象与触摸显示屏(130,或230)接触的释放。
在其中一个实施例中,图1中的第一处理器140或者图2中的第二处理器至少通过以下方式之一实现检测上述接触的释放。
1、在触摸显示屏上监测输入对象与触摸显示屏(130,或230)之间接触的脱离,也就是说,输入对象不再对触摸显示屏进行输入操作。
2、监测在输入对象403与触摸显示屏(130,或230)的接触是否位于预定区域之外,当上述接触位于预定区域之外时,则认为上述接触已经释放,输入对象不再对触摸显示屏进行输入操作。
在检测到输入对象与触摸显示屏(130,或230)接触产生释放时,图1中的第一处理器140或者图2中的第二处理器会检测到上述接触的释放在触摸显示屏上的第三操作位置。例如,在触摸显示屏上监测输入对象与触摸显示屏(130,或230)之间接触的脱离,将上述接触在脱离前位于的操作位置作为第三操作位置。如图5所示,在区域416内输入对象403与触摸显示屏(130,或230)之间接触,处理器检测两者的接触从第一操作位置4121移动到第一操作位置4122,因此也将指示标识412的显示不断更新,用以连续移动至第一操作位置4122。此时,在第一操作位置4122处检测到输入对象403与触摸显示屏(130,或230)之间接触的脱离,识别将上述接触在脱离前位于的操作位置4122作为第三操作位置。还如图8中在预定区域(图8中为显示多个信息指示符的区域范围)内监测输入对象703与触摸显示屏(130,或230)之间沿方向704或者方向705的持续接触,处理器检测两者的接触关联在预定区域内是从第一操作位置7191移动到第二操作位置7192,因此也将指示标识712的显示不断更新,用以连续移动至第二操作位置7192。此时,当在指示标识712的显示在第二操作位置7192时,处理器检测到输入对象403与触摸显示屏(130,或230)之间接触发生脱离,识别将上述接触在脱离前位于的操作位置7192作为第三操作位置,此时多个信息指示符与指示标识之间的相对位置关系将从虚线框732所指示的显示结果,按照指示线740所指示的,跳变到显示框733所指示的显示结果。此外,本文虽然用“第一”或者“第二”、“第三”区分了不同情况下的操作位置,用以在描述时进行区分,因此,操作位置包括第一操作位置、第二操作位置和第三操作位置;二第二操作位置可以包括第一操作位置,第一操作位置为输入对象403与触摸显示屏(130,或230)之间发生初次接触产生的操作位置,属于第二操作位置变化的起点。
还比如,检测在输入对象与触摸显示屏(130,或230)的接触是否位于预定区域之外,当上述接触位于预定区域之外时,识别将上述接触对应在显示多个信息指示符的区域范围内的最后一个操作位置作为所述第三操作位置。 预定区域可以为上述区域416或301,或者上述第三区域(图6中的608)。例如,图6中灰色箭头标识指示标识602的历史显示位置,黑色箭头标识指示标识602的当前显示位置。在第三区域608内监测输入对象604与触摸显示屏(130,或230)之间沿方向603或者方向605的持续接触,处理器检测两者的接触关联在区域612内是从第一操作位置615移动到第一操作位置616,因此也将指示标识602的显示不断更新,用以连续移动至第一操作位置616。此时,当在指示标识602的显示在第一操作位置616时,处理器检测到输入对象403与触摸显示屏(130,或230)之间接触位于第三区域608之外,识别将上述接触对应在区域612内的最后一个操作位置616作为第三操作位置。
在图3的步骤S230中,第一处理器或者图2中的第二处理器240根据上述接触释放时指示标识与多个信息指示符之间的相对位置关系,调节指示标识和所述多个信息指示符之间的相对位置,使指示标识正对多个信息指示符中的其中一个信息指示符。
例如,在其中一个实施例中,通过以下方式实现上述步骤230:第一处理器或者图2中的第二处理器240在触摸显示屏上设置多个间隔排列的显示区,每个显示区用以显示一个信息指示符,识别输入对象与触摸显示屏接触的释放产生在触摸显示屏上的第三操作位置;查找与上述第三操作位置相关联的显示区;和,将指示标识显示在查找到的显示区,从而实现使指示标识正对多个信息指示符中的其中一个信息指示符。如图5所示,在触摸显示屏上设置多个间隔排列的显示区418,每个显示区418用以显示一个信息指示符415,识别输入对象403与触摸显示屏接触的释放产生在触摸显示屏上的第三操作位置4122;查找与上述第三操作位置4122相关联的显示区4181;和,将指示标识412显示在查找到的显示区4181。因此可以看出,当输入对象403与触摸显示屏的接触释放时,从上述接触释放时、指示标识412指示在信息指示符“60”和信息指示符“80”之间的位置,跳变到指示标识412正对指示信息指示符“80”的位置。还如图6所示,在触摸显示屏上设置多个间隔排列的显示区613,每个显示区613用以显示一个信息指示符601,识别输入对象604与触摸显示屏接触的释放产生在触摸显示屏上的第三操作位置616;查找与上述第三操作位置616相关联的显示区6131;和,将指示标识602显示在查找到的显示区6131。因此可以看出,当输入对象604与触摸显示屏的接触释放时,从上述接触释放时、指示标识602指示偏离多个信息指示符中其中之一的位置,跳变到指示标识602正对指示其中一个信息指示符的位置 (指示标识602正对指示刻度“12”)。
上述步骤230中,第一处理器或者图2中的第二处理器在查找与上述第三操作位置相关联的显示区时采用以下方式之一实现。
1、例如图5所示,查找与第三操作位置4122距离最近的显示区4181,将距离第三操作位置最近的显示区作为与上述第三操作位置相关联的显示区。
2、例如图6所示,查找输入对象604与第三区域608的接触致使指示标识的显示最后跨过的显示区6131。也就是说,根据第三操作位置查找输入对象604与触摸显示屏的持续接触在区域612内最后跨过的显示区。
还比如,在其中一个实施例中,通过以下方式实现上述步骤230:第一处理器或者图2中的第二处理器240在触摸显示屏上设置多个间隔排列的显示区,每个显示区用以显示一个信息指示符;识别输入对象与触摸显示屏接触的释放产生在触摸显示屏上的第三操作位置;查找位于指示标识附近的多个信息指示符中之一;和,将查找到的多个信息指示符中之一显示在与指示标识正对的位置。例如,如图8所示的虚线框733,在触摸显示屏上设置多个间隔排列的显示区718,每个显示区718用以显示一个信息指示符715;识别输入对象703与触摸显示屏接触的释放产生在触摸显示屏上的第三操作位置7192;查找位于指示标识712附近的多个信息指示符中之一,如信息指示符“80”,或者信息指示符“75”;和,将信息指示符“80”或信息指示符“75”显示在与指示标识712正对的位置,图8中在虚线框。因此可以看出,当输入对象703与触摸显示屏的接触释放时,从上述接触释放时、指示标识712指示在信息指示符“75”和“80”中间的位置,跳变到指示标识712正对指示信息指示符“80”。
在图3的步骤S232中,第一处理器或者图2中的第二处理器确定与上述指示标识正对的信息指示符所关联对应的离散备选项。
图5实施例中,上述指示标识正对的信息指示符“80”,确定离散备选项为刻度“80”;图6实施例中,上述指示标识正对的信息指示符“12”,确定的离散备选项为刻度“12”。其中,各个处理器可以通过查找信息指示符所占的显示区与离散备选项之间的关系,来确定上述指示标识正对的信息指示符所关联对应的离散备选项。
然后,在图3的步骤S234中,第一处理器或者图2中的第二处理器依据确定的与指示标识正对的信息指示符所关联对应的离散备选项,调整前述超声成像参数对应的参数值,并使用调整后的参数值获得前述超声图像。根据 确定的离散备选项,由处理器来重新设定超声扫描时所采用的超声成像参数。
此外,还例如图2所示的实施例中,第二处理器依据确定的与指示标识正对的信息指示符所关联对应的离散备选项,获得超声成像参数对应的参数值,然后智能控制器270生成含有该参数值的控制信号,并通过第二通信模块214将该控制信号输出至第一通信模块215,用以通过第一处理器来控制探头对目标组织的超声扫描和超声图像的形成,从而更新超声图像的显示结果。根据上述控制信号可以获得前述超声图像所需要的超声成像参数。
图3分别提供的仅仅是一种步骤间的流程执行顺序,还可以基于前文中对图3中的各个步骤进行调整顺序获得各种变形方案,上述各个步骤不限于仅按照图3的顺序执行,步骤间在满足基本逻辑的情况下可以相互置换,更改执行顺序,还可以重复执行其中的一个或多个步骤后,在执行最后一个或多个步骤,这些方案均属于依据本文提供的实施例进行的变形方案。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品承载在一个非易失性计算机可读存储载体(如ROM、磁碟、光盘、硬盘、服务器云空间)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本发明各个实施例的系统结构和方法。例如,一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时至少可以用于实现前文中提到的基于图3中步骤S216至步骤S234所示流程的各个实施例。
本发明在调节不连续型参数时,在触摸显示屏上,手指或者触笔按住参数调节标识滑动,滑动过程中该参数调节标识一直跟随手指或者触笔滑动。松开手指或者触笔之后,自动寻找与当前参数调节标识的位置最接近的参数水平或者档位,并自动将参数调节到该水平或者档位。这样,提高了参数调节过程中显示界面上的视觉反馈,方便了参数调节,提高了用户的体验。
以上实施例仅表达了几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干变形和改进,这些都属于本发明的保护范围。因此,本发明专利的保护范围应以所附权利要求为准。

Claims (31)

  1. 一种超声医学检测设备,其特征在于,所述设备包括:
    探头;
    发射电路和接收电路,用于激励所述探头向检测对象发射超声波束,并接收所述超声波束的回波,获得超声回波信号;
    图像处理模块,用于根据所述超声回波信号获得超声图像;
    触摸显示屏;
    第一存储器,所述第一存储器存储处理器上运行的计算机程序;和,
    第一处理器,所述第一处理器执行所述程序时实现以下步骤:
    根据超声成像参数建立多个离散备选项,
    在触摸显示屏上按预置顺序间隔排列显示多个信息指示符,一个信息指示符与一个离散备选项关联对应,
    在触摸显示屏上显示指示标识,
    监测输入对象在所述触摸显示屏上的接触,
    根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化,
    检测所述接触的释放,
    根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符,
    确定与所述指示标识正对的信息指示符所关联对应的离散备选项,和,
    依据确定的与所述指示标识正对的信息指示符所关联对应的离散备选项,调整所述超声成像参数对应的参数值,并使用所述参数值获得超声图像。
  2. 根据权利要求1所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时在所述根据所述接触移动所述指示标识或所述多个信息指示符的显示之前还包括:
    识别所述接触对应在触摸显示屏上的第一操作位置;
    查找与所述第一操作位置关联的所述多个信息指示符中的至少一部分;和,
    放大查找到的所述多个信息指示符中的至少一部分。
  3. 根据权利要求2所述的超声医学检测设备,其特在于,所述第一处理 器执行所述程序时在所述放大查找到的所述多个信息指示符中的至少一部分之后还包括:
    基于查找到的所述多个信息指示符中的至少一部分,在两个信息指示符对应的两个离散备选项之间建立多个子级的离散备选项;和,
    在触摸显示屏上按预置顺序间隔排列显示多个子级的信息指示符,一个子级的信息指示符与一个子级的离散备选项关联对应。
  4. 根据权利要求1所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时采用以下步骤实现所述根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化:
    跟踪监测所述接触的运动;
    确定所述接触的运动关联在触摸显示屏上的第二操作位置;和,
    根据所述第二操作位置的变化,移动显示所述指示标识或多个信息指示符。
  5. 根据权利要求1所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时采用以下步骤实现所述根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符:
    在触摸显示屏上设置多个间隔排列的显示区,每个显示区用以显示一个信息指示符;
    识别所述接触的释放产生在触摸显示屏上的第三操作位置;
    查找与所述第三操作位置相关联的显示区、或者位于所述指示标识附近的所述多个信息指示符中之一;和,
    将所述指示标识显示在查找到的显示区,或者将查找到的所述多个信息指示符中之一显示在与所述指示标识正对的位置。
  6. 根据权利要求5所述的超声医学检测设备,其特在于,所述第一处理器还至少采用以下方式之一实现所述查找与所述第三操作位置相关联的显示区:
    查找与所述第三操作位置距离最近的显示区;和,
    查找所述接触的运动最后跨过的显示区。
  7. 根据权利要求1所述的超声医学检测设备,其特在于,所述指示标识 可沿着信息指示符的排列方向移动到预定区域内的任意一个操作位置。
  8. 根据权利要求4所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时采用以下步骤实现所述根据所述第二操作位置的变化,移动显示所述指示标识或多个信息指示符:
    识别所述第二操作位置在信息指示符的排列方向上的位置变化;和,
    根据所述位置变化,将所述指示标识在信息指示符的排列方向上的显示位置,依次变更为所述第二操作位置在信息指示符的排列方向上的位置。
  9. 根据权利要求4所述的超声医学检测设备,其特在于,所述第一处理器执行所述程序时采用以下步骤实现所述根据所述第二操作位置的变化,移动显示所述指示标识或多个信息指示符:
    识别所述第二操作位置的变化方向;和,
    沿所述变化方向移动所述多个信息指示符。
  10. 一种超声成像控制方法,其包括:
    向检测对象发射超声波束,接收所述超声波束的回波,获得超声回波信号,根据所述超声回波信号获得超声图像;
    根据超声成像参数建立多个离散备选项,
    在触摸显示屏上按预置顺序间隔排列显示多个信息指示符,一个信息指示符与一个离散备选项关联对应,
    在触摸显示屏上显示指示标识,
    监测输入对象在所述触摸显示屏上的接触,
    根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化,
    检测所述接触的释放,
    根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符,
    确定与所述指示标识正对的信息指示符所关联对应的离散备选项,和,
    依据确定的与所述指示标识正对的信息指示符所关联对应的离散备选项,调整所述超声成像参数对应的参数值,并使用所述参数值获得超声图像。
  11. 根据权利要求10所述的超声成像控制方法,其特征在于,所述根据所述接触移动所述指示标识或所述多个信息指示符的显示之前还包括:
    识别所述接触对应在触摸显示屏上的第一操作位置;
    查找与所述第一操作位置关联的所述多个信息指示符中的至少一部分;和,
    放大查找到的所述多个信息指示符中的至少一部分。
  12. 根据权利要求11所述的超声成像控制方法,其特征在于,在所述放大查找到的所述多个信息指示符中的至少一部分之后还包括:
    基于查找到的所述多个信息指示符中的至少一部分,在两个信息指示符对应的两个离散备选项之间建立多个子级的离散备选项;和,
    在触摸显示屏上按预置顺序间隔排列显示多个子级的信息指示符,一个子级的信息指示符与一个子级的离散备选项关联对应。
  13. 根据权利要求10所述的超声成像控制方法,其特征在于,所述根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化包括:
    跟踪监测所述接触的运动;
    确定所述接触的运动关联在触摸显示屏上的第二操作位置;和,
    根据所述第二操作位置的变化,移动显示所述指示标识或多个信息指示符。
  14. 根据权利要求10所述的超声成像控制方法,其特征在于,所述根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符:
    在触摸显示屏上设置多个间隔排列的显示区,每个显示区用以显示一个信息指示符;
    识别所述接触的释放产生在触摸显示屏上的第三操作位置;
    查找与所述第三操作位置相关联的显示区、或者位于所述指示标识附近的所述多个信息指示符中之一;和,
    将所述指示标识显示在查找到的显示区,或者将查找到的所述多个信息指示符中之一显示在与所述指示标识正对的位置。
  15. 根据权利要求14所述的超声成像控制方法,其特征在于,采用以下方式之一实现所述查找与所述第三操作位置相关联的显示区:
    查找与所述第三操作位置距离最近的显示区;和,
    查找所述接触的运动最后跨过的显示区。
  16. 根据权利要求10所述的超声成像控制方法,其特在于,所述指示标识可沿着信息指示符的排列方向移动到预定区域内的任意一个操作位置。
  17. 根据权利要求10所述的超声成像控制方法,其特在于,所述根据所述第二操作位置的变化,移动显示所述指示标识或多个信息指示符包括:
    识别所述第二操作位置在信息指示符的排列方向上的位置变化,根据所述位置变化,将所述指示标识在信息指示符的排列方向上的显示位置,依次变更为所述第二操作位置在信息指示符的排列方向上的位置;或者,
    识别所述第二操作位置的变化方向,沿所述变化方向移动所述多个信息指示符。
  18. 一种超声成像系统,其特征在于,所述系统包括:超声医学检测设备和智能控制器;其中,
    所述超声医学检测设备包括:
    探头;
    发射电路和接收电路,用于激励所述探头向检测对象发射超声波束,接收所述超声波束的回波,获得超声回波信号;
    图像处理模块,用于根据所述超声回波信号获得超声图像;和,
    与图像处理模块电连接的第一通信模块,用于将所述超声图像数据传输至所述智能控制器,和/或接收所述智能控制器输入的控制信号用以获得所述超声图像所需要的超声成像参数;
    所述智能控制器包括:
    触摸显示屏,
    第二通信模块,用于接收来自所述第一通信模块传送的超声图像数据,和/或向所述第一通信模块发送控制信号;
    第二存储器,所述存储器存储处理器上运行的计算机程序;和,
    第二处理器,所述第二处理器执行所述程序时实现以下步骤:
    根据超声成像参数建立多个离散备选项,
    在触摸显示屏上按预置顺序间隔排列显示多个信息指示符,一个信息指示符与一个离散备选项关联对应,
    在触摸显示屏上显示指示标识,
    监测输入对象在所述触摸显示屏上的接触,
    根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述 指示标识与所述多个信息指示符之间的位置关系发生变化,
    检测所述接触的释放,
    根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符,
    确定与所述指示标识正对的信息指示符所关联对应的离散备选项,
    依据确定的与所述指示标识正对的信息指示符所关联对应的离散备选项,获得超声成像参数对应的参数值,和,
    通过所述第二通信模块输出含有所述参数值的控制信号至所述第一通信模块。
  19. 根据权利要求18所述的超声成像系统,其特在于,所述第二处理器执行所述程序时在所述根据所述接触移动所述指示标识或所述多个信息指示符的显示之前还包括:
    识别所述接触对应在触摸显示屏上的第一操作位置;
    查找与所述第一操作位置关联的所述多个信息指示符中的至少一部分;和,
    放大查找到的所述多个信息指示符中的至少一部分。
  20. 根据权利要求19所述的超声成像系统,其特在于,所述第二处理器执行所述程序时在所述放大查找到的所述多个信息指示符中的至少一部分之后还包括:
    基于查找到的所述多个信息指示符中的至少一部分,在两个信息指示符对应的两个离散备选项之间建立多个子级的离散备选项;和,
    在触摸显示屏上按预置顺序间隔排列显示多个子级的信息指示符,一个子级的信息指示符与一个子级的离散备选项关联对应。
  21. 根据权利要求18所述的超声成像系统,其特在于,所述第二处理器执行所述程序时采用以下步骤实现所述根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化:
    跟踪监测所述接触的运动;
    确定所述接触的运动关联在触摸显示屏上的第二操作位置;和,
    根据所述第二操作位置的变化,移动显示所述指示标识或多个信息指示符。
  22. 根据权利要求18所述的超声成像系统,其特在于,所述第二处理器执行所述程序时采用以下步骤实现所述根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符:
    在触摸显示屏上设置多个间隔排列的显示区,每个显示区用以显示一个信息指示符;
    识别所述接触的释放产生在触摸显示屏上的第三操作位置;
    查找与所述第三操作位置相关联的显示区、或者位于所述指示标识附近的所述多个信息指示符中之一;和,
    将所述指示标识显示在查找到的显示区,或者将查找到的所述多个信息指示符中之一显示在与所述指示标识正对的位置。
  23. 根据权利要求22所述的超声成像系统,其特在于,所述第二处理器执行所述程序时采用以下方式之一实现所述查找与所述第三操作位置相关联的显示区:
    查找与所述第三操作位置距离最近的显示区;和,
    查找所述接触的运动最后跨过的显示区。
  24. 根据权利要求21所述的超声成像系统,其特在于,所述第二处理器执行所述程序时采用以下方式实现所述根据所述第二操作位置的变化,移动显示所述指示标识或多个信息指示符:
    识别所述第二操作位置在信息指示符的排列方向上的位置变化,根据所述位置变化,将所述指示标识在信息指示符的排列方向上的显示位置,依次变更为所述第二操作位置在信息指示符的排列方向上的位置;或者,
    识别所述第二操作位置的变化方向,沿所述变化方向移动所述多个信息指示符。
  25. 一种智能控制器,其特征在于,所述智能控制器包括:
    触摸显示屏;
    第二通信模块,用于接收来自超声医学检测设备传送的超声图像数据,和/或向所述超声医学检测设备发送控制信号;
    第二存储器,所述存储器存储处理器上运行的计算机程序;和,
    第二处理器,所述第二处理器执行所述程序时实现以下步骤:
    根据超声成像参数建立多个离散备选项,
    在触摸显示屏上按预置顺序间隔排列显示多个信息指示符,一个信息指示符与一个离散备选项关联对应,
    在触摸显示屏上显示指示标识,
    监测输入对象在所述触摸显示屏上的接触,
    根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置关系发生变化,
    检测所述接触的释放,
    根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符,
    确定与所述指示标识正对的信息指示符所关联对应的离散备选项,
    依据确定的与所述指示标识正对的信息指示符所关联对应的离散备选项,获得超声成像参数对应的参数值,和,
    通过所述第二通信模块输出含有所述参数值的控制信号。
  26. 根据权利要求25所述的智能控制器,其特在于,所述第二处理器执行所述程序时在所述根据所述接触移动所述指示标识或所述多个信息指示符的显示之前还包括:
    识别所述接触对应在触摸显示屏上的第一操作位置;
    查找与所述第一操作位置关联的所述多个信息指示符中的至少一部分;和,
    放大查找到的所述多个信息指示符中的至少一部分。
  27. 根据权利要求26所述的智能控制器,其特在于,所述第二处理器执行所述程序时在所述放大查找到的所述多个信息指示符中的至少一部分之后还包括:
    基于查找到的所述多个信息指示符中的至少一部分,在两个信息指示符对应的两个离散备选项之间建立多个子级的离散备选项;和,
    在触摸显示屏上按预置顺序间隔排列显示多个子级的信息指示符,一个子级的信息指示符与一个子级的离散备选项关联对应。
  28. 根据权利要求25所述的智能控制器,其特在于,所述第二处理器执行所述程序时采用以下步骤实现所述根据所述接触移动所述指示标识或所述多个信息指示符的显示,使所述指示标识与所述多个信息指示符之间的位置 关系发生变化:
    跟踪监测所述接触的运动;
    确定所述接触的运动关联在触摸显示屏上的第二操作位置;和,
    根据所述第二操作位置的变化,移动显示所述指示标识或多个信息指示符。
  29. 根据权利要求25所述的智能控制器,其特在于,所述第二处理器执行所述程序时采用以下步骤实现所述根据所述接触释放时所述指示标识与所述多个信息指示符之间的相对位置关系,调节所述指示标识和所述多个信息指示符之间的相对位置,使所述指示标识正对所述多个信息指示符中的其中一个信息指示符:
    在触摸显示屏上设置多个间隔排列的显示区,每个显示区用以显示一个信息指示符;
    识别所述接触的释放产生在触摸显示屏上的第三操作位置;
    查找与所述第三操作位置相关联的显示区、或者位于所述指示标识附近的所述多个信息指示符中之一;和,
    将所述指示标识显示在查找到的显示区,或者将查找到的所述多个信息指示符中之一显示在与所述指示标识正对的位置。
  30. 根据权利要求29所述的智能控制器,其特在于,所述第二处理器执行所述程序时采用以下方式之一实现所述查找与所述第三操作位置相关联的显示区:
    查找与所述第三操作位置距离最近的显示区;和,
    查找所述接触的运动最后跨过的显示区。
  31. 根据权利要求28所述的智能控制器,其特在于,所述第二处理器执行所述程序时采用以下方式实现所述根据所述第二操作位置的变化,移动显示所述指示标识或多个信息指示符:
    识别所述第二操作位置在信息指示符的排列方向上的位置变化,根据所述位置变化,将所述指示标识在信息指示符的排列方向上的显示位置,依次变更为所述第二操作位置在信息指示符的排列方向上的位置;或者,
    识别所述第二操作位置的变化方向,沿所述变化方向移动所述多个信息指示符。
PCT/CN2017/073099 2017-02-08 2017-02-08 超声医学检测设备及成像控制方法、成像系统、控制器 WO2018145264A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/073099 WO2018145264A1 (zh) 2017-02-08 2017-02-08 超声医学检测设备及成像控制方法、成像系统、控制器
CN201780024746.8A CN109069104B (zh) 2017-02-08 2017-02-08 超声医学检测设备及成像控制方法、成像系统、控制器

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/073099 WO2018145264A1 (zh) 2017-02-08 2017-02-08 超声医学检测设备及成像控制方法、成像系统、控制器

Publications (1)

Publication Number Publication Date
WO2018145264A1 true WO2018145264A1 (zh) 2018-08-16

Family

ID=63106988

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/073099 WO2018145264A1 (zh) 2017-02-08 2017-02-08 超声医学检测设备及成像控制方法、成像系统、控制器

Country Status (2)

Country Link
CN (1) CN109069104B (zh)
WO (1) WO2018145264A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116058871B (zh) * 2023-03-24 2023-07-14 深圳鲲为科技有限公司 超声检查的处理方法及超声检查设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009075846A (ja) * 2007-09-20 2009-04-09 Fujifilm Corp 輪郭抽出装置及びプログラム
CN104545996A (zh) * 2013-10-24 2015-04-29 三星麦迪森株式会社 超声诊断设备以及由其执行的时间增益补偿设置方法
CN104970823A (zh) * 2014-04-01 2015-10-14 三星麦迪森株式会社 使用预存灰度数据和图像调节超声图像亮度的方法和系统
CN105662470A (zh) * 2012-09-24 2016-06-15 三星电子株式会社 超声设备以及超声设备的信息提供方法
CN105686798A (zh) * 2014-12-12 2016-06-22 三星麦迪森株式会社 成像设备及其控制方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7380216B2 (en) * 2000-11-30 2008-05-27 International Business Machines Corporation Zoom-capable scrollbar
JP2006026256A (ja) * 2004-07-21 2006-02-02 Matsushita Electric Ind Co Ltd 超音波診断装置
WO2012081182A1 (ja) * 2010-12-13 2012-06-21 パナソニック株式会社 電子機器
KR101290145B1 (ko) * 2011-05-31 2013-07-26 삼성전자주식회사 터치 스크린 제어 방법 및 장치, 컴퓨터에 의해 독출될 수 있는 기록 매체, 그리고 단말장치
CN102178548B (zh) * 2011-06-10 2013-01-02 无锡祥生医学影像有限责任公司 触摸屏超声诊断仪及其参数调节方法
WO2014142468A1 (en) * 2013-03-13 2014-09-18 Samsung Electronics Co., Ltd. Method of providing copy image and ultrasound apparatus therefor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009075846A (ja) * 2007-09-20 2009-04-09 Fujifilm Corp 輪郭抽出装置及びプログラム
CN105662470A (zh) * 2012-09-24 2016-06-15 三星电子株式会社 超声设备以及超声设备的信息提供方法
CN104545996A (zh) * 2013-10-24 2015-04-29 三星麦迪森株式会社 超声诊断设备以及由其执行的时间增益补偿设置方法
CN104970823A (zh) * 2014-04-01 2015-10-14 三星麦迪森株式会社 使用预存灰度数据和图像调节超声图像亮度的方法和系统
CN105686798A (zh) * 2014-12-12 2016-06-22 三星麦迪森株式会社 成像设备及其控制方法

Also Published As

Publication number Publication date
CN109069104A (zh) 2018-12-21
CN109069104B (zh) 2021-04-27

Similar Documents

Publication Publication Date Title
EP2921115B1 (en) Ultrasound apparatus and method of measuring ultrasound image
US7834850B2 (en) Method and system for object control
JP6371475B2 (ja) 視線入力装置、視線入力方法、および、視線入力プログラム
KR101167248B1 (ko) 터치 인터랙션을 사용하는 초음파 진단 장치
US20140098049A1 (en) Systems and methods for touch-based input on ultrasound devices
US20230274513A1 (en) Content creation in augmented reality environment
US9996160B2 (en) Method and apparatus for gesture detection and display control
WO2014181876A1 (ja) 入力支援装置、入力支援方法、および、プログラム
EP2818115B1 (en) Ultrasonic diagnostic apparatus and method of operating the same
CN109069108B (zh) 超声医学检测设备、传输控制方法以及成像系统和终端
CN111078018A (zh) 显示器的触摸控制方法、终端设备及存储介质
KR20180098499A (ko) 복수의 디스플레이부를 이용한 정보 제공 방법 및 이를 위한 초음파 장치
US20190272090A1 (en) Multi-touch based drawing input method and apparatus
US20230329676A1 (en) Methods and apparatus for performing measurements on an ultrasound image
US20200129156A1 (en) Methods and apparatus for collecting color doppler ultrasound data
US11793482B2 (en) Ultrasound imaging apparatus, method of controlling the same, and computer program product
US11602332B2 (en) Methods and systems for multi-mode ultrasound imaging
WO2018145264A1 (zh) 超声医学检测设备及成像控制方法、成像系统、控制器
CN109069105B (zh) 超声医学检测设备及成像控制方法、成像系统、控制器
WO2017190360A1 (zh) 一种医疗检测系统及其控制方法
JP6256545B2 (ja) 情報処理装置、その制御方法、及びプログラム、並びに、情報処理システム、その制御方法、及びプログラム
US20190114812A1 (en) Method and ultrasound imaging system for emphasizing an ultrasound image on a display screen
US11570017B2 (en) Batch information processing apparatus, batch information processing method, and program
JP2015095127A (ja) タッチパネル式表示装置を備えた音声調整卓、その表示装置の制御方法、及び制御プログラム
CN116149482A (zh) 手势交互方法、装置、电子设备及可存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17896145

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17896145

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 18/02/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17896145

Country of ref document: EP

Kind code of ref document: A1