CN108037891B - Annotation method, system, equipment and computer storage medium - Google Patents
Annotation method, system, equipment and computer storage medium Download PDFInfo
- Publication number
- CN108037891B CN108037891B CN201711435035.8A CN201711435035A CN108037891B CN 108037891 B CN108037891 B CN 108037891B CN 201711435035 A CN201711435035 A CN 201711435035A CN 108037891 B CN108037891 B CN 108037891B
- Authority
- CN
- China
- Prior art keywords
- annotation
- target
- entry
- vocabulary
- touch screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
Abstract
The invention discloses an annotation method, an annotation system, annotation equipment and a computer storage medium, which are applied to a touch screen of an ultrasonic instrument, wherein the method comprises the following steps: displaying the picture image of the main display in real time; under the external trigger, controlling a target annotation vocabulary entry to a first target position of the picture image; the target annotation entry is mapped to a second target location on the primary display corresponding to the first target location. The operating distance that operating personnel moved the annotation vocabulary entry to first target location on the touch-sensitive screen is shorter than the operating distance that directly moved the annotation vocabulary entry to second target location on main display, compares with prior art moreover, and operating personnel directly can accomplish the labeling process of annotation vocabulary entry on the touch-sensitive screen, need not switch over the operation between touch-sensitive screen and main display, can improve annotation efficiency to a certain extent. The annotation method, the annotation system, the annotation equipment, the annotation extreme and the annotation storage medium disclosed by the invention improve the annotation efficiency to a certain extent.
Description
Technical Field
The invention relates to the technical field of intelligent ultrasonic instrument annotation, in particular to an annotation method, an annotation system, annotation equipment and a computer storage medium.
Background
During use of an ultrasound instrument, a physician sometimes needs to annotate an image acquired by the ultrasound instrument.
One existing annotation method is: displaying an annotation menu on a touch screen of the ultrasonic instrument; the main display of the ultrasonic instrument moves a cursor to a target position on a displayed image; and the touch screen marks the target annotation entry at the target position of the cursor.
However, in the conventional annotation method, the operator needs to move the cursor on the main display to the target position, and after the cursor is moved to the target position, the operator needs to select and confirm the target annotation vocabulary entry to the target position on the main display on the touch screen, so that the operation steps of the operator are many, the limb movement range is large, and the annotation efficiency of the conventional annotation method is low.
In summary, how to improve the annotation efficiency of the existing annotation method is a problem to be urgently solved by those skilled in the art.
Disclosure of Invention
The invention aims to provide an annotation method which can solve the technical problem of improving the annotation efficiency of the existing annotation method to a certain extent. The invention also provides an annotation system, equipment and a computer storage medium.
In order to achieve the above purpose, the invention provides the following technical scheme:
an annotation method applied to a touch screen of an ultrasonic instrument comprises the following steps:
displaying the picture image of the main display in real time;
under the external trigger, controlling a target annotation vocabulary entry to a first target position of the picture image;
mapping the target annotation entry to a second target location on the primary display corresponding to the first target location.
Preferably, the annotating the entry to the first target position of the screen image by the control target includes:
controlling a cursor to a first target position of the picture image;
and labeling the target annotation entry to a first target position of the picture image.
Preferably, before the annotating the entry to the first target position of the screen image by the control target, the method further includes:
a target annotation entry is determined in the annotation menu.
Preferably, the annotating the entry to the first target position of the screen image by the control target includes:
determining an annotation vocabulary entry box;
controlling the annotation vocabulary entry box to a first target position;
and receiving annotation content input from the outside, and adding the annotation content into the annotation vocabulary entry frame to form a target annotation vocabulary entry.
Preferably, before the annotating the entry to the first target position of the screen image by the control target, the method further includes:
determining an annotation vocabulary entry box;
receiving annotation content input from the outside;
and adding the annotation content into the annotation vocabulary entry frame to form a target annotation vocabulary entry.
Preferably, the determining an annotation vocabulary box includes:
and constructing the comment vocabulary box under the external trigger.
Preferably, the determining an annotation vocabulary box includes:
an annotation verblock box is selected in the annotation menu.
Preferably, the receiving of the externally inputted annotation content includes:
receiving the annotation content input by the external voice.
An annotation system for use with a touchscreen of an ultrasound instrument, comprising:
the display module is used for displaying the picture image of the main display in real time;
the control module is used for controlling a target annotation vocabulary entry to a first target position of the picture image under the external trigger;
a mapping module to map the target annotation entry to a second target location on the primary display corresponding to the first target location.
An annotation device, comprising:
a memory for storing a computer program;
a processor for implementing the steps of an annotation method as described in any of the above when said computer program is executed.
A computer storage medium having a computer program stored thereon, which, when executed by a processor, carries out the steps of an annotation method as set out in any one of the preceding claims.
The annotation method provided by the invention is applied to a touch screen of an ultrasonic instrument, and firstly displays a picture image of a main display in real time; then, under the external trigger, controlling the target annotation vocabulary entry to reach a first target position of the picture image; and finally mapping the target annotation entry to a second target position corresponding to the first target position on the main display. The image displayed on the touch screen is smaller than the image actually displayed on the main display, the stroke of the annotation entry moving to the first target position on the touch screen is shorter than the stroke of the annotation entry directly moving to the second target position on the main display, and the movement of the annotation entry is controlled by an operator, so that the operating distance of the operator moving the annotation entry to the first target position on the touch screen is shorter than the operating distance of the operator directly moving the annotation entry to the second target position on the main display. In summary, the annotation method provided by the invention solves the technical problem of how to improve the annotation efficiency of the existing annotation method to a certain extent. The annotation system, the annotation equipment and the computer storage medium provided by the invention also solve the corresponding technical problems.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of an annotation method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an annotation system according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an annotation device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The action execution subject of each step in the annotation method provided by the embodiment of the present invention may be an annotation system provided by the embodiment of the present invention, and both the method and the system are applied to a touch screen of an ultrasound instrument, so the action execution subject of each step in the annotation method provided by the embodiment of the present invention may also be a touch screen of an ultrasound instrument. For convenience of description, the action execution main body of each step in the annotation method provided by the embodiment of the present invention is set as a touch screen of an ultrasound apparatus, which is simply referred to as a touch screen, and in addition, the main display in the present invention refers to the main display of the ultrasound apparatus.
Referring to fig. 1, fig. 1 is a flowchart illustrating an annotation method according to an embodiment of the invention.
An annotation method applied to a touch screen of an ultrasonic instrument can comprise the following steps:
step S101: and displaying the picture image of the main display in real time.
In practical applications, since the image on the main display changes in real time, the touch screen needs to display the screen image of the main display in real time. The display position and size of the picture image displayed on the touch screen can be determined according to actual needs, for example, the touch screen can display the picture image in the middle area of the touch screen, and display other functions of the touch screen in areas outside the middle area.
Step S102: and under the external trigger, controlling the target annotation vocabulary entry to reach the first target position of the picture image.
When the image on the main display needs to be labeled, the touch screen controls the target annotation vocabulary entry to the first target position of the picture image under the external trigger. The target annotation vocabulary entry refers to a vocabulary entry consisting of an annotation vocabulary entry frame and target annotation content, wherein the shape of the annotation vocabulary entry frame can be a closed figure such as a rectangle, a circle, an ellipse and the like, and can also be an open-loop figure with only one side and the like, and the annotation vocabulary entry frame can be determined according to actual needs; the target annotation content can be any one or combination of English letters, Chinese characters, punctuation marks, figures and the like, and can be determined according to actual needs; the first target position refers to the labeling position of the target annotation entry on the picture image; the target annotation entry and the first target location may be determined according to actual needs. In practical application, the external world can directly drag the target annotation entry to the first target position, and correspondingly, the touch screen controls the target annotation entry to the first target position in real time according to the moving track of the external dragged target annotation entry; in a specific application scenario, in the moving process of a target annotation entry, a touch screen may determine whether a position of the target annotation entry is a first target position by determining a staying time of the target annotation entry at the position, where the staying time may be determined according to actual needs, for example, 10 seconds, 20 seconds, and the like, and if the staying time is 10 seconds, the touch screen may mark the position of the target annotation entry when determining that the target annotation entry stays at the certain position for 10 seconds, and at this time, the position is also a first target position; of course, the touch screen may also directly determine the position of the target annotation entry at the time of stopping the movement as the first target position, and it should be noted that the touch screen supports the target annotation entry to move again, so as to externally readjust the position of the target annotation entry.
Step S103: the target annotation entry is mapped to a second target location on the primary display corresponding to the first target location.
After the touch screen controls the target annotation vocabulary entry to the first target position, a second target position of the target annotation vocabulary entry actually marked on the main display can be determined according to the corresponding position of the first target position on the picture image, and then the target annotation vocabulary entry is mapped to the second target position corresponding to the first target position on the main display; the process that the outside world marks the picture image displayed by the touch screen to finish corresponding marking on the main display is realized.
The annotation method provided by the invention is applied to a touch screen of an ultrasonic instrument, and firstly displays a picture image of a main display in real time; then, under the external trigger, controlling the target annotation vocabulary entry to reach a first target position of the picture image; and finally mapping the target annotation entry to a second target position corresponding to the first target position on the main display. The image displayed on the touch screen is smaller than the image actually displayed on the main display, the stroke of the annotation entry moving to the first target position on the touch screen is shorter than the stroke of the annotation entry directly moving to the second target position on the main display, and the movement of the annotation entry is controlled by an operator, so that the operating distance of the operator moving the annotation entry to the first target position on the touch screen is shorter than the operating distance of the operator directly moving the annotation entry to the second target position on the main display. In summary, the annotation method provided by the invention solves the technical problem of how to improve the annotation efficiency of the existing annotation method to a certain extent.
In the above embodiment, the external world performs tagging by directly dragging the target annotation entry, and in a specific application scenario, the external world may further determine a first target position first, and then add a tag to the first target position, for example, the external world moves a cursor on a touch screen to the first target position on the screen image first, and then manually clicks the target annotation entry to tag the target annotation entry at the first target position, thereby completing a tagging process, and the like, and correspondingly, the step 102 of controlling the target annotation entry to the first target position of the screen image by the touch screen may specifically be:
step S1021: controlling a cursor to a first target position of the picture image;
step S1022: and labeling the target annotation entry to a first target position of the picture image.
In practical application, the touch screen may store the annotation entries in the form of an annotation menu, and correspondingly, the touch screen may be provided with annotation keys, where the annotation keys may be virtual keys on the touch screen, and the touch screen displays or hides the annotation menu when the annotation keys are clicked externally, for example, when the touch screen displays the annotation menu, the touch screen may hide the annotation menu after the annotation keys are clicked externally, and correspondingly, when the touch screen hides the annotation menu, the touch screen may display the annotation menu after the annotation keys are clicked externally; when the picture image is located in the middle area of the touch screen, the touch screen can display the annotation menu on other areas except the middle area of the touch screen, and in a specific application scene, the touch screen can also directly display each annotation entry in the annotation menu on other areas except the middle area according to a certain display rule; certainly, the display area of the picture image can be reserved on the touch screen, so that the annotation vocabulary entry can be displayed to other areas except the display area when the touch screen is just started, and convenience is brought to subsequent annotation. Correspondingly, before controlling the target annotation entry to the first target position of the picture image, the touch screen may determine the target annotation entry in the annotation menu, and at this time, the target annotation entry is also a certain annotation entry in the annotation menu.
In practical application, the target annotation entry is not in the annotation menu, and at this time, the touch screen needs to generate the target annotation entry under external triggering. In this embodiment, the touch screen may combine the processes of generating the target annotation entry and controlling the target annotation entry to the first target position of the picture image, and correspondingly, the step S102 of controlling the target annotation entry to the first target position of the picture image by the touch screen may specifically include the following steps:
step S1121: determining an annotation vocabulary entry box;
step S1122: controlling the annotation vocabulary entry box to a first target position;
step S1123: and receiving annotation content input from the outside, and adding the annotation content into the annotation vocabulary entry frame to form a target annotation vocabulary entry.
In the above embodiment, the process of generating the target annotation term and controlling the target annotation term to the first target position of the screen image is combined by the touch screen, and in this embodiment, the process of generating the target annotation term by the touch screen before controlling the target annotation term to the first target position of the screen image in step S102 may specifically include the following steps:
step S1221: determining an annotation vocabulary entry box;
step S1222: receiving annotation content input from the outside;
step S1223: and adding the annotation content into the annotation vocabulary entry frame to form a target annotation vocabulary entry.
In the two embodiments, the annotation vocabulary entry frame needs to be determined in the process of generating the target annotation vocabulary entry by the touch screen, and the method for determining the annotation vocabulary entry frame by the touch screen is various, for example, the touch screen can generate the corresponding annotation vocabulary entry frame under external triggering; in a specific application scenario, different types of annotation entry frames may be stored in the annotation menu, and at this time, the touch screen may select the annotation entry frame required by itself in the annotation menu. In addition, the receiving of the comment content input from the outside by the touch screen in step S1123 and step S1222 may specifically be: receiving annotation content input by external voice by the touch screen; of course, the touch screen may also receive external manually input annotation content and the like, and the present invention is not limited in detail herein.
The invention also provides an annotation system which has the corresponding effect of the annotation method provided by the embodiment of the invention. Referring to fig. 2, fig. 2 is a schematic structural diagram of an annotation system according to an embodiment of the present invention.
The annotation system provided by the embodiment of the invention is applied to a touch screen of an ultrasonic instrument, and can comprise:
a display module 101, configured to display a picture image of a main display in real time;
the control module 102 is configured to control the target annotation entry to a first target position of the picture image under external triggering;
a mapping module 103 for mapping the target annotation entry to a second target location on the primary display corresponding to the first target location.
In an annotation system provided in an embodiment of the present invention, a control module may include:
the first control unit is used for controlling the cursor to reach a first target position of the picture image;
and the first labeling unit is used for labeling the target annotation vocabulary entry to the first target position of the picture image.
In an annotation system provided in an embodiment of the present invention, the annotation system may further include:
and the first determining module is used for determining the target annotation vocabulary entry in the annotation menu before the control module controls the target annotation vocabulary entry to the first target position of the picture image.
In an annotation system provided in an embodiment of the present invention, a control module may include:
a determination unit configured to determine an annotation entry box;
the second control unit is used for controlling the annotation vocabulary entry frame to the first target position;
and the receiving unit is used for receiving the annotation content input from the outside and adding the annotation content into the annotation vocabulary entry frame to form a target annotation vocabulary entry.
In an annotation system provided in an embodiment of the present invention, the annotation system may further include:
the second determination module is used for determining an annotation vocabulary entry frame before the control module controls the target annotation vocabulary entry to the first target position of the picture image;
the receiving module is used for receiving annotation content input from the outside;
and the adding module is used for adding the annotation content into the annotation entry frame to form a target annotation entry.
In an annotation system provided in an embodiment of the present invention, the determining unit and the second determining module may both include:
and the construction subunit is used for constructing the annotation vocabulary strip frame under the external trigger.
In an annotation system provided in an embodiment of the present invention, the determining unit and the second determining module may both include:
and the selection subunit is used for selecting the annotation vocabulary strip box in the annotation menu.
In an annotation method provided in an embodiment of the present invention, both the receiving unit and the receiving module may include:
and the receiving subunit is used for receiving the annotation content input by the external voice.
The invention also provides an annotation device and a computer storage medium, which have the corresponding effects of the annotation method provided by the embodiment of the invention. Referring to fig. 3, fig. 3 is a schematic structural diagram of an annotating device according to an embodiment of the present invention.
An annotation device provided in an embodiment of the present invention may include:
a memory 201 for storing a computer program;
a processor 202 for implementing the steps of an annotation method as described in any of the above embodiments when executing a computer program.
In an embodiment of the present invention, a computer storage medium is provided, where a computer program is stored on the computer storage medium, and when the computer program is executed by a processor, the steps of an annotation method described in any one of the above embodiments are implemented.
For a description of a relevant part in an annotation system, an annotation device, and a computer storage medium provided in the embodiments of the present invention, reference is made to detailed descriptions of a corresponding part in an annotation method provided in the embodiments of the present invention, and details are not repeated here. In addition, parts of the above technical solutions provided in the embodiments of the present invention that are consistent with the implementation principles of the corresponding technical solutions in the prior art are not described in detail, so as to avoid redundant description.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (10)
1. An annotation method, applied to a touch screen of an ultrasound instrument, comprising:
displaying a picture image of a main display in real time, wherein the picture image displayed by the touch screen is smaller than the picture image displayed by the main display;
under the external trigger, controlling a target annotation entry to a first target position of the picture image, wherein the annotation entry is stored in an annotation menu form, and the annotation menu is displayed or hidden when a virtual annotation button is clicked by the outside;
mapping the target annotation entry to a second target location on the primary display corresponding to the first target location;
wherein annotating the entry to the first target location of the screen image with the control target comprises:
and judging whether the stay time of the target annotation vocabulary entry at the first target position is longer than a preset time, if so, calibrating the target annotation vocabulary entry at the first target position.
2. The method of claim 1, wherein the controlling the target annotation entry to be prior to the first target location of the screen image further comprises:
a target annotation entry is determined in the annotation menu.
3. The method of claim 1, wherein annotating the entry to the first target location of the screen image with the control target comprises:
determining an annotation vocabulary entry box;
controlling the annotation vocabulary entry box to a first target position;
and receiving annotation content input from the outside, and adding the annotation content into the annotation vocabulary entry frame to form a target annotation vocabulary entry.
4. The method of claim 1, wherein the controlling the target annotation entry to be prior to the first target location of the screen image further comprises:
determining an annotation vocabulary entry box;
receiving annotation content input from the outside;
and adding the annotation content into the annotation vocabulary entry frame to form a target annotation vocabulary entry.
5. The method of claim 3 or 4, wherein the determining an annotation vocabulary box comprises:
and constructing the comment vocabulary box under the external trigger.
6. The method of claim 3 or 4, wherein the determining an annotation vocabulary box comprises:
an annotation verblock box is selected in the annotation menu.
7. The method of claim 3 or 4, wherein the receiving of the externally inputted annotation content comprises:
receiving the annotation content input by the external voice.
8. An annotation system, applied to a touch screen of an ultrasound instrument, comprising:
the display module is used for displaying the picture image of a main display in real time, and the picture image displayed by the touch screen is smaller than the picture image displayed by the main display;
the control module is used for controlling a target annotation entry to a first target position of the picture image under the external trigger, the annotation entry is stored in an annotation menu form, and the annotation menu is displayed or hidden when a virtual annotation button is clicked by the outside;
a mapping module to map the target annotation entry to a second target location on the primary display corresponding to the first target location;
wherein annotating the entry to the first target location of the screen image with the control target comprises:
and judging whether the stay time of the target annotation vocabulary entry at the first target position is longer than a preset time, if so, calibrating the target annotation vocabulary entry at the first target position.
9. An annotation device, comprising:
a memory for storing a computer program;
a processor for implementing the steps of an annotation method according to any one of claims 1 to 7 when executing said computer program.
10. A computer storage medium, characterized in that the computer storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of an annotation method according to one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711435035.8A CN108037891B (en) | 2017-12-26 | 2017-12-26 | Annotation method, system, equipment and computer storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711435035.8A CN108037891B (en) | 2017-12-26 | 2017-12-26 | Annotation method, system, equipment and computer storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108037891A CN108037891A (en) | 2018-05-15 |
CN108037891B true CN108037891B (en) | 2022-04-01 |
Family
ID=62101282
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711435035.8A Active CN108037891B (en) | 2017-12-26 | 2017-12-26 | Annotation method, system, equipment and computer storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108037891B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108962357A (en) * | 2018-06-28 | 2018-12-07 | 深圳开立生物医疗科技股份有限公司 | A kind of annotation method, system, equipment and computer readable storage medium |
CN109192282B (en) * | 2018-08-10 | 2021-09-14 | 飞依诺科技(苏州)有限公司 | Editing method and device for medical image annotation, computer equipment and storage medium |
CN109343777B (en) * | 2018-09-11 | 2020-05-05 | 北京市劳动保护科学研究所 | Labeling method and system |
CN109857318A (en) * | 2018-12-26 | 2019-06-07 | 深圳开立生物医疗科技股份有限公司 | Ultrasound image processing method, equipment and storage medium based on compuscan |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105487793A (en) * | 2014-07-09 | 2016-04-13 | 深圳市理邦精密仪器股份有限公司 | Portable ultrasound user interface and resource management systems and methods |
WO2017060791A1 (en) * | 2015-10-08 | 2017-04-13 | Koninklijke Philips N.V. | Apparatuses, methods, and systems for annotation of medical images |
CN107003404A (en) * | 2014-10-15 | 2017-08-01 | 三星电子株式会社 | The method and ultrasonic equipment of information are provided using multiple displays |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7777731B2 (en) * | 2006-10-13 | 2010-08-17 | Siemens Medical Solutions Usa, Inc. | System and method for selection of points of interest during quantitative analysis using a touch screen display |
CN106843797A (en) * | 2017-03-13 | 2017-06-13 | 广州视源电子科技股份有限公司 | The edit methods and device of a kind of image file |
-
2017
- 2017-12-26 CN CN201711435035.8A patent/CN108037891B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105487793A (en) * | 2014-07-09 | 2016-04-13 | 深圳市理邦精密仪器股份有限公司 | Portable ultrasound user interface and resource management systems and methods |
CN107003404A (en) * | 2014-10-15 | 2017-08-01 | 三星电子株式会社 | The method and ultrasonic equipment of information are provided using multiple displays |
WO2017060791A1 (en) * | 2015-10-08 | 2017-04-13 | Koninklijke Philips N.V. | Apparatuses, methods, and systems for annotation of medical images |
Also Published As
Publication number | Publication date |
---|---|
CN108037891A (en) | 2018-05-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108037891B (en) | Annotation method, system, equipment and computer storage medium | |
US11494244B2 (en) | Multi-window control method and electronic device supporting the same | |
US11262895B2 (en) | Screen capturing method and apparatus | |
US9324305B2 (en) | Method of synthesizing images photographed by portable terminal, machine-readable storage medium, and portable terminal | |
US10929013B2 (en) | Method for adjusting input virtual keyboard and input apparatus | |
US9495802B2 (en) | Position identification method and system | |
JP5458161B1 (en) | Electronic apparatus and method | |
CN108876934B (en) | Key point marking method, device and system and storage medium | |
US9001036B2 (en) | Systems and methods of camera-based fingertip tracking | |
EP2733629A1 (en) | System for associating tag information with images supporting image feature search | |
CN108553894B (en) | Display control method and device, electronic equipment and storage medium | |
US20180121076A1 (en) | Drawing processing method, drawing program, and drawing device | |
US9354711B2 (en) | Dynamic hand-gesture-based region of interest localization | |
US9146667B2 (en) | Electronic device, display system, and method of displaying a display screen of the electronic device | |
CN109918685B (en) | Computer-aided translation method, device, computer equipment and storage medium | |
JPWO2015083290A1 (en) | Electronic device and method for processing handwritten document information | |
CN109976614B (en) | Method, device, equipment and medium for marking three-dimensional graph | |
US20150212713A1 (en) | Information processing apparatus, information processing method, and computer-readable recording medium | |
WO2021197094A1 (en) | Translation text display | |
KR20150106823A (en) | Gesture recognition apparatus and control method of gesture recognition apparatus | |
US20230259697A1 (en) | Annotation Display Method and Electronic Device | |
US20160062601A1 (en) | Electronic device with touch screen and method for moving application functional interface | |
CN106598610B (en) | Refreshing implementation method and device based on webpage application | |
CN105867798A (en) | Touch screen recording method and device | |
KR102171327B1 (en) | Method for proving translation service and terminal device using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |