US20150005630A1 - Method of sharing information in ultrasound imaging - Google Patents

Method of sharing information in ultrasound imaging Download PDF

Info

Publication number
US20150005630A1
US20150005630A1 US14/320,971 US201414320971A US2015005630A1 US 20150005630 A1 US20150005630 A1 US 20150005630A1 US 201414320971 A US201414320971 A US 201414320971A US 2015005630 A1 US2015005630 A1 US 2015005630A1
Authority
US
United States
Prior art keywords
ultrasound
ultrasound image
information
image
annotation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/320,971
Other languages
English (en)
Inventor
Jongwoo JUNG
Eun-ho YANG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020140078390A external-priority patent/KR102207255B1/ko
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JUNG, JONGWOO, Yang, Eun-ho
Publication of US20150005630A1 publication Critical patent/US20150005630A1/en
Priority to US16/031,669 priority Critical patent/US20180317890A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • A61B8/565Details of data transmission or power supply involving data transmission via a network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/468Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means allowing annotation or message recording
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves

Definitions

  • Apparatuses and methods consistent with exemplary embodiments relate to sharing information about ultrasound images between an ultrasound apparatus and an external device.
  • An ultrasound diagnostic apparatus transfers ultrasound signals from a surface of an object toward a predetermined portion inside the object to obtain a tomogram image of a soft tissue or an image about blood flow by using information of the ultrasound signal reflected from the internal regions of the object.
  • An ultrasound diagnostic apparatus is small, inexpensive, and has high reliability without a risk of X-ray exposure, and, thus, is widely used in addition to other imaging diagnostic apparatuses such as an X-ray diagnostic apparatus, a computed tomography (CT) scanner, a magnetic resonance imaging (MRI) apparatus, and a nuclear medicine imaging apparatus.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • nuclear medicine imaging apparatus a nuclear medicine imaging apparatus
  • Exemplary embodiments may address at least the above problems and/or disadvantages and other disadvantages not described above. Also, the exemplary embodiments are not required to overcome the disadvantages described above, and an exemplary embodiment may not overcome any of the problems described above.
  • One or more exemplary embodiments provide an ultrasound apparatus sharing ultrasound information about an ultrasound image with a receiving device according to a sharing level of the receiving device, and an information sharing method of the ultrasound apparatus.
  • One or more exemplary embodiments also provide a medical expert device capable of sending and receiving information about an ultrasound image to and from an ultrasound apparatus, and a communication method of the medical expert device.
  • a method of sharing information with an external device by an ultrasound apparatus including: acquiring an ultrasound image of an object by transmitting an ultrasound signal to the object and receiving an ultrasound response signal from the object; identifying a sharing level of an external device, which is set in advance, for sharing the ultrasound image; and transmitting ultrasound information about the ultrasound image to the external device according to the sharing level that is set in advance.
  • the ultrasound information may be displayed on the external device in real-time.
  • the external device may include a first device and a second device, and the transmitting may include: transmitting first ultrasound information corresponding to a first sharing level of the first device to the first device; and transmitting second ultrasound information corresponding to a second sharing level of the second device to the second device.
  • the first device may include a patient device
  • the second device may include a medical expert device.
  • the transmitting of the ultrasound information may include generating the ultrasound information by adding at least one annotation related to all or some of the ultrasound image to the ultrasound image.
  • the adding of the at least one annotation to the ultrasound image may include: analyzing the ultrasound image; recommending at least one annotation related to all or some of the ultrasound image based on a result of the analyzing; and displaying the at least one recommended annotation on the ultrasound image.
  • the adding of the at least one annotation to the ultrasound image may include: displaying an annotation list that is set in advance; receiving a selection of at least one annotation from the annotation list; and displaying the selected at least one annotation on the ultrasound image.
  • the displaying of the selected at least one annotation on the ultrasound image may include: receiving a drag and drop input that includes dragging the selected at least one annotation to a region where the ultrasound image is displayed and dropping the at least one annotation; and displaying the selected at least one annotation on the ultrasound image based on the drag and drop input.
  • the method may further include: when a touch input of touching the at least one annotation displayed on the ultrasound image for a predetermined length of time or longer is received, activating the touched annotation to be moved; receiving a drag input of the user on the touched at least one annotation; and moving the touched at least one annotation according to the drag input of the user.
  • the method may further include: providing an activated annotation list about at least one annotation added to the ultrasound image on a predetermined region of a screen.
  • the transmitting of the ultrasound information may include: selecting a region of interest from the ultrasound image; indicating an identification mark on the region of interest; and transmitting the ultrasound information including information about the region of interest that is indicated by the identification mark to the external device.
  • the indicating of the identification mark may include: adding a color to the region of interest.
  • the transmitting of the second ultrasound information may include generating the second ultrasound information including at least one of analysis information about the ultrasound image, by the ultrasound technician, measurement information with respect to the ultrasound image, and patient information.
  • the method may further include: receiving a message about the ultrasound image from the external device; and displaying the received message on a screen.
  • the method may further include: providing a chatting window for communicating with the medical expert device on a screen; receiving a user input about the ultrasound image through the chatting window; transmitting the user input to the medical expert device; and receiving a response message to the user input from the medical expert device.
  • the method may further include: transmitting a request for confirmation about the second ultrasound information to the medical expert device; and receiving a confirmation message of the second ultrasound information from the medical expert device.
  • the method may further include: receiving control information from the medical expert device; and executing a control command corresponding to the control information.
  • the control information may include at least one of a control command for selecting and displaying another ultrasound image, a control command for expanding or reducing the ultrasound image, a control command for storing the ultrasound image, a control command for three-dimensional (3D) rendering the ultrasound image, a control command for adding an annotation or a body marker, a control command for measuring the region of interest, and a control command for correcting the analysis information about the ultrasound image.
  • a control command for selecting and displaying another ultrasound image a control command for expanding or reducing the ultrasound image
  • a control command for storing the ultrasound image a control command for three-dimensional (3D) rendering the ultrasound image
  • a control command for adding an annotation or a body marker a control command for measuring the region of interest
  • a control command for correcting the analysis information about the ultrasound image may include at least one of a control command for selecting and displaying another ultrasound image, a control command for expanding or reducing the ultrasound image, a control command for storing the ultrasound image, a control command for three-dimensional (3D) rendering the ultrasound
  • the method may further include: providing a list of devices that are capable of communicating with the ultrasound apparatus; and receiving a selection of the external device from the list of devices.
  • the method may further include: requesting a server for an authentication of the selected external device; and receiving information of a sharing level of the external device from the server, when the authentication has succeeded.
  • a method of communicating with at least one ultrasound apparatus by a medical expert device including: displaying an ultrasound image list including at least one ultrasound image acquired by the at least one ultrasound apparatus on a first region of a screen; receiving a selection of an ultrasound image from the ultrasound image list; displaying the selected ultrasound image on a second region of the screen; and communicating with an ultrasound apparatus that acquires the selected ultrasound image.
  • the displaying of the ultrasound image list may include displaying the ultrasound list including a real-time ultrasound image transmitted from the at least one ultrasound apparatus and an ultrasound image stored in advance.
  • the displaying of the ultrasound image list may further include indicating an identification mark on the real-time ultrasound image to distinguish the real-time ultrasound image from the ultrasound image stored in advance.
  • the displaying of the selected ultrasound image may include displaying at least one of analysis information by an ultrasound technician about the selected ultrasound image, measurement information with respect to the selected ultrasound image, and patient information, according to a sharing level of the medical expert device.
  • the displaying of the selected ultrasound image may include displaying a pointer of the ultrasound apparatus and a pointer of the medical expert device on the selected ultrasound image.
  • the communicating may include: providing a chatting window for communicating with the ultrasound apparatus on a third region of the screen; receiving an input about the selected ultrasound image through the chatting window; and transmitting the received input to the ultrasound apparatus.
  • the communicating may include: receiving a request for confirmation about the selected ultrasound image from the ultrasound apparatus; receiving confirmation information about the selected ultrasound image; and transmitting the confirmation information to the ultrasound apparatus.
  • the transmitting of the confirmation information may include: displaying a graphical user interface (GUI) for receiving a confirmation input of the selected ultrasound image; and receiving a confirmation input about the selected ultrasound image through the GUI.
  • GUI graphical user interface
  • the communicating may include transmitting control information for controlling the ultrasound apparatus to the ultrasound apparatus.
  • the transmitting of the control information may include transmitting to the ultrasound apparatus at least one of a control command for selecting and displaying another ultrasound image, a control command for expanding or reducing the ultrasound image, a control command for storing the ultrasound image, a control command for 3D rendering the ultrasound image, a control command for adding an annotation or a body marker, a control command for measuring the region of interest, and a control command for correcting the analysis information about the ultrasound image.
  • the method may further include communicating with a patient device that displays the selected ultrasound image.
  • the communicating with the patient device may include: receiving a description by a medical expert about the selected ultrasound image; and transmitting the description by the medical expert to the patient device.
  • the description by the medical expert may be displayed on the patient device in real-time.
  • a method of communicating with a patient device by a medical expert device including: displaying an ultrasound image of a patient on a screen of the medical expert device; receiving a description by a medical expert about the ultrasound image; and transmitting the description by the medical expert to the patient device displaying the ultrasound image, wherein the description by the medical expert is displayed on the patient device.
  • the description by the medical expert may be displayed on the patient device in real-time, or stored in the patient device.
  • an ultrasound apparatus including: an ultrasound image obtainer for acquiring an ultrasound image of an object by transmitting an ultrasound signal to the object and receiving an ultrasound response signal from the object; a controller for identifying a sharing level of an external device, which is set in advance, for sharing the ultrasound image; and a communicator for transmitting ultrasound information about the ultrasound image to the external device, according to the sharing level that is set in advance.
  • the external device may include a first device and a second device, and the controller generates first ultrasound information corresponding to a first sharing level of the first device, and second ultrasound information corresponding to a second sharing level of the second device.
  • the ultrasound apparatus may further include an image processor for adding at least one annotation relating to all or some of the ultrasound image to the ultrasound image.
  • the ultrasound apparatus may further include: a user input unit for receiving a selection of at least one annotation from the annotation list displayed on the screen; and a display for displaying the selected at least one annotation on the ultrasound image.
  • the user input unit may receive a drag and drop input that includes dragging the selected at least one annotation to a region where the ultrasound image is displayed and dropping the at least one annotation, and the display may display the selected at least one annotation on the ultrasound image based on the drag and drop input.
  • the controller may activate the touched at least one annotation to be moved, the user input unit receives a drag input of the user on the touched at least one annotation, and the display may move the touched at least one annotation according to the drag input of the user
  • the ultrasound apparatus may further include a display for providing an activated annotation list about at least one annotation added to the ultrasound image on a predetermined region of the screen.
  • the controller may generate the ultrasound information by selecting a region of interest from the ultrasound image and adding a color to the region of interest.
  • the second ultrasound information may include at least one of analysis information by an ultrasound technician about the ultrasound image, measurement information with respect to the ultrasound apparatus, and patient information.
  • the communicator may receive a message about the ultrasound image from the external device, and the controller may control the display to display the received message on a screen.
  • the ultrasound apparatus may further include: a display for providing a chatting window for communicating with the second device on a screen; and a user input unit for receiving a user input about the ultrasound image through the chatting window, wherein the communicator may transmit the user input to the second device and receive a response message to the user input from the second device.
  • the communicator may transmit a request for confirmation about the second ultrasound information to the second device and receive a confirmation message of the second ultrasound information from the second device.
  • the communicator may receive control information from the medical expert device, and the controller may execute a control command corresponding to the control information.
  • the ultrasound apparatus may further include: a display for providing a list of devices that are capable of communicating with the ultrasound apparatus; and a user input unit for receiving a selection of the external device from the list of devices.
  • the communicator may request a server for an authentication of the selected external device, and receive information of a sharing level of the external device from the server, when the authentication has succeeded.
  • a medical expert device including: a display for displaying an ultrasound image list including at least one ultrasound image acquired by at least one ultrasound apparatus on a first region of a screen, and displaying a ultrasound image selected from the ultrasound image list on a second region of the screen; a user input unit for receiving a selection of the ultrasound image from the ultrasound image list; a communicator for communicating with an ultrasound apparatus that acquires the selected ultrasound image; and a controller for controlling the display, the user input unit, and the communicator.
  • the ultrasound image list may include a real-time ultrasound image transmitted from the at least one ultrasound apparatus and an ultrasound image stored in advance.
  • the controller may indicate an identification mark on the real-time ultrasound image to distinguish the real-time ultrasound image from the ultrasound image stored in advance.
  • the display may display at least one of analysis information by an ultrasound technician about the selected ultrasound image, measurement information with respect to the selected ultrasound image, and patient information, according to a sharing level of the medical expert device.
  • the display may display a pointer of the ultrasound apparatus and a pointer of the medical expert device on the selected ultrasound image.
  • the display may provide a chatting window for communicating with the ultrasound apparatus on a third region of the screen, the user input unit may receive an input about the selected ultrasound image through the chatting window, and the communicator may transmit the received input to the ultrasound apparatus.
  • the communicator may receive a request for confirmation about the selected ultrasound image from the ultrasound apparatus, and transmit the confirmation information to the ultrasound apparatus.
  • the display may display a graphical user interface (GUI) for receiving a confirmation input of the selected ultrasound image, and receive a confirmation input about the selected ultrasound image through the GUI.
  • GUI graphical user interface
  • the communicator may transmit control information for controlling the ultrasound apparatus to the ultrasound apparatus.
  • the user input unit may receive a description by a medical expert about the selected ultrasound image, and the communicator may transmit the description by the medical expert to the patient device.
  • a method of sharing information in a medical imaging apparatus including: acquiring a medical image of an object; selecting an external device for sharing information about the medical image; identifying a sharing level of the external device that is selected; and transmitting the information about the medical image to the external device based on the sharing level.
  • the sharing level may include information representing an authority of a user of the external device for checking the information about the medical image.
  • the external device may include a first device and a second device, and the transmitting of the information may include: transmitting first medical image information corresponding to a first sharing level of the first device to the first device; and transmitting second medical image information corresponding to a second sharing level of the second device to the second device.
  • FIG. 1 is a block diagram of an information sharing system according to an exemplary embodiment
  • FIG. 2 is a block diagram of an information sharing system according to another exemplary embodiment
  • FIG. 3 is a flowchart illustrating an information sharing method, according to an exemplary embodiment
  • FIG. 4 is a flowchart illustrating a method of generating ultrasound information, according to an exemplary embodiment
  • FIG. 5 is a flowchart illustrating a method of generating ultrasound information corresponding to a sharing level of a device, according to an exemplary embodiment
  • FIG. 6 is a diagram showing a graphical user interface (GUI) for adding an annotation, according to an exemplary embodiment
  • FIG. 7 is a diagram showing a GUI for marking a region of interest according to an exemplary embodiment
  • FIG. 8 is a diagram showing a GUI for adding an annotation and a color, according to an exemplary embodiment
  • FIG. 9 is a diagram showing a screen of a device according to an exemplary embodiment.
  • FIG. 10 is a flowchart illustrating a communication method according to the exemplary embodiment
  • FIGS. 11A and 11B are diagrams showing a GUI for confirming an ultrasound image, according to an exemplary embodiment
  • FIG. 12 is a diagram showing a screen providing a chatting window, according to an exemplary embodiment
  • FIG. 13 is a diagram showing a screen displaying a message, according to an exemplary embodiment
  • FIG. 14 is a flowchart illustrating a communication method, according to an exemplary embodiment
  • FIG. 15 is a diagram showing a screen providing a list of ultrasound images, according to an exemplary embodiment
  • FIG. 16 is a diagram showing a screen displaying a chatting window and a pointer, according to an exemplary embodiment
  • FIG. 17 is a diagram showing an example of a GUI for remotely controlling the ultrasound apparatus, according to the exemplary embodiment
  • FIGS. 18A and 18B are diagrams showing an example of a GUI for adding a body marker to the ultrasound image remotely, according to the exemplary embodiment
  • FIG. 19 is a flowchart illustrating a communication method, according to the exemplary embodiment.
  • FIG. 20 shows screens of the ultrasound apparatus, the first device, and the second device according to the exemplary embodiment
  • FIG. 21 is a block diagram showing an ultrasound apparatus according to an exemplary embodiment
  • FIG. 22 is a block diagram showing an ultrasound apparatus according to another exemplary embodiment
  • FIG. 23 is a block diagram showing a device according to an exemplary embodiment
  • FIG. 24 illustrates a communication method according to an exemplary embodiment
  • FIG. 25 is a flowchart illustrating an information sharing method of a medical imaging apparatus according to an exemplary embodiment
  • FIG. 26 is a diagram showing an X-ray image obtained by an X-ray imaging apparatus according to an exemplary embodiment
  • FIG. 27 is a diagram showing a Bone suppression image, in which a region of interest is marked, according to an exemplary embodiment.
  • FIG. 28 is a diagram showing an X-ray image including an annotation according to an exemplary embodiment.
  • ultrasound image is an image of an object acquired by using ultrasound signals.
  • the object may denote a human or a body part.
  • the object may include an organ such as a liver, heart, brain, breast, and abdomen, or a fetus, or nuchal translucency (NT).
  • NT nuchal translucency
  • the ultrasound image may be at least one of a brightness mode (B mode) image representing a magnitude of an ultrasound echo signal reflected by an object as brightness, a color mode (C mode) image representing a velocity of a moving object as a color by using a Doppler effect, a Doppler mode (D mode) image representing an object of a moving object as a spectrum by using a Doppler effect, a motion mode (M mode) image representing movement of an object at a constant location according to time, and an elastic mode image representing a difference between reactions when a compression is applied and not applied to an object as an image; however, an exemplary embodiment is not limited thereto.
  • the ultrasound image may be a two-dimensional (2D) image, a 3D image, or a four-dimensional (4D) image.
  • FIG. 1 is a block diagram of an information sharing system according to an exemplary embodiment.
  • the information sharing system may include an ultrasound apparatus 3000 and an external device (for example, at least one of a first device 1000 and a second device 2000 ).
  • the first device 1000 may be a display apparatus for providing an ultrasound image.
  • the first device 1000 may be a device for a patient, i.e., a patient device, which displays an ultrasound image to view by the patient, an authorized family member, an authorized friend, etc.
  • the first device 1000 may receive ultrasound information about an ultrasound image from the ultrasound apparatus 3000 .
  • the first device 1000 may receive the ultrasound information directly from the ultrasound apparatus 3000 , or via a server.
  • the ultrasound device received by the first device 1000 may be ultrasound information corresponding to a sharing level (i.e., authority) of the first device 1000 .
  • the ultrasound information corresponding to the sharing level of the first device 1000 may include ultrasound image information, caption (or annotation) information for illustrating all or part of the ultrasound image, and identification mark information for identifying a region of interest (ROI) in the ultrasound image; however, an exemplary embodiment is not limited thereto.
  • the sharing level of the first device 1000 may be set by an information sharing system or an authorized user, in advance.
  • the second device 2000 may be a device of a medical expert, which may perform a data communication with the ultrasound apparatus 3000 , i.e., a medical expert device.
  • a medical expert may be an expert who can analyze or confirm an ultrasound image transmitted from the ultrasound apparatus 3000 by using the second device 2000 .
  • the medical expert may include a doctor, a technician, a medical technologist, a nurse, or a radiologist; however, an exemplary embodiment is not limited thereto.
  • the medical expert according to an exemplary embodiment may be different from a technician who acquires an ultrasound image by using the ultrasound apparatus 3000 .
  • a first technician acquires an ultrasound image of an object by using the ultrasound apparatus 3000
  • a second technician or a doctor may identify the ultrasound image acquired by the first technician or analysis information generated by the first technician by using the second device 2000 .
  • the second device 2000 may receive ultrasound information about the ultrasound image from the ultrasound apparatus 3000 , and may display the ultrasound information on a screen thereof.
  • the second device 2000 may receive the ultrasound information corresponding to a sharing level (i.e., authority) of the second device 2000 .
  • the second device 2000 and the first device 1000 may have sharing levels different from each other. Therefore, the ultrasound information corresponding to the sharing level of the second device 2000 may be different from ultrasound information corresponding to the sharing level of the first device 1000 .
  • the ultrasound information corresponding to the sharing level of the second device 2000 may include analysis information generated by the technician, information about a patient, and measurement information; however, an exemplary embodiment is not limited thereto.
  • the second device 2000 may be a device having an authority to control the ultrasound apparatus 3000 remotely. That is, the second device 2000 may transmit control information for controlling the ultrasound apparatus 3000 based on an input of the medical expert (for example, a doctor).
  • the medical expert for example, a doctor
  • the second device 2000 may transmit to the ultrasound apparatus 300 a control command for selecting and displaying another ultrasound image, a control command for expanding or reducing the ultrasound image, a control command for storing the ultrasound image, a control command for performing a 3D rendering of the ultrasound image, a control command for adding an annotation or a body marker, a control command for measuring a region of interest, and a control command for correcting analysis information about the ultrasound image.
  • the ultrasound apparatus 3000 of an exemplary embodiment is a device that acquires ultrasound image data of an object by using ultrasound waves and shares ultrasound information about the ultrasound image with an external device.
  • the ultrasound apparatus 3000 of an exemplary embodiment may include a mobile terminal or a stationary terminal.
  • the mobile terminal may include a laptop computer, a personal digital assistant (PDA), a tablet personal computer (PC), and a smartphone.
  • PDA personal digital assistant
  • PC tablet personal computer
  • smartphone smartphone
  • the ultrasound apparatus 3000 of an exemplary embodiment may transmit the ultrasound information to a receiving device (for example, the first device 1000 or the second device 2000 ) according to the sharing level of the receiving device. That is, the ultrasound apparatus 3000 may determine a type and/or amount of pieces of information to be transmitted to the receiving device, according to the sharing level of the receiving device.
  • the ultrasound apparatus 3000 may transmit different ultrasound information to the patient display device and the medical expert display device.
  • an amount of ultrasound information transferred to the patient display device may be less than that of ultrasound information transferred to the medical expert display device.
  • an annotation or a color may be added to the ultrasound information that is transmitted to the patient display apparatus.
  • the ultrasound apparatus 3000 of an exemplary embodiment may transmit the ultrasound information in various ways.
  • the ultrasound apparatus 3000 may encode ultrasound information about an ultrasound image by using a mirroring technology using a miracast, and may transmit the encoded ultrasound information to the first device 1000 or the second device 2000 via Wi-Fi Direct (WFD).
  • WFD Wi-Fi Direct
  • An encoding algorithm according to an exemplary embodiment may include MPEG-2, MPEG-4, H.264, or advanced video coding (AVC); however, exemplary embodiments are not limited thereto.
  • the ultrasound apparatus 3000 may transmit the ultrasound information to the first device 1000 or the second device 2000 via wireless fidelity (Wi-Fi), Bluetooth, ultra wideband (UWB), or IEEE 1394 communication, but exemplary embodiments are not limited thereto.
  • Wi-Fi wireless fidelity
  • Wi-B Bluetooth
  • UWB ultra wideband
  • IEEE 1394 IEEE 1394
  • the ultrasound apparatus 3000 may include a touch screen.
  • the touch screen may be configured to detect a touch input location, a touched area, and a touch input pressure.
  • the touch screen may detect a proximity touch, as well as a real-touch input.
  • the real-touch input denotes a case where the screen is actually touched by a touch tool (for example, a finger, an electronic pen, etc.)
  • the proximity touch input denotes a case where a touch tool approaches the screen to within a predetermined distance without actually touching the screen.
  • the ultrasound apparatus 3000 may sense a touch gesture of a user on an ultrasound image via the touch screen.
  • Touch gestures of a user may include a tap, a touch and hold, a double tap, a drag, panning, a flick, a drag and drop, a swipe, a pinch, etc.
  • “Tap” is an operation in which the user touches the screen by using a finger or an electronic pen and then lifts the finger or the electronic pen from the screen without moving it on the screen.
  • “Touch and hold” is an operation in which the user touches the screen by using a finger or an electronic pen and maintains the touch input for a length of time (for example, two seconds) or longer. That is, a time difference between a touch-in time and a touch-out time is equal to or greater than the length of time (for example, two seconds).
  • a visual, an audible, or a tactile feedback signal is transmitted when the touch input is maintained for the critical length of time or longer.
  • the length of time may vary.
  • Double tap is an operation in which the user touches the screen twice by using the finger or the electronic pen.
  • Drag is an operation in which the user touches the screen by using a finger or an electronic pen and then moves the finger or the electronic pen to another position on the screen while continuously touching the screen. An object is moved or a panning operation that will be described below is performed by the drag operation.
  • “Panning” is an operation in which the user performs the drag operation without selecting an object. Since the user does not select a certain object in the panning operation, a page itself moves in the screen or a group of objects moves in the page, without moving the certain object in the page.
  • “Flick” is an operation in which the user drags a finger or an electronic pen at a certain speed (for example, 100 pixel/s) or faster.
  • the drag operation (or panning operation) and the flick operation may be distinguished from each other based on whether the velocity of the finger or the electronic pen is the certain speed (for example, 100 pixel/s) or greater.
  • Drag and drop is an operation in which the user drags and drops an object at a location on the screen by using the finger or the electronic pen.
  • pinch is an operation in which the user touches the screen by using two fingers and then moves the fingers to different directions from each other.
  • the pinch operation is a gesture for expanding (pinch open) or reducing (pinch close) the object or the page, and an expansion value or a reduction value may be determined by a distance between two fingers.
  • swipe is an operation in which the user moves the finger or the electronic pen a distance in a horizontal or a vertical direction in a state of touching the object on the screen. Movement in a diagonal direction is not considered as a swipe event.
  • the ultrasound apparatus 3000 of an exemplary embodiment may provide some or all of buttons included in a control panel of a general ultrasound apparatus through the touch screen as a graphical user interface (GUI).
  • GUI graphical user interface
  • FIG. 2 is a block diagram of an information sharing system according to another exemplary embodiment.
  • the information sharing system may further include a server 4000 .
  • the server 4000 may be a server for managing information related to ultrasound images.
  • the server 4000 may include a server of a medical institution (for example, a hospital).
  • the server 4000 of an exemplary embodiment may store information about patients, and may store information relating to ultrasound images captured by the ultrasound apparatus 3000 .
  • the server 4000 may store information about sharing levels of devices that may share information with the ultrasound apparatus 3000 .
  • information such as identification information of a device (for example, a device ID, a Mac address, and an equipment name, etc.), the sharing level, types of information that may be shared, and an amount of data that may be shared may be stored in the server 4000 in a table format.
  • the server 4000 of an exemplary embodiment may transmit information about real-time ultrasound images received from the ultrasound apparatus 3000 to the first device 1000 or the second device 2000 .
  • the real-time ultrasound image may refer to an ultrasound image that is captured by the ultrasound apparatus 3000 within a predetermined time period (for example, within five minutes) from the present or at a time of receiving the ultrasound image.
  • the server 4000 may receive a request to transmit a certain ultrasound image that is stored in advance from the second device 2000 , and may transmit the requested ultrasound image to the second device 2000 .
  • FIG. 3 is a flowchart illustrating an information sharing method of an ultrasound apparatus 3000 , according to an exemplary embodiment.
  • the ultrasound apparatus 3000 may acquire an ultrasound image about an object.
  • the ultrasound apparatus 3000 of an exemplary embodiment may directly generate the ultrasound image, or may receive the ultrasound image from outside.
  • the ultrasound apparatus 3000 transmits an ultrasound signal to the object and receives an ultrasound response signal from the object to generate the ultrasound image.
  • the ultrasound apparatus 3000 may receive an ultrasound image from an external server or an external device.
  • the ultrasound apparatus 3000 may identify a sharing level of an external device for sharing the ultrasound image. For example, the ultrasound apparatus 3000 may identify information about a first sharing level corresponding to identification information of the first device 1000 and/or a second sharing level corresponding to identification information of the second device 2000 .
  • the ultrasound apparatus 3000 may receive information about the first sharing level and/or information about the second sharing level from the server 4000 . According to another exemplary embodiment, the ultrasound apparatus 3000 may read the information about the first sharing level and/or the information about the second sharing level from a memory.
  • the first sharing level of the first device 1000 and the second sharing level of the second device 2000 may be different from each other.
  • the sharing level of the first device 1000 may be lower than the sharing level of the second device 2000 .
  • the lower sharing level denotes that a number of types and/or amounts of the sharable information are relatively small.
  • the ultrasound apparatus 3000 may transmit ultrasound information to the external device according to the sharing level. That is, the ultrasound apparatus 3000 may transmit first ultrasound information corresponding to the first sharing level to the first device 1000 , or may transmit second ultrasound information corresponding to the second sharing level to the second device 2000 .
  • the ultrasound apparatus 3000 shares the ultrasound image only with the first device 1000 , and may share additional information in addition to the ultrasound image with the second device 2000 .
  • the additional information may include analysis information of a technician, measurement information, and patient information; however, the exemplary embodiment is not limited thereto.
  • the external device may display the ultrasound information transmitted from the ultrasound apparatus 3000 in real-time.
  • the displaying of the ultrasound information in real-time may denote displaying of the ultrasound information on the screen within a predetermined time period (for example, one minute) from a point of time when the ultrasound information is received.
  • the first device 1000 may display first ultrasound information and the second device 2000 may display second ultrasound information in real-time.
  • the first ultrasound information and the second ultrasound information may include at least the ultrasound image, which has been scanned by the technician, and the patient and the medical expert may share the ultrasound image in real-time.
  • the ultrasound apparatus 3000 may encode the first ultrasound information by using an encoding code that is negotiated in advance or an encoding code that is set in advance for securing the first ultrasound information, and may transmit the encoded first ultrasound information to the first device 1000 .
  • the ultrasound apparatus 3000 encodes the second ultrasound information by using an encoding code that is negotiated in advance or an encoding code that is set in advance for securing the second ultrasound information, and may transmit the encoded second ultrasound information to the second device 2000 .
  • the method of sharing the ultrasound information by the ultrasound apparatus 3000 according to the sharing level will be described in more detail with reference to FIGS. 4 through 13 .
  • the first device 1000 is a patient device and the second device 2000 is a medical expert device.
  • FIG. 4 is a flowchart illustrating a method of generating ultrasound information, according to an exemplary embodiment.
  • the ultrasound apparatus 3000 may acquire an ultrasound image of an object.
  • the ultrasound apparatus 3000 may acquire a B mode image, a C mode image, a D mode image, an M mode image, an elastic mode image, a 2D ultrasound image, a 3D ultrasound image, and a 4D ultrasound image of the object; however, an exemplary embodiment is not limited thereto. Since operation S 410 corresponds to operation S 310 shown in FIG. 3 , detailed descriptions thereof will not be provided here.
  • the ultrasound apparatus 3000 may select the first device 1000 and the second device 2000 with which the information will be shared. According to an exemplary embodiment, the ultrasound apparatus 3000 may select the first device 1000 and the second device 2000 connected to the ultrasound apparatus 3000 based on system settings.
  • the ultrasound apparatus 3000 may select the first device 1000 and the second device 2000 based on a user input.
  • the ultrasound apparatus 3000 may provide a device list including identification information of the devices that may communicate with the ultrasound apparatus 3000 on a screen.
  • the ultrasound apparatus 3000 may receive the user input for selecting the first device 1000 and the second device 2000 in the device list.
  • the user of the ultrasound apparatus 3000 of an exemplary embodiment may be a technician, a medical professional, or an ordinary person; however, the exemplary embodiment is not limited thereto.
  • the technician will be assumed as the user of the ultrasound apparatus 3000 .
  • the ultrasound apparatus 3000 may request authentication regarding the first device 1000 and the second device 2000 to the server 4000 .
  • the server 4000 may perform authentication operations regarding the first device 1000 and the second device 2000 .
  • the server 4000 may verify whether each of the first device 1000 and the second device 2000 is a device that may share the information with the ultrasound apparatus 3000 . Through the authentication, exposure of the ultrasound information relating to personal medical diagnosis to unspecified public may be prevented.
  • the server 4000 may transmit the information about the sharing information of the first device 1000 and the second device 2000 to the ultrasound apparatus 3000 . That is, the ultrasound apparatus 3000 may receive information about the first sharing level of the first device 1000 and the information about the second sharing level of the second device 2000 .
  • the ultrasound apparatus 3000 may generate the first ultrasound information corresponding to the first sharing level of the first device 1000 .
  • the ultrasound apparatus 3000 may generate the first ultrasound information including an ultrasound image, annotation for describing the ultrasound image, an identification mark on a region of interest, and measurement information according to the sharing level of the patient device.
  • the annotation of an exemplary embodiment may include an annotation representing the object (for example, a head of an embryo, liver, heart, carotid, etc.), and an annotation describing a type of the ultrasound image (for example, a Doppler image, an M mode image, an elastic mode image, etc.); however, the exemplary embodiments are not limited thereto.
  • an annotation representing the object for example, a head of an embryo, liver, heart, carotid, etc.
  • an annotation describing a type of the ultrasound image for example, a Doppler image, an M mode image, an elastic mode image, etc.
  • the identification mark on the region of interest may include adding of a color to the region of interest (for example, adding of skin color to the embryo image), adding of a boundary to the region of interest, and adding of a pattern to the region of interest; however, the exemplary embodiments are not limited thereto.
  • the measurement information may include information obtained from measuring the region of interest.
  • the measurement information may include a head girth and a neck girth of an embryo, a head size, a tumor size, etc.; however, the exemplary embodiments are not limited thereto.
  • the ultrasound apparatus 3000 may generate the second ultrasound information corresponding to the second sharing level of the second device 2000 .
  • the ultrasound apparatus 3000 when the second device 2000 is a medical expert device, the ultrasound apparatus 3000 generates the second ultrasound information including an ultrasound image, measurement information about the ultrasound image, analysis information of the technician about the ultrasound image, and patient information according to the sharing level of the medical expert device.
  • the measurement information provided to the medical expert device may include more detailed information than that provided to the patient device.
  • the measurement information provided to the medical expert device may include a maximum speed in a sample volume, an inclination in an M mode image, etc., in addition to the head girth and a neck girth of the embryo, head size, and tumor size, etc.
  • the analysis information of the technician may include a result report generated by the technician after performing an ultrasound examination.
  • the patient information of an exemplary embodiment may include a medical history of the patient, heath state information of the patient, body features of the patient, and image history of the patient; however, the exemplary embodiments are not limited thereto.
  • the ultrasound apparatus 3000 may transmit different pieces of the ultrasound information about one ultrasound image respectively to the patient device and the medical expert device according to the sharing level.
  • an order of performing operations S 410 to S 470 may be changed, and some of the operations may be omitted.
  • the first ultrasound information shared by the ultrasound apparatus 3000 with the first device 1000 will be described in detail with reference to FIGS. 5 through 9 .
  • the patient device will be described as an example of the first device 1000 .
  • FIG. 5 is a flowchart illustrating a method of generating the first ultrasound information by the ultrasound apparatus 3000 to correspond to the sharing level of the first device 1000 , according to an exemplary embodiment.
  • the ultrasound apparatus 3000 may acquire an ultrasound image of an object. Since operation S 510 corresponds to operation S 310 shown in FIG. 3 , detailed descriptions thereof will be omitted here.
  • the ultrasound apparatus 3000 may add at least one annotation to the ultrasound image relating to all or a portion of the ultrasound image.
  • the ultrasound apparatus 3000 may analyze the ultrasound image to recommend at least one annotation.
  • the ultrasound apparatus 3000 may add at least one annotation to the ultrasound image based on a selection of the user (for example, the technician).
  • the ultrasound apparatus 3000 may display an annotation list that is set in advance.
  • the ultrasound apparatus 3000 receives the user's selection of at least one annotation in the annotation list, and then, may display the selected annotation on the ultrasound image. The adding of the annotation by the ultrasound apparatus 3000 will be described below with reference to FIG. 6 .
  • the ultrasound apparatus 3000 may select a region of interest in the ultrasound image.
  • the ultrasound apparatus 3000 may select the region of interest automatically by analyzing the ultrasound image, or based on the user input.
  • the ultrasound apparatus 3000 may detect an edge from the ultrasound image and recommend the region of interest automatically. Otherwise, the ultrasound apparatus 3000 may display a graphical user interface (GUI) allowing the user to set the region of interest directly on the screen.
  • GUI graphical user interface
  • the ultrasound apparatus 3000 may display an identification mark on the region of interest.
  • the ultrasound apparatus 3000 may add a color or a pattern in the region of interest, or may change a thickness of a boundary on the region of interest. This will be described in more detail with reference to FIGS. 7 and 8 .
  • the ultrasound apparatus 3000 may generate first ultrasound information including at least one of the annotation and the identification mark on the region of interest.
  • the ultrasound apparatus 3000 may generate the first ultrasound information including an ultrasound image, to which an annotation for the patient's comprehension is added.
  • the ultrasound apparatus 3000 may generate the first ultrasound information in which the color or the pattern is added to the region of interest, or the first ultrasound information including the annotation and the region of interest on which the identification mark is displayed.
  • the ultrasound apparatus 3000 may transmit the first ultrasound information to the first device 1000 .
  • the ultrasound apparatus 3000 may transmit the first ultrasound information to the first device 1000 via short distance communication.
  • the short distance communication may be Wireless Local Area Network (WLAN) (e.g., Wireless Fidelity (Wi-Fi)), Bluetooth, Bluetooth Low Energy (BLE), UWB, ZigBee, and WFD; however, the exemplary embodiments are not limited thereto.
  • WLAN Wireless Local Area Network
  • Wi-Fi Wireless Fidelity
  • BLE Bluetooth Low Energy
  • UWB UltraWB
  • ZigBee ZigBee
  • WFD Wireless Local Area Network
  • the first device 1000 receives the first ultrasound information from the ultrasound apparatus 3000 , and may display the received first ultrasound information on a screen.
  • the patient may identify the first ultrasound information relating to the ultrasound image acquired by the ultrasound apparatus 3000 via the first device 1000 .
  • an order of performing operations S 510 to S 570 may be changed, and some operations may be omitted.
  • FIG. 6 is a diagram showing a GUI for adding an annotation, according to an exemplary embodiment.
  • the ultrasound apparatus 3000 analyses the information about the ultrasound image to recommend annotations for describing a part or all of the ultrasound image 641 (for example, a recommended annotation 1, a recommended annotation 2, and a recommended annotation 3).
  • the ultrasound apparatus 3000 may analyze the information about the ultrasound image, and may display a recommended annotation for describing the object or a recommended annotation for representing a location of an embryo head at corresponding locations.
  • a method of analyzing the ultrasound image by the ultrasound apparatus 3000 is known to those skilled in the art, and thus, detailed descriptions thereof will not be provided here.
  • the ultrasound apparatus 3000 may move, delete, or edit a recommended annotation 610 based on a user input. For example, when a touch input for touching the recommended annotation 610 for a predetermined time or longer is received, the ultrasound apparatus 3000 may display a delete icon 640 for deleting the recommended annotation 610 . In this case, the user may select the delete icon 640 for deleting the recommended annotation 610 .
  • the ultrasound apparatus 3000 may enable the recommended annotation 610 to move.
  • the user may drag the recommended annotation 610 to change the display location of the recommended annotation 610 .
  • the ultrasound apparatus 3000 may provide an annotation list 630 that is set in advance on a predetermined region 642 of the screen for the user's selection.
  • the annotation list 630 of an exemplary embodiment may be displayed on a region that is different from the region 648 where the ultrasound image is displayed.
  • the annotation included in the annotation list 630 may be arranged based on usage times.
  • the annotations in the annotation list 630 may be arranged in an order from most frequently used by the user.
  • the annotation list 630 of an exemplary embodiment may include annotations relating to the current ultrasound mode and set in advance, or annotations relating to the target or to the object and set in advance. For example, if the target is an embryo, annotations relating to the embryo may be included in the annotation list 630 .
  • the annotation list 630 of an exemplary embodiment may include annotations relating to the identification information of the technician or identification information of the patient.
  • the ultrasound apparatus 3000 of an exemplary embodiment may receive a selection of the user (for example, the technician) on at least one annotation from the annotation list 630 and may display the selected annotation on the ultrasound image.
  • the ultrasound apparatus 3000 may sense a drag and drop input which involves dragging a second annotation 631 from the annotation list 630 to the region where the ultrasound image is displayed and dropping the second annotation 631 onto a certain location.
  • the ultrasound apparatus 3000 may display the second annotation 631 at the certain point where the drop input is sensed, based on the drag and drop input.
  • the ultrasound apparatus 3000 may provide an activated annotation list 620 including identification information of the annotations displayed on the ultrasound image on a predetermined region of the screen.
  • the activated annotation may denote an annotation that is currently displayed on the ultrasound image.
  • the recommended annotation 1, the recommended annotation 2, the recommended annotation 3, and the second annotation 631 displayed on the ultrasound image 641 may be activated annotations.
  • the user may identify the annotations that are added to the current ultrasound image via the activated annotation list 620 .
  • the ultrasound apparatus 3000 may receive a user input for selecting at least one activated annotation from the activated annotation list 620 .
  • the ultrasound apparatus 3000 may activate an edit mode of the selected activated annotation.
  • the user may touch an activated annotation 621 from the activated annotation list 620 for a predetermined time, tap the activated annotation 621 , or double-tap the activated annotation 621 .
  • the ultrasound apparatus 3000 may activate an edit mode of the activated annotation 621 , and may correct the activated annotation 621 displayed on the ultrasound image.
  • FIG. 7 is a diagram showing a GUI for displaying an identification mark on a region of interest, according to an exemplary embodiment.
  • the ultrasound apparatus 3000 of an exemplary embodiment may provide a template list 710 for selecting a region of interest 700 .
  • the template may be a figure that is set in advance and used to select the region of interest 700 .
  • the template list 710 according to an exemplary embodiment may include a circle, a square, a pentagon, etc.
  • the user may select the region of interest 700 by changing a location and a size of the circle on the ultrasound image.
  • the user may select the region of interest 700 by directly drawing a line on the ultrasound image by using a touch tool (for example, a finger, and an electronic pen), a mouse, or a trackball.
  • a touch tool for example, a finger, and an electronic pen
  • a mouse for example, a mouse
  • a trackball for example, a trackball
  • the ultrasound apparatus 3000 of an exemplary embodiment may provide a color list 720 for adding a color to the region of interest 700 .
  • the ultrasound apparatus 3000 according to another exemplary embodiment may provide a pattern list 730 so that the user may add a pattern to the region of interest 700 .
  • the user (for example, the technician) may add the color or the pattern to the region of interest 700 so as to help the patient's comprehension.
  • FIG. 8 is a diagram showing a GUI for adding an annotation and a color, according to an exemplary embodiment.
  • the ultrasound apparatus 3000 may provide an activated annotation list 810 , an annotation list 820 , and a palette tool 830 .
  • the user (for example, the technician) may add an annotation for describing the ultrasound image.
  • the user (for example, the technician) may select the region of interest 800 in the ultrasound image, and may indicate an identification mark on the region of interest 800 by using the palette tool 830 .
  • the ultrasound apparatus 3000 of an exemplary embodiment may transmit first ultrasound information corresponding to the sharing level of the first device 1000 to the first device 1000 . If the first device 1000 is the patient device, the sharing level information of the first device 1000 may be set in advance to share the annotation and the identification mark on the ultrasound image, as well as the ultrasound image, with the first device 1000 . Therefore, the ultrasound apparatus 3000 of an exemplary embodiment may transmit the first ultrasound information including the annotation and the identification mark on the region of interest for helping the patient's comprehension of the ultrasound image to the first device 1000 .
  • FIG. 9 is a diagram showing a screen of the first device 1000 , according to an exemplary embodiment.
  • the first device 1000 of an exemplary embodiment may display the first ultrasound information transmitted from the ultrasound apparatus 3000 on the screen.
  • the first ultrasound information displayed on the screen may include an ultrasound image 910 , an annotation 920 for describing the ultrasound image 910 , and a region of interest 930 indicated with an identification mark.
  • the first device 1000 may display skin color on the image of the embryo so that the user may identify the embryo easily.
  • the first device 1000 may display information about a head girth of the embryo, a location of the head, etc., as an annotation so that the patient may easily understand the image.
  • the patient may understand the ultrasound image captured by the ultrasound apparatus 3000 through the first ultrasound information displayed on the first device 1000 .
  • FIG. 10 is a flowchart illustrating a communication method between the ultrasound apparatus 3000 and the second device 1000 , according to an exemplary embodiment.
  • the ultrasound apparatus 3000 may acquire an ultrasound image about an object. Since operation S 1010 corresponds to operation S 310 shown in FIG. 3 , detailed descriptions thereof will not be provided here.
  • the ultrasound apparatus 3000 may generate second ultrasound information corresponding to the sharing level of the second device 2000 .
  • the ultrasound apparatus 3000 may generate the second ultrasound information including an ultrasound image, measurement information of the ultrasound image, analysis information of the technician about the ultrasound image, and the patient information. Since operation S 1020 corresponds to operation S 470 of FIG. 4 , detailed descriptions thereof will not be provided.
  • the ultrasound apparatus 3000 may transmit the second ultrasound information to the second device 2000 .
  • the ultrasound apparatus 3000 may transmit the second ultrasound information to the second device 2000 via short distance communication.
  • Examples of the short distance communication may include Wi-Fi, near field communication (NFC), Bluetooth, BLE, Zigbee, WFD, and UWB; however, the exemplary embodiments are not limited thereto.
  • the second device 2000 receives the second ultrasound information transmitted from the ultrasound apparatus 3000 , and may display the second ultrasound information on a screen.
  • the medical expert for example, a doctor
  • the ultrasound apparatus 3000 may transmit a request for confirmation of the second ultrasound information to the second device 2000 .
  • the ultrasound apparatus 3000 may request a confirmation of the medical expert for the ultrasound image or the analysis information of the technician included in the second ultrasound information.
  • the second device 2000 may transmit a message for confirming the second ultrasound information to the ultrasound apparatus 3000 in response to the request for confirmation.
  • the second device 2000 may receive confirmation information of the ultrasound image from the medical expert, and may transmit the received confirmation information to the ultrasound apparatus 3000 .
  • the ultrasound apparatus 3000 when receiving the confirmation message, may represent that the ultrasound image displayed on the screen is confirmed by the medical expert.
  • an order of performing operations S 1010 to S 1060 may be changed, and some of the operations may be omitted.
  • FIGS. 11A and 11B are diagrams showing a GUI for confirming the ultrasound image according to the exemplary embodiment. It is assumed that the second device 2000 is the medical expert device.
  • the ultrasound apparatus 3000 and the second device 2000 may respectively include confirmation buttons 1110 and 1120 for confirming the ultrasound image.
  • the confirmation buttons 1110 and 1120 of an exemplary embodiment may be a GUI.
  • the confirmation buttons 1110 and 1120 of an exemplary embodiment may display the confirmation status processing by using a predetermined color, a predetermined number, and/or a predetermined pattern. For example, when the technician 1122 requests confirmation of an ultrasound image 1, the confirmation button 1110 may be displayed as a red ⁇ circumflex over (1) ⁇ button, on the ultrasound apparatus 3000 . The confirmation button 1120 may be displayed as a blue ⁇ circumflex over (2) ⁇ button on the second device 2000 . When the medical expert 1124 selects the blue confirmation button 1120 to confirm the ultrasound image 1, the confirmation button 1110 on the ultrasound apparatus 3000 and the confirmation button 1120 on the second device 2000 may be changed to be displayed as green ⁇ circumflex over (3) ⁇ buttons, as shown in FIG. 11B .
  • the technician may additionally confirm the ultrasound image 1 after the medical expert confirms the ultrasound image 1.
  • the confirmation button 1110 on the ultrasound apparatus 3000 and the confirmation button 1120 on the second device 2000 may be changed to be displayed as the green ⁇ circumflex over (3) ⁇ buttons.
  • the ultrasound image used to diagnose the disease is confirmed only when both the technician and the medical expert confirm the ultrasound image, confirmation of a wrong ultrasound image due to a mistake of the technician or the medical expert may be prevented.
  • FIG. 12 is a diagram showing a screen on which a chatting window is provided by the ultrasound apparatus 3000 , according to an exemplary embodiment.
  • the ultrasound apparatus 3000 may provide a chatting window 1200 through which communication with the medical expert having the second device 2000 may be performed.
  • the ultrasound apparatus 3000 of an exemplary embodiment may receive a user input about the ultrasound image (for example, a question of the technician) via the chatting window 1200 .
  • the ultrasound apparatus 3000 may transmit the user input (for example, the question of the technician) to the second device 2000 , and may receive a response message with respect to the user input from the second device 2000 .
  • the ultrasound apparatus 3000 may receive inquiry information about a status of the patient (for example, arrhythmia, dyspnea, and posture of the patient) during the examination through the chatting window 1200 from the technician, and may transmit the inquiry information about the status of the patient to the second device 2000 .
  • the ultrasound apparatus 3000 receives an order of the medical expert about an action that has to be performed from the second device 2000 in consideration of the status of the patient, and may display the action ordered by the medical expert on the chatting window 1200 .
  • FIG. 13 is a diagram showing a screen displaying a message in the ultrasound apparatus 3000 , according to an exemplary embodiment.
  • the ultrasound apparatus 3000 of an exemplary embodiment may receive a message about the ultrasound image from the second device 2000 .
  • the ultrasound apparatus 3000 may display the received image on the screen.
  • the ultrasound apparatus 3000 may display the message (for example, a message ‘CT scan needed’) transmitted from the second device 2000 as a pop-up window 1300 .
  • the ultrasound apparatus 3000 may also receive a voice message and may output the voice message through a speaker.
  • FIG. 14 is a flowchart illustrating a communication method of the second device 2000 , according to an exemplary embodiment.
  • the second device 2000 may display an ultrasound image list on a first region of the screen.
  • the second device 2000 may display an ultrasound image list including at least one ultrasound image acquired by at least one ultrasound apparatus on the first region of the screen.
  • the ultrasound image list according to an exemplary embodiment may include thumbnail images of the ultrasound images.
  • the ultrasound image list of an exemplary embodiment may include real-time ultrasound images transmitted from at least one ultrasound apparatus, and ultrasound images that are stored in advance.
  • the real-time ultrasound image may denote an ultrasound image acquired by the ultrasound apparatus 3000 within a predetermined time period prior to the current time or the time point when the ultrasound image is received by the second device 2000 .
  • the real-time ultrasound image may be an image acquired by the ultrasound apparatus 3000 within 10 minutes from the time point when the ultrasound image is received by the second device 2000 . That is, when the predetermined time is 10 minutes, the ultrasound image acquired 3 minutes earlier is a real-time image, whereas the ultrasound image acquired 15 minutes earlier may not be a real-time image.
  • the ultrasound image stored in advance may be an image acquired by the ultrasound apparatus 3000 prior to the predetermined time period and stored.
  • the ultrasound image may be stored in the ultrasound apparatus 3000 , the server 4000 , and/or the second device 2000 . Therefore, the second device 2000 may read the ultrasound images stored in advance from a memory of the ultrasound apparatus 3000 , the server 4000 , or the second device 2000 .
  • the second device 2000 may indicate an identification mark on the real-time ultrasound image in order to distinguish the real-time ultrasound image and the ultrasound image stored in advance from each other.
  • the second device 2000 may add an indicator such as ‘LIVE’ or an exclamation mark (!) to the real-time ultrasound image in the ultrasound image list, make a boundary of the real-time ultrasound image bold, or display the boundary of the real-time ultrasound image in a predetermined color.
  • the second device 2000 may display the real-time ultrasound image to be larger than the ultrasound image stored in advance or may locate the real-time ultrasound image at a center portion in the ultrasound image list.
  • the second device 2000 may receive a selection of an ultrasound image from the ultrasound image list.
  • the second device 2000 senses a medical expert gesture such as a tap, a double-tap, a flicking, or a dragging operation of the ultrasound image included in the ultrasound image list to receive a selection of the medical expert on one ultrasound image from the ultrasound image list.
  • a medical expert gesture such as a tap, a double-tap, a flicking, or a dragging operation of the ultrasound image included in the ultrasound image list to receive a selection of the medical expert on one ultrasound image from the ultrasound image list.
  • the medical expert may select one ultrasound image from the ultrasound image list by using a mouse or a trackball, or by a voice input.
  • the second device 2000 may display the selected ultrasound image on a second region of the screen.
  • the second device 2000 of an exemplary embodiment may display the selected ultrasound image on the second region in a predetermined size. That is, the second device 2000 may display the ultrasound image to be greater than the thumbnail image included in the ultrasound image list on the second region.
  • the second device 2000 may further display at least one of the analysis information of the technician about the ultrasound image displayed on the second region, measurement information of the ultrasound image displayed on the second region, and the patient information according to the sharing level.
  • the second device 2000 may communicate with the ultrasound apparatus 3000 that acquired the selected ultrasound image.
  • the second device 2000 may provide a chatting window for communicating with the ultrasound apparatus 3000 on a third region of the screen.
  • the second device 2000 may receive an input of the medical expert about the ultrasound image displayed on the second region, through the chatting window.
  • the second device 2000 may transmit the input of the medical expert received through the chatting window to the ultrasound apparatus 3000 in a chatting session.
  • the second device 2000 may receive a request for confirming the selected ultrasound image from the ultrasound apparatus 3000 second device 2000 and may transmit the confirmation information to the ultrasound apparatus 3000 when the confirmation information about the selected ultrasound image is input.
  • the second device 2000 of an exemplary embodiment may transmit control information for controlling the ultrasound apparatus 3000 to the ultrasound apparatus 3000 .
  • the control information of an exemplary embodiment may include at least one of a control command for selecting and displaying another ultrasound image that is different from the ultrasound image displayed on the second region, a control command for expanding or reducing the ultrasound image displayed on the second region, a control command for storing the ultrasound image displayed on the second region, a control command for 3D rendering the ultrasound image displayed on the second region, a control command for adding an annotation or a body marker to the ultrasound image displayed on the second region, a control command for measuring a region of interest, and a control command for correcting analysis information of the ultrasound image displayed on the second region.
  • FIGS. 15 through 18 a screen for displaying a plurality of ultrasound images acquired by a plurality of ultrasound apparatuses in the second device 2000 will be described with reference to FIGS. 15 through 18 .
  • FIG. 15 is a diagram showing a screen providing an ultrasound image list by the second device 2000 , according to an exemplary embodiment.
  • the screen of the second device 2000 may be divided into a first region 1510 for displaying an ultrasound image list and a second region 1520 displaying a main ultrasound image.
  • the ultrasound image list may include the real-time ultrasound images acquired by the ultrasound apparatus 3000 and/or the ultrasound images stored in advance.
  • the second device 2000 of an exemplary embodiment may distinguish the ultrasound images stored in advance and the real-time ultrasound images from each other by a mark ‘LIVE’ 1512 added to the real-time ultrasound images.
  • an ultrasound image 1511 and an ultrasound image 1512 are the real-time ultrasound images.
  • An ultrasound image 3, an ultrasound image 4, an ultrasound image 5, and an ultrasound image 6 may be the ultrasound images that are stored in advance and/or the ultrasound images that are not real-time.
  • the second device 2000 may display the ultrasound image 1511 on the second region 1520 as a main ultrasound image.
  • FIG. 15 shows the touch-and-drag gesture as an example of selecting the ultrasound image; however, the exemplary embodiments are not limited thereto.
  • the doctor may select the ultrasound image by using a tap gesture, a double tap gesture, a voice input, or a physical selection button.
  • the second device 2000 of an exemplary embodiment communicates with the ultrasound apparatus 3000 that acquires the ultrasound image 1511 bi-directionally. This will be described below with reference to FIG. 16 .
  • the second device 2000 may display a GUI for receiving a confirmation input of the ultrasound image 1511 , that is, a confirmation button 1530 , on the screen.
  • the second device 2000 may receive a confirmation input of the medical expert about the ultrasound image 1511 via the confirmation button 1530 .
  • a color and/or a pattern of the confirmation button 1530 may be changed.
  • the second device 2000 may transmit a confirmation message of the ultrasound image 1511 to the ultrasound apparatus 3000 that acquired the ultrasound image 1511 .
  • FIG. 16 is a diagram showing a screen for displaying a chatting window and a pointer in the second device 2000 , according to an exemplary embodiment.
  • the medical expert is a doctor.
  • the second device 2000 of an exemplary embodiment may provide a chatting window 1610 for communicating with the ultrasound apparatus 3000 on a third region 1612 of the screen.
  • the second device 2000 may receive an input of the doctor about an ultrasound image 1600 via the chatting window 1610 , and may transmit the received input to the ultrasound apparatus 3000 .
  • the second device 2000 of an exemplary embodiment may receive an image message, a voice message, and a text message of the technician from the ultrasound apparatus 3000 , and may transmit an image message, a voice message, and a text message of the medical expert to the ultrasound apparatus 3000 via the chatting window 1610 .
  • the doctor may identify a result report generated by the technician after performing the ultrasound examination and may confirm that the report may be finished in the current status, or may input a message requesting correction of a result value of the report in the chatting window 1610 .
  • a message requesting a re-examination may be input to the chatting window 1610 .
  • the doctor may request the technician to perform an examination such as CT or MRI.
  • the doctor may inquire about the technician's opinion regarding the measurement values or may ask the technician about an operation status of the patient via the chatting window 1610 .
  • the second device 2000 may display a first pointer 1630 of the ultrasound apparatus 3000 and a second pointer 1620 of the second device 2000 on the ultrasound image 1600 displayed on the second device 2000 .
  • the second device 2000 may receive information about a location of the first pointer 1630 of the ultrasound apparatus 3000 from the ultrasound apparatus 3000 .
  • the ultrasound apparatus 3000 may display the first pointer 1630 of the ultrasound apparatus 3000 and the second pointer 1620 of the second device 2000 on the ultrasound image 1600 displayed on the ultrasound apparatus 3000 . Therefore, the technician and the doctor may identify areas of interest of each other by locating the pointer on a point of interest via the bi-directional chatting, in real time.
  • FIG. 17 is a diagram showing an example of a GUI for remotely controlling the ultrasound apparatus 3000 by the second device 2000 , according to an exemplary embodiment.
  • the second device 2000 of an exemplary embodiment may sense a pinch input on a main ultrasound image 1710 .
  • the second device 2000 may transmit a control command for expanding or reducing the main ultrasound image 1710 to the ultrasound apparatus 3000 based on the pinch input.
  • a control command for expanding the main ultrasound image 1710 according to the distance between the fingers is transmitted to the ultrasound apparatus 3000 .
  • a control command for reducing the main ultrasound image 1710 may be transmitted to the ultrasound apparatus 3000 .
  • the second device 2000 remotely controls the ultrasound apparatus 3000 to expand or reduce the main ultrasound image 1710 and to transmit the expanded or reduced ultrasound image to the second device 2000 in real-time, while maintaining the resolution.
  • the second device 2000 may transmit a control command for adjusting a direction of a probe to the ultrasound apparatus 3000 based on the medical expert's input.
  • the second device 2000 may include a remote controller and may remotely control the probe direction of the ultrasound apparatus 3000 according to the medical expert's input, by manipulating a remote control button 1720 displayed on the screen to an upper/lower/left/right position, or another position therebetween.
  • FIGS. 18A and 18B are diagrams showing an example of a GUI for adding a body marker to the ultrasound image remotely by the second device 2000 , according to an exemplary embodiment.
  • the body marker according to an exemplary embodiment may be a figure for identifying a location to which an ultrasound wave is scanned or an object.
  • the body marker of an exemplary embodiment may include a figure representing an object to which the ultrasound wave is scanned and a figure representing a location of the probe contacting the object.
  • An example of the body marker may be an arm shape, a liver shape, or a uterus shape.
  • the second device 2000 may receive an input of the medical expert for adding a body marker 1810 to the ultrasound image.
  • the second device 2000 may transmit to the ultrasound apparatus 3000 a control command for adding the body marker 1810 selected by the medical expert to the ultrasound image displayed on the ultrasound apparatus 3000 .
  • the ultrasound apparatus 3000 receives the control command for adding the body marker from the second device 2000 , and may add a body marker 1820 to the ultrasound image displayed on the screen.
  • the medical expert may control the ultrasound apparatus 3000 via the second device 2000 from a remote distance.
  • FIGS. 18A and 18B show an example of a GUI for adding the body marker; however, the exemplary embodiments are not limited thereto.
  • the second device 2000 may remotely control the ultrasound apparatus 3000 to add an annotation, to re-measure the region of interest, or to perform 3D image rendering.
  • the second device 2000 may control the ultrasound apparatus 3000 to display a certain ultrasound image, and thus, the medical expert may select and display a desired ultrasound image from the thumbnail images or review window displayed on the ultrasound apparatus 3000 , and then, check the selected ultrasound image. The medical expert may expand another ultrasound image while reviewing the selected ultrasound image.
  • the medical expert may control the ultrasound apparatus 3000 remotely to store the ultrasound image again, or may directly correct the report generated by the technician.
  • FIG. 19 is a flowchart illustrating a communication method between the second device 2000 and the patient device, according to an exemplary embodiment.
  • the second device 2000 is the medical expert device
  • the first device 1000 is the patient device.
  • the second device 2000 may display an ultrasound image on the screen.
  • the ultrasound image displayed on the second device 2000 may be an ultrasound image acquired by the ultrasound apparatus 3000 in real-time, or an ultrasound image stored in a memory in advance.
  • the second device 2000 may receive a description, from the medical expert, about the ultrasound image displayed on the screen.
  • the description by the medical expert may include representing a region of interest (for example, a tumor, an embryo, a head of the embryo, a hand of the embryo, etc.), a description about the region of interest, an analysis result of the measurement value (for example, head girth, the number of fingers of an embryo, etc.), or existence of lesions; however, the exemplary embodiments are not limited thereto.
  • the second device 2000 may transmit the description of the medical expert to the first device 1000 .
  • the first device 1000 may be a patient device displaying the ultrasound image that is the same as that displayed on the second device 2000 .
  • the second device 2000 may transmit the description by the medical expert to the first device 1000 through wired or wireless communication. Otherwise, the second device 2000 may transmit the description by the medical expert to the first device 1000 via the server 4000 .
  • the second device 2000 may transmit the description by the medical expert about the ultrasound image input from the medical expert to the ultrasound apparatus 3000 .
  • the second device 2000 may transmit the description by the medical expert simultaneously to both the first device 1000 and the ultrasound apparatus 3000 , or may transmit them sequentially.
  • the first device 1000 may display the received description by the medical expert on the ultrasound image which is displayed on the screen of the first device 1000 .
  • the description by the medical expert may be displayed on the first device 1000 in real-time or may be stored in the first device 1000 .
  • displaying of the description by the medical expert on the first device 1000 in real-time denotes that the description by the medical expert is displayed on the first device 1000 within a predetermined time period after the input of the description to the second device 2000 .
  • FIG. 20 shows screens of the ultrasound apparatus 3000 , the first device 1000 , and the second device 2000 , according to an exemplary embodiment.
  • the ultrasound apparatus 3000 may transmit the ultrasound image displayed on the screen 2010 to the first device 1000 and the second device 2000 in real-time.
  • the ultrasound apparatus 3000 may display a GUI for adding an annotation or color to the ultrasound image for the user of the first device 1000 , and a confirmation button to request the confirmation of the ultrasound image from the second device 2000 .
  • the ultrasound image displayed on the screen that the user (for example, the technician) of the ultrasound apparatus 3000 watches may be displayed on the first device 1000 and the second device 2000 in real-time.
  • the information about the ultrasound image displayed on the first device 1000 and the second device 2000 may vary depending on the sharing levels of the first device 1000 and the second device 2000 .
  • the ultrasound image, the annotation for describing the ultrasound image, and the identification mark of the region of interest included in the ultrasound image may be displayed on the screen of the first device 1000 according to the sharing level of the first device 1000 .
  • the ultrasound image, the information about the patient, the measurement information with respect to the ultrasound image, the chatting window for communicating with the user of the ultrasound apparatus 3000 , the confirmation button for confirming the ultrasound image, and the GUI for remotely controlling the ultrasound apparatus 3000 may be displayed on the screen of the second device 2000 according to the sharing level of the second device 2000 .
  • the second device 2000 may display information about a non-real time ultrasound image, as well as the real-time ultrasound image, on the screen thereof.
  • the second device 2000 displays the ultrasound image list including the real-time ultrasound image and the ultrasound image that is stored in advance on the first region of the screen, and may display one ultrasound image selected from the ultrasound image list on the second region of the screen.
  • the second device 2000 may transmit the input description 2001 to the first device 1000 .
  • the first device 1000 may display the description 2001 transmitted from the second device 2000 on the ultrasound image.
  • FIGS. 21 and 22 are block diagrams showing an ultrasound apparatus 3000 , according to an exemplary embodiment.
  • the ultrasound apparatus 3000 of an exemplary embodiment may include an ultrasound image obtainer 3100 , a communicator 3200 , and a controller 3300 .
  • the ultrasound apparatus 3000 may include the illustrated elements and other elements, or may include only some of the illustrated elements.
  • the ultrasound apparatus 3000 may further include a user input unit 3400 , a memory 3500 , and a display 3600 .
  • the above components may be connected to each other via a bus 3700 .
  • the ultrasound image obtainer 3100 of an exemplary embodiment may acquire ultrasound image data about an object.
  • the ultrasound image data may be 2D ultrasound image data or 3D ultrasound image data about the object.
  • the ultrasound image obtainer 3100 may include a probe 20 , an ultrasound transceiver 3110 , and an image processor 3120 .
  • the probe 20 sends an ultrasound signal to an object 10 according to a driving signal applied from the ultrasound transceiver 3110 , and receives an echo signal reflected by the object 10 .
  • the probe 20 may include a plurality of transducers that vibrate according to an electric signal to generate the ultrasound that is acoustic energy.
  • the probe 20 may be connected to a main body of the ultrasound apparatus 3000 via a wire or wirelessly, and the ultrasound apparatus 3000 may include a plurality of probes 20 according to an exemplary embodiment.
  • the probe 20 may include at least one of a one-dimensional (1D) probe, a 1.5-dimensional (1.5D) probe, a 2D (matrix) probe, and a 3D probe.
  • a transmitter 3111 applies a driving signal to the probe 20 , and may include a pulse generator 3113 , a transmission delay unit 3114 , and a pulsar 3115 .
  • the pulse generator 3113 generates pulses for forming a transmission ultrasound according to a predetermined pulse repetition frequency (PRF), and the transmission delay unit 3114 applies a delay time for determining transmission directionality to the pulses.
  • PRF pulse repetition frequency
  • Each of the pulses to which the delay time is applied corresponds to each of a plurality of piezoelectric vibrators included in the probe 20 .
  • the pulsar 3115 applies the driving signal (or the driving pulse) to the probe 20 at timings corresponding to the pulses to which the delay time is applied.
  • a receiver 3112 processes an echo signal transmitted from the probe 20 to generate ultrasound data, and may include an amplifier 3116 , an analog-to-digital converter (ADC) 3117 , a reception delay unit 3118 , and a combiner 3119 .
  • the amplifier 3116 amplifies the echo signal for each of channels, and the ADC 3117 converts the amplified echo signal.
  • the reception delay unit 3118 applies a delay time for determining a reception directionality to the echo signal that is converted to a digital signal, and the combiner 3119 combines echo signals processed by the reception delay unit 3118 to output the ultrasound data to a data processor 3121 .
  • the image processor 3120 generates ultrasound images by performing a scan conversion of the ultrasound data generated by the ultrasound transceiver 3110 .
  • the ultrasound image may include a gray scale image obtained by scanning the object in an amplitude (A) mode, a brightness (B) mode, and a motion (M) mode, and a Doppler image representing a moving object by using the Doppler effect.
  • the Doppler image may include a bloodstream Doppler image (or color Doppler image) representing flow of blood, a tissue Doppler image representing movement of tissues, and a spectral Doppler image representing velocity of the object as a waveform.
  • a B mode processor 3123 extracts B mode components from the ultrasound data, and processes the B mode components.
  • An image generator 3122 may generate an ultrasound image in which an intensity of a signal is represented by brightness based on the B mode components extracted by the B mode processor 3123 .
  • a Doppler processor 3124 extracts Doppler components from the ultrasound data, and the image generator 3122 may generate a Doppler image in which movement of the object is represented by a color or a waveform based on the extracted Doppler components.
  • the image generator 3122 may generate a 3D ultrasound image by performing a volume rendering process of volume data, and may generate an elastic mode image in which a transformation degree of the object 10 according to a pressure is imaged.
  • the image generator 3122 may represent various pieces of additional information as a text or a graphic on the ultrasound image. For example, the image generator 3122 may add at least one annotation relating to all or some of the ultrasound image to the ultrasound image. That is, the image generator 3122 may analyze the ultrasound image, and may recommend at least one annotation relating to all or some of the ultrasound image based on a result of the analysis. The image generator 3122 may add at least one annotation selected by the user to the ultrasound image.
  • the image processor 3120 may extract a region of interest from the ultrasound image by using an image processing algorithm.
  • the image processor 3120 may add a color, a pattern, or a boundary to the region of interest.
  • the communicator 3200 may include one or more components enabling communications between the ultrasound apparatus 3000 and the first device 1000 , the ultrasound apparatus 3000 and the second device 2000 , and the ultrasound apparatus 3000 and the server 4000 .
  • the communicator 3200 may include a short range communication module 3210 , a wired communication module 3220 , a mobile communication module 3230 , etc.
  • the short distance communication module 3210 refers to a module for short distance communication.
  • WLAN e.g., Wi-Fi
  • Bluetooth e.g., Bluetooth
  • BLE e.g., UWB
  • ZigBee e.g., ZigBee
  • NFC e.g., Wi-Fi
  • WFD Wireless Fidelity
  • IrDA Infrared Data Association
  • the wired communication module 3220 is a communication module using an electric signal or an optical signal.
  • a wired communication technology of an exemplary embodiment a pair cable, a coaxial cable, an optical fiber cable, an Ethernet cable, etc. may be used.
  • the mobile communication module 3230 transmits and receives wireless signals to and from at least one of a base station, an external device, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to a voice call signal, a video call signal, or text/multimedia message transmission.
  • the communicator 3200 is connected to a network 30 by using a wire or wirelessly to communicate with an external device (for example, the first device 1000 or the second device 2000 ) or the server 4000 .
  • the communicator 3200 may transmit and receive data to and from a hospital server or other medical equipment in the hospital, which are connected to the communicator 3200 via a picture archiving and communication system (PACS).
  • PACS picture archiving and communication system
  • the communicator 3200 may perform data communication according to digital imaging and communications in medicine (DICOM).
  • the communicator 3200 may transmit data relating to diagnosis of the object, for example, ultrasound images, ultrasound data, Doppler data, etc., of the object 10 , via the network 30 , and may receive medical images captured by other medical apparatuses such as a CT, an MRI, an X-ray, etc.
  • the communicator 3200 may receive information about a medical history or treatment schedule of a patient from the server 4000 , to be used in the diagnosis of the object 10 .
  • the communicator 3200 may transmit first ultrasound information about the ultrasound image to the first device 1000 and second ultrasound information about the ultrasound image to the second device 2000 according to the sharing levels that are set in advance.
  • the communicator 3200 may receive a message about the ultrasound image from the second device 2000 .
  • the message may include a voice message, a text message, or a video message.
  • the communicator 3200 may transmit a user input to the second device 2000 , and may receive a response message from the second device 2000 .
  • the communicator 3200 may transmit a request for confirmation of the second ultrasound information to the second device 2000 , and may receive a confirmation message of the second ultrasound information from the second device 2000 .
  • the communicator 3200 may receive control information from the second device 2000 .
  • the control information may include at least one of a control command for selecting and displaying a different ultrasound image from a currently displayed image, a control command for expanding or reducing the ultrasound image, a control command for storing the ultrasound image, a control command for 3D rendering the ultrasound image, a control command for adding an annotation or a body marker, a control command for measuring a region of interest, and a control command for correcting analysis information of the ultrasound image; however, the exemplary embodiments are not limited thereto.
  • the communicator 3200 may request the server 4000 for authentication of the first device 1000 and the second device 2000 , and if the authentication succeeds, the communicator 3200 may receive sharing level information of the first device 1000 and the second device 2000 from the server 4000 .
  • the controller 3300 controls operations of the ultrasound apparatus 3000 . That is, the controller 3300 may control the ultrasound image obtainer 3100 , the communicator 3200 , the user input unit 3400 , the memory 3500 , and the display 3600 .
  • the controller 3300 may identify the sharing level of each of the first and second devices 1000 and 2000 for sharing the ultrasound image.
  • the controller 3300 may generate first ultrasound information corresponding to a first sharing level of the first device 1000 , and second ultrasound information corresponding to a second sharing level of the second device 2000 .
  • the controller 3300 may enable the touched annotation to move.
  • the controller 3300 may select a region of interest from the ultrasound image, and may indicate an identification mark on the region of interest.
  • the controller 3300 may execute a control command corresponding to the control information transmitted from the second device 2000 .
  • the user input unit 3400 is a unit used by the user (for example, the technician) to input data for controlling the ultrasound apparatus 3000 .
  • the user input unit 3400 may be a keypad, a dome switch, a touch pad (a contact capacitance type, a pressure resistive type, an infrared sensing type, a surface ultrasound transfer type, an integral tension measurement type, a piezo effect type, etc.), a trackball, or a jog switch; however, the exemplary embodiments are not limited thereto.
  • the user input unit 3400 may further include an electrocardiogram (ECG) measuring module, a respiratory sensor module, a voice recognition sensor, a gesture recognition sensor, a fingerprint recognition sensor, an iris recognition sensor, a depth sensor, a distance sensor, etc.
  • ECG electrocardiogram
  • the user input unit 3400 of an exemplary embodiment may detect a proximity touch, as well as a real-touch.
  • the user input unit 3400 may sense a touch input on the ultrasound image (for example, a touch-and-hold, a tap, a double tap, or a flicking operation).
  • the user input unit 3400 may sense a drag input from a point where a touch input is sensed.
  • the user input unit 3400 may sense a multiple touch input (for example, a pinch operation) on at least two or more points included in the ultrasound image.
  • the user input unit 3400 may receive a selection of at least one annotation included in the annotation list displayed on the screen.
  • the user input unit 3400 may receive a drag and drop input that involves dragging the selected annotation to the region where the ultrasound image is displayed and dropping the annotation, and may sense a drag input of the user onto the annotation displayed on the ultrasound image.
  • the user input unit 3400 may receive a user input about the ultrasound image through the chatting window.
  • the memory 3500 may store programs for processing and controlling the controller 3300 , and may store input/output data (for example, the annotation list set in advance, the ultrasound images, the patient information, probe information, body markers, etc.).
  • input/output data for example, the annotation list set in advance, the ultrasound images, the patient information, probe information, body markers, etc.
  • the memory 3500 may include a storage medium including at least one of a flash memory, a hard disk, a micro multimedia card, a card-type memory (e.g., a security digital (SD) or XD memory), a random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk.
  • the ultrasound apparatus 3000 may operate web storage or a cloud server for performing the storing function of the memory 3500 on the Internet.
  • the display 3600 displays and outputs the information processed by the ultrasound apparatus 3000 .
  • the display 3600 displays the ultrasound images, or may display a user interface (UI) or GUI related to a control panel.
  • the display 3600 may display a list of devices that may communicate with the ultrasound apparatus 3000 , an annotation list about the plurality of annotations that are set in advance, and an activated annotation list about the annotations displayed on the screen.
  • the display 3600 may display an annotation selected from the annotation list on the ultrasound image.
  • the display 3600 may display the selected annotation based on the drag and drop input about the annotation included in the annotation list.
  • the display 3600 may move the displayed location of the annotation according to the drag input of the user.
  • the display 3600 may display an activated annotation list about at least one annotation added to the ultrasound image on a predetermined region of the screen.
  • the display 3600 of an exemplary embodiment may display a message received from the second device 2000 and may provide a chatting window for communicating with the second device 2000 on the screen thereof.
  • the display 3600 may be used as an input device as well as an output device.
  • the display 3600 may include at least one of a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an organic light-emitting diode (OLED), a flexible display, a 3D display, and an electrophoretic display.
  • the ultrasound apparatus 3000 may include two or more displays 3600 .
  • FIG. 23 is a block diagram showing a second device 2000 according to an exemplary embodiment.
  • the second device 2000 of an exemplary embodiment may include a display 210 , a user input unit 220 , a communicator 230 , and a controller 240 .
  • the second device 2000 may include more or less components than those of FIG. 23 .
  • the second device 2000 may include a mobile terminal and/or a stationary terminal.
  • Examples of the mobile terminal may include a laptop computer, a personal digital assistant (PDA), a tablet PC, etc.
  • the display 210 displays and outputs information processed by the second device 2000 .
  • the display 210 may display an ultrasound image list including at least one ultrasound image acquired by at least one ultrasound apparatus 3000 on a first region of the screen, and may display an ultrasound image selected from the ultrasound image list on a second region of the screen.
  • the ultrasound image list of an exemplary embodiment may include real-time ultrasound images transmitted from at least one ultrasound apparatus and ultrasound images stored in advance.
  • the ultrasound image list may include real-time ultrasound images transmitted from the ultrasound apparatus 3000 only, or ultrasound images stored in advance.
  • the display 210 may display at least one of analysis information of the technician about the ultrasound image, measurement information with respect to the ultrasound image, and patient information, in addition to the ultrasound image, according to a sharing level of the second device 2000 .
  • the display 210 may indicate a pointer of the ultrasound apparatus 3000 and a pointer of the second device 2000 , and may provide a chatting window for communicating with the ultrasound apparatus 3000 on a third region of the screen.
  • the display 210 may display a GUI for receiving a confirmation input about the ultrasound image from the medical expert (for example, a doctor).
  • the user input unit 220 is a unit for inputting data for controlling the second device 2000 from the medical expert (for example, the doctor).
  • the user input unit 220 may be a keypad, a dome switch, a touch pad (a contact capacitance type, a pressure resistive type, an infrared sensing type, a surface ultrasound transfer type, an integral tension measurement type, a piezo effect type, etc.), a trackball, or a jog switch; however, the exemplary embodiments are not limited thereto.
  • a touch pad is layered with a display panel, this is referred to as a touch screen.
  • the user input unit 220 of an exemplary embodiment may detect a proximity touch, as well as a real-touch.
  • the user input unit 220 may sense a touch input on the ultrasound image (for example, a touch-and-hold, a tap, a double tap, or a flicking operation).
  • the user input unit 220 may sense a drag input from a point where a touch input is sensed.
  • the user input unit 220 may sense a multiple touch input (for example, a pinch operation) on at least two or more points included in the ultrasound image.
  • the user input unit 220 may receive a selection of an ultrasound image from the ultrasound image list.
  • the user input unit 220 may receive an input of the user about the ultrasound image through the chatting window, and may receive a confirmation input about the ultrasound image through the GUI for receiving the confirmation input.
  • the user input unit 220 may receive a description by the medical expert about the ultrasound image.
  • the description by the medical expert may include representing a region of interest (for example, a tumor, an embryo, a head of the embryo, a hand of the embryo, etc.), a description about the region of interest, an analysis result of the measurement value (for example, head girth, the number of fingers of an embryo, etc.), or existence of lesions; however, the exemplary embodiments are not limited thereto.
  • the communicator 230 may include one or more components enabling communications between the ultrasound apparatus 3000 and the first device 1000 , the ultrasound apparatus 3000 and the second device 2000 , and the ultrasound apparatus 3000 and the server 4000 .
  • the communicator 230 may include a short distance communication module, a mobile communication module, a wired internet module, a wireless internet module, etc.
  • the short distance communication module refers to a module for short distance communication.
  • WLAN e.g., Wi-Fi
  • Bluetooth e.g., Bluetooth
  • BLE e.g., Bluetooth
  • UWB e.g., Bluetooth
  • ZigBee e.g., ZigBee
  • NFC e.g., Wi-Fi
  • WFD e.g., Wi-Fi
  • IrDA IrDA
  • the mobile communication module transmits and receives wireless signals to and from at least one of a base station, an external device, and a server on a mobile communication network.
  • the wireless internet module is a module for accessing wireless internet, and may be built into or provided outside the second device 2000 .
  • the wired internet module is a module for accessing wired internet.
  • the communicator 230 may communicate with the ultrasound apparatus 3000 that acquires the ultrasound image selected by the medical expert (for example, the doctor). For example, the communicator 230 may transmit an input about the ultrasound image to the ultrasound apparatus 3000 . The communicator 230 may receive a request for confirmation about the selected ultrasound image from the ultrasound apparatus 3000 , and may transmit the confirmation information about the selected ultrasound image to the ultrasound apparatus 3000 .
  • the communicator 230 may transmit control information for controlling the ultrasound apparatus 3000 to the ultrasound apparatus 3000 .
  • the communicator 230 may transmit to the ultrasound apparatus 3000 at least one of a control command for selecting and displaying another ultrasound image, a control command for expanding or reducing the ultrasound image, a control command for storing the ultrasound image, a control command for 3D rendering the ultrasound image, a control command for adding an annotation or a body marker, a control command for measuring a region of interest, and a control command for correcting analysis information of the ultrasound image.
  • the communicator 230 may transmit the description by the medical expert to the first device 1000 .
  • the controller 240 controls overall operations of the second device 2000 . That is, the controller 240 may control the display 210 , the user input unit 220 , and the communicator 230 .
  • the controller 240 may indicate an identification mark on the real-time ultrasound image for distinguishing the real-time ultrasound image from the ultrasound images that are stored in advance.
  • the above-described exemplary embodiments may be implemented with at least one processor and include a computer-readable medium including program instructions for executing various operations realized by a computer.
  • the computer-readable medium may include program instructions, a data file, and a data structure, separately or cooperatively.
  • the program instructions and the media may be specially designed and constructed for the purposes of the exemplary embodiments, or may be of a kind known to one of ordinary skill in the art of computer software arts.
  • Examples of the computer-readable media include magnetic media (e.g., hard disks, floppy disks, and magnetic tapes), optical media (e.g., CD-ROMs or DVD), magneto-optical media (e.g., floptical disks), and hardware devices (e.g., ROMs, RAMs, or flash memories, etc.) that are specially configured to store and perform program instructions.
  • the media may also be transmission media such as optical or metallic lines, wave guides, etc. specifying the program instructions, data structures, etc.
  • Examples of the program instructions include both machine code, such as that produced by a compiler, and files containing high-level languages codes that may be executed by the computer using an interpreter.
  • the ultrasound apparatus may transmit appropriate ultrasound information to a receiving device according to a sharing authority of the receiving device (for example, the patient device and the medical expert device).
  • the sharing authority level may be determined so that excessive exposure of the ultrasound information may be prevented.
  • the technician of the ultrasound apparatus 3000 may acquire the ultrasound image of the patient, in operation S 2410 , and may perform initial analysis and processing of the ultrasound image, in operation S 2411 .
  • the ultrasound apparatus 3000 may provide the second device 1000 with the ultrasound information that the patient may easily comprehend, in operation S 2412 .
  • the patient may be provided with a display of the ultrasound image in which the features of interest are signified by color and annotated and/or explained by text.
  • the ultrasound apparatus 3000 may provide the second device 2000 of the medical expert (for example, the doctor) with more detailed information.
  • the doctor may be additionally provided with detailed medical history of the patient, report of the technician performing the imaging procedure of the patient, etc.
  • the operations S 2412 and S 2414 may be one operation filtered by the different sharing authority levels assigned to the first device 1000 and the second device 2000 .
  • the exchange of the above information may be performed in real time or substantially contemporaneously with the imaging procedure performed by the technician who operates the ultrasound.
  • the doctor may provide an expeditious feedback and may direct appropriate operations, for example, during the imaging procedure or while transferring a patient in an emergency to a hospital, i.e., the number of ultrasound images taken and the time for diagnosis and imaging procedure may be minimized.
  • the doctor located in a remote location may check the image and remotely direct the ultrasound apparatus to re-take the image, change the scan direction, take a different image, perform calibration, change imaging parameters, etc. (operations S 2420 and S 2422 ).
  • the doctor may control the scan direction and movement of the ultrasound probe, by operating the remote controller provided via the doctor's device.
  • the remote location may be a location disposed in a different or isolated geographic locality as that of the ultrasound apparatus and the patient.
  • the patients may better comprehend the images and may diagnose themselves, i.e., a self-diagnosis is possible.
  • the method of sharing information about ultrasound images between the ultrasound apparatus 3000 and at least one external device according to a sharing level is described as an example; however, exemplary embodiments are not limited thereto.
  • other medical imaging apparatuses in addition to the ultrasound apparatus 3000 may share information about medical images with the external device according to a sharing level of the external device.
  • a method of sharing information about medical images by the medical imaging apparatus with at least one external device will be described with reference to FIG. 25 .
  • FIG. 25 is a flowchart illustrating an information sharing method of a medical imaging apparatus according to an exemplary embodiment.
  • the medical imaging apparatus may acquire a medical image of an object.
  • the medical imaging apparatus may directly generate the medical image or may receive the medical image from outside.
  • the medical imaging apparatus may include a magnetic resonant imaging (MRI) apparatus, a computerized tomography (CT) apparatus, an X-ray imaging apparatus, an angiography apparatus, and the like; however, exemplary embodiments are not limited thereto.
  • MRI magnetic resonant imaging
  • CT computerized tomography
  • X-ray imaging apparatus X-ray imaging apparatus
  • angiography apparatus angiography apparatus
  • An MRI apparatus is an apparatus for acquiring a sectional image of a part of an object by expressing, in a contrast comparison, a strength of a MR signal with respect to a radio frequency (RF) signal generated in a magnetic field having a specific strength.
  • RF radio frequency
  • the CT apparatus may express an inner structure (e.g., an organ such as a kidney, a lung, etc.) of the object without an overlap therebetween.
  • the CT apparatus may obtain a plurality of pieces of image data with a thickness not more than 2 mm for several tens to several hundreds of times per second and then may process the plurality of pieces of image data, so that the CT apparatus may provide a relatively accurate a cross-sectional image of the object.
  • An X-ray imaging apparatus is an apparatus for imaging internal organic structure of an object by transmitting an X-ray through the object.
  • An angiography apparatus is an apparatus for imaging blood vessels (arteries and veins), in which a contrast medium is injected through a thin tube having a diameter of about 2 mm, called as a catheter, by transmitting an X-ray.
  • the medical imaging apparatus may be realized in various formats.
  • the medical imaging apparatus recited in the present specification may be configured as a mobile terminal type, as well as a stationary terminal type.
  • the mobile terminal may include smartphones, laptop computers, personal digital assistants (PDAs), tablet PCs, and the like.
  • the medical image may include an MRI, a CT image, an ultrasound image, and an X-ray image; however, the exemplary embodiments are not limited thereto.
  • the medical image may be a two-dimensional image, a three-dimensional image, or a four-dimensional image; however, the exemplary embodiments are not limited thereto.
  • the medical image is assumed as a two-dimensional X-ray image for the convenience of description.
  • the medical imaging apparatus may select an external device, with which information about a medical image will be shared.
  • the medical imaging apparatus may select a first device 1000 and a second device 2000 that are connected to the medical imaging apparatus based on system settings.
  • the medical imaging apparatus may select the first device 1000 and the second device 2000 based on a user input.
  • the medical imaging apparatus may provide a device list including identification information of devices that may communicate with the medical imaging apparatus on a screen.
  • the medical imaging apparatus may receive a user input for the first device 1000 and the second device 2000 in the device list.
  • the user of the medical imaging apparatus according to the present exemplary embodiment may be an ultrasound technician, a radiological technologist, an ambulance attendant, or an ordinary person; however, the exemplary embodiments are not limited thereto.
  • Information about the medical image may include at least one piece of information from among information about the medical image itself, annotation information for describing all or some of the medical image, identification mark information for identifying a region of interest in the medical image, analyzing information generated by the radiological technologist or the ultrasound technician, information about the object, and measuring information; however, the exemplary embodiments are not limited thereto.
  • the medical imaging apparatus may identify a sharing level of the selected external device.
  • the sharing level may include information representing authority of a user of the external device for checking the information about the medical image.
  • the sharing level may be set in advance by information sharing system or a user.
  • the medical imaging apparatus may identify information about a first sharing level corresponding to identification information of the first device 1000 and/or a second sharing level corresponding to the identification information of the second device 2000 .
  • the medical imaging apparatus may receive information about the first sharing level and/or information about the second sharing level from a server 4000 . Also, according to another exemplary embodiment, the medical imaging apparatus may read the information about the first sharing level and/or the information about the second sharing level stored in the memory.
  • the first sharing level of the first device 1000 and the second sharing level of the second device 2000 may be different from each other.
  • the sharing level of the first device 1000 that is, the object device
  • the sharing level of the second device 2000 that is, the medical expert device.
  • Relative low sharing level denotes that a type and/or amount of information that may be shared is relatively small.
  • the medical imaging apparatus may transmit information about the medical image to the external device based on the sharing level.
  • the medical imaging apparatus may transmit first medical image information corresponding to the first sharing level to the first device 1000 , or may transmit second medical image information corresponding to the second sharing level to the second device 2000 .
  • the medical imaging apparatus may generate the first medical image information and the second medical image information.
  • the medical imaging apparatus may generate the first medical image information corresponding to the first sharing level of the first device 1000 .
  • the medical imaging apparatus may generate the first medical image information including the medical image, annotation for describing the medical image, an identification mark on the region of interest, and measuring information according to the first sharing level of the object device.
  • the medical imaging apparatus may generate the second medical image information corresponding to the second sharing level of the second device 2000 .
  • the medical imaging apparatus may generate the second medical image information including the medical image, measuring information of the medical image, analyzing information of the medical image by the radiological technologist, and object information according to the second sharing level of the medical expert device.
  • the medical imaging apparatus may share the first medical image information including the annotation for describing the medical image, the identification mark on the region of interest, and the measuring information with the first device 1000 , and may share the second medical image information including the medical image, the measuring information of the medical image, the analyzing information of the medical image by the radiological technologist, and the object information with the second device 2000 .
  • the medical imaging apparatus encodes the first medical image information and/or the second medical image information with an encoding code that is negotiated in advance or set in advance for security of the first medical image information and/or the second medical image information, and transmits the encoded first medical image information and/or second medical image information to the first device 1000 or the second device 2000 .
  • FIG. 26 is a diagram showing an X-ray image acquired by an X-ray imaging apparatus according to an exemplary embodiment.
  • the X-ray imaging apparatus may acquire an X-ray chest image 2600 of an object by using the X-ray.
  • the X-ray photographing apparatus may select an external device with which information about the X-ray chest image 2600 will be shared.
  • the X-ray imaging apparatus may select an object device for describing the X-ray chest image 2600 to the object.
  • the X-ray imaging apparatus may identify a sharing level of the selected object device.
  • the X-ray imaging apparatus may generate information about the X-ray chest image 2600 that will be shared with the object device according to the sharing level.
  • the X-ray imaging apparatus For example, if the sharing level of the object device is a first level, the X-ray imaging apparatus generates a bone suppression image that is obtained by deleting bones from the X-ray chest image 2600 , and marks a region of interest on the bone suppression image. This will be described with reference to FIG. 27 .
  • the X-ray imaging apparatus may generate a bone suppression image 2700 by deleting bones from the X-ray chest image 2600 so that the object may clearly see lesion.
  • the X-ray imaging apparatus selects a region that is suspected to have nodule on the bone suppression image 2700 as a region of interest 2710 , and displays a mark for the region of interest 2710 .
  • the X-ray imaging apparatus may select the region of interest 2710 automatically or based on a user input.
  • the region of interest 2710 is marked by a circle; however, the exemplary embodiments are not limited thereto.
  • the X-ray imaging apparatus may transmit information about the bone suppression image 2700 and information about the region of interest 2710 to the object device according to the sharing level of the object device (for example, the first level).
  • the object device may display the bone suppression image 2700 including the region of interest 2710 on a screen.
  • the X-ray imaging apparatus may additionally generate information about annotation for describing the region of interest 2710 (for example, text, font, and display location information). This will be described with reference to FIG. 28 .
  • the X-ray imaging apparatus may generate a bone suppression image 2800 by deleting bones from the X-ray chest image 2600 so that the object may clearly see the lesion.
  • the X-ray imaging apparatus selects a part suspected to be a nodule as a region of interest 2810 on the bone suppression image 2800 , and may generate an annotation 2820 about the region of interest 2810 .
  • the X-ray imaging apparatus may generate the annotation 2820 automatically or based on a user input.
  • the X-ray imaging apparatus may transmit information about the bone suppression image 2800 , information about the region of interest 2810 , and information about the annotation 2820 to the object device according to the sharing level of the object device (for example, the second level).
  • the object device may display the bone suppression image 2800 including the region of interest 2810 and the annotation 2820 on a screen.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
US14/320,971 2013-07-01 2014-07-01 Method of sharing information in ultrasound imaging Abandoned US20150005630A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/031,669 US20180317890A1 (en) 2013-07-01 2018-07-10 Method of sharing information in ultrasound imaging

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20130076594 2013-07-01
KR10-2013-0076594 2013-07-01
KR10-2014-0078390 2014-06-25
KR1020140078390A KR102207255B1 (ko) 2013-07-01 2014-06-25 초음파 장치의 정보 공유 방법, 의료 전문가 디바이스의 통신 방법 및 정보 공유 시스템

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/031,669 Continuation US20180317890A1 (en) 2013-07-01 2018-07-10 Method of sharing information in ultrasound imaging

Publications (1)

Publication Number Publication Date
US20150005630A1 true US20150005630A1 (en) 2015-01-01

Family

ID=51228271

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/320,971 Abandoned US20150005630A1 (en) 2013-07-01 2014-07-01 Method of sharing information in ultrasound imaging
US16/031,669 Pending US20180317890A1 (en) 2013-07-01 2018-07-10 Method of sharing information in ultrasound imaging

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/031,669 Pending US20180317890A1 (en) 2013-07-01 2018-07-10 Method of sharing information in ultrasound imaging

Country Status (3)

Country Link
US (2) US20150005630A1 (fr)
EP (3) EP3298966A1 (fr)
WO (1) WO2015002409A1 (fr)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160299664A1 (en) * 2015-04-07 2016-10-13 Olympus America, Inc. Diagram based visual procedure note writing tool
US20170069354A1 (en) * 2015-09-08 2017-03-09 Canon Kabushiki Kaisha Method, system and apparatus for generating a position marker in video images
CN107135193A (zh) * 2016-02-26 2017-09-05 Lg电子株式会社 无线装置
CN107239203A (zh) * 2016-03-29 2017-10-10 北京三星通信技术研究有限公司 一种图像管理方法和装置
CN107358015A (zh) * 2016-05-10 2017-11-17 三星麦迪森株式会社 显示超声图像的方法及超声诊断设备
JP2017209339A (ja) * 2016-05-26 2017-11-30 東芝メディカルシステムズ株式会社 医用診断システム及び医用診断装置
US9973928B2 (en) * 2014-04-01 2018-05-15 Sony Corporation Authentication with ultrasound
US20180137119A1 (en) * 2016-11-16 2018-05-17 Samsung Electronics Co., Ltd. Image management method and apparatus thereof
US20190138689A1 (en) * 2017-11-06 2019-05-09 International Business Machines Corporation Medical image manager with automated synthetic image generator
US20190200965A1 (en) * 2016-09-12 2019-07-04 Supersonic Imagine Ultrasound imaging method and an apparatus implementing said method
EP3364881A4 (fr) * 2015-10-20 2019-07-10 Samsung Medison Co., Ltd. Appareil d'imagerie par ultrasons et son procédé de commande
US10390786B2 (en) * 2015-01-05 2019-08-27 Canon Medical Systems Corporation X-ray diagnostic apparatus
US20190302997A1 (en) * 2018-03-27 2019-10-03 Konica Minolta, Inc. Medical image display apparatus and recording medium
CN110418311A (zh) * 2018-04-28 2019-11-05 华为技术有限公司 一种基于多个终端的互联方法、装置及终端
US10561373B2 (en) 2017-01-31 2020-02-18 International Business Machines Corporation Topological evolution of tumor imagery
US20200069291A1 (en) * 2018-08-29 2020-03-05 Butterfly Network, Inc. Methods and apparatuses for collection of ultrasound data
WO2020086899A1 (fr) * 2018-10-25 2020-04-30 Butterfly Network, Inc. Procédés et appareil pour collecter des données ultrasonores doppler en couleur
US10646199B2 (en) 2015-10-19 2020-05-12 Clarius Mobile Health Corp. Systems and methods for remote graphical feedback of ultrasound scanning technique
CN111326263A (zh) * 2020-02-03 2020-06-23 深圳安泰创新科技股份有限公司 标注轨迹显示方法、装置、设备及计算机可读存储介质
US20200375578A1 (en) * 2018-03-01 2020-12-03 Fujifilm Corporation Acoustic wave diagnostic apparatus and control method of acoustic wave diagnostic apparatus
USD904428S1 (en) * 2019-03-07 2020-12-08 Fujifilm Sonosite, Inc. Display screen or portion thereof with a graphical user interface
USD904427S1 (en) * 2019-03-07 2020-12-08 Fujifilm Sonosite, Inc. Display screen or portion thereof with a graphical user interface
CN112447276A (zh) * 2019-09-03 2021-03-05 通用电气精准医疗有限责任公司 用于提示数据捐赠以用于人工智能工具开发的方法和系统
US10993703B2 (en) * 2016-09-23 2021-05-04 Konica Minolta, Inc. Ultrasound diagnosis apparatus and computer readable recording medium
US20210330296A1 (en) * 2020-04-27 2021-10-28 Butterfly Network, Inc. Methods and apparatuses for enhancing ultrasound data
US11189375B1 (en) * 2020-05-27 2021-11-30 GE Precision Healthcare LLC Methods and systems for a medical image annotation tool
US20220239887A1 (en) * 2021-01-22 2022-07-28 Valeo Comfort And Driving Assistance Shared viewing of video among multiple users
US11464484B2 (en) 2018-09-19 2022-10-11 Clarius Mobile Health Corp. Systems and methods of establishing a communication session for live review of ultrasound scanning
US20220338840A1 (en) * 2021-04-23 2022-10-27 Fujifilm Healthcare Corporation Ultrasound diagnostic system and ultrasound diagnostic apparatus
US20220361840A1 (en) * 2021-04-23 2022-11-17 Fujifilm Sonosite, Inc. Displaying blood vessels in ultrasound images
US20230016097A1 (en) * 2021-07-19 2023-01-19 GE Precision Healthcare LLC Imaging System and Method Employing Visual Skin Markers
US11564663B2 (en) * 2017-06-26 2023-01-31 Samsung Medison Co., Ltd. Ultrasound imaging apparatus and control method thereof
US20230038965A1 (en) * 2020-02-14 2023-02-09 Koninklijke Philips N.V. Model-based image segmentation
US11690602B2 (en) * 2018-02-27 2023-07-04 Bfly Operations, Inc. Methods and apparatus for tele-medicine
WO2023189510A1 (fr) * 2022-03-30 2023-10-05 テルモ株式会社 Dispositif de traitement d'image, système de traitement d'image, méthode d'affichage d'image et programme de traitement d'image
US11890137B2 (en) * 2018-10-26 2024-02-06 Philips Image Guided Therapy Corporation Intraluminal ultrasound imaging with automatic and assisted labels and bookmarks
US11900593B2 (en) 2021-04-23 2024-02-13 Fujifilm Sonosite, Inc. Identifying blood vessels in ultrasound images
US11896425B2 (en) 2021-04-23 2024-02-13 Fujifilm Sonosite, Inc. Guiding instrument insertion
US11983822B2 (en) 2022-09-02 2024-05-14 Valeo Comfort And Driving Assistance Shared viewing of video with prevention of cyclical following among users

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160049385A (ko) * 2014-10-27 2016-05-09 삼성메디슨 주식회사 초음파 장치 및 초음파 장치의 정보 입력 방법
JP2019185674A (ja) * 2018-04-17 2019-10-24 大日本印刷株式会社 画像送信方法、画像キャプチャシステム及びコンピュータプログラム

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6210327B1 (en) * 1999-04-28 2001-04-03 General Electric Company Method and apparatus for sending ultrasound image data to remotely located device
US6475146B1 (en) * 2001-09-24 2002-11-05 Siemens Medical Solutions Usa, Inc. Method and system for using personal digital assistants with diagnostic medical ultrasound systems
US20020173721A1 (en) * 1999-08-20 2002-11-21 Novasonics, Inc. User interface for handheld imaging devices
US20050043620A1 (en) * 2003-08-20 2005-02-24 Siemens Medical Solutions Usa, Inc. Diagnostic medical ultrasound system communication network architecture and method
US20060155578A1 (en) * 2005-01-10 2006-07-13 George Eisenberger Privacy entitlement protocols for secure data exchange, collection, monitoring and/or alerting
US20080263048A1 (en) * 2007-04-16 2008-10-23 Kelley Wise File Access Management System
US20110022414A1 (en) * 2009-06-30 2011-01-27 Yaorong Ge Method and apparatus for personally controlled sharing of medical image and other health data
US20120157844A1 (en) * 2010-12-16 2012-06-21 General Electric Company System and method to illustrate ultrasound data at independent displays
US20140275851A1 (en) * 2013-03-15 2014-09-18 eagleyemed, Inc. Multi-site data sharing platform

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100319912B1 (ko) * 1995-06-30 2002-04-22 윤종용 원격의료진단시스템과원격의료영상전송및수신방법
US5715823A (en) * 1996-02-27 1998-02-10 Atlantis Diagnostics International, L.L.C. Ultrasonic diagnostic imaging system with universal access to diagnostic information and images
US20040204649A1 (en) * 2003-04-11 2004-10-14 Sankaralingam Ramraj Method and system for migrating presets between medical diagnostic imaging system platforms
JP4435530B2 (ja) * 2003-10-08 2010-03-17 株式会社東芝 医用画像集合処理システム及び医用画像集合処理方法
US20080039722A1 (en) * 2006-08-11 2008-02-14 General Electric Company System and method for physiological signal exchange between an ep/hemo system and an ultrasound system
JP5737823B2 (ja) * 2007-09-03 2015-06-17 株式会社日立メディコ 超音波診断装置
KR101562972B1 (ko) * 2009-03-26 2015-10-26 삼성전자 주식회사 공유 레벨에 따라 영상을 차등화하여 제공하는 영상 공유 장치 및 방법
JP2012231228A (ja) * 2011-04-25 2012-11-22 Yoshiaki Sasaki 胎児画像及び新生児画像の提供システム

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6210327B1 (en) * 1999-04-28 2001-04-03 General Electric Company Method and apparatus for sending ultrasound image data to remotely located device
US20020173721A1 (en) * 1999-08-20 2002-11-21 Novasonics, Inc. User interface for handheld imaging devices
US6475146B1 (en) * 2001-09-24 2002-11-05 Siemens Medical Solutions Usa, Inc. Method and system for using personal digital assistants with diagnostic medical ultrasound systems
US20050043620A1 (en) * 2003-08-20 2005-02-24 Siemens Medical Solutions Usa, Inc. Diagnostic medical ultrasound system communication network architecture and method
US20060155578A1 (en) * 2005-01-10 2006-07-13 George Eisenberger Privacy entitlement protocols for secure data exchange, collection, monitoring and/or alerting
US20080263048A1 (en) * 2007-04-16 2008-10-23 Kelley Wise File Access Management System
US20110022414A1 (en) * 2009-06-30 2011-01-27 Yaorong Ge Method and apparatus for personally controlled sharing of medical image and other health data
US20120157844A1 (en) * 2010-12-16 2012-06-21 General Electric Company System and method to illustrate ultrasound data at independent displays
US20140275851A1 (en) * 2013-03-15 2014-09-18 eagleyemed, Inc. Multi-site data sharing platform

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Koutelakis et al., ("Application of Multiprotocol Medical Imaging Communications and an Extended DICOM WADO Service in a Teleradiology Architecture", International Journal of Telemedicine and Applications. Volume 2012, June 13, 2011) *

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9973928B2 (en) * 2014-04-01 2018-05-15 Sony Corporation Authentication with ultrasound
US10390786B2 (en) * 2015-01-05 2019-08-27 Canon Medical Systems Corporation X-ray diagnostic apparatus
US11354007B2 (en) * 2015-04-07 2022-06-07 Olympus America, Inc. Diagram based visual procedure note writing tool
US20160299664A1 (en) * 2015-04-07 2016-10-13 Olympus America, Inc. Diagram based visual procedure note writing tool
US20170069354A1 (en) * 2015-09-08 2017-03-09 Canon Kabushiki Kaisha Method, system and apparatus for generating a position marker in video images
US10646199B2 (en) 2015-10-19 2020-05-12 Clarius Mobile Health Corp. Systems and methods for remote graphical feedback of ultrasound scanning technique
US11801035B2 (en) 2015-10-19 2023-10-31 Clarius Mobile Health Corp. Systems and methods for remote graphical feedback of ultrasound scanning technique
EP3364881A4 (fr) * 2015-10-20 2019-07-10 Samsung Medison Co., Ltd. Appareil d'imagerie par ultrasons et son procédé de commande
US11219429B2 (en) 2015-10-20 2022-01-11 Samsung Medison Co., Ltd. Ultrasound imaging apparatus and controlling method for the same
US10431183B2 (en) * 2016-02-26 2019-10-01 Lg Electronics Inc. Wireless device displaying images and matching resolution or aspect ratio for screen sharing during Wi-Fi direct service
CN107135193A (zh) * 2016-02-26 2017-09-05 Lg电子株式会社 无线装置
CN107239203A (zh) * 2016-03-29 2017-10-10 北京三星通信技术研究有限公司 一种图像管理方法和装置
CN107358015A (zh) * 2016-05-10 2017-11-17 三星麦迪森株式会社 显示超声图像的方法及超声诊断设备
JP2017209339A (ja) * 2016-05-26 2017-11-30 東芝メディカルシステムズ株式会社 医用診断システム及び医用診断装置
US20170347056A1 (en) * 2016-05-26 2017-11-30 Toshiba Medical Systems Corporation Medical diagnostic apparatus and medical diagnostic system
US10609322B2 (en) * 2016-05-26 2020-03-31 Canon Medical Systems Corporation Medical diagnostic apparatus and medical diagnostic system
US20190200965A1 (en) * 2016-09-12 2019-07-04 Supersonic Imagine Ultrasound imaging method and an apparatus implementing said method
US10993703B2 (en) * 2016-09-23 2021-05-04 Konica Minolta, Inc. Ultrasound diagnosis apparatus and computer readable recording medium
US20180137119A1 (en) * 2016-11-16 2018-05-17 Samsung Electronics Co., Ltd. Image management method and apparatus thereof
US10561373B2 (en) 2017-01-31 2020-02-18 International Business Machines Corporation Topological evolution of tumor imagery
US11172889B2 (en) 2017-01-31 2021-11-16 International Business Machines Corporation Topological evolution of tumor imagery
US10653363B2 (en) 2017-01-31 2020-05-19 International Business Machines Corporation Topological evolution of tumor imagery
US11564663B2 (en) * 2017-06-26 2023-01-31 Samsung Medison Co., Ltd. Ultrasound imaging apparatus and control method thereof
US10719580B2 (en) * 2017-11-06 2020-07-21 International Business Machines Corporation Medical image manager with automated synthetic image generator
US10733265B2 (en) 2017-11-06 2020-08-04 International Business Machines Corporation Medical image manager with automated synthetic image generator
US20190138689A1 (en) * 2017-11-06 2019-05-09 International Business Machines Corporation Medical image manager with automated synthetic image generator
US11690602B2 (en) * 2018-02-27 2023-07-04 Bfly Operations, Inc. Methods and apparatus for tele-medicine
US20200375578A1 (en) * 2018-03-01 2020-12-03 Fujifilm Corporation Acoustic wave diagnostic apparatus and control method of acoustic wave diagnostic apparatus
US11766246B2 (en) * 2018-03-01 2023-09-26 Fujifilm Corporation Acoustic wave diagnostic apparatus and control method of acoustic wave diagnostic apparatus
US10852939B2 (en) * 2018-03-27 2020-12-01 Konica Minolta, Inc. Medical image display apparatus and recording medium
JP2019170422A (ja) * 2018-03-27 2019-10-10 コニカミノルタ株式会社 医用画像表示装置及びプログラム
US20190302997A1 (en) * 2018-03-27 2019-10-03 Konica Minolta, Inc. Medical image display apparatus and recording medium
JP7013994B2 (ja) 2018-03-27 2022-02-01 コニカミノルタ株式会社 医用画像表示装置及びプログラム
CN110418311A (zh) * 2018-04-28 2019-11-05 华为技术有限公司 一种基于多个终端的互联方法、装置及终端
US20200069291A1 (en) * 2018-08-29 2020-03-05 Butterfly Network, Inc. Methods and apparatuses for collection of ultrasound data
US11464484B2 (en) 2018-09-19 2022-10-11 Clarius Mobile Health Corp. Systems and methods of establishing a communication session for live review of ultrasound scanning
WO2020086899A1 (fr) * 2018-10-25 2020-04-30 Butterfly Network, Inc. Procédés et appareil pour collecter des données ultrasonores doppler en couleur
US11890137B2 (en) * 2018-10-26 2024-02-06 Philips Image Guided Therapy Corporation Intraluminal ultrasound imaging with automatic and assisted labels and bookmarks
USD904428S1 (en) * 2019-03-07 2020-12-08 Fujifilm Sonosite, Inc. Display screen or portion thereof with a graphical user interface
USD904427S1 (en) * 2019-03-07 2020-12-08 Fujifilm Sonosite, Inc. Display screen or portion thereof with a graphical user interface
CN112447276A (zh) * 2019-09-03 2021-03-05 通用电气精准医疗有限责任公司 用于提示数据捐赠以用于人工智能工具开发的方法和系统
CN111326263A (zh) * 2020-02-03 2020-06-23 深圳安泰创新科技股份有限公司 标注轨迹显示方法、装置、设备及计算机可读存储介质
US20230038965A1 (en) * 2020-02-14 2023-02-09 Koninklijke Philips N.V. Model-based image segmentation
US20210330296A1 (en) * 2020-04-27 2021-10-28 Butterfly Network, Inc. Methods and apparatuses for enhancing ultrasound data
US11587668B2 (en) * 2020-05-27 2023-02-21 GE Precision Healthcare LLC Methods and systems for a medical image annotation tool
US11189375B1 (en) * 2020-05-27 2021-11-30 GE Precision Healthcare LLC Methods and systems for a medical image annotation tool
US20220037001A1 (en) * 2020-05-27 2022-02-03 GE Precision Healthcare LLC Methods and systems for a medical image annotation tool
US20210375435A1 (en) * 2020-05-27 2021-12-02 GE Precision Healthcare LLC Methods and systems for a medical image annotation tool
US20220239887A1 (en) * 2021-01-22 2022-07-28 Valeo Comfort And Driving Assistance Shared viewing of video among multiple users
US11924393B2 (en) * 2021-01-22 2024-03-05 Valeo Comfort And Driving Assistance Shared viewing of video among multiple users
US20220361840A1 (en) * 2021-04-23 2022-11-17 Fujifilm Sonosite, Inc. Displaying blood vessels in ultrasound images
US20220338840A1 (en) * 2021-04-23 2022-10-27 Fujifilm Healthcare Corporation Ultrasound diagnostic system and ultrasound diagnostic apparatus
US11900593B2 (en) 2021-04-23 2024-02-13 Fujifilm Sonosite, Inc. Identifying blood vessels in ultrasound images
US11896425B2 (en) 2021-04-23 2024-02-13 Fujifilm Sonosite, Inc. Guiding instrument insertion
US20230016097A1 (en) * 2021-07-19 2023-01-19 GE Precision Healthcare LLC Imaging System and Method Employing Visual Skin Markers
WO2023189510A1 (fr) * 2022-03-30 2023-10-05 テルモ株式会社 Dispositif de traitement d'image, système de traitement d'image, méthode d'affichage d'image et programme de traitement d'image
US11983822B2 (en) 2022-09-02 2024-05-14 Valeo Comfort And Driving Assistance Shared viewing of video with prevention of cyclical following among users

Also Published As

Publication number Publication date
EP2821014B1 (fr) 2021-09-01
EP2821014A1 (fr) 2015-01-07
WO2015002409A1 (fr) 2015-01-08
US20180317890A1 (en) 2018-11-08
EP3984465A1 (fr) 2022-04-20
EP3298966A1 (fr) 2018-03-28

Similar Documents

Publication Publication Date Title
US20180317890A1 (en) Method of sharing information in ultrasound imaging
US10459627B2 (en) Medical image display apparatus and method of providing user interface
US20220008040A1 (en) Ultrasound apparatus and method of displaying ultrasound images
KR102207255B1 (ko) 초음파 장치의 정보 공유 방법, 의료 전문가 디바이스의 통신 방법 및 정보 공유 시스템
US10228785B2 (en) Ultrasound diagnosis apparatus and method and computer-readable storage medium
US9891784B2 (en) Apparatus and method of displaying medical image
US10768797B2 (en) Method, apparatus, and system for generating body marker indicating object
US20150160821A1 (en) Method of arranging medical images and medical apparatus using the same
US20190357881A1 (en) Ultrasonic diagnosis device and method of diagnosing by using the same
KR102273831B1 (ko) 의료 영상을 디스플레이 하는 방법 및 그 의료 영상 장치
US20150148676A1 (en) Method and ultrasound apparatus for marking tumor on ultrasound elastography image
CN106551707B (zh) 显示超声图像的设备和方法
US10922875B2 (en) Ultrasound system and method of displaying three-dimensional (3D) image
EP2842495A1 (fr) Procédé de génération de marqueur de corps et appareil de diagnostic ultrasonore l'utilisant
US20150160844A1 (en) Method and apparatus for displaying medical images
US10163228B2 (en) Medical imaging apparatus and method of operating same
US20150248573A1 (en) Method and apparatus for processing medical images and computer-readable recording medium
KR102418975B1 (ko) 초음파 영상 제공 장치 및 초음파 영상 제공 방법
EP3342347A1 (fr) Procédé et appareil d'affichage d'images médicales
KR101643166B1 (ko) 초음파 장치 및 그 제어방법
KR101643322B1 (ko) 의료 영상 배치 방법 및 이를 위한 의료 기기

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JUNG, JONGWOO;YANG, EUN-HO;REEL/FRAME:033221/0262

Effective date: 20140630

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION