EP3034005A1 - Method, apparatus and system for generating body marker indicating object - Google Patents
Method, apparatus and system for generating body marker indicating object Download PDFInfo
- Publication number
- EP3034005A1 EP3034005A1 EP15163220.5A EP15163220A EP3034005A1 EP 3034005 A1 EP3034005 A1 EP 3034005A1 EP 15163220 A EP15163220 A EP 15163220A EP 3034005 A1 EP3034005 A1 EP 3034005A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- body marker
- marker
- controller
- image
- user input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000003550 marker Substances 0.000 title claims abstract description 369
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000002604 ultrasonography Methods 0.000 description 78
- 238000010586 diagram Methods 0.000 description 36
- 210000003754 fetus Anatomy 0.000 description 35
- 238000004891 communication Methods 0.000 description 33
- 239000000523 sample Substances 0.000 description 26
- 238000003745 diagnosis Methods 0.000 description 24
- 238000012285 ultrasound imaging Methods 0.000 description 20
- 230000005540 biological transmission Effects 0.000 description 8
- 210000004204 blood vessel Anatomy 0.000 description 8
- 241001465754 Metazoa Species 0.000 description 4
- 238000002591 computed tomography Methods 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 4
- 210000000056 organ Anatomy 0.000 description 4
- 238000010079 rubber tapping Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000002792 vascular Effects 0.000 description 3
- 210000001015 abdomen Anatomy 0.000 description 2
- 230000017531 blood circulation Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 210000004291 uterus Anatomy 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 210000000038 chest Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 230000002285 radioactive effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000001568 sexual effect Effects 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/465—Displaying means of special interest adapted to display user selection data, e.g. icons or menus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/468—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means allowing annotation or message recording
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0866—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0891—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4405—Device being mounted on a trolley
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4427—Device being portable or laptop-like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
- A61B8/4472—Wireless probes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/56—Details of data transmission or power supply
- A61B8/565—Details of data transmission or power supply involving data transmission via a network
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S128/00—Surgery
- Y10S128/916—Ultrasound 3-D imaging
Definitions
- One or more exemplary embodiments relate to a method, an apparatus, and a system for generating a body marker indicating an object.
- Ultrasound diagnosis apparatuses transmit ultrasound signals generated by transducers of a probe to an object and receive echo signals reflected from the object, thereby obtaining at least one image of an internal part of the object (e.g., soft tissues or blood flow).
- ultrasound diagnosis apparatuses are used for medical purposes including observation of the interior of an object, detection of foreign substances, and diagnosis of damage to the object.
- Such ultrasound diagnosis apparatuses provide high stability, display images in real time, and are safe due to the lack of radioactive exposure, compared to X-ray apparatuses. Therefore, ultrasound diagnosis apparatuses are widely used together with other image diagnosis apparatuses including a computed tomography (CT) apparatus, a magnetic resonance imaging (MRI) apparatus, and the like.
- CT computed tomography
- MRI magnetic resonance imaging
- a body marker is added to an ultrasound image when a user selects one of pre-generated body markers.
- it may be difficult to accurately determine a shape or an orientation of an object shown in the ultrasound image based on the body marker shown in the ultrasound image.
- it may require a large amount of time for the user to select one of the pre-generated body markers.
- One or more exemplary embodiments include a method, an apparatus, and a system for generating a body marker indicating an object. Also, a non-transitory computer-readable recording medium having recorded thereon a program, which, when executed by a computer, performs the method.
- a method of generating a body marker includes selecting a first body marker from among a plurality of prestored body markers based on an object shown in a medical image, generating a second body marker by modifying the first body marker according to a user input, and displaying the second body marker.
- the second body marker may include a body marker generated by flipping the first body marker about a vertical axis.
- the second body marker may include a body marker generated by rotating the first body marker about a central axis of the first body marker.
- the second body marker may include a body marker generated by rotating the first body marker in a clockwise direction.
- the second body marker may include a body marker generated by rotating the first body marker in a counterclockwise direction.
- the selecting may include receiving a user input for selecting the first body marker that corresponds to the object from among a plurality of prestored body markers, and selecting the first body marker based on the received user input.
- the selecting may include selecting a portion of the object shown in the medical image, and selecting the first body marker based on the selected portion of the object.
- the plurality of prestored body markers may be sorted into application groups.
- the first body marker and the second body marker may be 2-dimensional or 3-dimensional.
- the user input may include a user gesture input on a touch screen.
- the displaying may include displaying the first and second body markers on a single screen.
- a non-transitory computer-readable recording medium having recorded thereon a program, which, when executed by a computer, performs the method above.
- an apparatus for generating a body marker includes a display displaying a medical image showing an object, and a controller selecting a first body marker from among a plurality of prestored body markers based on the object, and generating a second body marker by modifying a shape of the first body marker according to a user input.
- the display displays the second body marker.
- the second body marker may include a body marker generated by flipping the first body marker about a vertical axis.
- the second body marker may include a body marker generated by flipping the first body marker about a horizontal axis.
- the second body marker may include a body marker generated by rotating the first body marker in a clockwise direction.
- the second body marker may include a body marker generated by rotating the first body marker in a counterclockwise direction.
- the apparatus may further include an input unit receiving a user input for selecting the first body marker that corresponds to the object from among a plurality of prestored body markers, and the controller may select the first body marker based on the received user input.
- the apparatus may further include an image processor selecting a portion of the object from the medical image, and the controller may select the first body marker based on the selected portion of the object.
- the plurality of prestored body markers may be sorted into application groups.
- the first body marker and the second body marker may be 2-dimensional or 3-dimensional.
- the user input may include a user gesture input on a touch screen.
- the display may display the first and second body markers on a single screen.
- an "ultrasound image” refers to an image of an object or an image of a region of interest included in the object, which is obtained using ultrasound waves.
- the region of interest is a region in the object which a user wants to focus on, for example, a lesion.
- an "object” may be a human, an animal, or a part of a human or animal.
- the object may be an organ (e.g., the liver, heart, womb, brain, breast, or abdomen), a blood vessel, or a combination thereof.
- the object may be a phantom.
- the phantom means a material having a density, an effective atomic number, and a volume that are approximately the same as those of an organism.
- the phantom may be a spherical phantom having properties similar to a human body.
- a "user” may be, but is not limited to, a medical expert, for example, a medical doctor, a nurse, a medical laboratory technologist, or a medical imaging expert, or a technician who repairs medical apparatuses.
- FIGS. 1A and 1B are diagrams illustrating an ultrasound diagnosis system 1000 according to an exemplary embodiment.
- a probe 20 may be wired to an ultrasound imaging device 100 in the ultrasound diagnosis system 1000.
- the probe 20, which transmits and receives ultrasound may be connected to a main body of the ultrasound diagnosis system 1000, i.e., the ultrasound image device 100, via a cable 110.
- the probe 20 may be wirelessly connected to the ultrasound imaging device 100 in an ultrasound diagnosis system 1001.
- the probe 20 and the ultrasound imaging device 100 may be connected via a wireless network.
- the probe 20 may be connected to the ultrasound imaging device 100 via a millimeter wave (mmWave) wireless network, receive an echo signal via a transducer, and transmit the echo signal in a 60 GHz frequency range to the ultrasound imaging device 100.
- the ultrasound imaging device 100 may generate an ultrasound image of various modes by using the echo signal received in the 60 GHz frequency range, and display the generated ultrasound image.
- the millimeter wave wireless network may use, but is not limited to, a wireless communication method according to the Wireless Gigabit Alliance (WiGig) standard.
- FIG. 2 is a block diagram illustrating an ultrasound diagnosis system 1002 according to an exemplary embodiment.
- the ultrasound diagnosis system 1002 may include a probe 20 and an ultrasound imaging device 100.
- the ultrasound imaging device 100 may include an ultrasound transceiver 1100, an image processor 1200, a communication module 1300, a display 1400, a memory 1500, an input unit 1600, and a controller 1700, which may be connected to one another via buses 1800.
- the ultrasound imaging device 1002 may be a cart type apparatus or a portable type apparatus.
- portable ultrasound diagnosis apparatuses may include, but are not limited to, a picture archiving and communication system (PACS) viewer, a smartphone, a laptop computer, a personal digital assistant (PDA), and a tablet PC.
- PACS picture archiving and communication system
- smartphone a smartphone
- laptop computer a laptop computer
- PDA personal digital assistant
- tablet PC tablet PC
- the probe 20 may transmit ultrasound waves to an object 10 (or, a region of interest in the object 10) in response to a driving signal applied by the ultrasound transceiver 1100 and receives echo signals reflected by the object 10 (or, the region of interest in the object 10).
- the probe 20 includes a plurality of transducers, and the plurality of transducers oscillate in response to electric signals and generate acoustic energy, that is, ultrasound waves.
- the probe 20 may be wired or wirelessly connected to a main body of the ultrasound diagnosis system 1002, and the ultrasound diagnosis system 1002 may include a plurality of probes 20.
- a transmitter 1110 supplies a driving signal to the probe 20.
- the transmitter 110 includes a pulse generator 1112, a transmission delaying unit 1114, and a pulser 1116.
- the pulse generator 1112 generates pulses for forming transmission ultrasound waves based on a predetermined pulse repetition frequency (PRF), and the transmission delaying unit 1114 delays the pulses by delay times necessary for determining transmission directionality.
- the pulses which have been delayed correspond to a plurality of piezoelectric vibrators included in the probe 20, respectively.
- the pulser 1116 applies a driving signal (or a driving pulse) to the probe 20 based on timing corresponding to each of the pulses which have been delayed.
- a receiver 1120 generates ultrasound data by processing echo signals received from the probe 20.
- the receiver 120 may include an amplifier 1122, an analog-to-digital converter (ADC) 1124, a reception delaying unit 1126, and a summing unit 1128.
- the amplifier 1122 amplifies echo signals in each channel, and the ADC 1124 performs analog-to-digital conversion with respect to the amplified echo signals.
- the reception delaying unit 1126 delays digital echo signals output by the ADC 124 by delay times necessary for determining reception directionality, and the summing unit 1128 generates ultrasound data by summing the echo signals processed by the reception delaying unit 1166.
- the receiver 1120 may not include the amplifier 1122. In other words, if the sensitivity of the probe 20 or the capability of the ADC 1124 to process bits is enhanced, the amplifier 1122 may be omitted.
- the image processor 1200 generates an ultrasound image by scan-converting ultrasound data generated by the ultrasound transceiver 1100.
- the ultrasound image may be not only a grayscale ultrasound image obtained by scanning the object 10 in an amplitude (A) mode, a brightness (B) mode, and a motion (M) mode, but also a Doppler image showing a movement of the object 10 via a Doppler effect.
- the Doppler image may be a blood flow Doppler image showing flow of blood (also referred to as a color Doppler image), a tissue Doppler image showing a movement of tissue, or a spectral Doppler image showing a moving speed of the object 10 as a waveform.
- a B mode processor 1212 extracts B mode components from ultrasound data and processes the B mode components.
- An image generator 1220 may generate an ultrasound image indicating signal intensities as brightness based on the extracted B mode components 1212.
- a Doppler processor 1214 may extract Doppler components from ultrasound data, and the image generator 1220 may generate a Doppler image indicating a movement of the object 10 as colors or waveforms based on the extracted Doppler components.
- the image generator 1220 may generate a three-dimensional (3D) ultrasound image via volume-rendering with respect to volume data and may also generate an elasticity image by imaging deformation of the object 10 due to pressure. Furthermore, the image generator 1220 may display various pieces of additional information in an ultrasound image by using text and graphics. In addition, the generated ultrasound image may be stored in the memory 1500.
- 3D three-dimensional
- the image processor 1200 may select a portion of the object 10 shown in the ultrasound image.
- the display 1400 displays the generated ultrasound image.
- the display 1400 may display not only an ultrasound image, but also various pieces of information processed by the ultrasound imaging device 1002 on a screen image via a graphical user interface (GUI).
- GUI graphical user interface
- the ultrasound imaging device 100 may include two or more display 1400 according to embodiments.
- the display 1400 may display at least one body marker from among the body markers stored in the memory 1500, and the display 1400 may display a second body marker that is generated according to a user input.
- the communication module 1300 is wired or wirelessly connected to a network 30 to communicate with an external device or a server. Also, when the probe 20 is connected to the ultrasound imaging device 100 via a wireless network, the communication module 1300 may communicate with the probe 20.
- the communication module 1300 may exchange data with a hospital server or another medical apparatus in a hospital, which is connected thereto via a PACS. Furthermore, the communication module 1300 may perform data communication according to the digital imaging and communications in medicine (DICOM) standard.
- DICOM digital imaging and communications in medicine
- the communication module 1300 may transmit or receive data related to diagnosis of the object 10, e.g., an ultrasound image, ultrasound data, and Doppler data of the object 10, via the network 30 and may also transmit or receive medical images captured by another medical apparatus, e.g., a computed tomography (CT) apparatus, a magnetic resonance imaging (MRI) apparatus, or an X-ray apparatus. Furthermore, the communication module 1300 may receive information about a diagnosis history or medical treatment schedule of a patient from a server and utilizes the received information to diagnose the patient. Furthermore, the communication module 1300 may perform data communication not only with a server or a medical apparatus in a hospital, but also with a portable terminal of a medical doctor or patient.
- CT computed tomography
- MRI magnetic resonance imaging
- X-ray apparatus e.g., X-ray apparatus
- the communication module 1300 may receive information about a diagnosis history or medical treatment schedule of a patient from a server and utilizes the received information to diagnose the patient. Furthermore, the communication module 1300 may perform data
- the communication module 1300 is wired to the network 30 or connected wirelessly to exchange data with a server 32, a medical apparatus 34, or a portable terminal 36.
- the communication module 1300 may include one or more components for communication with external devices.
- the communication module 1300 may include a local area communication module 1310, a wired communication module 1320, and a mobile communication module 1330.
- the local area communication module 1310 refers to a module for local area communication within a predetermined distance.
- Examples of local area communication techniques according to an embodiment may include, but are not limited to, wireless LAN, Wi-Fi, Bluetooth, ZigBee, Wi-Fi Direct (WFD), ultra wideband (UWB), infrared data association (IrDA), Bluetooth low energy (BLE), and near field communication (NFC).
- the wired communication module 1320 refers to a module for communication using electric signals or optical signals. Examples of wired communication techniques according to an embodiment may include communication via a twisted pair cable, a coaxial cable, an optical fiber cable, and an Ethernet cable.
- the mobile communication module 1330 transmits or receives wireless signals to or from at least one selected from a base station, an external terminal, and a server on a mobile communication network.
- the wireless signals may be voice call signals, video call signals, or various types of data for transmission and reception of text/multimedia messages.
- the memory 1500 stores various data processed by the ultrasound imaging device 1000.
- the memory 1500 may store medical data related to diagnosis of the object 10, such as ultrasound data and an ultrasound image that are input or output, and may also store algorithms or programs which are to be executed in the ultrasound imaging device 1002.
- the memory 1500 may store pre-generated body markers and a body marker generated by the controller 1700.
- the memory 1500 may be any of various storage media, e.g., a flash memory, a hard disk drive, EEPROM, etc. Furthermore, the ultrasound imaging device 1002 may utilize web storage or a cloud server that performs the storage function of the memory 1500 online.
- the input unit 1600 refers to a unit via which a user may input data for controlling the ultrasound imaging device 1002.
- the input unit 1600 may include hardware components, such as a keyboard, a mouse, a touch pad, a touch screen, and a jog switch, and a software module for driving the hardware components.
- the exemplary embodiments are not limited thereto, and the input unit 1600 may further include any of various input units including an electrocardiogram (ECG) measuring module, a respiration measuring module, a voice recognition sensor, a gesture recognition sensor, a fingerprint recognition sensor, an iris recognition sensor, a depth sensor, a distance sensor, etc.
- ECG electrocardiogram
- the input unit 1600 may receive a user input for selecting a first body marker from among the body markers stored in the memory 1500.
- the controller 1700 may control all operations of the ultrasound imaging device 1000. In other words, the controller 1700 may control operations among the probe 20, the ultrasound transceiver 1100, the image processor 1200, the communication module 1300, the display 1400, the memory 1500, and the input unit 1600 shown in FIG. 1 .
- the controller 1700 may select a first body marker from among prestored body markers based on the object 10.
- the prestored body markers are the body markers that are preset and stored in the memory 1500, regardless of a shape or a location of the object 10 shown on the ultrasound image.
- the controller 1700 may generate a second body marker by reshaping or modifying an orientation of the first body marker according to a user input.
- the user input is input via the input unit 1600, and includes a gesture performed by the user, for example, tapping, touch and hold, double tapping, dragging, panning, flicking, drag and drop, pinching, and stretching.
- a first body marker may be selected from the body markers stored in the memory 1500 according to a user input received by the input unit 1600. An example of the first body marker being selected according to the user input will be described in detail with reference to FIGS. 15 , 16A, and 16B . As another example, a first body marker may be selected from the body markers stored in the memory 1500 based on a portion of the object 10 selected from the ultrasound image by the image processor 1200. An example of the first body marker being selected based on the portion of the object 10 selected from the ultrasound image will be described in detail with reference to FIGS. 17 , 18A, and 18B .
- All or some of the probe 20, the ultrasound transceiver 1100, the image processor 1200, the communication module 1300, the display 1400, the memory 1500, the input unit 1600, and the controller 1700 may be implemented as software modules. Furthermore, at least one selected from the ultrasound transceiver 1100, the image processor 1200, and the communication module 1300 may be included in the controller 1700. However, the exemplary embodiments are not limited thereto.
- FIG. 3 is a block diagram illustrating a wireless probe 2000 according to an exemplary embodiment.
- the wireless probe 2000 may include a plurality of transducers, and, according to embodiments, may include some or all of the components of the ultrasound transceiver 1100 shown in FIG. 2 .
- the wireless probe 2000 includes a transmitter 2100, a transducer 2200, and a receiver 2300. Since descriptions thereof are given above with reference to FIG. 2 , detailed descriptions thereof will be omitted here.
- the wireless probe 2000 may selectively include a reception delaying unit 2330 and a summing unit 2340.
- the wireless probe 2000 may transmit ultrasound signals to the object 10, receive echo signals from the object 10, generate ultrasound data, and wirelessly transmit the ultrasound data to the ultrasound diagnosis system 1002 shown in FIG. 2 .
- FIG. 4 is a block diagram illustrating an apparatus 101 for generating a body marker, according to an exemplary embodiment.
- the apparatus 101 may include a controller 1701 and a display 1401.
- One or both of the controller 1701 and the display 1401 may be implemented as software modules, but are not limited thereto.
- One of the controller 1701 and the display 1401 may be implemented as hardware.
- the display 1401 may include an independent control module.
- the controller 1701 may be the same as the controller 1700 of FIG. 2
- the display 1401 may be the same as the display 1400 of FIG. 2
- the apparatus 101 may further include the ultrasound transceiver 1100, the image processor 1200, the communication module 1300, the memory 1500, and the input unit 1600 shown in FIG. 2 .
- the controller 1701 may select a first body marker from prestored body markers based on an object shown in a medical image.
- the body markers may be stored in a memory (not shown) of the apparatus 101.
- a body marker refers to a figure added to a medical image (e.g., an ultrasound image) so that a viewer of the medical image (e.g., a user) may easily recognize an object shown in the medical image.
- a medical image e.g., an ultrasound image
- the first body marker refers to a body marker that is pre-generated and stored in the apparatus 101. That is, the first body marker refers to a body marker generated in advance by a manufacturer of the user, regardless of information of a current shape or location of the object shown in the medical image.
- the first body marker does not reflect the current shape or the location of the object, when the first body marker is added to the medical image, the viewer of the medical image may be unable to accurately recognize the shape, position, or orientation of the object. Also, since the first body marker is selected from the prestored body markers, a large amount of time may be required for a user to select an appropriate body marker for the object. Hereinafter, the first body marker will be described in detail with reference to FIGS. 5A to 5C .
- FIGS. 5A to 5C are diagrams illustrating a first body marker according to an exemplary embodiment.
- a screen 3110 displays an example of prestored body markers 3120.
- the controller 1701 may select a first body marker from any one of the body markers 3120 based on an object shown in a medical image.
- the body markers 3120 may be sorted into application groups 3130.
- An application refers to a diagnosis type determined based on a body part or an organ of a human or an animal.
- the application groups 3130 may include an abdomen group, a small part group including the chest, mouth, sexual organs, and the like, a vascular group, a musculoskeletal group, an obstetrics group, a gynecology group, a cardiac group, a brain group, a urology group, and a veterinary group.
- the application groups 3130 are not limited thereto.
- Body parts and organs of a human or an animal may be sorted into a plurality of application groups based on a predetermined standard.
- the display 1401 may display the body markers 3120, which are sorted into the application groups 3130, on the screen 3110. For example, the user may select (e.g., click or tap) a 'Vascular' icon from icons representing the application groups 3130 displayed on the screen 3110. Then, from among the body markers 3120, the display 1401 may display body markers included in the vascular group on the screen 3110.
- the body markers 3120 are generated and stored in advance. Accordingly, the user has to examine the body markers 3120 and select a first body marker that is the most appropriate for an object in a medical image. Therefore, it may require a large amount of time for the user to select the first body marker.
- the first body marker selected from the body markers 3120 may not include accurate information about the object. This will be described in detail with reference to FIG. 5B .
- FIG. 5B illustrates an example of a medical image 3220 displayed on a screen 3210 and a first body marker 3230 added to the medical image 3220.
- FIG. 5B is described assuming that the medical image 3220 is an ultrasound image showing a portion of blood vessels.
- the first body marker 3230 may be added to the medical image 3220.
- the first body marker 3230 is selected from pre-generated body markers. Therefore, by viewing only the first body marker 3230, the viewer may only be able to recognize that the medical image 3220 obtained is a captured image of blood vessels, without knowing which portion of the blood vessels is shown in the medical image 3220.
- the first body marker 3230 cannot be rotated or modified in any other manner. According to a direction of blood vessels 3250 shown in the first body marker 3230, the first body marker 3230 may be unable to accurately indicate a direction of blood vessels 3240 shown in the medical image 3220. For example, referring to FIG. 5B , the blood vessels 3240 in the medical image 3220 are oriented with respect to an x-axis, whereas the blood vessels 3250 in the first body marker 3230 are oriented with respect to a y-axis. Therefore, the first body marker 3230 does provide accurate information about the object.
- FIG. 5C illustrates an example of a medical image 3320 displayed on a screen 3310 and a first body marker 3330 added to the medical image 3320.
- FIG. 5C is described assuming that the medical image 3320 is an ultrasound image showing a fetus.
- a fetus 3340 of the ultrasound image 3320 is shown lying parallel to the x-axis and facing downward.
- a fetus 3350 of the first body marker 3330 is shown lying parallel to the x-axis and facing upward. Therefore, the first body marker 3330 does not provide accurate information about the fetus 3340 shown in the ultrasound image 3320.
- the first body marker which is a pre-generated body marker, may not accurately represent an object included in a medical image.
- the controller 1701 may generate a second body marker by modify a shape, orientations, or positions of the first body marker. Therefore, the controller 1701 may generate a body marker that accurately shows information about an object.
- the second body marker will be described in detail with reference to FIG. 6 .
- FIG. 6 is a diagram illustrating a second body marker 3430 according to an exemplary embodiment.
- FIG. 6 illustrates an example of a medical image 3420 displayed on a screen 3410 and a second body marker 3430 added to the medical image 3420.
- FIG. 6 is described assuming that the medical image 3320 is an ultrasound image showing a fetus.
- a fetus 3440 of the ultrasound image 3420 is shown lying parallel to the x-axis and facing downward.
- a fetus 3450 of the second body marker 3430 is also shown lying parallel to the x-axis and facing downward.
- the second body marker 3430 accurately shows an orientation and a facing direction of the fetus 3440 of the ultrasound image 3420.
- the first body marker 3330 does not accurately show a facing direction of the fetus 3340 of the ultrasound image 3320. Therefore, the viewer may be unable to obtain accurate information about the fetus 3340 shown in the ultrasound image 3320 based on only the first body marker 3330.
- the second body marker 3430 accurately shows the orientation and location of the fetus 3440 of the ultrasound image 3420, and thus, the viewer may obtain accurate information about the fetus 3440 shown in the ultrasound image 3420 based on only the second body marker 3430.
- the controller 1701 may generate a second body marker by changing a shape of a first body marker according to a user input.
- the controller 1701 may modify the shape, orientations, or positions of the first body marker according to a user input received via an input unit (not shown) in the apparatus 101.
- the input unit in the apparatus 101 may be the same as the input unit 1600 described with reference to FIG. 2 .
- the user input may include clicking a predetermined point on a screen or inputting a drag gesture from a point on the screen to another point on the screen.
- the user input may include a user gesture input on the touch screen.
- the gesture may include, for example, tapping, touch and hold, double tapping, dragging, scrolling, flicking, drag and drop, pinching, and stretching.
- the controller 1701 may generate a second body marker by flipping the first body marker about a vertical axis according to the user input.
- the controller 1701 may generate a second body marker by rotating the first body marker about a central axis of the first body marker according to the user input.
- the controller 1701 may generate a second body marker by rotating the first body marker in a clockwise or counterclockwise direction according to the user input.
- the display 1401 may display a medical image showing an object on a screen. Also, the display 1401 may display the second body marker generated by the controller 1701 on the screen. The display 1401 may display the first body marker and the second body marker on a single screen.
- the controller 1701 may generate the second body marker by changing the shape of the first body marker according to the user input.
- the display 1401 may display the first body marker and a predetermined guide image on the screen.
- the first body marker and the predetermined guide image displayed by the display 1401 will be described in detail with reference to FIGS. 7A and 7B .
- FIGS. 7A and 7B are diagrams illustrating images used for generating a second body marker, according to an exemplary embodiment.
- FIG. 7A illustrates an example of a first body marker 4120 and first to third guide images 4130, 4140, and 4150 displayed on a screen 4110.
- the user may provide a user input to the apparatus 101 based on the first body marker 4120 and the first to third guide images 4130, 4140, and 4150 displayed on the screen 4110.
- the user may select a point on the guide images 4130, 4140, and 4150 according to a predetermine rule, or input a drag gesture beginning from a point on the guide images 4130, 4140, and 4150 to another point thereon.
- the controller 1701 may modify a shape, orientations, or positions of the first body marker 4120 according to the user input.
- the user input is assumed as a gesture performed by the user in FIGS. 7A to 14B .
- the user input is not limited thereto.
- the user may provide the user input to the apparatus 101 by using various hardware components, such as a keyboard or a mouse.
- the controller 1701 may rotate a portion of the first body marker 4120 about the central axis thereof.
- the controller 1701 may rotate a portion of the first body marker 4120 toward the second guide image 4140.
- the controller 1701 may rotate a portion of the first body marker 4120 in a clockwise or counterclockwise direction.
- the first body marker 4120 may be displayed on a screen 4160.
- the first to third guide images 4130, 4140, and 4150 of FIG. 7A may be omitted from the screen 4160.
- the user may also provide the user input, as described above with reference to FIG. 7A , to the apparatus 101, and the controller 1701 may modify a shape, orientations, or positions of the first body marker 4120 according to the user input.
- FIGS. 8A to 14B it is assumed that the guide images 4130, 4140, and 4150 are not displayed on a screen.
- FIGS. 8A and 8B are diagrams illustrating an example of the controller 1701 generating a second body marker, according to an exemplary embodiment.
- the controller 1701 may generate a second body marker 4230 by rotating a portion of a first body marker 4220 according to a user input.
- FIG. 8A illustrates an example of a screen 4210 displaying the first body marker 4220.
- the user may select (e.g., tap) an outer point of the first body marker 4220, and the controller 1701 may rotate the portion of the first body marker 4220 to the point selected by the user.
- FIG. 8B illustrates an example of the second body marker 4230 generated by rotating the portion of the first body marker 4220.
- the controller 1701 may rotate the portion of the first body marker 4220 such that the head of the fetus is located toward a point 4240 selected by the user. In this case, the controller 1701 may rotate the portion of the first body marker 4220 in a clockwise or counterclockwise direction.
- the second body marker 4230 may be the same as when the first body marker 4220 is flipped about a horizontal axis.
- the controller 1701 may generate the second body marker 4230 by rotating the portion of the first body marker 4220 toward the point 4240 selected by the user. Also, the controller 1701 may store the second body marker 4230 in the memory of the apparatus 101.
- FIGS. 9A and 9B are diagrams illustrating another example of the controller 1701 generating a second body marker, according to an exemplary embodiment.
- the controller 1701 may generate a second body marker 4330 by rotating a portion of a first body marker 4320 in a clockwise or counterclockwise direction according to a user input.
- FIG. 9A illustrates an example of a screen 4310 displaying the first body marker 4320.
- the user may input a drag gesture from an outer point on the first body marker 4320 to another outer point thereon in a clockwise or counterclockwise direction, and the controller 1701 may rotate the portion of the first body marker 4320 in a clockwise or counterclockwise direction according to the drag gesture input by the user.
- FIG. 9B illustrates the second body marker 4330 generated by rotating the portion of the first body marker 4320 in a clockwise direction.
- the first body marker 4320 represents a fetus
- the user inputs a drag gesture in a clockwise direction from a point 4340 toward which the head of the fetus 4833 is directed to another point 4350 on the first body marker 4320.
- the controller 1701 may rotate the portion of the first body marker 4320 in a clockwise direction such that the head of the fetus is pointed toward the point 4350 where the drag gesture stops.
- the controller 1701 may generate the second body marker 4330 by rotating the portion of the first body marker 4320 in a clockwise direction toward the point 4350 selected by the user. Also, the controller 1701 may store the second body marker 4330 in the memory of the apparatus 101.
- the controller 1701 may generate the second body marker 4330 by rotating the portion of the first body marker 4320 in a clockwise direction.
- the controller 1701 may generate a second body marker by rotating a first body marker in a counterclockwise direction toward a point where the drag gesture stops.
- FIGS. 10A to 10D are diagrams illustrating another example of the controller 1701 generating a second body marker, according to an exemplary embodiment.
- the controller 1701 may generate second body markers 4440, 4450, and 4460 by rotating a portion of a first body marker 4420 about a central axis of the first body marker 4420 according to a user input.
- FIG. 10A illustrates an example of a screen 4410 displaying the first body marker 4420.
- the user may select (e.g., tap) a point 4430 through which the central axis of the first body marker 4420 passes, and the controller 1701 may rotate the portion of the first body marker 4420 about the central axis thereof according to the selected point.
- FIGS. 10B to 10D illustrate that the first body marker 4420 is rotated about the central axis thereof in a counterclockwise direction
- the exemplary embodiments are not limited thereto.
- the controller 1701 may rotate the portion of the first body marker 4420 about the central axis thereof in the counterclockwise direction based on a predetermined rule.
- FIG. 10B illustrates an example of the second body marker 4440 generated by rotating the portion of the first body marker 4420 by 90° about the central axis thereof.
- the first body marker 4420 represents the fetus, when the user taps the point 4430 through which the central axis of the first body marker 4320 passes.
- the controller 1701 may rotate the portion of the first body marker 4320 by 90° about the central axis thereof in a counterclockwise direction.
- FIG. 10C illustrates an example of the second body marker 4450 generated by rotating the portion of the second body marker 4440 of the FIG. 10B by 90° about a central axis of the second body marker 4440.
- the second body marker 4440 represents the fetus
- the controller 1701 may rotate the portion of the second body marker 4440 by 90° about the central axis thereof in a counterclockwise direction.
- FIG. 10D illustrates an example of the second body marker 4460 generated by rotating the portion of the second body marker 4450 of FIG. 10C by 90° about a central axis of the second body marker 4450.
- the controller 1701 may rotate the portion of the second body marker 4450 by 90° about the central axis thereof in a counterclockwise direction.
- the controller 1701 may generate the second body markers 4440, 4450, and 4460 by rotating the portion of the first body marker 4420 about the central axis thereof. Also, the controller 1701 may store the second body markers 4440, 4450, and 4460 in the memory of the apparatus 101.
- the controller 1701 may rotate the portion of the first body marker 4420 about the central axis thereof according to taps performed by the user.
- the taps may be continuous or discontinuous, but are not limited thereto.
- the controller 1701 may rotate the first body marker 4420 about the central axis thereof in a clockwise direction. For example, when the user inputs a drag gesture in a clockwise direction from a first point to a second point, in which the distance between the first point and the second point is 1 mm, the controller 1701 may rotate the first body marker 4420 by 20° in a clockwise direction.
- the exemplary embodiments are not limited thereto. The distance of the drag gesture and a rotation degree of the first body marker 4420 may vary.
- FIGS. 11A and 11B are diagrams illustrating another example of the controller 1701 generating a second body marker, according to an exemplary embodiment.
- the controller 1701 may generate a second body marker 4540 by flipping a first body marker 4520 about the vertical axis according to a user input.
- FIG. 11A illustrates an example of a screen 4510 displaying the first body marker 4520.
- the controller 1701 may flip the first body marker 4520 about the vertical axis.
- FIG. 11B illustrates an example of the second body marker 4540 generated by flipping the first body marker 4520 about the vertical axis.
- the controller 1701 may flip the first body marker 4520 about the vertical axis to show an inverse image of the palm of the right hand.
- the controller 1701 may generate the second body marker 4540 by flipping the first body marker 4520 about the vertical axis. Also, the controller 1701 may store the second body marker 4540 in the memory of the apparatus 101.
- FIGS. 12A and 12B are diagrams illustrating another example of the controller 1701 generating a second body marker, according to an exemplary embodiment.
- the controller 1701 may generate a second body marker 4640 by flipping over a first body marker 4620 (i.e., rotating the first body marker 4620 by 180° about a central axis thereof in a clockwise or counterclockwise direction) according to a user input.
- FIG. 12A illustrates an example of a screen 4610 displaying the first body marker 4620.
- the controller 1701 may flip over the first body marker 4620.
- FIG. 12B illustrates an example of the second body marker 4640 generated by flipping over the first body marker 4620.
- the controller 1701 may flip over the first body marker 4620 to display the back of the right hand.
- the controller 1701 may generate the second body marker 4640 by flipping over the first body marker 4620. Also, the controller 1701 may store the second body marker 4640 in the memory of the apparatus 101.
- the controller 1701 may generate a second body marker that represents a single object by using a first body marker that represents a single object.
- a body marker may represent a plurality of objects. For example, when a medical image is obtained by capturing the uterus of a pregnant woman having twins, a body marker added to the medical image has to represent the twins (that is, a plurality of objects).
- the controller 1701 may generate a second body marker that represents a plurality of objects by using a first body marker that represents a single object.
- examples of the controller 1701 generating a second body marker that represents a plurality of objects will be described with reference to FIGS. 13A to 14B .
- FIGS. 13A and 13B are diagrams illustrating another example of the controller 1701 generating a second body marker, according to an exemplary embodiment.
- FIG. 13A illustrates an example of a screen 4710 displaying a first body marker 4720 that represents a single object. For convenience of description, it is assumed that the first body marker 4720 represents a fetus.
- the controller 1701 may change the number of objects represented by a body marker, according to a user input. For example, when the user selects (e.g., taps) an icon 4730 displayed at a predetermined location on the screen 4710, the display 1401 may display a pop-up window 4740 showing the number of objects that may be represented by a body marker. Accordingly, the user may set the number of objects to be represented by the body marker.
- the controller 1701 may generate a second body marker 4750 that includes 2 fetuses, as shown in FIG. 13B . Also, the controller 1701 may store the second body marker 4750 in the memory of the apparatus 101.
- the controller 1701 may modify shapes, orientations, or positions of a plurality of objects included in a body marker. For example, when a medical image is obtained by capturing the uterus of a pregnant woman having twins, each fetus may be oriented in different directions. Therefore, the controller 1701 may modify the shape, orientations, or positions of the plurality of objects in the body marker to thus generate a body marker that accurately shows information about the objects.
- controller 1701 generating a second body marker by modifying the shape, orientations, or positions according to a plurality of objects included in a body marker will be described with reference to FIGS. 14A and 14B .
- FIGS. 14A and 14B are diagrams illustrating another example of the controller 1701 generating a second body marker, according to an exemplary embodiment.
- FIG. 14A illustrates an example of a screen 4810 displaying a first body marker 4820 that includes a plurality of objects.
- the first body marker 4820 includes twins.
- the controller 1701 may generate the first body marker 4820 by using a body marker that includes a single fetus.
- the controller 1701 may select an object from among objects included in a body marker. For example, when the user selects (e.g., taps) a fetus 4833 from fetuses 4831 and 4833 displayed on the screen 4810, the controller 1701 may determine the fetus 4833 as an object to be reshaped, reoriented, or repositioned. In this case, the display 1401 may display the fetuses 4831 and 4833 such that the fetus 4833 that is selected is distinguished from the fetus 4831 that is not selected (for example, change in thickness or line color). Thus, the user may easily recognize the selected fetus 4833.
- the controller 1701 may determine the fetus 4833 as an object to be reshaped, reoriented, or repositioned.
- the display 1401 may display the fetuses 4831 and 4833 such that the fetus 4833 that is selected is distinguished from the fetus 4831 that is not
- the controller 1701 may modify a shape, orientations, or positions of the fetus 4833 according to a user input.
- the user input is the same as described above with reference to FIGS. 8A to 12B .
- the controller 1701 may rotate the fetus 4833 such that the head is directed toward the point 4850 where the drag gesture stops.
- the controller 1701 may generate a second body marker 4860 in which a location of the fetus 4833 of FIG. 14A is modified. Also, the controller 1701 may store the second body marker 4860 in the memory of the apparatus 101.
- second body markers described above with reference to FIGS. 7A to 14B are 2-dimensional, the exemplary embodiments are not limited thereto. In other words, the controller 1701 may generate a 3-dimensional second body marker.
- FIG. 15 is a block diagram illustrating an apparatus 102 for generating a body marker, according to another exemplary embodiment.
- the apparatus 102 may include a controller 1702, a display 1402, and an input unit 1601. All or some of the controller 1702, the display 1402, and the input unit 1601 may be implemented as software modules, but are not limited thereto. Some of the controller 1702, the display 1402, and the input unit 1601 may be implemented as hardware. Also, each of the display 1402 and the input unit 1601 may include an independent control module.
- the controller 1702 may be the same as the controller 1701 of FIG. 4
- the display 1402 may be the same as the display 1401 of FIG. 4
- the input unit 1601 may be the same as the input unit 1600 of FIG. 2 . If the apparatus 102 is a component included in an ultrasound imaging device, then, in addition to the controller 1702, the display 1402, and the input unit 1601, the apparatus 102 may further include the ultrasound transceiver 1100, the image processor 1200, the communication module 1300, and the memory 1500 shown in FIG. 2 .
- the input unit 1601 may receive a user input for selecting a first body marker corresponding to an object in a medical image.
- the user input refers to an input selecting the first body marker from prestored body markers.
- the controller 1702 may select the first body marker based on the user input transmitted from the input unit 1601. Also, the controller 1702 may generate a second body marker by changing a shape of the first body marker according to a user input received after the user input for selecting the first body marker. Since examples of the controller 1702 generating the second body marker have been described above in detail with reference to FIGS. 8A to 14B , descriptions of the examples will not be repeated.
- FIGS. 16A and 16B are diagrams illustrating an example of an input unit receiving a user input for selecting a first body marker, according to an exemplary embodiment.
- a screen 5110 displays a plurality of body markers 5120 prestored in the apparatus 102.
- the body markers 5120 may be sorted into application groups 5130, as described above with reference to FIG. 5A .
- the user may select a body marker 5140 from among the body markers 5120 displayed on the screen 5110.
- the input unit 1601 includes hardware components, such as a keyboard, a mouse, a trackball, and a jog switch, and a software module for driving the hardware components
- the user may click the body marker 5140 from among the body markers 5120.
- the input unit 1601 includes a touch screen and a software module for driving the touch screen
- the user may tap the body marker 5140 from among the body markers 5120.
- the controller 1702 may select a first body marker based on the user input.
- the body marker 5140 selected by the user is selected as the first body marker.
- the display 1402 may display a first body marker 5160 on a screen 5150.
- the controller 1702 may select a body marker determined by the user as the first body marker.
- the exemplary embodiments are not limited thereto.
- the controller 1702 may select the first body marker based on a shape of an object shown in a medical image.
- FIGS. 17 , 18A, and 18B an example of the controller 1702 selecting a first body marker based on a shape of an object will be described with reference to FIGS. 17 , 18A, and 18B .
- FIG. 17 is a block diagram illustrating an apparatus 103 for generating a body marker, according to another exemplary embodiment.
- the apparatus 103 may include a controller 1703, a display 1403, and an image processor 1201. All or some of the controller 1703, the display 1403, and the image processor 1201 may be implemented as software modules, but are not limited thereto. Some of the controller 1703, the display 1403, and the image processor 1201 may be implemented as hardware. Also, each of the display 1403 and the image processor 1201 may include an independent control module.
- the controller 1703 may be the same as the controller 1701 of FIG. 4
- the display 1403 may be the same as the display 1401 of FIG. 4
- the image processor 1201 may be the same as the image processor 1200 of FIG. 2 .
- the apparatus 103 may further include the ultrasound transceiver 1100, the communication module 1300, the memory 1500, and the input unit 1600 shown in FIG. 2 .
- the image processor 1201 selects a portion of an object from a medical image. For example, the image processor 1201 may detect outlines of the object in the medical image, connect the detected outlines, and thus select the portion of the object. The image processor 1201 may select the portion of the object by using various methods, for example, a thresholding method, a K-means algorithm, a compression-based method, a histogram-based method, edge detection, a region-growing method, a partial differential equation-based method, and a graph partitioning method. Since the methods above are well-know to one of ordinary skill in the art, detailed descriptions thereof will be omitted.
- the controller 1703 may select a first body marker based on information of the shape of the object. For example, from among a plurality of body markers stored in the apparatus 103, the controller 1703 may select a first body marker having a shape that is the most similar to the object.
- the controller 1703 may generate a second body marker by modifying the shape, orientations, or positions of the first body marker according to a user input received after the first body marker is selected. Since examples of the controller 1703 generating the second body marker have been described above in detail with reference to FIGS. 8A to 14B , the examples will not be repeatedly described.
- FIGS. 18A and 18B are diagrams illustrating an example of an image processor selecting a portion of an object from a medical image, according to an exemplary embodiment.
- a screen 5210 displays a medical image 5220 showing an object 5230.
- the image processor 1201 may select a portion of the object 5230 from the medical image 5220.
- the image processor 1201 may use any one of the methods described with reference to FIG. 17 to select the portion of the object 5230 from the medical image 5220.
- the controller 1703 may select a first body marker based on the shape of the object 5230. For example, from among the plurality of body markers stored in the apparatus 103, the controller 1703 may select a body marker 5240 having a shape that is the most similar to the object 5230 as the first body marker. Also, as shown in FIG. 16B , the display 1402 may display the first body marker 5240 on a screen 5250.
- FIG. 19 is a flowchart illustrating a method of generating a body marker, according to an exemplary embodiment.
- the method of generating the body marker includes operations sequentially performed by the ultrasound diagnosis systems 1000, 1001, and 1002 respectively shown in FIGS. 1A, 1B and 2 or the apparatuses 100, 101, 102, and 103 respectively shown in FIGS. 4 , 15 , and 17 . Therefore, all of the above-described features and elements of the ultrasound diagnosis systems 1000, 1001, and 1002 respectively shown in FIGS. 1A, 1B and 2 and the apparatuses 100, 101, 102, and 103 respectively shown in FIGS. 4 , 15 , and 17 apply to the method of FIG. 19 .
- a controller selects a first body marker from among a plurality of prestored body markers based on an object shown in a medical image.
- the plurality of body markers may be stored in a memory of an apparatus for generating a body marker.
- the controller generates a second body marker by changing a shape of the first body marker according to a user input.
- the controller may generate the second body marker by flipping the first body marker about a vertical axis according to the user input.
- the controller may generate the second body marker by rotating the first body marker about a central axis thereof according to the user input.
- the controller may generate the second body marker by rotating the first body marker in a clockwise or counterclockwise direction according to the user input.
- a display displays the second body marker.
- the display may display the first body marker and the second body marker on a single screen.
- a body marker that corresponds to a current location or direction of an object shown in a medical image may be generated. Also, the user may spend less time selecting the body marker that corresponds to the current location or direction of the object from among prestored body markers.
- exemplary embodiments can also be implemented through computer-readable code/instructions in/on a medium, e.g., a computer-readable medium, to control at least one processing element to implement any above described exemplary embodiment.
- the medium can correspond to any medium/media permitting the storage and/or transmission of the computer-readable code.
- the computer-readable code can be recorded/transferred on a medium in a variety of ways, with examples of the medium including recording media, such as magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs), and transmission media such as Internet transmission media.
- the medium may be such a defined and measurable structure including or carrying a signal or information, such as a device carrying a bitstream according to one or more exemplary embodiments.
- the media may also be a distributed network, so that the computer-readable code is stored/transferred and executed in a distributed fashion.
- the processing element could include a processor or a computer processor, and processing elements may be distributed and/or included in a single device.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Physics & Mathematics (AREA)
- Pathology (AREA)
- Human Computer Interaction (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
- This application claims the benefit of Korean Patent Application No.
10-2014-0180499, filed on December 15, 2014 - One or more exemplary embodiments relate to a method, an apparatus, and a system for generating a body marker indicating an object.
- Ultrasound diagnosis apparatuses transmit ultrasound signals generated by transducers of a probe to an object and receive echo signals reflected from the object, thereby obtaining at least one image of an internal part of the object (e.g., soft tissues or blood flow). In particular, ultrasound diagnosis apparatuses are used for medical purposes including observation of the interior of an object, detection of foreign substances, and diagnosis of damage to the object. Such ultrasound diagnosis apparatuses provide high stability, display images in real time, and are safe due to the lack of radioactive exposure, compared to X-ray apparatuses. Therefore, ultrasound diagnosis apparatuses are widely used together with other image diagnosis apparatuses including a computed tomography (CT) apparatus, a magnetic resonance imaging (MRI) apparatus, and the like.
- In general, a body marker is added to an ultrasound image when a user selects one of pre-generated body markers. However, it may be difficult to accurately determine a shape or an orientation of an object shown in the ultrasound image based on the body marker shown in the ultrasound image. Also, it may require a large amount of time for the user to select one of the pre-generated body markers.
- One or more exemplary embodiments include a method, an apparatus, and a system for generating a body marker indicating an object. Also, a non-transitory computer-readable recording medium having recorded thereon a program, which, when executed by a computer, performs the method.
- Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented exemplary embodiments.
- According to one or more exemplary embodiments, a method of generating a body marker includes selecting a first body marker from among a plurality of prestored body markers based on an object shown in a medical image, generating a second body marker by modifying the first body marker according to a user input, and displaying the second body marker.
- The second body marker may include a body marker generated by flipping the first body marker about a vertical axis.
- The second body marker may include a body marker generated by rotating the first body marker about a central axis of the first body marker.
- The second body marker may include a body marker generated by rotating the first body marker in a clockwise direction.
- The second body marker may include a body marker generated by rotating the first body marker in a counterclockwise direction.
- The selecting may include receiving a user input for selecting the first body marker that corresponds to the object from among a plurality of prestored body markers, and selecting the first body marker based on the received user input.
- The selecting may include selecting a portion of the object shown in the medical image, and selecting the first body marker based on the selected portion of the object.
- The plurality of prestored body markers may be sorted into application groups.
- The first body marker and the second body marker may be 2-dimensional or 3-dimensional.
- The user input may include a user gesture input on a touch screen.
- The displaying may include displaying the first and second body markers on a single screen.
- According to one or more exemplary embodiments, a non-transitory computer-readable recording medium having recorded thereon a program, which, when executed by a computer, performs the method above.
- According to one or more exemplary embodiments, an apparatus for generating a body marker includes a display displaying a medical image showing an object, and a controller selecting a first body marker from among a plurality of prestored body markers based on the object, and generating a second body marker by modifying a shape of the first body marker according to a user input. The display displays the second body marker.
- The second body marker may include a body marker generated by flipping the first body marker about a vertical axis.
- The second body marker may include a body marker generated by flipping the first body marker about a horizontal axis.
- The second body marker may include a body marker generated by rotating the first body marker in a clockwise direction.
- The second body marker may include a body marker generated by rotating the first body marker in a counterclockwise direction.
- The apparatus may further include an input unit receiving a user input for selecting the first body marker that corresponds to the object from among a plurality of prestored body markers, and the controller may select the first body marker based on the received user input.
- The apparatus may further include an image processor selecting a portion of the object from the medical image, and the controller may select the first body marker based on the selected portion of the object.
- The plurality of prestored body markers may be sorted into application groups.
- The first body marker and the second body marker may be 2-dimensional or 3-dimensional.
- The user input may include a user gesture input on a touch screen.
- The display may display the first and second body markers on a single screen.
- Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which reference numerals denote structural elements.
-
FIGS. 1A and 1B are diagrams illustrating an ultrasound diagnosis system according to an exemplary embodiment; -
FIG. 2 is a block diagram illustrating an ultrasound diagnosis system according to an exemplary embodiment; -
FIG. 3 is a block diagram illustrating a wireless probe according to an exemplary embodiment; -
FIG. 4 is a block diagram illustrating an apparatus for generating a body marker, according to an exemplary embodiment; -
FIGS. 5A to 5C are diagrams illustrating a first body marker according to an exemplary embodiment; -
FIG. 6 is a diagram illustrating a second body marker according to an exemplary embodiment; -
FIGS. 7A and 7B are diagrams illustrating images used for generating a second body marker, according to an exemplary embodiment; -
FIGS. 8A and 8B are diagrams illustrating an example of a controller generating a second body marker, according to an exemplary embodiment; -
FIGS. 9A and 9B are diagrams illustrating another example of a controller generating a second body marker, according to an exemplary embodiment; -
FIGS. 10A to 10D are diagrams illustrating another example of a controller generating a second body marker, according to an exemplary embodiment; -
FIGS. 11A and 11B are diagrams illustrating another example of a controller generating a second body marker, according to an exemplary embodiment; -
FIGS. 12A and 12B are diagrams illustrating another example of a controller generating a second body marker, according to an exemplary embodiment; -
FIGS. 13A and 13B are diagrams illustrating another example of a controller generating a second body marker, according to an exemplary embodiment; -
FIGS. 14A and 14B are diagrams illustrating another example of a controller generating a second body marker, according to an exemplary embodiment; -
FIG. 15 is a block diagram illustrating an apparatus for generating a body marker, according to another exemplary embodiment; -
FIGS. 16A and 16B are diagrams illustrating an example of an input unit receiving a user input for selecting a first body marker, according to an exemplary embodiment; -
FIG. 17 is a block diagram illustrating an apparatus for generating a body marker, according to another exemplary embodiment; -
FIGS. 18A and 18B are diagrams illustrating an example of an image processor selecting a portion of an object from a medical image, according to an exemplary embodiment; and -
FIG. 19 is a flowchart illustrating a method of generating a body marker, according to an exemplary embodiment. - The terms used in this specification are those general terms currently widely used in the art in consideration of functions regarding the inventive concept, but the terms may vary according to the intention of those of ordinary skill in the art, precedents, or new technology in the art. Also, some terms may be arbitrarily selected by the applicant, and in this case, the meaning of the selected terms will be described in detail in the detailed description of the present specification. Thus, the terms used in the specification should be understood not as simple names but based on the meaning of the terms and the overall description of the inventive concept. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items. Expressions such as "at least one of," when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
- Throughout the specification, it will also be understood that when a component "includes" an element, unless there is another opposite description thereto, it should be understood that the component does not exclude another element and may further include another element. In addition, terms such as "... unit," "... module," or the like refer to units that perform at least one function or operation, and the units may be implemented as hardware or software or as a combination of hardware and software.
- Throughout the specification, an "ultrasound image" refers to an image of an object or an image of a region of interest included in the object, which is obtained using ultrasound waves. The region of interest is a region in the object which a user wants to focus on, for example, a lesion. Furthermore, an "object" may be a human, an animal, or a part of a human or animal. For example, the object may be an organ (e.g., the liver, heart, womb, brain, breast, or abdomen), a blood vessel, or a combination thereof. Also, the object may be a phantom. The phantom means a material having a density, an effective atomic number, and a volume that are approximately the same as those of an organism. For example, the phantom may be a spherical phantom having properties similar to a human body.
- Throughout the specification, a "user" may be, but is not limited to, a medical expert, for example, a medical doctor, a nurse, a medical laboratory technologist, or a medical imaging expert, or a technician who repairs medical apparatuses.
- Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings.
-
FIGS. 1A and 1B are diagrams illustrating anultrasound diagnosis system 1000 according to an exemplary embodiment. - Referring to
FIG. 1A , aprobe 20 may be wired to anultrasound imaging device 100 in theultrasound diagnosis system 1000. In other words, theprobe 20, which transmits and receives ultrasound, may be connected to a main body of theultrasound diagnosis system 1000, i.e., theultrasound image device 100, via acable 110. - Referring to
FIG. 1B , theprobe 20 may be wirelessly connected to theultrasound imaging device 100 in anultrasound diagnosis system 1001. In other words, theprobe 20 and theultrasound imaging device 100 may be connected via a wireless network. For example, theprobe 20 may be connected to theultrasound imaging device 100 via a millimeter wave (mmWave) wireless network, receive an echo signal via a transducer, and transmit the echo signal in a 60 GHz frequency range to theultrasound imaging device 100. Also, theultrasound imaging device 100 may generate an ultrasound image of various modes by using the echo signal received in the 60 GHz frequency range, and display the generated ultrasound image. The millimeter wave wireless network may use, but is not limited to, a wireless communication method according to the Wireless Gigabit Alliance (WiGig) standard. -
FIG. 2 is a block diagram illustrating anultrasound diagnosis system 1002 according to an exemplary embodiment. - Referring to
FIG. 2 , theultrasound diagnosis system 1002 may include aprobe 20 and anultrasound imaging device 100. Referring toFIG. 1 , theultrasound imaging device 100 may include anultrasound transceiver 1100, animage processor 1200, acommunication module 1300, adisplay 1400, amemory 1500, aninput unit 1600, and acontroller 1700, which may be connected to one another viabuses 1800. - The
ultrasound imaging device 1002 may be a cart type apparatus or a portable type apparatus. Examples of portable ultrasound diagnosis apparatuses may include, but are not limited to, a picture archiving and communication system (PACS) viewer, a smartphone, a laptop computer, a personal digital assistant (PDA), and a tablet PC. - The
probe 20 may transmit ultrasound waves to an object 10 (or, a region of interest in the object 10) in response to a driving signal applied by theultrasound transceiver 1100 and receives echo signals reflected by the object 10 (or, the region of interest in the object 10). Theprobe 20 includes a plurality of transducers, and the plurality of transducers oscillate in response to electric signals and generate acoustic energy, that is, ultrasound waves. Furthermore, theprobe 20 may be wired or wirelessly connected to a main body of theultrasound diagnosis system 1002, and theultrasound diagnosis system 1002 may include a plurality ofprobes 20. - A
transmitter 1110 supplies a driving signal to theprobe 20. Thetransmitter 110 includes apulse generator 1112, atransmission delaying unit 1114, and apulser 1116. Thepulse generator 1112 generates pulses for forming transmission ultrasound waves based on a predetermined pulse repetition frequency (PRF), and thetransmission delaying unit 1114 delays the pulses by delay times necessary for determining transmission directionality. The pulses which have been delayed correspond to a plurality of piezoelectric vibrators included in theprobe 20, respectively. Thepulser 1116 applies a driving signal (or a driving pulse) to theprobe 20 based on timing corresponding to each of the pulses which have been delayed. - A
receiver 1120 generates ultrasound data by processing echo signals received from theprobe 20. The receiver 120 may include anamplifier 1122, an analog-to-digital converter (ADC) 1124, areception delaying unit 1126, and a summing unit 1128. Theamplifier 1122 amplifies echo signals in each channel, and theADC 1124 performs analog-to-digital conversion with respect to the amplified echo signals. Thereception delaying unit 1126 delays digital echo signals output by the ADC 124 by delay times necessary for determining reception directionality, and the summing unit 1128 generates ultrasound data by summing the echo signals processed by the reception delaying unit 1166. In some embodiments, thereceiver 1120 may not include theamplifier 1122. In other words, if the sensitivity of theprobe 20 or the capability of theADC 1124 to process bits is enhanced, theamplifier 1122 may be omitted. - The
image processor 1200 generates an ultrasound image by scan-converting ultrasound data generated by theultrasound transceiver 1100. The ultrasound image may be not only a grayscale ultrasound image obtained by scanning theobject 10 in an amplitude (A) mode, a brightness (B) mode, and a motion (M) mode, but also a Doppler image showing a movement of theobject 10 via a Doppler effect. The Doppler image may be a blood flow Doppler image showing flow of blood (also referred to as a color Doppler image), a tissue Doppler image showing a movement of tissue, or a spectral Doppler image showing a moving speed of theobject 10 as a waveform. -
A B mode processor 1212 extracts B mode components from ultrasound data and processes the B mode components. Animage generator 1220 may generate an ultrasound image indicating signal intensities as brightness based on the extractedB mode components 1212. - Similarly, a
Doppler processor 1214 may extract Doppler components from ultrasound data, and theimage generator 1220 may generate a Doppler image indicating a movement of theobject 10 as colors or waveforms based on the extracted Doppler components. - According to an embodiment, the
image generator 1220 may generate a three-dimensional (3D) ultrasound image via volume-rendering with respect to volume data and may also generate an elasticity image by imaging deformation of theobject 10 due to pressure. Furthermore, theimage generator 1220 may display various pieces of additional information in an ultrasound image by using text and graphics. In addition, the generated ultrasound image may be stored in thememory 1500. - Also, the
image processor 1200 may select a portion of theobject 10 shown in the ultrasound image. - The
display 1400 displays the generated ultrasound image. Thedisplay 1400 may display not only an ultrasound image, but also various pieces of information processed by theultrasound imaging device 1002 on a screen image via a graphical user interface (GUI). In addition, theultrasound imaging device 100 may include two ormore display 1400 according to embodiments. - Also, the
display 1400 may display at least one body marker from among the body markers stored in thememory 1500, and thedisplay 1400 may display a second body marker that is generated according to a user input. - The
communication module 1300 is wired or wirelessly connected to anetwork 30 to communicate with an external device or a server. Also, when theprobe 20 is connected to theultrasound imaging device 100 via a wireless network, thecommunication module 1300 may communicate with theprobe 20. - The
communication module 1300 may exchange data with a hospital server or another medical apparatus in a hospital, which is connected thereto via a PACS. Furthermore, thecommunication module 1300 may perform data communication according to the digital imaging and communications in medicine (DICOM) standard. - The
communication module 1300 may transmit or receive data related to diagnosis of theobject 10, e.g., an ultrasound image, ultrasound data, and Doppler data of theobject 10, via thenetwork 30 and may also transmit or receive medical images captured by another medical apparatus, e.g., a computed tomography (CT) apparatus, a magnetic resonance imaging (MRI) apparatus, or an X-ray apparatus. Furthermore, thecommunication module 1300 may receive information about a diagnosis history or medical treatment schedule of a patient from a server and utilizes the received information to diagnose the patient. Furthermore, thecommunication module 1300 may perform data communication not only with a server or a medical apparatus in a hospital, but also with a portable terminal of a medical doctor or patient. - The
communication module 1300 is wired to thenetwork 30 or connected wirelessly to exchange data with aserver 32, amedical apparatus 34, or aportable terminal 36. Thecommunication module 1300 may include one or more components for communication with external devices. For example, thecommunication module 1300 may include a localarea communication module 1310, awired communication module 1320, and amobile communication module 1330. - The local
area communication module 1310 refers to a module for local area communication within a predetermined distance. Examples of local area communication techniques according to an embodiment may include, but are not limited to, wireless LAN, Wi-Fi, Bluetooth, ZigBee, Wi-Fi Direct (WFD), ultra wideband (UWB), infrared data association (IrDA), Bluetooth low energy (BLE), and near field communication (NFC). - The
wired communication module 1320 refers to a module for communication using electric signals or optical signals. Examples of wired communication techniques according to an embodiment may include communication via a twisted pair cable, a coaxial cable, an optical fiber cable, and an Ethernet cable. - The
mobile communication module 1330 transmits or receives wireless signals to or from at least one selected from a base station, an external terminal, and a server on a mobile communication network. The wireless signals may be voice call signals, video call signals, or various types of data for transmission and reception of text/multimedia messages. - The
memory 1500 stores various data processed by theultrasound imaging device 1000. For example, thememory 1500 may store medical data related to diagnosis of theobject 10, such as ultrasound data and an ultrasound image that are input or output, and may also store algorithms or programs which are to be executed in theultrasound imaging device 1002. - Also, the
memory 1500 may store pre-generated body markers and a body marker generated by thecontroller 1700. - The
memory 1500 may be any of various storage media, e.g., a flash memory, a hard disk drive, EEPROM, etc. Furthermore, theultrasound imaging device 1002 may utilize web storage or a cloud server that performs the storage function of thememory 1500 online. - The
input unit 1600 refers to a unit via which a user may input data for controlling theultrasound imaging device 1002. Theinput unit 1600 may include hardware components, such as a keyboard, a mouse, a touch pad, a touch screen, and a jog switch, and a software module for driving the hardware components. However, the exemplary embodiments are not limited thereto, and theinput unit 1600 may further include any of various input units including an electrocardiogram (ECG) measuring module, a respiration measuring module, a voice recognition sensor, a gesture recognition sensor, a fingerprint recognition sensor, an iris recognition sensor, a depth sensor, a distance sensor, etc. - Also, the
input unit 1600 may receive a user input for selecting a first body marker from among the body markers stored in thememory 1500. - The
controller 1700 may control all operations of theultrasound imaging device 1000. In other words, thecontroller 1700 may control operations among theprobe 20, theultrasound transceiver 1100, theimage processor 1200, thecommunication module 1300, thedisplay 1400, thememory 1500, and theinput unit 1600 shown inFIG. 1 . - The
controller 1700 according to an exemplary embodiment may select a first body marker from among prestored body markers based on theobject 10. The prestored body markers are the body markers that are preset and stored in thememory 1500, regardless of a shape or a location of theobject 10 shown on the ultrasound image. - Also, the
controller 1700 may generate a second body marker by reshaping or modifying an orientation of the first body marker according to a user input. The user input is input via theinput unit 1600, and includes a gesture performed by the user, for example, tapping, touch and hold, double tapping, dragging, panning, flicking, drag and drop, pinching, and stretching. - For example, a first body marker may be selected from the body markers stored in the
memory 1500 according to a user input received by theinput unit 1600. An example of the first body marker being selected according to the user input will be described in detail with reference toFIGS. 15 ,16A, and 16B . As another example, a first body marker may be selected from the body markers stored in thememory 1500 based on a portion of theobject 10 selected from the ultrasound image by theimage processor 1200. An example of the first body marker being selected based on the portion of theobject 10 selected from the ultrasound image will be described in detail with reference toFIGS. 17 ,18A, and 18B . - All or some of the
probe 20, theultrasound transceiver 1100, theimage processor 1200, thecommunication module 1300, thedisplay 1400, thememory 1500, theinput unit 1600, and thecontroller 1700 may be implemented as software modules. Furthermore, at least one selected from theultrasound transceiver 1100, theimage processor 1200, and thecommunication module 1300 may be included in thecontroller 1700. However, the exemplary embodiments are not limited thereto. -
FIG. 3 is a block diagram illustrating awireless probe 2000 according to an exemplary embodiment. - As described above with reference to
FIG. 3 , thewireless probe 2000 may include a plurality of transducers, and, according to embodiments, may include some or all of the components of theultrasound transceiver 1100 shown inFIG. 2 . - The
wireless probe 2000 according to the embodiment shown inFIG. 3 includes atransmitter 2100, atransducer 2200, and areceiver 2300. Since descriptions thereof are given above with reference toFIG. 2 , detailed descriptions thereof will be omitted here. In addition, according to embodiments, thewireless probe 2000 may selectively include areception delaying unit 2330 and a summingunit 2340. - The
wireless probe 2000 may transmit ultrasound signals to theobject 10, receive echo signals from theobject 10, generate ultrasound data, and wirelessly transmit the ultrasound data to theultrasound diagnosis system 1002 shown inFIG. 2 . -
FIG. 4 is a block diagram illustrating anapparatus 101 for generating a body marker, according to an exemplary embodiment. - Referring to
FIG. 4 , theapparatus 101 may include acontroller 1701 and adisplay 1401. One or both of thecontroller 1701 and thedisplay 1401 may be implemented as software modules, but are not limited thereto. One of thecontroller 1701 and thedisplay 1401 may be implemented as hardware. Also, thedisplay 1401 may include an independent control module. - Also, the
controller 1701 may be the same as thecontroller 1700 ofFIG. 2 , and thedisplay 1401 may be the same as thedisplay 1400 ofFIG. 2 . If theapparatus 101 is a component included in an ultrasound imaging device, then, in addition to thecontroller 1701 and thedisplay 1401, theapparatus 101 may further include theultrasound transceiver 1100, theimage processor 1200, thecommunication module 1300, thememory 1500, and theinput unit 1600 shown inFIG. 2 . - The
controller 1701 may select a first body marker from prestored body markers based on an object shown in a medical image. The body markers may be stored in a memory (not shown) of theapparatus 101. - A body marker refers to a figure added to a medical image (e.g., an ultrasound image) so that a viewer of the medical image (e.g., a user) may easily recognize an object shown in the medical image.
- The first body marker according to an exemplary embodiment refers to a body marker that is pre-generated and stored in the
apparatus 101. That is, the first body marker refers to a body marker generated in advance by a manufacturer of the user, regardless of information of a current shape or location of the object shown in the medical image. - Since the first body marker does not reflect the current shape or the location of the object, when the first body marker is added to the medical image, the viewer of the medical image may be unable to accurately recognize the shape, position, or orientation of the object. Also, since the first body marker is selected from the prestored body markers, a large amount of time may be required for a user to select an appropriate body marker for the object. Hereinafter, the first body marker will be described in detail with reference to
FIGS. 5A to 5C . -
FIGS. 5A to 5C are diagrams illustrating a first body marker according to an exemplary embodiment. - Referring to
FIG. 5A , ascreen 3110 displays an example ofprestored body markers 3120. Thecontroller 1701 may select a first body marker from any one of thebody markers 3120 based on an object shown in a medical image. - The
body markers 3120 may be sorted intoapplication groups 3130. An application refers to a diagnosis type determined based on a body part or an organ of a human or an animal. - For example, the
application groups 3130 may include an abdomen group, a small part group including the chest, mouth, sexual organs, and the like, a vascular group, a musculoskeletal group, an obstetrics group, a gynecology group, a cardiac group, a brain group, a urology group, and a veterinary group. However, theapplication groups 3130 are not limited thereto. Body parts and organs of a human or an animal may be sorted into a plurality of application groups based on a predetermined standard. - The
display 1401 may display thebody markers 3120, which are sorted into theapplication groups 3130, on thescreen 3110. For example, the user may select (e.g., click or tap) a 'Vascular' icon from icons representing theapplication groups 3130 displayed on thescreen 3110. Then, from among thebody markers 3120, thedisplay 1401 may display body markers included in the vascular group on thescreen 3110. - The
body markers 3120 are generated and stored in advance. Accordingly, the user has to examine thebody markers 3120 and select a first body marker that is the most appropriate for an object in a medical image. Therefore, it may require a large amount of time for the user to select the first body marker. - Also, the first body marker selected from the
body markers 3120 may not include accurate information about the object. This will be described in detail with reference toFIG. 5B . -
FIG. 5B illustrates an example of amedical image 3220 displayed on ascreen 3210 and afirst body marker 3230 added to themedical image 3220. For convenience of description,FIG. 5B is described assuming that themedical image 3220 is an ultrasound image showing a portion of blood vessels. - In order to facilitate understanding of a viewer of the
medical image 3220, thefirst body marker 3230 may be added to themedical image 3220. Thefirst body marker 3230 is selected from pre-generated body markers. Therefore, by viewing only thefirst body marker 3230, the viewer may only be able to recognize that themedical image 3220 obtained is a captured image of blood vessels, without knowing which portion of the blood vessels is shown in themedical image 3220. - The
first body marker 3230 cannot be rotated or modified in any other manner. According to a direction ofblood vessels 3250 shown in thefirst body marker 3230, thefirst body marker 3230 may be unable to accurately indicate a direction ofblood vessels 3240 shown in themedical image 3220. For example, referring toFIG. 5B , theblood vessels 3240 in themedical image 3220 are oriented with respect to an x-axis, whereas theblood vessels 3250 in thefirst body marker 3230 are oriented with respect to a y-axis. Therefore, thefirst body marker 3230 does provide accurate information about the object. -
FIG. 5C illustrates an example of amedical image 3320 displayed on ascreen 3310 and afirst body marker 3330 added to themedical image 3320. For convenience of description,FIG. 5C is described assuming that themedical image 3320 is an ultrasound image showing a fetus. - A
fetus 3340 of theultrasound image 3320 is shown lying parallel to the x-axis and facing downward. However, afetus 3350 of thefirst body marker 3330 is shown lying parallel to the x-axis and facing upward. Therefore, thefirst body marker 3330 does not provide accurate information about thefetus 3340 shown in theultrasound image 3320. - As described above with reference to
FIGS. 5A to 5C , the first body marker, which is a pre-generated body marker, may not accurately represent an object included in a medical image. - The
controller 1701 according to an exemplary embodiment may generate a second body marker by modify a shape, orientations, or positions of the first body marker. Therefore, thecontroller 1701 may generate a body marker that accurately shows information about an object. Hereinafter, the second body marker will be described in detail with reference toFIG. 6 . -
FIG. 6 is a diagram illustrating asecond body marker 3430 according to an exemplary embodiment. -
FIG. 6 illustrates an example of amedical image 3420 displayed on ascreen 3410 and asecond body marker 3430 added to themedical image 3420. For convenience of description,FIG. 6 is described assuming that themedical image 3320 is an ultrasound image showing a fetus. - A
fetus 3440 of theultrasound image 3420 is shown lying parallel to the x-axis and facing downward. Afetus 3450 of thesecond body marker 3430 is also shown lying parallel to the x-axis and facing downward. In other words, thesecond body marker 3430 accurately shows an orientation and a facing direction of thefetus 3440 of theultrasound image 3420. - As described above with reference to
FIG. 5C , thefirst body marker 3330 does not accurately show a facing direction of thefetus 3340 of theultrasound image 3320. Therefore, the viewer may be unable to obtain accurate information about thefetus 3340 shown in theultrasound image 3320 based on only thefirst body marker 3330. However, as shown inFIG. 6 , thesecond body marker 3430 accurately shows the orientation and location of thefetus 3440 of theultrasound image 3420, and thus, the viewer may obtain accurate information about thefetus 3440 shown in theultrasound image 3420 based on only thesecond body marker 3430. - Referring back to
FIG. 4 , thecontroller 1701 may generate a second body marker by changing a shape of a first body marker according to a user input. For example, thecontroller 1701 may modify the shape, orientations, or positions of the first body marker according to a user input received via an input unit (not shown) in theapparatus 101. The input unit in theapparatus 101 may be the same as theinput unit 1600 described with reference toFIG. 2 . - For example, in the case that the input unit includes hardware components, such as a keyboard, a mouse, a trackball, and a jog switch, and a software module for driving the hardware components, the user input may include clicking a predetermined point on a screen or inputting a drag gesture from a point on the screen to another point on the screen. As another example, in the case that the input unit includes a touch screen and a software module for driving the touch screen, the user input may include a user gesture input on the touch screen. The gesture may include, for example, tapping, touch and hold, double tapping, dragging, scrolling, flicking, drag and drop, pinching, and stretching.
- For example, the
controller 1701 may generate a second body marker by flipping the first body marker about a vertical axis according to the user input. As another example, thecontroller 1701 may generate a second body marker by rotating the first body marker about a central axis of the first body marker according to the user input. As another example, thecontroller 1701 may generate a second body marker by rotating the first body marker in a clockwise or counterclockwise direction according to the user input. - The
display 1401 may display a medical image showing an object on a screen. Also, thedisplay 1401 may display the second body marker generated by thecontroller 1701 on the screen. Thedisplay 1401 may display the first body marker and the second body marker on a single screen. - The
controller 1701 may generate the second body marker by changing the shape of the first body marker according to the user input. In order to help the user to accurately input data, thedisplay 1401 may display the first body marker and a predetermined guide image on the screen. Hereinafter, the first body marker and the predetermined guide image displayed by thedisplay 1401 will be described in detail with reference toFIGS. 7A and 7B . -
FIGS. 7A and 7B are diagrams illustrating images used for generating a second body marker, according to an exemplary embodiment. -
FIG. 7A illustrates an example of afirst body marker 4120 and first tothird guide images screen 4110. The user may provide a user input to theapparatus 101 based on thefirst body marker 4120 and the first tothird guide images screen 4110. - The user may select a point on the
guide images guide images controller 1701 may modify a shape, orientations, or positions of thefirst body marker 4120 according to the user input. - For convenience of description, the user input is assumed as a gesture performed by the user in
FIGS. 7A to 14B . However, the user input is not limited thereto. As described above, the user may provide the user input to theapparatus 101 by using various hardware components, such as a keyboard or a mouse. - For example, when the user taps the
first guide image 4130 or inputs a drag gesture along points surrounding thefirst guide image 4130 in a clockwise or counterclockwise direction, thecontroller 1701 may rotate a portion of thefirst body marker 4120 about the central axis thereof. - As another example, when the user taps the
second guide image 4140, thecontroller 1701 may rotate a portion of thefirst body marker 4120 toward thesecond guide image 4140. - As another example, when the user taps the
third guide image 4150 or inputs a drag gesture in a clockwise or counterclockwise direction along points surrounding thethird guide image 4150, thecontroller 1701 may rotate a portion of thefirst body marker 4120 in a clockwise or counterclockwise direction. - Alternatively, as shown in
FIG. 7B , only thefirst body marker 4120 may be displayed on ascreen 4160. In other words, the first tothird guide images FIG. 7A may be omitted from thescreen 4160. In the case ofFIG. 7B , the user may also provide the user input, as described above with reference toFIG. 7A , to theapparatus 101, and thecontroller 1701 may modify a shape, orientations, or positions of thefirst body marker 4120 according to the user input. - Hereinafter, examples of the
controller 1701 generating a second body marker by modifying a shape of a first body marker will be described in detail with reference toFIGS. 8A to 14B . InFIGS. 8A to 14B , it is assumed that theguide images -
FIGS. 8A and 8B are diagrams illustrating an example of thecontroller 1701 generating a second body marker, according to an exemplary embodiment. - For example, the
controller 1701 may generate asecond body marker 4230 by rotating a portion of afirst body marker 4220 according to a user input. -
FIG. 8A illustrates an example of ascreen 4210 displaying thefirst body marker 4220. The user may select (e.g., tap) an outer point of thefirst body marker 4220, and thecontroller 1701 may rotate the portion of thefirst body marker 4220 to the point selected by the user. -
FIG. 8B illustrates an example of thesecond body marker 4230 generated by rotating the portion of thefirst body marker 4220. For example, when thefirst body marker 4220 represents a fetus, thecontroller 1701 may rotate the portion of thefirst body marker 4220 such that the head of the fetus is located toward apoint 4240 selected by the user. In this case, thecontroller 1701 may rotate the portion of thefirst body marker 4220 in a clockwise or counterclockwise direction. - According to a rotation degree of a portion of the
second body marker 4230, thesecond body marker 4230 may be the same as when thefirst body marker 4220 is flipped about a horizontal axis. - Accordingly, the
controller 1701 may generate thesecond body marker 4230 by rotating the portion of thefirst body marker 4220 toward thepoint 4240 selected by the user. Also, thecontroller 1701 may store thesecond body marker 4230 in the memory of theapparatus 101. -
FIGS. 9A and 9B are diagrams illustrating another example of thecontroller 1701 generating a second body marker, according to an exemplary embodiment. - For example, the
controller 1701 may generate asecond body marker 4330 by rotating a portion of afirst body marker 4320 in a clockwise or counterclockwise direction according to a user input. -
FIG. 9A illustrates an example of ascreen 4310 displaying thefirst body marker 4320. The user may input a drag gesture from an outer point on thefirst body marker 4320 to another outer point thereon in a clockwise or counterclockwise direction, and thecontroller 1701 may rotate the portion of thefirst body marker 4320 in a clockwise or counterclockwise direction according to the drag gesture input by the user. -
FIG. 9B illustrates thesecond body marker 4330 generated by rotating the portion of thefirst body marker 4320 in a clockwise direction. For example, it is assumed that thefirst body marker 4320 represents a fetus, and the user inputs a drag gesture in a clockwise direction from apoint 4340 toward which the head of thefetus 4833 is directed to anotherpoint 4350 on thefirst body marker 4320. In this case, thecontroller 1701 may rotate the portion of thefirst body marker 4320 in a clockwise direction such that the head of the fetus is pointed toward thepoint 4350 where the drag gesture stops. - Accordingly, the
controller 1701 may generate thesecond body marker 4330 by rotating the portion of thefirst body marker 4320 in a clockwise direction toward thepoint 4350 selected by the user. Also, thecontroller 1701 may store thesecond body marker 4330 in the memory of theapparatus 101. - As described above with reference to
FIGS. 9A to 9B , thecontroller 1701 may generate thesecond body marker 4330 by rotating the portion of thefirst body marker 4320 in a clockwise direction. However, when the user inputs a drag gesture in a counterclockwise direction, thecontroller 1701 may generate a second body marker by rotating a first body marker in a counterclockwise direction toward a point where the drag gesture stops. -
FIGS. 10A to 10D are diagrams illustrating another example of thecontroller 1701 generating a second body marker, according to an exemplary embodiment. - For example, the
controller 1701 may generatesecond body markers first body marker 4420 about a central axis of thefirst body marker 4420 according to a user input. -
FIG. 10A illustrates an example of ascreen 4410 displaying thefirst body marker 4420. The user may select (e.g., tap) apoint 4430 through which the central axis of thefirst body marker 4420 passes, and thecontroller 1701 may rotate the portion of thefirst body marker 4420 about the central axis thereof according to the selected point. - Although
FIGS. 10B to 10D illustrate that thefirst body marker 4420 is rotated about the central axis thereof in a counterclockwise direction, the exemplary embodiments are not limited thereto. In other words, thecontroller 1701 may rotate the portion of thefirst body marker 4420 about the central axis thereof in the counterclockwise direction based on a predetermined rule. -
FIG. 10B illustrates an example of thesecond body marker 4440 generated by rotating the portion of thefirst body marker 4420 by 90° about the central axis thereof. For example, it is assumed that thefirst body marker 4420 represents the fetus, when the user taps thepoint 4430 through which the central axis of thefirst body marker 4320 passes. In this case, thecontroller 1701 may rotate the portion of thefirst body marker 4320 by 90° about the central axis thereof in a counterclockwise direction. -
FIG. 10C illustrates an example of thesecond body marker 4450 generated by rotating the portion of thesecond body marker 4440 of theFIG. 10B by 90° about a central axis of thesecond body marker 4440. For example, it is assumed that thesecond body marker 4440 represents the fetus, and the user taps thepoint 4430 through which the central axis of thesecond body marker 4440 passes. In this case, thecontroller 1701 may rotate the portion of thesecond body marker 4440 by 90° about the central axis thereof in a counterclockwise direction. -
FIG. 10D illustrates an example of thesecond body marker 4460 generated by rotating the portion of thesecond body marker 4450 ofFIG. 10C by 90° about a central axis of thesecond body marker 4450. For example, it is assumed that thesecond body marker 4450 represents the fetus, and the user taps thepoint 4430 through which the central axis of thesecond body marker 4450 passes. In this case, thecontroller 1701 may rotate the portion of thesecond body marker 4450 by 90° about the central axis thereof in a counterclockwise direction. - Accordingly, the
controller 1701 may generate thesecond body markers first body marker 4420 about the central axis thereof. Also, thecontroller 1701 may store thesecond body markers apparatus 101. - As described above with reference to FIGS.
FIG. 10A to FIG. 10D , thecontroller 1701 may rotate the portion of thefirst body marker 4420 about the central axis thereof according to taps performed by the user. The taps may be continuous or discontinuous, but are not limited thereto. - For example, it is assumed that the user inputs a drag gesture in a clockwise direction from a point on the
first body marker 4420 where the central axis of thefirst body marker 4420 passes through to another point on thefirst body marker 4420. Based on a distance of the drag gesture, thecontroller 1701 may rotate thefirst body marker 4420 about the central axis thereof in a clockwise direction. For example, when the user inputs a drag gesture in a clockwise direction from a first point to a second point, in which the distance between the first point and the second point is 1 mm, thecontroller 1701 may rotate thefirst body marker 4420 by 20° in a clockwise direction. However, the exemplary embodiments are not limited thereto. The distance of the drag gesture and a rotation degree of thefirst body marker 4420 may vary. -
FIGS. 11A and 11B are diagrams illustrating another example of thecontroller 1701 generating a second body marker, according to an exemplary embodiment. - For example, the
controller 1701 may generate asecond body marker 4540 by flipping afirst body marker 4520 about the vertical axis according to a user input. -
FIG. 11A illustrates an example of ascreen 4510 displaying thefirst body marker 4520. When the user selects (e.g., tap) anouter point 4530 on thefirst body marker 4520, thecontroller 1701 may flip thefirst body marker 4520 about the vertical axis. -
FIG. 11B illustrates an example of thesecond body marker 4540 generated by flipping thefirst body marker 4520 about the vertical axis. For example, when thefirst body marker 4520 represents the palm of the right hand of a subject, thecontroller 1701 may flip thefirst body marker 4520 about the vertical axis to show an inverse image of the palm of the right hand. - Accordingly, the
controller 1701 may generate thesecond body marker 4540 by flipping thefirst body marker 4520 about the vertical axis. Also, thecontroller 1701 may store thesecond body marker 4540 in the memory of theapparatus 101. -
FIGS. 12A and 12B are diagrams illustrating another example of thecontroller 1701 generating a second body marker, according to an exemplary embodiment. - For example, the
controller 1701 may generate asecond body marker 4640 by flipping over a first body marker 4620 (i.e., rotating thefirst body marker 4620 by 180° about a central axis thereof in a clockwise or counterclockwise direction) according to a user input. -
FIG. 12A illustrates an example of ascreen 4610 displaying thefirst body marker 4620. When the user selects (e.g., tap) apoint 4630 through which a central axis of thefirst body marker 4620 passes, thecontroller 1701 may flip over thefirst body marker 4620. -
FIG. 12B illustrates an example of thesecond body marker 4640 generated by flipping over thefirst body marker 4620. For example, when thefirst body marker 4620 represents the palm of the right hand, thecontroller 1701 may flip over thefirst body marker 4620 to display the back of the right hand. - Accordingly, the
controller 1701 may generate thesecond body marker 4640 by flipping over thefirst body marker 4620. Also, thecontroller 1701 may store thesecond body marker 4640 in the memory of theapparatus 101. - As described above with reference to
FIGS. 8A to 12B , thecontroller 1701 may generate a second body marker that represents a single object by using a first body marker that represents a single object. However, a body marker may represent a plurality of objects. For example, when a medical image is obtained by capturing the uterus of a pregnant woman having twins, a body marker added to the medical image has to represent the twins (that is, a plurality of objects). - The
controller 1701 according to an exemplary embodiment may generate a second body marker that represents a plurality of objects by using a first body marker that represents a single object. Hereinafter, examples of thecontroller 1701 generating a second body marker that represents a plurality of objects will be described with reference toFIGS. 13A to 14B . -
FIGS. 13A and 13B are diagrams illustrating another example of thecontroller 1701 generating a second body marker, according to an exemplary embodiment. -
FIG. 13A illustrates an example of ascreen 4710 displaying afirst body marker 4720 that represents a single object. For convenience of description, it is assumed that thefirst body marker 4720 represents a fetus. - The
controller 1701 may change the number of objects represented by a body marker, according to a user input. For example, when the user selects (e.g., taps) anicon 4730 displayed at a predetermined location on thescreen 4710, thedisplay 1401 may display a pop-upwindow 4740 showing the number of objects that may be represented by a body marker. Accordingly, the user may set the number of objects to be represented by the body marker. - When the user has set the body marker to represent 2 objects, the
controller 1701 may generate asecond body marker 4750 that includes 2 fetuses, as shown inFIG. 13B . Also, thecontroller 1701 may store thesecond body marker 4750 in the memory of theapparatus 101. - The
controller 1701 may modify shapes, orientations, or positions of a plurality of objects included in a body marker. For example, when a medical image is obtained by capturing the uterus of a pregnant woman having twins, each fetus may be oriented in different directions. Therefore, thecontroller 1701 may modify the shape, orientations, or positions of the plurality of objects in the body marker to thus generate a body marker that accurately shows information about the objects. - Hereinafter, an example of the
controller 1701 generating a second body marker by modifying the shape, orientations, or positions according to a plurality of objects included in a body marker will be described with reference toFIGS. 14A and 14B . -
FIGS. 14A and 14B are diagrams illustrating another example of thecontroller 1701 generating a second body marker, according to an exemplary embodiment. -
FIG. 14A illustrates an example of ascreen 4810 displaying afirst body marker 4820 that includes a plurality of objects. For convenience of description, it is assumed that thefirst body marker 4820 includes twins. Also, as described above with reference toFIGS. 13A and 13B , thecontroller 1701 may generate thefirst body marker 4820 by using a body marker that includes a single fetus. - Based on a user input, the
controller 1701 may select an object from among objects included in a body marker. For example, when the user selects (e.g., taps) afetus 4833 fromfetuses screen 4810, thecontroller 1701 may determine thefetus 4833 as an object to be reshaped, reoriented, or repositioned. In this case, thedisplay 1401 may display thefetuses fetus 4833 that is selected is distinguished from thefetus 4831 that is not selected (for example, change in thickness or line color). Thus, the user may easily recognize the selectedfetus 4833. - The
controller 1701 may modify a shape, orientations, or positions of thefetus 4833 according to a user input. The user input is the same as described above with reference toFIGS. 8A to 12B . For example, it is assumed that the user inputs a drag gesture in a clockwise direction from apoint 4840 toward which the head of thefetus 4833 is directed to anotherpoint 4850. In this case, thecontroller 1701 may rotate thefetus 4833 such that the head is directed toward thepoint 4850 where the drag gesture stops. - Referring to
FIG. 14B , thecontroller 1701 may generate asecond body marker 4860 in which a location of thefetus 4833 ofFIG. 14A is modified. Also, thecontroller 1701 may store thesecond body marker 4860 in the memory of theapparatus 101. - Although second body markers described above with reference to
FIGS. 7A to 14B are 2-dimensional, the exemplary embodiments are not limited thereto. In other words, thecontroller 1701 may generate a 3-dimensional second body marker. -
FIG. 15 is a block diagram illustrating anapparatus 102 for generating a body marker, according to another exemplary embodiment. - Referring to
FIG. 15 , theapparatus 102 may include acontroller 1702, adisplay 1402, and aninput unit 1601. All or some of thecontroller 1702, thedisplay 1402, and theinput unit 1601 may be implemented as software modules, but are not limited thereto. Some of thecontroller 1702, thedisplay 1402, and theinput unit 1601 may be implemented as hardware. Also, each of thedisplay 1402 and theinput unit 1601 may include an independent control module. - Also, the
controller 1702 may be the same as thecontroller 1701 ofFIG. 4 , thedisplay 1402 may be the same as thedisplay 1401 ofFIG. 4 , and theinput unit 1601 may be the same as theinput unit 1600 ofFIG. 2 . If theapparatus 102 is a component included in an ultrasound imaging device, then, in addition to thecontroller 1702, thedisplay 1402, and theinput unit 1601, theapparatus 102 may further include theultrasound transceiver 1100, theimage processor 1200, thecommunication module 1300, and thememory 1500 shown inFIG. 2 . - Operations of the
display 1402 are the same as operations of corresponding components described above with reference toFIGS. 4 to 14B . Therefore, detailed description of thedisplay 1402 will not be repeated. - The
input unit 1601 may receive a user input for selecting a first body marker corresponding to an object in a medical image. The user input refers to an input selecting the first body marker from prestored body markers. - The
controller 1702 may select the first body marker based on the user input transmitted from theinput unit 1601. Also, thecontroller 1702 may generate a second body marker by changing a shape of the first body marker according to a user input received after the user input for selecting the first body marker. Since examples of thecontroller 1702 generating the second body marker have been described above in detail with reference toFIGS. 8A to 14B , descriptions of the examples will not be repeated. - Hereinafter, an example of the
input unit 1601 receiving a user input and thecontroller 1702 selecting a first body marker will be described in detail with reference toFIGS. 16A and 16B . -
FIGS. 16A and 16B are diagrams illustrating an example of an input unit receiving a user input for selecting a first body marker, according to an exemplary embodiment. - Referring to
FIG. 16A , ascreen 5110 displays a plurality ofbody markers 5120 prestored in theapparatus 102. In this case, thebody markers 5120 may be sorted intoapplication groups 5130, as described above with reference toFIG. 5A . - The user may select a
body marker 5140 from among thebody markers 5120 displayed on thescreen 5110. For example, when theinput unit 1601 includes hardware components, such as a keyboard, a mouse, a trackball, and a jog switch, and a software module for driving the hardware components, the user may click thebody marker 5140 from among thebody markers 5120. As another example, when theinput unit 1601 includes a touch screen and a software module for driving the touch screen, the user may tap thebody marker 5140 from among thebody markers 5120. - The
controller 1702 may select a first body marker based on the user input. In other words, thebody marker 5140 selected by the user is selected as the first body marker. Then, as shown inFIG. 16B , thedisplay 1402 may display afirst body marker 5160 on ascreen 5150. - As described above with reference to
FIGS. 16A and FIG. 16B , thecontroller 1702 may select a body marker determined by the user as the first body marker. However, the exemplary embodiments are not limited thereto. For example, thecontroller 1702 may select the first body marker based on a shape of an object shown in a medical image. Hereinafter, an example of thecontroller 1702 selecting a first body marker based on a shape of an object will be described with reference toFIGS. 17 ,18A, and 18B . -
FIG. 17 is a block diagram illustrating anapparatus 103 for generating a body marker, according to another exemplary embodiment. - Referring to
FIG. 17 , theapparatus 103 may include acontroller 1703, adisplay 1403, and animage processor 1201. All or some of thecontroller 1703, thedisplay 1403, and theimage processor 1201 may be implemented as software modules, but are not limited thereto. Some of thecontroller 1703, thedisplay 1403, and theimage processor 1201 may be implemented as hardware. Also, each of thedisplay 1403 and theimage processor 1201 may include an independent control module. - Also, the
controller 1703 may be the same as thecontroller 1701 ofFIG. 4 , thedisplay 1403 may be the same as thedisplay 1401 ofFIG. 4 , and theimage processor 1201 may be the same as theimage processor 1200 ofFIG. 2 . If theapparatus 103 is a component included in an ultrasound imaging device, then, in addition to thecontroller 1703, thedisplay 1403, and theimage processor 1201, theapparatus 103 may further include theultrasound transceiver 1100, thecommunication module 1300, thememory 1500, and theinput unit 1600 shown inFIG. 2 . - Operations of the
display 1403 are the same as operations of corresponding components described above with reference toFIGS. 4 to 14B . Therefore, detailed description of thedisplay 1403 will not be repeated. - The
image processor 1201 selects a portion of an object from a medical image. For example, theimage processor 1201 may detect outlines of the object in the medical image, connect the detected outlines, and thus select the portion of the object. Theimage processor 1201 may select the portion of the object by using various methods, for example, a thresholding method, a K-means algorithm, a compression-based method, a histogram-based method, edge detection, a region-growing method, a partial differential equation-based method, and a graph partitioning method. Since the methods above are well-know to one of ordinary skill in the art, detailed descriptions thereof will be omitted. - The
controller 1703 may select a first body marker based on information of the shape of the object. For example, from among a plurality of body markers stored in theapparatus 103, thecontroller 1703 may select a first body marker having a shape that is the most similar to the object. - Also, the
controller 1703 may generate a second body marker by modifying the shape, orientations, or positions of the first body marker according to a user input received after the first body marker is selected. Since examples of thecontroller 1703 generating the second body marker have been described above in detail with reference toFIGS. 8A to 14B , the examples will not be repeatedly described. - Hereinafter an example of the
image processor 1201 selecting the portion of the object from the medical image and thecontroller 1702 selecting the first body marker will be described in detail with reference toFIGS. 18A and 18B . -
FIGS. 18A and 18B are diagrams illustrating an example of an image processor selecting a portion of an object from a medical image, according to an exemplary embodiment. - Referring to
FIG. 18A , ascreen 5210 displays amedical image 5220 showing anobject 5230. Theimage processor 1201 may select a portion of theobject 5230 from themedical image 5220. For example, theimage processor 1201 may use any one of the methods described with reference toFIG. 17 to select the portion of theobject 5230 from themedical image 5220. - The
controller 1703 may select a first body marker based on the shape of theobject 5230. For example, from among the plurality of body markers stored in theapparatus 103, thecontroller 1703 may select abody marker 5240 having a shape that is the most similar to theobject 5230 as the first body marker. Also, as shown inFIG. 16B , thedisplay 1402 may display thefirst body marker 5240 on ascreen 5250. -
FIG. 19 is a flowchart illustrating a method of generating a body marker, according to an exemplary embodiment. - Referring to
FIG. 19 , the method of generating the body marker includes operations sequentially performed by theultrasound diagnosis systems FIGS. 1A, 1B and2 or theapparatuses FIGS. 4 ,15 , and17 . Therefore, all of the above-described features and elements of theultrasound diagnosis systems FIGS. 1A, 1B and2 and theapparatuses FIGS. 4 ,15 , and17 apply to the method ofFIG. 19 . - In
operation 6100, a controller selects a first body marker from among a plurality of prestored body markers based on an object shown in a medical image. The plurality of body markers may be stored in a memory of an apparatus for generating a body marker. - In
operation 6200, the controller generates a second body marker by changing a shape of the first body marker according to a user input. For example, the controller may generate the second body marker by flipping the first body marker about a vertical axis according to the user input. As another example, the controller may generate the second body marker by rotating the first body marker about a central axis thereof according to the user input. Alternatively, the controller may generate the second body marker by rotating the first body marker in a clockwise or counterclockwise direction according to the user input. - In
operation 6300, a display displays the second body marker. The display may display the first body marker and the second body marker on a single screen. - As described above, according to the one or more of the above exemplary embodiments, a body marker that corresponds to a current location or direction of an object shown in a medical image may be generated. Also, the user may spend less time selecting the body marker that corresponds to the current location or direction of the object from among prestored body markers.
- In addition, other exemplary embodiments can also be implemented through computer-readable code/instructions in/on a medium, e.g., a computer-readable medium, to control at least one processing element to implement any above described exemplary embodiment. The medium can correspond to any medium/media permitting the storage and/or transmission of the computer-readable code.
- The computer-readable code can be recorded/transferred on a medium in a variety of ways, with examples of the medium including recording media, such as magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs), and transmission media such as Internet transmission media. Thus, the medium may be such a defined and measurable structure including or carrying a signal or information, such as a device carrying a bitstream according to one or more exemplary embodiments. The media may also be a distributed network, so that the computer-readable code is stored/transferred and executed in a distributed fashion. Furthermore, the processing element could include a processor or a computer processor, and processing elements may be distributed and/or included in a single device.
- It should be understood that the exemplary embodiments described therein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each exemplary embodiment should typically be considered as available for other similar features or aspects in other exemplary embodiments.
- While one or more exemplary embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.
Claims (15)
- A method of generating a body marker, the method comprising:selecting a first body marker from among a plurality of prestored body markers based on an object shown in a medical image;generating a second body marker by modifying the first body marker according to a user input; anddisplaying the second body marker.
- The method of claim 1, wherein the second body marker comprises a body marker generated by flipping the first body marker about a vertical axis.
- The method of claim 1, wherein the second body marker comprises a body marker generated by flipping the first body marker about a horizontal axis.
- The method of claim 1, wherein the second body marker comprises a body marker generated by rotating the first body marker about a central axis of the first body marker.
- The method of claim 1, wherein the second body marker comprises a body marker generated by rotating the first body marker in a clockwise direction or a counterclockwise direction.
- The method of claim 1, wherein the selecting comprises receiving a user input for selecting the first body marker that corresponds to the object from among a plurality of prestored body markers, and selecting the first body marker based on the received user input.
- The method of claim 1, wherein the selecting comprises selecting a portion of the object shown in the medical image, and selecting the first body marker based on the selected portion of the object.
- A non-transitory computer-readable recording medium having recorded thereon a program, which, when executed by a computer, performs the method of claims 1.
- An apparatus for generating a body marker, the apparatus comprising:a display displaying a medical image showing an object; anda controller selecting a first body marker from among a plurality of prestored body markers based on the object, and generating a second body marker by modifying a shape of the first body marker according to a user input,wherein the display displays the second body marker.
- The apparatus of claim 9, wherein the second body marker comprises a body marker generated by flipping the first body marker about a vertical axis.
- The apparatus of claim 9, wherein the second body marker comprises a body marker generated by flipping the first body marker about a horizontal axis.
- The apparatus of claim 9, wherein the second body marker comprises a body marker generated by rotating the first body marker about a central axis of the first body marker.
- The apparatus of claim 9, wherein the second body marker comprises a body marker generated by rotating the first body marker in a clockwise direction or a counterclockwise direction.
- The apparatus of claim 9, further comprising an input unit receiving a user input for selecting the first body marker that corresponds to the object from among a plurality of prestored body markers, and
wherein the controller selects the first body marker based on the received user input. - The apparatus of claim 9, further comprising an image processor selecting a portion of the object from the medical image, and
wherein the controller selects the first body marker based on the selected portion of the object.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140180499A KR102388132B1 (en) | 2014-12-15 | 2014-12-15 | Method, apparatus and system for generating a body marker which indicates an object |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3034005A1 true EP3034005A1 (en) | 2016-06-22 |
EP3034005B1 EP3034005B1 (en) | 2021-10-20 |
Family
ID=53264448
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15163220.5A Active EP3034005B1 (en) | 2014-12-15 | 2015-04-10 | Method, apparatus and system for generating body marker indicating object |
Country Status (4)
Country | Link |
---|---|
US (1) | US10768797B2 (en) |
EP (1) | EP3034005B1 (en) |
KR (1) | KR102388132B1 (en) |
CN (1) | CN105686848A (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109922737A (en) * | 2016-11-11 | 2019-06-21 | 皇家飞利浦有限公司 | Imaging device and associated equipment, system and method in wireless lumen |
US20180189992A1 (en) * | 2017-01-04 | 2018-07-05 | Clarius Mobile Health Corp. | Systems and methods for generating an ultrasound multimedia product |
CN107693047A (en) * | 2017-10-18 | 2018-02-16 | 飞依诺科技(苏州)有限公司 | Based on the body mark method to set up symmetrically organized and system in ultrasonic imaging |
JP7171291B2 (en) * | 2018-07-26 | 2022-11-15 | キヤノンメディカルシステムズ株式会社 | Ultrasound diagnostic equipment and image processing program |
USD947220S1 (en) * | 2018-08-22 | 2022-03-29 | Sonivate Medical, Inc. | Display screen with a graphical user interface for an ultrasound system |
CN109567861B (en) * | 2018-10-25 | 2022-06-07 | 中国医学科学院北京协和医院 | Ultrasound imaging method and related apparatus |
WO2021024754A1 (en) * | 2019-08-02 | 2021-02-11 | ソニー株式会社 | Endoscope system, control device, and control method |
CN111603194B (en) * | 2020-06-02 | 2023-09-19 | 上海联影医疗科技股份有限公司 | Method, system and computer readable storage medium for breast tomogram display |
US11579968B2 (en) | 2020-08-26 | 2023-02-14 | Micron Technology, Inc. | Efficient management of failed memory blocks in memory sub-systems |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000175910A (en) * | 1998-12-21 | 2000-06-27 | Toshiba Iyo System Engineering Kk | Ultrasonograph |
US20110208052A1 (en) * | 2008-11-06 | 2011-08-25 | Koninklijke Philips Electronics N.V. | Breast ultrasound annotation user interface |
US20140104311A1 (en) * | 2012-10-12 | 2014-04-17 | Infinitt Healthcare Co., Ltd. | Medical image display method using virtual patient model and apparatus thereof |
EP2783635A1 (en) * | 2013-03-28 | 2014-10-01 | Samsung Medison Co., Ltd. | Ultrasound system and method of providing direction information of object |
US20140324475A1 (en) * | 2012-08-31 | 2014-10-30 | Kabushiki Kaisha Toshiba | Medical reading report preparing apparatus and medical image diagnostic apparatus |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS56112226A (en) * | 1980-02-12 | 1981-09-04 | Tokyo Shibaura Electric Co | Ultrasonic blood stream measuring apparatus |
US5367498A (en) * | 1990-07-11 | 1994-11-22 | Yoshida Takashi | Lateral direction detection sonar |
JP3091503B2 (en) * | 1991-03-22 | 2000-09-25 | 株式会社日立メディコ | Ultrasonic tomograph |
US20020173721A1 (en) * | 1999-08-20 | 2002-11-21 | Novasonics, Inc. | User interface for handheld imaging devices |
US7615008B2 (en) * | 2000-11-24 | 2009-11-10 | U-Systems, Inc. | Processing and displaying breast ultrasound information |
US6675038B2 (en) * | 2001-05-14 | 2004-01-06 | U-Systems, Inc. | Method and system for recording probe position during breast ultrasound scan |
GB2395880B (en) * | 2002-11-27 | 2005-02-02 | Voxar Ltd | Curved multi-planar reformatting of three-dimensional volume data sets |
US6991605B2 (en) * | 2002-12-18 | 2006-01-31 | Siemens Medical Solutions Usa, Inc. | Three-dimensional pictograms for use with medical images |
DE10323008A1 (en) * | 2003-05-21 | 2004-12-23 | Siemens Ag | Automatic fusion of 2D fluoroscopic C-frame X-ray images with preoperative 3D images using navigation markers, by use of a projection matrix based on a 2D fluoroscopy image and a defined reference navigation system |
CA2576646C (en) * | 2003-07-29 | 2011-02-08 | Ntd Laboratories, Inc. | System and method for assessing fetal abnormality based on landmarks |
JP2008515519A (en) * | 2004-10-08 | 2008-05-15 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Ultrasound imaging system using body marker annotation |
US8591420B2 (en) * | 2006-12-28 | 2013-11-26 | Kabushiki Kaisha Toshiba | Ultrasound imaging apparatus and method for acquiring ultrasound image |
JP5022716B2 (en) * | 2007-01-24 | 2012-09-12 | 株式会社東芝 | Ultrasonic diagnostic apparatus and control program for ultrasonic diagnostic apparatus |
EP2255730A4 (en) | 2008-03-03 | 2014-12-10 | Konica Minolta Inc | Ultrasonograph |
JP2009297072A (en) * | 2008-06-10 | 2009-12-24 | Toshiba Corp | Ultrasonic diagnostic apparatus and medical image processing apparatus |
CN101477435A (en) * | 2008-12-26 | 2009-07-08 | 明基电通有限公司 | Image operation method and its portable electronic device |
KR101182880B1 (en) * | 2009-01-28 | 2012-09-13 | 삼성메디슨 주식회사 | Ultrasound system and method for providing image indicator |
US20160066887A1 (en) * | 2009-01-28 | 2016-03-10 | Samsung Medison Co., Ltd. | Image indicator provision in ultrasound system |
EP2387949A1 (en) * | 2010-05-17 | 2011-11-23 | Samsung Medison Co., Ltd. | Ultrasound system for measuring image using figure template and method for operating ultrasound system |
JP5269851B2 (en) * | 2010-09-27 | 2013-08-21 | 富士フイルム株式会社 | Image editing apparatus, image editing method and program thereof |
WO2012040827A2 (en) * | 2010-10-01 | 2012-04-05 | Smart Technologies Ulc | Interactive input system having a 3d input space |
US9146674B2 (en) * | 2010-11-23 | 2015-09-29 | Sectra Ab | GUI controls with movable touch-control objects for alternate interactions |
US20130137988A1 (en) * | 2011-11-28 | 2013-05-30 | Samsung Electronics Co., Ltd. | Method and Apparatus for the Augmentation of Physical Examination over Medical Imaging Data |
KR20130107882A (en) | 2012-03-23 | 2013-10-02 | 삼성메디슨 주식회사 | Apparatus and method for providing data to user of ultrasound device |
KR101446780B1 (en) * | 2012-06-01 | 2014-10-01 | 삼성메디슨 주식회사 | The method and apparatus for displaying an ultrasound image and an information related the image |
JP6012288B2 (en) * | 2012-06-27 | 2016-10-25 | 株式会社日立製作所 | Ultrasonic diagnostic equipment |
JP2014064637A (en) * | 2012-09-25 | 2014-04-17 | Fujifilm Corp | Ultrasonic diagnostic device |
EP2719336B1 (en) * | 2012-10-12 | 2021-03-17 | Samsung Medison Co., Ltd. | Method for displaying ultrasound image using doppler data and ultrasound medical apparatus thereto |
JP6173686B2 (en) | 2012-12-25 | 2017-08-02 | 東芝メディカルシステムズ株式会社 | Ultrasonic diagnostic equipment |
JP6011378B2 (en) | 2013-02-05 | 2016-10-19 | コニカミノルタ株式会社 | Ultrasound diagnostic imaging equipment |
-
2014
- 2014-12-15 KR KR1020140180499A patent/KR102388132B1/en active IP Right Grant
-
2015
- 2015-04-10 EP EP15163220.5A patent/EP3034005B1/en active Active
- 2015-05-18 US US14/714,735 patent/US10768797B2/en active Active
- 2015-07-29 CN CN201510455280.XA patent/CN105686848A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000175910A (en) * | 1998-12-21 | 2000-06-27 | Toshiba Iyo System Engineering Kk | Ultrasonograph |
US20110208052A1 (en) * | 2008-11-06 | 2011-08-25 | Koninklijke Philips Electronics N.V. | Breast ultrasound annotation user interface |
US20140324475A1 (en) * | 2012-08-31 | 2014-10-30 | Kabushiki Kaisha Toshiba | Medical reading report preparing apparatus and medical image diagnostic apparatus |
US20140104311A1 (en) * | 2012-10-12 | 2014-04-17 | Infinitt Healthcare Co., Ltd. | Medical image display method using virtual patient model and apparatus thereof |
EP2783635A1 (en) * | 2013-03-28 | 2014-10-01 | Samsung Medison Co., Ltd. | Ultrasound system and method of providing direction information of object |
Also Published As
Publication number | Publication date |
---|---|
KR102388132B1 (en) | 2022-04-19 |
KR20160072618A (en) | 2016-06-23 |
EP3034005B1 (en) | 2021-10-20 |
US20160170618A1 (en) | 2016-06-16 |
US10768797B2 (en) | 2020-09-08 |
CN105686848A (en) | 2016-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3034005B1 (en) | Method, apparatus and system for generating body marker indicating object | |
US10861161B2 (en) | Method and apparatus for displaying image showing object | |
US10809878B2 (en) | Method and apparatus for displaying ultrasound image | |
EP3132749A1 (en) | Ultrasound diagnosis apparatus for analyzing plaque and method of operating the same | |
US10163228B2 (en) | Medical imaging apparatus and method of operating same | |
US11033247B2 (en) | Ultrasound system and method of providing guide for improved HPRF doppler image | |
US20160054901A1 (en) | Method, apparatus, and system for outputting medical image representing object and keyboard image | |
EP2892024A2 (en) | Method and medical imaging apparatus for displaying medical images | |
EP3040030B1 (en) | Ultrasound image providing apparatus and method | |
US20160066887A1 (en) | Image indicator provision in ultrasound system | |
US20160157829A1 (en) | Medical imaging apparatus and method of generating medical image | |
US10441249B2 (en) | Ultrasound diagnosis apparatus and method of operating the same | |
US10849599B2 (en) | Method and apparatus for generating body marker | |
US10383599B2 (en) | Ultrasound diagnostic apparatus, operating method thereof, and computer-readable recording medium | |
KR102416511B1 (en) | Method and apparatus for generating a body marker | |
US11291429B2 (en) | Medical imaging apparatus and method of generating medical image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20161214 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20191129 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 8/08 20060101ALI20210429BHEP Ipc: G06F 3/0484 20130101ALI20210429BHEP Ipc: G06F 3/0482 20130101ALI20210429BHEP Ipc: A61B 8/00 20060101AFI20210429BHEP |
|
INTG | Intention to grant announced |
Effective date: 20210527 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602015074208 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1439251 Country of ref document: AT Kind code of ref document: T Effective date: 20211115 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20211020 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1439251 Country of ref document: AT Kind code of ref document: T Effective date: 20211020 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220120 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220220 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220221 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220120 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220121 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602015074208 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20220721 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20220410 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20220430 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220410 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220430 Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220410 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220430 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220430 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220410 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20150410 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: IT Payment date: 20240307 Year of fee payment: 10 Ref country code: FR Payment date: 20240306 Year of fee payment: 10 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211020 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240305 Year of fee payment: 10 |