JP6538130B2 - Image processing apparatus and program - Google Patents

Image processing apparatus and program Download PDF

Info

Publication number
JP6538130B2
JP6538130B2 JP2017168796A JP2017168796A JP6538130B2 JP 6538130 B2 JP6538130 B2 JP 6538130B2 JP 2017168796 A JP2017168796 A JP 2017168796A JP 2017168796 A JP2017168796 A JP 2017168796A JP 6538130 B2 JP6538130 B2 JP 6538130B2
Authority
JP
Japan
Prior art keywords
image data
control unit
display control
imaging
probe mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2017168796A
Other languages
Japanese (ja)
Other versions
JP2017225850A (en
Inventor
治郎 樋口
治郎 樋口
尚之 中沢
尚之 中沢
高橋 正美
正美 高橋
平間 信
信 平間
和俊 貞光
和俊 貞光
正敏 西野
正敏 西野
紀久 菊地
紀久 菊地
篤司 鷲見
篤司 鷲見
文康 坂口
文康 坂口
淳 中井
淳 中井
義徳 後藤
後藤  義徳
洋一 小笠原
洋一 小笠原
Original Assignee
キヤノンメディカルシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノンメディカルシステムズ株式会社 filed Critical キヤノンメディカルシステムズ株式会社
Priority to JP2017168796A priority Critical patent/JP6538130B2/en
Publication of JP2017225850A publication Critical patent/JP2017225850A/en
Application granted granted Critical
Publication of JP6538130B2 publication Critical patent/JP6538130B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  Embodiments of the present invention relate to an image processing apparatus and a program.

  Conventionally, in an ultrasound diagnostic imaging apparatus, imaging may be performed at a plurality of imaging positions in one examination. For example, in a health check, the ultrasound diagnostic imaging apparatus performs imaging at a plurality of imaging positions such as the liver and kidney as an examination of the abdomen.

JP, 2011-136044, A

  The problem to be solved by the present invention is to provide an image processing apparatus and program capable of easily determining the correspondence between ultrasound image data captured in an examination and the imaging position of the ultrasound image data. .

An image processing apparatus according to an embodiment includes a storage unit and a display control unit. The storage unit stores ultrasound image data and imaging position information in which a probe mark is arranged on a body mark in association with the imaging position at which the ultrasound image data is captured. The display control unit receives a display request for ultrasonic image data, and displays the ultrasonic image data and the imaging position information on the display unit. When the display control unit receives from the operator an operation indicating that the viewing of the ultrasound image data has been completed, the display control unit is configured to receive the information indicating that the viewing of the ultrasound image data has been completed with the ultrasound image data. Displayed with shooting position information .

FIG. 1 is a view for explaining the overall configuration of the ultrasound diagnostic imaging apparatus according to the first embodiment. FIG. 2 is a view for explaining an operation unit according to the first embodiment. FIG. 3 is a diagram for explaining the first embodiment (part 1). FIG. 4 is a flowchart showing a processing procedure by the ultrasound diagnostic imaging apparatus according to the first embodiment. FIG. 5 is a second diagram to explain the first embodiment; FIG. 6 is a diagram for explaining the first embodiment (No. 3). FIG. 7 is a fourth diagram to explain the first embodiment; FIG. 8 is a fifth diagram to explain the first embodiment; FIG. 9 is a view for explaining the first embodiment (part 6); FIG. 10 is a view for explaining the first embodiment (part 7). FIG. 11 is an explanatory view (8) of the first embodiment; FIG. 12 is a view for explaining the first embodiment (part 9); FIG. 13 is a view for explaining the first embodiment (part 10). FIG. 14 is a diagram (11) to explain the first embodiment; FIG. 15: is a figure (the 1) for demonstrating the modification of 1st Embodiment. FIG. 16 is a second diagram to explain a modification of the first embodiment; FIG. 17 is a third diagram to explain a modification of the first embodiment; FIG. 18 is a fourth diagram to explain a modification of the first embodiment; FIG. 19 is a view for explaining the overall configuration of the image processing apparatus according to the second embodiment. FIG. 20 is a flowchart showing the processing procedure of the image processing apparatus according to the second embodiment. FIG. 21 is a diagram for explaining the second embodiment (part 1). FIG. 22 is a second diagram to explain the second embodiment; FIG. 23: is a figure (the 1) for demonstrating the modification of 2nd Embodiment. FIG. 24: is a figure (the 2) for demonstrating the modification of 2nd Embodiment.

  Hereinafter, an image processing apparatus and a program according to an embodiment will be described with reference to the drawings.

First Embodiment
First, the configuration of the ultrasound diagnostic imaging apparatus 100 according to the first embodiment will be described. FIG. 1 is a diagram for explaining the overall configuration of the ultrasound diagnostic imaging apparatus 100 according to the first embodiment. As shown in FIG. 1, the ultrasound diagnostic imaging apparatus 100 according to the present embodiment includes an ultrasound probe 1, a monitor 2, an operation unit 3, and an apparatus main body 10.

  The ultrasonic probe 1 has a plurality of piezoelectric vibrators, and the plurality of piezoelectric vibrators generate ultrasonic waves based on a drive signal supplied from a transmission unit 11 of an apparatus main body 10 described later. In addition, the ultrasonic probe 1 receives a reflected wave from the subject P and converts it into an electric signal. In addition, the ultrasonic probe 1 includes a matching layer and an acoustic lens provided in the piezoelectric vibrator, and a backing material that prevents the propagation of ultrasonic waves from the piezoelectric vibrator to the rear. The ultrasonic probe 1 is detachably connected to the apparatus main body 10.

  When an ultrasonic wave is transmitted from the ultrasonic probe 1 to the subject P, the transmitted ultrasonic wave is reflected one after another by the discontinuous surface of the acoustic impedance in the body tissue of the subject P, and the ultrasonic probe as a reflected wave signal A plurality of piezoelectric vibrators 1 are received. The amplitude of the received reflected wave signal depends on the difference in acoustic impedance at the discontinuity where the ultrasound is reflected. The reflected wave signal when the transmitted ultrasonic pulse is reflected on the surface of the moving blood flow, heart wall, etc. depends on the velocity component of the moving body with respect to the ultrasonic transmission direction by the Doppler effect. And frequency shift (Doppler shift).

  In the present embodiment, even when the subject P is scanned in two dimensions by the ultrasound probe 1 which is a one-dimensional ultrasound probe in which a plurality of piezoelectric transducers are arranged in a line, one-dimensional ultrasound A subject P is an ultrasonic probe 1 that is a two-dimensional ultrasonic probe in which an ultrasonic probe 1 that mechanically oscillates a plurality of piezoelectric vibrators of the probe and a plurality of piezoelectric vibrators are two-dimensionally arranged in a grid. Even in the case of scanning in three dimensions, it is applicable.

  The monitor 2 displays a graphical user interface (GUI) for the operator of the ultrasonic diagnostic imaging apparatus 100 to input various setting requests using the operation unit 3, or an ultrasonic image generated in the apparatus main body 10 Or display

  The operation unit 3 displays a GUI for receiving input of various instructions from the operator. The various instructions accepted by the operation unit 3 include the setting of scanning condition parameters and the setting of the imaging plan described in detail below. FIG. 2 is a diagram for explaining the operation unit 3 according to the first embodiment.

  As shown in FIG. 2, in the operation unit 3, a TCS (Touch Command Screen) 3 a capable of displaying a GUI is disposed. The TCS 3a can display a GUI independent of the monitor 2. For this reason, the operator can set and change, for example, scanning condition parameters and the like by operating the GUI displayed on the TCS 3a while, for example, checking an ultrasonic image displayed on the monitor 2. .

  Although not illustrated in FIG. 2, an operation device for hardware is disposed in the operation unit 3, and the operator is assigned a function interlocking with the GUI while looking at the GUI displayed on the TCS 3 a. By operating the operation device, various instructions are input. The operation device is, for example, a track ball, a changeover switch, a button switch, a toggle switch, a keyboard, a pedal switch, or the like.

  Further, the TCS 3a may display a software switch as one of the GUIs, and may receive an input by touching the software switch. In this case, for example, the operator directly inputs various instructions by directly touching the GUI displayed on the TCS 3a. A function is assigned to an operation device of the software switch or the hardware by an operator, a service person, or the like.

  The operation unit 3 also receives various instructions from the operator of the ultrasound diagnostic imaging apparatus 100 and transfers the received settings and various instructions to the apparatus main body 10. In addition, although the case where the operation part 3 was integrated with the apparatus main body 10 was illustrated in FIG. 2, the operation part 3 may be provided independently with the apparatus main body 10 by a portable apparatus.

  The device body 10 is a device that generates an ultrasound image based on the reflected wave received by the ultrasound probe 1. As shown in FIG. 1, the device body 10 includes a transmitter 11, a receiver 12, a B mode processor 13, a Doppler processor 14, an image generator 15, a display controller 16, and an internal storage unit. And an image memory 18 and a control unit 19.

  The transmission unit 11 includes a trigger generation circuit, a transmission delay circuit, a pulser circuit, and the like, and supplies a drive signal to the ultrasonic probe 1. The pulser circuit repeatedly generates rate pulses for forming transmission ultrasonic waves at a predetermined rate frequency. In addition, the transmission delay circuit is a rate at which the pulser circuit generates the delay time of each piezoelectric transducer required to focus the ultrasonic waves generated from the ultrasonic probe 1 in a beam shape to determine the transmission directivity. Apply to pulse. The trigger generation circuit also applies a drive signal (drive pulse) to the ultrasonic probe 1 at a timing based on the rate pulse. That is, the delay circuit arbitrarily adjusts the transmission direction from the surface of the piezoelectric vibrator by changing the delay time given to each rate pulse.

  The transmission unit 11 has a function capable of instantaneously changing the transmission frequency, the transmission drive voltage, and the like in order to execute a predetermined scan sequence based on an instruction from the control unit 19 described later. In particular, the change of the transmission drive voltage is realized by a linear amplifier type transmission circuit whose value can be switched at an instant or a mechanism for electrically switching a plurality of power supply units.

  The receiver 12 includes an amplifier circuit, an A / D converter, an adder, and the like, and performs various processes on the reflected wave signal received by the ultrasonic probe 1 to generate reflected wave data. The amplifier circuit amplifies the reflected wave signal for each channel to perform gain correction processing. The A / D converter A / D converts the gain-corrected reflected wave signal and gives digital data the delay time necessary to determine the reception directivity. The adder adds the reflected wave signals processed by the A / D converter to generate reflected wave data. The addition processing of the adder emphasizes the reflection component from the direction according to the reception directivity of the reflection wave signal.

  Thus, the transmission unit 11 and the reception unit 12 control transmission directivity and reception directivity in transmission and reception of ultrasonic waves.

  Here, when the ultrasonic probe 1 is capable of three-dimensional scanning, the transmitting unit 11 transmits a three-dimensional ultrasonic beam from the ultrasonic probe 1 to the subject P, and the receiving unit 12 is an ultrasonic probe. It is also possible to generate three-dimensional reflected wave data from the three-dimensional reflected wave signal received by 1.

  The B-mode processing unit 13 receives the reflected wave data from the receiving unit 12 and performs logarithmic amplification, envelope detection processing, etc., and data (“B-mode data” or “B-mode data”) B mode image data ") is generated.

  The Doppler processing unit 14 performs frequency analysis of velocity information from the reflected wave data received from the receiving unit 12, extracts blood flow, tissue, and contrast agent echo component due to the Doppler effect, and mobile object information such as average velocity, dispersion, and power. Are generated for multiple points ("Doppler data" or "Doppler image data"). Specifically, the Doppler processing unit 14 generates Doppler data in which an average velocity, a variance value, a power value, and the like are extracted over multiple points as motion information of the moving object. More specifically, the Doppler processing unit 14 generates color Doppler data for generating a color Doppler image indicating movement of blood flow, and tissue Doppler data for generating a tissue Doppler image indicating movement of tissue. .

  The B-mode processing unit 13 and the Doppler processing unit 14 according to the present embodiment can process both two-dimensional reflected wave data and three-dimensional reflected wave data. That is, the B-mode processing unit 13 can also generate two-dimensional B-mode data from two-dimensional reflected wave data, and generate three-dimensional B-mode data from three-dimensional reflected wave data. The Doppler processing unit 14 can also generate two-dimensional Doppler data from two-dimensional reflected wave data, and generate three-dimensional Doppler data from three-dimensional reflected wave data.

  The image generation unit 15 generates an ultrasonic image for display using data generated by the B-mode processing unit 13 and the Doppler processing unit 14. That is, the image generation unit 15 generates a B-mode image in which the intensity of the reflected wave is represented by luminance from the B-mode data generated by the B-mode processing unit 13. In addition, the image generation unit 15 uses the Doppler data generated by the Doppler processing unit 14 as an average velocity image, a dispersion image, a power image, or a combination image thereof representing moving body information (blood flow movement information and tissue movement information). To generate a doppler image (color doppler image or tissue doppler image).

  Here, in general, the image generation unit 15 converts (scan converts) a scanning line signal sequence of ultrasonic scanning into a scanning line signal sequence of a video format represented by a television or the like as a display image. Generate an ultrasound image. Specifically, the image generation unit 15 performs coordinate conversion in accordance with the scanning form of the ultrasonic wave by the ultrasonic probe 1 to generate an ultrasonic image as a display image. That is, the data generated by the B-mode processing unit 13 and the Doppler processing unit 14 is ultrasound image data before scan conversion processing, and the data generated by the image generation unit 15 is ultrasound for display after scan conversion processing. It is image data. The various data generated by the B-mode processing unit 13 and the Doppler processing unit 14 is also referred to as raw data.

  In addition to the scan conversion, the image generation unit 15 performs, as various image processing, for example, image processing (smoothing processing) for regenerating an average value image of luminance using a plurality of image frames after scan conversion. Perform image processing (edge enhancement processing) using a differential filter in the image. Further, the image generation unit 15 generates a composite image in which character information of various parameters, a scale, and the like are combined with ultrasound image data.

  Furthermore, the image generation unit 15 performs coordinate conversion on the three-dimensional B-mode data generated by the B-mode processing unit 13 to generate three-dimensional B-mode image data. Further, the image generation unit 15 performs coordinate conversion on the three-dimensional Doppler data generated by the Doppler processing unit 14 to generate three-dimensional Doppler image data. The image generation unit 15 generates “three-dimensional B-mode image data and three-dimensional Doppler image data” as “three-dimensional ultrasound image data (volume data)”.

  Further, the image generation unit 15 performs rendering processing on the volume data in order to generate various two-dimensional image data for displaying the volume data on the monitor 2. As a rendering process performed by the image generation unit 15, for example, there is a process of generating MPR image data from volume data by performing a cross-sectional reconstruction method (MPR: Multi Planer Reconstruction). Further, as a rendering process performed by the image generation unit 15, there is, for example, a volume rendering process (VR: Volume Rendering process) that generates two-dimensional image data reflecting three-dimensional information.

  Further, the image generation unit 15 is equipped with a storage memory for storing image data, and is capable of performing a three-dimensional image reconstruction process and the like. In addition, for example, after diagnosis, it is possible for the operator to call an image recorded during an examination from a storage memory mounted by the image generation unit 15.

  The display control unit 16 includes a first display control unit 16a and a second display control unit 16b, and causes the monitor 2 and the TCS 3a of the operation unit 3 to display various types of information for supporting imaging by the operator. . The details of the first display control unit 16a and the second display control unit 16b will be described later.

  The internal storage unit 17 stores various data such as a control program for performing ultrasonic wave transmission / reception, image processing and display processing, diagnostic information (for example, patient ID, doctor's finding, etc.), diagnostic protocol, various body marks and the like. The internal storage unit 17 is also used, for example, to store images stored in the image memory 18 as needed. Also, data stored in the internal storage unit 17 can be transferred to an external peripheral device via an interface (not shown).

  The image memory 18 stores the ultrasonic image and the composite image generated by the image generation unit 15. The image memory 18 can also store data (raw data) generated by the B-mode processing unit 13 and the Doppler processing unit 14. The image memory 18 can also store data subjected to various image processing by the display control unit 16.

  The control unit 19 controls the entire processing of the ultrasound diagnostic imaging apparatus 100. Specifically, the control unit 19 transmits the transmission unit based on various setting requests such as a photographing plan input from the operator via the operation unit 3 and various control programs and various data read from the internal storage unit 17. 11 controls the processing of the reception unit 12, the B mode processing unit 13, the Doppler processing unit 14, the image generation unit 15, and the display control unit 16. The control unit 19 controls the monitor 2 and the TCS 3 a of the operation unit 3 to display the ultrasonic image and the composite image stored in the image memory 18 and the GUI for the operator to specify various processes. .

  The overall configuration of the ultrasound diagnostic imaging apparatus 100 according to the present embodiment has been described above. In such a configuration, the ultrasound diagnostic imaging apparatus 100 according to the present embodiment captures an ultrasound image by ultrasound transmission / reception. Here, the operator, for example, sets various conditions using the operation unit 3, applies the ultrasonic probe 1 to the subject P, and transmits and receives ultrasonic waves. Then, when the apparatus main body 10 generates ultrasound image data, the operator can view the ultrasound image displayed on the monitor 2.

  In such imaging by the ultrasound diagnostic imaging apparatus 100, the first display control unit 16a supports imaging by the operator by causing the TCS 3a to display the imaging position (hereinafter, "preset information") of the initial value. . Here, first, the preset information will be described. The ultrasound diagnostic imaging apparatus 100 may perform imaging at a plurality of imaging positions in an examination. For example, in a certain physical examination, the ultrasound diagnostic imaging apparatus 100 may perform imaging at three places in the abdomen, and in another physical examination, the ultrasound diagnostic imaging apparatus 100 performs two places in the abdomen and 1 at the chest. There are cases where shooting is done at a certain place. In order to assist the operator in imaging at such a plurality of imaging positions, the internal storage unit 17 of the ultrasound diagnostic imaging apparatus 100 indicates the plurality of imaging positions and the position and orientation of the ultrasound probe at each imaging position. A plurality of patterns of information sets are stored in advance as "preset information". The internal storage unit 17 stores preset information in association with an identifier. Then, the operator selects preset information corresponding to a combination of examination and photographing positions to be photographed from the preset information of a plurality of patterns as a photographing plan before starting photographing, and is selected through the operation unit 3 Enter the preset information identifier.

  Then, the first display control unit 16a causes the TCS 3a to display the preset information selected by the operator in the imaging plan. FIG. 3 is a diagram (part 1) for explaining the first embodiment. In the example shown in FIG. 3, the case where preset information with three shooting positions is selected will be described. As shown in FIG. 3, the first display control unit 16 a causes the TCS 3 a to display preset information in which the probe marks 51 to 53 are arranged on the body mark 50. Here, the body mark 50 is information schematically indicating the subject P, and the probe marks 51 to 53 are information for presenting the position and the orientation of the ultrasonic probe 1 at the imaging position to the operator.

  Subsequently, the operator starts shooting. Here, the operator applies the ultrasound probe 1 to the imaging position of the subject P with reference to the preset information displayed on the TCS 3a, and selects a probe mark corresponding to the imaging position during imaging in the TCS 3a. Then, the second display control unit 16 b causes the probe mark selected in the TCS 3 a to be displayed at the corresponding position on the body mark 62 being displayed on the monitor 2. FIG. 3 shows the case where the probe mark 51 is selected by the operator in the TCS 3a. In this case, the second display control unit 16 b causes the probe mark 63 to be displayed on the body mark 62. The monitor 2 displays the image data 60 captured by the ultrasound diagnostic imaging apparatus 100 at the imaging position indicated by the probe mark 51. The probe mark selected by the TCS 3a is maintained in the state of being selected as a probe mark during imaging in the TCS 3a until the end of imaging at the imaging position corresponding to the probe mark is received.

  When the second display control unit 16b receives an end of shooting at any of the shooting positions scheduled for shooting on the TCS 3a, the probe mark corresponding to the shooting position is used for shooting at the shooting position. The end information indicating that the has been completed is displayed on the TCS 3a. As the ultrasonic diagnostic imaging apparatus 100 supports the imaging by the operator in this manner, the operator can prevent the imaging from being missed at the imaging position to be imaged in the examination.

  Next, details of processing by the ultrasound diagnostic imaging apparatus 100 will be described using FIG. 4. FIG. 4 is a flowchart showing the processing procedure of the ultrasound diagnostic imaging apparatus 100 according to the first embodiment. Note that the ultrasound diagnostic imaging apparatus 100 executes the following process in response to acceptance of selection of preset information as an imaging plan. As shown in FIG. 4, the first display control unit 16a displays a body mark and a probe mark as preset information on the TCS 3a of the operation unit 3 based on the photographing plan (step S101).

  FIG. 5 is a diagram (part 2) for explaining the first embodiment. In FIG. 5, a case where preset information having three imaging positions scheduled for imaging in an examination is selected as an imaging plan will be described. As shown in FIG. 5, the first display control unit 16a displays a body mark 50 and three probe marks (probe mark 51 to probe mark 53) superimposed on the body mark 50 on the TCS 3a. Here, the body mark 50 is information schematically showing the subject P, and the probe marks 51 to 53 present the operator with the position and the orientation of the ultrasonic probe 1 at the imaging position scheduled to be imaged. It is information.

  In addition, the first display control unit 16a further displays order information indicating the imaging order of the imaging position to be imaged in the examination. For example, the first display control unit 16a displays the numbers (1) to (3) in the vicinity of the probe mark 51 to the probe mark 53. These numbers are order information indicating the imaging order, and in the example shown in FIG. 5, the first imaging is performed at the imaging position indicated by the probe mark 51, and the second imaging is performed at the imaging position indicated by the probe mark 52. It indicates that the third imaging is performed at the imaging position indicated by the mark 53. As described above, the first display control unit 16a sets the position of the ultrasound probe 1 at the imaging position at the corresponding position on the body mark 50 for each imaging position to be imaged in the inspection before the start of the inspection. Probe marks 51 to 53 indicating the direction are displayed on the TCS 3a. Here, although the case where there are three imaging positions will be described, the number of imaging positions can be arbitrarily changed.

  In addition, the first display control unit 16 a does not overlap the body mark 50 and causes the probe mark 54 to be further displayed outside the body mark 50. The probe mark 54 is a probe mark which is not included in preset information selected at the time of imaging planning, and is a preliminary probe mark used when imaging is performed at a position other than the imaging position displayed as the preset information. For example, when the operator captures an image suspected of a tumor or the like by capturing an image while checking the image data displayed on the monitor 2, additional imaging may be performed at other than the imaging position scheduled at the imaging planning. As described above, when additional imaging is performed at a position other than the imaging position indicated by the preset information, the probe mark 54 displayed outside the body mark 50 is used. The first display control unit 16a further causes (4) to be displayed as order information in the vicinity of the probe mark 54. Further, in the example illustrated in FIG. 5, the first display control unit 16 a displays the probe mark 51 to the probe mark 54 as a white rectangular area.

  Return to FIG. Subsequent to step S101, the ultrasound diagnostic imaging apparatus 100 starts imaging (step S102). Thereby, the ultrasound diagnostic imaging apparatus 100 displays the captured ultrasound image on the monitor 2 in real time. Then, the second display control unit 16b determines whether a touch operation has been received on the TCS 3a (step S103). Here, when it is determined that the touch operation has not been received on the TCS 3a (No in step S103), the second display control unit 16b subsequently determines whether the touch operation has been received on the TCS 3a. On the other hand, when the second display control unit 16b determines that the touch operation has been accepted on the TCS 3a (Yes in step S103), whether the probe mark corresponding to the imaging position scheduled to be imaged is the target of the touch operation It determines (step S104). Here, when the second display control unit 16 b determines that the probe mark corresponding to the imaging position scheduled to be imaged is the target of the touch operation (Yes in step S 104), the process proceeds to step S 107.

  On the other hand, when the second display control unit 16b does not determine that the probe mark corresponding to the imaging position scheduled to be imaged is the target of the touch operation (No in step S104), it accepts the preliminary movement of the probe mark. (Step S105). The operator selects the probe mark 54 displayed outside the body mark 50 on the TCS 3 a in the example shown in FIG. 5 when performing additional photographing other than the photographing position indicated by the preset information, and the operator is on the body mark 50. The probe mark 54 is moved to a position corresponding to the position to be imaged. As a result, the second control unit 16b adds the position of the probe mark 54 to the imaging position scheduled for imaging (step S106). After step S106, the second display control unit 16b proceeds to step S103.

  Subsequently, in step S107, the second display control unit 16b accepts selection of a probe mark on the TCS 3a (step S107). Here, when there is a probe mark added to the imaging position scheduled to be imaged, the second display control unit 16b can also accept the selection of the added probe mark. In the example illustrated in FIG. 5, the second display control unit 16 b directly touches any one of the probe marks 51 to 53 on the TCS 3 a to receive selection of the directly touched probe mark.

  Then, the second display control unit 16b displays the probe mark selected on the TCS 3a on the monitor 2 (step S108). FIG. 6 is a third diagram to explain the first embodiment. FIG. 6 shows an ultrasound image displayed on the monitor 2 when the second display control unit 16b receives the selection of the probe mark 51 shown in FIG. 5 on the TCS 3a. FIG. 6 shows the case where the monitor 2 is used by dividing the display area into a first display area 2a, a second display area 2b, and a third display area 2c. Further, FIG. 6 shows a case where the ultrasound diagnostic imaging apparatus 100 captures B-mode image data.

  As illustrated in FIG. 6, the second display control unit 16 b displays an ultrasound image including the B-mode image 60 and the shooting position information 61 in the second display area 2 b of the monitor 2. Here, the shooting position information 61 is information in which a probe mark associated with a shooting position during shooting is arranged on the body mark displayed on the monitor 2. For example, the shooting position information 61 includes a body mark 62 and a probe mark 63. In the photographing position information 61 displayed on the monitor 2, the body mark 62 corresponds to the body mark 50 displayed on the TCS 3a, and the probe mark 63 corresponds to the body mark 51 displayed on the TCS 3a. That is, the imaging position information 61 displayed on the monitor 2 corresponds to the probe mark indicating the imaging position being imaged and the body mark 50 among the imaging positions scheduled to be displayed on the TCS 3 a.

  Return to FIG. By the way, when imaging, depending on the physical form of the subject P, the operator may have difficulty in imaging at a planned imaging position, and may change the position or orientation of the ultrasound probe 1. In such a case, the operator performs an operation of reflecting the position and the orientation of the ultrasonic probe after the change on the probe mark displayed on the TCS 3 a in order to record the changed imaging position. That is, the operator performs an operation of changing the position or the direction of the probe mark corresponding to the imaging position being imaged among the probe marks displayed on the TCS 3a. Because of this, the second display control unit 16b determines whether or not an operation to change at least one of the position and the orientation of the probe mark has been received in the TCS 3a (Step S109). Here, when it is determined that the second display control unit 16b does not receive an operation to change at least one of the position and the orientation of the probe mark (No in step S109), the process proceeds to step S112. On the other hand, when it is determined that the operation to change at least one of the position and the orientation of the probe mark is received (Yes at Step S109), the second display control unit 16b changes the position and / or the orientation of the probe mark (Step S110).

  FIG. 7 is a fourth diagram to explain the first embodiment. FIG. 7 shows the case where the probe mark 51 is selected by the TCS 3 a in the state shown in FIG. 5 and an operation of changing the direction of the probe mark 51 is received. As shown in FIG. 7, in the TCS 3a, the second display control unit 16b further causes the probe mark 51a in which the direction of the probe mark 51 is changed to be displayed on the body mark 50. When the second display control unit 16 b receives an operation of moving the probe mark 51 in parallel, the second display control unit 16 b causes the TCS 3 a to display the probe mark 51 a at the position where the probe mark 51 is moved in parallel.

  Return to FIG. The second display control unit 16b displays the changed probe mark on the monitor 2 (step S111). FIG. 8 is a fifth diagram to explain the first embodiment. FIG. 8 shows the case where the orientation of the probe mark 51 on the TCS 3a is changed as shown in FIG. 7 in the state shown in FIG. As shown in FIG. 8, the second display control unit 16 b further displays a probe mark 64 corresponding to the probe mark 51 a on the TCS 3 a shown in FIG. 7 on the body mark 61.

  Return to FIG. Subsequently, the second display control unit 16b determines whether a store of ultrasonic image data being captured has been received (step S112). Here, when it is determined that the second display control unit 16b does not receive the store (No in step S112), the process proceeds to step S109. On the other hand, when the second display control unit 16b determines that the store has been received (Yes at step S112), the second display control unit 16b determines whether the shooting position during shooting is the shooting position of the preset information (step S113). In other words, the second control unit 16b determines whether the probe mark corresponding to the imaging position during imaging is the probe mark included in the preset information selected at the time of the imaging plan or the probe added to the imaging position scheduled for imaging later. Determine if it is a mark. For example, when the preset information selected at the time of shooting planning is the preset information shown in FIG. 5, the second display control unit 16 b selects one of the probe mark 51 to the probe mark 53 corresponding to the shooting position during shooting. If it is determined that the shooting position is the shooting position of the preset information. Also, for example, in the case of the preset information shown in FIG. 5, the second display control unit 16b takes the imaging position when the probe mark corresponding to the imaging position during imaging is a probe mark other than the probe mark 51 to the probe mark 53. Is determined to be a shooting position different from the shooting position of the preset information.

  Here, when the second display control unit 16b determines that the shooting position during shooting is the shooting position of the preset information (Yes in step S113), it is used as end information indicating that shooting at the shooting position is completed. The color of the probe mark is changed to blue (step S114). For example, as shown in FIG. 7, the second display control unit 16b changes the color of the probe mark 51a to blue (dot display in FIG. 7).

  On the other hand, when the second display control unit 16b determines that the shooting position during shooting is not the shooting position of the preset information (No in step S113), the probe mark at the shooting position is of preset information The color of the probe mark is changed to red as the unscheduled end information indicating that the imaging at the imaging position different from the imaging position is ended (step S115). FIG. 9 is a diagram (part 6) for describing the first embodiment. In FIG. 9, after completion of imaging at the imaging position indicated by the probe mark 51, the probe mark 54 is added to the imaging position to be imaged in step S106, and imaging is performed at the imaging position indicated by the probe mark. . The example shown in FIG. 9 shows the case where the imaging with the probe mark 54 is performed prior to the imaging with the probe mark 52 and the probe mark 53. As shown in FIG. 9, the second display control unit 16b, for example, changes the color of the probe mark 54 to red (hatching in FIG. 9). When the probe mark 54 is selected, the first display control unit 16a newly displays the probe mark 55 outside the body mark 50 for imaging at an imaging position other than the imaging planned. In addition, the first display control unit 16a further displays (5) as order information in the vicinity of the probe mark 55.

  Note that, in parallel with the processing in step S114 and step S115, the control unit 19 adds an identifier to the captured ultrasonic image data and causes the image memory 18 to store the same. FIG. 10 is a diagram (7) for explaining the first embodiment. In FIG. 10, a data structure when ultrasonic image data is stored in the image memory 18 will be described. As shown in FIG. 10, the ultrasound image data is stored in the image memory 18 in association with the “examination ID” identifying the examination and the “ultrasound image data ID” identifying the ultrasound image data. .

  Here, “ultrasound image data” is an ultrasonic image displayed on the monitor 2 and includes, for example, a B-mode image 60 and position information 61 at the time of imaging. In addition to the “examination ID” for identifying the examination and the “ultrasound image data ID” for identifying the ultrasound image data, the ultrasound image data is a “device for identifying the ultrasound image diagnostic apparatus 100 that is imaging "ID" may be further associated and stored in the image memory 18. Further, as long as the ultrasound image data includes the B mode image 60, the imaging position information 61 may not be included.

  Return to FIG. Subsequent to step S114 or step S115, the second display control unit 16b associates and stores the probe mark of the imaging position at which the store has been received and the ultrasound image data imaged at this imaging position (step S116). ). In other words, the second display control unit 16b generates definition information in which the correspondence with the ultrasound image data is defined for each probe mark, and stores the definition information in the image memory 18. FIG. 11 is a figure (eighth) for explaining the first embodiment.

  FIG. 11 illustrates an example of a data structure of definition information generated by the second display control unit 16b. As shown in FIG. 11, the definition information includes “examination ID” for identifying an examination, “probe mark ID” for identifying a probe mark, “ultrasound image for identifying“ probe mark data ”and ultrasound image data. The information associated with the data ID is stored.

  Here, “probe mark data” is image data indicating a body mark and a probe mark superimposed on the body mark on the TCS screen after the store is received and the color of the probe mark is changed. That is, the probe mark data indicates the body mark and the probe mark displayed on the TCS 3a when the ultrasonic image data is stored. Further, probe mark data is stored, for example, in the order in which ultrasound image data is stored. Therefore, by comparing probe mark data before and after in the imaging order, it is possible to specify a probe mark corresponding to the latest imaging position.

  In the example shown in FIG. 11, when the "examination ID" is "xxx" and the probe mark ID is "000x-a", the probe mark specified by this probe mark ID is the probe mark 51a, and this probe It indicates that the identifier of the ultrasound image data captured at the imaging position indicated by the mark 51a is "0001". Further, when the “examination ID” is “xxx” and the probe mark ID is “000x-b”, the probe mark specified by the probe mark ID is the probe mark 54, and the imaging indicated by the probe mark 54 It indicates that the identifier of the ultrasound image data captured at the position is "0002".

  Then, the second display control unit 16b determines whether the end of shooting has been received (step S117). FIG. 12 is a diagram (No. 9) for explaining the first embodiment, and FIG. 13 is a diagram (No. 10) for explaining the first embodiment.

  FIG. 12 shows an example of the TCS screen when the photographing at the photographing position to be photographed is not finished. As shown in FIG. 12, on the body mark 50, a probe mark 51, a probe mark 51a, and probe marks 52 to 54 are displayed. Here, since the probe mark 51a, the probe mark 53 and the probe mark 54 have rectangular areas in blue (dot display in FIG. 12) or red (hatching display in FIG. 12), these probe marks indicate Indicates that shooting at the shooting position has ended. On the other hand, the probe mark 52 indicates that the imaging at the imaging position indicated by the probe mark 52 has not been completed because the rectangular area is white. That is, in TCS 3a, the probe mark displayed on the body mark 50 has a probe mark indicating that the imaging is not completed. As described above, the operator visually recognizes whether or not an unphotographed photographing position exists by indicating whether or not the photographing at the photographing position scheduled to be photographed is finished by the color of the rectangular area of the probe mark. Is possible. That is, in the case of the TCS screen shown in FIG. 12, since the rectangular area of the probe mark 52 is white, it is possible for the operator to determine not to end the scheduled imaging. Then, if the second control unit 16 b determines that the end of the imaging is not received (No at Step S117), the process proceeds to Step S103.

  On the other hand, FIG. 13 shows an example of the TCS screen when the photographing at the photographing position to be photographed is finished. As shown in FIG. 13, on the body mark 50, a probe mark 51, a probe mark 51a, and probe marks 52 to 54 are displayed. Here, the rectangular area of the probe mark 51a and the probe mark 52 to the probe mark 54 is blue (dot display in FIG. 13) or red (hatching display in FIG. 13). That is, it indicates that the imaging at the imaging position indicated by these probe marks has ended. In other words, in the TCS 3a, the probe mark displayed on the body mark 50 has no probe mark indicating that the imaging is not completed. As described above, the operator visually recognizes whether or not an unphotographed photographing position exists by indicating whether or not the photographing at the photographing position scheduled to be photographed is finished by the color of the rectangular area of the probe mark. Is possible. That is, in the case of the TCS screen shown in FIG. 13, since the rectangular area of the probe mark 51a and the probe mark 52 to the probe mark 54 is not white, the operator can determine to end the imaging.

  When the second control unit 16b determines that the end of shooting has been received (Yes at step S117), the second control unit 16b stores the TCS screen (step S118), and ends the process. Here, as shown in FIG. 13, the second display control unit 16b displays the TCS screen including the probe mark 51 and the probe mark 51a displayed on the body mark 50 in a superimposed manner, and the probe mark 52 to the probe mark 54. It is stored in the image memory 18 as “photographing position image data (also referred to as photographing position information)”. Here, the second display control unit 16 b adds an identifier to the photographing position information and stores the identifier in the image memory 18. FIG. 14 is a diagram (part 11) for describing the first embodiment. In FIG. 14, a data structure in the case where shooting position information is stored in the image memory 18 will be described. As shown in FIG. 14, the imaging position information is stored in the image memory 18 in association with the “examination ID” identifying an examination and the “imaging position image data ID” identifying imaging position image data.

  As described above, according to the first embodiment, the probe marks corresponding to the respective imaging positions are displayed on the TCS 3a at the corresponding positions on the body mark for each of the imaging positions determined to be imaged in advance in the examination. Let As a result, the operator can easily visually recognize the shooting position to be shot. For example, the operator can easily recognize the imaging position to be imaged and the position and orientation of the ultrasonic probe 1 at the imaging position. In other words, the operator can visually recognize where the imaging position to be taken in the examination is, and in what direction the ultrasonic probe 1 is to be applied. Thus, the operator can omit the operation of setting the probe mark at the position on the body mark corresponding to the imaging position each time imaging is performed at each imaging position. As a result, the operability of the operator can be improved. The preset information may be set at the time of factory shipment, or may be set by the operator.

  Further, according to the first embodiment, when the end of imaging at any of the imaging positions scheduled for imaging is received, the imaging at the imaging position is completed on the probe mark corresponding to the imaging position. Display end information indicating. For example, when the operator stores, the color of the probe mark is changed to blue or the like. As a result, the operator can easily discriminate the imaging position to be imaged and the position and orientation of the ultrasonic probe 1, and the imaged imaging position and the position and orientation of the ultrasonic probe 1.

  Further, according to the first embodiment, the operator can determine whether the imaging at the imaging position scheduled to be imaged has ended based on the color of the probe mark. For example, when there is a probe mark whose rectangular area is white, the operator can visually determine that there is a photographing position which is to be photographed but not yet photographed. Thereby, the operator can prevent a drop in imaging at the imaging position scheduled to be imaged in the examination. The color displayed by the second display control unit 16b as the end information is not limited to blue. Similarly, the color displayed by the second display control unit 16b as the unscheduled end information is not limited to red. That is, the end information and the unscheduled end information may be different in at least one of the display color and the display mode.

  Further, according to the first embodiment, the position and the orientation of the probe mark can be changed by touch-operating the probe mark at the imaging position to be photographed. Thus, even when the position and the orientation of the ultrasonic probe 1 at the imaging position are changed, it is possible to record a probe mark corresponding to the changed position and the orientation of the ultrasonic probe 1. As a result, even when the radiologist and the radiologist are different, it is possible to recognize that the position and orientation of the ultrasound probe 1 are changed from the imaging position scheduled for imaging.

  Further, according to the first embodiment, a preliminary probe mark is displayed also on the outside of the body mark. For example, when shooting at a shooting position not scheduled for shooting, the operator touches the probe mark outside the body mark to move it to the corresponding shooting position on the body mark. Then, when the operator changes the position and orientation of the probe mark and stores it, the color of the rectangular area of the probe mark is changed to, for example, red as a color indicating that the photographing position is not determined beforehand to be photographed. As described above, even if the radiographer and the radiographer are different by changing the color of the probe mark to a color indicating that it is not the radiographing position determined to be radiographed in advance, the radiographed ultrasound image data is It is possible to recognize that the data is captured at a shooting position where there is no image.

  In addition, the ultrasound diagnostic imaging apparatus 100 according to the first embodiment sets in advance the position and orientation of the ultrasound probe 1 at the imaging position assumed at the time of the imaging schedule, and the imaging position is added after the start of imaging. If the case or the position and orientation of the ultrasonic probe 1 are changed, the change content is accepted from the operator. As described above, the ultrasound diagnostic imaging apparatus 100 according to the first embodiment presents the operator with the position and the orientation of the ultrasound probe 1 at the imaging position without using the position sensor. As a result, the ultrasound diagnostic imaging apparatus 100 according to the first embodiment does not have to incorporate the position sensor, so that the production cost can be reduced.

  Further, in the above-described embodiment, the order information indicating the photographing order is displayed. However, the embodiment is not limited to this. For example, the first display control unit 16a may not display order information indicating the photographing order. In addition, the first display control unit 16a blinks the probe mark selected on the TCS 3a, for example, as a probe mark indicating a shooting position during shooting until the shooting at the shooting position is completed. It is also good.

  In the embodiment described above, the case where the probe mark used when photographing is performed at the photographing position other than the photographing schedule has been described on the outside of the body mark 50, but the embodiment is limited to this is not. For example, when the first display control unit 16a does not display in advance a probe mark for an imaging position other than the imaging schedule and receives an operation indicating that imaging other than the imaging schedule is performed from the operator, the imaging schedule is not planned. The probe mark for the imaging position of may be displayed.

  In the embodiment described above, the second display control unit 16b is described as displaying an ultrasound image including the B mode image and the position information at the time of imaging on the monitor 2, but the embodiment is not limited to this. It is not limited to That is, the second display control unit 16b may display an ultrasound image including an image captured in a shooting mode other than the B mode and position information at the time of shooting. For example, the second display control unit 16b may display an ultrasound image including the color Doppler image and the position information at the time of imaging on the monitor 2. In addition, the second display control unit 16 b may display an ultrasound image not including the imaging position information on the monitor 2.

  In the embodiment described above, when the second display control unit 16 b receives an operation to change at least one of the position and the orientation of the probe mark, the second display control unit 16 b does not change the form of the probe mark before the change. Although described, the embodiment is not limited thereto. For example, the second display control unit 16b may indicate a rectangular area by a broken line for the probe mark before the change. Specifically, after the second display control unit 16b displays the probe mark 51a shown in FIG. 7, the rectangular area of the probe mark 51 may be indicated by a broken line. As a result, the operator can easily discriminate the probe mark 51 before change from the unphotographed probe marks 52 and 53. Alternatively, when changing the position and orientation of the probe mark, the second display control unit 16b may display the probe mark before the change in a color distinguishable from the colors of other probe marks. Then, the second display control unit 16b may display the area in which the pre-changed probe mark and the post-changed probe mark overlap each other in a merged color before and after the change. In addition, the second display control unit 16b may delete the probe mark before the change when the operation of changing at least one of the position and the orientation of the probe mark is received.

  In the embodiment described above, when at least one of the position and the orientation of the probe mark is changed, the change is performed before storage of the ultrasonic image data, but the embodiment is not limited to this. Absent. For example, at least one of the position and orientation of the probe mark may be changed after storage of ultrasound image data.

  In the above-described embodiment, the first control unit 16a is described as displaying the preset information selected in the imaging plan on the TCS 3a, but the embodiment is not limited to this. For example, the first control unit 16a may cause the monitor 2 to display the preset information selected in the imaging plan. In such a case, the display area of the monitor 2 is divided into, for example, a plurality of display areas, ultrasonic image data is displayed in one display area, and the preset information selected in the imaging plan is displayed in another display area. Further, in such a case, the second display control unit 16b changes at least one of the position and the direction of the probe mark displayed on the monitor 2, or changes the color of the probe mark to blue as the end information. When the preset information is displayed on the monitor 2, the ultrasound image data and the preset information may be alternately switched and displayed without dividing the display area.

  Moreover, the display form of the probe mark in the embodiment described above can be arbitrarily changed. For example, the color, line type and design of the probe mark may be changed as appropriate by the operator.

  In addition, when displaying the probe mark, when the operation scheduled to be performed at each imaging position is further correlated, the first display control unit 16a further displays support information for supporting the operation scheduled to be performed. May be FIG. 15 is a diagram (part 1) for describing a modification of the first embodiment. In FIG. 15, the case of performing distance measurement as an operation to be performed will be described.

  As illustrated in FIG. 15, the first display control unit 16a causes the probe mark 51 to the probe mark 53 to be displayed superimposed on the body mark 50 in the TCS 3a, and the probe mark 54 and the body mark 50 are displayed. Are displayed on the outside of the body mark 50 without being superimposed. In addition, the first display control unit 16a causes the numbers (1) to (4), which are order information, to be displayed near the probe mark 51 to the probe mark 54. Furthermore, the first display control unit 16a displays the support information 56 presenting "distance measurement" near the probe mark 51, and displays the support information 57 presenting "distance measurement" near the probe mark 52. . As a result, the operator can visually recognize that the distance measurement is performed at the imaging positions indicated by the probe mark 51 and the probe mark 52. Note that the support information is not limited to distance measurement, but indicates that imaging is performed by administering a contrast agent at the imaging position, that presenting imaging in the Doppler mode, and measuring blood flow velocity May be presented.

  Moreover, in the examination of ultrasound, it may be possible to specify an approximate position in ultrasound image data of an observation target according to the imaging position. For example, at the imaging position indicated by the probe mark 51 shown in FIG. 15, the organ A is to be observed. In the ultrasound image data at this imaging position, the position of the organ A can be identified on the upper left side. At the imaging position indicated by the probe mark 52 shown in FIG. 15, the organ B is to be observed. In the ultrasound image data at this imaging position, the position of the organ B can be identified near the center. Because of this, when distance measurement is associated with any of the probe marks selected on the TCS 3a as an operation to be performed, the second display control unit 16b captures an image at the imaging position corresponding to the probe mark. The starting point for distance measurement may be displayed at a position on the ultrasound image to be measured and at a position on the ultrasound image determined according to the position of the probe mark on the body mark. FIG. 16 is a diagram (part 2) for describing a modification of the first embodiment, and FIG. 17 is a diagram (part 3) for describing a modification of the first embodiment.

  FIG. 16 shows an ultrasound image displayed on the monitor 2 when the probe mark 51 shown in FIG. 15 is selected on the TCS 3a. As shown in FIG. 16, the second display control unit 16 b displays an ultrasound image including the B-mode image 60 and the imaging position information 61 on the second display area 2 b of the monitor 2. Here, the second display control unit 16 b determines that the observation target according to the position of the probe mark 51 on the body mark 50 is the organ A. Then, the second display control unit 16 b displays the distance measurement starting point 66 at a position where it is predicted that the organ A can be identified in the B-mode image 60. The position of the start point 66 can be moved to another position by, for example, a touch operation by the operator.

  FIG. 17 shows an ultrasound image displayed on the monitor 2 when the probe mark 52 shown in FIG. 15 is selected on the TCS 3a. As shown in FIG. 17, the second display control unit 16 b displays an ultrasound image including the B-mode image 60 and the shooting position information 61 in the second display area 2 b of the monitor 2. Here, the second display control unit 16 b determines that the observation target according to the position of the probe mark 52 on the body mark 50 is the organ B. Then, the second display control unit 16 b displays the starting point 67 for distance measurement at a position where it is predicted that the renal pelvis can be identified in the B-mode image 60. The position of the start point 67 can be moved to another position by, for example, a touch operation by the operator.

  In the example illustrated in FIGS. 16 and 17, the second display control unit 16b determines the observation target and displays the start point at a position where it is predicted that the observation target can be identified. The form is not limited to this. For example, the second display control unit 16b may display the start point at a position on ultrasound image data associated in advance, in accordance with the position on the body mark of the probe mark corresponding to the imaging position.

  In the embodiment described above, the case where there are a plurality of shooting positions scheduled to be shot has been described, but the embodiment is not limited to this. For example, the number of imaging positions scheduled for imaging may be one. FIG. 18 is a diagram (No. 4) for describing a modification of the first embodiment. In FIG. 18, a case where preset information having one imaging position to be imaged in an examination is selected as an imaging plan will be described. As shown in the upper part of FIG. 18, the first display control unit 16a displays the body mark 50 and one probe mark 51 superimposed on the body mark 50 on the TCS 3a. The first display control unit 16a also displays a preliminary probe mark 54 on the outside of the body mark 50, as shown in the upper part of FIG.

  The lower left view of FIG. 18 shows the case where the imaging at the imaging position indicated by the probe mark 51 is completed. In such a case, the second display control unit 16b changes the color of the probe mark to blue (dot display in FIG. 18) as end information indicating that the imaging of the preset information at the imaging position is ended. On the other hand, the lower right part of FIG. 18 shows the case where the imaging at the imaging position indicated by the probe mark 54 is completed before the imaging at the imaging position indicated by the probe mark 51 is completed. In such a case, the second display control unit 16b changes the color of the probe mark to red (hatched display in FIG. 18) as end information indicating that shooting at the shooting position other than the shooting position of the preset information is ended. change.

Second Embodiment
In the first embodiment, at the time of imaging by the ultrasound diagnostic imaging apparatus 100, the case where the probe mark indicating the imaging position to be imaged and the position and the direction of the ultrasonic probe 1 at the imaging position to be imaged is displayed is described. . Further, in such imaging, in addition to the ultrasonic image data, imaging position information and definition information are generated and stored in the image memory 18. For this reason, in the second embodiment, a case where ultrasound image data and imaging position information are browsed with reference to definition information will be described.

  FIG. 19 is a diagram for explaining the overall configuration of the image processing apparatus 300 according to the second embodiment. As shown in FIG. 19, the image processing apparatus 300 according to the second embodiment is communicably connected to the ultrasound diagnostic imaging apparatus 100 and the medical image storage apparatus 200 via the network 400. The ultrasound diagnostic imaging apparatus 100 captures ultrasound image data in the same manner as the ultrasound diagnostic imaging apparatus 100 according to the first embodiment. The medical image storage apparatus 200 acquires image data captured by the ultrasound diagnostic imaging apparatus 100 from the ultrasound diagnostic imaging apparatus 100 and stores the image data.

  An image processing apparatus 300 according to the second embodiment includes a monitor 301, an operation unit 302, a storage unit 303, a control unit 304, and a display control unit 305. The monitor 301 displays a GUI for inputting various setting requests, and displays an ultrasonic image or the like requested to be browsed. The operation unit 302 includes a mouse, a keyboard, a trackball, and the like, and receives various setting requests from the operator of the image processing apparatus 300. The control unit 304 controls the entire processing of the image processing apparatus 300.

  The storage unit 303 stores ultrasound image data and ultrasound images received from the ultrasound imaging diagnostic apparatus 100, the medical image storage apparatus 200, a database of PACS (Picture Archiving and Communication Systems), and a database of an electronic medical record system. Remember.

  For example, the storage unit 303 stores ultrasound image data DB (Data Base) that stores ultrasound image data captured at a plurality of imaging positions in an examination, and a plurality of probes in association with a plurality of imaging positions on a body mark. An imaging position information DB for storing imaging position information in which marks are arranged, and a definition information DB for storing definition information in which correspondence with ultrasonic image data is defined for each probe mark are stored. The data structure of ultrasonic image data stored in the ultrasonic image data DB is the same as the data structure shown in FIG. 10, and the data structure of the imaging position information stored in the imaging position information DB is the data shown in FIG. Similar to the structure, the data structure of definition information stored in the definition information DB is similar to the data structure shown in FIG.

  The display control unit 305 receives a display request for ultrasound image data, and displays ultrasound image data and imaging position information. For example, the display control unit 305 displays a list of a plurality of ultrasonic image data captured in an examination and imaging position information. Alternatively, the display control unit 305 displays shooting position information. When the display control unit 305 accepts selection of any of the probe marks in the imaging position information, the display control unit 305 refers to the definition information to identify ultrasonic image data associated with the selected probe mark, and identifies the identified ultrasound image data. The sound wave image data is further displayed.

  FIG. 20 is a flowchart showing the processing procedure of the image processing apparatus 300 according to the second embodiment. As shown in FIG. 20, the operation unit 302 receives a display of image data from the operator (step S201). Here, the operation unit 302 designates, for example, an examination ID or an image ID and accepts display of image data. The operation unit 302 passes the received inspection ID and image ID to the display control unit 305. The image data referred to here includes ultrasonic image data and imaging position image data (imaging position information).

  Subsequently, the display control unit 305 searches the storage unit 303 to specify the designated image data (step S202). For example, the display control unit 305 refers to the ultrasound image data DB and the imaging position information DB and, as image data associated with the designated examination ID, the ultrasound image data ID and the imaging position image data ID Identify. As an example, when “xxx” is specified as the examination ID, the display control unit 305 determines that ultrasonic image data IDs “0001”, “0002”, and “0003” as ultrasonic image data shown in FIG. And “0004”, and the imaging position image data ID “000x” is identified as the imaging position information shown in FIG.

  Alternatively, the display control unit 305 refers to the ultrasound image data DB and the imaging position information DB to specify image data (for example, ultrasound image data and imaging position image data) that matches the specified image ID. Then, the display control unit 305 refers to the ultrasound image data DB and the imaging position information DB, and identifies the image data associated with the examination ID associated with the image ID of the identified image data. As an example, when “000x” is designated as the image ID, the display control unit 305 specifies “000x” as the shooting position image data ID as the shooting position information shown in FIG. In addition, the display control unit 305 specifies “xxx” as the examination ID associated with the shooting position image data ID. Then, the display control unit 305 specifies ultrasound image data IDs “0001”, “0002”, “0003”, and “0004” shown in FIG. 10 as ultrasound image data associated with the examination ID.

  Then, the display control unit 305 determines whether a list display has been received (step S203). Here, when the display control unit 305 determines that the list display has been received (Yes in step S203), the display control unit 305 displays the list of image data on the monitor 301 (step S204). FIG. 21 is a diagram (part 1) for describing the second embodiment.

  FIG. 21 shows an ultrasonic image displayed on the monitor 301 when the display control unit 305 receives a selection for displaying image data in a list. In FIG. 21, the monitor 301 divides the display area into a first display area 301a, a second display area 301b, a third display area 301c, and a fourth display area 301d. Show the case. As shown in FIG. 21, when the display control unit 305 determines that the list display has been received, a plurality of ultrasonic image data 71 to 74 captured in the examination and the imaging position information 70 are displayed on the second monitor 301. A list is displayed in the display area 301b.

  The display control unit 305 determines whether the end of the browsing operation has been received (step S205). Here, if the display control unit 305 does not determine that the end of the browsing operation has been received (No in step S205), the display control unit 305 determines whether the end of the browsing operation is received. On the other hand, if the display control unit 305 determines that the end of the browsing operation has been received (Yes at step S205), the process ends.

  When the display control unit 305 does not determine that the list display has been received in step S203 (No in step S203), the display control unit 305 displays the shooting position information on the monitor 301 (step S206). FIG. 21 is a second diagram to explain the second embodiment;

  FIG. 22 illustrates an ultrasound image displayed on the monitor 301 when the display control unit 305 receives a selection for displaying imaging position information. In FIG. 22, the monitor 301 is used for the first display area 301a, the second display area 301b1, the third display area 301c, the fourth display area 301d, and the fifth display area 301b2. The case where the display area is divided and used is shown. As shown in FIG. 22, when it is not determined that the list display has been received, the display control unit 305 displays the shooting position information 70 in the fifth display area 301 b 2 of the monitor 301.

  Subsequently, the display control unit 305 determines whether the selection of the probe mark has been received (step S207). Here, when the display control unit 305 does not determine that the selection of the probe mark is received (No in step S207), the display control unit 305 determines whether the selection of the probe mark is continuously received. On the other hand, when the display control unit 305 determines that the selection of the probe mark is received (Yes at step S207), the display control unit 305 displays ultrasound image data corresponding to the selected probe mark (step S208). In the example shown in FIG. 22, the display control unit 305 shows the case where the probe mark shown on the upper left side is selected in the photographing position information 70. In such a case, the display control unit 305 specifies the ultrasonic image data 71 corresponding to the selected probe mark with reference to the definition information DB, and displays the specified ultrasonic image data 71 as the second display of the monitor 301. Displayed in the area 301b1.

  The display control unit 305 determines whether the end of the browsing operation has been received (step S209). Here, when the display control unit 305 does not determine that the end of the browsing operation has been received (No in step S209), the display control unit 305 proceeds to step S207 and determines whether the selection of the probe mark is received. On the other hand, if the display control unit 305 determines that the end of the browsing operation has been received (Yes at Step S209), the process ends.

  As described above, according to the second embodiment, it is possible to easily determine the correspondence between ultrasound image data captured in an examination and the imaging position of the ultrasound image data.

  Note that the display control unit 305 according to the second embodiment may further display identification information that can identify whether or not browsing of ultrasound image data has ended. FIG. 23 is a diagram (part 1) for describing a modification of the second embodiment, and FIG. 24 is a diagram (part 2) for describing a modification of the second embodiment.

  FIG. 23 illustrates a case where the display control unit 305 further displays identification information on the monitor 301 when the display control unit 305 receives a selection for displaying image data in a list. In FIG. 23, the same reference numerals are given to the same contents as in FIG. 21, and the detailed description will be omitted. As shown in FIG. 23, the display control unit 305 further displays identification information 80 a to 80 d in the vicinity of each of the plurality of ultrasonic image data 71 to 74 in the second display area 301 b. In the example shown in FIG. 23, the display control unit 305 displays the identification information 80 a near the ultrasonic image data 71, displays the identification information 80 b near the ultrasonic image data 72, and the vicinity of the ultrasonic image data 73. The identification information 80 c is displayed on the screen, and the identification information 80 d is displayed in the vicinity of the ultrasound image data 74.

  When the display control unit 305 receives, from the operator, an operation indicating that browsing of a certain ultrasound image data is finished, browsing is ended in the browsing information displayed in the vicinity of the ultrasound image data which has been browsed. Display information indicating that. For example, as shown in FIG. 23, when the display control unit 305 receives from the operator an operation indicating that the viewing of the ultrasound image data 71 has ended, the viewing information 80a displayed in the vicinity of the ultrasound image data 71. Display a check mark indicating that browsing is complete. The display control unit 305 may display a check mark on the browsing information 80a, for example, when an operation of selecting the browsing information 80a is received.

  FIG. 24 shows the case where the display control unit 305 further displays identification information on the monitor 301 when the display control unit 305 receives a selection for displaying the shooting position information. In FIG. 24, the same reference numerals are given to the same contents as in FIG. 22, and the detailed description will be omitted. As shown in FIG. 24, the display control unit 305 further displays identification information 90a to 90d in the vicinity of each of the plurality of probe marks in the imaging position information 70 displayed in the fifth display area 301b2.

  When the display control unit 305 receives, from the operator, an operation indicating that viewing of certain ultrasound image data has ended, the viewing information displayed in the vicinity of the probe mark corresponding to the ultrasound image data for which viewing has ended Displays information indicating that browsing is complete. For example, as shown in FIG. 24, when the display control unit 305 receives from the operator an operation indicating that the viewing of the ultrasound image data 71 is completed, the display control unit 305 is near the probe mark corresponding to the ultrasound image data 71. A check mark indicating that the browsing is completed is displayed on the displayed browsing information 90a. The display control unit 305 may display a check mark on the browsing information 90a, for example, when an operation of selecting the browsing information 90a is received. In addition, when it is determined that the selection of the probe mark is received, the display control unit 305 may display a check mark indicating that the browsing is finished in the browsing information. In other words, when the selected ultrasound image data is opened, the display control unit 305 may display a check mark indicating that the browsing has ended in the browsing information. Specifically, in the example shown in FIG. 24, when displaying the ultrasonic image data 71 corresponding to the selected probe mark, the display control unit 305 displays the browsing information 90a displayed in the vicinity of the selected probe mark. Display a check mark on.

  Further, the data structure of the ultrasonic image data stored in the image memory 18 and the ultrasonic image data DB is not limited to the data structure shown in FIG. 10, and any arbitrary ultrasonic image data can be specified as long as the ultrasonic image data can be specified. It may be a data structure. Further, the data structure of the imaging position information stored in the image memory 18 and the imaging position information DB is not limited to the data structure shown in FIG. 14, and any data structure may be used as long as the imaging position information can be identified. Good. Further, the data structure of the definition information stored in the image memory 18 and the definition information DB is not limited to the data structure shown in FIG. 11, and any data structure may be used as long as the definition information can be identified. Furthermore, at least two of the ultrasound image data DB, the imaging position information DB, and the definition information DB may be integrated.

  According to at least one embodiment described above, it is possible to easily determine the correspondence between ultrasound image data captured in an examination and the imaging position of the ultrasound image data.

  While certain embodiments of the present invention have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the invention. These embodiments can be implemented in other various forms, and various omissions, replacements, and modifications can be made without departing from the scope of the invention. These embodiments and modifications thereof are included in the invention described in the claims and the equivalents thereof as well as included in the scope and the gist of the invention.

DESCRIPTION OF SYMBOLS 10 Apparatus main body 16 Display control part 16a 1st display control part 16b 2nd display control part

Claims (4)

  1. A storage unit that stores ultrasound image data and imaging position information in which a probe mark is disposed on a body mark in association with the imaging position at which the ultrasound image data is captured;
    A display control unit that receives a display request for ultrasonic image data and displays the ultrasonic image data and the imaging position information on a display unit;
    Equipped with
    When the display control unit receives from the operator an operation indicating that the viewing of the ultrasound image data has been completed, the display control unit is configured to receive the information indicating that the viewing of the ultrasound image data has been completed with the ultrasound image data. and wherein the benzalkonium be displayed along with the shooting position information, the image processing apparatus.
  2.   The image processing apparatus according to claim 1, wherein the display control unit displays ultrasound image data and the imaging position information in a list.
  3.   The display control unit displays the imaging position information, and when the selection of the probe mark is accepted in the imaging position information, identifies the ultrasound image data associated with the selected probe mark, and identifies the identified ultrasound image The image processing apparatus according to claim 1, further displaying data.
  4. A storage procedure for storing ultrasound image data and imaging position information in which a probe mark is disposed on a body mark in association with the imaging position at which the ultrasound image data is captured;
    A program that causes a computer to execute a display control procedure for receiving a display request for ultrasonic image data and displaying the ultrasonic image data and the imaging position information on a display unit,
    In the display control procedure, when an operation indicating that the browsing of the ultrasonic image data is ended is received from the operator, information indicating that the browsing of the ultrasonic image data is ended is the information for displaying the ultrasonic image data and the information. It is displayed along with the shooting position information, program.
JP2017168796A 2017-09-01 2017-09-01 Image processing apparatus and program Active JP6538130B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017168796A JP6538130B2 (en) 2017-09-01 2017-09-01 Image processing apparatus and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2017168796A JP6538130B2 (en) 2017-09-01 2017-09-01 Image processing apparatus and program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2014000315 Division 2014-01-06

Publications (2)

Publication Number Publication Date
JP2017225850A JP2017225850A (en) 2017-12-28
JP6538130B2 true JP6538130B2 (en) 2019-07-03

Family

ID=60890647

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2017168796A Active JP6538130B2 (en) 2017-09-01 2017-09-01 Image processing apparatus and program

Country Status (1)

Country Link
JP (1) JP6538130B2 (en)

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08280684A (en) * 1995-04-18 1996-10-29 Fujitsu Ltd Ultrasonic diagnostic apparatus
JP4469444B2 (en) * 1999-10-15 2010-05-26 東芝医用システムエンジニアリング株式会社 Ultrasound diagnostic imaging equipment
US6675038B2 (en) * 2001-05-14 2004-01-06 U-Systems, Inc. Method and system for recording probe position during breast ultrasound scan
JP2003000589A (en) * 2001-06-19 2003-01-07 Hitachi Medical Corp Ultrasonic diagnostic device
JP2004105638A (en) * 2002-09-20 2004-04-08 Shimadzu Corp Ultrasonic diagnostic apparatus
JP4167162B2 (en) * 2003-10-14 2008-10-15 アロカ株式会社 Ultrasonic diagnostic equipment
JP5738507B2 (en) * 2006-01-19 2015-06-24 東芝メディカルシステムズ株式会社 Ultrasonic probe trajectory expression device and ultrasonic diagnostic device
US9084556B2 (en) * 2006-01-19 2015-07-21 Toshiba Medical Systems Corporation Apparatus for indicating locus of an ultrasonic probe, ultrasonic diagnostic apparatus
JP2007282792A (en) * 2006-04-14 2007-11-01 Matsushita Electric Ind Co Ltd Ultrasonic diagnostic system
JP2008104551A (en) * 2006-10-24 2008-05-08 Toshiba Corp Ultrasonic diagnostic equipment
JP5468343B2 (en) * 2009-09-30 2014-04-09 株式会社東芝 Ultrasonic diagnostic equipment

Also Published As

Publication number Publication date
JP2017225850A (en) 2017-12-28

Similar Documents

Publication Publication Date Title
US20170112468A1 (en) Image diagnosis apparatus and image diagnosis method
US9483177B2 (en) Diagnostic imaging apparatus, diagnostic ultrasonic apparatus, and medical image displaying apparatus
JP4088104B2 (en) Ultrasonic diagnostic equipment
US9119558B2 (en) Ultrasonic diagnostic apparatus and ultrasonic diagnostic method
US8696575B2 (en) Ultrasonic diagnostic apparatus and method of controlling the same
JP6367425B2 (en) Ultrasonic diagnostic equipment
JP5435751B2 (en) Ultrasonic diagnostic apparatus, ultrasonic transmission / reception method, and ultrasonic transmission / reception program
US8105240B2 (en) Ultrasonic imaging apparatus and low attenuation medium with a prescribed pattern for apparatus localization
KR100718403B1 (en) Ultrasonic imaging apparatus
WO2013129590A1 (en) Ultrasound diagnostic equipment, medical diagnostic imaging equipment, and ultrasound diagnostic equipment control program
US10278670B2 (en) Ultrasound diagnostic apparatus and method of controlling ultrasound diagnostic apparatus
US9520154B2 (en) Apparatus and method for displaying images
US9568598B2 (en) Ultrasonic diagnostic apparatus and program
US20060058651A1 (en) Method and apparatus for extending an ultrasound image field of view
JP2009297072A (en) Ultrasonic diagnostic apparatus and medical image processing apparatus
JP2005296436A (en) Ultrasonic diagnostic apparatus
JP5566773B2 (en) Ultrasonic diagnostic apparatus and sound speed setting method
US8275447B2 (en) Medical image diagnostic system, medical imaging apparatus, medical image storage apparatus, and medical image display apparatus
CN101292879B (en) Ultrasonic diagnostic apparatus and control method thereof
US20120108960A1 (en) Method and system for organizing stored ultrasound data
KR100948047B1 (en) Ultrasound system and method for forming ultrasound image
US8834371B2 (en) Ultrasound diagnostic apparatus and ultrasound image processing program
US10342514B2 (en) Ultrasonic diagnostic apparatus and method of ultrasonic imaging
JP2008514264A (en) Method and apparatus for performing ultrasonic diagnostic imaging of breast with high accuracy
US20150080726A1 (en) Ultrasonic diagnostic apparatus, positional information acquiring method, and computer program product

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20180525

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180605

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180727

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20181211

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20190311

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20190319

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190507

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190605

R150 Certificate of patent or registration of utility model

Ref document number: 6538130

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150