WO2021039101A1 - Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons - Google Patents

Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons Download PDF

Info

Publication number
WO2021039101A1
WO2021039101A1 PCT/JP2020/025725 JP2020025725W WO2021039101A1 WO 2021039101 A1 WO2021039101 A1 WO 2021039101A1 JP 2020025725 W JP2020025725 W JP 2020025725W WO 2021039101 A1 WO2021039101 A1 WO 2021039101A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasonic
label number
tip
endoscope
image
Prior art date
Application number
PCT/JP2020/025725
Other languages
English (en)
Japanese (ja)
Inventor
匡信 内原
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2021542583A priority Critical patent/JP7158596B2/ja
Priority to CN202080060272.4A priority patent/CN114302679A/zh
Publication of WO2021039101A1 publication Critical patent/WO2021039101A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to an ultrasonic endoscopic system for observing the state of an observation target site in the body of a subject and a method for operating the ultrasonic endoscopic system using ultrasonic waves.
  • the ultrasonic endoscopy system mainly aims to observe the pancreas or gallbladder by the transgastrointestinal tract, and puts an endoscopic endoscope and an ultrasonic endoscope having an ultrasonic observation part at the tip into the digestive tract of the subject. It is inserted and an endoscopic image inside the gastrointestinal tract and an ultrasonic image of the part outside the gastrointestinal wall are taken.
  • the illumination unit at the tip of the ultrasonic endoscope irradiates the area adjacent to the observation target in the digestive tract with illumination light, and the reflected light is emitted by the imaging unit at the tip of the ultrasonic endoscope.
  • An endoscopic image is generated from the received signal of the reflected light.
  • multiple ultrasonic transducers at the tip of the ultrasonic endoscope transmit and receive ultrasonic waves to the observation target site such as an organ outside the gastrointestinal wall, and an ultrasonic image is generated from the received signal of the ultrasonic waves. Will be done.
  • Patent Documents 1 to 4 as prior art documents related to the present invention.
  • Patent Document 1 the target site in the image of the diagnostic site in the subject is roughly extracted, the global information for recognizing the target site is predicted using a neural network, and the global information is used to predict the target site. It is described that the contour is recognized and the recognition result is displayed together with the original image.
  • Patent Document 2 the position / orientation data of the tip of the ultrasonic endoscope is generated based on the electric signal from the coil, and the insertion shape of the ultrasonic endoscope is shown from the position / orientation data.
  • a guide image is generated by generating the insertion shape data of the above and combining it with the three-dimensional biological tissue model data of the tissue structure such as the organ group of the subject, and the video signal of the composite image obtained by synthesizing the ultrasonic image and the guide image. Is described to be generated and displayed on the monitor.
  • Patent Document 2 describes that the stereoscopic guide image and the cross-section guide image are arranged in the area on the left side of the screen, and the ultrasonic image is displayed while being arranged in the area on the right side of the screen.
  • Patent Document 2 describes a button for expanding or contracting the display range of the ultrasonic image.
  • Patent Document 3 describes that an ultrasonic tomographic image of a subject and an optical image thereof are displayed adjacent to each other on the screen of a display device so that both images can be observed at the same time.
  • Patent Document 4 the ultrasonic image and the schematic diagram are displayed on the same screen, and the schematic diagram is a schema diagram or an actual optical image of the human body.
  • the scanning surface of the ultrasonic endoscope It is described that the insertion shape is also displayed.
  • the region of the scanning position of the ultrasonic endoscope is detected from the signal of the position and direction of the ultrasonic endoscope detected by using the coil, and the ultrasonic scanning region data is output.
  • the part name data corresponding to the ultrasonic scanning area data is read out from the part name storage unit, and the part name is superimposed and displayed on the ultrasonic image.
  • Japanese Unexamined Patent Publication No. 06-233761 Japanese Unexamined Patent Publication No. 2010-609018 Japanese Unexamined Patent Publication No. 02-045045 Japanese Unexamined Patent Publication No. 2004-113629
  • the first object of the present invention is to solve the above-mentioned problems of the prior art and to surely grasp the position and direction of the tip portion of the ultrasonic endoscope even by an operator who is unfamiliar with ultrasonic images. It is an object of the present invention to provide an ultrasonic endoscopic system and a method of operating the ultrasonic endoscopic system.
  • the second object of the present invention is, in addition to the above-mentioned first object, even an operator who is unfamiliar with ultrasonic imaging, without hesitation in the body of the subject, from the current observation target site to the next observation target site. It is an object of the present invention to provide an ultrasonic endoscopic system and a method of operating an ultrasonic endoscopic system capable of correctly moving the tip of an ultrasonic endoscope.
  • the present invention includes an ultrasonic endoscope having an ultrasonic transducer at the tip and an ultrasonic endoscope.
  • An ultrasonic observation device that transmits and receives ultrasonic waves with an ultrasonic vibrator and generates an ultrasonic image for diagnosis from the received signal of the ultrasonic waves.
  • the relationship with the label number corresponding to the position of the tip of the endoscope is learned in advance for a plurality of learning ultrasonic images, and based on the learning results, from the diagnostic ultrasonic image to the diagnostic ultrasonic image.
  • An ultrasonic image recognition unit that recognizes the label number corresponding to the position of the tip of the ultrasonic endoscope during imaging, and an ultrasonic image recognition unit.
  • an ultrasonic endoscopic system including a display control unit for displaying the position of the tip of the ultrasonic endoscope corresponding to a label number recognized by the ultrasonic image recognition unit on a monitor.
  • the display control unit displays the name of the observation target part corresponding to the recognized label number on the monitor as character information.
  • the display control unit displays an anatomical schema diagram in which the position of the tip of the ultrasonic endoscope corresponding to the recognized label number is superimposed and displayed as image information on the monitor.
  • the ultrasonic image recognition unit associates the position and orientation of the tip of the ultrasonic endoscope with the label number, and the ultrasonic endoscope at the time of capturing the learning ultrasonic image and the learning ultrasonic image.
  • the relationship between the position and orientation of the tip of the ultrasonic image is learned in advance for a plurality of learning ultrasonic images, and based on the learning result, the diagnostic ultrasonic image is captured from the diagnostic ultrasonic image. Recognize the label number corresponding to the position and orientation of the tip of the ultrasonic endoscope at the time, It is preferable that the display control unit displays the position and orientation of the tip portion of the ultrasonic endoscope corresponding to the label number recognized by the ultrasonic image recognition unit on the monitor.
  • the display control unit displays the name of the observation target part corresponding to the recognized label number on the monitor as character information.
  • the display control unit displays an anatomical schema diagram in which the position and orientation of the tip of the ultrasonic endoscope corresponding to the recognized label number are superimposed as image information on the monitor.
  • the operation procedure for moving the tip of the ultrasonic endoscope from the observation target part corresponding to the label number 1 to the observation target part corresponding to the label number next to the label number 1 in the observation order is memorized. Equipped with an operation procedure storage unit
  • the display control unit is an operation for moving the tip of the ultrasonic endoscope from the observation target part corresponding to the current label number to the observation target part whose observation order corresponds to the label number next to the current label number. It is preferable to acquire the procedure from the operation procedure storage unit and display the acquired operation procedure on the monitor.
  • the display control unit further displays on the monitor the name of the observation target part corresponding to the label number next to the current label number in the observation order as character information.
  • the display control unit displays the operation procedure as character information on the monitor.
  • the operation procedure includes the names of one or more organs drawn when moving the tip of the ultrasonic endoscope.
  • the display control unit displays an anatomical schema diagram in which the operation procedure is superimposed as image information on the monitor.
  • the display control unit colors the area of one or more organs drawn when moving the tip of the ultrasonic endoscope on the anatomy schema diagram, and displays the anatomy schema diagram in which the area is colored. It is preferable to display it on a monitor.
  • the display control unit draws out when moving the region of the observation target site whose observation order corresponds to the label number next to the current label number and the tip portion of the ultrasonic endoscope on the anatomy schema diagram. It is preferable to color the regions of one or more organs to be different colors and display the anatomical schema diagram in which the regions are colored different colors on the monitor.
  • the tip of the ultrasonic endoscope was moved from the observation target part corresponding to the current label number to the observation target part other than the observation target part whose observation order corresponds to the label number next to the current label number.
  • the warning generating unit issues a warning as voice information, or simultaneously emits both text information and voice information as a warning.
  • the display control unit adds a check mark to the label number corresponding to the reached observation target part and checks each time the tip of the ultrasonic endoscope reaches the observation target part corresponding to each label number. It is preferable to display the label number with the mark on the monitor as character information.
  • the display control unit colors the area of the reached observation target part on the anatomical schema diagram each time the tip of the ultrasonic endoscope reaches the observation target part corresponding to each label number. It is preferable to display an anatomical schema diagram in which the area of the observed area reached is colored on the monitor.
  • the display control unit emphasizes the area of the observation target part whose observation order corresponds to the label number next to the current label number on the anatomy schema diagram, and the anatomy schema in which the area of the observation target part is emphasized. It is preferable to display the figure on the monitor.
  • the display control unit corresponds to the area of the observation target part where the observation order corresponds to the label number next to the current label number on the anatomical schema diagram, and the observation order corresponds to the label number next to the current label number. It is preferable to color the area of the observation target part other than the observation target part to be different from that of the observation target part, and to display the anatomical schema diagram in which the observation target part area is colored on the monitor.
  • the display control unit is based on the movement route when the tip of the ultrasonic endoscope is ideally moved based on the observation order of the observation target part, and the actual operation of the ultrasonic endoscope. It is preferable to arrange the movement route when the tip is actually moved on the anatomical schema diagram as image information and display it on the monitor.
  • a movement route registration unit for pre-registering a movement route when the tip of the ultrasonic endoscope is ideally moved based on the observation order of the observation target part.
  • the ultrasonic image recognition unit is built in the ultrasonic observation device.
  • the ultrasonic endoscope further has an illumination unit and an imaging unit at the tip, Further, it is provided with an endoscope processor that receives the reflected light of the illumination light emitted from the illumination unit by the imaging unit and generates a diagnostic endoscope image from the image pickup signal of the reflected light.
  • the ultrasonic image recognition unit is preferably built in the endoscope processor.
  • the ultrasonic endoscope further has an illumination unit and an imaging unit at the tip, Further, it is provided with an endoscope processor that receives the reflected light of the illumination light emitted from the illumination unit by the imaging unit and generates a diagnostic endoscope image from the image pickup signal of the reflected light.
  • the ultrasonic image recognition unit is preferably provided outside the ultrasonic observation device and the endoscope processor.
  • the ultrasonic image recognition unit associates the position of the tip of the ultrasonic endoscope in the body cavity of the subject with the label number based on the observation order of the observation target site, and the ultrasonic for learning.
  • a step of learning in advance the relationship between the image and the label number corresponding to the position of the tip of the ultrasonic endoscope at the time of capturing the ultrasonic image for learning for a plurality of ultrasonic images for learning.
  • a step in which an ultrasonic observation device transmits and receives ultrasonic waves by an ultrasonic vibrator at the tip of an ultrasonic endoscope and generates an ultrasonic image for diagnosis from the received signal of the ultrasonic waves.
  • the ultrasonic image recognition unit recognizes the label number corresponding to the position of the tip of the ultrasonic endoscope at the time of capturing the diagnostic ultrasonic image from the diagnostic ultrasonic image.
  • a method of operating the ultrasonic endoscopic system including a step in which the display control unit displays the position of the tip of the ultrasonic endoscope corresponding to the label number recognized by the ultrasonic image recognition unit on the monitor. provide.
  • the position and orientation of the tip of the ultrasonic endoscope by associating the position and orientation of the tip of the ultrasonic endoscope with the label number, the position and orientation of the tip of the ultrasonic endoscope at the time of capturing the learning ultrasonic image and the learning ultrasonic image.
  • the relationship with the label number corresponding to the orientation is learned in advance for a plurality of learning ultrasonic images, and then Based on the learning result, the label number corresponding to the position and orientation of the tip of the ultrasonic endoscope at the time of capturing the diagnostic ultrasonic image is recognized from the diagnostic ultrasonic image. It is preferable to display the position and orientation of the tip of the ultrasonic endoscope corresponding to the label number recognized by the ultrasonic image recognition unit on the monitor.
  • the operation procedure storage unit moves the tip of the ultrasonic endoscope from the observation target part corresponding to the label number 1 to the observation target part corresponding to the label number next to the label number 1 in the observation order. Includes steps to memorize operating procedures for Using the recognized label number as the current label number, from the observation target site corresponding to the current label number to the observation target site whose observation order corresponds to the label number next to the current label number, the tip of the ultrasonic endoscope. Obtain the operation procedure for moving the unit from the operation procedure storage unit, and It is preferable to display the acquired operating procedure on the monitor.
  • the ultrasonic image recognition unit, the display control unit, and the warning generation unit are preferably hardware or a processor that executes a program
  • the operation procedure storage unit and the movement route registration unit are hardware or memory. Is preferable.
  • the position and orientation of the tip of the ultrasonic endoscope are displayed on the monitor.
  • the tip of the ultrasonic endoscope is in which position, in which direction, and in which part. It is possible to surely grasp whether or not the patient is observing.
  • the operation procedure for moving the tip portion of the ultrasonic endoscope can be displayed on the monitor. As a result, according to the present invention, even an operator who is unfamiliar with ultrasonic images can move from the current observation target site to the next observation target site without hesitation in the body of the subject. The tip can be moved correctly.
  • FIG. 1 It is a figure which shows the schematic structure of the ultrasonic endoscopic system which concerns on one Embodiment of this invention. It is a top view which shows the tip part of the insertion part of an ultrasonic endoscope and the periphery thereof. It is a figure which shows the cross section when the tip part of the insertion part of the ultrasonic endoscope is cut by the II cross section shown in FIG. It is a block diagram of one Embodiment which shows the structure of the endoscope image recognition part. It is a block diagram which shows the structure of an ultrasonic observation apparatus. It is a block diagram of one Embodiment which shows the structure of the ultrasonic image recognition part. It is a figure which shows the flow of the diagnostic processing using an ultrasonic endoscopy system.
  • the ultrasonic endoscopic system according to an embodiment of the present invention (the present embodiment) will be described in detail below with reference to the preferred embodiments shown in the accompanying drawings.
  • the present embodiment is a typical embodiment of the present invention, it is merely an example and does not limit the present invention.
  • FIG. 1 is a diagram showing a schematic configuration of an ultrasonic endoscopy system 10.
  • the ultrasonic endoscopy system 10 is used for observing the state of an observation target site in the body of a patient who is a subject (hereinafter, also referred to as ultrasonic diagnosis) using ultrasonic waves.
  • the observation target site is a site that is difficult to inspect from the body surface side of the patient, such as the pancreas or the gallbladder.
  • the endoscopic ultrasonography system 10 the state of the observation target site and the presence or absence of abnormalities are ultrasonically diagnosed via the gastrointestinal tract such as the esophagus, stomach, duodenum, small intestine, and large intestine, which are the body cavities of the patient. It is possible.
  • the ultrasonic endoscope system 10 acquires an ultrasonic image and an endoscopic image, and as shown in FIG. 1, an ultrasonic endoscope 12, an ultrasonic observation device 14, and an endoscope processor. It has a light source device 18, a monitor 20, a water supply tank 21a, a suction pump 21b, and a console 100.
  • the ultrasonic endoscope 12 is attached to an insertion portion 22 inserted into the body cavity of a patient, an operation portion 24 operated by an operator (user) such as a doctor or a technician, and a tip portion 40 of the insertion portion 22.
  • the ultrasonic oscillator unit 46 (see FIGS. 2 and 3) is provided.
  • the ultrasonic endoscope 12 has a plurality of ultrasonic vibrators 48 included in the ultrasonic vibrator unit 46 at the tip as the ultrasonic observation unit 36 (see FIGS. 2 and 3).
  • the ultrasonic endoscope 12 has an illumination unit including an illumination window 88 and the like and an imaging unit including an observation window 82, an objective lens 84, a solid-state image sensor 86 and the like as the endoscope observation unit 38 at the tip thereof. Has (see FIGS. 2 and 3). The surgeon acquires an endoscopic image and an ultrasonic image by the function of the ultrasonic endoscope 12.
  • the "endoscopic image” is an image obtained by photographing the inner wall of the body cavity of the patient by an optical method.
  • the “ultrasonic image” is an image obtained by receiving an ultrasonic reflected wave (echo) transmitted from the inside of the patient's body cavity toward the observation target site and imaging the received signal.
  • the ultrasonic endoscope 12 will be described in detail in a later section.
  • the ultrasonic observation device 14 is connected to the ultrasonic endoscope 12 via the universal cord 26 and the ultrasonic connector 32a provided at the end thereof.
  • the ultrasonic observation device 14 controls the ultrasonic oscillator unit 46 of the ultrasonic endoscope 12 to transmit ultrasonic waves. Further, the ultrasonic observation device 14 generates an ultrasonic image by imaging the received signal when the ultrasonic vibrator unit 46 receives the reflected wave (echo) of the transmitted ultrasonic wave.
  • the ultrasonic observation device 14 transmits and receives ultrasonic waves by a plurality of ultrasonic transducers 48 included in the ultrasonic transducer unit 46, and a diagnostic ultrasonic image (hereinafter, simply an ultrasonic image) is transmitted from the received signal of the ultrasonic waves. Also called) is generated.
  • a diagnostic ultrasonic image hereinafter, simply an ultrasonic image
  • the ultrasonic observation device 14 will be described in detail in a later section.
  • the endoscope processor 16 is connected to the ultrasonic endoscope 12 via the universal cord 26 and the endoscope connector 32b provided at the end thereof.
  • the endoscope processor 16 acquires image data of an adjacent portion to be observed imaged by an ultrasonic endoscope 12 (specifically, a solid-state image sensor 86 described later), and performs predetermined image processing on the acquired image data.
  • an ultrasonic endoscope 12 specifically, a solid-state image sensor 86 described later
  • the endoscope processor 16 receives the reflected light of the illumination light emitted from the illumination unit at the tip of the ultrasonic endoscope 12 by the imaging unit also at the tip of the ultrasonic endoscope 12.
  • a diagnostic endoscopic image (hereinafter, also simply referred to as an endoscopic image) is generated from the imaged signal of the reflected light.
  • the "observation target adjacent site” is a portion of the inner wall of the patient's body cavity that is adjacent to the observation target site.
  • the ultrasonic observation device 14 and the endoscope processor 16 are composed of two devices (computers) separately provided.
  • the present invention is not limited to this, and both the ultrasonic observation device 14 and the endoscope processor 16 may be configured by one device.
  • the light source device 18 is connected to the ultrasonic endoscope 12 via the universal cord 26 and the light source connector 32c provided at the end thereof.
  • the light source device 18 uses the ultrasonic endoscope 12 to image an adjacent portion to be observed, it irradiates white light or light having a specific wavelength composed of three primary colors of red light, green light, and blue light.
  • the light emitted by the light source device 18 propagates in the ultrasonic endoscope 12 through a light guide (not shown) included in the universal cord 26, and propagates in the ultrasonic endoscope 12 (details, the illumination window 88 described later). Is emitted from.
  • the portion adjacent to the observation target is illuminated by the light from the light source device 18.
  • the monitor 20 is connected to the ultrasonic observation device 14 and the endoscope processor 16, and includes an ultrasonic image generated by the ultrasonic observation device 14, an endoscopic image generated by the endoscope processor 16, and the like. Display anatomical schema diagrams, etc.
  • the ultrasonic image and the endoscopic image one image may be switched to one of the other images and displayed on the monitor 20, or two or more images may be displayed side by side at the same time. ..
  • the ultrasonic image and the endoscopic image are displayed on one monitor 20, but the monitor for displaying the ultrasonic image, the monitor for displaying the endoscopic image, and the anatomical schema diagram are used. Monitor and may be provided separately.
  • the ultrasonic image and the endoscopic image may be displayed in a display form other than the monitor 20, for example, a form displayed on the display of a terminal carried by the operator.
  • the operation console 100 is an example of an instruction acquisition unit that acquires an instruction input from the operator (user), and the operator inputs information necessary for ultrasonic diagnosis or superimposes the ultrasonic observation device 14. It is a device provided for giving an instruction to start ultrasonic diagnosis.
  • the console 100 is composed of, for example, a keyboard, a mouse, a trackball, a touch pad, a touch panel, and the like.
  • the CPU (control circuit) 152 see FIG. 5
  • the ultrasonic observation device 14 controls each part of the device (for example, the receiving circuit 142 and the transmitting circuit 144 described later) according to the operation content. To do.
  • the surgeon performs the examination information (for example, the examination order information including the date and the order number, and the patient information including the patient ID and the patient name) before starting the ultrasonic diagnosis. ) Is input on the console 100. After the input of the examination information is completed, when the operator instructs the start of the ultrasonic diagnosis through the console 100, the CPU 152 of the ultrasonic observation device 14 superimposes the ultrasonic diagnosis based on the input examination information. Each part of the ultrasonic observation device 14 is controlled.
  • the examination information for example, the examination order information including the date and the order number, and the patient information including the patient ID and the patient name
  • the operator can set various control parameters on the console 100 when performing the ultrasonic diagnosis.
  • control parameters include the selection result of the live mode and the freeze mode, the set value of the display depth (depth), the selection result of the ultrasonic image generation mode, and the like.
  • the "live mode” is a mode in which ultrasonic images (moving images) obtained at a predetermined frame rate are sequentially displayed (real-time display).
  • the "freeze mode” is a mode in which a one-frame image (still image) of an ultrasonic image (moving image) generated in the past is read out from a cine memory 150 described later and displayed.
  • the B mode is a mode in which the amplitude of the ultrasonic echo is converted into brightness and a tomographic image is displayed.
  • the CF mode is a mode in which the average blood flow velocity, the flow fluctuation, the strength of the flow signal, the flow power, etc. are mapped to various colors and displayed on the B mode image.
  • the PW mode is a mode for displaying the velocity of the ultrasonic echo source (for example, the velocity of blood flow) detected based on the transmission / reception of a pulse wave.
  • the above-mentioned ultrasonic image generation mode is merely an example, and modes other than the above-mentioned three types of modes, for example, an A (Amplitude) mode, an M (Motion) mode, a contrast mode, and the like may be further included. ..
  • FIG. 2 is an enlarged plan view of the tip of the insertion portion 22 of the ultrasonic endoscope 12 and its periphery.
  • FIG. 3 is a cross-sectional view showing a cross section when the tip portion 40 of the insertion portion 22 of the ultrasonic endoscope 12 is cut along the I-I cross section shown in FIG.
  • FIG. 5 is a block diagram showing the configuration of the ultrasonic observation device 14.
  • the ultrasonic endoscope 12 has an insertion unit 22 and an operation unit 24 as described above.
  • the insertion portion 22 includes a tip portion 40, a curved portion 42, and a soft portion 43 in this order from the tip end side (free end side).
  • the tip portion 40 is provided with an ultrasonic observation unit 36 and an endoscopic observation unit 38.
  • an ultrasonic oscillator unit 46 including a plurality of ultrasonic oscillators 48 is arranged in the ultrasonic observation unit 36.
  • the tip portion 40 is provided with a treatment tool outlet 44.
  • the treatment tool outlet 44 serves as an outlet for a treatment tool (not shown) such as a forceps, a puncture needle, or a high-frequency scalpel.
  • the treatment tool outlet 44 also serves as a suction port for sucking suctioned substances such as blood and internal filth.
  • the curved portion 42 is a portion connected to the base end side (the side opposite to the side where the ultrasonic vibrator unit 46 is provided) with respect to the tip end portion 40, and is freely bendable.
  • the flexible portion 43 is a portion that connects the curved portion 42 and the operating portion 24, has flexibility, and is provided in an elongated state.
  • a plurality of pipelines for air supply and water supply and a plurality of pipelines for suction are formed inside each of the insertion unit 22 and the operation unit 24. Further, inside each of the insertion portion 22 and the operation portion 24, a treatment tool channel 45 having one end leading to the treatment tool outlet 44 is formed.
  • the ultrasonic observation unit 36 the ultrasonic observation unit 36, the endoscope observation unit 38, the water supply tank 21a and the suction pump 21b, and the operation unit 24 will be described in detail.
  • the ultrasonic observation unit 36 is a portion provided for acquiring an ultrasonic image, and is arranged on the tip side of the tip portion 40 of the insertion portion 22. As shown in FIG. 3, the ultrasonic observation unit 36 includes an ultrasonic oscillator unit 46, a plurality of coaxial cables 56, and an FPC (Flexible Printed Circuit) 60.
  • FPC Flexible Printed Circuit
  • the ultrasonic transducer unit 46 corresponds to an ultrasonic probe, and uses an ultrasonic transducer array 50 in which a plurality of ultrasonic transducers 48, which will be described later, are arranged in the body cavity of a patient. Is transmitted, and the reflected wave (echo) of the ultrasonic wave reflected at the observation target part is received and the received signal is output.
  • the ultrasonic oscillator unit 46 according to the present embodiment is a convex type and transmits ultrasonic waves in a radial shape (arc shape).
  • the type (model) of the ultrasonic oscillator unit 46 is not particularly limited to this, and other types may be used as long as they can transmit and receive ultrasonic waves, for example, a radial type, a linear type, and the like. You may.
  • the ultrasonic oscillator unit 46 is composed of a backing material layer 54, an ultrasonic oscillator array 50, an acoustic matching layer 74, and an acoustic lens 76.
  • the ultrasonic oscillator array 50 may be configured by arranging a plurality of ultrasonic oscillators 48 in a two-dimensional array.
  • Each of the N ultrasonic vibrators 48 is configured by arranging electrodes on both sides of the piezoelectric element (piezoelectric body).
  • the piezoelectric element barium titanate (BaTiO 3 ), lead zirconate titanate (PZT), potassium niobate (KNbO 3 ) and the like are used.
  • the electrodes include individual electrodes (not shown) individually provided for each of the plurality of ultrasonic transducers 48, and a transducer ground (not shown) common to the plurality of ultrasonic transducers 48. Further, the electrodes are electrically connected to the ultrasonic observation device 14 via the coaxial cable 56 and the FPC 60.
  • a pulsed drive voltage is supplied to each ultrasonic vibrator 48 as an input signal (transmission signal) from the ultrasonic observation device 14 through the coaxial cable 56.
  • this drive voltage is applied to the electrodes of the ultrasonic vibrator 48, the piezoelectric element expands and contracts to drive (vibrate) the ultrasonic vibrator 48.
  • pulsed ultrasonic waves are output from the ultrasonic vibrator 48.
  • the amplitude of the ultrasonic waves output from the ultrasonic vibrator 48 is large according to the intensity (output intensity) when the ultrasonic vibrator 48 outputs the ultrasonic waves.
  • the output intensity is defined as the magnitude of the sound pressure of the ultrasonic waves output from the ultrasonic vibrator 48.
  • each ultrasonic vibrator 48 receives a reflected wave (echo) of ultrasonic waves, it vibrates (drives) in accordance with the reflected wave (echo), and the piezoelectric element of each ultrasonic vibrator 48 generates an electric signal.
  • This electric signal is output from each ultrasonic oscillator 48 toward the ultrasonic observation device 14 as an ultrasonic reception signal.
  • the magnitude (voltage value) of the electric signal output from the ultrasonic vibrator 48 is a magnitude corresponding to the reception sensitivity when the ultrasonic vibrator 48 receives the ultrasonic wave.
  • the reception sensitivity is defined as the ratio of the amplitude of the electric signal received and output by the ultrasonic transducer 48 to the amplitude of the ultrasonic wave transmitted by the ultrasonic transducer 48.
  • a scanning range along a curved surface in which the ultrasonic oscillator array 50 is arranged for example, Ultrasonic waves are scanned within a range of several tens of mm from the center of curvature of the curved surface. More specifically, when a B-mode image (tomographic image) is acquired as an ultrasonic image, m of N ultrasonic oscillators 48 (for example, for example) are arranged in succession by selecting the aperture channel of the multiplexer 140.
  • a drive voltage is supplied to the ultrasonic oscillator 48 (hereinafter referred to as a drive target oscillator).
  • a drive target oscillator As a result, m drive target oscillators are driven, and ultrasonic waves are output from each drive target oscillator of the aperture channel.
  • the ultrasonic waves output from the m drive target oscillators are synthesized immediately afterwards, and the combined wave (ultrasonic beam) is transmitted toward the observation target site.
  • each of the m driven target oscillators receives the ultrasonic waves (echo) reflected at the observation target site, and outputs an electric signal (received signal) according to the reception sensitivity at that time.
  • the positions of the drive target oscillators in the N ultrasonic oscillators 48 are set one by one (one).
  • the ultrasonic oscillator 48) is shifted and repeated.
  • the series of steps is performed from m drive target oscillators on both sides of the N ultrasonic oscillators 48, centered on the ultrasonic oscillator 48 located at one end. It will be started.
  • the above series of steps are repeated every time the position of the driven target oscillator shifts due to the switching of the aperture channel by the multiplexer 140.
  • the series of steps up to the m driven target oscillators on both sides of the N ultrasonic oscillators 48 centered on the ultrasonic oscillator 48 located at the other end. It is repeated N times in total.
  • the backing material layer 54 supports each ultrasonic oscillator 48 of the ultrasonic oscillator array 50 from the back surface side. Further, the backing material layer 54 attenuates the ultrasonic waves emitted from the ultrasonic transducer 48 or the ultrasonic waves (echo) reflected at the observation target portion, which have propagated to the backing material layer 54 side. Has a function.
  • the backing material is made of a rigid material such as hard rubber, and an ultrasonic damping material (ferrite, ceramics, etc.) is added as needed.
  • the acoustic matching layer 74 is superposed on the ultrasonic vibrator array 50, and is provided to match the acoustic impedance between the human body of the patient and the ultrasonic vibrator 48. By providing the acoustic matching layer 74, it is possible to increase the transmittance of ultrasonic waves.
  • As the material of the acoustic matching layer 74 various organic materials whose acoustic impedance value is closer to that of the human body of the patient can be used as compared with the piezoelectric element of the ultrasonic vibrator 48.
  • Specific examples of the material of the acoustic matching layer 74 include epoxy resin, silicone rubber, polyimide, polyethylene and the like.
  • the acoustic lens 76 superposed on the acoustic matching layer 74 is for converging the ultrasonic waves emitted from the ultrasonic oscillator array 50 toward the observation target portion.
  • the acoustic lens 76 is made of, for example, a silicon-based resin (mirable type silicon rubber (HTV rubber), liquid silicon rubber (RTV rubber), etc.), a butadiene-based resin, a polyurethane-based resin, or the like, and titanium oxide is required. , Alumina, silica and other powders are mixed.
  • the FPC 60 is electrically connected to the electrodes included in each ultrasonic oscillator 48.
  • Each of the plurality of coaxial cables 56 is wired to the FPC 60 at one end thereof. Then, when the ultrasonic endoscope 12 is connected to the ultrasonic observation device 14 via the ultrasonic connector 32a, each of the plurality of coaxial cables 56 superimposes at the other end (the side opposite to the FPC60 side). It is electrically connected to the ultrasonic observation device 14.
  • the endoscopic observation unit 38 is a portion provided for acquiring an endoscopic image, and is arranged at the tip portion 40 of the insertion portion 22 on the proximal end side of the ultrasonic observation unit 36. As shown in FIGS. 2 and 3, the endoscope observation unit 38 includes an observation window 82, an objective lens 84, a solid-state image sensor 86, an illumination window 88, a cleaning nozzle 90, a wiring cable 92, and the like.
  • the observation window 82 is attached at the tip 40 of the insertion portion 22 in a state of being inclined obliquely with respect to the axial direction (longitudinal axial direction of the insertion portion 22).
  • the light reflected from the portion adjacent to the observation target and incident from the observation window 82 is imaged on the image pickup surface of the solid-state image sensor 86 by the objective lens 84.
  • the solid-state image sensor 86 transmits the observation window 82 and the objective lens 84, photoelectrically converts the reflected light of the portion adjacent to the observation target imaged on the image pickup surface, and outputs the image pickup signal.
  • CCD Charge Coupled Device: charge-coupled device
  • CMOS Complementary MetalOxide Semiconductor: complementary metal oxide semiconductor
  • the captured image signal output by the solid-state image sensor 86 is transmitted to the endoscope processor 16 by the universal cord 26 via the wiring cable 92 extending from the insertion unit 22 to the operation unit 24.
  • the illumination window 88 is provided at both side positions of the observation window 82.
  • An exit end of a light guide (not shown) is connected to the illumination window 88.
  • the light guide extends from the insertion unit 22 to the operation unit 24, and its incident end is connected to the light source device 18 connected via the universal cord 26.
  • the illumination light emitted by the light source device 18 is transmitted through the light guide and is emitted from the illumination window 88 toward the portion adjacent to the observation target.
  • the cleaning nozzle 90 is a ejection hole formed in the tip portion 40 of the insertion portion 22 for cleaning the surfaces of the observation window 82 and the illumination window 88, and air or a cleaning liquid is discharged from the cleaning nozzle 90 through the observation window 82. And is ejected toward the illumination window 88.
  • the cleaning liquid ejected from the cleaning nozzle 90 is water, particularly degassed water.
  • the cleaning liquid is not particularly limited, and other liquids, for example, ordinary water (water that has not been degassed) may be used.
  • the water supply tank 21a is a tank for storing degassed water, and is connected to the light source connector 32c by the air supply water supply tube 34a.
  • the degassed water is used as a cleaning liquid ejected from the cleaning nozzle 90.
  • the suction pump 21b sucks the suction material (including the degassed water supplied for cleaning) in the body cavity through the treatment tool outlet 44.
  • the suction pump 21b is connected to the light source connector 32c by a suction tube 34b.
  • the ultrasonic endoscopy system 10 may be provided with an air supply pump or the like that supplies air to a predetermined air supply destination.
  • a treatment tool channel 45 and an air supply / water pipe (not shown) are provided in the insertion unit 22 and the operation unit 24.
  • the treatment tool channel 45 communicates between the treatment tool insertion port 30 provided in the operation unit 24 and the treatment tool outlet 44. Further, the treatment tool channel 45 is connected to a suction button 28b provided on the operation unit 24. The suction button 28b is connected to the suction pump 21b in addition to the treatment tool channel 45.
  • the air supply / water pipe is connected to the cleaning nozzle 90 on one end side thereof, and is connected to the air supply water supply button 28a provided on the operation unit 24 on the other end side.
  • the air supply / water supply button 28a is connected to the water supply tank 21a in addition to the air supply / water supply pipeline.
  • the operation unit 24 is a part operated by the operator at the start, during the diagnosis, at the end of the diagnosis, etc., and one end of the universal cord 26 is connected to one end thereof. Further, as shown in FIG. 1, the operation unit 24 has an air supply / water supply button 28a, a suction button 28b, a pair of angle knobs 29, and a treatment tool insertion port (forceps port) 30.
  • the treatment tool insertion port 30 is a hole formed for inserting a treatment tool (not shown) such as forceps, and is in contact with the treatment tool outlet 44 via the treatment tool channel 45.
  • the treatment tool inserted into the treatment tool insertion port 30 is introduced into the body cavity from the treatment tool outlet 44 after passing through the treatment tool channel 45.
  • the air supply / water supply button 28a and the suction button 28b are two-stage switching type push buttons, and are operated to switch the opening / closing of the pipelines provided inside each of the insertion unit 22 and the operation unit 24.
  • the endoscopic image recognition unit 170 has learned in advance the relationship between the learning endoscopic image and the lesion area displayed on the learning endoscopic image for the plurality of learning endoscopic images. Based on the learning result, the lesion region displayed on the diagnostic endoscopic image is recognized from the diagnostic endoscopic image generated by the endoscope processor 16.
  • the learning endoscopic image is an existing endoscopic image for the endoscopic image recognition unit 170 to learn the relationship between the endoscopic image and the lesion area displayed on the endoscopic image. Yes, for example, various endoscopic images taken in the past can be used.
  • the endoscopic image recognition unit 170 includes a lesion area detection unit 102, a position information acquisition unit 104, a selection unit 106, and a lesion area detection control unit 108.
  • the lesion area detection unit 102 detects the lesion area from the diagnostic endoscopic image based on the learning result.
  • the lesion area detection unit 102 includes a plurality of detection units corresponding to a plurality of positions in the body cavity.
  • the first to eleventh detection units 102A to 102K are provided.
  • the first detection unit 102A is in the rectum
  • the second detection unit 102B is in the sigmoid colon
  • the third detection unit 102C is in the descending colon
  • the fourth detection unit 102D is in the transverse colon
  • the fifth detection unit 102E is in the ascending colon.
  • the sixth detection unit 102F is in the cecum
  • the seventh detection unit 102G is in the ileum
  • the eighth detection unit 102H is in the jejunum
  • the ninth detection unit 102I is in the duodenum
  • the tenth detection unit 102J is in the stomach
  • the eleventh detection unit 102K Corresponds to the esophagus.
  • the first to eleventh detection units 102A to 102K are trained models, respectively.
  • the plurality of trained models are models trained using a dataset consisting of different learning endoscopic images. Specifically, the plurality of trained models are displayed on the learning endoscopic image and the learning endoscopic image using a data set consisting of learning endoscopic images taken at different positions in the body cavity. This is a model in which the relationship between the lesion area and the lesion area is learned in advance.
  • the first detection unit 102A is a data set consisting of learning endoscopic images of the rectum
  • the second detection unit 102B is a data set consisting of learning endoscopic images of the sigmoid colon
  • the third detection unit 102C is the descending colon.
  • the fourth detection unit 102D is a data set consisting of a learning endoscopic image of the transverse colon
  • the fifth detection unit 102E is a data set consisting of a learning endoscopic image of the ascending colon.
  • 6th detection unit 102F is a data set consisting of endoscopic images for learning of the cecum
  • 7th detection unit 102G is a data set consisting of endoscopic images for learning of the colon
  • 8th detection unit 102H is for learning of the air intestine.
  • Data set consisting of endoscopic images 9th detection unit 102I is a data set consisting of endoscopic images for learning of the duodenum
  • 10th detection unit 102J is a data set consisting of endoscopic images for learning of the stomach
  • 11th detection Part 102K is a model learned using a data set consisting of endoscopic images for learning the esophagus.
  • the learning method is not particularly limited as long as it can learn the relationship between the endoscopic image and the lesion region from a plurality of learning endoscopic images and generate a learned model.
  • a learning method for example, deep learning (deep learning) using a hierarchical neural network as an example of machine learning (machine learning), which is one of the technologies of artificial intelligence (AI), is used. be able to.
  • machine learning other than deep learning may be used, artificial intelligence technology other than machine learning may be used, or learning methods other than artificial intelligence technology may be used.
  • the trained model may be generated using only the training endoscopic image. In this case, the trained model is not updated and the same trained model can always be used.
  • the diagnostic endoscopic image may be used to generate a trained model. In this case, the learned model is updated as needed by learning the relationship between the diagnostic endoscopic image and the lesion area displayed on the diagnostic endoscopic image.
  • the position information acquisition unit 104 acquires information on the position in the body cavity of the endoscopic image.
  • an operator such as a doctor inputs position information using the console 100.
  • the position information acquisition unit 104 acquires the position information input from the operation console 100.
  • the selection unit 106 selects a detection unit corresponding to the position information acquired by the position information acquisition unit 104 from the lesion area detection unit 102. That is, the selection unit 106 uses the first detection unit 102A when the position information is the rectum, the second detection unit 102B when the position information is the sigmoid colon, the third detection unit 102C when the position information is the descending colon, and the transverse colon.
  • the 4th detection unit 102D in the case of the ascending colon, the 5th detection unit 102E, in the case of the rectum, the 6th detection unit 102F, in the case of the ileum, the 7th detection unit 102G, and in the case of the jejunum, the 8th detection unit.
  • the lesion area detection control unit 108 causes the lesion area to be detected from the endoscopic image by the detection unit selected by the selection unit 106.
  • the lesion area here is not limited to the one caused by the disease, and includes an area in a state different from the normal state in appearance.
  • the lesion area includes, for example, treatment scars such as polyps, cancer, colon diverticulum, inflammation, EMR (Endoscopic Mucosal Resection) scars or ESD (Endoscopic Submucosal Dissection) scars, clip sites, bleeding points, perforations, and vascular atypia. It can be exemplified.
  • the ultrasonic observation device 14 transmits and receives ultrasonic waves to the ultrasonic vibrator unit 46, and images the received signal output by the ultrasonic vibrator 48 (specifically, the element to be driven) at the time of ultrasonic reception to obtain an ultrasonic image. To generate. Further, in addition to the generated ultrasonic image, the ultrasonic observation device 14 displays the endoscopic image transferred from the endoscope processor 16 and the anatomical schema diagram on the monitor 20.
  • the ultrasonic observation device 14 includes a multiplexer 140, a reception circuit 142, a transmission circuit 144, an A / D converter 146, an ASIC (Application Specific Integrated Circuit) 148, a cine memory 150, and a CPU (Central Processing Unit) 152. , DSC (Digital Scan Controller) 154, ultrasonic image recognition unit 168, operation procedure storage unit 174, warning generation unit 176, movement route registration unit 178, and display control unit 172.
  • DSC Digital Scan Controller
  • the receiving circuit 142 and the transmitting circuit 144 are electrically connected to the ultrasonic oscillator array 50 of the ultrasonic endoscope 12.
  • the multiplexer 140 selects a maximum of m drive target oscillators from the N ultrasonic oscillators 48 and opens the channels thereof.
  • the transmission circuit 144 includes an FPGA (field programmable gate array), a pulsar (pulse generation circuit 158), a SW (switch), and the like, and is connected to a MUX (multiplexer 140).
  • An ASIC application specific integrated circuit may be used instead of the FPGA.
  • the transmission circuit 144 transmits ultrasonic waves to the drive target oscillator selected by the multiplexer 140 according to the control signal sent from the CPU 152 in order to transmit ultrasonic waves from the ultrasonic transducer unit 46.
  • the drive voltage is a pulsed voltage signal (transmission signal) and is applied to the electrodes of the drive target oscillator via the universal cord 26 and the coaxial cable 56.
  • the transmission circuit 144 has a pulse generation circuit 158 that generates a transmission signal based on the control signal, and under the control of the CPU 152, the pulse generation circuit 158 is used to drive a plurality of ultrasonic vibrators 48 to superimpose.
  • a transmission signal for generating a sound wave is generated and supplied to a plurality of ultrasonic vibrators 48. More specifically, the transmission circuit 144 uses the pulse generation circuit 158 to generate a transmission signal having a drive voltage for performing the ultrasonic diagnosis when the ultrasonic diagnosis is performed under the control of the CPU 152.
  • the receiving circuit 142 is a circuit that receives an electric signal output from a drive target oscillator that has received ultrasonic waves (echo), that is, a received signal. Further, the receiving circuit 142 amplifies the received signal received from the ultrasonic vibrator 48 according to the control signal sent from the CPU 152, and delivers the amplified signal to the A / D converter 146.
  • the A / D converter 146 is connected to the receiving circuit 142, converts the received signal received from the receiving circuit 142 from an analog signal to a digital signal, and outputs the converted digital signal to the ASIC 148.
  • the ASIC 148 is connected to the A / D converter 146, and as shown in FIG. 5, the phase matching unit 160, the B mode image generation unit 162, the PW mode image generation unit 164, the CF mode image generation unit 166, and the memory controller 151. Consists of. In this embodiment, the above-mentioned functions (specifically, the phase matching unit 160, the B mode image generation unit 162, the PW mode image generation unit 164, the CF mode image generation unit 166, and the above-mentioned functions by a hardware circuit such as ASIC148 are used.
  • the memory controller 151) has been realized, but the present invention is not limited to this. The above functions may be realized by linking a central processing unit (CPU) and software (computer program) for executing various data processes.
  • the phase matching unit 160 executes a process of giving a delay time to the received signal (received data) digitized by the A / D converter 146 and performing phase adjustment addition (adding after matching the phase of the received data). To do.
  • the phasing addition process generates a sound line signal in which the focus of the ultrasonic echo is narrowed down.
  • the B-mode image generation unit 162, the PW-mode image generation unit 164, and the CF-mode image generation unit 166 are driven by the ultrasonic transducer units 48 among the plurality of ultrasonic transducers 48 when the ultrasonic transducer unit 46 receives ultrasonic waves. Generates an ultrasonic image based on the electric signal output by (strictly speaking, the sound line signal generated by phasing and adding the received data).
  • the B-mode image generation unit 162 is an image generation unit that generates a B-mode image that is a tomographic image of the inside (inside the body cavity) of the patient.
  • the B-mode image generation unit 162 corrects the attenuation due to the propagation distance of the sequentially generated sound line signals by STC (Sensitivity Time gain Control) according to the depth of the reflection position of the ultrasonic waves. Further, the B-mode image generation unit 162 generates an B-mode image (image signal) by performing an envelope detection process and a Log (logarithmic) compression process on the corrected sound line signal.
  • the PW mode image generation unit 164 is an image generation unit that generates an image that displays the velocity of blood flow in a predetermined direction.
  • the PW mode image generation unit 164 extracts a frequency component by performing a high-speed Fourier transform on a plurality of sound line signals in the same direction among the sound line signals sequentially generated by the phase matching unit 160. After that, the PW mode image generation unit 164 calculates the blood flow velocity from the extracted frequency component, and generates a PW mode image (image signal) displaying the calculated blood flow velocity.
  • the CF mode image generation unit 166 is an image generation unit that generates an image that displays blood flow information in a predetermined direction.
  • the CF mode image generation unit 166 generates an image signal indicating information on blood flow by obtaining the autocorrelation of a plurality of sound line signals in the same direction among the sound line signals sequentially generated by the phase matching unit 160. .. After that, the CF mode image generation unit 166 superimposes information on blood flow on the B mode image signal generated by the B mode image generation unit 162 based on the above image signal, and the CF mode image (image signal) as a color image. ) Is generated.
  • the memory controller 151 stores the image signal generated by the B-mode image generation unit 162, the PW-mode image generation unit 164, or the CF-mode image generation unit 166 in the cine memory 150.
  • the DSC 154 is connected to the ASIC 148 and converts the image signal generated by the B mode image generation unit 162, the PW mode image generation unit 164, or the CF mode image generation unit 166 into an image signal according to a normal television signal scanning method. (Raster conversion) is performed, the image signal is subjected to various necessary image processing such as gradation processing, and then output to the ultrasonic image recognition unit 168.
  • the ultrasonic image recognition unit 168 associates the position and orientation of the tip of the ultrasonic endoscope 12 in the body cavity of the subject with a label number based on the observation order of the observation target part such as an organ, and super-learning.
  • the relationship between the ultrasonic image and the label number corresponding to the position and orientation of the tip 40 of the ultrasonic endoscope 12 at the time of capturing the ultrasonic image for learning is learned in advance for a plurality of ultrasonic images for learning and learned.
  • the ultrasonic endoscope 12 Based on the result, from the ultrasonic image raster-converted by DSC154, that is, the ultrasonic image for diagnosis generated by the ultrasonic observation device 14, the ultrasonic endoscope 12 at the time of capturing the ultrasonic image for diagnosis The label number corresponding to the position and orientation of the tip portion 40 is recognized.
  • the label number recognized by the ultrasonic image recognition unit 168 is output to the display control unit 172 and the warning generation unit 176 (see FIG. 5), which will be described later.
  • the observation order of the observation target sites is the order of the observation target sites in which ultrasonic images are imaged (observed) in the body cavity of the subject.
  • the observation order of the observation target sites will be described later with an example.
  • the label number is given based on the observation order of the observation target site. For example, assuming that the observation target site having the first observation order is the left lobe of the liver, the left lobe of the liver is given the first label number.
  • the label number does not have to be a "number" as long as the order is known, and may be any label.
  • the ultrasonic image recognition unit 168 has a relationship between the ultrasonic image and a label number corresponding to the position and orientation of the tip portion 40 of the ultrasonic endoscope 12 at the time of capturing the ultrasonic image. It is an existing ultrasonic image for learning, for example, various ultrasonic images captured in the past can be used.
  • the ultrasonic image recognition unit 168 includes a label number detection unit 112, an organ name detection unit 120, and a position and orientation detection unit 122.
  • the label number detection unit 112 detects the label number corresponding to the position and orientation of the tip portion 40 of the ultrasonic endoscope 12 at the time of capturing the diagnostic ultrasonic image from the diagnostic ultrasonic image. To do.
  • the label number detection unit 112 is a trained model.
  • This trained model uses a data set consisting of learning ultrasonic images that capture different positions of the subject's body to be observed, and uses an ultrasonic endoscope 12 at the time of capturing the learning ultrasonic image.
  • the position and orientation of the tip 40 of the ultrasonic endoscope 12 at the time of capturing the learning ultrasonic image and the learning ultrasonic image by assigning each label number corresponding to each position and orientation of the tip 40 of the above.
  • This is a model in which the relationship between the label number corresponding to the orientation and the relationship is learned in advance.
  • the learning method has been learned by learning the relationship between the ultrasonic image and the label number corresponding to the position and orientation of the tip 40 at the time of imaging of the ultrasonic endoscope 12 from a plurality of learning ultrasonic images.
  • the model is not particularly limited as long as it can generate a model.
  • a learning method for example, deep learning (deep learning) using a hierarchical neural network as an example of machine learning (machine learning), which is one of the technologies of artificial intelligence (AI), is used. be able to.
  • machine learning other than deep learning may be used, artificial intelligence technology other than machine learning may be used, or learning methods other than artificial intelligence technology may be used.
  • the trained model may be generated using only the ultrasonic image for training. In this case, the trained model is not updated and the same trained model can always be used.
  • the diagnostic ultrasound image may be used to generate the trained model. In this case, the trained model learns the relationship between the diagnostic ultrasonic image and the label number corresponding to the position and orientation of the tip 40 of the ultrasonic endoscope 12 at the time of capturing the diagnostic ultrasonic image. Is updated from time to time.
  • Typical observation points in the body include, for example, the following (1) to (12).
  • the gallbladder is a typical observation point from within the stomach, and (8) the portal vein, (9) the common
  • the observation order of the observation target sites is an example, and the observation order of the observation target sites may be slightly different depending on the operator. For this reason, a plurality of lists of observation orders of different observation target sites are prepared according to the operator, and for each list, a learning ultrasonic image and an ultrasonic endoscope at the time of capturing the learning ultrasonic image are prepared.
  • the relationship between the label numbers corresponding to the positions and orientations of the tips 40 of the twelve is learned for a plurality of learning ultrasonic images, and the list to be used, that is, the observation order of the observation target site is switched by the operator. You may do so. Alternatively, the surgeon may be able to register the desired list.
  • the number of observation target sites in the list may be larger than the observation order of the above observation target sites, or conversely may be smaller. That is, one or more other observation target parts may be added between the observation target part of 1 and the observation target part next to the observation target part having the observation order of 1, or conversely, the observation order is continuous. One or more observation target sites may be deleted from the plurality of observation target sites.
  • the organ name detection unit 120 detects the names of the organs corresponding to the label numbers (1) to (12) detected by the label number detection unit 112. Since the label number is associated with the observation order of the observation target site, the name of the observation target site (organ) corresponding to this label number can be obtained from the label number.
  • the position and orientation detection unit 122 sets the tip of the ultrasonic endoscope 12 at the time of capturing a diagnostic ultrasonic image based on the label numbers (1) to (12) detected by the label number detection unit 112.
  • the position and orientation of the unit 40 are detected. Since the label number is associated with the position and orientation of the tip 40 of the ultrasonic endoscope 12 at the time of capturing the ultrasonic image, the label number is used to indicate the ultrasonic endoscope 12 corresponding to this label number. The position and orientation of the tip 40 can be obtained.
  • the position and orientation of the tip 40 of the ultrasonic endoscope 12 are the observation points corresponding to the above label numbers (1) to (12), that is, (1) left lobe of the liver, (2). ) Confluence of aorta, peritoneal artery, and superior mesenteric artery, (3) pancreatic body, (4) tail of pancreas, (5) splenic vein, superior mesenteric vein, and confluence of portal vein, (6) ) Pancreatic head and (7) gallbladder (representative observation point from the stomach), (8) portal vein, (9) common bile duct and (10) gallbladder (representative observation point of duodenal bulb), (11) The position and orientation of the tip 40 of the ultrasonic endoscope 12 at the time of capturing an ultrasonic image at the pancreatic gall bladder and (12) the papilla (a typical observation point from the descending duodenum) are detected.
  • the ultrasonic vibrator unit 46 When the ultrasonic vibrator unit 46 is a convex type as in the present embodiment, the position and orientation of the tip portion 40 of the ultrasonic endoscope 12 are made to correspond to the label number, and the position and orientation corresponding to the label number are detected. It is desirable to do. On the other hand, when the ultrasonic vibrator unit 46 is a radial type, it is not necessary to detect the direction of the tip 40 of the ultrasonic endoscope 12, so only the position of the tip 40 of the ultrasonic endoscope 12 is labeled. It is desirable to correspond to the number and detect only the position corresponding to the label number.
  • the operation procedure storage unit 174 moves the tip portion 40 of the ultrasonic endoscope 12 from the observation target portion corresponding to the label number 1 to the observation target portion corresponding to the label number next to the label number having the observation order of 1. Memorize the operation procedure for making it. The operation procedure is output to the display control unit 172.
  • the label number of 1 is a label number corresponding to any one observation target part among the observation target parts to be observed according to the observation order, and the label number next to the label number having the observation order of 1 is the arbitrary one. It is a label number corresponding to the next observation target part of the observation target part.
  • the first observation site corresponding to the first label number is the left lobe of the liver
  • the second observation site corresponding to the second label number is the confluence of the aorta, celiac artery, and superior mesenteric artery.
  • it is a department.
  • the operation procedure storage unit 174 is an endoscopic ultrasound endoscope from the left lobe of the liver corresponding to the first label number to the confluence of the aorta, the celiac artery, and the superior mesenteric artery corresponding to the second label number.
  • the operation procedure for moving the tip portion 40 of 12 is stored.
  • the operation procedure storage unit 174 similarly stores the operation procedure for the observation target parts corresponding to the second and subsequent label numbers.
  • the operation procedure for the observation target site whose observation order corresponds to the last label number is not stored.
  • the operating procedure includes various instructions for moving the tip 40 of the ultrasonic endoscope 12.
  • instructions such as moving the ultrasonic endoscope 12 forward, turning the ultrasonic endoscope 12 clockwise or counterclockwise, and bending the tip 40 of the ultrasonic endoscope 12 are included.
  • the warning generation unit 176 moves the tip of the ultrasonic endoscope 12 from the observation target part corresponding to the current label number to the observation target part other than the observation target part whose observation order corresponds to the label number next to the current label number. A warning is issued when the unit 40 is moved.
  • the current label number is the label number corresponding to the observation target part currently being observed among the observation target parts to be observed according to the observation order, in other words, the label number currently recognized by the ultrasonic image recognition unit 168. Is.
  • the label number next to the label number whose observation order is the current one is the label number corresponding to the observation target part to be observed next to the observation target part currently being observed.
  • the first observation site corresponding to the first label number is the left lobe of the liver
  • the second observation site corresponding to the second label number is the aorta, celiac artery, and superior mesenteric artery. It is assumed that it is a confluence.
  • the first label number corresponding to the left lobe of the liver is the current label number, which corresponds to the confluence of the aorta, celiac artery, and superior mesenteric artery.
  • the second label number is the label number next to the current label number in the observation order.
  • the tip 40 of the ultrasonic endoscope 12 moves from the observation target part corresponding to the current label number to the observation target part other than the observation target part whose observation order corresponds to the label number next to the current label number. If so, the warning generating unit 176 determines that the moving direction of the tip 40 of the ultrasonic endoscope 12 operated by the operator is incorrect, and issues a warning. When the warning generator 176 issues a warning, the operator can notice that the tip 40 of the ultrasonic endoscope 12 is being moved in the wrong direction, and the ultrasonic endoscope 12 can be moved in the correct direction. The tip 40 can be moved.
  • the warning issued by the warning generation unit 176 is output to the display control unit 172, and under the control of the display control unit 172, a warning such as "the moving direction is wrong! Is displayed as character information. It is displayed on the monitor 20.
  • the means for issuing the warning is not particularly limited, and for example, the warning may be issued from the speaker as voice information, or both text information and voice information may be issued at the same time as the warning.
  • the movement route registration unit 178 registers in advance a movement route when the tip 40 of the ultrasonic endoscope 12 is ideally moved based on the observation order of the observation target portion. This ideal movement route is output to the display control unit 172.
  • the ideal movement route is a movement route when the tip 40 of the ultrasonic endoscope 12 is correctly operated and moved according to the observation order of the observation target part.
  • the display control unit 172 causes the monitor 20 to display the position and orientation of the tip portion 40 of the ultrasonic endoscope 12 corresponding to the label number recognized by the ultrasonic image recognition unit 168.
  • the display control unit 172 superimposes the lesion area on the endoscopic image, superimposes the name of the organ on the ultrasonic image, and displays the name of the organ on the anatomical schema according to the instruction from the operator.
  • the position and orientation of the tip 40 of the ultrasonic endoscope 12 may be superimposed and displayed.
  • the display control unit 172 displays an endoscopic image in which the lesion area is not displayed, an endoscopic image in which the lesion area is superimposed, and an organ name in response to an instruction from the operator.
  • the name of the organ is displayed in the vicinity of the organ, for example, on the organ by superimposing it on the ultrasonic image, and the position and orientation of the tip 40 of the ultrasonic endoscope 12 are, for example, in the anatomical schema diagram. It is displayed on top of it.
  • the lesion area is displayed, for example, overlaid on the endoscopic image and surrounded by a frame.
  • the cine memory 150 has a capacity for accumulating image signals for one frame or several frames.
  • the image signal generated by the ASIC 148 is output to the DSC 154, and is also stored in the cine memory 150 by the memory controller 151.
  • the memory controller 151 reads the image signal stored in the cine memory 150 and outputs it to the DSC 154.
  • the monitor 20 displays an ultrasonic image (still image) based on the image signal read from the cine memory 150.
  • the CPU 152 functions as a control unit that controls each part of the ultrasonic observation device 14, and is connected to a reception circuit 142, a transmission circuit 144, an A / D converter 146, an ASIC 148, and the like to control these devices. Specifically, the CPU 152 is connected to the console 100 and controls each part of the ultrasonic observation device 14 according to the inspection information, control parameters, and the like input by the console 100.
  • the CPU 152 automatically recognizes the ultrasonic endoscope 12 by a method such as PnP (Plug and Play). To do.
  • FIG. 7 is a diagram showing a flow of diagnostic processing using the ultrasonic endoscopy system 10.
  • FIG. 8 is a diagram showing the procedure of the diagnostic step during the diagnostic process.
  • the CPU 152 controls each part of the ultrasonic observation device 14 to perform the diagnosis step (S004).
  • the diagnostic step proceeds according to the flow shown in FIG. 8, and when the designated image generation mode is the B mode (Yes in S031), each part of the ultrasonic observation device 14 so as to generate the B mode image. Is controlled (S032). Further, when the designated image generation mode is not the B mode (No in S031) and the CF mode (Yes in S033), each part of the ultrasonic observation device 14 is controlled so as to generate a CF mode image (S034). ).
  • each part of the ultrasonic observation device 14 is controlled so as to generate a PW mode image (S036). ). If the designated image generation mode is not the PW mode (No in S035), the process proceeds to step S037.
  • the CPU 152 determines whether or not the ultrasonic diagnosis has been completed (S037).
  • the process returns to the diagnosis step S031, and the generation of the ultrasonic image by each image generation mode is repeatedly performed until the diagnosis end condition is satisfied.
  • the diagnosis end condition include, for example, an operator instructing the end of the diagnosis through the operation console 100.
  • the diagnosis step is completed. Then, returning to FIG. 7, when the power of each part of the ultrasonic endoscopic system 10 is turned off (Yes in S006), the diagnostic process is completed. On the other hand, when the power of each part of the ultrasonic endoscopic system 10 is maintained in the on state (No in S005), the process returns to the input step S001 and each step of the above-mentioned diagnostic process is repeated.
  • the surgeon can display at least one of the endoscopic image, the ultrasonic image, and the anatomical schema diagram on the screen of the monitor 20.
  • the display control unit 172 determines the endoscopic image (with / without display of the lesion area), the ultrasonic image (with / without the display of the organ name), and the anatomy according to the instruction from the operator.
  • One image or two or more images are displayed side by side on the screen of the monitor 20 from the anatomical schema diagram (with / without display of the position and orientation of the tip 40 of the ultrasonic endoscope 12).
  • the display control unit 172 can display one image as a attention image larger than the other images from the two or more images displayed on the monitor 20.
  • the ultrasonic image recognition unit 168 operates, and the endoscopic image is displayed on the screen of the monitor 20.
  • the endoscopic image recognition unit 170 operates.
  • an endoscopic image in which the lesion area is superimposed and displayed on the monitor 20 is displayed, or an ultrasonic image in which the name of the organ is superimposed on the monitor 20.
  • Can be displayed on the monitor 20, or an anatomical schema diagram in which the position and orientation of the tip 40 of the ultrasonic endoscope 12 are superimposed on the monitor 20 can be displayed on the monitor 20.
  • surgeon can display an endoscopic image, an ultrasonic image, and an anatomical schema diagram on the screen of the monitor 20 as shown in FIG.
  • An ultrasonic image is displayed from the left to the center of the screen of the monitor 20 shown in FIG. 9, and the names of the organs Panc, PD, SV, and SA are superimposed on the ultrasonic image.
  • Panc stands for pancreas
  • PD stands for pancreatic duct
  • SV stands for splenic vein
  • SA stands for splenic artery.
  • An anatomy schema diagram is displayed in the upper right portion of the screen of the monitor 20, and the position and orientation of the tip portion 40 of the ultrasonic endoscope 12 are superimposed on the anatomy schema diagram.
  • An endoscopic image in which the lesion area is not displayed is displayed in the lower right portion of the screen of the monitor 20.
  • the ultrasound image is displayed as a noteworthy image larger than the anatomical schema and endoscopic images. Further, the operation procedure for moving the tip portion 40 of the ultrasonic endoscope 12 between the anatomical schema diagram at the right center portion in the screen of the monitor 20 and the endoscope image is described in "Next observation. Target site: 4th tail of pancreas Please turn clockwise along the SV. "is displayed as text information.
  • the operator can arbitrarily combine one image or two or more images and display them side by side on the screen of the monitor 20.
  • the surgeon can arbitrarily set the position where the endoscopic image, the ultrasonic image, the anatomical schema diagram, and the operation procedure are arranged.
  • the surgeon can switch and display the image of interest from the images displayed on the monitor 20.
  • Display control method by display control unit >> Next, various display control methods by the display control unit 172 will be described. First, a method of displaying the position and orientation of the tip 40 of the ultrasonic endoscope 12 will be described.
  • the display control unit 172 can display the position and orientation of the tip portion 40 of the ultrasonic endoscope 12 corresponding to the label number recognized by the ultrasonic image recognition unit 168 on the monitor 20 as character information.
  • the display control unit 172 determines the tip of the ultrasonic endoscope 12 such as "Currently, the tip of the ultrasonic endoscope depicts the direction of the left lobe of the liver from the stomach." Character information explaining the position and orientation of the unit 40 is displayed on the monitor 20. Further, the display control unit 172 can display the name of the observation target portion corresponding to the label number recognized by the ultrasonic image recognition unit 168 on the monitor 20 as character information.
  • the display control unit 172 displays on the monitor 20 an anatomical schema diagram in which the position and orientation of the tip portion 40 of the ultrasonic endoscope 12 recognized by the ultrasonic image recognition unit 168 are superimposed and displayed as image information. Can be made to.
  • the display control unit 172 displays an anatomical schema diagram on the monitor 20 as shown in the upper right part of FIG. 9, and the position and orientation of the tip 40 of the ultrasonic endoscope 12 are anatomical as image information. It is overlaid on the schema diagram.
  • the display control unit 172 may display the position and orientation of the tip portion 40 of the ultrasonic endoscope 12 on the monitor 20 as character information and display it on the monitor 20 as image information. That is, both the character information and the image information may be displayed at the same time.
  • the display control unit 172 can display the operation procedure for moving the tip portion 40 of the ultrasonic endoscope 12 on the monitor 20.
  • the display control unit 172 moves the observation target portion corresponding to the current label number to the observation target portion whose observation order corresponds to the label number next to the current label number.
  • the tip portion 40 of the ultrasonic endoscope 12 The operation procedure for moving the label is acquired from the operation procedure storage unit 174, and the operation procedure acquired from the operation procedure storage unit 174 is displayed on the monitor 20.
  • the tip 40 of the ultrasonic endoscope 12 can be correctly moved from the current observation target site to the next observation target site without hesitation in the body of the subject. Can be moved.
  • the display control unit 172 can display the operation procedure on the monitor 20 as character information explaining the operation procedure.
  • the operating procedure may include the names of one or more organs that serve as landmarks for the operation, as shown in the center right of FIG.
  • "SV" in “Turn clockwise along the SV” is the name of the organ that serves as a mark of operation.
  • the organ that serves as a marker for operation is an organ that is a target when the tip 40 of the ultrasonic endoscope 12 is moved, and is drawn when the tip 40 of the ultrasonic endoscope 12 is moved.
  • Organs displayed in the current ultrasonic image organs displayed within a certain range from the position of the tip 40 of the ultrasonic endoscope 12 in the current ultrasonic image, and a plurality of defined organs. , One or more organs selected from the above can be exemplified.
  • the display control unit 172 can display the anatomical schema diagram in which the operation procedure is superimposed as image information representing the movement route of the tip portion 40 of the ultrasonic endoscope 12 on the monitor 20.
  • the display control unit 172 may color an area of one or more organs that serves as a mark of operation on the anatomy schema diagram, and display the colored anatomy schema diagram on the monitor 20. .. Further, on the anatomy schema diagram, the display control unit 172 includes a region of an observation target site whose observation order corresponds to the label number next to the current label number, and a region of one or more organs that serves as a mark for the above operation.
  • the anatomical schema diagram may be displayed on the monitor 20 in which the and is colored in different colors and these areas are colored in different colors.
  • the display control unit 172 may display both the position and orientation of the tip portion 40 of the ultrasonic endoscope 12 and the operation procedure on the monitor 20 at the same time.
  • the display control unit 172 further causes the monitor 20 to display the name of the observation target portion whose observation order corresponds to the label number next to the current label number as character information. Good.
  • "next observation target site: fourth pancreatic tail” is the name of the observation target site corresponding to the next label number. Since the label number is associated with the observation order of the observation target part, the name of the observation target part whose observation order corresponds to the label number next to this label number can be obtained from the label number. As a result, the operator can easily grasp where the next observation target site is.
  • the display control unit 172 adds a check mark to the label number corresponding to the reached observation target part each time the tip 40 of the ultrasonic endoscope 12 reaches the observation target part corresponding to each label number. Then, as shown in FIG. 10, the label number to which the check mark is added may be displayed on the monitor 20 as character information. That is, a check mark is added to the label number corresponding to the observation target portion reached by the tip portion 40 of the ultrasonic endoscope 12. In the case of the example of FIG. 10, a check mark is attached to the label numbers (1) left lobe of the liver, (2) aorta, celiac artery, confluence of superior mesenteric artery, and (3) left side of pancreatic body.
  • the tip 40 of the ultrasonic endoscope 12 reaches the observation target site corresponding to the label number (3) pancreatic body. You can see that.
  • the display control unit 172 determines the area of the observation target portion reached on the anatomical schema diagram each time the tip portion 40 of the ultrasonic endoscope 12 reaches the observation target region corresponding to each label number.
  • the monitor 20 may display an anatomical schema diagram in which the area of the observation target site that has been colored and reached is colored. That is, the region of the observation target portion reached by the tip portion 40 of the ultrasonic endoscope 12 is colored.
  • the operator can easily grasp which label number the observation order currently reaches the observation target site, which is the tip 40 of the ultrasonic endoscope 12. That is, the surgeon confirms that the tip 40 of the ultrasonic endoscope 12 has reached the observation target site corresponding to the label number with the check mark, or the colored observation target site. can do.
  • the operator can easily grasp which observation target site is the observation target site whose observation order corresponds to the label number next to the current label number.
  • the display control unit 172 emphasizes the area of the observation target part whose observation order corresponds to the label number next to the current label number on the anatomy schema diagram, and the anatomy in which the area of the observation target part is emphasized.
  • the schema diagram may be displayed on the monitor 20.
  • the display control unit 172 sets the area of the observation target portion whose observation order corresponds to the label number next to the current label number on the anatomical schema diagram to the label number next to the current label number in the observation order. Colored in a color different from the area of the observation target part other than the corresponding observation target part, for example, the area of the observation target part whose observation order corresponds to the label number next to the current label number, and the observation order is the current label number.
  • the monitor 20 displays an anatomical schema diagram in which the areas of the observation target parts other than the observation target parts corresponding to the following label numbers are colored darker or lighter than those of the observation target parts. You may.
  • the region of the observation target portion whose observation order corresponds to the label number next to the current label number is displayed more emphasized than the region of the other observation target region. Therefore, the surgeon can easily move the tip 40 of the ultrasonic endoscope 12 from the observation target site corresponding to the current label number to the observation target site whose observation order corresponds to the label number next to the current label number. Can be moved to.
  • the method of emphasizing the area of the observation target part is not limited to the above. For example, the area of the observation target part to be emphasized is surrounded by a thick frame, only the area of the observation target part to be emphasized is colored, and an arrow indicating the observation target part to be emphasized is indicated. It is possible to exemplify attaching and the like.
  • the display control unit 172 acquires a movement route when the tip portion 40 of the ultrasonic endoscope 12 is ideally moved based on the observation order of the observation target part from the movement route registration unit 178, and observes the movement route.
  • the movement routes when the 40 is actually moved may be displayed on the monitor 20 side by side on the anatomical schema diagram as image information representing these routes.
  • the surgeon can move the tip 40 of the ultrasonic endoscope 12 while confirming the ideal movement route and the movement route by his / her own operation. Therefore, the surgeon can move the tip 40 of the ultrasonic endoscope 12 so that the actual route matches the ideal route, and as a result, the actual movement route is changed to the ideal movement route. Can be approached to.
  • the ultrasonic image recognition unit 168 is built in the ultrasonic observation device 14, but is not limited to this, and may be built in, for example, the endoscope processor 16 or an ultrasonic wave. It may be provided outside the observation device 14 and the endoscope processor 16.
  • the ultrasonic image recognition unit 168 When the ultrasonic image recognition unit 168 is built in the ultrasonic observation device 14 as in the present embodiment, as shown in FIG. 11, the endoscopic image is transmitted from the endoscope processor 16 to the ultrasonic observation device. Transferred to 14.
  • the ultrasonic image recognition unit 168 is built in the endoscope processor 16, the ultrasonic image is transferred from the ultrasonic observation device 14 to the endoscope processor 16 as shown in FIG.
  • the endoscopic image is transmitted from the endoscope processor 16 to the ultrasonic observation device. It is transferred to 14, and further, the endoscopic image and the ultrasonic image are transferred from the ultrasonic observation device 14 to the ultrasonic image recognition unit 168. In this case, the ultrasonic image is transferred from the ultrasonic observation device 14 to the endoscope processor 16, and the endoscope image and the ultrasonic image are further transferred from the endoscope processor 16 to the ultrasonic image recognition unit 168. You may.
  • the endoscopic image is not transferred from the endoscope processor 16 to the ultrasonic observation device 14, and further transferred from the ultrasonic observation device 14 to the ultrasonic image recognition unit 168, but from the endoscope processor 16. It may be transferred to the ultrasonic image recognition unit 168.
  • the display control unit 172 is arranged between the final image signal output to the monitor 20 and the monitor 20.
  • the display control unit 172 is, for example, built in the ultrasonic observation device 14, or between the ultrasonic observation device 14 and the monitor 20. Can be provided in. Further, when the ultrasonic image recognition unit 168 is built in the endoscope processor 16, the display control unit 172 is, for example, built in the endoscope processor 16, or the endoscope processor 16 and the monitor 20. Can be provided between. Further, when the ultrasonic image recognition unit 168 is provided outside the ultrasonic observation device 14 and the endoscope processor 16, the display control unit 172 is, for example, outside the ultrasonic observation device 14 and the endoscope processor 16. Can be provided.
  • the display control unit 172 responds to instructions from the operator, and displays an endoscopic image (with / without display of the lesion area), an ultrasonic image (with / without display of the name of the organ), and an anatomical schema diagram.
  • One image or two or more images are arranged side by side and displayed on the screen of the monitor 20 from (with / without display of the position and orientation of the tip portion 40 of the ultrasonic endoscope 12).
  • the location of the endoscopic image recognition unit 170 can be determined in the same manner as the location of the ultrasonic image recognition unit 168. That is, in the case of this embodiment, the endoscope image recognition unit 170 is built in the endoscope processor 16, but is not limited to this, and may be built in, for example, the ultrasonic observation device 14. , It may be provided outside the ultrasonic observation device 14 and the endoscope processor 16.
  • the positions of the ultrasonic image recognition unit 168 and the endoscopic image recognition unit 170 are not fixed, and the ultrasonic image recognition unit 168 and the endoscopic image recognition unit 170 are not fixed. Can be provided at any arrangement location.
  • an endoscopic image recognition unit 170 (lesion area detection unit 102, position information acquisition unit 104, selection unit 106, and lesion area detection control unit 108), ultrasonic image recognition unit 168 (label number).
  • the hardware configuration of the processing unit (Processing Unit) that executes various processes such as the acquisition unit) 100 may be dedicated hardware, or may be various processors or computers that execute programs. Good. Further, the hardware configuration of the operation procedure storage unit 174 and the movement route registration unit 178 may be dedicated hardware or a memory such as a semiconductor memory.
  • the circuit configuration can be changed after manufacturing the CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that execute software (programs) and function as various processing units.
  • Programmable Logic Device (PLD) Programmable Logic Device (PLD), ASIC (Application Specific Integrated Circuit), and other dedicated electric circuits that are processors with a circuit configuration designed exclusively for performing specific processing are included. ..
  • One processing unit may be composed of one of these various processors, or a combination of two or more processors of the same type or different types, for example, a combination of a plurality of FPGAs, or a combination of an FPGA and a CPU. And so on. Further, a plurality of processing units may be configured by one of various processors, or two or more of the plurality of processing units may be collectively configured by using one processor.
  • SoC System on Chip
  • circuitry that combines circuit elements such as semiconductor elements.
  • the method of the present invention can be carried out, for example, by a program for causing a computer to execute each step. It is also possible to provide a computer-readable recording medium on which this program is recorded.
  • Ultrasonic endoscopic system 12
  • Ultrasonic endoscope 14
  • Ultrasonic observation device 16
  • Endoscopic processor 18
  • Light source device 20
  • Monitor 21a Water supply tank 21b
  • Suction pump 22
  • Insertion unit 24
  • Operation unit 26
  • Universal cord 28a
  • Air supply water supply button 28b
  • Angle knob 30
  • Treatment tool insertion port 32a
  • Ultrasonic connector 32b
  • Endoscopic connector 32c
  • Light source connector 34a Air supply and water supply tube 34b
  • Suction tube 36
  • Ultrasonic observation part 38
  • Tip part 42 Curved Part
  • Flexible part 44
  • Treatment tool outlet 45
  • Treatment tool channel 46
  • Ultrasonic transducer unit 48
  • Ultrasonic transducer array 54
  • Backing material layer 56
  • Coaxial cable 60
  • FPC 74
  • Acoustic matching layer 76
  • Acoustic lens 82
  • Observation window 84
  • Objective lens 86
  • Solid-state image sensor 88
  • Illumination window 90

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne un système endoscopique à ultrasons et un procédé de fonctionnement pour un système endoscopique à ultrasons comprenant une unité de reconnaissance d'image à ultrasons qui apprend à l'avance, pour une pluralité d'images à ultrasons d'apprentissage, la relation entre une image à ultrasons d'apprentissage et un numéro d'étiquette correspondant à l'emplacement de l'extrémité distale d'un endoscope à ultrasons lorsque l'image à ultrasons d'apprentissage a été imagée, en associant l'emplacement de l'extrémité distale de l'endoscope à ultrasons dans une cavité corporelle d'un sujet avec le numéro d'étiquette sur la base de l'ordre d'observation des sites à observer, et qui reconnaît, à partir d'une image à ultrasons de diagnostic, le numéro d'étiquette correspondant à l'emplacement de l'extrémité distale de l'endoscope à ultrasons lorsque l'image à ultrasons de diagnostic a été imagée, sur la base du résultat de l'apprentissage ; et une unité de commande d'affichage qui affiche, sur un moniteur, l'emplacement de l'extrémité distale de l'endoscope à ultrasons qui correspond au numéro d'étiquette reconnu par l'unité de reconnaissance d'image à ultrasons.
PCT/JP2020/025725 2019-08-27 2020-06-30 Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons WO2021039101A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021542583A JP7158596B2 (ja) 2019-08-27 2020-06-30 超音波内視鏡システムおよび超音波内視鏡システムの作動方法
CN202080060272.4A CN114302679A (zh) 2019-08-27 2020-06-30 超声波内窥镜系统及超声波内窥镜系统的工作方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019154475 2019-08-27
JP2019-154475 2019-08-27

Publications (1)

Publication Number Publication Date
WO2021039101A1 true WO2021039101A1 (fr) 2021-03-04

Family

ID=74683660

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/025725 WO2021039101A1 (fr) 2019-08-27 2020-06-30 Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons

Country Status (3)

Country Link
JP (1) JP7158596B2 (fr)
CN (1) CN114302679A (fr)
WO (1) WO2021039101A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023054467A1 (fr) * 2021-09-30 2023-04-06 テルモ株式会社 Procédé de génération de modèle, modèle d'apprentissage, programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130035596A1 (en) * 2011-07-14 2013-02-07 Siemens Corporation Model-based positioning for intracardiac echocardiography volume stitching
WO2017195540A1 (fr) * 2016-05-12 2017-11-16 株式会社日立製作所 Dispositif d'imagerie ultrasonore, dispositif de traitement d'image et procédé associé

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102639049B (zh) * 2010-09-29 2014-11-26 奥林巴斯医疗株式会社 信息处理装置以及胶囊型内窥镜系统
JP6625746B2 (ja) * 2016-06-30 2019-12-25 富士フイルム株式会社 超音波内視鏡、及びその製造方法
CN107886503A (zh) * 2017-10-27 2018-04-06 重庆金山医疗器械有限公司 一种消化道解剖位置识别方法及装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130035596A1 (en) * 2011-07-14 2013-02-07 Siemens Corporation Model-based positioning for intracardiac echocardiography volume stitching
WO2017195540A1 (fr) * 2016-05-12 2017-11-16 株式会社日立製作所 Dispositif d'imagerie ultrasonore, dispositif de traitement d'image et procédé associé

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023054467A1 (fr) * 2021-09-30 2023-04-06 テルモ株式会社 Procédé de génération de modèle, modèle d'apprentissage, programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations

Also Published As

Publication number Publication date
CN114302679A (zh) 2022-04-08
JP7158596B2 (ja) 2022-10-21
JPWO2021039101A1 (fr) 2021-03-04

Similar Documents

Publication Publication Date Title
JP6899804B2 (ja) 超音波診断装置および超音波診断装置の作動方法
US20210369238A1 (en) Ultrasound endoscope system and method of operating ultrasound endoscope system
JP7265593B2 (ja) 超音波システム、及び、超音波画像生成方法
JP2022040175A (ja) 超音波診断装置および超音波診断装置の作動方法
WO2021039101A1 (fr) Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons
JP2021035442A (ja) 超音波診断システムおよび超音波診断システムの作動方法
JP7157710B2 (ja) 計測装置、超音波診断装置、計測方法、計測プログラム
US20200245978A1 (en) Failure diagnosis system of ultrasonic endoscope apparatus, failure diagnosis method of ultrasonic endoscope apparatus, and failure diagnosis program of ultrasonic endoscope apparatus
JP7094237B2 (ja) 超音波診断システムおよび超音波診断システムの作動方法
US20200305834A1 (en) Ultrasound observation apparatus and ultrasonic endoscope system
JP6987029B2 (ja) 超音波診断装置、及び、超音波診断装置の作動方法
JP7041014B2 (ja) 超音波診断装置、及び、超音波診断装置の作動方法
JP7292184B2 (ja) 学習装置、学習方法および学習済みモデル
JP7301114B2 (ja) 超音波診断装置、及び、超音波診断装置の作動方法
JP7300029B2 (ja) 超音波診断装置、及び、超音波診断装置の作動方法
WO2021019851A1 (fr) Dispositif de mesure, dispositif de diagnostic à ultrasons, procédé de mesure et programme de mesure
WO2023053662A1 (fr) Système d'endoscope à ultrasons et procédé de fonctionnement de système d'endoscope à ultrasons
JP2022132940A (ja) 内視鏡及び内視鏡システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20857666

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021542583

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20857666

Country of ref document: EP

Kind code of ref document: A1