WO2020116992A1 - Dispositif d'image échographique ayant une fonction de guidage de thérapie par aiguille utilisant un marqueur - Google Patents

Dispositif d'image échographique ayant une fonction de guidage de thérapie par aiguille utilisant un marqueur Download PDF

Info

Publication number
WO2020116992A1
WO2020116992A1 PCT/KR2019/017197 KR2019017197W WO2020116992A1 WO 2020116992 A1 WO2020116992 A1 WO 2020116992A1 KR 2019017197 W KR2019017197 W KR 2019017197W WO 2020116992 A1 WO2020116992 A1 WO 2020116992A1
Authority
WO
WIPO (PCT)
Prior art keywords
needle
anatomical region
ultrasound
information
unit
Prior art date
Application number
PCT/KR2019/017197
Other languages
English (en)
Korean (ko)
Inventor
이상훈
전영주
전민호
김대혁
김소영
Original Assignee
한국 한의학 연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국 한의학 연구원 filed Critical 한국 한의학 연구원
Publication of WO2020116992A1 publication Critical patent/WO2020116992A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device

Definitions

  • U.S. Patent No. 9,292,654 discloses an ultrasound imaging device that provides tutorial information. Instruction information including text, images, and videos is provided for each operation step of the ultrasound imaging apparatus. In addition, reference parameters of a setting status corresponding to each operation step are prepared, and they are set according to the operation step.
  • US Patent Publication No. 2014/0039304A9 discloses a technique that detects blood flow by driving an ultrasound imaging device in a Doppler mode, generates location information of the blood flow, and warns when the needle approaches the blood vessel.
  • An ultrasonic imaging device suitable for needle surgery is proposed.
  • Guidance information suitable for users who do not have much experience in the operation of the ultrasound imaging apparatus is provided for each region to be treated.
  • an ultrasound imaging device recognizes it and provides a warning.
  • a method is proposed in which the ultrasound imaging device can detect the needle entering a specific anatomical region more quickly and reliably.
  • a method for reducing the time required for image recognition is proposed by storing and recycling specific anatomical area information recognized for a patient for each acupuncture point.
  • a needle surgical technique is proposed that can further reduce the time required for image recognition by recognizing the location of the marker fixed on the surface of the patient's skin and reusing the specific anatomical region information stored based on the location of the marker.
  • specific anatomical region information is prepared and stored in advance for each needle treatment site.
  • a specific anatomical region refers to a region of a biological tissue that is paying attention to a specific organ, nerve, or the like.
  • a specific anatomical region may refer to an organ in which the needle is to be treated, an area of danger to which the needle should be avoided, or an area of an organ prone to damage.
  • One or more markers are fixed near the patient's desired site. The position of the needle is tracked by the positioning system.
  • specific anatomical region information corresponding to the selected treatment region is extracted from the stored specific anatomical region information, and the ultrasound image is recognized with the help of this specific anatomical region information to recognize the anatomical region in the actual image. Decide.
  • the determined specific anatomical region information is stored together with the marker location information. Warn if the needle position is close to a specific anatomical area.
  • the ultrasound imaging system extracts and utilizes specific anatomical region information of the patient's corresponding treatment site. This particular anatomical region information can be easily matched to the actual currently acquired ultrasound image using the detected location information of the marker.
  • a warning may be output when the position of the tip of the needle is close to a specific anatomical region even if it is outside the range of the ultrasound image displayed on the screen.
  • the proposed invention it is possible to reduce the risk of the needle procedure. Furthermore, it is advantageous for real-time processing because it can quickly and accurately detect the entry of the anatomical region. Furthermore, by re-matching the ultrasound image when the same procedure is performed again on the patient, by using markers fixed on the skin surface, etc., the specific anatomical region information about the same procedure region of the patient is recognized and stored. It can reduce the time required to prepare for the procedure.
  • FIG. 1 shows the configuration of a needle procedure system according to an embodiment.
  • FIG. 2 is a block diagram showing the configuration of a probe according to an embodiment.
  • FIG. 3 is a block diagram showing the configuration of an ultrasound imaging apparatus for needle surgery according to an embodiment.
  • FIG. 4 is a block diagram showing the configuration of an anatomical region determining unit according to an embodiment.
  • FIG. 5 is a block diagram showing the configuration of a needle position detection unit according to another embodiment.
  • FIG. 6 is a flowchart illustrating a configuration of an ultrasound image display control method according to an embodiment.
  • FIG. 7 is a flow chart showing the configuration of one embodiment of the sensor-based position calculation step.
  • each block is understood to represent various embodiments by adding a combination of one, two, or more numbers of blocks that are not necessary to essential blocks. Should be.
  • the needle treatment system includes an ultrasound imaging apparatus 50, a probe 10, and a needle 30.
  • the operator selects the treatment site, and the main body of the ultrasound imaging apparatus 50 is a document explaining a needle treatment method of the treatment site in response to a user's manipulation through the operation unit, 2D or 3D showing the anatomical structure of the treatment site Video, graphic animation or video showing the needle procedure of the corresponding acupuncture points is displayed on the screen.
  • a positioning system for positioning the needle 30 is mounted. On the screen of the ultrasound imaging apparatus 50, the position of the needle 30 detected by the positioning system is displayed on the image scanned by the probe 10.
  • the marker 70 is fixed to the skin surface of the acupuncture points to be treated by the patient.
  • the marker 70 can be clearly recognized in an ultrasound image and can be anything as long as it is fixed to a certain extent.
  • the marker 70 is not limited to being attached to the skin surface, and may be, for example, a pattern drawn with a paint or pen, made of a paint that absorbs or reflects ultrasound. As another example, it may be a shape deeply embedded in the skin.
  • a probe according to an embodiment includes an ultrasonic element array 250 and an ultrasonic element driver 225 driving the ultrasonic element array 250. Additionally, the probe according to an embodiment may further include an acceleration sensor 230 and an acceleration sensor driver 223. The acceleration sensor 230 may be, for example, a gyro sensor. The absolute position of the probe can be calculated using an acceleration sensor. Additionally, the probe according to an embodiment may further include magnetic position sensors 210-1, 210-2, ..., 210-n, and a magnetic sensor driver 221 for driving them. In one embodiment, the needle positioning system is a magnetic force based positioning system.
  • the needle is magnetized, and a plurality of magnetic position sensors provided in the probe measure the intensity of the magnetic field to calculate the position of the magnetized needle.
  • the needle positioning system is an RF based positioning system.
  • the RF transmitter is fixed to the needle, and a plurality of receivers provided in the probe measures the intensity of the RF field to calculate the position of the needle.
  • the proposed invention is not limited to a specific positioning system, and may be one of known positioning systems not based on ultrasound images.
  • the ultrasound imaging apparatus for needle treatment provides an image of a biological tissue at the treatment site during needle treatment.
  • the ultrasound imaging apparatus for needle treatment includes an ultrasound signal processing unit 370, a needle position detection unit 330, and a procedure risk warning unit 310.
  • the ultrasound signal processing unit 370, the needle position detection unit 330, and the procedure risk warning unit 310 are all or part of one or more of a microprocessor or a digital signal processing processor. It can be implemented with program instructions executed by computational elements.
  • the computing elements process data by reading and executing program instructions stored in the storage unit 320.
  • the storage 320 may be, for example, one or a combination of mass storage such as hard disk, SSD, and network storage, and memory such as nonvolatile memory and volatile RAM.
  • the ultrasound signal processing unit 370 generates an image by processing an ultrasound signal obtained by driving an array of ultrasound elements.
  • the imaging mode of the ultrasound imaging apparatus includes a B mode (brightness mode), an M mode (motion mode), a D mode (Doppler mode) using a Doppler effect, and a C mode (color Doppler mode).
  • the ultrasonic signal processing unit 370 drives the ultrasonic element array 250 in one of these modes to generate ultrasonic image information from the output signal.
  • the needle position detection unit 330 calculates the position of the needle from at least two sensor inputs.
  • U.S. Patent No. 9,597,008, issued to EZONO AG on March 21, 2017, measures the strength of the magnetic field of a magnetized needle using a plurality of magneto-metric detectors and uses it to locate the needle Disclosed is a technique for measuring. Needle positioning systems from EZONO AG are commercially available and the applicants are using this technology in developing the proposed invention in cooperation with EZONO AG.
  • the self-positioning system of EZONO AG uses a plurality of self-positioning detection elements built into the probe, and can calculate the position of the needle and output it superimposed on the ultrasound image.
  • the proposed invention is not limited to this, and various known positioning techniques, such as the above-described RF method, an anchor method, and a positioning system using radio wave fingerprint, can be applied.
  • the probe 10 includes an acceleration sensor 230, for example a gyro sensor.
  • the probe position detector 350 measures the position and posture of the probe, that is, the tilted direction, from the output of the acceleration sensor 230.
  • the needle position detection unit 330 first calculates the position of the needle 30 in the reference coordinate system of the probe 10 from the output of the magnetic position detection elements 210. Thereafter, the needle position detection unit 330 converts the coordinate system using the position and posture information of the probe detected by the probe position detection unit 350 to calculate the position of the probe of the needle 30 in the reference coordinate system.
  • the acceleration sensor 230 of the probe 10 may be unnecessary to detect the needle position.
  • the operation risk warning unit 310 determines a specific anatomical region in 3D from the ultrasound image generated by the ultrasound signal processing unit 370, and the position of the needle tip according to the position of the needle calculated by the needle position detection unit 330 A warning signal is output when the specific anatomical region is approached.
  • the procedure risk warning unit 310 includes a marker position detection unit 312, an anatomical region determination unit 313, and a risk determination unit 311.
  • the marker position detection unit 312 recognizes a marker from an image output from the ultrasound signal processing unit 370.
  • Markers can be fixed one, two, or three. If only one marker is fixed, the location of a specific skin surface near the treatment site can be determined. When two or more markers are fixed, the position and orientation of a specific skin surface near the treatment site can be determined. When the marker is on the skin surface, the ultrasound signal processing unit 370 can more easily calculate the position of the marker on a plane.
  • the marker position detection unit 312 identifies the marker from the ultrasound volume image.
  • the position of the marker can be a reference for defining the 3D space.
  • the ultrasound imaging apparatus may start positioning the corresponding treatment site of the patient based on the marker.
  • the scanned volume images can be identified based on the marker location.
  • the position of the needle detected by the needle position detector 330 may be calculated based on the marker position. Detecting a clearly visible marker in an ultrasound image and setting a position in a space identified by an acceleration sensor can be processed relatively accurately and quickly in image processing.
  • the calculated position information of the marker may be stored in the database as a part of a record for a corresponding procedure of the storage unit 320.
  • the marker may include identification information. Numerals or barcodes recorded with ultrasound sensitive inks that can be identified on ultrasound images can help to quickly identify the patient and the site of the procedure.
  • the identification information of the recognized marker may be stored in the database as a part of a record for a corresponding procedure of the storage unit 320.
  • the anatomical region determination unit 313 determines a specific anatomical region in 3D from the ultrasound image generated by the ultrasound signal processing unit 370.
  • the anatomical region determining unit 313 generates a volume image from an ultrasound image generated by scanning a periphery of a treatment subject with a probe.
  • the surgical risk warning unit 310 processes a 3D volume image to recognize the boundary of a biological tissue such as a bone or an organ, and recognizes a specific anatomical region such as a blood vessel or an organ during a needle procedure.
  • the anatomical region determination unit 313 calculates specific anatomical region information of the selected surgical site of the operator, it stores it in the database as a part of a record of the procedure of the corresponding operator of the storage unit 320.
  • the stored specific anatomical region information may be extracted when the same subject is subsequently operated on the same region and used for re-recognition of the corresponding anatomical region. In this case, the anatomical region recognition can be processed much faster and simpler by matching processing.
  • the anatomical region determining unit 313 uses the specific anatomical region information of the same marker previously detected and the location information of the marker detected by the marker location detector 312 to correspond to the corresponding anatomical region of the patient. Generate information.
  • the ultrasound imaging apparatus stores the markers in the database of the storage unit 320 as a part of a record of the corresponding procedure when the position of the marker is calculated and the specific anatomical region is recognized in the ultrasound image obtained at the procedure site from the operator.
  • the position of the marker and the specific anatomical region information may be independently stored based on the probe coordinate system.
  • a position relative to a corresponding anatomical region of a marker may be stored.
  • a position relative to a corresponding marker in a specific anatomical region may be stored.
  • the ultrasound imaging apparatus may further include a relative position calculator 314.
  • the relative position calculating unit 314 calculates a position relative to the anatomical region of the marker or a position relative to the marker of the anatomical region and stores the relative location of the marker in the storage unit 320.
  • anatomical region information of the patient using the anatomical region information of the same marker previously detected and stored and the location information of the marker detected by the marker location detector 312 Produces If the position of the marker has not changed and the anatomy of the subject has not changed significantly, the position and direction of the corresponding anatomical region from the currently detected position of the marker can be easily determined by the stored information.
  • the anatomical region can be determined by confirming the determined anatomical region by matching with an actually acquired ultrasound volume image or one to several ultrasound images. It can be applied more easily if it is repeatedly performed for a certain period of time or a small area is operated. For example, when applied to acupuncture acupuncture in the field of oriental medicine, this aspect can reduce the processing time and increase reliability by reducing the burden of image processing.
  • the risk determination unit 311 outputs a warning signal when the position of the needle tip according to the position of the needle calculated by the needle position detection unit 330 approaches a specific anatomical region. If the distance from the tip of the needle to the vertical part of the boundary of a specific anatomical region is below a certain threshold value, or if the direction of movement of the tip of the needle is toward the corresponding anatomical region and the movement speed is more than a certain value, a warning message is generated and output. do.
  • the warning message can be output on the screen and/or audio.
  • the needle position is measured by an independent positioning system that does not rely on ultrasound imaging.
  • 3D modeling information of the anatomical region is calculated and stored from an ultrasound volume image. Accordingly, since the operation risk warning unit 310 compares and processes the position of the end of the needle with the 3D modeling area information of a specific anatomical area, the stored position of the end of the needle is generated by the ultrasound signal processing unit 370 and displayed on the screen. Even when the display range of the displayed ultrasound image is out of range, a warning may be output when the corresponding anatomical region is approached.
  • the ultrasound imaging apparatus may further include a storage unit 320 and a user interface unit 390.
  • operation guide information including operation guide content for each operation part is stored.
  • the user interface unit 390 extracts and provides a corresponding operation guide content from the storage unit when a treatment site is selected.
  • the treatment site is determined according to the acupuncture points.
  • a corresponding treatment site is selected.
  • the area of acupuncture for needle treatment is defined in the field of Korean medicine. Accurate treatment of acupuncture points requires a lot of experience and anatomical knowledge.
  • the invention proposed in this field can be applied particularly well.
  • the manipulation guide content is prepared as a database having a tree structure for each major and sub-classification of the treatment site.
  • the operation guide content is document information describing a method of needle treatment at a corresponding treatment site, 2D or 3D image information showing an anatomical structure of the treatment region, graphic animation or video showing a needle treatment process at the treatment region It includes.
  • the document information may include a brief description of the corresponding needle procedure, information about the patient posture and the location where the probe is to be located, and a specific method of accurately locating the treatment region using ultrasound.
  • the user interface unit 390 receives a user's operation command through a control unit such as a mouse, a touch pad, and a keyboard, and outputs ultrasound image or system generated information as visual information through a screen or audio information through a speaker. do.
  • the user interface unit 390 includes an instruction information providing unit 391.
  • the instruction information providing unit 391 receives the operation guide content from the operator and extracts and provides the corresponding operation guide content from the storage unit 320.
  • the user selects one of the major categories through, for example, a touch input, and then selects one of the sub-categories to select a treatment site.
  • the user interface unit 390 extracts the operation guide content of the selected treatment site from the database. Document information describing the method of needle treatment at the selected treatment site, 2D or 3D image information showing the anatomical structure of the treatment site, and graphic animation or video showing a needle treatment process at the treatment site are provided according to a user's selection.
  • the image display control unit 360 combines and displays the ultrasound image generated by the ultrasound signal processing unit 370, the instruction information provided by the user interface unit 390, and the position of the needle detected by the needle position detection unit 330. .
  • the image display control unit 360 overlays and displays a needle image generated as a graphic reflecting the position of the needle detected by the needle position detection unit 330 on the ultrasound image generated by the ultrasound signal processing unit 370.
  • the image display control unit 360 may simultaneously display the instruction information provided by the user interface unit 390 on the ultrasound image generated by the ultrasound signal processing unit 370 by dividing the screen.
  • the anatomical region determining unit 313 may include an anatomical region recognition unit 315 and an anatomical region recognizing unit 316.
  • the anatomical region recognition unit 315 determines an anatomical region from standard specific anatomical region information corresponding to the selected treatment site and real-time ultrasound images generated by the ultrasound signal processing unit.
  • the procedure guide information stored in the storage unit 320 may further include specific anatomical region information for each treatment region.
  • the specific anatomical region information may be, for example, image patterns or feature information capable of distinguishing specific biological tissues, for example, blood vessels, from an ultrasound image.
  • image patterns or feature information capable of distinguishing specific biological tissues, for example, blood vessels, from an ultrasound image.
  • three-dimensional modeling of an organ existing near a specific surgical site, and image patterns that appear when each modeling is viewed on an ultrasound image may be specific anatomical region information.
  • a function value is compared with a reference value to determine a blood vessel region pattern value. This may be specific anatomical area information.
  • an ultrasound image has a limited resolution, and thus it is not easy to recognize a specific biological tissue from the image. If the treatment location is limited to a very narrow location, the treatment site can be more easily specified.
  • the anatomical region in the ultrasound image is obtained from information such as the pattern or feature information of the image of the corresponding biological tissue commonly seen in ultrasound images, for example, characteristics of luminance distribution or shape of a boundary line. It can increase the accuracy or the likelihood of success in recognizing.
  • specific anatomical region information may be three-dimensional region information of biological tissue.
  • the standard 3D region information of the biological tissue indicates the shape of the corresponding biological tissue
  • the overall 3D region information may be a map for finding the biological tissue in an ultrasound image.
  • the anatomical region recognition unit 315 first stores a plurality of images obtained by scanning the probe 10 around the selected treatment area, and the three-dimensional information surrounding the treatment area from the plurality of images To acquire. This 3D information is not modeled as a 3D volume image. In the ultrasound volume image, specific anatomical region information is recognized using specific anatomical region information. In this embodiment, the 3D image is zoomed and shifted to have the same scale and position compared to the dangerous 3D anatomical region modeling information stored through the feature point recognition of the acquired 3D image. Subsequently, the transformed 3D ultrasound image is partitioned into blocks.
  • a correlation function is calculated for a pattern around a position in an image corresponding to specific anatomical region information and a block of the ultrasound image to find a 3D model in the ultrasound image.
  • 3D anatomical region modeling information may be generated from the ultrasound volume image obtained by repeating the above process.
  • the proposed invention is not limited to this embodiment, and includes various techniques for detecting an anatomical region in an ultrasound image of a corresponding treatment site using information stored for each treatment site.
  • the treatment site can be specified.
  • the biological tissues differ slightly from person to person, the anatomical region in the ultrasound image is obtained from information such as the pattern or feature information of the image of the corresponding biological tissue commonly seen in ultrasound images, for example, characteristics of luminance distribution or shape of a boundary line. In recognizing, processing speed or accuracy can be increased.
  • the user interface unit 390 may further include a display setting unit 393 for each site.
  • the procedure guide information stored in the storage unit 320 may further include setting parameters for each treatment site.
  • the display setting unit for each part 393 extracts the setting parameter for each treatment part from the storage unit 320 and sets the display control parameter of the ultrasound image accordingly.
  • the anatomical region re-recognition unit 316 generates specific anatomical region information from the currently input ultrasound image using specific anatomical region information of the same marker previously detected and the location information of the marker detected by the marker location detector. . This has been described in the previous description of the anatomical region determining unit 313 in the embodiment of FIG. 1, and thus detailed description thereof will be omitted.
  • the needle position detection unit 330 includes a sensor-based position calculation unit 331, an image-based position calculation unit 333, and a location information output unit 335 It may include.
  • the sensor-based position calculator 331 calculates the position of the needle from at least two sensor inputs. This is as described above.
  • the image-based position calculator 333 estimates the position of the needle by tracking the movement of the tissue in the ultrasound image. It is advantageous to use other methods in parallel, as certain anatomical zone entry may occur at the moment the magnetic system positioning system is inoperative. The needle is so thin that it is not visible on the ultrasound image.
  • the image-based position calculating unit 333 continuously compares the generated ultrasound image to identify a changing portion and a portion that does not change, and calculates a portion changing in a specific direction as a needle position. For example, by calculating a change amount between image frames, a motion vector can be obtained, and accordingly, motion of a biological tissue can be identified. When the needle enters the living tissue, movement occurs because the surrounding living tissue is pushed out. This amount of change can reflect motion relatively well even when the image quality is poor.
  • the position of the tip of the needle can be estimated by analyzing the direction of movement of the biological tissue.
  • the ultrasonic signal processing unit 370 drives the array of ultrasonic elements in a Doppler mode, detects the speed of biological tissue pushed out when the needle enters, identifies biological tissue, and estimates the position of the needle therefrom You can.
  • the Doppler mode can sensitively identify moving tissue. While operating in the Doppler mode, the movement of biological tissues can be obtained through the difference between frames of the ultrasound image, and the position of the needle can be estimated by finding the portion of the movement or the central axis of the movement from the distribution of the movement.
  • the position information output unit 335 synthesizes the outputs of the sensor-based position calculation unit 331 and the image-based position calculation unit 333 to calculate the position of the needle.
  • the location information output unit 335 preferentially selects the output of the sensor-based location calculating unit 331, but when it is determined that there is no output or reliability, the location-based output calculating unit 333 is executed in parallel. Select output.
  • the reliability of the result of calculating the needle position can be judged to be unreliable, for example, when the continuity of movement of the needle tip is largely broken.
  • the location information output unit 335 preferentially selects the output of the sensor-based location calculation unit 331, but when it is determined that the output is not present or unreliable, the image-based location calculation unit executed in parallel ( 333) Create a location in combination with the output of the task.
  • the ultrasound image display control method includes a surgical site selection step 610, a marker position detection step 643, a specific anatomical area information checking step 644, and an anatomical region re-recognition. It includes a step 645, a needle position detection step 660, and a risk determination step 670.
  • the ultrasound imaging apparatus receives a treatment site from the operator through the manipulation unit.
  • the processing configuration of the ultrasound imaging apparatus in the operation site selection step 610 is similar to the processing of the user interface unit 390 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the ultrasound imaging apparatus detects the position of at least one marker fixed to the surface of the treatment site in the generated ultrasound image.
  • the marker position detection step 620 includes a surgical site scan step 641 and a marker check step 643.
  • the ultrasound imaging apparatus acquires an ultrasound image when the operator scans the area around the selected treatment area with a probe.
  • the ultrasound imaging apparatus checks whether a marker is present in the acquired ultrasound image. If a marker is present, the position of the marker is detected.
  • the configuration for detecting the position of the marker is similar to the processing of the marker position detection unit 312 described with reference to FIG. 3, so a detailed description is omitted.
  • the marker check step 643 if a marker is present, it is checked whether there is pre-stored anatomical area information for the corresponding surgical site, that is, specific anatomical area information previously recognized and stored for the operator (step 644). ). If the information exists, an anatomical region re-recognition step 645 is executed. In the anatomical region re-recognition step 645, the ultrasound imaging apparatus extracts the specific anatomical region information stored and generates specific anatomical region information using the information and the location information of the marker detected in the marker location detection step 643. .
  • the processing configuration of the ultrasound imaging apparatus is similar to the processing of the anatomical region re-recognition unit 316 described with reference to FIG. 4, so a detailed description thereof will be omitted.
  • the ultrasound imaging apparatus calculates the position of the needle from at least two sensor inputs.
  • the processing configuration of the ultrasound imaging apparatus is similar to the processing of the needle position detection unit 330 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the ultrasound imaging apparatus outputs a warning signal when the position of the needle tip according to the position of the needle calculated in the needle position detection step 660 approaches the stored anatomical region.
  • the ultrasound imaging apparatus checks whether the position of the end of the needle calculated in the needle position detection step 660 is close to the calculated specific anatomical region.
  • danger warning step 680 the ultrasound imaging apparatus outputs a danger warning when it approaches.
  • the processing configuration of the ultrasound imaging apparatus is similar to that of the risk determination unit 311 described with reference to FIG. 3, and thus detailed description thereof will be omitted.
  • the needle position is measured by an independent positioning system that does not depend on the ultrasound image, and the 3D modeling information of the anatomical region is calculated and stored from the ultrasound volume image, and then stored, and the procedure risk warning unit ( 310) processes the position of the end of the needle by comparing it with the 3D modeling area information of the stored anatomical region, so that the stored position of the end of the needle is generated by the ultrasound signal processing unit 370 and displayed on the screen.
  • a warning can be issued if the anatomical region is approached even if it deviates.
  • the ultrasound image display control method may further include a specific anatomical region information extraction step 620 and an anatomical region recognition step 650. If a marker is not found in the marker position detection step 643 or the corresponding anatomical area information stored in the anatomical area information check step 644 is not found, the ultrasound imaging apparatus executes a specific anatomical area information extraction step 620 do.
  • the ultrasound imaging apparatus reads specific anatomical region information corresponding to the selected surgical site from the database.
  • the processing configuration of the ultrasound imaging apparatus in the specific anatomical region information extraction step 620 is similar to the processing of the anatomical region determining unit 313 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the ultrasound imaging apparatus In the anatomical region recognition step 650, the ultrasound imaging apparatus generates specific anatomical region information in three dimensions from the ultrasound volume image generated from the ultrasound images input in the operation region scan step 640, The anatomical area information is extracted and generated by referring to the specific anatomical area information of the corresponding treatment site extracted in step 620.
  • the processing configuration of the ultrasound imaging apparatus is similar to the processing of the anatomical region determining unit 313 described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the ultrasound image display control method may further include a display setting step 630 for each region.
  • the ultrasound imaging apparatus sets the display control parameter of the ultrasound image according to the setting parameter for each treatment part included in the procedure guide information.
  • the processing configuration of the ultrasound imaging device in the display setting step 630 for each part is similar to the processing of the display setting part 393 for each part described with reference to FIG. 3, so a detailed description thereof will be omitted.
  • the needle position detection step 660 may include a sensor-based position calculation step 661, an image-based position calculation step 663, and a location information output step 665.
  • 7 is a flow chart showing the configuration of one embodiment of the sensor-based position calculation step 661.
  • the ultrasound imaging apparatus calculates the position of the needle from at least two sensor inputs.
  • the ultrasound imaging apparatus estimates the position of the needle by tracking tissue movement in the ultrasound image.
  • the ultrasound imaging apparatus sequentially compares the generated ultrasound images to identify the changing and unchanging parts, and calculates the changing part in a specific direction as a needle position. can do.
  • the ultrasound imaging device synthesizes the outputs of the sensor-based position calculation step 661 and the image-based position calculation step 663 to calculate the position of the needle.
  • the processing configuration of the ultrasound imaging apparatus is similar to the processing of the needle position detection unit 330 described with reference to FIG. 4, so a detailed description thereof will be omitted.
  • the proposed invention can be applied to an ultrasound imaging device that can be used during acupuncture.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

La présente invention concerne une technologie de commande d'affichage d'image d'un dispositif d'image ultrasonore. Des informations de région anatomique spécifiques sont préparées à l'avance et sont stockées conformément à une zone d'opération d'aiguille. Un ou plusieurs marqueurs sont fixés en position adjacente à la zone d'opération dans laquelle une opération de patient doit être conduite. La position d'une aiguille est suivie par un système de positionnement. Lorsqu'une zone d'opération d'aiguille est sélectionnée, des informations de région anatomique spécifique correspondant à la zone d'opération sélectionnée sont extraites parmi les éléments stockés d'informations de région anatomique spécifique, et une image échographique est reconnue par réception d'une aide à partir des informations de région anatomique spécifique de sorte qu'une région anatomique est déterminée dans une image réelle. Les informations de région anatomique spécifique déterminées sont stockées conjointement avec des informations de position des marqueurs.
PCT/KR2019/017197 2018-12-07 2019-12-06 Dispositif d'image échographique ayant une fonction de guidage de thérapie par aiguille utilisant un marqueur WO2020116992A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180157373A KR102182134B1 (ko) 2018-12-07 2018-12-07 마커를 이용하는 니들 시술 가이드 기능을 가진 초음파 영상 기기
KR10-2018-0157373 2018-12-07

Publications (1)

Publication Number Publication Date
WO2020116992A1 true WO2020116992A1 (fr) 2020-06-11

Family

ID=70974669

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/017197 WO2020116992A1 (fr) 2018-12-07 2019-12-06 Dispositif d'image échographique ayant une fonction de guidage de thérapie par aiguille utilisant un marqueur

Country Status (2)

Country Link
KR (1) KR102182134B1 (fr)
WO (1) WO2020116992A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116012286A (zh) * 2022-10-27 2023-04-25 数坤(上海)医疗科技有限公司 手术风险区域确定方法、装置、电子设备及可读存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11074909B2 (en) 2019-06-28 2021-07-27 Samsung Electronics Co., Ltd. Device for recognizing speech input from user and operating method thereof
KR20230168230A (ko) 2022-06-03 2023-12-13 전남대학교산학협력단 의료진단영상에서 생검 바늘의 위치를 파악하는 방법 및 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5095186B2 (ja) * 2006-11-28 2012-12-12 株式会社東芝 超音波診断装置及び過去画像データ参照方法
JP2014525328A (ja) * 2011-08-31 2014-09-29 ゼネラル・エレクトリック・カンパニイ 針を検出して追跡する方法
JP2016022124A (ja) * 2014-07-18 2016-02-08 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置及びその制御プログラム
JP2016059481A (ja) * 2014-09-16 2016-04-25 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置及びその制御プログラム
US20170116729A1 (en) * 2015-04-17 2017-04-27 Clear Guide Medical, Inc. System and method for fused image based navigation with late marker placement

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10507006B2 (en) * 2013-12-27 2019-12-17 General Electric Company System and method for tracking an invasive device using ultrasound position signals

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5095186B2 (ja) * 2006-11-28 2012-12-12 株式会社東芝 超音波診断装置及び過去画像データ参照方法
JP2014525328A (ja) * 2011-08-31 2014-09-29 ゼネラル・エレクトリック・カンパニイ 針を検出して追跡する方法
JP2016022124A (ja) * 2014-07-18 2016-02-08 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置及びその制御プログラム
JP2016059481A (ja) * 2014-09-16 2016-04-25 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置及びその制御プログラム
US20170116729A1 (en) * 2015-04-17 2017-04-27 Clear Guide Medical, Inc. System and method for fused image based navigation with late marker placement

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116012286A (zh) * 2022-10-27 2023-04-25 数坤(上海)医疗科技有限公司 手术风险区域确定方法、装置、电子设备及可读存储介质
CN116012286B (zh) * 2022-10-27 2024-04-09 数坤(上海)医疗科技有限公司 手术风险区域确定方法、装置、电子设备及可读存储介质

Also Published As

Publication number Publication date
KR20200069846A (ko) 2020-06-17
KR102182134B1 (ko) 2020-11-23

Similar Documents

Publication Publication Date Title
WO2020116991A1 (fr) Dispositif d'imagerie ultrasonore ayant une fonction guide d'acupuncture
CN112215843B (zh) 超声智能成像导航方法、装置、超声设备及存储介质
KR102014359B1 (ko) 수술영상 기반 카메라 위치 제공 방법 및 장치
WO2020116992A1 (fr) Dispositif d'image échographique ayant une fonction de guidage de thérapie par aiguille utilisant un marqueur
EP1323380B1 (fr) Appareil d'imagerie ultrasonique d'une aiguille pour biopsie
US9773305B2 (en) Lesion diagnosis apparatus and method
CN203970548U (zh) 外科手术控制系统
KR20140091177A (ko) 병변 진단 장치 및 방법
CN109998678A (zh) 在医学规程期间使用增强现实辅助导航
CN108231180B (zh) 医学图像显示设备及其方法
WO2015119338A1 (fr) Procédé de guidage de la position d'analyse d'une sonde à ultrasons tridimensionnelle, et système de diagnostic par ultrasons utilisant le procédé de guidage
KR20130026041A (ko) 의료 영상의 일부 정보를 활용한 장기 영상 생성 방법 및 장치
CN107111875A (zh) 用于多模态自动配准的反馈
CN111292277A (zh) 超声融合成像方法及超声融合成像导航系统
KR20160037023A (ko) 컴퓨터 보조 진단 지원 장치 및 방법
KR20160046670A (ko) 영상 진단 보조 장치 및 방법
CN111657997A (zh) 超声辅助引导方法、装置及存储介质
CA3102807A1 (fr) Detection de l`orientation dans des images fluoroscopiques
JP2023022123A (ja) ガイダンス信号の決定及び手持ち式超音波トランスデューサのためのガイダンスを提供するためのシステム
JP2021029676A (ja) 情報処理装置、検査システム及び情報処理方法
JP6921589B2 (ja) 情報処理装置、検査システム及び情報処理方法
CN112533540A (zh) 超声成像的方法、超声成像设备以及穿刺导航系统
US11532101B2 (en) Marker element and application method with ECG
US20220132026A1 (en) Systems and methods for capturing, displaying, and manipulating medical images and videos
CN116269749B (zh) 一种改良保留性神经的腹腔镜下膀胱癌手术系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19893598

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19893598

Country of ref document: EP

Kind code of ref document: A1