WO2020000963A1 - 针的超声引导辅助装置及系统 - Google Patents

针的超声引导辅助装置及系统 Download PDF

Info

Publication number
WO2020000963A1
WO2020000963A1 PCT/CN2018/124034 CN2018124034W WO2020000963A1 WO 2020000963 A1 WO2020000963 A1 WO 2020000963A1 CN 2018124034 W CN2018124034 W CN 2018124034W WO 2020000963 A1 WO2020000963 A1 WO 2020000963A1
Authority
WO
WIPO (PCT)
Prior art keywords
needle
tissue
robot arm
probe
ultrasound
Prior art date
Application number
PCT/CN2018/124034
Other languages
English (en)
French (fr)
Inventor
莫若理
黄明进
赵明昌
Original Assignee
无锡祥生医疗科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 无锡祥生医疗科技股份有限公司 filed Critical 无锡祥生医疗科技股份有限公司
Priority to US17/254,996 priority Critical patent/US20210322106A1/en
Priority to EP18924220.9A priority patent/EP3815636A4/en
Publication of WO2020000963A1 publication Critical patent/WO2020000963A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0035Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7425Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4218Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4416Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound

Definitions

  • the invention relates to a medical guidance auxiliary device and system, in particular to a medical ultrasound guidance auxiliary device and system.
  • ultrasound has been widely used for puncture guidance, but the probe is still manually implemented after the ultrasound probe scan is performed on the surface of the human body to be tested, and the ultrasound image is used to determine the depth of tissues such as blood vessels, and then the artificial puncture is performed.
  • Intravenous injection is different from current conventional department puncture.
  • ultrasound or infrared is used for guidance during puncture.
  • the puncture needle is usually a separate needle tube.
  • Conventional intravenous injections (such as drips) do not currently use ultrasound for guidance.
  • Intravenous injection is the most routine work in hospitals or clinics. Doctors or nurses need to perform a large number of intravenous injections every day. Because of human experience, it is inevitable. It often occurs. Intravenous injection fails due to the negligence or judgment of the doctor or nurse, which leads to the failure of the intravenous injection. For example, an intravenous needle punctures a blood vessel, or a very thin blood vessel is selected, or the angle of the intravenous injection is incorrect. Stability of fluid into any blood vessel or tissue.
  • the purpose of the present invention is to supplement the shortcomings in the prior art, and to provide an ultrasonic guiding aid device and system for needles for guiding accurate puncture of needles.
  • the technical scheme adopted by the present invention is:
  • a needle ultrasound guide assisting device includes:
  • Probe for transmitting and receiving ultrasonic signals to the tissue to be injected
  • a first robot arm for moving a probe to an injection tissue to be punctured
  • a second robotic arm that fixes a needle for puncturing and moves the needle close to the tissue to be punctured and penetrates into the tissue to be punctured;
  • the host moves the probe to the tissue to be punctured according to controlling the first robotic arm, and the probe transmits and receives ultrasonic signals;
  • the host synthesizes an ultrasound image through the ultrasound signal transmitted by the probe; the host calculates at least the puncture distance and the puncture angle between the needle and the tissue to be punctured by the ultrasound image;
  • the host controls the second robotic arm to move the needle to the corresponding puncture distance and puncture angle according to the puncture distance and puncture angle, and performs puncture.
  • the host acquires the position of the needle in the injection tissue to be punctured according to the ultrasound signal transmitted by the probe, and controls the second robot arm to keep the tip of the needle within the target area of the injection tissue to be punctured.
  • the host controls the first robotic arm to cooperate with the second robotic arm to keep the needle tip at a position that can be detected by the probe, and controls the second robotic arm to keep the needle tip at the target of the tissue to be injected. within the area.
  • the injection tissue to be punctured includes one or more of arteries, veins, nerves, and organs.
  • the host obtains the minimum cross-sectional area of the artery or vein according to the ultrasound image of the artery or vein; the host controls the probe to be orthogonal to the artery or vein through the first robot arm.
  • the host calculates the three-dimensional coordinates of the center point in the injection tissue to be punctured according to the ultrasound image, and the host calculates the puncture distance and the puncture angle of the needle and the injection tissue to be punctured according to the three-dimensional coordinates of the center point.
  • the first mechanical arm includes a rotary articulated arm, and the rotary articulated arm controls the rotation of the probe, so that the host can obtain an ultrasound image of the tissue to be punctured and injected.
  • the needle ultrasound guiding device includes an infrared irradiation light source, which is used to perform infrared irradiation on the tissue to be punctured and injected; the host further includes an infrared image acquisition unit for controlling a camera to obtain infrared rays of the tissue to be punctured and injected image.
  • the second robot arm is separately installed from the first robot arm on the corresponding base or on the same base; or,
  • the second robot arm is fixedly mounted on the first robot arm.
  • the host controls the second robot arm to pierce the needle into the human tissue
  • the host controls the second robot arm to pierce the needle into the human tissue.
  • the ultrasound guidance assisting device includes a display for displaying an ultrasound image obtained after a probe scan, and a dynamic puncture ultrasound image after a needle penetrates a human tissue.
  • the present invention provides an ultrasound guided assistance system for a needle, including:
  • Probe for transmitting and receiving ultrasonic signals to the tissue to be injected
  • a first robot arm for moving a probe to an injection tissue to be punctured
  • a second robot arm fixing the needle for puncturing, and moving the needle to the tissue to be injected and piercing the injection tissue to be punctured;
  • a host including a first robot arm control unit, a second robot arm control unit, a probe control unit, an image processing unit, and a displacement calculation unit; the first robot arm control unit controls the first robot arm to move the probe to the Puncture injection tissue, the probe control unit controls the probe to transmit and receive ultrasonic signals;
  • the image processing unit synthesizes an ultrasound image through an ultrasound signal transmitted by a probe
  • the illustrated displacement calculation unit calculates at least the puncture distance and puncture angle between the needle and the tissue to be punctured by the ultrasound image
  • the second robot arm control unit controls the second robot arm to move the needle to the corresponding puncture distance and puncture angle according to the puncture distance and the puncture angle to perform puncture.
  • the displacement calculation unit obtains the position of the needle at the tissue to be injected according to the ultrasound image; the second robot arm control unit controls and controls the second robot arm to keep the tip of the needle within the target area of the tissue to be injected.
  • the displacement calculation unit obtains the position of the needle in the injection tissue to be punctured according to the ultrasound image; the first robot arm control unit controls the first robot arm to coordinately move with the second robot arm controlled by the second robot arm control unit Keep the needle at a position that can be detected by the probe, and keep the tip of the needle within the target area of the tissue to be injected.
  • the injection tissue to be punctured includes one or more of arteries, veins, nerves, and organs.
  • the displacement calculation unit obtains the minimum cross-sectional area of the artery or vein according to the ultrasound image of the artery or vein; the displacement calculation unit is orthogonal to the artery or vein through the first robotic arm control probe.
  • the displacement calculation unit calculates the three-dimensional coordinates of the center point in the injection tissue to be punctured according to the ultrasound image, and the displacement calculation unit calculates the puncture distance and the puncture angle of the needle and the injection tissue to be punctured according to the three-dimensional coordinates of the center point.
  • the ultrasound guidance assistance system for the needle further includes an infrared irradiation light source
  • the host further includes an infrared light source processing unit for controlling the infrared irradiation light source to emit infrared rays
  • the ultrasound guidance assistance system further includes a camera
  • the host It also includes an infrared image acquisition unit for controlling a camera to acquire an infrared image of the tissue to be punctured and injected;
  • the image processing unit includes: an image fusion processing unit, an ultrasonic image processing unit, and an infrared image processing unit; the image fusion processing unit fuses the ultrasonic image obtained by the ultrasonic image processing unit and the infrared image obtained by the infrared image processing unit to generate a new image.
  • the ultrasound guidance assistance system for the needle further includes a display for displaying an ultrasound image obtained after the probe scans, a dynamic puncture ultrasound image after the needle penetrates into a human tissue, and displaying an infrared image.
  • the advantage of the present invention is that the present invention automatically scans by connecting an ultrasound probe to the first robotic arm linked to the host.
  • the host processes the ultrasound image according to the signal obtained by the probe, and the host automatically determines the blood vessel based on the processed ultrasound image.
  • the parameters such as depth and diameter are controlled by the host to automatically insert the needle connected to the second robotic arm into the target blood vessel or tissue.
  • the host controls the first robotic arm.
  • the probe moves with the movement of the needle on the second robot arm to ensure that the needle on the second robot arm is always inserted near the center of the blood vessel.
  • the present invention solves how to select a blood vessel of a suitable thickness and shape to ensure the comfort in the later stage of intravenous injection.
  • FIG. 1 is a schematic structural diagram of an exemplary ultrasonic guidance assisting device according to the present invention
  • FIG. 2 is a schematic structural diagram of an ultrasonic guiding auxiliary device including an infrared light source according to the present invention
  • FIG. 3 is a schematic structural diagram of working states of another exemplary first robot arm and a second robot arm according to the present invention.
  • FIG. 4 is a cross-sectional view showing a working state of another exemplary first robot arm and a second robot arm according to the present invention
  • FIG. 5 is a schematic diagram of a needle punctured into a blood vessel according to the present invention.
  • FIG. 6 is a block diagram of an exemplary ultrasound guidance assistance system according to the present invention.
  • FIG. 7 is a block diagram of an exemplary ultrasound guidance assistance system including an infrared light source according to the present invention.
  • FIG. 8 is a block diagram of an exemplary image processing unit according to the present invention.
  • the needle ultrasound guidance assisting device includes a probe 100, a first robot arm 200, a second robot arm 400, and a host 500.
  • the first robot arm 200 includes a first arm 210, a second arm 220, and a third arm 230.
  • the first robot arm 200 implements a probe through the first arm 210, the second arm 220, and the third arm 230.
  • 100 is rotated in three spatial dimensions of X axis, Y axis, and Z axis.
  • the probe can quickly move to the injection tissue to be punctured.
  • the host controls the probe 100 to launch the injection tissue 600 to be punctured. And receiving ultrasound.
  • the first robot arm 200 and the second robot arm 400 are mounted on respective bases.
  • the first robot arm 200 is mounted on a base 270 and the second robot arm 400 is mounted on another base.
  • the probe 100 is used for transmitting and receiving ultrasonic signals to the human body to be injected tissue; the first robot arm 200 is used to move the probe 100 to the human body to be injected tissue 600; the needle 300 is used for the human body to be injected tissue 600 Perform a puncture; a second robot arm 400 for fixing the needle 300 and moving the needle 300 to the human body to be injected tissue 600 and pierce the human body to be punctured with the injection tissue 600; the host 500, according to the control of the first robot arm 200, moves the probe 100 To the human body to be injected tissue 600, the probe 100 transmits and receives ultrasound signals; the host 500 synthesizes an ultrasound image through the ultrasound signals transmitted by the probe 100; the host 500 calculates at least the puncture of the needle 300 and the tissue to be punctured 600 by the ultrasound image Distance and puncture angle; the host 500 controls the second robot arm 400 to move the needle 300 to the corresponding puncture distance and puncture angle according to the puncture distance and puncture angle, and performs puncture.
  • the displacement calculation unit of the host computer 500 calculates the three-dimensional coordinates of the center point in the injection tissue to be punctured based on the ultrasound image, and the displacement calculation unit calculates the puncture distance and the puncture angle between the needle and the injection tissue to be punctured according to the three-dimensional coordinates of the center point.
  • the host computer 500 obtains the position of the needle 300 in the tissue to be punctured and injected, such as the position in the blood vessel, according to the ultrasonic signal transmitted by the probe 100, and controls the first robot arm 200 and the second robot arm 400 to coordinately move to keep the needle 300 at the probe.
  • the position that can be detected, and the tip of the needle 300 remains within the tissue to be punctured for injection, such as the position of a blood vessel target area.
  • tip of the needle is the segment where the pointer penetrates into human tissue.
  • the second robot arm 400 includes a fixing device 320 for the needle 300, and the fixing device can be adjusted according to different shapes of the needle.
  • a fixing device 320 for a needle 300 is provided on the second robot arm, and a catheter 310 for the needle is connected to the needle 300.
  • the fixing device 320 can be replaced according to the shape of the needle 300.
  • the ultrasonic guidance assisting device of the present invention further includes a display, which is a combination or a combination of existing displays such as a cathode ray tube (CRT), a liquid crystal display (LCD), a plasma screen projector, an OLED, and an LED. Or other displays now known or later developed for displaying images.
  • a display which is a combination or a combination of existing displays such as a cathode ray tube (CRT), a liquid crystal display (LCD), a plasma screen projector, an OLED, and an LED. Or other displays now known or later developed for displaying images.
  • an ultrasound guidance assisting system for a needle includes a host, a first robot arm, a second robot arm, and a probe.
  • the host 500 includes an image processing unit, a probe control unit, a displacement calculation unit, a first robot arm control unit, and a second robot arm control unit.
  • the first robotic arm control unit controls the first robotic arm to move the probe to the human body to be punctured with injection tissue, and the probe control unit controls the probe to transmit and receive ultrasound signals; the image processing unit synthesizes ultrasound images through the ultrasound signals transmitted by the probe; the displacement calculation unit uses ultrasound The image calculates at least the puncture distance and the puncture angle between the needle and the injection tissue to be punctured; the second robot arm control unit controls the second robot arm to move the needle to the corresponding puncture distance and puncture angle according to the puncture distance and the puncture angle to perform puncture.
  • the displacement calculation unit obtains the position of the needle in the blood vessel according to the ultrasound image; the first robotic arm control unit controls the first robotic arm and the second robotic arm controlled by the second robotic arm control unit to coordinately move, and the needle is always maintained at the probe to detect With the needle tip inside the blood vessel.
  • the position parameters of the probe 100 are pre-set in the host 500, and the host 500 can perform quick position determination automatically or according to a user's selection.
  • the user asks the person to be tested to place the tissue to be punctured and injected into a specific area, and the user selects on the input device on the host or the input device connected to the host.
  • the host computer 500 determines the approximate position of the probe 100 according to the blood vessel or tissue selected by the user, and quickly moves the probe 100 to the surface of the tissue 600 to be punctured through the first robot arm 200.
  • the host computer 500 controls the probe 600 to transmit and receive ultrasonic waves, and an image processing unit (shown in FIG. 6) in the host computer processes the obtained ultrasonic signals to obtain an ultrasonic image.
  • the displacement calculation unit (shown in FIG. 6) in the host computer 500 determines parameters such as the thickness, depth, diameter, and shape of the blood vessel or tissue based on the ultrasound image. For example, the probe 100 moves a certain displacement (for example, parallel to the finger direction) in the horizontal X axis of the tissue to be injected 600, and the displacement calculation unit selects a suitable position of the blood vessel or tissue according to the ultrasound image at this distance, for example, the thickest and relatively smooth blood vessel s position. At this time, the probe 100 is controlled by the host 500 to move the first arm 200 to a certain position, and for convenience of understanding, this position is described as the A position.
  • the probe 100 is controlled by the host 500 to move the first arm 200 to a certain position, and for convenience of understanding, this position is described as the A position.
  • the first robotic arm 200 also includes a rotating articulated arm 240, which can rotate the probe 100 to an appropriate angle, so that it is not necessary to pass through the first, second, and second arm 210, 220, and third of the first and second arm 200.
  • the joint arm 230 needs to perform complex displacement to achieve the desired probe angle.
  • the rotary joint arm 240 is connected to the third joint arm 230 through a rotary joint 260.
  • the rotary articulated arm 240 rotates a certain angle, such as 30 °, 60 °, 90 °, 120 °, 150 °, 180 °, etc.
  • the rotation angle and the number of rotations can be Also set in advance.
  • the displacement calculation unit obtains an appropriate rotation angle position of the probe 100 according to the ultrasound image, and the rotary articulated arm 240 rotates the probe 100 to an appropriate rotation angle position determined by the displacement calculation unit.
  • the position of the probe 100 is determined For ease of understanding, this position is described as the B position.
  • the displacement calculation unit obtains the parameters such as the depth, diameter, length, and shape of the blood vessel or tissue based on the ultrasound image.
  • the displacement calculation unit uses the patient's information, such as height, weight, age, gender, detection site, and other parameters to combine the blood vessel or The tissue depth, diameter, length, shape and other parameter information, the displacement calculation unit calculates the displacement and puncture angle of the needle relative to the marked position on the probe 100 according to the parameter information, and the second robot arm 400 performs the corresponding displacement and The angle is rotated to make the needle 300 reach the calculated specified position.
  • this position we describe this position as the C position.
  • the second robot arm 400 penetrates the needle 300 into the injection tissue 600 to be punctured according to the puncture distance and the puncture angle calculated by the displacement calculation unit.
  • the displacement calculation unit calculates the minimum cross-sectional area of the blood vessels of the ultrasound image by obtaining multiple frames of ultrasound images to determine whether the position of the probe at this time is orthogonal to the target vessel.
  • the first robot arm 200 controls the probe 100 to move to an orthogonal position with the blood vessel.
  • the displacement calculation unit calculates the three-dimensional coordinates of the blood vessel center point at this time.
  • the displacement calculation unit converts the three-dimensional coordinates of the blood vessel center point into the second robot arm 400 control needle. Needle distance and angle.
  • the displacement calculation unit calculates The first robot arm 200 moves along the opposite direction of the needle 300 penetration direction, and controls the first robot arm 200 to move the probe 100 along the needle 300 penetration direction to keep the tip of the needle 300 in the blood vessel
  • the setting position for example, the tip of the needle 300 is approximately on the centerline 630 of the blood vessel.
  • FIG. 7 and FIG. 8 the probe 100, the first robot arm 200, the needle 300, the second robot arm 400, the host 500, the infrared light source 700, and the camera.
  • the infrared light source 700 is mounted on the third arm 230 of the first robot arm.
  • the host 500 controls the infrared irradiation light source 700 to irradiate infrared rays, and the camera collects infrared images of blood vessels obtained after the infrared light source is irradiated on the surface of the injection tissue 600 to be punctured.
  • the infrared image calculates the A position that the probe 100 needs to reach, and the first robot arm control unit controls the first robot arm 200 to reach the A position through a moving displacement.
  • the host 500 includes an image processing unit, a probe control unit, a displacement calculation unit, a first robot arm control unit, a second robot arm control unit, and an infrared light source. Processing unit, infrared image acquisition unit.
  • the infrared light source processing unit controls and controls the infrared irradiation light source 700 to irradiate infrared rays.
  • the infrared image acquisition unit controls the camera to collect infrared images of blood vessels obtained by irradiating the infrared light source on the surface of the tissue to be injected 600.
  • the first robotic arm control unit controls the first robotic arm Move the probe to the human tissue to be injected, and the probe control unit controls the probe to transmit and receive ultrasound signals; the image processing unit synthesizes an ultrasound image by the ultrasound signals transmitted by the probe; the displacement calculation unit calculates at least the puncture of the needle and the tissue to be punctured by the ultrasound image Distance and puncture angle; the second robot arm control unit controls the second robot arm to move the needle to the corresponding puncture distance and puncture angle according to the puncture distance and puncture angle to perform puncture.
  • the displacement calculation unit obtains the position of the needle in the blood vessel according to the ultrasound image; the first robotic arm control unit controls the first robotic arm and the second robotic arm controlled by the second robotic arm control unit to coordinately move, and the needle is always maintained at the probe to detect With the needle tip inside the blood vessel.
  • the image processing unit includes: an image fusion processing unit, an ultrasonic image processing unit, and an infrared image processing unit; the image fusion processing unit fuses an ultrasonic image obtained by the ultrasonic image processing unit and an infrared image obtained by the infrared image processing unit To produce a new two-dimensional or three-dimensional or four-dimensional image, a more vivid image of the simulated image of the needle 300 inserted into the tissue to be punctured is displayed on the display.
  • the image processing unit also includes an image judging unit (not shown in the figure).
  • the image judging unit judges ultrasound by analyzing various ultrasound parameter information of the ultrasound image or comparing with the expert picture library in the machine, such as contrast, TGC, elastic modulus and other parameters. Image Quality. For example, the ultrasound image quality does not reach the preset image standard, and the pressure of the probe 100 and the tissue to be punctured needs to be increased.
  • the image processing unit transmits the information to the displacement calculation unit, and the probe 100 is controlled to move closer by the first robot arm 200. The tissue to be injected moves in the direction of injection.
  • the second robot arm 400 is mounted on the first robot arm 200, or the second robot arm 400 and the first robot arm 200 are commonly connected to a base. 270.
  • the second robot arm 400 is mounted on the first robot arm 200 and has less interference with ultrasound images, which is favorable for imaging.
  • the second robot arm 400 is connected to the rotary joint arm 240 of the first robot arm through an angle adjustment lever 410.
  • processors such as CPU, GPU
  • the processors are general-purpose processors, application-specific integrated circuits, digital signal processors, controllers, field programmable gate arrays, digital devices, Analog devices, transistors, combinations thereof or other now known or later developed for receiving analog or digital data, and outputting changed or calculated data.
  • a third robot arm is provided for controlling one or more probes, so that, together with the probe controlled by the first robot arm, it is used for ultrasound imaging.
  • the third robotic arm may also be provided with an infrared light source or a camera for emitting infrared rays or collecting infrared images.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Robotics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Vascular Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

一种针(300)的超声引导辅助装置,包括:探头(100),用于对待穿刺注射组织(600)进行发射与接收超声波信号;第一机械臂(200),用于将探头(100)移动至待穿刺注射组织(600);第二机械臂(400),固定用于穿刺的针(300),并将针(300)移动靠近至待穿刺注射组织(600)后刺入待穿刺注射组织(600);主机(500),根据控制第一机械臂(200)将探头(100)移动至待穿刺注射组织(600),探头(100)发射与接收超声信号;主机(500)通过探头(100)传输的超声信号合成超声图像;主机(500)通过超声图像至少计算针(300)与待穿刺注射组织(600)的穿刺距离及穿刺角度;主机(500)根据穿刺距离及穿刺角度控制第二机械臂(400)将针(300)移动至相应穿刺距离及穿刺角度后进行穿刺。针(300)的超声引导辅助装置用于引导针(300)的准确穿刺。

Description

针的超声引导辅助装置及系统 技术领域
本发明涉及一种医疗引导辅助装置及系统,尤其是医疗超声引导辅助装置及系统。
背景技术
目前,超声设备在临床诊断和治疗中的应用已经十分普及,对医生准确了解病人病情,制定医疗方案、辅助治疗做出了很大贡献。
目前超声已经广泛用于穿刺引导,但是仍然人工实施探头在人体待测组织表面进行超声探头扫描后,人工进行超声影像进行血管等组织的深度判断,后进行人工穿刺。这样就需要人工进行三次判断操作,例如如何进行有效区域的快速超声探头扫描,如何判断超声影像上血管等其他组织的深度,如何选择合适的穿刺位置及穿刺角度进行穿刺。
静脉注射区别于目前常规的科室穿刺,目前在穿刺时才使用超声或红外进行引导,穿刺的针往往是一个独立的针管。而常规静脉注射(例如打点滴)目前市场上还没有出现使用超声进行引导,静脉注射属于医院或诊所最常规的工作,医生或护士每天需要进行大量的静脉注射操作,因为人的经验原因,不可避免的会经常出现,静脉注射时由于医生或护士的疏忽或判断错误,造成静脉注射的失败,例如静脉注射的针穿破血管,或选择了很细的血管,或静脉注射的角度不对造成静脉注射液体进入任何血管或组织的稳定性。
发明内容
本发明的目的在于补充现有技术中存在的不足,提供一种针的超声引导辅助装置及系统,用于引导针的准确穿刺。本发明采用的技术方案是:
一种针的超声引导辅助装置,包括:
探头,用于对待穿刺注射组织进行发射与接收超声波信号;
第一机械臂,用于将探头移动至待穿刺注射组织;
第二机械臂,固定用于穿刺的针,并将针移动靠近至待穿刺注射组织后刺入待穿刺注射组织;
主机,根据控制第一机械臂将探头移动至待穿刺注射组织,所述探头发射与接收超声信号;
所述主机通过探头传输的超声信号合成超声图像;所述主机通过超声图像至少计算针与待穿刺注射组织的穿刺距离及穿刺角度;
所述主机根据穿刺距离及穿刺角度控制第二机械臂将针移动至相应穿刺距离及穿刺角度后进行穿刺。
进一步地,所述主机根据探头传输的超声信号获取针在待穿刺注射组织内的位置,控制第二机械臂将针的尖端保持在待穿刺注射组织的目标区域内。
更进一步地,所述主机控制第一机械臂与第二机械臂进行配合移动,将针 尖端始终保持在探头能够检测的位置,控制第二机械臂将针的尖端保持在待穿刺注射组织的目标区域内。
进一步地,所述待穿刺注射组织包括动脉、静脉、神经、器官中的一种或多种。
更进一步地,主机根据动脉或静脉的超声图像获取动脉或静脉的最小截面积;所述主机通过第一机械臂控制探头与动脉或静脉正交。
进一步地,所述主机根据超声图像计算待穿刺注射组织内的中心点的三维坐标,主机根据中心点的三维坐标计算针与待穿刺注射组织的穿刺距离及穿刺角度。
进一步地,所述第一机械臂包括旋转关节臂,所述旋转关节臂控制探头旋转,使主机能够获得待穿刺注射组织超声图像。
进一步地,针的超声引导装置包括红外线照射光源,所述红外线照射光源用于对对待穿刺注射组织进行红外照射;所述主机还包括红外图像采集单元,用于控制摄像头获取待穿刺注射组织的红外线图像。
进一步地,所述第二机械臂与第一机械臂分离安装在各自相应的底座或安装在同一个底座上;或,
所述第二机械臂固定地安装在所述第一机械臂上。
所述主机控制第二机械臂将针穿刺入人体组织后,所述主机控制第二机械臂将针继续穿刺入人体组织。
所述超声引导辅助装置包括显示器,用于显示探头扫描后得到的超声图像,及针穿刺入人体组织后动态穿刺超声图像。
本发明提供一种针的超声引导辅助系统,包括:
探头,用于对待穿刺注射组织进行发射与接收超声波信号;
第一机械臂,用于将探头移动至待穿刺注射组织;
第二机械臂,固定针用于穿刺的针,并将针移动至待注射组织后刺入待穿刺注射组织;
主机,所述主机包括第一机械臂控制单元、第二机械臂控制单元、探头控制单元、图像处理单元、位移计算单元;所述第一机械臂控制单元控制第一机械臂将探头移动至待穿刺注射组织,所述探头控制单元控制探头发射与接收超声信号;
所述图像处理单元通过探头传输的超声信号合成超声图像;
所示位移计算单元通过超声图像至少计算针与待穿刺注射组织的穿刺距离及穿刺角度;
所述第二机械臂控制单元根据穿刺距离及穿刺角度控制第二机械臂将针移动至相应穿刺距离及穿刺角度后进行穿刺。
进一步地,所述位移计算单元根据超声图像获取针在待穿刺注射组织的位置;第二机械臂控制单元控制控制第二机械臂将针的尖端保持在待穿刺注射组织的目标区域内。
更进一步地,所述位移计算单元根据超声图像获取针在待穿刺注射组织内 的位置;第一机械臂控制单元控制第一机械臂与第二机械臂控制单元控制的第二机械臂进行配合移动,将针始终保持在探头可以检测的位置,且针的尖端保持在待穿刺注射组织的目标区域内。
所述待穿刺注射组织包括动脉、静脉、神经、器官中的一种或多种。
所述位移计算单元根据动脉或静脉的超声图像获取动脉或静脉的最小截面积;所述位移计算单元通过第一机械臂控制探头与动脉或静脉正交。
所述位移计算单元根据超声图像计算待穿刺注射组织内的中心点的三维坐标,位移计算单元根据中心点的三维坐标计算针与待穿刺注射组织的穿刺距离及穿刺角度。
进一步地,所述针的超声引导辅助系统还包括红外线照射光源,所述主机还包括红外灯源处理单元,用于控制红外线照射光源发射红外线;所述超声引导辅助系统还包括摄像头,所述主机还包括红外图像采集单元,用于控制摄像头获取待穿刺注射组织的红外线图像;
图像处理单元包括:图像融合处理单元、超声图像处理单元、红外线图像处理单元;图像融合处理单元将超声图像处理单元得到的超声图像与红外线图像处理单元得到的红外线图像进行融合,产生新的图像。
所述针的超声引导辅助系统还包括显示器,用于显示探头扫描后得到的超声图像,及针穿刺入人体组织后动态穿刺超声图像,以及显示红外线图像。
本发明的优点在于:本发明通过与主机链接的第一机械臂连接超声探头进行自动扫描,主机根据探头获得的信号进行超声图像的处理,主机根据处理后的超声图像进行自动判断,判断血管的深度及直径等参数,主机控制与第二机械臂连接的针自动插入目标血管或组织内,随着第二机械臂控制静脉注射针插入血管或组织后的不断深入,主机控制第一机械臂上的探头配合第二机械臂上的针运动轨迹而进行移动,以保障第二机械臂上的针始终在血管中心附近位置插入。
通过上述技术方案,本发明解决了如何选择一个合适粗细及形状的血管,保证在静脉注射的后期的舒适性。
附图说明
图1为本发明一个示例性超声引导辅助装置的结构示意图;
图2为本发明包含红外线照射灯源超声引导辅助装置的结构示意图;
图3为本发明另一个示例性第一机械臂与第二机械臂工作状态结构示意图;
图4为本发明另一个示例性第一机械臂与第二机械臂工作状态结构剖视图;
图5为本发明针穿刺入血管后示意图;
图6为本发明一个示例性的超声引导辅助系统框图;
图7为本发明一个示例性包含红外光源的超声引导辅助系统框图;
图8为本发明一个示例性的图像处理单元框图。
具体实施方式
下面结合具体附图和实施例对本发明作进一步说明。
如图1、图6所示,针的超声引导辅助装置包括:探头100、第一机械臂200、 第二机械臂400、主机500。
第一机械臂200包括有第一节臂210、第二节臂220、第三节臂230,第一机械臂200通过第一节臂210、第二节臂220、第三节臂230实现探头100在X轴、Y轴、Z轴三个空间维度进行转动,是探头能够快速的移动到待穿刺注射组织,当探头100移动至待穿刺注射组织时,主机控制探头100对待穿刺注射组织600发射与接收超声波。
第一机械臂200与第二机械臂400安装在各自的底座上。图2中第一机械臂200安装在底座270上,第二机械臂400安装在另一个底座上。
探头100,用于对人体待穿刺注射组织进行发射与接收超声波信号;第一机械臂200,用于将探头100移动至人体待穿刺注射组织600;针300,用于对人体待穿刺注射组织600进行穿刺;第二机械臂400,用于固定针300,并将针300移动至人体待注射组织600后刺入人体待穿刺注射组织600;主机500,根据控制第一机械臂200将探头100移动至人体待穿刺注射组织600,所述探头100发射与接收超声信号;主机500通过探头100传输的超声信号合成超声图像;所述主机500通过超声图像至少计算针300与待穿刺注射组织600的穿刺距离及穿刺角度;主机500根据穿刺距离及穿刺角度控制第二机械臂400将针300移动至相应穿刺距离及穿刺角度后进行穿刺。
主机500的位移计算单元根据超声图像计算待穿刺注射组织内的中心点的三维坐标,位移计算单元根据中心点的三维坐标计算针与待穿刺注射组织的穿刺距离及穿刺角度。
主机500根据探头100传输的超声信号获取针300在待穿刺注射组织内的位置,例如血管内的位置,控制第一机械臂200与第二机械臂400进行配合移动,将针300始终保持在探头可以检测的位置,且针300的尖端保持在待穿刺注射组织内,例如血管目标区域位置。
为清楚起见,将理解到的是,词语“针的尖端”是指针穿刺入人体组织内的一段。
第二机械臂400包括针300的固定装置320,固定装置可以按照针的不同形状进行调整。
如图1、图2所示,第二机械臂上设有针300的固定装置320,针的导管310与针300连接。固定装置320可以根据针300的形状进行更换。
本发明的超声引导辅助装置还包括显示器,该显示器是阴极射线管CRT(阴极射线管),LCD(液晶显示设备),等离子体屏板投影机、OLED、LED等现有显示器的的组合或一种,或其它现在已知的或以后开发的用于显示图像的显示器。
如图6所示:在本发明的一示例性实施例中,针的超声引导辅助系统包括主机、第一机械臂、第二机械臂、探头。所述主机500包括图像处理单元、探头控制单元、位移计算单元、第一机械臂控制单元、第二机械臂控制单元。第一机械臂控制单元控制第一机械臂将探头移动至人体待穿刺注射组织,探头控制单元控制探头发射与接收超声信号;图像处理单元通过探头传输的超声信号合成超声图像;位移计算单元通过超声图像至少计算针与待穿刺注射组织的穿 刺距离及穿刺角度;第二机械臂控制单元根据穿刺距离及穿刺角度控制第二机械臂将针移动至相应穿刺距离及穿刺角度后进行穿刺。位移计算单元根据超声图像获取针在血管内的位置;第一机械臂控制单元控制第一机械臂与第二机械臂控制单元控制的第二机械臂进行配合移动,将针始终保持在探头可以检测的位置,且针的尖端保持在血管内。
本实施例中,主机500中预设置了探头100的位置参数,主机500可以自动或根据使用者的选择进行快速的位置确定。例如使用者让待检测人员的将待穿刺注射组织放置到特定的区域,使用人员在主机上的输入装置或与主机相连的输入装置上进行选择。主机500根据使用人员的选择血管或组织进行探头100大致位置的确定,并通过第一机械臂200将探头100快速移动至待穿刺注射组织600表面。同时主机500控制探头600进行发射与接收超声波,主机内的图像处理单元(附图6所示)根据获得的超声信号进行处理,得到超声图像。
主机500内的位移计算单元(附图6所示)根据超声图像,判断血管或组织的粗细、深度、直径、形状等参数。例如探头100在待穿刺注射组织600水平X轴向移动一定位移(例如平行于手指方向),位移计算单元根据这一段距离的超声图像,选择血管或组织合适的位置,例如血管最粗且相对平滑的位置。此时探头100被主机500控制第一节臂200位移至某一位置,为了便于理解将此位置描述为A位置。
第一机械臂200还包括旋转关节臂240,旋转关节臂240可以将探头100旋转至合适的角度,这样不需要通过第一节臂200的第一节臂210、第二节臂220、第三节臂230来进行复杂位移才能达到想要的探头角度。旋转关节臂240通过一旋转关节260连接第三节臂230。
当第一机械臂200将探头移动至A位置时,旋转关节臂240转动一定的角度,例如转动30°、60°、90°、120°、150°、180°等,转动角度及转动次数可以也进行预先设置。通过探头100的转动,位移计算单元根据超声图像得出合适的探头100的旋转角度位置,旋转关节臂240将探头100旋转至位移计算单元判断的合适的旋转角度位置,此时探头100的位置确定,为了便于理解,将此位置描述为B位置。此时位移计算单元根据超声图像获计算出血管或组织的深度、直径、长度、形状等参数信息,位移计算单元根据病人的信息,例如身高、体重、年龄、性别、检测部位等参数结合血管或组织的深度、直径、长度、形状等参数信息,位移计算单元根据所述参数信息计算出针相对于探头100上的标记位置需要进行的位移、穿刺角度,第二机械臂400进行相应的位移及角度旋转,使针300到达经过计算的指定位置,为了便于理解,我们将此位置描述为C位置。第二机械臂400将针300按照位移计算单元计算出的穿刺距离及穿刺角度,刺入待穿刺注射组织600内。
当第一机械臂200控制探头进行旋转后获得血管的多帧超声图像,位移计算单元通过获得多帧超声图像,计算超声图像血管的最小截面积,这样判断探头此时位置是否正交与目标血管,第一机械臂200控制探头100运动至与血管的正交位置,位移计算单元计算此时血管中心点的三维坐标,位移计算单元将 血管中心点的三维坐标转换成第二机械臂400控制针需要进行的穿刺距离及穿刺角度。
针300的尖端到达一定位置时,被探头100探测到,主机500获得针300与血管相应位置关系的超声图像,随着第二机械臂400不断将针300推刺入血管内,位移计算单元计算第一机械臂200沿着针300刺入方向的相反方向移动的距离,并控制第一机械臂200将探头100沿着针300刺入相反方向进行移动,以将针300的尖端保持在血管预设置位置,例如针300的尖端大致在血管的中心线630上。
如图2、图7、图8所示:探头100、第一机械臂200、针300、第二机械臂400、主机500、红外线照射光源700、摄像头。其中红外线照射光源700安装在第一机械臂的第三节臂230上。
当准备检测待穿刺注射组织600时,主机500,控制红外线照射光源700照射红外线,摄像头采集红外线光源照射在待穿刺注射组织600表面后得到的血管红外线图像,位移计算单元根据图像处理单元得到的血管红外线图像计算出探头100需要到达的A位置,第一机械臂控制单元控制第一机械臂200通过移动的位移到达A位置。
如图7所示:在本发明的一示例性实施例中所述主机500包括图像处理单元、探头控制单元、位移计算单元、第一机械臂控制单元、第二机械臂控制单元、红外灯源处理单元、红外图像采集单元。
红外灯源处理单元控制控制红外线照射光源700照射红外线,红外图像采集单元控制摄像头,采集红外线光源照射在待穿刺注射组织600表面后得到的血管红外线图像,第一机械臂控制单元控制第一机械臂将探头移动至人体待穿刺注射组织,探头控制单元控制探头发射与接收超声信号;图像处理单元通过探头传输的超声信号合成超声图像;位移计算单元通过超声图像至少计算针与待穿刺注射组织的穿刺距离及穿刺角度;第二机械臂控制单元根据穿刺距离及穿刺角度控制第二机械臂将针移动至相应穿刺距离及穿刺角度后进行穿刺。位移计算单元根据超声图像获取针在血管内的位置;第一机械臂控制单元控制第一机械臂与第二机械臂控制单元控制的第二机械臂进行配合移动,将针始终保持在探头可以检测的位置,且针的尖端保持在血管内。
如图8所示,图像处理单元包括:图像融合处理单元、超声图像处理单元、红外线图像处理单元;图像融合处理单元将超声图像处理单元得到的超声图像与红外线图像处理单元得到的红外线图像进行融合,生产新的二维或三维或四维图像,更加生动形象的将针300刺入待穿刺注射组织内的模拟图像显示在显示器上。
图像处理单元还包括图像判断单元(图中未示出),图像判断单元通过分析超声图像各种超声参数信息或与机器内的专家图片库对比,例如对比度、TGC、弹性模量等参数判断超声图像质量。例如超声图像质量未达到预设值的图像标准,需要加大探头100与待穿刺注射组织的压力,则图像处理单元将信息传输至位移计算单元,通过第一机械臂200控制探头100进一步向靠近待穿侧注射 组织方向移动。
如图3、图4所示,本发明的另一示例性实施例中,第二机械臂400安装在第一机械臂200上,或第二机械臂400与第一机械臂200共同连接一个底座270。第二机械臂400安装在第一机械臂200上,对超声图像的干扰性小,利于成像。
图4中,第二机械臂400通过一个角度调节杆410连接第一机械臂的旋转关节臂240。
本发明的主机500的各种处理单元为处理器(例如CPU、GPU),所述处理器通用处理器,应用专用集成电路,数字信号处理器,控制器,现场可编程门阵列,数字装置,模拟装置,晶体管,其组合或其它现在已知的或以后开发的用于接收模拟或数字数据,并输出改变,或者计算后的数据。
在本发明的另一实施例中,设有第三机械臂,用于控制一个或多个探头,这样与第一机械臂控制的探头一起,用于超声成像。第三机械臂上还可以设有红外线照射光源或摄像头等,用于发射红外线或采集红外线图像。
在权利要求书及说明书中,术语“第一,”“第二,”和“第三,”是纯粹用作标记,并且不试图对它们的修饰物对数值要求。
需要说明的是,当元件或部件被称为“固定”另一个元件或部件时,它可以直接在另一个元件上或者也可以存在居中的元件。当一个元件被认为是“连接”另一个元件,它可以是直接连接到另一个元件或者可能同时存在居中元件。本文所使用的术语“中心”、“水平的”、“平行的”以及类似的表述只是为了说明的目的,并不表示为唯一的实施方式。
以上只是本发明特定实施例的描述,应当理解成在本领域的技术人员不脱离本实用新型的真实精神和范围下,通过其他各种简单变化和等同物进行取代修改,达到本实用新型所述目的,这样的修改都被所附权利要求的范围内。

Claims (13)

  1. 针的超声引导辅助装置,其特征在于,包括:
    探头,用于对待穿刺注射组织进行发射与接收超声波信号;
    第一机械臂,用于将探头移动至待穿刺注射组织;
    第二机械臂,固定用于穿刺的针,并将针移动靠近至待穿刺注射组织后刺入待穿刺注射组织;
    主机,根据控制第一机械臂将探头移动至待穿刺注射组织,所述探头发射与接收超声信号;
    所述主机通过探头传输的超声信号合成超声图像;所述主机通过超声图像至少计算针与待穿刺注射组织的穿刺距离及穿刺角度;
    所述主机根据穿刺距离及穿刺角度控制第二机械臂将针移动至相应穿刺距离及穿刺角度后进行穿刺。
  2. 如权利要求1所述的针的超声引导辅助装置,其特征在于:所述主机根据探头传输的超声信号获取针在待穿刺注射组织内的位置,控制第二机械臂将针的尖端保持在待穿刺注射组织的目标区域内。
  3. 如权利要求2所述的针的超声引导辅助装置,其特征在于:所述主机控制第一机械臂与第二机械臂进行配合移动,将针尖端始终保持在探头能够检测的位置,控制第二机械臂将针的尖端保持在待穿刺注射组织的目标区域内。
  4. 如权利要求1所述的针的超声引导辅助装置,其特征在于:所述待穿刺注射组织包括动脉、静脉、神经、器官中的一种或多种。
  5. 如权利要求4所述的针的超声引导辅助装置,其特征在于:主机根据动脉或静脉的超声图像获取动脉或静脉的最小截面积;所述主机通过第一机械臂控制探头与动脉或静脉正交。
  6. 如权利要求1~5中任一项所述的针的超声引导辅助装置,其特征在于:所述主机根据超声图像计算待穿刺注射组织内的中心点的三维坐标,主机根据中心点的三维坐标计算针与待穿刺注射组织的穿刺距离及穿刺角度。
  7. 如权利要求1~5中任一项所述的针的超声引导辅助装置,其特征在于:所述第一机械臂包括旋转关节臂,所述旋转关节臂控制探头旋转,使主机能够获得待穿刺注射组织超声图像。
  8. 如权利要求1~5中任一项所述的针的超声引导辅助装置,其特征在于:针的超声引导装置包括红外线照射光源,所述红外线照射光源用于对对待穿刺注射组织进行红外照射;所述主机还包括红外图像采集单元,用于控制摄像头获取待穿刺注射组织的红外线图像。
  9. 如权利要求1~5中任一项所述的针的超声引导辅助装置,其特征在于:
    所述第二机械臂与第一机械臂分离安装在各自相应的底座或安装在同一个底座上;或,
    所述第二机械臂固定地安装在所述第一机械臂上。
  10. 针的超声引导辅助系统,其特征在于,包括:
    探头,用于对待穿刺注射组织进行发射与接收超声波信号;
    第一机械臂,用于将探头移动至待穿刺注射组织;
    第二机械臂,固定针用于穿刺的针,并将针移动至待注射组织后刺入待穿刺注射组织;
    主机,所述主机包括第一机械臂控制单元、第二机械臂控制单元、探头控制单元、图像处理单元、位移计算单元;所述第一机械臂控制单元控制第一机械臂将探头移动至待穿刺注射组织,所述探头控制单元控制探头发射与接收超声信号;
    所述图像处理单元通过探头传输的超声信号合成超声图像;
    所示位移计算单元通过超声图像至少计算针与待穿刺注射组织的穿刺距离及穿刺角度;
    所述第二机械臂控制单元根据穿刺距离及穿刺角度控制第二机械臂将针移动至相应穿刺距离及穿刺角度后进行穿刺。
  11. 如权利要求10所述的针的超声引导辅助系统,其特征在于:所述位移计算单元根据超声图像获取针在待穿刺注射组织的位置;第二机械臂控制单元控制控制第二机械臂将针的尖端保持在待穿刺注射组织的目标区域内。
  12. 如权利要求10所述的针的超声引导辅助系统,其特征在于:
    所述位移计算单元根据超声图像获取针在待穿刺注射组织内的位置;第一机械臂控制单元控制第一机械臂与第二机械臂控制单元控制的第二机械臂进行配合移动,将针始终保持在探头可以检测的位置,且针的尖端保持在待穿刺注射组织的目标区域内。
  13. 如权利要求10所述的针的超声引导辅助系统,其特征在于:
    所述超声引导辅助系统还包括红外线照射光源,所述主机还包括红外灯源处理单元,用于控制红外线照射光源发射红外线;所述超声引导辅助系统还包括摄像头,所述主机还包括红外图像采集单元,用于控制摄像头获取待穿刺注射组织的红外线图像;
    图像处理单元包括:图像融合处理单元、超声图像处理单元、红外线图像处理单元;图像融合处理单元将超声图像处理单元得到的超声图像与红外线图像处理单元得到的红外线图像进行融合,产生新的图像。
PCT/CN2018/124034 2018-06-27 2018-12-26 针的超声引导辅助装置及系统 WO2020000963A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/254,996 US20210322106A1 (en) 2018-06-27 2018-12-26 Ultrasound-guided assistance device and system for needle
EP18924220.9A EP3815636A4 (en) 2018-06-27 2018-12-26 ULTRASOUND-GUIDED ASSISTANCE DEVICE AND SYSTEM FOR NEEDLE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810680946.5A CN108814691B (zh) 2018-06-27 2018-06-27 针的超声引导辅助装置及系统
CN201810680946.5 2018-06-27

Publications (1)

Publication Number Publication Date
WO2020000963A1 true WO2020000963A1 (zh) 2020-01-02

Family

ID=64139105

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/124034 WO2020000963A1 (zh) 2018-06-27 2018-12-26 针的超声引导辅助装置及系统

Country Status (4)

Country Link
US (1) US20210322106A1 (zh)
EP (1) EP3815636A4 (zh)
CN (1) CN108814691B (zh)
WO (1) WO2020000963A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113349897A (zh) * 2021-07-13 2021-09-07 安徽科大讯飞医疗信息技术有限公司 一种超声穿刺的引导方法、装置以及设备
CN113616299A (zh) * 2021-09-14 2021-11-09 常德职业技术学院 一种智能控制的输液穿刺系统
CN117084791A (zh) * 2023-10-19 2023-11-21 苏州恒瑞宏远医疗科技有限公司 一种穿刺方位解算方法以及穿刺作业执行系统

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108814691B (zh) * 2018-06-27 2020-06-02 无锡祥生医疗科技股份有限公司 针的超声引导辅助装置及系统
CN109452960B (zh) * 2018-12-11 2024-01-23 北京润美玉之光医疗美容门诊部 一种用于线雕整形手术的控制系统
CN111347410B (zh) * 2018-12-20 2022-07-26 沈阳新松机器人自动化股份有限公司 一种多视觉融合目标引导机器人及方法
CN111358534A (zh) * 2018-12-25 2020-07-03 无锡祥生医疗科技股份有限公司 超声引导穿刺装置及系统
CN109480908A (zh) * 2018-12-29 2019-03-19 无锡祥生医疗科技股份有限公司 换能器导航方法及成像设备
TWI720398B (zh) * 2019-01-03 2021-03-01 國立陽明大學 用於肋膜訊號分析辨識、追蹤測距及顯示的合併方法及其內針超音波系統
CN110547858B (zh) * 2019-09-19 2021-06-01 西安交通大学医学院第一附属医院 一种心血管内科临床穿刺装置
US12036069B2 (en) * 2019-12-05 2024-07-16 Fuji Corporation Ultrasonic diagnosis system
CN110812217B (zh) * 2019-12-13 2022-01-07 陕西中医药大学 一种智能针灸装置
CN111839538A (zh) * 2020-06-05 2020-10-30 哈工大机器人(中山)无人装备与人工智能研究院 一种智能采血机器人
CN111820920B (zh) * 2020-06-05 2021-10-29 哈工大机器人(中山)无人装备与人工智能研究院 一种静脉采血数据处理方法、装置及智能采血机器人
CN111839540A (zh) * 2020-06-05 2020-10-30 哈工大机器人(中山)无人装备与人工智能研究院 智能采血机器人
CN111803214B (zh) * 2020-07-21 2024-06-14 京东方科技集团股份有限公司 一种手术机器人装置
CN112022296B (zh) * 2020-08-31 2021-09-03 同济大学 一种静脉穿刺装置及方法
CN112890957A (zh) * 2021-01-14 2021-06-04 北京美迪云机器人科技有限公司 一种磁力感应远程定位系统及方法
CN113133832B (zh) * 2021-03-26 2022-09-20 中国科学院深圳先进技术研究院 一种双臂机器人穿刺系统标定方法及系统
CN113303826B (zh) * 2021-06-02 2023-07-07 无锡医百加科技有限公司 智能型超声系统
CN113577458A (zh) * 2021-07-14 2021-11-02 深圳市罗湖医院集团 自动注射方法、装置、电子设备和存储介质
CN114376685B (zh) * 2021-07-20 2023-08-22 牡丹江医学院 一种椎管内穿刺超声探头
CN113413216B (zh) * 2021-07-30 2022-06-07 武汉大学 一种基于超声影像导航的双臂穿刺机器人
CN113749694B (zh) * 2021-10-11 2023-08-15 上海交通大学医学院附属第九人民医院 穿刺取活检及消融系统
IT202100027677A1 (it) * 2021-10-28 2023-04-28 Samuele Innocenti Macchinario per realizzare iniezioni
EP4179980A1 (en) * 2021-11-10 2023-05-17 Koninklijke Philips N.V. Controller for imaging and controlling robotic device using imaging interface
EP4429560A1 (en) * 2021-11-10 2024-09-18 Koninklijke Philips N.V. Controller for imaging and controlling robotic device using imaging interface
CN114287997B (zh) * 2021-12-17 2023-10-03 上海卓昕医疗科技有限公司 医疗穿刺机器人
US20230240754A1 (en) * 2022-02-02 2023-08-03 Mazor Robotics Ltd. Tissue pathway creation using ultrasonic sensors
CN114521943B (zh) * 2022-03-11 2022-11-25 安徽省立医院(中国科学技术大学附属第一医院) 一种骶孔探测智能定位穿刺装置
WO2024070932A1 (ja) * 2022-09-29 2024-04-04 テルモ株式会社 血管穿刺システムおよびその制御方法
CN116983057B (zh) * 2023-09-25 2024-01-23 中南大学 一种可实时多重配准的数字孪生图像穿刺引导系统
CN117444987B (zh) * 2023-12-22 2024-03-15 北京衔微医疗科技有限公司 一种应用于辅助机器人的器械控制方法、系统、终端及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5647373A (en) * 1993-11-07 1997-07-15 Ultra-Guide Ltd. Articulated needle guide for ultrasound imaging and method of using same
US20130197355A1 (en) * 2012-02-01 2013-08-01 Samsung Medison Co., Ltd. Method of controlling needle guide apparatus, and ultrasound diagnostic apparatus using the same
CN205031338U (zh) * 2015-07-16 2016-02-17 执鼎医疗科技江苏有限公司 一种红外引导超声定位的静脉穿刺系统
US20170055940A1 (en) * 2014-04-28 2017-03-02 Mazor Robotics Ltd. Ultrasound guided hand held robot
US20170080166A1 (en) * 2015-09-18 2017-03-23 Actuated Medical, lnc. Device and System for Insertion of Penetrating Member
CN107811683A (zh) * 2017-10-20 2018-03-20 张还添 一种骨关节超声微创治疗仪
CN108814691A (zh) * 2018-06-27 2018-11-16 无锡祥生医疗科技股份有限公司 针的超声引导辅助装置及系统

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101612062B (zh) * 2008-06-26 2014-03-26 北京石油化工学院 实现超声影像导航定位方法的传感式的六关节机械臂
US8864652B2 (en) * 2008-06-27 2014-10-21 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
WO2012088471A1 (en) * 2010-12-22 2012-06-28 Veebot, Llc Systems and methods for autonomous intravenous needle insertion
DE102011005917A1 (de) * 2011-03-22 2012-09-27 Kuka Laboratories Gmbh Medizinischer Arbeitsplatz
CN103860194B (zh) * 2012-12-17 2016-08-10 上海精诚医疗器械有限公司 体外冲击波治疗设备的b超定位机构
WO2015179505A1 (en) * 2014-05-20 2015-11-26 Children's Hospital Medical Center Image guided autonomous needle insertion device for vascular access
DE102014226240A1 (de) * 2014-12-17 2016-06-23 Kuka Roboter Gmbh System zur roboterunterstützten medizinischen Behandlung
US10154886B2 (en) * 2016-01-06 2018-12-18 Ethicon Llc Methods, systems, and devices for controlling movement of a robotic surgical system
CN206473396U (zh) * 2016-10-27 2017-09-08 河南曙光健士医疗器械集团股份有限公司 一种机械臂
CN106419960A (zh) * 2016-12-04 2017-02-22 无锡圣诺亚科技有限公司 具备超声穿刺导航的超声仪
CN206867233U (zh) * 2017-02-13 2018-01-12 李淑珍 分离式机械臂彩超检查仪
CN107440745A (zh) * 2017-09-06 2017-12-08 深圳铭锐医疗自动化有限公司 一种超声波机器人检查系统
CN107789001B (zh) * 2017-10-31 2021-08-31 上海联影医疗科技股份有限公司 一种用于成像扫描的摆位方法和系统
CN107970060A (zh) * 2018-01-11 2018-05-01 上海联影医疗科技有限公司 手术机器人系统及其控制方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5647373A (en) * 1993-11-07 1997-07-15 Ultra-Guide Ltd. Articulated needle guide for ultrasound imaging and method of using same
US20130197355A1 (en) * 2012-02-01 2013-08-01 Samsung Medison Co., Ltd. Method of controlling needle guide apparatus, and ultrasound diagnostic apparatus using the same
US20170055940A1 (en) * 2014-04-28 2017-03-02 Mazor Robotics Ltd. Ultrasound guided hand held robot
CN205031338U (zh) * 2015-07-16 2016-02-17 执鼎医疗科技江苏有限公司 一种红外引导超声定位的静脉穿刺系统
US20170080166A1 (en) * 2015-09-18 2017-03-23 Actuated Medical, lnc. Device and System for Insertion of Penetrating Member
CN107811683A (zh) * 2017-10-20 2018-03-20 张还添 一种骨关节超声微创治疗仪
CN108814691A (zh) * 2018-06-27 2018-11-16 无锡祥生医疗科技股份有限公司 针的超声引导辅助装置及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3815636A4 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113349897A (zh) * 2021-07-13 2021-09-07 安徽科大讯飞医疗信息技术有限公司 一种超声穿刺的引导方法、装置以及设备
CN113349897B (zh) * 2021-07-13 2023-03-24 安徽讯飞医疗股份有限公司 一种超声穿刺的引导方法、装置以及设备
CN113616299A (zh) * 2021-09-14 2021-11-09 常德职业技术学院 一种智能控制的输液穿刺系统
CN117084791A (zh) * 2023-10-19 2023-11-21 苏州恒瑞宏远医疗科技有限公司 一种穿刺方位解算方法以及穿刺作业执行系统
CN117084791B (zh) * 2023-10-19 2023-12-22 苏州恒瑞宏远医疗科技有限公司 一种穿刺方位解算方法以及穿刺作业执行系统

Also Published As

Publication number Publication date
CN108814691B (zh) 2020-06-02
EP3815636A4 (en) 2022-03-23
EP3815636A1 (en) 2021-05-05
CN108814691A (zh) 2018-11-16
US20210322106A1 (en) 2021-10-21

Similar Documents

Publication Publication Date Title
WO2020000963A1 (zh) 针的超声引导辅助装置及系统
US10258413B2 (en) Human organ movement monitoring method, surgical navigation system and computer readable medium
JP7422773B2 (ja) 血管検出および脈管アクセスデバイス配置のための静脈内治療システム
US20170252002A1 (en) Ultrasonic diagnostic apparatus and ultrasonic diagnosis support apparatus
CN109758233B (zh) 一种诊疗一体化手术机器人系统及其导航定位方法
US20160374644A1 (en) Ultrasonic Guidance of a Probe with Respect to Anatomical Features
CN107970060A (zh) 手术机器人系统及其控制方法
WO2008081438A1 (en) Vascular access system and method
CN106108951B (zh) 一种医用实时三维定位追踪系统及方法
CN105107067A (zh) 一种红外引导超声定位的静脉穿刺系统
JP2010269067A (ja) 治療支援装置
RU2594100C1 (ru) Способ проведения малоинвазивного хирургического вмешательства и установка "рх-1" для его осуществления
JP2022544625A (ja) 可搬型の超音波誘導下カニューレ挿入のためのシステム及び方法
WO2018161620A1 (zh) 一种静脉穿刺装置、系统及静脉穿刺控制方法
WO2017050201A1 (zh) 微创医疗机器人系统
JP2012045198A (ja) 治療支援装置及び治療支援システム
CN208573801U (zh) 手术机器人系统
CN113558735A (zh) 一种面向胆道穿刺的机器人穿刺定位方法及装置
CN109745074B (zh) 一种三维超声成像的系统及方法
WO2012147733A1 (ja) 治療支援システム及び医用画像処理装置
CN116983057B (zh) 一种可实时多重配准的数字孪生图像穿刺引导系统
CN215874870U (zh) 一种面向胆道穿刺的机器人穿刺定位装置
WO2023019479A1 (zh) 一种面向胆道穿刺的机器人穿刺定位方法及装置
CN209847368U (zh) 一种诊疗一体化手术机器人系统
CN112617879A (zh) 一种智能肿瘤检查设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18924220

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018924220

Country of ref document: EP

Effective date: 20210127