CN115089212A - Three-dimensional vision-guided automatic neck ultrasonic scanning method and system for mechanical arm - Google Patents

Three-dimensional vision-guided automatic neck ultrasonic scanning method and system for mechanical arm Download PDF

Info

Publication number
CN115089212A
CN115089212A CN202210496051.2A CN202210496051A CN115089212A CN 115089212 A CN115089212 A CN 115089212A CN 202210496051 A CN202210496051 A CN 202210496051A CN 115089212 A CN115089212 A CN 115089212A
Authority
CN
China
Prior art keywords
neck
dimensional
point
key point
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210496051.2A
Other languages
Chinese (zh)
Inventor
何昊
李�浩
孟玲
肖振
李文科
毛昕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Feique Medical Technology Co ltd
Second Xiangya Hospital of Central South University
Original Assignee
Hunan Feique Medical Technology Co ltd
Second Xiangya Hospital of Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Feique Medical Technology Co ltd, Second Xiangya Hospital of Central South University filed Critical Hunan Feique Medical Technology Co ltd
Priority to CN202210496051.2A priority Critical patent/CN115089212A/en
Publication of CN115089212A publication Critical patent/CN115089212A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4218Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4263Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors not mounted on the probe, e.g. mounted on an external reference frame
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5292Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • A61B2034/306Wrists with multiple vertebrae

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Robotics (AREA)
  • Physiology (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention discloses a three-dimensional vision-guided automatic neck ultrasonic scanning method and a system of a mechanical arm. The method comprises the following steps: s1, performing three-dimensional reconstruction on the neck of the human body; s2, performing neck key point detection on the three-dimensional reconstructed image according to a preset key point detection network to obtain neck key point information; s3, planning a path according to the neck key point information and the set matching rule; and S4, guiding the mechanical arm to carry out neck ultrasonic scanning according to the path planning. According to the invention, the characteristic points are extracted according to the three-dimensional visual information of the patient shot by the camera, the path planning of the mechanical arm is automatically planned, and the mechanical arm is guided to carry out neck ultrasonic scanning through the path planning, so that the thyroid diagnosis of the human body is facilitated.

Description

Three-dimensional vision-guided automatic neck ultrasonic scanning method and system for mechanical arm
Technical Field
The invention relates to the technical field of ultrasound, in particular to an automatic neck ultrasonic scanning method and system of a mechanical arm guided by three-dimensional vision.
Background
The health state of the thyroid part can be detected by using ultrasonic equipment so as to judge the health state of the thyroid and part of the body of a patient, and the pathological change condition of the thyroid part is mainly detected by ultrasonic color ultrasonography at present. The technology of adopting the mechanical arm to replace manpower to produce and live is developed and applied relatively rapidly in recent years, and particularly in the industrial field and the medical care field, the mechanical arm is popularized to a certain extent, and human resources are greatly liberated. The existing mechanical arm in the medical field is mainly applied to the nursing industry, such as massage, health care and maintenance, and the like, but has few directly related technical applications for diagnosis and treatment assistance, medicine research and development and the like.
Disclosure of Invention
The invention aims to provide a three-dimensional visual-guided automatic neck ultrasonic scanning method and system of a mechanical arm, which are used for overcoming the defects in the prior art.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a three-dimensional visual guidance automatic neck ultrasonic scanning method of a mechanical arm comprises the following steps:
s1, carrying out three-dimensional reconstruction on the neck of the human body,
s2, performing neck key point detection on the three-dimensional reconstructed image according to a preset key point detection network to obtain neck key point information;
s3, planning a path according to the neck key point information and the set matching rule;
and S4, guiding the mechanical arm to carry out neck ultrasonic scanning according to the path plan.
Further, the step S1 specifically includes:
s11, shooting the neck of the human body by using a binocular camera to obtain a color image of left and right visual fields;
s12, calculating three-dimensional information of each pixel point in the left view image based on a three-dimensional measurement principle;
s13, segmenting the color image according to a preset semantic segmentation network to obtain pixel point coordinates belonging to a human body part;
and S14, constructing a mask according to the pixel point coordinates obtained in the step S13, indexing the mask in the three-dimensional information matrix of the pixel point obtained in the step S12 by using the mask to obtain the space coordinates and normal vectors of each point of the human body, and constructing a point cloud.
Further, the preset key point detection network is a centret key point detection network, the centret key point detection network adopts a ResNet50 or Hourglass network as a main feature extraction network to obtain feature information in an image, and thermodynamic diagram prediction is performed by using the feature information obtained by the network to realize the detection of the key point.
Further, the specific steps of detecting the neck key points in step S2 include:
s21, inputting the obtained color image into a CenterNet key point detection network, and obtaining a pixel coordinate of a neck key point;
s22, indexing the three-dimensional information matrix by using the pixel coordinates and acquiring the three-dimensional coordinates of each key point;
and S23, acquiring position indexes of all key points in the point cloud according to the three-dimensional coordinates to obtain the spatial position and normal vector information of each key point.
Further, the neck keypoints comprise a bottom center point, a bottom left point, a bottom right point, a top left point, and a top right point.
Further, the step S3 specifically includes:
s31, using the obtained spatial position of the key point and the normal vector information as nodes on the motion path of the tail end of the mechanical arm, and interpolating between the nodes to obtain a smooth and complete path;
s32, calculating the Euclidean distance between each point and the interpolation point in the point cloud, setting the minimum Euclidean distance as the matching point of the interpolation point, and acquiring the spatial position and the normal vector information of the matching point from the point cloud.
Further, the step S4 specifically includes:
s41, constructing a Cartesian three-dimensional coordinate system Oxyz by taking the normal vector as the positive direction of the z axis, and enabling the coordinate system to pass through a rotation matrix T m Conversion to the camera coordinate System Oxyz camera Rotation matrix T using C + + Eigen library m Converting into quaternion data;
s42, converting the quaternion data in the camera coordinate system into the data in the base coordinate system according to the following formula,
T object-in-base =T tool-in-base *T cam-in-too *T object-in-can
wherein, T tool-in-base As the end joint coordinate system, T cam-in-too Is a tool end coordinate system, T object-in-can As camera coordinate system, T object-in- Is a base coordinate system;
and S43, calculating the angle of each joint according to the known space position and attitude of the tail end of the mechanical arm.
Further, the step S43 is specifically: and deducing a transformation matrix from the terminal coordinate system of the tool to the base coordinate system by a D-H parameter calibration method, performing inverse transformation on the transformation matrix of each joint, and performing equation matching and variable separation to obtain the angle of each joint.
Further, in the process of the neck ultrasonic scanning by the robot arm in step S4, the force fed back by the end force sensor is directly fed back to the initial position by using a proportional coefficient by using a direct force-contact feedback control algorithm, and the initial position is corrected as a position correction amount.
The invention also provides a system of the robot automatic neck ultrasonic scanning method based on the three-dimensional visual guidance, which comprises the following steps:
a three-dimensional reconstruction module for three-dimensionally reconstructing the neck of a human body,
the key point detection module is used for detecting key points of the neck of the three-dimensional reconstructed image according to a preset key point detection network to obtain key point information of the neck;
the path planning module is used for planning paths according to the neck key point information and the set matching rule;
and the guiding module is used for guiding the mechanical arm to carry out neck ultrasonic scanning according to the path planning.
Compared with the prior art, the invention has the advantages that: according to the three-dimensional visual guidance mechanical arm automatic neck ultrasonic scanning method and system, the characteristic points are extracted according to the three-dimensional visual information of a patient shot by a camera, the path planning of the mechanical arm is automatically planned, and the mechanical arm is guided to carry out neck ultrasonic scanning through the path planning, so that the thyroid diagnosis of a human body is facilitated.
Drawings
In order to more clearly illustrate the embodiments or technical solutions of the present invention, the drawings used in the embodiments or technical descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of the three-dimensional vision-guided robotic arm automated neck ultrasound scanning method of the present invention.
Fig. 2 is a diagram showing the result of human body segmentation in the present invention.
FIG. 3 is a graph showing the detection results of key points in the present invention.
FIG. 4 is a rendering effect diagram of each path point in the human body point cloud.
Fig. 5 is a schematic diagram of the hand-eye calibration of the present invention.
FIG. 6 is a schematic view of the joint angle of the robot arm of the present invention.
FIG. 7 is a direct force-touch feedback control algorithm diagram in accordance with the present invention.
FIG. 8 is a block diagram of the three-dimensional vision guided robotic neck ultrasound scanning system of the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings so that the advantages and features of the present invention can be more readily understood by those skilled in the art, and the scope of the present invention will be more clearly and clearly defined.
Referring to fig. 1, the embodiment discloses a three-dimensional vision-guided robot automatic neck ultrasonic scanning method, which includes the following steps:
step S1, carrying out three-dimensional reconstruction on the neck of the human body, which specifically comprises the following steps:
and step S11, shooting the neck of the human body by using a binocular camera to obtain a color image of left and right visual fields.
And step S12, calculating the three-dimensional information of each pixel point in the left view image based on the three-dimensional measurement principle.
And S13, segmenting the color image according to a preset deep LabV3+ semantic segmentation network to obtain pixel point coordinates belonging to a human body part. The DeepLabV3+ semantic segmentation network mainly comprises an encoding part and a decoding part and can realize the segmentation of the pixel level of a target object, and the input of the DeepLabV3+ semantic segmentation network is an RGB color image and the output is the segmentation result of the target object in the image. And (3) sending the color image into a DeepLabV3+ semantic segmentation network to realize accurate segmentation of the human body, as shown in figure 2.
And S14, constructing a mask according to the pixel point coordinates obtained in the step S13, indexing the mask in the three-dimensional information matrix of the pixel points obtained in the step S12 by using the mask to obtain the space coordinates and normal vectors of each point of the human body, constructing point cloud, and realizing three-dimensional reconstruction of the human body.
And step S2, performing neck key point detection on the three-dimensional reconstructed image according to a preset key point detection network to obtain neck key point information.
Specifically, the movement path of the robot arm may be different due to different neck sizes of different users. However, by observing the scanning technique and the motion trail of the professional physician, it is found that these paths always pass through several key points of the neck. Therefore, the detection of human neck key points is crucial to the path planning of subsequent mechanical arms, and 5 neck key points are preset in the embodiment, which are respectively as follows: bottom center point (down), bottom left point (left _ down), bottom right point (right _ down), top left point (left _ top), and top right point (right _ top), as shown in fig. 3.
In this embodiment, the preset keypoint detection network is a centret keypoint detection network, and the centret keypoint detection network uses a ResNet50 or a Hourglass network as a main feature extraction network to obtain feature information in an image, and then performs thermodynamic diagram prediction by using the feature information obtained by the network, so as to realize detection of keypoints.
The input of the centret keypoint detection network is an RGB color image, and the output is the pixel coordinates of all keypoints in the image.
In this embodiment, the specific steps of detecting the neck key points in step S2 include:
step S21, inputting the obtained color image into a CenterNet key point detection network, and obtaining the pixel coordinates of the neck key point;
step S22, indexing a three-dimensional information matrix by using the pixel coordinates and acquiring the three-dimensional coordinates of each key point;
and step S23, acquiring position indexes of all key points in the point cloud according to the three-dimensional coordinates to obtain the spatial position and normal vector information of each key point.
And step S3, planning a path according to the neck key point information and the set matching rule. The method specifically comprises the following steps:
and step S31, utilizing the obtained spatial position of the key point and the normal vector information as nodes on the motion path of the tail end of the mechanical arm, and interpolating between the nodes to obtain a smooth and complete path so as to realize the planning of the autonomous path.
The space coordinate of the key point A is set as (x) A ,y A ,z A ) The spatial coordinate of the key point B is (x) B ,y B ,z B ) If the number of points inserted between two points is N, the spatial coordinate (x) of the point i is inserted i ,y i ,z i ) Satisfies the following conditions:
Figure BDA0003633018550000051
step S32, matching each path point from the point cloud, specifically: and (3) calculating Euclidean distances between each point and the interpolation point in the point cloud, setting the minimum Euclidean distance as a matching point of the interpolation point, and acquiring the spatial position and normal vector information of the matching point from the point cloud, wherein the rendering effect of each path point in the human body point cloud is shown in figure 4.
And step S4, guiding the mechanical arm to carry out neck ultrasonic scanning according to the path planning, wherein the neck ultrasonic scanning is specifically divided into normal vector-to-quaternion, hand-eye calibration and mechanical arm motion inverse resolving. The method comprises the following steps:
step S41, constructing a Cartesian three-dimensional coordinate system Oxyz by taking the normal vector as the positive direction of the z axis, and passing the coordinate system through a rotation matrix T m Conversion to the camera coordinate System Oxyz camera Rotation matrix T using C + + Eigen library m Converted into quaternion data. The algorithm pseudo code is as follows (need: unit normal vector Tmpvec of measured point):
tmpvec and camera coordinate system Oxyz camera The unit direction vector of the positive direction of the z axis is cross-multiplied to obtain a unit direction vector x; tmpvec is cross-multiplied with x to obtain a unit direction vector y; constructing a Cartesian coordinate system Oxyz coordinate system by taking Tmpvec, x and y as the positive direction of a z axis, the positive direction of an x axis and the positive direction of a y axis respectively; constructing a rotation matrix T m X, y, Tmpvec }; recall Eigen library m ═>Quaterninond obtains quaternion attitude data.
The pose information of the measured object obtained by the algorithm is a cameraThe original data in the coordinate system and the pose information for controlling the motion of the mechanical arm are data in the basic coordinate system of the mechanical arm, as shown in fig. 5, so that the calibration of hands and eyes is required. There are four coordinate systems for this system: (1) base coordinate system (T) object-in-base ) (ii) a (2) End joint coordinate system (T) tool-in-base ) (ii) a (3) Tool tip coordinate system (T) cam-in-tool ) (ii) a (4) Camera coordinate system (T) object-in-can ) (ii) a Has the following relations:
step S42, converting the quaternion data in the camera coordinate system into data in the base coordinate system according to the following formula:
T object-in- =T tool-in-base *T cam-in-too *T object-in-can
step S43, calculating the angle of each joint by knowing the spatial position and attitude of the end of the robot arm.
Specifically, as shown in fig. 6, which is a schematic diagram of the joint angles of the robot arm, the inverse solution of the kinematics of the robot arm is to calculate the possible angles of each joint by knowing the spatial position and attitude of the end of the robot arm. And deducing a transformation matrix from the tool end to the base coordinate system by a D-H parameter calibration method. And (3) performing inverse transformation on the transformation matrix of each joint, performing equation matching and variable separation to obtain the angle of each joint, and selecting a group of solutions which are most consistent under the practical conditions of considering the motion attribute of the mechanical arm, safety problems and the like. The robot arm Aubo-C5 is called circularly to move the command robottservicejoinmove (waypoingvector. at (i). joointpos, true), each waypoint is added into a waypoint container (waypoingvector) created in advance, and then the controller drives the end effector to move to a desired position according to a target position in the waypoint container and other robot arm settings, such as a maximum movement speed, a maximum acceleration and a current position of the end effector.
In this embodiment, in the ultrasonic examination process, the probe of the diagnostic apparatus is required to scan closely to the skin of the human body, so that the proper force contact of the diagnostic apparatus to the skin needs to be ensured. And C + + multithreading is adopted to judge and process real-time terminal force-touch information so as to prevent safety accidents and ensure user experience.
In the actual thyroid gland diagnosis process, a tail end contact force signal is fed back by the pressure sensor, the control mode of the mechanical arm of the operating platform is position control, and if the coordinated control of force and position is to be realized, a flexible control algorithm can be adopted to convert a feedback force into a position signal or convert a feedback position into a force signal. Therefore, the embodiment can design a control system for compliant operation of the tail end based on force-touch feedback, the system adopts PD position control as an inner control ring, a compliance algorithm as an outer control ring, and signals collected by the pressure sensor are processed and then used as the input of the inner control ring when the mechanical arm moves. In order to realize compliance control, firstly, a direct force touch feedback control algorithm is utilized to directly feed back force fed back by a tail end force sensor to an initial position by utilizing a proportional coefficient, the force is used as a position correction quantity to correct the initial position, and the corresponding direct force touch feedback control rate is as follows:
τ=K pd ·q d
wherein the position correction amount q is used d To correct the initial position q d ,K pd The direct force feedback parameters are expressed, and the corresponding control quantity input to the position controller is as follows:
q r =q d -τ/K pd
a corresponding direct force-touch feedback control algorithm is shown in fig. 7, according to which compliance control can be realized under the condition that the actual position deviates from the preset position, wherein the force F generated under the action force with the environment can be converted into the action moment based on the joint layer through conversion of the jacobian matrix.
Referring to fig. 8, the present invention further provides a system of the robotic arm automatic neck ultrasonic scanning method according to the three-dimensional visual guidance, comprising: the system comprises a three-dimensional reconstruction module 1 for performing three-dimensional reconstruction on the neck of a human body, and a key point detection module 2 for performing neck key point detection on an image after the three-dimensional reconstruction according to a preset key point detection network to obtain neck key point information; the path planning module 3 is used for planning paths according to the neck key point information and the set matching rule; and the guiding module 4 is used for guiding the mechanical arm to carry out neck ultrasonic scanning according to the path planning.
According to the invention, the characteristic points are extracted according to the three-dimensional visual information of the patient shot by the camera, the path planning of the mechanical arm is automatically planned, and the mechanical arm is guided to carry out neck ultrasonic scanning through the path planning, so that the thyroid diagnosis of the human body is facilitated.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, various changes or modifications may be made by the patentees within the scope of the appended claims, and within the scope of the invention, as long as they do not exceed the scope of the invention described in the claims.

Claims (10)

1. A three-dimensional vision-guided automatic neck ultrasonic scanning method of a mechanical arm is characterized by comprising the following steps:
s1, carrying out three-dimensional reconstruction on the neck of the human body;
s2, performing neck key point detection on the three-dimensional reconstructed image according to a preset key point detection network to obtain neck key point information;
s3, planning a path according to the neck key point information and a set matching rule;
and S4, guiding the mechanical arm to carry out neck ultrasonic scanning according to the path plan.
2. The three-dimensional visually guided robotic automatic neck ultrasound scanning method of claim 1, wherein said step S1 specifically comprises:
s11, shooting the neck of the human body by using a binocular camera to obtain a color image of left and right visual fields;
s12, calculating three-dimensional information of each pixel point in the left view image based on a three-dimensional measurement principle;
s13, segmenting the color image according to a preset semantic segmentation network to obtain pixel point coordinates belonging to a human body part;
and S14, constructing a mask according to the pixel point coordinates obtained in the step S13, indexing the mask in the three-dimensional information matrix of the pixel point obtained in the step S12 by using the mask to obtain the space coordinates and normal vectors of each point of the human body, and constructing a point cloud.
3. The three-dimensional visual guidance mechanical arm automatic neck ultrasonic scanning method according to claim 1, wherein the preset key point detection network is a centret key point detection network, the centret key point detection network adopts ResNet50 or Hourglass network as a main feature extraction network to obtain feature information in an image, and then thermodynamic prediction is performed by using the feature information obtained by the network to realize the detection of key points.
4. The three-dimensional vision-guided robotic automatic neck ultrasonic scanning method of claim 3, wherein the specific step of neck key point detection in step S2 comprises:
s21, inputting the obtained color image into a CenterNet key point detection network, and obtaining pixel coordinates of neck key points;
s22, indexing the three-dimensional information matrix by using the pixel coordinates and acquiring the three-dimensional coordinates of each key point;
and S23, acquiring position indexes of all key points in the point cloud according to the three-dimensional coordinates to obtain the spatial position and normal vector information of each key point.
5. The method of claim 3 or 4, wherein the neck keypoints comprise a bottom center point, a bottom left point, a bottom right point, a top left point, and a top right point.
6. The three-dimensional visually guided robotic automatic neck ultrasound scanning method of claim 1, wherein said step S3 specifically comprises:
s31, using the obtained spatial position of the key point and the normal vector information as nodes on the motion path of the tail end of the mechanical arm, and interpolating between the nodes to obtain a smooth and complete path;
s32, calculating the Euclidean distance between each point and the interpolation point in the point cloud, setting the minimum Euclidean distance as the matching point of the interpolation point, and acquiring the spatial position and the normal vector information of the matching point from the point cloud.
7. The three-dimensional visually guided robotic automatic neck ultrasound scanning method of claim 6, wherein said step S4 specifically comprises:
s41, constructing a Cartesian three-dimensional coordinate system Oxyz by taking the normal vector as the positive direction of the z axis, and enabling the coordinate system to pass through a rotation matrix T m Conversion to the Camera coordinate System Oxyz camera Rotation matrix T using C + + Eigen library m Converting into quaternion data;
s42, converting the quaternion data in the camera coordinate system into the data in the base coordinate system according to the following formula,
T object-in-b =T tool-in-base *T cam-in-tool *T object-in-ca
wherein, T tool-in-bas As the end joint coordinate system, T cam-in-tool Is a tool end coordinate system, T object-in-ca As camera coordinate system, T object-in-b Is a base coordinate system;
and S43, calculating the angle of each joint according to the known space position and attitude of the tail end of the mechanical arm.
8. The three-dimensional visually guided robotic neck ultrasound scanning method of claim 7, wherein said step S43 is embodied by: and deducing a transformation matrix from the tool terminal coordinate system to the base coordinate system by a D-H parameter calibration method, inversely transforming the transformation matrix of each joint, and performing equation matching and variable separation to obtain the angle of each joint.
9. The method according to claim 7, wherein in step S4, during the neck ultrasonic scanning of the robot arm, a direct force-touch feedback control algorithm is used to directly feed back the force fed back from the end force sensor to the initial position by using a scaling factor, so as to correct the initial position as a position correction.
10. A system for the three-dimensional visually guided robotic automated neck ultrasound scanning method according to any of claims 1-9, comprising:
a three-dimensional reconstruction module for three-dimensionally reconstructing the neck of a human body,
the key point detection module is used for detecting the key points of the neck of the image after the three-dimensional reconstruction according to a preset key point detection network to obtain the key point information of the neck;
the path planning module is used for planning paths according to the neck key point information and the set matching rule;
and the guiding module is used for guiding the mechanical arm to carry out neck ultrasonic scanning according to the path plan.
CN202210496051.2A 2022-05-08 2022-05-08 Three-dimensional vision-guided automatic neck ultrasonic scanning method and system for mechanical arm Pending CN115089212A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210496051.2A CN115089212A (en) 2022-05-08 2022-05-08 Three-dimensional vision-guided automatic neck ultrasonic scanning method and system for mechanical arm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210496051.2A CN115089212A (en) 2022-05-08 2022-05-08 Three-dimensional vision-guided automatic neck ultrasonic scanning method and system for mechanical arm

Publications (1)

Publication Number Publication Date
CN115089212A true CN115089212A (en) 2022-09-23

Family

ID=83287054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210496051.2A Pending CN115089212A (en) 2022-05-08 2022-05-08 Three-dimensional vision-guided automatic neck ultrasonic scanning method and system for mechanical arm

Country Status (1)

Country Link
CN (1) CN115089212A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117017355A (en) * 2023-10-08 2023-11-10 合肥合滨智能机器人有限公司 Thyroid autonomous scanning system based on multi-modal generation type dialogue
CN117731324A (en) * 2024-02-21 2024-03-22 北京智源人工智能研究院 Method and device for real-time force interaction control of an ultrasound probe on a contact surface

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117017355A (en) * 2023-10-08 2023-11-10 合肥合滨智能机器人有限公司 Thyroid autonomous scanning system based on multi-modal generation type dialogue
CN117017355B (en) * 2023-10-08 2024-01-12 合肥合滨智能机器人有限公司 Thyroid autonomous scanning system based on multi-modal generation type dialogue
CN117731324A (en) * 2024-02-21 2024-03-22 北京智源人工智能研究院 Method and device for real-time force interaction control of an ultrasound probe on a contact surface

Similar Documents

Publication Publication Date Title
CN112215843B (en) Ultrasonic intelligent imaging navigation method and device, ultrasonic equipment and storage medium
CN115089212A (en) Three-dimensional vision-guided automatic neck ultrasonic scanning method and system for mechanical arm
CN112634318B (en) Teleoperation system and method for underwater maintenance robot
US8147503B2 (en) Methods of locating and tracking robotic instruments in robotic surgical systems
US8073528B2 (en) Tool tracking systems, methods and computer products for image guided surgery
Schröder et al. Real-time hand tracking using synergistic inverse kinematics
Kondori et al. Head operated electric wheelchair
CN111152218A (en) Action mapping method and system of heterogeneous humanoid mechanical arm
WO2012116198A2 (en) System and method for detecting and tracking a curvilinear object in a three-dimensional space
Reiter et al. A learning algorithm for visual pose estimation of continuum robots
Deng et al. Learning robotic ultrasound scanning skills via human demonstrations and guided explorations
Zhang et al. A handheld master controller for robot-assisted microsurgery
CN115670515A (en) Ultrasonic robot thyroid detection system based on deep learning
Pachtrachai et al. Learning to calibrate-estimating the hand-eye transformation without calibration objects
CN112132805B (en) Ultrasonic robot state normalization method and system based on human body characteristics
Lu et al. Autonomous intelligent navigation for flexible endoscopy using monocular depth guidance and 3-D shape planning
CN111477318B (en) Virtual ultrasonic probe tracking method for remote control
Huang et al. Automatic ultrasound scanning system based on robotic arm.
Jin et al. Human-robot interaction for assisted object grasping by a wearable robotic object manipulation aid for the blind
Cabras et al. Comparison of methods for estimating the position of actuated instruments in flexible endoscopic surgery
Zhang et al. Implicit neural field guidance for teleoperated robot-assisted surgery
Ortiz et al. Learning representations of spatial displacement through sensorimotor prediction
Zhang et al. Visual Perception and Convolutional Neural Network Based Robotic Autonomous Lung Ultrasound Scanning Localization System
Zhang et al. Research on Path Planning of Breast Ultrasound Examination Robot
Gong et al. Real-Time Camera Localization during Robot-Assisted Telecystoscopy for Bladder Cancer Surveillance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination