AU2022333990A1 - Method for intelligently identifying thoracic organ, autonomously locating and scanning thoracic organ - Google Patents

Method for intelligently identifying thoracic organ, autonomously locating and scanning thoracic organ Download PDF

Info

Publication number
AU2022333990A1
AU2022333990A1 AU2022333990A AU2022333990A AU2022333990A1 AU 2022333990 A1 AU2022333990 A1 AU 2022333990A1 AU 2022333990 A AU2022333990 A AU 2022333990A AU 2022333990 A AU2022333990 A AU 2022333990A AU 2022333990 A1 AU2022333990 A1 AU 2022333990A1
Authority
AU
Australia
Prior art keywords
images
rib
probe
scanning
organ
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
AU2022333990A
Inventor
Sicong TAN
Hao Yu
Mengfei YU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of AU2022333990A1 publication Critical patent/AU2022333990A1/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

A method for intelligently identifying a thoracic organ, and autonomously locating and scanning a thoracic organ uses artificially intelligent robot technology to autonomously locate and identify a characteristic position of a human body organ, autonomously locate a thoracic organ, collect an ultrasound image, and remotely and adaptively collect an image or video of a chest, a nipple, a breast, or a heart organ. Autonomously and remotely controlling the collection and sharing to achieve sharing of medical images and videos alleviates problems such as high pressure and increased night shifts for doctors and nurses in collection operations. This facilitates experts and doctors remotely and autonomously locating and scanning thoracic organs, collecting and sharing medical images, performing remote consultations and rounds, providing expert opinions, and efficiently and professionally solving clinical cases, with wide applicability in outpatient clinics, inpatient wards, and overseas medical institutions.

Description

METHOD FOR INTELLIGENTLY IDENTIFYING THORACIC ORGAN, AUTONOMOUSLY LOCATING AND SCANNING THORACIC ORGAN
TECHNICAL FIELD The present invention belongs to the technical field of medical artificial intelligence, and relates to data analysis technology, motion planning technology field, image intelligent recognition method, medical data intelligence analysis and intelligence recognition technology.
BACKGROUND At present, in the field of medical treatment, in the examination process, due to various of human factor analysis, medical images of chest, breast, cardiac, low quality of video collection , heart structures and breast structures are complex,medical images of ultrasonic examination, low standardization degree of videos, and poor accuracy diagnosis of disease. The doctor usually has difference of professional skill, based on ultrasound examination, the diagnosis of the medical image is incorrect. And the robotic arm, the vision device, the depth vision device and various neural network methods carried by the robot and the improved method are used to intelligently recognize human face, human organ, bone, and assist in collecting ultrasound medical images and videos. Ultrasound medical images, low video collection efficiency, inaccurate manual collection, and artificial collection can cause epidemic propagation, Remote controlled and autonomous collection of medical images and videos are used to realize remote collection, autonomous collection, intelligent recognition and analysis of ultrasound data, images, videos, autonomous localization, scanning of heart, effective prevention of spreading of major diseases such as infectious diseases and plague. A visual image and medical image fusion intelligent recognition and autonomous locating scanning method, self-adaptive adjustment robot arm, intelligent recognition human body, organ, bone, identification, remote and autonomous movement, robot arm, scanning probe, scanning device, autonomous locating, scanning part, organs which autonomously scan, collect medical images and videos according to intelligent scanning method, greatly improve intelligent recognition, and collect medical data and medical image video more efficiency.
Technical Problems The objective of the present invention is to overcome the above-mentioned shortcomings and deficiencies, and provide medical images by remote control, autonomous control robot to recognize chest, heart, breast, autonomous locating, motion module, collection of chest, heart, and breast. A variety of medical image fusion intelligent recognition methods for recognizing five organs, bones, shoulderjoints, joints, ribs, breasts, papilla, human body,robot, robot arm, method for autonomously locating organ and the positions by medical device, and method for scanning and collecting heart and breast medical image become the key technical problems of autonomous recognition and locating scanning of heart, breast and chest organs. On the basis of intelligently recognizing heart, breast and chest organ, effective scan and complete scanning and collection of heart, breast and chest organ become another important technical problem for ultrasonic scanning. The present invention provides method for remote controlled and self-adaptive scanning heart, breast, chest to collect images, and videos.
Technical Solutions of the Present Invention Method for intelligently recognizing chest organ and autonomously locating and scanning a chest organ is characterized in that, method for intelligently recognizing human organs, organs, bones, joints, blood vessels, human body features and positions is integrated, method for integrally and intelligently recognizing human organs, bones, joints, blood vessels, human body feature points and positions by integrating planar visual images, depth images, ultrasonic images, videos and endoscope images, including the following steps: S1, Establish general visual image model, human organ model, blood vessel model and feature model, wherein, the models of the general image include: human face, nipple, breast, navel, feature parts and positions as feature items, and intelligently recognize human body features and positions by neural network algorithm and the improved method ; S2, Depth vision device and the depth visual image modelthe chest bone model, the sternumare established, the shoulder joint, the second rib, the third rib, the fourth rib, the fifth rib skeleton, the xiphoid process, the spine location information of the human body where the bone is located, the fifth rib skeleton, the xiphoid process, the spine location information, and the general image model described in S1 are combined as feature items of the bone intelligent recognition model as input items; S3, Apply neural network algorithm and the improved method, intelligently recognize each bone and the position where each bone is located, second rib, third rib, fourth rib, fifth rib, rib bottom end bone, xiphoid, spine, spine position, bone lower boundary position, shoulder joint and the position of each bone ,position of each joint; S4, The information of the breast neighbor feature organ heart is used as external scanning feature information, skeleton information is scanned outside the skeleton information recognized in S2, and the ultrasonic image, the video, the color information of blood vessel, the location information of blood vessel, the ultrasonic image, the contour, the shape, the feature, the structural feature, the color feature and the position feature of the video organ are used as input items; S5, Input the feature items as input items into neural network, improved method, and weight optimizer and obtain output value by image training; S6, Recognize location information of organ, blood vessel, bone and human body where the organ, the blood vessel, the bone and the human body are located, and scanning the position of human body where the autonomous organ, blood vessel, bone and human body are located; Method for intelligently recognizing chest organ and autonomously locating and scanning chest organ, which is characterized in that, robot arm, ultrasonic device and ultrasonic scanning device autonomously recognize organs of heart, breast, chest organ and method for locating and scanning organs, and the steps are as follows: S1, Obtain collection image of organs of human body corresponding to the collection tasks and coordinates of external location areas, Set the organs as targets which include target name, target parameters and location information, set communication target; S2, Robot arm, ultrasonic device, ultrasonic scanning device, robot subscripts the location information of the external scanning area of the organ corresponding to subscription, the subscription target, the parameter, the target pose and the pose mark, set parameters of the targets which include head ID, target pose, direction value, set timestamp; S3, The ultrasonic scanning device carried by remote control system,The ultrasonic scanning device subscribe the location information of collection area, according to the subscribed collection area, image collection motion,robotic arm autonomous perform planning motions of moving and scanning the human body. The ultrasonic scanning device and the ultrasonic device publish image information, color information of blood vessel, location information of blood vessel and ultrasonic images, and the features of contour and features of shape, the features of structural and color features of video organs; S4, The robot main system determines collection tasks of all target organs according to collection completion information of target organ returned S5, According to S2, according to coordinating external location area of the human body by robot autonomous locating, The scanning method for scanning and collecting medical images which robot arm, ultrasonic device, ultrasonic scanning device, robot main system subscribes location information of the breast, the heart in the external scanning area. The probe angle, the subscription target, the parameters of the target pose, and the pose marker, set the parameters of targets include the head ID, the target pose, the direction value, and set the timestamp; S6, According to S1, according to the general vision device and various medical image fusion intelligent recognition methods, intelligently recognize human papilla, breast, heart and the position and the location, intelligently recognize bone, sternum, second rib, third rib, fourth rib, fifth rib and the scanning position and location corresponding to each structural part of the heart organ. By deep visual collection device and by a plurality of medical image fusion intelligent recognition methods; S7, The image parameters, gain parameters, color gain parameters, sensitivity time control adjustment parameters, time gain control parameters, focusing parameters, depth parameters, frame size, blood flow velocity scale parameters, video parameters, and image collection method, Robot arm, ultrasonic scanning device remote control and self-adaptive adjust probes, rotation angles, inclination angles, scanning mode of ultrasonic scanning device, probe angle, parameters of the following organs, tissue,to implement effective collection and complete collection; S8, Remote controlled and self-self-adaptive move the robot arm according to S2, The chest rib is intelligently recognized, the depth images are collected by depth vision device; According to S1, The plane vision device intelligently recognizes breast, the position information and location information, The robot arm scans the heart along the left edge of the sternum, the second rib, the third rib, the fourth rib and the inter-rib; Method for intelligently recognizing the chest organ and autonomously locating and scanning the chest organ ,wherein, is characterized in that, the method for scanning the heart by remote controlling and autonomous scanning inter-rib by self-adaptive robot arm Which comprises the following steps: S1, According to claim 1, claim 2, scan organs by ultrasonic probe, intelligently recognize dynamic pulsate organ, move the robotic arm to scan the left long axis in the right shoulder direction. The probe scans the inner side and the outer side to tilt to display the anterior leaflet of mitral valve and the left atrioventricular, rotate probe and scan along the direction of ascending aorta; S2, According to claim 1, claim2, nipple and the positions and locations are intelligently recognized by recognition of images collected by visual device, the probe rotates in the counterclockwise direction, the robot arm scans the front leaf long axis and along the direction of the anterior papillary muscle, the chordae tendineae and the anterior leaflet of mitral valve; S3, According to claim 1, In claim 2, nipple and the positions and locations are intelligently recognized by recognition of images collected by visual device, the probe rotates in the counterclockwise direction to scan, and the robot arm scans the posterior blade major axis, along the the direction of the anterior papillary muscle, the chordae tendineae and posterior leaflet of mitral valve. S4, The ultrasonic probe descends one inter-rib and scan, The probe inclines towards the inner side or the outer side, the organs of the papillary muscle, chordae tendineae, posterior leaflet of mitral valvemedian long axis, the organs of the next rib, the long axis, the organs of the anterior leaflet of mitral valve, the long axis of the posterior leaflet blade
, the organs are scanned; S 5. According to claim1,claim 2, the chest rib is intelligently recognized by the images collected by depth vision device, the breast,the position and location are intelligently recognized by the images collected by the visual device, the robot arm scans the long axis and between the upper ribs along the left edge of the sternum, the second rib and the third rib; S6, According to claim 1, claim 2, the position feature and the structural feature of each part of the images collected by ultrasound devices are intelligently recognized by scanning the images by the probe to intelligently recognize the features of the circular shape, the inverted triangle shape, the Y shape, the intelligent recognition part, the position and the structure of the images in the diastole period and in the contraction period, the probe moves slightly outwards, the robot arm moves along the left edge of the sternum and rotates by 90 degrees in the clockwise direction, and the four heart cavities, the mitral valve, the tricuspid valve, the left chest wall four-cavity heart, the pulmonary artery, the aortic valve, the pulmonary valve and the images are scanned; S7, According to claim 1, according to claim 1, according to the position feature and structural feature of each part of the ultrasonic image and the video, the robot arm moves and scans along the left edge of the sternum, moves for scanning from the left long axis, the probe rotates anticlockwise, the right chamber flows into the long axis of the channel and the long axis of the outflow channel of the right chamber, The probe scans the right chamber funnel, the pulmonary valve, and the artery stem counter-clockwise; S8, According to claim 1, according to claim 2, scan organs and images are collected by probeintelligently recognize dynamical pulsation organ, by position features and structural features of each part of the ultrasonic images and the videos, The probe is placed at the heart tip of the pulsation position, the chest wall is inclined towards the back side, and the four heart cavities and the mitral valve, the tricuspid valve and the heart tip four cavity heart are scanned by rotary movement; S10, According to claim 1 and claim 2, the visual device intelligently recognize the nipple and the positionintelligently recognize the rib bottom end skeleton and the xiphoid position, according to the images collected by the depth vision device, recognize the position of the apex, From the location and the position of the xiphoid by the robot arm, the pressure device is used to pressurize, tilt towards the head side, rotate the probe, self-adaptively adjust the inclination angle, scan the four heart chambers and the mitral valve, the tricuspid valve and the Xiphoid lower four cavity heart image; S11, According to claim 1, claim 2, the probe rotate clockwise, the three-lumen section (left chamber, left chamber, aorta) and two cavity sections (left chamber and left chamber) are sequentially recorded from the four-cavity center image; S12, According to claim 1, the image collected by the probe, intelligently recognize the circular shape and marking as left chamber, scan self-adaptive adjust, image center, left chamber short axis under the xiphoid, lower cavity vein long axis, short axis and abdominal aorta, self-adaptively adjust the inclination angle in the right direction,scan the inferior vena cava long axis, the short axis, abdominal aorta; S13. According to claim 1, claim 2,Intelligently recognize sternum upper nest and sternum left edge first rib, inclining towards the head side, rotating the probe, self-adaptively adjust the probe rotation angle, the inclination angle, scan the ascending aortic arch, the distal end, the RPL, the descending aorta, and the sternum upper fossa aortic arch; S 14, According to claim 1, claim 2, intelligently recognize the third rib and the fourth rib on the right edge of the sternum according to the images collected by the depth vision device, move the probe to the third rib and the fourth rib, self-adaptively adjust the rotation angle of the probe, the inclination angle, scan the left room, the right room, the room interval and the right edge center horizontal of the sternum,adjust the probe to rotate anticlockwise 90 degrees, and scan the sagittal of the right edge of the sternum; S15, intelligently recognize the second rib and the third rib of the right edge of the sternum according to the images collected by the depth vision device, according to claim 1, move the probes to the second rib or the third rib according to the images collected by the depth vision device, self-adaptively adjust the rotation angle and inclination angle of the probe, and scan the long axis images of the right edge of the sternum; Method for intelligently recognizing breast organs and autonomously locating and scanning breast organs,wherein, is characterized in that, the method for scanning breasts between autonomous ribs of remote control and self-adaptive move robot arm wherein, is characterized by comprising the following steps: S1, intelligently recognize nipple, breast, the position and location, range and the contour of nipple, breast and the position, range and contour of the nipple, according to claim 1 and the images collected by the visual device; S2, According to claim 1, claim 2, establish novel models of depth vision device and depth visual images, models of chest skeleton, double-shoulder joint axillary side, according to the depth information,the location information and location information of human body where the skeleton is located as input items input into model by the improved neural network method, Intelligently recognize sternum, double-shoulder joints, double-shoulder joint axillary sides, the location information and the location information. S3, According to claim 1, probe move from the outer side to the inner side of the mammary gland, the probe is moved from the inner side to the outer side, and the probe is placed in the axillary central line, The probe scans the nipple to reach the lower edge of the mammary gland until the gland disappears, and the longitudinal breaking direction is scanned; Scan along tudinal breaking direction. S4, According to claim 1, according to claim 2, images collected by depth vision devices and the location information of sternum, double-shoulder joint, double-shoulder armpit side are intelligently recognized, place the probe in clavicle, scan from clavicle, axillary central line and transverse breaking direction, move the probe from the head side of the upper breast, move towards the lower foot side, scan from the head side to the foot side to the upper breast until the gland disappears, scan in the longitudinal breaking direction; S5, According to claim 1, claim 2, Intelligently recognize nipple and the images of nipple and the position and the location which collected by the visual device, place the probe at the nipple, scan in the rotation direction of the nipple with the nipple as the center, scan from the nipple to the outer edge direction of the mammary gland until the rib in the reverse direction, the gland disappears, end the scanning in the rotation direction;
In summary, the present invention has the beneficial effects According to the present invention, the medical robot device, the robot arm, the medical device remote isolation, autonomous collection, isolation collection, autonomous locate and recognize the feature of human organ,the collection of the ultrasonic image, the video method, the autonomous completion, various of medical tasks of the ward, the problems of many collection tasks, high working pressure, many night shifts are solved. At the same time, the present invention provides remote controlled and self-adaptive method for collecting images of heart, breast and videos; Remote controlled and autonomous effective collection, completion collection of medical images, videos, shared medical pictures, videos, multi-expert remote joint consultation are realized, data, images and videos collected by the robot are obtained in real time, the working efficiency is greatly improved.
BRIEF DESCRIPTION OF THE DRAWINGS FIG.1 is flowchart of method for intelligently recognizing human organs, bones, joints, blood vessels, human body feature points and positions, and autonomous locating in the description of the present application; FIG.2 is flowchart of robot, medical device, robotic arm, ultrasonic device, ultrasonic scanning device, and the method for autonomously recognizing heart, breast, locating and scanning organs. FIG.3 is flowchart of robot, medical device, robotic arm, ultrasonic device, ultrasonic scanning device, and the method for autonomous recognizing heart, locating and scanning heart in the description of the present application.
DETAILED DESCRIPTION OF THE EMBODIMENTS The objective of the present invention is to design remote control robot for replacing human work, to realize the collection of remote control robot arm, and to effectively solve the problem of autonomous collection of images and videos.
In the field of automation, artificial intelligence technology, robots and motion planning, autonomous recognition of human face, five sense organs, arms, external features of human body, depth camera bones, and joints. The visual images and the medical images are fused to intelligently recognize human body five-organ, organs, bones, joints, blood vessels, human body, the features and the locations and the positions. Robots, robot arms, medical device autonomous locate and scan external location area of human body and method for scanning and collecting medical images. The present invention provides remote and self-adaptive intelligent recognition, heart and breast scanning method, remote control robot and method for autonomously collecting medical images, videos. Remote control medical images and video collection device and the shared images which solve the problems of collection and diagnosis and treatment errors, improve the accuracy of intelligent collection and the accuracy of intelligent recognition of diseases and medical data recognition. In order to understand the above technical solutions, the present invention will be further described in detail below with reference to the embodiments and drawings, but the embodiments of the present invention are not limited. The technical solutions to implement the present application is to solve the above technical problems: The present invention provides intelligent recognition method for fusing medical images and visual images, intelligently recognize five sense organs, organs, bones, joints, blood vessels, human body features and positions, establish general visual image model, human organ model, blood vessel model and feature model, the models of the general image include: human face, five sense organs, breast, nipple, navel, feature genital organ, feature part and position as feature items, By neural network algorithm and the improved method, intelligently recognize features of human body and the positions. depth vision device, depth vision image model, and bone model are established. The depth information and the location information of the human body where the bones located are independently posterior to recognize features and human organ, Method for collecting ultrasonic images, and method for remote controlled and self-adaptively collecting images and videos of heart, breast, chest.
Example 1 The present disclosure will be further described in detail below with reference to the embodiments and FIG.1,FIG.2, and FIG.3, the embodiments of the present disclosure are not limited, input general visual images, human organ, blood vessel, and features, including: human face, five sense organ, ear, lip, mandible, neck, navel, nipple, characteristic genital organ, the feature parts and the positions, input depth visual images, bone information, mandible, rib bottom end bone,xiphoid, spine, spine position, bone lower bound position, shoulder joint, knee joint, foot, bone, foot joint, waist joint, joints and their positions. The color information, the location information of blood vessel, and the contour feature,the shape feature, the structural feature, and the color feature of the ultrasonic image organ in the ultrasonic image. The blood vessel information, the location information of blood vessel, the organ information of the ultrasonic images and the organ features are combined into combination information items as feature items of the ultrasonic image model, entry is input, the external scanning feature information and the external scanning skeleton information are used as external scanning information, improved deep neural network method and weight optimizer are applied by image training, output value and organ, blood vessel, bone, and human body scan organ and location information where the output value and the organ, the blood vessel and the bone are located and obtainedresult of the position of the autonomous positioner,the blood vessel, the bone and the human body where the output value and the organ, the blood vessel, the bone and the human body are located is output. According to administrator, doctor publish the collection tasks and medical advice messages, Obtain information of collection image corresponding to the collection task and coordinates of external location area, set it as target, sets target name, target parameter, location information, and sets communication target. The robot vision collection device and the visual recognition module publish external features corresponding to each organ, and the external scanning feature information of the neighbor feature organ recognized by the general image model scans the coordinates of the external position area of the human body; and the depth information published by the depth camera and the skeleton information of the neighbor scan the bone information. The robot arm, the ultrasonic device, the ultrasonic probe, the robot subscript location information of corresponding organ in the external scanning area, the subscription target, the parameter, the target pose, the pose marker, set target parameters which include head ID, target pose, direction value, set timestamp, Remote main control system and autonomous robotic arm-mounted ultrasonic probe move and scan the human body collection area according to the subscribed collection area location.
According to the motions of image collection motion planning module. The ultrasonic probe and the ultrasonic device publish image information of blood vessel, color information of blood vessel, the location information of blood vessel, the contour feature,the shape feature, the structural feature and the color feature of the ultrasonic images. The robot, medical devicevisual recognition module subscribes the images information which extract color information, location information of blood vessel, contour features, shape features, structural features and color features of the ultrasonic images. Input the structural features and the color features into calculation model, intelligently recognize the images whether target organ and tissue are collected, whether the images are target organ collected. Set parameters of collection target which include pose marker, timestamp, target for head ID, COG target pose and direction value, Set allowable error of the position and the attitude, if motion planning fails, allow re-planning, Set reference coordinate system of the target position, and set time limit of each motion planning. The robot, the medical device, remote control robotic arm, self-adaptive adjust and collect the images, parameters of the medical devices, parameters of the videos whether are the standard videos whether the images are valid. Robot, medical device, remote control robotic arm,self-adaptive adjust ultrasound probe to scan, adjust the probe angle, collection position of parameters, the inspection target organ the collection organ, all of the images whether the images, the videos of the target organ are collected,whether are the complete collection of the tissue, return completion collection information of the target organ, robot and robotic arm subscripts information of tasks, robotmedical devicerobotic arm remote control, self-adaptive control ultrasound probe move to the external scanning position area of the next target organ. The robot and the medical device determine that all of the target organ collection tasks are completed according to collection completion information of target organ returned. The collection method is to scan the images by ultrasonic probeintelligently recognize the dynamic pulsation organ, and move the robotic arm to scan the left long axis in the right shoulder direction; The probe scans the inner side and the outer side to tilt, and shows the mitral valve front blade and the left chamber, rotate probe to scan in the ascending aorta direction; the position of the nipple are intelligently recognized by the visual device, the probe rotates anticlockwise for scanning, The robot arm scans the front blade major axis and collects the images along the anterior papillary muscle, the chordae tendineae and the mitral valve anterior blade; The images are intelligently recognized according to the images collected by the visual device, the probe rotates anticlockwise for scanning. The robot arm scans the posterior papillary muscle, the chordae tendineae and the mitral valve posterior blade to scan the long axis and the image of the posterior blade; The ultrasonic probe descends by one rib for scanning, the probe inclines towards the inner side or the outer side, the papillary muscle, chordae tendineae, mitral valve posterior leaflet, median long axis, anterior leaflet major axis, posterior leaflet major axis; Intelligently recognize chest ribs according to the images collected by depth vision device, Intelligently recognize breast and the position according to the images collected by vision device, scan long axes between high ribs along the left edge of the sternum, the third ribs and the ribs; The position features and the structural features of each part of ultrasound images are intelligently recognized. Intelligently recognize the features of the circular shape, the inverted triangle shape, the Y shape, the intelligent recognition part, the position and the structure of the image in the diastole period and in the contraction period, the probe moves slightly outwards, the robot arm moves along the left edge of the sternum and rotates 90 degrees in the clockwise direction, and the four heart cavities, the mitral valve, the tricuspid valve, the left chest wall four-cavity heart, the pulmonary artery, the aortic valve, the pulmonary valve, the organs are scanned and images are collected; According to ultrasonic images and position features and structural features of each part ,
the robot arm moves and scans along the left edge of the sternum, moves and scans the left and right long axis images, the probe rotates anticlockwise, scan the right chamber flows into the long axis of the channel and the long axis of the outflow channel of the right chamber, the probe scans the funnel part, the pulmonary valve and the artery of the right chamber counterclockwise; The organs are scanned by the probe and the images are collectedintelligently recognize the dynamic pulsation organ, the position feature and structural feature of each part of ultrasonic images and videos.the probe is placed at the heart tip of the beat position, the chest wall is inclined towards the back side, and the four heart cavities and the mitral valve, the tricuspid valve and the cardiac apex four-cavity heart,images are collected and scanned by rotating and moving; The vision device intelligently recognizes organ and the position of the nipple according to the images collected by the depth vision deviceintelligently recognize the rib bottom end skeleton and the xiphoid positionrecognize the position of the heart tip, inclines the robot arm from the xiphoid position, the pressure device, applies pressure, inclines towards the head side, rotates the probe, self adaptively adjusts the inclination angle, scans the four heart cavities and the mitral valve, the tricuspid valve, 4-lumen heart images under xiphoid; the probe rotates clockwise, and the three-lumen section (left chamber, left chamber, aorta) and two cavity sections (left chamber and left chamber)are sequentially recorded from the four-cavity heart image; The image collected by the probe intelligently recognize the circular shape and marked as the left chamber, scan self-adaptive adjustment, the image center, the left chamber short axis image under the xiphoid process, the inferior vena cava major axis, the short axis image and the abdominal aorta image, the inclination angle is self-adaptively adjusted in the right direction, and the inferior vena cava long axis image, the short axis image and the abdominal aorta image are scanned; images collected by the depth vision device is used for intelligently recognizing sternum upper nest and sternum left edge first rib, inclining towards the head side, rotating the probe, self-adaptively adjust the probe rotation angle, the inclination angle, scan the ascending aortic arch, remote control RPL, the descending aorta and the sternum fossa aortic arch; The images collected by the depth vision device is used to intelligently recognize the third rib and the fourth rib on the right edge of the sternum,and the probe is moved to the third rib and the fourth rib to self-adaptively adjust the rotation angle of the probe, the inclination angle.the scanning left room, the right room, the room interval and the right edge center horizontal image of the sternum. adjust the probe to rotate anticlockwise 90 degrees, scan sagittal of the right edge of the sternum; Intelligently recognize the second rib and the third rib of the right edge of the sternum, according to the images collected by the depth vision device, move the probe to the second rib or the third rib, self-adaptively adjust the rotation angle and the inclination angle of the probe, scan the long axis of the right edge of the sternum;
Example 2
The present invention will be further described in detail below with reference to the embodiments and FIG1,FIG2, and FIG3, but the embodiments of the present invention are not limited, Input general visual images, recognize human face, nipple, features of genital organ, feature parts, human organ, blood vessel and features, Input depth visual image, recognize bone information, rib bottom end bone, shoulderjoint, joints, lower bone boundaries, and positions. According to the color information of blood vessel, the location information of blood vessel the contour feature, the shape feature, the structural feature and the color feature of the ultrasonic image, the information of blood vessel, the location information of blood vessel, and the organ information under the ultrasonic image are used as input items, an improved deep neural network method and weight optimizer are applied by image training, and result of the location of the autonomous positioner, the blood vessel, the bone and the human body where the output value, the blood vessel, the bone and the human body are located is output. Output value, organ, blood vessel, bone, human body scanning organ and location information where the output value, the organ, the blood vessel and the bone are located are obtained. According to administratordoctor publish collection tasks and medical advice message, obtain collection image of organ of human body corresponding to the collection task and coordinates of external location area, set it as target, set target name, target parameter, and location information, and set communication target. The robot vision collection device and the visual recognition module publish external features corresponding to each organ, The external scanning feature information of neighbor feature organ recognized by the general image model scans the coordinates of the external location area of the human body; And the depth information is published by depth camera, the skeleton information of the neighbor organs, the bone information is collected. The robot arm, ultrasonic device, ultrasonic probe, the robot subscripts corresponding organ location information of external scanning area, the subscription target, the parameter, the target pose, the pose marker, the setting target for the head ID, the target pose, the direction value, the timestampremote control main control system,autonomous robotic arm-mounted ultrasonic probe move and scan the collection area of human body according to the location of the subscribed collection area, According to the image collection motions of the motion planning module by robotic arm.
The ultrasonic probe and the ultrasonic device publish image information, the color information of blood vessel, the location information of blood vessel, the contour feature, the shape feature, the structural feature and the color feature of the ultrasonic image. The robot, the medical device, and the visual recognition module subscribes to the image informationextract color information of blood vessel, location information of blood vessel and contour features, shape features, structural features and color features of the ultrasonic images, Input the structural features and the color features into calculation model, intelligently recognize whether the images are the images of target organ tissue, and if the images by scanning target organ. Set parameters of the collection target, set the allowable error of the position and posture, allow the re planning if the motion planning fails, set the reference coordinate system of the target position, and set the time limit of each motion planning.
The robot, the medical device remote control the robotic arms, self-adaptive adjust parameters of collection of the image parameters of the medical device, the parameters of the videos, the image collection method, whether the images are the standard video and whether the images are valid. The robot, the medical device, the distal end of the robotic arm, the scanning mode of the self adaptive adjust ultrasound probe, probe angle, the parameter, the inspection target organ collection position, the collection organ, all of the images whether the videos are completion collection of the breast,return collection completion information of the target organ, Robot and the robotic arm subscript information of tasks, robot, medical device, remote control the robotic arm, scan external location area from the self-adaptive ultrasound probe to the next target organ. The robot and the medical device determine the nipple and the collection task of breast according to the collection completion information of nipple and breast organ returned. Intelligently recognize nipple, breast, position, range and contour of the nipple according to images collected by visual device; Establish model of depth vision device and depth visual image, chest bone model, double-shoulder armpit side model, input neural network method and the improved method by depth information and the position of human body where the bone is located, intelligently recognize sternum, double-shoulder joint, double-shoulder joint axillary side and the location information; A probe is moved from the outer side to the inner side of the mammary gland, the probe is moved from the inner side to the outer side, and the probe is placed in the axillary central line, The probe scans the nipple to reach the lower edge of the mammary gland until the gland disappears, and the longitudinal breaking direction is scanned; The probe is placed in the clavicle according to sternum, double-shoulder joint, double-shoulder armpit side and their location information that are intelligently recognized according to the images collected by the depth vision devicethe probe is scanned from the head side of the upper mammary gland to the lower foot side, It is scanned from the head side of the foot side to the upper mammary gland in the lateral direction until the gland disappears, and the longitudinal breaking direction is scanned; Intelligently recognize the nipple and the position of the nipple according to the images collected by the depth vision device, place the probe at the nipple, scan in the rotation direction of the nipple with the nipple as the center,scan from the nipple to the outer edge direction of the mammary gland until the rib in the reverse direction, the gland disappears, end the scanning in the rotation direction;

Claims (1)

  1. Claims Claim 1 Method for intelligently identifying thoracic organ, autonomously locating and scanning thoracic organ,wherein, is characterized in that, method for intelligently recognizing human organs, organs, bones, joints, blood vessels, human body features and positions is integrated, method for integrally and intelligently recognizing human organs, bones, joints, blood vessels, human body feature points and positions by integrating planar visual images, depth images, ultrasonic images, videos and endoscope images, including the following steps: S1, Establish general visual image model, human organ model, blood vessel model and feature model, wherein, the models of the general image include: human face, nipple, breast, navel, feature parts and positions as feature items, and intelligently recognize human body features and positions by neural network algorithm and the improved method ; S2, Depth vision device and the depth visual image modelthe chest bone model, the sternumare established, the shoulder joint, the second rib, the third rib, the fourth rib, the fifth rib skeleton, the xiphoid process, the spine location information of the human body where the bone is located, the fifth rib skeleton, the xiphoid process, the spine location information, and the general image model described in S1 are combined as feature items of the bone intelligent recognition model as input items; S3, Apply neural network algorithm and the improved method, intelligently recognize each bone and the position where each bone is located, second rib, third rib, fourth rib, fifth rib, rib bottom end bone, xiphoid, spine, spine position, bone lower boundary position, shoulder joint and the position of each bone ,position of each joint; S4, The information of the breast neighbor feature organ heart is used as external scanning feature information, skeleton information is scanned outside the skeleton information recognized in S2, and the ultrasonic image, the video, the color information of blood vessel, the location information of blood vessel, the ultrasonic image, the contour, the shape, the feature, the structural feature, the color feature and the position feature of the video organ are used as input items; S5, Input the feature items as input items into neural network, improved method, and weight optimizer and obtain output value by image training; S6, Recognize location information of organ, blood vessel, bone and human body where the organ, the blood vessel, the bone and the human body are located, and scanning the position of human body where the autonomous organ, blood vessel, bone and human body are located;
    Claim 2 Method for intelligently identifying thoracic organautonomously locating and scanning thoracic organ,wherein, is characterized in that, robot arm, ultrasonic device and ultrasonic scanning device autonomously recognize organs of heart, breast, chest organ and method for locating and scanning organs, and the steps are as follows: S1, Obtain collection image of organs of human body corresponding to the collection tasks and coordinates of external location areas, Set the organs as targets which include target name, target parameters and location information, set communication target; S2, Robot arm, ultrasonic device, ultrasonic scanning device, robot subscripts the location information of the external scanning area of the organ corresponding to subscription, the subscription target, the parameter, the target pose and the pose mark, set parameters of the targets which include head ID, target pose, direction value, set timestamp; S3, The ultrasonic scanning device carried by remote control system,The ultrasonic scanning device subscribe the location information of collection area, according to the subscribed collection area, image collection motionrobotic arm autonomous perform planning motions of moving and scanning the human body. The ultrasonic scanning device and the ultrasonic device publish image information, color information of blood vessel, location information of blood vessel and ultrasonic images, and the features of contour and features of shape, the features of structural and color features of video organs; S4, The robot main system determines collection tasks of all target organs according to collection completion information of target organ returned S5, According to S2, according to coordinating external location area of the human body by robot autonomous locating, The scanning method for scanning and collecting medical images which robot arm, ultrasonic device, ultrasonic scanning device, robot main system subscribes location information of the breast, the heart in the external scanning area. The probe angle, the subscription target, the parameters of the target pose, and the pose marker, set the parameters of targets include the head ID, the target pose, the direction value, and set the timestamp; S6, According to S1, according to the general vision device and various medical image fusion intelligent recognition methods, intelligently recognize human papilla, breast, heart and the position and the location, intelligently recognize bone, sternum, second rib, third rib, fourth rib, fifth rib and the scanning position and location corresponding to each structural part of the heart organ. By deep visual collection device and by a plurality of medical image fusion intelligent recognition methods; S7, The image parameters, gain parameters, color gain parameters, sensitivity time control adjustment parameters, time gain control parameters, focusing parameters, depth parameters, frame size, blood flow velocity scale parameters, video parameters, and image collection method, Robot arm, ultrasonic scanning device remote control and self-adaptive adjust probes, rotation angles, inclination angles, scanning mode of ultrasonic scanning device, probe angle, parameters of the following organs, tissue,to implement effective collection and complete collection; S8, Remote controlled and self-self-adaptive move the robot arm according to S2, The chest rib is intelligently recognized, the depth images are collected by depth vision device; According to S1, The plane vision device intelligently recognizes breast, the position information and location information, The robot arm scans the heart along the left edge of the sternum, the second rib, the third rib, the fourth rib and the inter-rib;
    Claim 3 Method for intelligently identifying thoracic organautonomously locating and scanning thoracic organ,wherein, is characterized in that, the method for scanning the heart by remote controlling and autonomous scanning inter-rib by self-adaptive robot arm, Which comprises the following steps: S1, According to claim 1, claim 2, scan organs by ultrasonic probe, intelligently recognize dynamic pulsate organ, move the robotic arm to scan the left long axis in the right shoulder direction. The probe scans the inner side and the outer side to tilt to display the anterior leaflet of mitral valve and the left atrioventricular, rotate probe and scan along the direction of ascending aorta; S2, According to claim 1, claim2, nipple and the positions and locations are intelligently recognized by recognition of images collected by visual device, the probe rotates in the counterclockwise direction, the robot arm scans the front leaf long axis and along the direction of the anterior papillary muscle, the chordae tendineae and the anterior leaflet of mitral valve; S3, According to claim 1, In claim 2, nipple and the positions and locations are intelligently recognized by recognition of images collected by visual device, the probe rotates in the counterclockwise direction to scan, and the robot arm scans the posterior blade major axis, along the the direction of the anterior papillary muscle, the chordae tendineae and posterior leaflet of mitral valve. S4, The ultrasonic probe descends one inter-rib and scan, The probe inclines towards the inner side or the outer side, the organs of the papillary muscle, chordae tendineae, posterior leaflet of mitral valvemedian long axis, the organs of the next rib, the long axis, the organs of the anterior leaflet of mitral valve, the long axis of the posterior leaflet blade the organs are scanned; , S 5. According to claim1,claim 2, the chest rib is intelligently recognized by the images collected by depth vision device, the breast,the position and location are intelligently recognized by the images collected by the visual device, the robot arm scans the long axis and between the upper ribs along the left edge of the sternum, the second rib and the third rib; S6, According to claim 1, claim 2, the position feature and the structural feature of each part of the images collected by ultrasound devices are intelligently recognized by scanning the images by the probe to intelligently recognize the features of the circular shape, the inverted triangle shape, the Y shape, the intelligent recognition part, the position and the structure of the images in the diastole period and in the contraction period, the probe moves slightly outwards, the robot arm moves along the left edge of the sternum and rotates by 90 degrees in the clockwise direction, and the four heart cavities, the mitral valve, the tricuspid valve, the left chest wall four-cavity heart, the pulmonary artery, the aortic valve, the pulmonary valve and the images are scanned; S7, According to claim 1, according to claim 1, according to the position feature and structural feature of each part of the ultrasonic image and the video, the robot arm moves and scans along the left edge of the sternum, moves for scanning from the left long axis, the probe rotates anticlockwise, the right chamber flows into the long axis of the channel and the long axis of the outflow channel of the right chamber, The probe scans the right chamber funnel, the pulmonary valve, and the artery stem counter-clockwise; S8, According to claim 1, according to claim 2, scan organs and images are collected by probeintelligently recognize dynamical pulsation organ, by position features and structural features of each part of the ultrasonic images and the videos, The probe is placed at the heart tip of the pulsation position, the chest wall is inclined towards the back side, and the four heart cavities and the mitral valve, the tricuspid valve and the heart tip four cavity heart are scanned by rotary movement; S10, According to claim 1 and claim 2, the visual device intelligently recognize the nipple and the positionintelligently recognize the rib bottom end skeleton and the xiphoid position, according to the images collected by the depth vision device, recognize the position of the apex,
    From the location and the position of the xiphoid by the robot arm, the pressure device is used to pressurize, tilt towards the head side, rotate the probe, self-adaptively adjust the inclination angle, scan the four heart chambers and the mitral valve, the tricuspid valve and the Xiphoid lower four cavity heart image; S11, According to claim 1, claim 2, the probe rotate clockwise, the three-lumen section (left chamber, left chamber, aorta) and two cavity sections (left chamber and left chamber) are sequentially recorded from the four-cavity center image; S12, According to claim 1, the image collected by the probe, intelligently recognize the circular shape and marking as left chamber, scan self-adaptive adjust, image center, left chamber short axis under the xiphoid, lower cavity vein long axis, short axis and abdominal aorta, self-adaptively adjust the inclination angle in the right direction,scan the inferior vena cava long axis, the short axis, abdominal aorta; S13. According to claim 1, claim 2,Intelligently recognize sternum upper nest and sternum left edge first rib, inclining towards the head side, rotating the probe, self-adaptively adjust the probe rotation angle, the inclination angle, scan the ascending aortic arch, the distal end, the RPL, the descending aorta, and the sternum upper fossa aortic arch; S 14, According to claim 1, claim 2, intelligently recognize the third rib and the fourth rib on the right edge of the sternum according to the images collected by the depth vision device, move the probe to the third rib and the fourth rib, self-adaptively adjust the rotation angle of the probe, the inclination angle, scan the left room, the right room, the room interval and the right edge center horizontal of the sternum,adjust the probe to rotate anticlockwise 90 degrees, and scan the sagittal of the right edge of the sternum; S15, intelligently recognize the second rib and the third rib of the right edge of the sternum according to the images collected by the depth vision device, according to claim 1, move the probes to the second rib or the third rib according to the images collected by the depth vision device, self-adaptively adjust the rotation angle and inclination angle of the probe, and scan the long axis images of the right edge of the sternum;
    Claim 4 Method for intelligently identifying thoracic organautonomously locating and scanning thoracic organ,wherein, is characterized in that, the method for scanning breasts between ribs remote control and autonomous self-adaptive move robot arm by comprising the following steps: S1, intelligently recognize nipple, breast, the position and location, range and the contour of nipple, breast and the position, range and contour of the nipple, according to claim 1 and the images collected by the visual device; S2, According to claim 1, claim 2, establish novel models of depth vision device and depth visual images, models of chest skeleton, double-shoulder joint axillary side, according to the depth informationthe location information and location information of human body where the skeleton is located as input items input into model by the improved neural network method, Intelligently recognize sternum, double-shoulder joints, double-shoulder joint axillary sides, the location information and the location information. S3, According to claim 1, probe move from the outer side to the inner side of the mammary gland, the probe is moved from the inner side to the outer side, and the probe is placed in the axillary central line, The probe scans the nipple to reach the lower edge of the mammary gland until the gland disappears, and the longitudinal breaking direction is scanned; Scan along tudinal breaking direction. S4, According to claim 1, according to claim 2, images collected by depth vision devices and the location information of sternum, double-shoulder joint, double-shoulder armpit side are intelligently recognized, place the probe in clavicle, scan from clavicle, axillary central line and transverse breaking direction, move the probe from the head side of the upper breast, move towards the lower foot side, scan from the head side to the foot side to the upper breast until the gland disappears, scan in the longitudinal breaking direction; S5, According to claim 1, claim 2, Intelligently recognize nipple and the images of nipple and the position and the location which collected by the visual device, place the probe at the nipple, scan in the rotation direction of the nipple with the nipple as the center, scan from the nipple to the outer edge direction of the mammary gland until the rib in the reverse direction, the gland disappears, end the scanning in the rotation direction;
    Establish deplthvision Establish deplth visionmodel modelandand Establishgeneral Establish generalvisual visual image image model model extract bone extract bonefeatures features
    Extract featureitems,neural Extract feature items,neural network network algorithm algorithm andimproved and the the improved method method
    Regnizechest Regnize chest skeleton skeleton and and bone bone information information Regnize face, Regnize face, nipple, nipple, breast, breast, navel, navel, feature feature the depth the depthinformation informationandand sternum. sternum. parts and parts andpositions positionsasas feature feature items items Recognizeeach Recognize eachbone boneand andthe thepositions positions
    Recognize Recognize organ, organ, blood blood vessel, vessel, bone,bone, and body and human humanand body andinformation location location information of of the organ, the organ, the blood the bloodvessel, vessel,the thebone, bone, andand the the human human body body
    Improve neural network Improve neural network algorithm algorithm and and the the improved improved method method
    trainining count trainining count
    Yes Yes
    Outputvalue Output valueandand regcognition regcognition result result of human of human organsorgans
    Figure 11 Figure
    Set collection Set collection tasks tasks and andcoordinates coordinates of of external external location location areas, areas, obtain obtain
    informationofofgeneral information generalvision vision images images and and depthdepth visionvision images, images,
    Set bone Set boneinformation information of of external external location location areaarea of the of the neighour neighour organsorgans as as target target
    Robotsubstracts Robot substractsthethe location location information information of external of external scanning scanning area area of of organs organs
    Move,scan, Move, scan, collect collect medicl medicl images images and videos and videos by motion by motion planningplanning module module
    Ultrasonic devicepublish Ultrasonic device publish image image information, information, colorcolor information information of blood of blood vessel,vessel,
    location information location information of blood of blood vessel vessel and ultrasonic and ultrasonic images, images, and the and the features features
    of contour of andfeatures contour and features of of shape, shape, the the features features of structural of structural and color and color features features
    Whether Whether it itisistarget target
    yes yes
    self-adaptiveadjust self-adaptive adjustprobes probesandand rotation rotation angles, angles, inclination inclination angles, angles, the the parameters parameters of of medicl medicl images, images, videos, videos, image image collection collection method method
    Whether Whether collection collection is is effectiveandand effective
    completecollection; complete collection;
    yes yes
    self-adaptive adjustprobes self-adaptive adjust probesandand rotation rotation angles, angles, inclination inclination angles angles and and
    imagecollection image collectionmethod, method, collect collect all all of of images images of target of target organ. organ.
    Images are Images are effective effective and and completion completion
    yes yes
    Moveprobe Move probeand andscan scanheart heartorgan organ
    Return and Return and complete complete all all of scanning of scanning
    tasks Figure Figure 22
    2
    Set collection Set collectiontasks, tasks,coordinates coordinates ofof external external location location areas, areas, obtain obtain information information
    of general of visionimages general vision imagesandand depth depth vision vision images, images, set target set target organ. organ.
    Robot substract Robot substract location location information, information, external external location location area area of target of target organorgan
    the robot the robotarm armscans scans thethe front front leaf leaf long long axis axis andand along along the direction the direction of of the the anterior papillary anterior papillary muscle, muscle,the thechordae chordae tendineae tendineae and and the the mitral mitral valve valve anterior anterior
    scanand scan andcollect collectimages images of long of long axisaxis image image of theofchannel the channel and theand longthe long axis axis collect images collect images ofofthe theoutflow outflow channel channel of the of the right right chamber chamber
    scanand scan andcollect collectimages images of along of along the the left left edgeedge of sternum, of the the sternum, moves moves for for scanning,from scanning, from the the leftlong left longaxis axisimage, image, the the probe probe rotates rotates anticlockwise anticlockwise
    the right chamber flows into the long axis image of the channel
    scanthe scan thefour fourheart heartcavities, cavities,the themitral mitralvalve, valve,the thetricuspid tricuspidvalve, valve,thethe left left
    chestwall chest wallfour-cavity four-cavityheart, heart,the thepulmonary pulmonary artery, artery, the the aortic aortic valve, valve, the the pulmonaryvalve pulmonary valve and and the the images are scanned; images are scanned;
    Move and Move and scan scan fromfrom the left the left longlong axisaxis and collect and collect images images
    proberotate probe rotateclockwise, clockwise, scan scan and and collect collect images images of theofthree-lumen the three-lumen section section andtwo and twocavity cavitysections, sections,four-cavity center four-cavity center
    scanthe scan thetwo-cavity two-cavity cardiac cardiac andand the the longlong axis axis of apex of the the apex of theofheart the heart tip, tip, and and the left the left chamber and chamber and thethe right right chamber chamber and collect and collect images; images;
    scanand scan andcollect collectimages images of the of the inferior inferior vena vena cavacava long long axis, axis, the short the short
    axis, abdominal axis, aorta, abdominal aorta,
    Scanand Scan and collect collect images images of the of the ascending ascending aorticaortic arch, arch, the distal the distal end,RPL, end, the the RPL, the descending the descending aorta, aorta, andand the the sternum sternum upper upper fossa arch; fossa aortic aortic arch;
    scanand scan andcollect collectimages images of the of the left left room, room, thethe right right room, room, the the roomroom interval interval and and the right the right edge centerhorizontal, edge center horizontal,sagittal sagittal ofof thesternum, the sternum,
    Figure Figure 33
    3
AU2022333990A 2021-08-27 2022-08-18 Method for intelligently identifying thoracic organ, autonomously locating and scanning thoracic organ Pending AU2022333990A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202111008005.5 2021-08-27
CN202111008005.5A CN113855068A (en) 2021-08-27 2021-08-27 Method for intelligently identifying chest organs and autonomously positioning and scanning chest organs
PCT/CN2022/000119 WO2023024398A1 (en) 2021-08-27 2022-08-18 Method for intelligently identifying thoracic organ, autonomously locating and scanning thoracic organ

Publications (1)

Publication Number Publication Date
AU2022333990A1 true AU2022333990A1 (en) 2024-04-11

Family

ID=78988754

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2022333990A Pending AU2022333990A1 (en) 2021-08-27 2022-08-18 Method for intelligently identifying thoracic organ, autonomously locating and scanning thoracic organ

Country Status (3)

Country Link
CN (1) CN113855068A (en)
AU (1) AU2022333990A1 (en)
WO (1) WO2023024398A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113855068A (en) * 2021-08-27 2021-12-31 谈斯聪 Method for intelligently identifying chest organs and autonomously positioning and scanning chest organs
WO2023167830A1 (en) * 2022-03-01 2023-09-07 The Johns Hopkins University Autonomous robotic point of care ultrasound imaging

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2597615A1 (en) * 2008-11-25 2013-05-29 Algotec Systems Ltd. Method and system for segmenting medical imaging data according to a skeletal atlas
CN103679175B (en) * 2013-12-13 2017-02-15 电子科技大学 Fast 3D skeleton model detecting method based on depth camera
CN104856720B (en) * 2015-05-07 2017-08-08 东北电力大学 A kind of robot assisted ultrasonic scanning system based on RGB D sensors
US10521927B2 (en) * 2017-08-15 2019-12-31 Siemens Healthcare Gmbh Internal body marker prediction from surface data in medical imaging
CN109758233B (en) * 2019-01-21 2024-02-02 上海益超医疗器械有限公司 Diagnosis and treatment integrated operation robot system and navigation positioning method thereof
CN110477956A (en) * 2019-09-27 2019-11-22 哈尔滨工业大学 A kind of intelligent checking method of the robotic diagnostic system based on ultrasound image guidance
CN110827960A (en) * 2019-11-04 2020-02-21 杭州依图医疗技术有限公司 Medical image display method and display equipment
CN111916195A (en) * 2020-08-05 2020-11-10 谈斯聪 Medical robot device, system and method
CN111973228A (en) * 2020-06-17 2020-11-24 谈斯聪 B-ultrasonic data acquisition, analysis and diagnosis integrated robot and platform
CN111973152A (en) * 2020-06-17 2020-11-24 谈斯聪 Five sense organs and surgical medical data acquisition analysis diagnosis robot and platform
CN111658003B (en) * 2020-06-19 2021-08-20 浙江大学 But pressure regulating medical science supersound is swept and is looked into device based on arm
CN112001925B (en) * 2020-06-24 2023-02-28 上海联影医疗科技股份有限公司 Image segmentation method, radiation therapy system, computer device and storage medium
WO2022032455A1 (en) * 2020-08-10 2022-02-17 Shanghai United Imaging Healthcare Co., Ltd. Imaging systems and methods
CN112509694A (en) * 2020-12-17 2021-03-16 谈斯聪 Method for comprehensively identifying multiple suspected diseases through characterization, blood data and medical image fusion
CN113855068A (en) * 2021-08-27 2021-12-31 谈斯聪 Method for intelligently identifying chest organs and autonomously positioning and scanning chest organs

Also Published As

Publication number Publication date
WO2023024398A1 (en) 2023-03-02
CN113855068A (en) 2021-12-31

Similar Documents

Publication Publication Date Title
AU2022333990A1 (en) Method for intelligently identifying thoracic organ, autonomously locating and scanning thoracic organ
Li et al. An overview of systems and techniques for autonomous robotic ultrasound acquisitions
US20080085043A1 (en) Cardiac Valve Data Measuring Method And Device
CN106605257A (en) Landmark detection with spatial and temporal constraints in medical imaging
AU2022335276A1 (en) Recognition, autonomous positioning and scanning method for visual image and medical image fusion
CN112151169B (en) Autonomous scanning method and system of humanoid-operation ultrasonic robot
CN112270993B (en) Ultrasonic robot online decision-making method and system taking diagnosis result as feedback
US20210236773A1 (en) Autonomous Robotic Catheter for Minimally Invasive Interventions
JP2022524583A (en) Smart monitoring system for pelvic fracture reduction
CN112998749A (en) Automatic ultrasonic inspection system based on visual servoing
Li et al. Rl-tee: Autonomous probe guidance for transesophageal echocardiography based on attention-augmented deep reinforcement learning
Tan et al. Automatic generation of autonomous ultrasound scanning trajectory based on 3-d point cloud
CN112132805B (en) Ultrasonic robot state normalization method and system based on human body characteristics
Zhang et al. Robotic actuation and control of a catheter for structural intervention cardiology
CN117323004B (en) Navigation positioning system of spinal surgery robot
WO2023024397A1 (en) Medical robot apparatus, system and method
Chen et al. Fully Robotized 3D Ultrasound Image Acquisition for Artery
US20230009891A1 (en) Augmented Imaging For Valve Repair
Patlan-Rosales et al. Strain estimation of moving tissue based on automatic motion compensation by ultrasound visual servoing
Huang et al. Robot-Assisted Deep Venous Thrombosis Ultrasound Examination Using Virtual Fixture
WO2021254427A1 (en) Integrated robot and platform for ultrasound image data acquisition, analysis, and recognition
CN113893033A (en) Lung percutaneous puncture navigation method and system
Vrooijink Control of continuum robots for minimally invasive interventions
Bal et al. A Curvature and Trajectory Optimization-based 3D Surface Reconstruction Pipeline for Ultrasound Trajectory Generation
Popovic et al. An approach to robotic guidance of an uncalibrated endoscope in beating heart surgery