WO2021254427A1 - 超声图像数据采集分析识别一体化机器人,平台 - Google Patents
超声图像数据采集分析识别一体化机器人,平台 Download PDFInfo
- Publication number
- WO2021254427A1 WO2021254427A1 PCT/CN2021/100562 CN2021100562W WO2021254427A1 WO 2021254427 A1 WO2021254427 A1 WO 2021254427A1 CN 2021100562 W CN2021100562 W CN 2021100562W WO 2021254427 A1 WO2021254427 A1 WO 2021254427A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- image
- ultrasound
- module
- data
- Prior art date
Links
- 238000002604 ultrasonography Methods 0.000 title claims abstract description 79
- 238000004458 analytical method Methods 0.000 title claims abstract description 14
- 210000000056 organ Anatomy 0.000 claims abstract description 79
- 201000010099 disease Diseases 0.000 claims abstract description 33
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims abstract description 33
- 238000000034 method Methods 0.000 claims abstract description 29
- 230000003993 interaction Effects 0.000 claims abstract description 23
- 230000002159 abnormal effect Effects 0.000 claims abstract description 14
- 238000013528 artificial neural network Methods 0.000 claims abstract description 10
- 238000010801 machine learning Methods 0.000 claims abstract description 8
- 230000006870 function Effects 0.000 claims abstract 2
- 238000013480 data collection Methods 0.000 claims description 25
- 238000007405 data analysis Methods 0.000 claims description 15
- 238000004891 communication Methods 0.000 claims description 11
- 238000013178 mathematical model Methods 0.000 claims description 10
- 238000013473 artificial intelligence Methods 0.000 claims description 9
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 230000036541 health Effects 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 7
- 239000000523 sample Substances 0.000 claims description 6
- 230000006872 improvement Effects 0.000 claims description 5
- 210000004204 blood vessel Anatomy 0.000 claims description 4
- 238000013461 design Methods 0.000 claims description 2
- 238000006243 chemical reaction Methods 0.000 claims 1
- 210000001835 viscera Anatomy 0.000 abstract description 12
- 230000005856 abnormality Effects 0.000 abstract description 6
- 238000003745 diagnosis Methods 0.000 abstract description 2
- 208000024891 symptom Diseases 0.000 abstract 1
- 230000009471 action Effects 0.000 description 18
- 210000001503 joint Anatomy 0.000 description 8
- 238000004088 simulation Methods 0.000 description 8
- 210000000481 breast Anatomy 0.000 description 5
- 238000007689 inspection Methods 0.000 description 4
- 230000010391 action planning Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 210000001015 abdomen Anatomy 0.000 description 2
- 230000003187 abdominal effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 230000005802 health problem Effects 0.000 description 2
- 210000003734 kidney Anatomy 0.000 description 2
- 210000004185 liver Anatomy 0.000 description 2
- 210000003141 lower extremity Anatomy 0.000 description 2
- 210000001672 ovary Anatomy 0.000 description 2
- 210000002307 prostate Anatomy 0.000 description 2
- 210000000952 spleen Anatomy 0.000 description 2
- 210000002784 stomach Anatomy 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 210000000683 abdominal cavity Anatomy 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 210000000232 gallbladder Anatomy 0.000 description 1
- 210000004392 genitalia Anatomy 0.000 description 1
- 210000002837 heart atrium Anatomy 0.000 description 1
- 208000019622 heart disease Diseases 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 235000013372 meat Nutrition 0.000 description 1
- 210000002445 nipple Anatomy 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 210000000323 shoulder joint Anatomy 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 210000004291 uterus Anatomy 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
Definitions
- the invention belongs to the technical field of artificial intelligence robot health examination equipment, and relates to medical data analysis and medical image intelligent recognition systems.
- the purpose of the present invention is to provide a health examination system based on an artificial intelligence robot.
- An artificial intelligence robot system combined with various data acquisition devices and other nodes to build a platform physical examination medical data acquisition and analysis robot platform system.
- An artificial intelligence robot medical data collection and analysis system for health examination the robot device includes:
- the robot main system the robot main system module is used to realize the main control of the robot, from the communication between the camera and the medical ultrasound equipment acquisition module to the medical data analysis module, and is used for the interaction between the robot arm motion planning control module, the voice module and the user.
- Camera and sensor data acquisition module the data acquisition module is used to collect ultrasound medical images and camera and other tested medical data.
- Voice module the data module is used for interaction and voice guidance between the main control system and the user.
- the data analysis module is used to analyze medical data against standard values and find abnormal medical data.
- Image classification module the data module is used to classify ultrasound medical images and internal organ ultrasound images.
- the ultrasound image module is a data acquisition module for medical ultrasound equipment, and the data acquisition module is used to collect medical data of the ultrasound detection equipment and medical images of the ultrasound equipment.
- the robot arm motion planning acquisition module which is used for motion planning and the interaction between the robot arm motion and the user.
- the main control system of the robot, the camera and sensor data acquisition module, the ultrasound module and other heart detection equipment medical data, and the medical images in the ultrasound organs can be used; and the acquisition module, voice module, and voice are planned according to the action of the robotic arm.
- Command remote control strengthen the interaction between the robot and the user, and realize intelligent collection.
- Medical data analysis is used to analyze medical data against standard values and intelligently find medical abnormal data
- image classification module is used to accurately classify ultrasound images, intelligently locate ultrasound positions and classify internal ultrasound images. It improves the accuracy of intelligent collection and the accuracy of medical data abnormal recognition, and improves the remote collection of medical image classification and analysis, and the flexibility and possibility of remote recognition.
- the main robot system is used to realize the main control of the robot, data collection, image classification, voice interaction, action interaction, intelligent collection, intelligent analysis of abnormal data, intelligent identification, and remote identification.
- a camera is used to recognize human faces, color markings, and organ collection areas outside the body, and medical detection equipment and ultrasound equipment are used to collect medical data and medical images in ultrasound organs.
- the voice module includes remote collection of voice commands and voice recognition for interaction and voice guidance between the main control system and the user.
- the action module includes an action planning module and an action acquisition module, which are used for the action interaction between the main control system and the user, and the action image collection of the robotic arm.
- the action module includes an action planning module, an ultrasound unit collection action plan, a cardiac medical data collection plan, which is used for action interaction between the main control system and the user, and robotic arm action image collection.
- STEP2 Set the target parameters (target name, left and right arm joints)
- STEP4 Publish target and parameters (target pose, pose tag)
- STEP6 Set the target for head id, target pose and direction value
- STEP8 Set the pose mark as the coordinate origin and direction value
- the vision camera communicates with the ultrasound collector:
- Step2 Set the parameters of the publisher node of the gripper (target name, pose mark)
- Step3 Set the camera subscriber node parameters (point cloud, recent point cloud list)
- Step4 Define and obtain the nearest point cloud list
- Step5 Define the nearest point and convert it into an array of points
- Step7 Confirm the parameters and return to the point cloud information
- Step8 Set the pose direction value as a point object
- Step9 Publish COG as the target pose
- Step10 Set target parameters (posture mark, timestamp, target to head id, COG target pose, direction value)
- Step11 Publish the target node of the gripper
- Step1 Set the allowable error of position and attitude
- Step2 When the motion planning fails, re-planning is allowed
- Step3 Set the reference coordinate system of the target position
- Step4 Set the time limit for each exercise plan
- Step5 Set the position of the medical bed, arms and legs, set the height of the medical bed, the position of the arm and the position of the leg
- Step6 Set up medical bed, arm, leg position physical examination and diagnosis DEMO (including: medical bed ID, medical bed position, left arm ID, left arm position, right arm ID, right arm, left leg ID, left leg position, right Leg ID, right leg pose) add the above parameters to the medical examination and treatment DEMO
- Step7 Set the color, AR label and other special marks on the position of the medical bed, arms and legs
- Step8 Set the position target, that is, the moving position (the color label for lying flat between the human body position marks, the color label for lying on the left side, and the color label for lying on the right side)
- Step9 Set the scene color
- Step10 Set the lying color label, the left label lying color, the right lying label color and other special marks
- Step11 Set the color to the DEMO, including: initialize the planning scene object, monitor and set the scene difference, set the color, publish the color label, the color of the flat scene, the color of the left scene, the color of the right scene, and other special marks
- a method for patient face recognition, external position recognition of human organs, and color mark recognition includes the following steps:
- the position image collected from the outside of the human organ and the external position information of the organ collection area use the improved deep neural network algorithm to intelligent face images, joint images, color-marked images, accurately locate the external collection location of the organ, and intelligent collection
- an improved method for machine learning classification algorithm to classify organ images includes the following steps:
- a disease recognition method under a deep neural network algorithm organ model includes the following steps:
- the present invention solves the problems of low physical examination efficiency, difficult data collection, and inaccurate data collection in the prior art through the camera and ultrasound probe carried by the robot to collect data. .
- the physical examination intelligent research and development platform can realize health management, effectively detect, analyze, and identify abnormalities in the heart, breast, and abdominal organs, realize intelligent recognition, and remotely identify problems in the ultrasound cavity, as well as abnormal diseases in the organs and other health problems.
- the ultrasound probe collects data such as the heart and internal organs of the body. Realize accurate analysis and classify abnormal data of various organs. Achieve accurate identification of common problems such as ultrasound internal organs and heart disease.
- FIG. 1 is a schematic structural diagram of a physical examination and medical data collection and analysis robot in Embodiment 1 of the present application.
- Fig. 2 is a schematic diagram of a camera and an ultrasound image acquisition module in the first embodiment of the present application.
- Figure 3 is a positioning diagram of the ultrasound collection position of the human body.
- Figure 1 is labeled: 100-robot main system; 101-voice module; 102-medical image acquisition module; 103-robotic arm motion planning module; 104-camera image acquisition module.
- Attached drawing 2 signs: 10-robot main control system simulation device, 20-camera simulation device, 30-voice module, 40-radar mobile base, 50-image acquisition device module, 60-robotic arm module, 100-human face, 300 -Corresponding to the external position of human organs (collection of internal organs).
- Figure 3 Labels: 200-color marking, 400-shoulder joint, 601-atrium, 602-breast, 603-liver, 604-spleen and stomach, 605-kidney, 606-uterus, bladder, female ovary, 607-male prostate.
- the embodiments of the application provide a medical examination robot system, an ultrasonic device for analyzing medical data collection, and an organ classification disease identification method, which solves the problems of low physical examination efficiency, remote data, difficulty in autonomous data collection, and inaccurate data collection in the prior art. , It realizes effective detection, data analysis, identification of body abnormalities, realization of intelligent identification, identification of problems such as diseases in the ultrasound cavity, and health problems such as abnormal diseases in organs.
- the robot device includes: the robot main system, the robot main system module is used to realize the main control of the robot, from the camera acquisition module, the ultrasound module, the equipment data acquisition module to the medical data
- the communication between the analysis modules is used for the interaction between the robot arm motion planning control module, the voice module and the user.
- the data collection module is used to collect ultrasound medical images, heart and other measured medical data
- voice module the data module is used for interaction between the main control system and the user and voice guidance
- image classification module the data module is used In the ultrasound image module, the ultrasound inspection equipment data acquisition module, the data acquisition module is used to collect ultrasonic testing equipment medical data and ultrasonic equipment medical images
- the robotic arm motion planning acquisition module the robotic arm motion planning acquisition module is used for motion Planning, the interaction between the robot's actions and the user.
- an artificial intelligence robot medical data collection and analysis system for health examination the robot device includes:
- the main control system 10 of the robot the module is used to realize the communication between the main control of the robot and the camera module, the ultrasound image acquisition module, the main control system is equipped with the robotic arm, and the ultrasound inspection equipment data acquisition module communicates, and is used for the robotic arm motion planning collection , To communicate with the voice module for voice interaction between the robot and the user.
- the camera 20, the voice module 30, the ultrasound image acquisition module 50, and the medical ultrasound are used to collect medical images in the ultrasound organs. And according to the robot arm motion planning collection module 103 and the voice module 101, guide the user, strengthen the interaction between the robot and the user, and realize intelligent collection.
- Medical data analysis is used to analyze medical data against standard values and intelligently find medical abnormal data; image classification module is used to accurately classify ultrasound, ultrasound, ultrasound medical images, intelligently locate ultrasound positions and classify ultrasound images in organs.
- the main control system 10 of the robot the communication between the main control system of the robot and each module is used to realize the main control of the robot, the communication with the camera 20 and the voice module 30, the ultrasound image acquisition module 50, the main control system and the machine
- the arm is equipped with an ultrasound module 50, which is used for the robot arm motion planning collection, communicates with the voice module 30, and is used for voice interaction between the robot and the user.
- the robot main control system is connected to the robotic arm simulation device 60 through the system 10 and the depth camera simulation unit 20; and the analog robot main control system device 10 and the voice module 30 are connected Communication connection. And the communication connection between the analog robot main control system device 10 and the ultrasonic image acquisition module 102 to be tested; and the communication connection between the analog robot main control system device 10 and the robotic arm, and the ultrasonic inspection equipment data acquisition module 50. And the communication connection between the analog robot main control system device 10 and the voice module 30.
- the robot main control system is connected with a depth camera for face, ultrasound, image collection for voice interaction, and image collection.
- the camera simulation unit 20 is used to collect human faces, and according to the instructions of the robot main control system simulation device 10, release image data, communicate with image recognition nodes, and recognize human faces, color markings, and joints.
- the robot main control system returns color marking information, joint information, and external position information of the body organs, and the robot arm 60 moves to the collection position of the external parts of the human body. Thereby accurately positioning the face, joints, and ultrasound collection area.
- Use the robot main system to plan the action interaction and realize data collection. Design robot actions, and aim at the camera and other collection positions to realize human-robot friendly interaction and collect data efficiently.
- the voice module 30 is used for voice commands, voice recognition, and voice consultation.
- the platform robot main control system 10 communicates with the voice module 30 to realize the voice control main system.
- the main system 10 sends an action instruction to the robot arm action plan collection module 60.
- the voice module is used for voice recognition, voice synthesis, robotic voice autonomous consultation, and disease knowledge answering. Voice interviews with family doctors and specialists at the remote end.
- the ultrasound acquisition module 50 is used to collect medical images in the ultrasound organs, according to the instructions of the robot main control system simulation device 10, release medical image data, use the robot main control system 10 to download the TF package to return the position information of the body, and the robot arm 60 moves to Collect data on the position of the internal organs of the body. So as to accurately locate the internal organs of the organs. Returns the name, image, and data value of each organ.
- the robot arm action planning acquisition module 60 is used to move and collect ultrasound medical images, calculate the position and time according to the action plan, the robot main control system simulation device 10 action instructions, and communicate with the organ recognition program node according to the camera module 20 to identify color marks and joints Mark, identify and determine the location of the organs in the ultrasound collection organ. Move to the location of the external organ.
- the robot arm package is used to realize the robot arm motion planning and data collection.
- the robot arm engineering package under the robot system is used to plan the robot arm movements. It is planned to use a camera and other mounted robot arms to effectively collect ultrasound heart, breast, and abdominal organ data through the robot arm movement planning and action interaction to achieve accurate data collection.
- the patient face recognition, external position recognition of human organs, and color mark recognition methods include:
- a mathematical model of face 100 a mathematical model for individual face image recognition, extract facial features, color labels 200 and corresponding external positions 300 of human organs, including features such as colors, faces, joints 400, etc., and extract the position of external organs of the human body.
- Feature value mark color value, shoulder, waist, lower limb joint position, face), etc., enter the feature value of the detection item.
- Improve the weight optimizer and get the output value through image training. According to the output result, the position image collected from the outside of the human organ and the external position information of the organ collection area.
- the improved deep neural network algorithm is used to intelligently recognize the face image 100, the color-marked image 200, accurately locate the external collection location 300 of the organ, and the joint 400, and collect data intelligently.
- the method includes:
- an internal ultrasound collection area 500 is established.
- Establish the internal organ 600 mathematical model extract the internal contour features of the organ, including features such as color, shape, contour, etc., extract the feature values (color, shape, contour) of the image, etc., and enter the feature values of the items. Calculate the output value.
- the organ images are classified.
- the accurately classified ultrasound images include atrium 601, breast 602, liver 603, spleen and stomach 604, kidney 605, uterus, bladder, female ovary 606, male prostate 607 and other images.
- the disease identification methods of the deep neural network algorithm organs 601-607 include:
- the ultrasound organ image input corresponds to the mathematical model of the organ 601-607, and the features of the input image are extracted, including the color, contour, and texture of the organ image, the image feature accelerator of the common organ corresponding to the disease, and the blood vessel color value and other features are transformed into the input data, after the algorithm
- the weight accelerator and optimizer calculate to obtain the output value, and classify the disease type of this organ according to the output result, and accurately identify the disease.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
- Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)
Abstract
Description
Claims (8)
- 超声图像数据采集分析识别一体化机器人,平台,其特征在于,采用人工智能机器人主系统,通过机器臂搭载摄像头和超声等装置智能采集医疗数据和医疗图像,对数据进行分析,对超声等图像进行分类,提高了体检及数据采集的效率,人工智能机器人医疗数据采集,分析健康体检系统,所述机器人装置包括:机器人主系统,所述机器人主系统模块用于实现机器人的主控制,从摄像头及医疗超声设备采集模块到医疗数据分析模块间通信,用于机器臂动作规划控制模块,语音模块和用户间交互;摄像头及传感器数据采集模块,所述数据采集模块用于采集图像和被测医疗数据;语音模块,所述数据模块用于主控制系统与用户间交互和语音引导;医疗数据分析,所述数据分析模块用于比照标准值分析医疗数据,发现医疗异常数据;图像分类模块,所述数据模块用于分类医疗图像;医疗图像模块,医疗设备数据采集模块,所述数据采集模块用于采集医疗数据和医疗图像;机器臂动作规划采集模块,所述机器臂动作规划采集模块用于动作规划,机器臂动作与用户间的交互。
- 超声图像数据采集分析识别一体化机器人,平台,其特征在于,利用改进的神经网络方法实现人体超声图像分类识别,实现智能定位身体组织器官,从而精准识别内部组织器官并对其定位采集。
- 根据权利要求1所述的机器人装置,其特征在于,利用机器人系统颜色标记关节标记特殊标记,坐标转换包返回颜色标记及身体超声各位置信息,用机器人系统连接机器臂移动到身体各部位数据采集位置,从而精准定位内部组织器官,精准采集内部组织器官图像。
- 根据权利要求1所述的机器人装置,其特征在于,利用机器人手臂,摄像头,采用机器人手臂连接超声探头采集图像,摄像头及传感器数据采集模块,利用机器人手臂搭载摄像头,采集人脸部,体外部位,关节图像数据,利用神经网络算法,识别人脸,身体外部位置,关节图像,计算返回值,大大提高了疾病的智能识别,智能体检数据异常识别效率。
- 根据权利要求1所述的机器人装置,其特征在于,利用机器人与机器臂搭载的摄像头,传感器,探头连接,采用机器人手臂搭载探头采集图像,利用超声采集装置,采集器官数据,超声图像模块及图像分类模块,基于机器学习改进方法,建立超声部轮廓和内部组织器官的特征模型,利用机器学习改进方法智能分类脏器部位,从而指示机器臂移动的方向和位置,实现超声部轮廓和内部组织器官的分类,精准识别,智能定位,识别超声图像疾病的方法。
- 超声图像数据采集分析识别一体化机器人,平台,其特征在于,基于SVM方法及不限于SVM方法的机器学习改进方法,建立超声部轮廓和内部组织器官的特征模型,利用SVM方法及不限于SVM方法的机器学习方法改进智能分类超声图像的脏器位置,从而指示机器臂移动的方向和位置,实现高效分类超声脏器图像轮廓和内部组织器官。
- 超声图像数据采集分析识别一体化机器人,平台,其特征在于,利用神经网络算法改进的方法,建立图像识别的数学模型,疾病的表象特征,包括:对图像超声腔的图形特征进行提取,引导可变模型轮廓演化到目标特征,通过脏器轮廓,血管位置形状,通过图像颜色,灰度对比,病症特征识别脏器疾病提取图像的特征值(颜色,形状,轮廓)等,输入检测项目特征值,利用改进深度神经网络方法调整权值参数,得到输出值,依据输出值的范围来识别对应器官正常体征或疾病。
- 超声图像数据采集分析识别一体化机器人,平台,其特征在于,利用机器人手臂及其动作规划设计方法,实现机器人手臂移动,抓取,有效动作向导,从而实现数据远端自主采集功能。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180008741.2A CN116507286A (zh) | 2020-06-17 | 2021-06-17 | 超声图像数据采集分析识别一体化机器人,平台 |
AU2021292112A AU2021292112A1 (en) | 2020-06-17 | 2021-06-17 | Integrated robot and platform for ultrasound image data acquisition, analysis, and recognition |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010556720.1A CN111973228A (zh) | 2020-06-17 | 2020-06-17 | B超数据采集分析诊断一体化机器人,平台 |
CN202010556720.1 | 2020-06-17 | ||
CN202010780479.0A CN111916195A (zh) | 2020-08-05 | 2020-08-05 | 一种医疗用机器人装置,系统及方法 |
CN202010780479.0 | 2020-08-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021254427A1 true WO2021254427A1 (zh) | 2021-12-23 |
Family
ID=79268472
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/100562 WO2021254427A1 (zh) | 2020-06-17 | 2021-06-17 | 超声图像数据采集分析识别一体化机器人,平台 |
Country Status (2)
Country | Link |
---|---|
AU (1) | AU2021292112A1 (zh) |
WO (1) | WO2021254427A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114536323A (zh) * | 2021-12-31 | 2022-05-27 | 中国人民解放军国防科技大学 | 一种基于图像处理的分类机器人 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007037848A2 (en) * | 2005-09-28 | 2007-04-05 | Siemens Medical Solutions Usa, Inc. | Systems and methods for computer aided diagnosis and decision support in whole-body imaging |
US20190262084A1 (en) * | 2018-02-27 | 2019-08-29 | NavLab, Inc. | Artificial intelligence guidance system for robotic surgery |
CN110288574A (zh) * | 2019-06-13 | 2019-09-27 | 南通市传染病防治院(南通市第三人民医院) | 一种超声辅助诊断肝肿块系统及方法 |
CN110477956A (zh) * | 2019-09-27 | 2019-11-22 | 哈尔滨工业大学 | 一种基于超声图像引导的机器人诊断系统的智能扫查方法 |
US20190358822A1 (en) * | 2018-05-23 | 2019-11-28 | Aeolus Robotics, Inc. | Robotic interactions for observable signs of core health |
CN111916195A (zh) * | 2020-08-05 | 2020-11-10 | 谈斯聪 | 一种医疗用机器人装置,系统及方法 |
CN111973152A (zh) * | 2020-06-17 | 2020-11-24 | 谈斯聪 | 五官及外科医疗数据采集分析诊断机器人,平台 |
CN111973228A (zh) * | 2020-06-17 | 2020-11-24 | 谈斯聪 | B超数据采集分析诊断一体化机器人,平台 |
-
2021
- 2021-06-17 WO PCT/CN2021/100562 patent/WO2021254427A1/zh active Application Filing
- 2021-06-17 AU AU2021292112A patent/AU2021292112A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007037848A2 (en) * | 2005-09-28 | 2007-04-05 | Siemens Medical Solutions Usa, Inc. | Systems and methods for computer aided diagnosis and decision support in whole-body imaging |
US20190262084A1 (en) * | 2018-02-27 | 2019-08-29 | NavLab, Inc. | Artificial intelligence guidance system for robotic surgery |
US20190358822A1 (en) * | 2018-05-23 | 2019-11-28 | Aeolus Robotics, Inc. | Robotic interactions for observable signs of core health |
CN110288574A (zh) * | 2019-06-13 | 2019-09-27 | 南通市传染病防治院(南通市第三人民医院) | 一种超声辅助诊断肝肿块系统及方法 |
CN110477956A (zh) * | 2019-09-27 | 2019-11-22 | 哈尔滨工业大学 | 一种基于超声图像引导的机器人诊断系统的智能扫查方法 |
CN111973152A (zh) * | 2020-06-17 | 2020-11-24 | 谈斯聪 | 五官及外科医疗数据采集分析诊断机器人,平台 |
CN111973228A (zh) * | 2020-06-17 | 2020-11-24 | 谈斯聪 | B超数据采集分析诊断一体化机器人,平台 |
CN111916195A (zh) * | 2020-08-05 | 2020-11-10 | 谈斯聪 | 一种医疗用机器人装置,系统及方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114536323A (zh) * | 2021-12-31 | 2022-05-27 | 中国人民解放军国防科技大学 | 一种基于图像处理的分类机器人 |
Also Published As
Publication number | Publication date |
---|---|
AU2021292112A1 (en) | 2023-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112155729B (zh) | 手术穿刺路径智能自动化规划方法及系统和医疗系统 | |
CN116507286A (zh) | 超声图像数据采集分析识别一体化机器人,平台 | |
CN109567942B (zh) | 采用人工智能技术的颅颌面外科手术机器人辅助系统 | |
WO2021254444A1 (zh) | 五官及外科医疗数据采集分析诊断机器人,平台 | |
Li et al. | An overview of systems and techniques for autonomous robotic ultrasound acquisitions | |
WO2022027921A1 (zh) | 一种医疗用机器人装置、系统及方法 | |
Li et al. | Autonomous multiple instruments tracking for robot-assisted laparoscopic surgery with visual tracking space vector method | |
JP2016080671A5 (zh) | ||
Suligoj et al. | RobUSt–an autonomous robotic ultrasound system for medical imaging | |
CN112998749B (zh) | 一种基于视觉伺服的自动超声检查系统 | |
CN112270993A (zh) | 一种以诊断结果为反馈的超声机器人在线决策方法及系统 | |
CN108030496A (zh) | 一种人体上肢肩部盂肱关节旋转中心与上臂抬升角耦合关系测量方法 | |
Peng et al. | Autonomous recognition of multiple surgical instruments tips based on arrow OBB-YOLO network | |
WO2023024396A1 (zh) | 视觉图像与医疗图像融合识别、自主定位扫查方法 | |
CN112132805A (zh) | 一种基于人体特征的超声机器人状态归一化方法及系统 | |
WO2023024398A1 (zh) | 智能识别胸部器官、自主定位扫查胸部器官的方法 | |
WO2021254427A1 (zh) | 超声图像数据采集分析识别一体化机器人,平台 | |
Mathur et al. | A semi-autonomous robotic system for remote trauma assessment | |
WO2021253809A1 (zh) | 血液采集分析、图像智能识别诊断一体化装置、系统及方法 | |
CN114310957A (zh) | 用于医疗检测的机器人系统及检测方法 | |
CN109993116A (zh) | 一种基于人体骨骼相互学习的行人再识别方法 | |
CN102370478B (zh) | 一种心电电极安放定位装置 | |
JP2016035651A (ja) | 在宅リハビリテーションシステム | |
Vitali et al. | A new approach for medical assessment of patient’s injured shoulder | |
WO2023024397A1 (zh) | 一种医疗用机器人装置、系统及方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21826621 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 202180008741.2 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2022581727 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021826621 Country of ref document: EP Effective date: 20230117 |
|
ENP | Entry into the national phase |
Ref document number: 2021292112 Country of ref document: AU Date of ref document: 20210617 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021826621 Country of ref document: EP Effective date: 20230117 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21826621 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22/09/2023) |