WO2022027921A1 - 一种医疗用机器人装置、系统及方法 - Google Patents
一种医疗用机器人装置、系统及方法 Download PDFInfo
- Publication number
- WO2022027921A1 WO2022027921A1 PCT/CN2021/000162 CN2021000162W WO2022027921A1 WO 2022027921 A1 WO2022027921 A1 WO 2022027921A1 CN 2021000162 W CN2021000162 W CN 2021000162W WO 2022027921 A1 WO2022027921 A1 WO 2022027921A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- module
- robot
- medical
- recognition
- equipment
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 239000003814 drug Substances 0.000 claims abstract description 60
- 238000002347 injection Methods 0.000 claims abstract description 44
- 239000007924 injection Substances 0.000 claims abstract description 44
- 239000008280 blood Substances 0.000 claims abstract description 41
- 210000004369 blood Anatomy 0.000 claims abstract description 41
- 210000004204 blood vessel Anatomy 0.000 claims abstract description 16
- 238000005516 engineering process Methods 0.000 claims abstract description 11
- 230000010391 action planning Effects 0.000 claims description 55
- 239000000523 sample Substances 0.000 claims description 55
- 230000000007 visual effect Effects 0.000 claims description 50
- 238000007726 management method Methods 0.000 claims description 48
- 210000000214 mouth Anatomy 0.000 claims description 39
- 229940079593 drug Drugs 0.000 claims description 37
- 210000000056 organ Anatomy 0.000 claims description 35
- 210000002700 urine Anatomy 0.000 claims description 27
- 230000033001 locomotion Effects 0.000 claims description 25
- 230000001815 facial effect Effects 0.000 claims description 20
- 230000003993 interaction Effects 0.000 claims description 19
- 239000007927 intramuscular injection Substances 0.000 claims description 19
- 238000010255 intramuscular injection Methods 0.000 claims description 19
- 238000002604 ultrasonography Methods 0.000 claims description 18
- 238000004891 communication Methods 0.000 claims description 17
- 238000011084 recovery Methods 0.000 claims description 15
- 201000010099 disease Diseases 0.000 claims description 14
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims description 14
- 210000000707 wrist Anatomy 0.000 claims description 14
- 230000009471 action Effects 0.000 claims description 13
- 210000001835 viscera Anatomy 0.000 claims description 13
- 210000001503 joint Anatomy 0.000 claims description 12
- 238000013507 mapping Methods 0.000 claims description 12
- 230000015572 biosynthetic process Effects 0.000 claims description 11
- 238000003786 synthesis reaction Methods 0.000 claims description 11
- 238000013528 artificial neural network Methods 0.000 claims description 9
- 210000003608 fece Anatomy 0.000 claims description 9
- 238000010253 intravenous injection Methods 0.000 claims description 9
- 210000003296 saliva Anatomy 0.000 claims description 9
- 210000003462 vein Anatomy 0.000 claims description 9
- 208000037656 Respiratory Sounds Diseases 0.000 claims description 8
- 229920000742 Cotton Polymers 0.000 claims description 7
- 230000029058 respiratory gaseous exchange Effects 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 6
- 230000002550 fecal effect Effects 0.000 claims description 5
- 210000000323 shoulder joint Anatomy 0.000 claims description 5
- 238000012360 testing method Methods 0.000 claims description 5
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 claims description 4
- 230000002159 abnormal effect Effects 0.000 claims description 4
- 230000036772 blood pressure Effects 0.000 claims description 4
- 210000000481 breast Anatomy 0.000 claims description 4
- 238000013461 design Methods 0.000 claims description 4
- 239000008103 glucose Substances 0.000 claims description 4
- 238000005457 optimization Methods 0.000 claims description 4
- 230000002685 pulmonary effect Effects 0.000 claims description 4
- 239000000243 solution Substances 0.000 claims description 4
- 208000024891 symptom Diseases 0.000 claims description 4
- 210000001015 abdomen Anatomy 0.000 claims description 3
- 230000003044 adaptive effect Effects 0.000 claims description 3
- 210000005069 ears Anatomy 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 210000001508 eye Anatomy 0.000 claims description 3
- 210000000887 face Anatomy 0.000 claims description 3
- 210000004392 genitalia Anatomy 0.000 claims description 3
- 230000006872 improvement Effects 0.000 claims description 3
- 210000001331 nose Anatomy 0.000 claims description 3
- 238000012806 monitoring device Methods 0.000 claims description 2
- 210000002445 nipple Anatomy 0.000 claims description 2
- 230000000474 nursing effect Effects 0.000 claims description 2
- 210000005259 peripheral blood Anatomy 0.000 claims description 2
- 239000011886 peripheral blood Substances 0.000 claims description 2
- 238000012549 training Methods 0.000 claims description 2
- 210000003928 nasal cavity Anatomy 0.000 claims 7
- 238000006243 chemical reaction Methods 0.000 claims 1
- 230000010354 integration Effects 0.000 claims 1
- 210000005036 nerve Anatomy 0.000 claims 1
- 238000003745 diagnosis Methods 0.000 abstract description 5
- 238000013473 artificial intelligence Methods 0.000 abstract description 3
- 238000001990 intravenous administration Methods 0.000 abstract description 3
- 238000002560 therapeutic procedure Methods 0.000 abstract 1
- 238000001514 detection method Methods 0.000 description 12
- 210000003371 toe Anatomy 0.000 description 11
- 230000002792 vascular Effects 0.000 description 7
- 210000001145 finger joint Anatomy 0.000 description 3
- 206010035148 Plague Diseases 0.000 description 2
- 241000607479 Yersinia pestis Species 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 150000007523 nucleic acids Chemical class 0.000 description 2
- 102000039446 nucleic acids Human genes 0.000 description 2
- 108020004707 nucleic acids Proteins 0.000 description 2
- 230000000241 respiratory effect Effects 0.000 description 2
- 208000035473 Communicable disease Diseases 0.000 description 1
- 241000237858 Gastropoda Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000002648 combination therapy Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 210000002310 elbow joint Anatomy 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 230000002458 infectious effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- -1 self-management Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
- A61B10/0045—Devices for taking samples of body liquids
- A61B10/0051—Devices for taking samples of body liquids for taking saliva or sputum samples
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
- A61B10/0045—Devices for taking samples of body liquids
- A61B10/007—Devices for taking samples of body liquids for taking urine samples
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0088—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/15—Devices for taking samples of blood
- A61B5/151—Devices specially adapted for taking samples of capillary blood, e.g. by lancets, needles or blades
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/15—Devices for taking samples of blood
- A61B5/151—Devices specially adapted for taking samples of capillary blood, e.g. by lancets, needles or blades
- A61B5/15101—Details
- A61B5/15103—Piercing procedure
- A61B5/15107—Piercing being assisted by a triggering mechanism
- A61B5/15109—Fully automatically triggered, i.e. the triggering does not require a deliberate action by the user, e.g. by contact with the patient's skin
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B7/00—Instruments for auscultation
- A61B7/02—Stethoscopes
- A61B7/04—Electric stethoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/90—Identification means for patients or instruments, e.g. tags
- A61B90/94—Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M5/00—Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
- A61M5/42—Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests having means for desensitising skin, for protruding skin to facilitate piercing, or for locating point where body is to be pierced
- A61M5/427—Locating point where body is to be pierced, e.g. vein location means using ultrasonic waves, injection site templates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Definitions
- the present invention belongs to the technical field of artificial intelligence robot health medical equipment, and relates to the field of robotics, an image intelligent recognition method, an intelligent device and a system.
- Background technology is currently used in the medical field.
- the accuracy of identifying the disease is poor, and the fields of various specialists and medical professions are limited. Illness is difficult to achieve.
- Remote control by administrators, remote joint consultation, joint rounds by ward specialists, robotic devices for combined therapy, robotic platforms involving robotic theory and practical techniques.
- robotic arms are used to autonomously collect oral testing samples, blood testing samples, urine, feces testing samples, self-injection, self-management, and drug configuration. medical supplies.
- machine vision and various intelligent identification methods assist in the identification of disease symptoms associated with the identification of diseases, realize remote detection, autonomous detection, infectious detection, intelligent analysis of data, and effectively prevent infectious diseases and plagues and other major diseases spread.
- the purpose of the present invention is to overcome the above-mentioned shortcomings and deficiencies of the prior art, and to provide a medical robotic device that utilizes remote consultation, multi-department joint consultation, remote doctor's orders, poor patient-doctor communication, and disease understanding.
- the ultrasonic image acquisition device, intraoral acquisition device, blood acquisition device, CT image and DR radiology image remote control acquisition and sharing implemented by robots are used to realize image sharing, which solves the problem of artificial diagnosis. Treatment errors, as well as the limitations of a single clinic and the monotony of diagnostic protocols.
- the present invention also provides an optimal management system for multi-task allocation in an outpatient ward and a method for real-time collection and sharing of medical pictures by sharing a multi-user-robot voice interaction; Dispensing management method; a medical care, patient, robot tripartite matching remote control and autonomous sample collection, injection management method.
- the technical solution adopted in the present invention is a medical robot device comprising: a robot main system, the robot main system module is used to realize the main control of the robot, the interaction between the voice module and the user, the voice module, the visual recognition module, the heart sound, the lung Parts snail sound recognition module, medical scene recognition, radar autonomous mobile real-time mapping module, blood collection, injection action planning module Action planning module, robotic arm picking, placing, scanning code, management action planning control module.
- a voice module, the voice module is used to collect the voices of doctors and patients and the scene voices of outpatient wards.
- the voice module is used for interaction and voice guidance, voice commands, and voice interaction between the main control system and the user.
- the visual recognition module is connected to an image acquisition device, and collects and recognizes images.
- the image acquisition device includes one or more of a general camera, a depth camera, and a binocular camera, but is not limited to the above image acquisition devices.
- the visual recognition module includes: face recognition, human facial features recognition, human body feature position recognition, medical scene recognition, medical supplies recognition, and drug recognition.
- the face recognition is the face recognition of patient users and medical administrators.
- Human facial features recognition is the recognition of facial features and their positions, the angle position of the oral cavity, and is used for nucleic acid detection, biometric detection, and other oral detection.
- Human body feature position recognition refers to finger joint position recognition, including: shoulder, wrist, arm elbow, finger joints and their position recognition, used to identify fingers, toe ends, wrist, elbow, shoulder arm joints, in Under the vascular amplifier, the position of the wrist vein, the cubital vein, and the position of the intramuscular injection near the shoulder are identified for the positioning of blood vessels and other key positions.
- an improved neural network method is applied to identify the medical scene. Recognition of comprehensive scenes including outpatient clinics, wards, patients, doctors, and alphanumeric characters of house numbers.
- the identification of the medical supplies includes: a blood pressure meter, a blood glucose meter, a thermometer, a stethoscope, a heart rate device for collecting medical information, a breathing device, a negative pressure device, and a 24-hour monitoring device in the basic medical equipment area carried by the robot, and others Medical devices for various specialties. Identify and manage medical supplies and equipment using an improved neural network method based on shape, color, digital code, and QR code features. According to the doctor's order and according to the doctor's task arrangement, the identified medical supplies correspond to the identified patient's face and the QR code of the bracelet, and they are matched and managed.
- the drug identification includes: the digital code on the outer label of the drug, the two-dimensional code, the character, the color, and the shape of the feature to identify the name and quantity of the drug, and its correspondence with the recognized face of the patient and the two-dimensional code of the bracelet, and the identification of the drug name and quantity. It performs matching management.
- Heart sound, lung sound recognition module, the heart sound, lung sound recognition module is used for heart sound, pulmonary spiral sound voiceprint feature extraction, using an improved sound recognition algorithm, intelligent identification of heart sounds, abnormal spiral sounds. Radar autonomous movement, medical scene recognition, mapping module.
- the radar autonomous positioning, navigation, and real-time mapping module the medical scene recognition department using the visual recognition module, the alphanumeric characters of the ward house number, the bed number and the radar real-time mapping are integrated, and the autonomous positioning, navigation, and movement to the corresponding department, ward , the bed position.
- the action planning is to set parameters through the administrator's mediation and to train the robot to learn and plan actions and adaptive mediation to set the action planning parameters through the improved method of the neural network.
- the collection and injection module includes: blood collection, injection module, oral detection sample collection module, urine, stool sample storage and management module, and medical image collection and sharing module.
- the blood collection, the injection module, the blood collection module at the tip of the finger, and the injection needle module on the basis of identifying the position of the finger, the end of the canthus, and the joints of the arm, apply a blood vessel amplifier, an arm fixing device, and locate the position of the end of the toe, and the wrist of the arm.
- the oral cavity detection sample collection module uses the visual recognition module to recognize the facial features, identify and locate the oral cavity position, the tooth position, the oral cavity wall position, use the oral cavity collector mounted on the robot arm, and the oral cavity collection cotton, oral mirror, planning movement, sliding along the wall in the left and right front and rear directions, collecting movements, accurately collecting saliva, oral features in the oral cavity, and intraoral images.
- the urine and feces sample storage and management module, the urine and feces sample storage action planning module is used for robot touring and corresponding wards, beds, patients and their corresponding two-dimensional codes, digital code matching, automatic identification by robotic arms And grab, move, and place urine and fecal samples in the sample collection area.
- the medical image acquisition and sharing module is characterized in that the medical image acquisition and sharing module is used for acquiring ultrasound images, CT images, image sharing, remote control acquisition and sharing of DR radiology and MRI nuclear magnetic images, remote consultation, and multi-departmental consultation. Joint clinic.
- an action planning module is used for wearing a medical device. It is characterized in that, the medical equipment refers to the equipment carried by the robot and the respiratory equipment in the medical area, the negative pressure equipment, the 24-hour monitoring equipment and other various medical equipment, which are controlled by the robot main system, and the robot uses the facial features recognition of the visual recognition module.
- the medical supplies, medicine pick-and-place configuration management module is characterized in that, medicines, treatment equipment, rehabilitation equipment and other medical supplies can be picked up, placed, scanned digital code, two-dimensional code, effective Manage and distribute equipment.
- the visual recognition module is used to identify the patient's face, and the bracelet scans the QR code to compare the bed, hand card information, digital codes of medical devices and drugs, and the QR code is matched to compare the doctor's order information.
- An optimized task management system including a medical robot device, medical care tasks of multiple departments, and a call subsystem, the medical robot device is the medical robot device in any of the above schemes, and all the multiple departments
- the medical task subsystem and a call subsystem are connected with the robot main control system and built on the optimized task management system platform.
- the medical administrator can arrange time for patients in multiple departments and wards - and the tasks corresponding to each time period, add, modify, delete, query, and dynamically schedule various tasks of robots in real time.
- Connect with the call system of the medical area conduct remote consultation, jointly consult and treat patients in the jurisdiction, send doctor's order information, accept patient messages, and reply to patient messages.
- a multi-user-robot voice interaction joint consultation method for collecting and sharing medical pictures in real time comprising the following steps:
- the robot uses speech recognition and speech synthesis technology to explain the patient's condition.
- the administrator uses the message information carried by the robot platform, subscribes to the picture data service, publishes images, and multi-user-robots share medical information, such as pictures and voices.
- the administrator uses the real-time voice interaction, voice recognition module, real-time multi-user voice conversation, voice-to-text additional picture information, recorded multi-user voice interaction, voice conference carried by the robot platform; a medicine that matches medical care, patient, and robot tripartite
- a method for the management of autonomous picking and distribution of medical devices includes the following steps:
- the administrator communication module publishes doctor's order messages and services, the robot voice module subscribes to receive doctor's order messages, and patient users subscribe to receive doctor's order messages and services.
- the robot uses speech recognition, speech synthesis technology, speech recording, and speech-to-text to recognize doctor's orders.
- the robot uses the visual recognition module to identify equipment, medicines and their corresponding location information.
- the robot uses the vision module, the equipment released by the communication module, the drug location information service, and the radar positioning and navigation module to subscribe to the location information service, and autonomously moves to the equipment and medicine location placement area.
- the robot uses the action planning module to pick up equipment, medicines, and scan digital codes and two-dimensional codes.
- the robot uses the communication module to publish patient location information including: ward, department, and bed location information.
- the radar positioning and navigation module subscribes to the patient's position information and moves to the hospital bed autonomously.
- the robot uses the visual recognition module to recognize the medical scene of the department, the ward house number and the alphanumeric characters, and the bed number uses the robot visual module to recognize the face, check the matching, and if they are consistent, perform step 8. If they are inconsistent, reposition the navigation.
- the robot scans the digital code and two-dimensional code of the patient's wristband by using the motion planning module, and checks and matches with the two-dimensional code, digital code, and digital code of the doctor's order information on the equipment and medicine. If the scanning result is correct, the equipment and medicine will be distributed. Otherwise return a message to the administrator.
- a medical care, patient, robot tripartite matching remote control and autonomous sample collection, injection management method includes the following steps:
- the administrator communication module publishes doctor's order messages and services, the robot voice module subscribes to receive doctor's order messages, and patient users subscribe to receive doctor's order messages and services.
- the robot uses speech recognition, speech synthesis technology, speech recording, and speech-to-text to recognize doctor's orders.
- the robot uses the communication module to publish patient location information including: ward, department, and bed location information.
- the radar positioning and navigation module subscribes to the patient's location information and autonomously moves to the hospital bed-
- the robot uses the vision module to identify the communication module to publish information services, the radar positioning and navigation module subscribes to the location information service, and autonomously moves to the equipment and medicine placement areas.
- the robot uses the visual recognition module to recognize faces, facial features, features, and their positions. Identify fingers, toe ends, arm joints, and joint positions. Apply vascular amplifier, arm immobilization device, locate toe end position, arm wrist, elbow vein position, upper arm intramuscular injection position, position information.
- the robot uses the communication module to publish and collect position information, the robotic arm subscribes to the fixed device, collects the position information, injects the position information, and the motion planning module subscribes to the position information.
- the robot collects oral cavity, image, blood, and injection actions according to the position information in step S6 and according to the action planning module.
- the collection module includes: blood collection, injection action planning module, oral cavity collection action planning module, urine and stool sample storage Action planning module.
- the blood collection, the injection action planning module, the finger end blood collection module, and the injection needle module are based on identifying the positions of the fingers, the ends of the toes, and the joints of the arm, applying a blood vessel amplifier, an arm fixing device, and positioning Toe end position, arm wrist, elbow vein position, upper arm intramuscular injection position, application of collection needle, injection needle to collect blood, intravenous injection, intramuscular injection.
- step S7 the oral cavity collection action planning module, the oral cavity collection action planning module, uses the facial features recognition of the visual recognition module to locate the position of the oral cavity, the position of the teeth, and the position of the oral cavity wall, using the oral cavity collector carried by the robot arm, Oral collector cotton, oral mirror, planning movement, sliding along the wall in the left and right front and rear directions, collecting movements, accurately collecting saliva, oral features in the oral cavity, and intraoral images.
- step S7 the urine and feces sample collection module, the urine and feces sample storage action planning module are used for the robot to tour the corresponding ward, hospital bed, patient and their corresponding two-dimensional code, digital code matching, using the machine
- the arm automatically recognizes and grabs, moves, and places urine and fecal samples in the sample collection area.
- the robot uses the communication module to publish the location information of the recovery area, the radar positioning and navigation module subscribes to the location information service of the recovery area, and moves autonomously to the saliva sample recovery area, the biological information sample recovery area, the blood sample recovery area, the urine sample recovery area, and the feces.
- the sample recovery area uses the robotic arm action module to place and recover samples.
- the robot visual recognition module publishes the coordinates of the external position area of the human body corresponding to the external features of each organ
- the main system subscribes the location and coordinates of the external location acquisition area.
- the ultrasonic probe carried by the remote main control system and the autonomous robotic arm moves and scans the human body acquisition area according to the subscribed location of the acquisition area and the action of the robotic arm image acquisition action planning module.
- the ultrasonic probe and ultrasonic device publish the collected image information, and the robot main system and the visual recognition module subscribe to the image information.
- the main robot system and the visual recognition module input the internal contour of the image and the characteristic value of each organ, and use the deep neural network method and the weight optimizer to obtain the output value and the classification and recognition result of the internal organ.
- the present invention can solve the remote control robot remote isolation collection, autonomous injection, autonomous positioning, movement, and navigation through the medical robot device. Realize unmanned collection, isolated collection, and independently complete various medical and nursing tasks in outpatient clinics and wards. And in order to improve the problems of doctors and nurses, the work pressure and night shifts are too many.
- Fig. 1 is a schematic diagram of a medical robot device module in the specification of this application; Fig.
- robot main system 102-M set, injection action planning module; 103-camera vision module; 104-ultrasound, CT, DR Image acquisition module; 105 - Voice module; 106 - Heart sound and lung sound acquisition module; 107 - Medical data acquisition module; 108 - Radar mapping positioning and navigation module;
- the purpose of the present invention is to design a remote control robot that can replace human work, realize remote control robotic arm collection, and at the same time effectively solve autonomous collection, collection of oral detection samples for nucleic acid detection, biometric detection, collection of blood samples, Collect urine, stool samples.
- the present invention is further described in detail below with reference to the embodiments and the accompanying drawings, but the embodiments of the present invention are not limited thereto.
- the general idea of the technical solution in the implementation of the present application is as follows to solve the above technical problems: through the main control system of the robot, the ultrasonic image acquisition device carried by the robot, the intraoral acquisition device, the blood acquisition device, the CT image and the remote end of the DR radiology image Control the acquisition and sharing of images, realize the remote control of the robot through the vascular amplifier, intravenous injector, and other injection devices carried by the robot, autonomously inject and configure medicines independently, through the radar and visual cameras, make rounds, and pick and place medical equipment.
- the invention also provides an optimal management system for multi-task allocation in an outpatient ward and a method for real-time collection and sharing of medical pictures and a multi-user-robot voice interaction joint consultation method; Methods; a medical care, patient, robot tripartite matching remote control and autonomous sample collection, injection management method; a method for autonomous positioning and identification of human organ feature positions, acquisition, and classification of internal organs ultrasound and CT images.
- Embodiment 1 As shown in FIG.
- a medical robot device includes: a robot main system 101, the robot main system 101 is used to realize the main control of the robot, and the voice module 105 is connected to the robot main system 101 for the user Interaction, the visual recognition module 103 is used for face, human body, medical scene recognition, heart sound, and lung sound recognition module 106, which is used for collecting heart sound and lung sound.
- the radar 108 is used for autonomous mobile real-time mapping, the robotic arm is equipped with blood collection, and the injection action planning module 102 is used for image samples, oral test samples, blood samples, urine and stool samples collection, intravenous injection, and intramuscular injection.
- the robot arm and pick, place, scan code, and manage the action planning control module 109 is used for medical equipment, medicine pick, place, scan code, and management.
- a voice module 105 the voice module is used to collect the voices of doctors and patients, and the scene voices of outpatient wards.
- the robot main control system 101 interacts with the user and provides voice guidance, voice commands, and voice interaction.
- the visual recognition module 103 the face recognition in the visual recognition module 103, recognizes the face of the patient user, the medical administrator, and is used for the collection of patients and their corresponding samples, medical equipment, and drug management.
- the visual recognition module 103 recognizes human facial features, recognizes facial features and their positions, and the position of the oral cavity, and is used to collect oral samples to be detected.
- the visual recognition module 103 recognizes the position of human body features, recognizes the wrist, arm elbow, and finger joints and their positions, under the blood vessel amplifier, the position of the wrist vein and the elbow vein, which is used for blood vessel positioning, blood collection, Intravenous injection. Identify shoulder joints, waist joints, for proximal shoulder intramuscular injection location identification, localization, distal and autonomous injections.
- the medical scene recognition described in the visual recognition module 103 identifies clinics, wards, patients, doctors, alphanumeric characters of house numbers, etc., and the voice module collects 105 medical scene voices to comprehensively recognize the medical scene.
- the medical supplies in the visual recognition module 103 identify respiratory equipment, negative pressure equipment, 24-hour monitoring equipment, and other medical equipment used in various specialties.
- the heart sound and pulmonary spiral sound recognition module 106 is used for the feature extraction of the heart sound and the pulmonary spiral sound, and the improved sound recognition method is used to intelligently identify the abnormal heart sound and the spiral sound.
- the blood collection, the injection action planning module 102, and the visual recognition module 103 identify the positions of the fingers, the pubic end, and the joints of the arm, using the blood vessel amplifier, the arm fixing device, to locate the position of the end of the toe, the wrist of the arm, and the vein of the elbow Vascular position, upper arm, waist joint intramuscular injection position, application of collection needle, injection needle to collect blood, intravenous injection, intramuscular injection.
- the robotic arm autonomously collects, moves, and places the blood sample to the sample placement area.
- Oral collection and facial facial features recognition in the visual recognition module 103 positioning of oral cavity position, tooth position, oral cavity wall position, using the oral cavity collector carried by the robotic arm, the oral cavity collector cotton, the oral mirror, the planning movement, the left and right front and rear directions Sliding along the wall, collecting movements, accurately collecting saliva, biological detection objects in the oral cavity, and intraoral images.
- Urine, stool sample storage action planning module, the urine, stool sample storage action and the visual recognition module 103 for robot tour and corresponding wards, beds, patients and their corresponding two-dimensional codes, digital codes are matched , Use the robotic arm to automatically identify and grab, move, and place urine and fecal samples in the sample collection area.
- the medical image acquisition and sharing module 104 is connected to the robot main system 101 and is used for acquiring ultrasound images, CT images, image sharing, remote control acquisition and sharing of DR radiology images, remote consultation, and multi-department joint consultation.
- Action planning modules for breathing equipment, negative pressure equipment, and 24-hour monitoring equipment The action planning module of the breathing equipment, negative pressure equipment, and 24-hour monitoring equipment applies the facial feature recognition and body feature recognition of the visual recognition module to identify the characteristic positions of the mouth, nose, ears, eyes, and the body, locate the position, and design the robotic arm. Pick up, move, place, wear, pick up equipment, and monitor the normal operation of equipment.
- the medical supplies, medicine picking, placing, configuring, and managing module 109 are used for medicines, treatment equipment, rehabilitation equipment, picking, placing, scanning digital codes, two-dimensional codes, effective management, and distribution equipment.
- the visual recognition module 103 is used to identify the patient's face, and the wristband scans the two-dimensional code to compare the bed position, hand card information, digital codes of medical devices and drugs, and two-dimensional code matching to compare the doctor's order information. Self-collection, scanning, and management of medical devices. As shown in FIG.
- an optimized task management system and a method of using a medical robot device are as follows: Using the optimized task management system, the medical administrator arranges time for patients in multiple departments and wards-and their respective For the tasks corresponding to the time period, add all the tasks to the optimization task management system, and the medical robot device receives the tasks assigned by the administrator of the optimization task management system according to the date, time, and corresponding departments and wards.
- Administrator users and expert users can log in to the optimization task management department, control robots remotely, manage robots under their respective departments and ward jurisdictions, add, modify, delete, query, dynamically schedule various tasks of robots in real time, and call the medical area System connection, remote consultation, joint consultation and treatment of patients in the jurisdiction, sending doctor's order information, accepting patient messages, and replying to patient messages.
- the radar module 108 and the vision module 103 for the task are routed for each time period.
- the application of medical supplies, medicine picking, placing, configuration, management module 109, blood collection, injection action planning module 102, voice module 105, ultrasound, CT, DR image acquisition module 104 respectively handle different tasks.
- the management and configuration tasks use the robot motion planning of the medical supplies, medicine picking, placing, configuration, and management module 109, and the steps are as follows:
- the administrator issues medical orders and assigns tasks.
- the robot uses the voice device 215, the voice recognition module 105, the voice synthesis technology, the voice recording, and the voice to text to recognize the doctor's order.
- the robot uses the visual recognition module 103 to identify the equipment, medicine and their corresponding positions.
- the robot uses the radar 207 and the radar to move autonomously, recognize the medical scene, map the module 108, locate, navigate, and autonomously move to the equipment and medicine placement area.
- the robot uses medical supplies, medicines to take, place, configure, manage module 109, pick up equipment, medicines, and scan information codes.
- the location information of the patient benefited by the robot includes: ward, department, and bed location information. Radar positioning, navigation autonomously moves to the hospital bed.
- the robot uses the medical scene of the visual recognition module 103 to identify the department, the ward house number, the alphanumeric characters, and the bed number. Use the robot vision module to recognize the face, check the matching, and if they are consistent, perform step 8. If they are inconsistent, reposition the navigation.
- the robot uses the 212 information scanning device to scan the digital code and the two-dimensional code of the patient's wristband, and check and match the two-dimensional code, digital code, and digital code of the doctor's order information on the equipment and medicine. If the scanning result is correct, the equipment and medicine will be distributed. Otherwise return the message to the administrator.
- S9. Use the upper left arm 208 and upper right arm 205 of the robotic arm to place the dispensing equipment and medicines in the medicine box and equipment placement area.
- the robot uses the motion planning blood collection to inject the motion planning module 102.
- the collection and injection steps are as follows:
- the administrator issues medical orders and assigns tasks.
- the robot uses the voice device 215, the voice module 105, the voice synthesis technology, the voice recording, and the voice to text to recognize the doctor's order.
- the robot uses the patient's position information, the patient's ward, the department, and the bed position information.
- the radar 207 navigates autonomously to the hospital bed.
- the robot uses the camera 201 and the vision module 103 to recognize faces, facial features, features, and their positions. Identify fingers, toe ends, arm joints, and joint positions. Apply the vascular amplifier 209, the arm immobilization device 213, locate the position of the toe end, the wrist of the arm, the position of the venous blood vessel in the elbow, the position of the intramuscular injection in the upper arm, and the position information.
- the robot collects oral cavity, image, blood, and injection actions according to the position information in step S4 and the action planning module.
- step S5 the blood collection, the injection action planning module, the finger-end peripheral blood collection module, the collector 210, the syringe needle 211, on the basis of identifying the positions of the fingers, the ends of the toes, and the joints of the arm, the blood vessel amplifier 209 is applied, and the arm is fixed
- the device 213 locates the position of the end of the toe, the wrist of the arm, the venous blood vessel of the elbow, and the intramuscular injection position of the upper arm, uses the collector 210, collects blood, and uses the syringe 211 for intravenous injection and intramuscular injection.
- step S5 the oral collection action planning module uses the facial features of the visual recognition module 103 to recognize, identify, and locate the oral cavity position, the tooth position, and the oral cavity wall position, using the oral cavity collector 210 carried by the robot arm, the oral cavity collector cotton 210, the oral mirror 210, plans to move, slides along the wall in the left and right front and rear directions, collects movements, accurately collects saliva, oral features in the oral cavity, and intraoral images.
- the urine and feces sample collection module the block is used for the robot to tour the corresponding ward, hospital bed, and the patient uses the information scanning device 212 to scan the corresponding two-dimensional code, the digital code is matched, and the right upper arm of the robot arm is used.
- the upper left arm 208 automatically recognizes and grasps, moves, and places urine and stool samples in the sample placement area 214 .
- the robot uses the radar 207 to locate and navigate autonomously to move to the sample recovery area.
- the multi-user-robot voice interaction joint consultation method includes the following steps :
- the robot uses speech recognition and speech synthesis technology to explain the patient's condition.
- the administrator uses the robot to use the ultrasound, CT, and DR image acquisition module 104 to acquire ultrasound images and CT images in real time picture.
- the collection step is as in S6.
- the administrator uses the robot platform to share voice, collected and real-time collected medical pictures, text, and multi-user-robots to share medical information.
- S5 uses the blood pressure meter, blood glucose meter, thermometer, stethoscope, and heart rate equipment in the basic medical equipment area carried by the robot to collect basic medical information and share medical information with multiple users.
- the administrator uses the real-time voice interaction carried by the robot platform, the voice recognition module 105, the real-time multi-user voice conversation, the voice-to-text additional picture information, the recording of the multi-user voice interaction, and the voice conference.
- Steps for the administrator to classify and identify the internal organs of the human body organ feature position and ultrasound image CT image, autonomously locate, and collect ultrasound and CT images:
- the robot visual recognition module 103 recognizes the external features of the organs including: shoulder joints, breasts and breast heads, belly navels, characteristic genitals, waist joints and their corresponding coordinates of the external position area of the human body.
- Step 2 According to the coordinates of the external position area of the human body corresponding to the external features of each organ, the ultrasonic probe 203 and the ultrasonic device 204 carried by the robot arm scan the external position acquisition area.
- Step 3 The remote main control system 202 and the ultrasonic probe 203 mounted on the autonomous robotic arm move and scan the human body acquisition area according to the action of the robotic arm image acquisition action planning module. Image information collected by the ultrasound probe 203 and the ultrasound device 204.
- Step 4 The robot main system 202 and the visual recognition module 103 input the ultrasound, the internal contour of the CT image, and the characteristic value of each organ, and use the deep neural network method and the weight optimizer to obtain the output value and the internal organ classification and recognition result.
- Step 5 According to the output results, accurately classify and identify the ultrasound and CT images of human organs, and associate the identification results with the intelligent identification system for diseases of each organ. Publish the identification results and their corresponding disease symptoms, and disease information to the administrators and users of the robot's main system.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Hematology (AREA)
- Biophysics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Epidemiology (AREA)
- Evolutionary Computation (AREA)
- Vascular Medicine (AREA)
- General Physics & Mathematics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Primary Health Care (AREA)
- Acoustics & Sound (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Computational Linguistics (AREA)
- Computing Systems (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Business, Economics & Management (AREA)
- Pulmonology (AREA)
- Software Systems (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Dentistry (AREA)
- Dermatology (AREA)
- Anesthesiology (AREA)
- Bioinformatics & Cheminformatics (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2021321650A AU2021321650A1 (en) | 2020-08-05 | 2021-07-29 | Medical robotic device, system, and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010780479.0 | 2020-08-05 | ||
CN202010780479.0A CN111916195A (zh) | 2020-08-05 | 2020-08-05 | 一种医疗用机器人装置,系统及方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022027921A1 true WO2022027921A1 (zh) | 2022-02-10 |
Family
ID=73287855
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/000162 WO2022027921A1 (zh) | 2020-08-05 | 2021-07-29 | 一种医疗用机器人装置、系统及方法 |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN111916195A (zh) |
AU (1) | AU2021321650A1 (zh) |
WO (1) | WO2022027921A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114886476A (zh) * | 2022-07-14 | 2022-08-12 | 清华大学 | 咽拭子自动采集机器人 |
CN115781686A (zh) * | 2022-12-26 | 2023-03-14 | 北京悬丝医疗科技有限公司 | 一种用于远程诊脉的机械手臂及控制方法 |
CN116129112A (zh) * | 2022-12-28 | 2023-05-16 | 深圳市人工智能与机器人研究院 | 一种核酸检测机器人的口腔三维点云分割方法及机器人 |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021254427A1 (zh) * | 2020-06-17 | 2021-12-23 | 谈斯聪 | 超声图像数据采集分析识别一体化机器人,平台 |
CN111916195A (zh) * | 2020-08-05 | 2020-11-10 | 谈斯聪 | 一种医疗用机器人装置,系统及方法 |
WO2021253809A1 (zh) * | 2020-06-19 | 2021-12-23 | 谈斯聪 | 血液采集分析、图像智能识别诊断一体化装置、系统及方法 |
CN114800538A (zh) * | 2021-01-21 | 2022-07-29 | 谈斯聪 | 一种陪伴陪护机器人装置、自适应学习系统及方法 |
CN112951230A (zh) * | 2021-02-08 | 2021-06-11 | 谈斯聪 | 一种远端及自主实验机器人装置、管理系统及方法 |
CN113110325A (zh) * | 2021-04-12 | 2021-07-13 | 谈斯聪 | 一种多臂分拣作业移动投递装置、最优化的管理系统及方法 |
CN115192051A (zh) * | 2021-04-13 | 2022-10-18 | 佳能医疗系统株式会社 | 医用影像装置、医用影像系统以及医用影像装置中辅助检查方法 |
CN112990101B (zh) * | 2021-04-14 | 2021-12-28 | 深圳市罗湖医院集团 | 基于机器视觉的面部器官定位方法及相关设备 |
CN113425332A (zh) * | 2021-06-29 | 2021-09-24 | 尹丰 | 核酸采集和疫苗接种一体化装置及方法 |
CN113478457A (zh) * | 2021-08-03 | 2021-10-08 | 爱在工匠智能科技(苏州)有限公司 | 一种医疗服务机器人 |
CN113858219A (zh) * | 2021-08-23 | 2021-12-31 | 谈斯聪 | 一种医疗用机器人装置、系统及方法 |
CN113855067A (zh) * | 2021-08-23 | 2021-12-31 | 谈斯聪 | 视觉图像与医疗图像融合识别、自主定位扫查方法 |
CN113855068A (zh) * | 2021-08-27 | 2021-12-31 | 谈斯聪 | 智能识别胸部器官、自主定位扫查胸部器官的方法 |
CN113855250A (zh) * | 2021-08-27 | 2021-12-31 | 谈斯聪 | 一种医疗用机器人装置、系统及方法 |
CN114310957A (zh) * | 2022-01-04 | 2022-04-12 | 中国科学技术大学 | 用于医疗检测的机器人系统及检测方法 |
WO2023167830A1 (en) * | 2022-03-01 | 2023-09-07 | The Johns Hopkins University | Autonomous robotic point of care ultrasound imaging |
CN117245635B (zh) * | 2022-12-12 | 2024-10-15 | 北京小米机器人技术有限公司 | 机器人及其控制方法、装置、存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150273697A1 (en) * | 2014-03-27 | 2015-10-01 | Fatemah A.J.A. Abdullah | Robot for medical assistance |
CN107030714A (zh) * | 2017-05-26 | 2017-08-11 | 深圳市天益智网科技有限公司 | 一种医用看护机器人 |
CN107322602A (zh) * | 2017-06-15 | 2017-11-07 | 重庆柚瓣家科技有限公司 | 用于远程医疗的家庭服务机器人 |
CN107788958A (zh) * | 2017-10-20 | 2018-03-13 | 深圳市前海安测信息技术有限公司 | 医疗监护机器人及医疗监护方法 |
WO2019175675A2 (en) * | 2019-07-01 | 2019-09-19 | Wasfi Alshdaifat | Dr robot medical artificial intelligence robotic arrangement |
CN110477956A (zh) * | 2019-09-27 | 2019-11-22 | 哈尔滨工业大学 | 一种基于超声图像引导的机器人诊断系统的智能扫查方法 |
CN111916195A (zh) * | 2020-08-05 | 2020-11-10 | 谈斯聪 | 一种医疗用机器人装置,系统及方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120130739A1 (en) * | 2010-11-21 | 2012-05-24 | David Crane | Unsupervised Telemedical Office for Remote &/or Autonomous & Automated Medical Care of Patients |
CN206780416U (zh) * | 2017-05-23 | 2017-12-22 | 周葛 | 一种智能医疗助理机器人 |
CN111358439A (zh) * | 2020-03-14 | 2020-07-03 | 厦门波耐模型设计有限责任公司 | 全科医师机器人系统 |
-
2020
- 2020-08-05 CN CN202010780479.0A patent/CN111916195A/zh active Pending
-
2021
- 2021-07-29 WO PCT/CN2021/000162 patent/WO2022027921A1/zh active Application Filing
- 2021-07-29 AU AU2021321650A patent/AU2021321650A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150273697A1 (en) * | 2014-03-27 | 2015-10-01 | Fatemah A.J.A. Abdullah | Robot for medical assistance |
CN107030714A (zh) * | 2017-05-26 | 2017-08-11 | 深圳市天益智网科技有限公司 | 一种医用看护机器人 |
CN107322602A (zh) * | 2017-06-15 | 2017-11-07 | 重庆柚瓣家科技有限公司 | 用于远程医疗的家庭服务机器人 |
CN107788958A (zh) * | 2017-10-20 | 2018-03-13 | 深圳市前海安测信息技术有限公司 | 医疗监护机器人及医疗监护方法 |
WO2019175675A2 (en) * | 2019-07-01 | 2019-09-19 | Wasfi Alshdaifat | Dr robot medical artificial intelligence robotic arrangement |
CN110477956A (zh) * | 2019-09-27 | 2019-11-22 | 哈尔滨工业大学 | 一种基于超声图像引导的机器人诊断系统的智能扫查方法 |
CN111916195A (zh) * | 2020-08-05 | 2020-11-10 | 谈斯聪 | 一种医疗用机器人装置,系统及方法 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114886476A (zh) * | 2022-07-14 | 2022-08-12 | 清华大学 | 咽拭子自动采集机器人 |
CN114886476B (zh) * | 2022-07-14 | 2022-09-20 | 清华大学 | 咽拭子自动采集机器人 |
CN115781686A (zh) * | 2022-12-26 | 2023-03-14 | 北京悬丝医疗科技有限公司 | 一种用于远程诊脉的机械手臂及控制方法 |
CN116129112A (zh) * | 2022-12-28 | 2023-05-16 | 深圳市人工智能与机器人研究院 | 一种核酸检测机器人的口腔三维点云分割方法及机器人 |
Also Published As
Publication number | Publication date |
---|---|
CN111916195A (zh) | 2020-11-10 |
AU2021321650A1 (en) | 2023-04-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022027921A1 (zh) | 一种医疗用机器人装置、系统及方法 | |
US20210030275A1 (en) | System and method for remotely adjusting sound acquisition sensor parameters | |
AU2012219076B2 (en) | System and method for performing an automatic and self-guided medical examination | |
CN107752984A (zh) | 一种基于大数据的高智能全科医疗执业机器人 | |
WO2021254444A1 (zh) | 五官及外科医疗数据采集分析诊断机器人,平台 | |
US20210166812A1 (en) | Apparatus and methods for the management of patients in a medical setting | |
CN109044656B (zh) | 医用护理设备 | |
WO2023024399A1 (zh) | 一种医疗用机器人装置、系统及方法 | |
CN111844078A (zh) | 一种协助护士临床工作的智能护理机器人 | |
CN108942952A (zh) | 一种医疗机器人 | |
US20200027568A1 (en) | Physician House Call Portal | |
WO2019100585A1 (zh) | 基于眼底相机的中医治未病监控系统及方法 | |
WO2012111013A1 (en) | System and method for performing an automatic and remote trained personnel guided medical examination | |
WO2023024397A1 (zh) | 一种医疗用机器人装置、系统及方法 | |
Gritsenko et al. | Current state and prospects for the development of digital medicine | |
CN115844346A (zh) | 一种应用于疾病检查、观察和治疗的无线体征参数监测装置 | |
CN110660487A (zh) | 一种新生儿疼痛闭环管理系统及方法 | |
CN108577884A (zh) | 一种远程听诊系统及方法 | |
JP2022000763A (ja) | 自動の及び遠隔の訓練された人によりガイドされる医学検査を行うためのシステム及び方法 | |
EP4371493A1 (en) | Method for ecg reading service providing | |
TW202044268A (zh) | 醫療機器人及病歷資訊整合系統 | |
CN115644807A (zh) | 基于面部和舌部图像采集的中医分析系统及其方法 | |
CN115813358A (zh) | 一种应用于健康管理的无线体征参数检测仪 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21852262 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021852262 Country of ref document: EP Effective date: 20230306 |
|
ENP | Entry into the national phase |
Ref document number: 2021321650 Country of ref document: AU Date of ref document: 20210729 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21852262 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 28.09.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21852262 Country of ref document: EP Kind code of ref document: A1 |