CN114072258A - Medical artificial intelligent robot arrangement for robot doctor - Google Patents
Medical artificial intelligent robot arrangement for robot doctor Download PDFInfo
- Publication number
- CN114072258A CN114072258A CN201980098296.6A CN201980098296A CN114072258A CN 114072258 A CN114072258 A CN 114072258A CN 201980098296 A CN201980098296 A CN 201980098296A CN 114072258 A CN114072258 A CN 114072258A
- Authority
- CN
- China
- Prior art keywords
- drobot
- robotic
- unit
- patient
- server
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013473 artificial intelligence Methods 0.000 claims abstract description 25
- 238000002347 injection Methods 0.000 claims abstract description 12
- 239000007924 injection Substances 0.000 claims abstract description 12
- 238000002604 ultrasonography Methods 0.000 claims abstract description 8
- 210000005069 ears Anatomy 0.000 claims abstract description 4
- 239000008280 blood Substances 0.000 claims description 20
- 210000004369 blood Anatomy 0.000 claims description 20
- 238000003745 diagnosis Methods 0.000 claims description 18
- 238000000034 method Methods 0.000 claims description 16
- 239000003814 drug Substances 0.000 claims description 12
- 229940079593 drug Drugs 0.000 claims description 12
- 241000282412 Homo Species 0.000 claims description 11
- 210000000056 organ Anatomy 0.000 claims description 11
- 238000012360 testing method Methods 0.000 claims description 11
- 208000024891 symptom Diseases 0.000 claims description 10
- 201000010099 disease Diseases 0.000 claims description 9
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims description 9
- 238000012549 training Methods 0.000 claims description 9
- 238000004891 communication Methods 0.000 claims description 7
- 230000036541 health Effects 0.000 claims description 7
- 230000008447 perception Effects 0.000 claims description 7
- 230000001954 sterilising effect Effects 0.000 claims description 7
- 238000005516 engineering process Methods 0.000 claims description 6
- 238000010801 machine learning Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 6
- 238000001356 surgical procedure Methods 0.000 claims description 6
- 238000011109 contamination Methods 0.000 claims description 5
- 238000005520 cutting process Methods 0.000 claims description 5
- 238000002483 medication Methods 0.000 claims description 5
- IQLZWWDXNXZGPK-UHFFFAOYSA-N methylsulfonyloxymethyl methanesulfonate Chemical compound CS(=O)(=O)OCOS(C)(=O)=O IQLZWWDXNXZGPK-UHFFFAOYSA-N 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 5
- 238000012216 screening Methods 0.000 claims description 5
- 238000004659 sterilization and disinfection Methods 0.000 claims description 5
- 230000003187 abdominal effect Effects 0.000 claims description 4
- 230000036772 blood pressure Effects 0.000 claims description 4
- 238000009534 blood test Methods 0.000 claims description 4
- 239000012636 effector Substances 0.000 claims description 4
- 238000012423 maintenance Methods 0.000 claims description 4
- 230000007246 mechanism Effects 0.000 claims description 4
- 239000002184 metal Substances 0.000 claims description 4
- 230000000813 microbial effect Effects 0.000 claims description 4
- 208000027418 Wounds and injury Diseases 0.000 claims description 3
- 239000000853 adhesive Substances 0.000 claims description 3
- 230000001070 adhesive effect Effects 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 claims description 3
- 238000003339 best practice Methods 0.000 claims description 3
- 210000005079 cognition system Anatomy 0.000 claims description 3
- 238000003384 imaging method Methods 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 claims description 3
- 241000272517 Anseriformes Species 0.000 claims description 2
- 230000009471 action Effects 0.000 claims description 2
- 230000002950 deficient Effects 0.000 claims description 2
- 238000001514 detection method Methods 0.000 claims description 2
- 230000005611 electricity Effects 0.000 claims description 2
- 230000005021 gait Effects 0.000 claims description 2
- 208000014674 injury Diseases 0.000 claims description 2
- 238000007689 inspection Methods 0.000 claims description 2
- 210000000867 larynx Anatomy 0.000 claims description 2
- 238000007726 management method Methods 0.000 claims description 2
- 239000000463 material Substances 0.000 claims description 2
- 238000012544 monitoring process Methods 0.000 claims description 2
- 235000010603 pastilles Nutrition 0.000 claims description 2
- 230000001575 pathological effect Effects 0.000 claims description 2
- 238000012552 review Methods 0.000 claims description 2
- 230000001953 sensory effect Effects 0.000 claims description 2
- 230000007480 spreading Effects 0.000 claims description 2
- 238000003892 spreading Methods 0.000 claims description 2
- 239000000126 substance Substances 0.000 claims description 2
- 238000002560 therapeutic procedure Methods 0.000 claims description 2
- 230000008733 trauma Effects 0.000 claims description 2
- 238000012285 ultrasound imaging Methods 0.000 claims description 2
- 210000003462 vein Anatomy 0.000 claims description 2
- 239000011295 pitch Substances 0.000 claims 2
- 230000002567 autonomic effect Effects 0.000 claims 1
- 230000001149 cognitive effect Effects 0.000 claims 1
- 238000007796 conventional method Methods 0.000 claims 1
- 230000001419 dependent effect Effects 0.000 claims 1
- 230000000977 initiatory effect Effects 0.000 claims 1
- 230000008685 targeting Effects 0.000 claims 1
- 230000003319 supportive effect Effects 0.000 abstract description 2
- -1 syringes Substances 0.000 abstract description 2
- 238000005259 measurement Methods 0.000 abstract 1
- 230000001225 therapeutic effect Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 3
- 238000003748 differential diagnosis Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 102100024125 Embryonal Fyn-associated substrate Human genes 0.000 description 2
- 101001053896 Homo sapiens Embryonal Fyn-associated substrate Proteins 0.000 description 2
- 210000004027 cell Anatomy 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 238000002405 diagnostic procedure Methods 0.000 description 2
- ONSMNPXHEJVOQJ-UHFFFAOYSA-N drilodefensin 1 Chemical compound CCCCCCC=1OC(CC)=CC=1S(O)(=O)=O ONSMNPXHEJVOQJ-UHFFFAOYSA-N 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 239000007937 lozenge Substances 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000004083 survival effect Effects 0.000 description 2
- 208000000044 Amnesia Diseases 0.000 description 1
- 208000031091 Amnestic disease Diseases 0.000 description 1
- 241000272525 Anas platyrhynchos Species 0.000 description 1
- 206010016952 Food poisoning Diseases 0.000 description 1
- 208000019331 Foodborne disease Diseases 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 208000005374 Poisoning Diseases 0.000 description 1
- 208000004078 Snake Bites Diseases 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000006986 amnesia Effects 0.000 description 1
- 210000003484 anatomy Anatomy 0.000 description 1
- 230000002421 anti-septic effect Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004195 computer-aided diagnosis Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 208000019622 heart disease Diseases 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 208000010125 myocardial infarction Diseases 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 231100000572 poisoning Toxicity 0.000 description 1
- 230000000607 poisoning effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000004393 prognosis Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000004171 remote diagnosis Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 238000010008 shearing Methods 0.000 description 1
- 210000000130 stem cell Anatomy 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 239000003826 tablet Substances 0.000 description 1
- 210000002700 urine Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/548—Remote control of the apparatus or devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4209—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
- A61B8/4218—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by articulated arms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/005—Manipulators for mechanical processing tasks
- B25J11/0055—Cutting
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/009—Nursing, e.g. carrying sick persons, pushing wheelchairs, distributing drugs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/305—Details of wrist mechanisms at distal ends of robotic arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Master-slave robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4405—Constructional features of apparatus for radiation diagnosis the apparatus being movable or portable, e.g. handheld or mounted on a trolley
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/545—Control of apparatus or devices for radiation diagnosis involving automatic set-up of acquisition parameters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/547—Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/56—Details of data transmission or power supply, e.g. use of slip rings
- A61B6/566—Details of data transmission or power supply, e.g. use of slip rings involving communication between diagnostic systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4427—Device being portable or laptop-like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
- A61B8/4494—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer characterised by the arrangement of the transducer elements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/56—Details of data transmission or power supply
- A61B8/565—Details of data transmission or power supply involving data transmission via a network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/58—Testing, adjusting or calibrating the diagnostic device
- A61B8/585—Automatic set-up of the device
Abstract
A robotic Artificial Intelligence (AI) clinical unit (60) customizes a master medical diagnostic device to a robotic physician (20), autonomous laboratory. The fingers of the right hand (21) of the robot (20) are replaced by retractable diagnostic and therapeutic tools: injections, syringes, gel tubes, ultrasound transducers, cameras and light poles, with an X-ray camera (32) in the middle of his palm, the left hand and two additional optional lower hands for other supportive treatment, measurement, sensing or semi-robotic tasks. The eyes are replaced by a computer vision camera and an infrared camera, the ears for listening and the mouth for speaking. The compartment is covered with a smart screen (26). The AI robot physician unit (60) is a clinic built locally inside a building, company, train, ship or flight ambulance cabin (51) with a twin robot (20).
Description
Technical Field
The present invention relates to embodiments of robots, artificial intelligence, manipulators, and UAVs in a compact clinical arrangement for medical services.
Background
Medical diagnosis is used to become aware of a disease or condition, and the symptoms of a person are explained using information collected from the person or his medical history and simple to extremely complex physical examinations, diagnostic tests, laboratory diagnoses, radiological imaging diagnoses, main diagnoses, during one or more visits or even during many clinics, medical centers and hospital visits in one or more countries. In many cases, where symptoms are misleading or indicative of many conditions, the variable pieces of information collected should be correlated via differential diagnosis to illustrate or understand medical cases.
The capabilities of any one of a healthcare professional, healthcare scientist, physician, medical nurse practitioner, technician, or the quality of tools and devices or medical services varies more or less from country to country, city to city in the same country, hospital, or even the same medical unit. Meanwhile, people from the same residential area, building, company, family, tribe or surrounding area visit different clinics or hospitals.
Thus, there are many complex variables, different factors and criteria, at least from the standpoint of Artificial Intelligence (AI), that make the overall process decentralized and relatively poorly correlated. As an example, differential diagnosis is based on the process of finding many candidate diseases or conditions that are likely to cause signs or symptoms followed by elimination of unlikely entries until a candidate disease or condition is obtained, which depends in many ways on human competence and manual handling, sometimes a tedious, tedious and costly process. Based on this, a patient who is able to search for the best service may be trapped between differences, data, diagnoses, history, documentation, judgments, tests, medications, recommended and missing family histories, etc., while an impoverished or unaware person may be limited to a certain number of options, with no opportunity to upgrade the level of service he is receiving.
Diagnostic robots in the form of physical robots or software expert systems were developed in the 70's of the twentieth century, and the Watson software system was announced by IBM in 2013. A medical automatic diagnostic apparatus for examining samples, which comprises a robot arm, is disclosed in the KR 100759079B 1 patent, and a medical endoscopic instrument for connection with a robot is disclosed in DE102013204677a 1. On the other hand, the AI system wins competition with experts, for example, the AI neural network can see patterns that humans cannot see in tumor patterns. Some efforts have been made to input data in large quantities into the AI system, with information on the most accurate details of the disease, and data on the pathology of the patient, and a method for medical diagnosis using a PDA software robot is disclosed in U.S. patent application No. US20110124975a 1.
Medical manipulators have also been introduced in the art, wherein a surgeon using a remote robot performs a specific movement to move a robot associated with a surgical procedure. In computer controlled systems, where the surgeon uses a computer to control the robot arm and its end effector, although these systems may still use telerobots to make their inputs, in computer controlled robotic systems, where the surgeon can perform surgery from anywhere in the world while controlling a particular robot in the operating room, a medical robotic system for surgery is disclosed in PCT patent application No. WO/2016/142050a 8. A western medical robotics doctor for obtaining direct images of the face and body of a patient and listening to his voice is disclosed in patent application CN 201711497223A.
None of the prior art discloses the use of his fingers and hands as a means for medical diagnosis or a robot relying on UAV or AI arrangements, which is the object of the present invention, and as machines have done in the past, in the near future, robotics can even replace highly skilled talents with the support of AI and internet of things development, where medical research, work and services can be replaced and driven by robotics AI medical engineering techniques.
Disclosure of Invention
Brief description of the drawings
The present invention is to customize the main medical diagnostic device into the robot to work as part of a compact medical arrangement supported with local and international data servers, AI and UAV.
The right hand finger of the robot is replaced by a retractable tool: injections, syringes, gel tubes, ultrasound transducers, cameras and lamp poles, while the X-ray camera is mounted in the middle of the palm of the robot, whose left hand serves as a holding hand, similar to the human hand, grasping, carrying, pushing, pressing and sensing. Additional two hands may be mounted under the waist of the robot for additional optional manipulator tasks.
The eyes of the robot are replaced by a camera and the face resembles a human face, listening with the ears and speaking with the mouth. While the chest and abdomen are replaced by four chambers for storing: blood samples and ready-to-use test tubes, emergency injections, lozenges, liquid medications. The door of the chamber opens autonomously. The four doors of the chamber are covered with one smart tablet screen for demonstration.
Under the chest, a groove of the screen is formed for use on the opposite side of a person's body when taking an X-ray image of any one of his organs.
This robot doctor is located in any facility (medical centre, building, factory, airplane.) and his clinic is called AI robot doctor unit. Drobot can visit at any time, but for safety by riding a sterile, closed, motorized cabin.
For emergencies, in another embodiment, Drobot is a twin robot fixed on rails inside a flying VTOL ambulance (cabin) with surgical hands and can be computer controlled, while on the side of the flying ambulance, two external sub-cabins are installed for carrying the robot staff providing support, they leave their cabin to search, pick up and rescue the injured person from the accident, lay him down on a hospital bed, and push him towards the ambulance cabin.
Drawings
FIG. 1: a 3D view showing the front side of the Drobot and its right hand details.
FIG. 2: a view of the robot right hand extensible finger device is shown.
FIG. 3: a 3D view of the Drobot storage compartment is shown.
FIG. 4: a 3D view of a Drobot right hand finger device configuration is shown.
FIG. 5: multiple 3D side views of a 4-handed Drobot configuration are shown.
FIG. 6: a plurality of 3D side views of X-ray images of a patient are shown being taken by a Drobot.
FIG. 7: showing the foot anatomy image displayed on the Drobot screen.
FIG. 8: a block diagram of a combination of a Drobot20 data input sensor to an intelligent system and output actuator is shown.
FIG. 9: a view of the adjustable handle and foldable leg support + foot/leg grip and foldable leg support is shown.
FIG. 10: a view of a sterile electric wheel well taken by a drosot is shown.
FIG. 11: a flow diagram of data between a Drobot, a sub-server, a local server, a city server, a province server, a country server, a regional server, and a world server is shown.
FIG. 12: two types of flight ambulance cabs are shown, typically in a flight configuration.
FIG. 13: two types of flight ambulance cabs are shown, typically in a landing or takeoff configuration.
FIG. 14: a view of an assistant technician robot is shown.
FIG. 15: a view showing a method of evacuating injured persons from a car in an accident.
FIG. 16: showing the injured person being evacuated to a flight ambulance.
FIG. 17: a Drobot diagnosis/treatment of an injured person is shown.
Detailed Description
To facilitate the implementation of the invention, a detailed description of the components of the invention is provided herein, supported by the drawings, in which the main components are arranged sequentially according to their importance, so that it is easy to read with the numbers contained in the component description text and in the list of component numbers by referring to each feature, where the numbers of the component features are represented sequentially, starting from the number 20, to which the required serial number will be assigned directly each time a component feature is presented in the text. In fig. 1, the features are arranged sequentially from the number 20 to 21, 22, 23, as an example. But in addition to that, the artificial medical clinic (robotic cell) is assigned the number 60, which refers to both landed or flying artificial medical clinics (robotic cells). Also, Drobot will be an abbreviation for Robotist, which is an abbreviation for devices and tools associated with Drobot. Any combination of unmanned robot arrangements may also be represented using Drobot as an abbreviation.
The artificial medical clinic (robotic unit) 60 main unit is a Drobot 20. The main characteristic part of the Drobot20 consists of his first right hand 21, the group of compartments 22, the first left hand 23, the second lower right hand 24, the second lower left hand 25 and the smart display 26 (figures 1, 5, 7).
The right hand 21 conventional finger is replaced by any set of customized compact devices to perform specific diagnostic tasks in addition to other emergency operations such as providing for injection and collecting blood samples (fig. 1, 2, 3). Thus, one proposed set of simulated finger devices includes: a gel supply tube 27, an injection 28, an ultrasonic transducer 29, a bar 30 with a lamp and a camera at the end, a blood sample tube (syringe) 31, and an X-ray camera 32 mounted in the middle of the palm.
The right hand 21 of the Drobot20 can be customized with more or less devices, the devices/tools can be retracted/stored within the recess 33 (FIG. 2) when not in use, and one or more devices/tools can be extended when in use, or bendable (foldable) as in FIG. 4, so that the finger devices or tools in use are in an extended configuration, while the gel tube 27, the injectant 28, and the syringe 31 can be replaced autonomously, where a command is sent to the DME (digital motor electronics) 35 to move the motorized left hand 23 to remove the used gel tube 27, injectant 28, or syringe 31 from the right hand 21, place it in a designated compartment and pick up a new device or tool, then mount it in a specific location of the right hand 21, according to the medical CPU (central processing unit) 34.
The compartment set 22 is divided and structured according to the requirements and number of compartments required, the basic set consisting of four compartments, a first compartment 36 for emergency injections 28, a second compartment 37 for blood samples 31, a third compartment 38 for emergency pastille or vial samples, a fourth compartment 39 for ready-to-use material accessories such as a gel tube 27, an ultrasound transducer 29, a thermometer 40, a wound antiseptic, a bandage, an adhesive.
The first left hand 23 palm, grip and finger fingertips are used for skin-like sensors for tactile perception, sensing abdominal sounds, heart rate, patient body temperature, optionally adding a second lower right hand 24 for additional tasks (fig. 5), such as measuring blood pressure and holding or supporting the patient's body or human organ to assist the other hands in performing their tasks on the patient's body, this and other hands being provided with all necessary devices and instruments to assist in locating the position and location of the human organ and the coordinates of specific points thereon, they are provided with attitude instruments 41, magnetometers 42, (accelerometers, gyroscopes, GPS sets) 43, and angular velocities, tilt angles, yaw angles, pitch angles are measured, these are processed via algorithms in the CPU 34 to specify specific points on the body, which are then compared with images of settings of such organs scanned via the Drobot eye camera 44, to know where to diagnose, and perform semi-naked eye visual tests on organs, urine, etc., as well as spreading the gel, passing the transducer, drawing a blood sample, injecting medication.
The lower left hand 25 is primarily used to wrap adhesive around a bandage placed over a wound, aided by the first left hand 23 and second right hand 24, many conventional mechanisms may be employed and accommodated in the lower left hand 25.
One of the Drobot20 eye cameras 44 is of the computer vision camera type which repeatedly processes 3D images of the patient with each movement to build up a 3D image of his body, as is done according to an algorithm which receives data from the camera sensors.
A stereoscopic screen 45 is stored inside the recessed compartment 46 below the main compartment group 22. If the Drobot20 finds that it needs to take an X-ray image for the patient behind him, it will verbally instruct the patient to stand between his hands with the back facing the right hand 23 of the robot, then the Drobot20 scans the patient's position, when the patient is positioned correctly, the Drobot20 pulls out the stereoscopic screen 45 using his first left hand 23, raises it vertically until facing the patient's chest, and starts imaging of the patient's back with his right-palm X-ray camera 32 (fig. 6).
The display screen 26 is used to demonstrate and explain the Drobot20 instructions, explanations, and illustrations for the patient, either for educational means or for images that have been taken of the patient by the Drobot 20. Fig. 7 shows the anatomy of the right foot displayed by the Drobot20 over its display screen 26.
As such, the Drobot should use one eye and the other infrared eye 47 with computerized eye vision 44, ear 48, nose 49 and mouth 50 to look, listen, smell, speak continuously to assist him in visualizing, speaking and communicating, these tools should work in concert with all other instrumentation, sensors, control unit devices and organs of Drobot20, some of which have been addressed in the art via available conventional AI and mechanisms, while others are divided into two parts here: a-math part; where algorithms will be established to govern the ability to locate the coordinates of each particular site or point on the patient's body, and compare and adjust in real time the position and movement of the involved part of the Drobot20 to make it easy to engage the organ site in the appropriate desired manner. B-will carry out in stages a training method of extremely complex structure on the Drobot20, where after each stage, a pattern of robots is introduced, first to assist nurses, second to replace practicing nurses, third to assist diagnosticians, fourth to replace some diagnosticians, fifth to be located everywhere inside a surrounding area or large residential building, sixth to visit people in their homes, and seventh to a joint flight ambulance cabin 51.
The training of Drobot20 and program updates via trainers are not the only resources of the human knowledge data and drivers that are output, but rather via a Machine Learning Algorithm Technology (MLAT) based Drobot20 Intelligent Learning System (ILS)52, Intelligent Perception System (IPS)53, Experience and Knowledge System (EKS)54, health assessment and fail-safe System (HEFS)55, search Server suggestion System (SSAS)56, activation of many self-education parameters, observability and cognition System (OVCS)57, intuition, inspiration and innovation System IIIS)58, task decision making System (MDMS)59 Meters, cameras, etc. collect data on a case-by-case basis. Fig. 8 demonstrates a block diagram of the structure of a Drobot20 data input sensor to an intelligent system and output actuator.
To this end, the various parts and tasks of Drobot20 are set forth, and thus, in a seventh version of Drobot, the following settings can be established to enable Drobot20 to gather symptoms, gather signs, communicate, and take proper decision making, all independently, following diagnostic criteria, wherein one or more methods of performing the task can yield results in the form of:
the a-Drobot 20 is distributed throughout a community of humans, such as buildings, towers, towns, villages, markets, companies, factories, trains, airplanes, ships.
b-Robotic doctor Unit (AI medical Unit or AI clinic) 60 is built in each of these facilities with twin Drobots, one Drobot sitting in the unit to take up unhealthy visitors, and another Drobot visiting unhealthy people who are not able to reach the unit or who will pay the visit fees.
c-robotic medical unit (AI medical unit) 60 is equipped with an autonomous blood testing laboratory 61 that receives blood samples from the hands of Drobot20, assigns them reference numbers and performs tests autonomously, shares blood test board results with the sub-server 62 of robotic medical unit 60 that studies and performs analysis on the results, discusses the results and, if necessary, takes recommendations of the city local server 63, and passes the results and decisions to Drobot20, which Drobot20 shares the results and decisions with the patient via an application, and assigns him another visit or appointment to discuss the results and decide what to do next according to the patient's schedule.
The d-robotics doctor unit 60 provides charging, accessories and all autonomous services to the Drobot 20.
The sub-server 62 of the e-bots medical unit 60 is the center of communication connection between the twin Drobot20 and the local data server 63 of a town, city, etc.
f-if Drobot20 encounters a problem with a patient problem or a failure of Drobot20 itself, the Robotic medical unit 60 searches for advice from a nearby data server, or seeks help from a 2 nd Drobot20, which 2 nd Drobot20 is located in unit 60 or from a nearby AI logistics center.
g-performing computer-aided detection (CADe): so that it shares, directly or indirectly, the radiological and pathological image processing and interpretation of medical images: x-ray or ultrasound digital images produced by the Drobot20, which are studied and analyzed in the same manner by the AI medical unit sub-server 62, highlight prominent regions (e.g., possible diseases), and are shared with the Drobot20 to take decisions and share them with the patient.
The h-robotics doctor unit 60 is provided with adjustable handles and foldable leg supports 64 for securely holding the patient's hands at a suitable adjustable level and height (fig. 9), which are also provided with the same tools but for the patient's feet/legs (adjustable foot grips and foldable leg 65 supports) so that with these tools the patient should carefully place his hands, arms, feet, legs. In the same manner, other compact collapsible tools may be disposed inside compartment 63 for carefully placing the patient's head, body, etc. thereon so that it may remain stationary while Drobot20 diagnoses, screens veins for blood draw, injects injectants, places the gel on a specific body site to perform ultrasound imaging.
The i-visitor Drobot20 can visit at any time, but for safety by riding a sterile, enclosed motorized wheel well 66 (FIG. 10). This compartment 66 is provided for this purpose with some adjustable handles and a foldable leg support, as already explained.
The limited injections, troches and medications available with j-Drobot 20 are associated with serious illnesses such as heart attacks, poisoning such as food poisoning or snake bites, where time plays a critical role in saving a person's life.
k-after the visitor Drobot20 dispenses the prescription to the patient, he may collect the medication from the robotic doctor unit pharmacy or choose to receive the medication from the pharmacy at home via the motorized autonomous delivery member, scheduling various transactions via Drobot 20.
The l-automatic telephone exchange 67 is built into Drobot20, where Drobot20 can simultaneously make many communications anywhere at any time with his patient or with AI clinic 60, all at the same time, even while busy receiving the patient.
m-provide multiple choices to the patient orally or through text delivery by the Drobot 20. Sending a medical report to the employer or sending certified examination results to the institution or sending a medical documentation history to a specific robotic doctor unit 60, national or international, scheduling the next visit to obtain approved statistical data about similar cases of his disease in the building, in the city, in the country or in the world, thus assisting in autonomous screening for a specific disease to know the best practices (possibly some charges) shared by other patients who share their experience from the same building.
n-Robotic doctor unit 60 is provided with a caregiver Drobot 68 that assists Drobot20 in cleaning.
The o-Drobot 20 itself is able to think and decide, study and analyze, observe and learn, establish humanitarian relationships with humans, prepare schedules, perform studies, surveys, search for valuable data via a server, and share among the robotics doctor community, send their articles and surveys about their local patients for review, publish and share among them and professionals searching for best practices and innovations.
Thus, the Drobot20 is expected to be able to intelligently carry out all kinds of diagnostics, such as: differential diagnosis, discriminant diagnosis, double diagnosis, remote diagnosis, point-of-care diagnosis, computer-aided diagnosis, etc., wherein these will be technically fruitful with the above-mentioned operating methods via the following effects and results:
a-people who are lazy, protracted, busy, hesitant, unwilling to visit a physician or even perhaps a relatively untrusted physician, will find that Drobot20 is what they need, at any time and anywhere, at their apartments, building entrances, offices, companies, etc. (which will focus on the purity of the site when collecting blood samples, otherwise the patient may ignore the robotic physician unit 60 laboratories available in almost every place nearby.
b-there is no longer the expense of the infrastructure of a new traditional hospital, as the residential facility houses the robotic medical unit 50, saving a lot of space in the current hospital to be designated for complex diagnostic and surgical procedures that require the physician and medical staff to work together in addition to the robotic equipment. At the same time, all hospital patient data will be delivered inside the local server 63 and sub-server 62 of the area and building where the patient is present.
c-people from unified buildings, villages, neighborhoods, regions, tribes, villages would ask for help from the same Drobot20 or nearby local robotic physician units 60, which means it is very easy to collect a unique database of medical history that returns a large number of recommendations with respect to health, environment, drug type.
The content mentioned in d-point b expands and applies to all cities, provinces, countries and even worldwide, where the unique combined medical history of all humans is constructed to resemble a genetic map, and finally a combined world medical history map (WMHP)69 is available. These will significantly prolong the expected survival of humans.
e-inside the IA unit 60, there will be no different level of physician currently occurring, there will be no various specialties, that is, because the same Drobot20 can visit a psychiatric patient, indicate and describe the medication, then visit a dermatologic patient, take an image of the patient's defective skin, compare the image to his world database and recommend therapy, then visit a person who complains from his abdominal joystick side, perform ultrasound and blood tests for him on his own, observe the inside of the larynx of another patient using a lamp and camera, take an image, compare, prescribe and recommend the appropriate medication, take radiographic X-ray images as the radiologist's self-behaviorally-fractured hand, etc., a number of specialties are integrated.
f-as such, there is no need to go to another city or country to seek other better diagnoses, as everything will be readily available on demand, and at the same level of diagnosis, the internet of things will achieve a fair diagnosis, as open source diagnostic data from worldwide servers is provided, autonomously online to communicate and discuss with each local server 63, sub-server 62 and individual Drobot20 anywhere and anytime. Of course, the communication of extremely complex medical problems via stationary hospitals will be greatly reduced over time.
g-Drobot 20 will not escape during war, earthquake, chemical pollution, flood, will remain unaffected, will work under conditions where electricity and the Internet are available, in which case they can be an important tool for screening and testing microbial contamination spread among people to provide statistical data about the focus of the microbial contamination spread.
h-whenever you call them anywhere they answer, ask questions, share observation data, photos or records of your gait, skin, sound, etc. They receive immediately and then process and reply directly, you will never see them busy, rest, lunch break, sleep, on vacation, absent, sick, vacation, productive, business trip, sick, contracted, hired, pattern invalid, usky taboo, anxious, angry, excited, sad, amnesia, evasion. Keep awake to attend to calls from their patients, or any symptoms of severe cases of patients connected to medical devices, anywhere and at any time.
i-if someone gets heart disease, stroke, etc., they can get near him in 1 to 5 minutes and provide the required injections or troches, which will greatly reduce the ethical issues that arise from this.
j-Drobot (20) can inform the patient of the cause, progress, prognosis, other consequences, and possible treatment options for her or his ailments, and provide recommendations to maintain his health.
k-disputes of insurance companies related to physician bills; the level differences will be significantly minimized.
l-from an economic perspective, their mass production, the time to incubate them, and the large exploitation of their superiority over many medical professionals, will make them cheaper than the money and time spent educating a physician, training a physician, and paying a physician a high amount of compensation, Drobot20 compensation is battery charging and periodic maintenance, they can even perform robot-to-robot training. This does not mean that the occupation will disappear, and that the robotics or AI medical engineering technology occupation will appear to support developing, manufacturing, selling, installing robotic doctor units, or to provide extra memory and multiple skills to an autonomous doctor.
The Drobot20, which is m-rented or purchased, may be provided to a camp, fleet/fleet, private aircraft, either permanently resident or on demand, intermittently or periodically, to visit some.
n-each Drobot20 may speak a number of languages, with variable pitch and pitch, male or female voices, and even boy or child voices, whichever is appropriate.
o-they remember all the data about your case, they do not need to read or modify, and they already have the full medical details of your family, neighbors, etc. and get the information from their memory in one second. Of course, 5G provides added value to their work.
Fig. 11 is a flowchart showing data flows among the Drobot20, the child server 62, the local server 63, the city server 70, the province server 71, the country server 72, the regional server 73, and the world server 74.
Note that: it should be apparent to the inventor that the robotic physician's office 60, and even the cabin 66, may be equipped with other custom-made replaceable and easily removable/installable robotic hands, manipulators, or portable tools of any shape, depending on the available space, for pick up and use by the Drobot20 or any other manipulator to perform some additional diagnosis and symptom assessment using items that are typically, but not limited to:
an eye-compact diagnostic device.
Pulse and vital signs monitor.
-a doppler instrument.
Ergometers, gauges, finger-pinch meters, sensory evaluations, patient scales, pulse oximeters, retinoscopes, blood pressure monitoring, convertible scopes, stethoscopes, thermistors and temperature detectors, Welch Allen, telemetry bags, skin surface microscopes, montemp, LED lights.
Symptom checker, compact 2-modular diagnostic station.
Anatomical maps and models.
The Drobot20 is not limited to a residential or sub-local or robotic medical unit 60, whose work extends to encompass an ambulance, here an embodiment of a twin Drobot20 positioned inside a flight ambulance 51, as shown in FIGS. 16, 17, where two opposing Drobots 20 are mounted on slidable tracks 75 around the area where people to be evacuated from accidents are received on a bed 76 inside a sterile compartment of a cabin 77.
The flight deck is a VTOL deck 51, and has four arms 78, the ends of which have a swingable jet propulsion engine 79 (vertical takeoff, horizontal cruise) or two swingable front canard wings 80 and two rear wings 81, each end having a jet propulsion engine 79 (vertical takeoff, horizontal cruise) (fig. 12). When the remotely controlled flying aircraft cabin 51 (ambulance) lands near an accident area (fig. 13), its side cabins 82 are swung into a vertical configuration to open, facilitating the two sheet metal mechanic robots 83 overhead with small UAVs 84 to assess the accident area exclusively around humans (fig. 15), the mechanic robots 83 pulling out the rescue bed 76 and stretchers 85, using their preset data, programs, algorithms and images received from the two small drones 81, the two mechanic robots 83 using their cutting tools 86 and cutting tools 87 to remove damaged pillars (fig. 14, 15), doors or roof around the trapped person, then rescuing and letting him lie flat on the stretchers 85, then putting him on the rescue bed 76, pushing the bed 76 inside the sterilization cabin 77 and returning to their side cabins 82.
The flight ambulance 51 cabin autonomously closes its doors 88 and takes off while the twin Drobot20 collects data picked up from the small evaluation drone (quadcopter) 84 and the injured person (if he can speak or point somewhere with his hand), in addition to performing a fast scan of the whole body of the injured person, then via a computer controlled system, the remote surgeon uses the computer to control the robot arm 89 of the Drobot20 and its end effector 90 and any other remote manipulators 91. According to one scenario, the hands of the Drobot20 may be autonomously replaced with any suitable familiar configuration of spare autonomous smart instruments 92 stored inside the flight chamber 52 to perform some action, such as controlled rib stretching, to mitigate or eliminate tissue trauma traditionally associated with open surgery.
Note that: it is obvious to the inventor that the side pods 82 may be used in another embodiment for launching aircraft parachutes, searching and rescuing drones (UAVs or helicopters), etc.
Finally, the disclosed method of operation of the Drobot20 can be based on other professional extensions and implementations, its loadable huge database, intelligent machine learning, its engageable intelligent autonomous replaceable instruments enable it to run meetings, provide lectures, answer any kind of questions within his professional scope, teaching/education/training, investigation, announcement, sale, marketing, diagnosis, inspection, maintenance, service, advice, management, honesty and efficiently adjudicate and judge. Which can achieve reconciliation between humans.
INDUSTRIAL APPLICABILITY
1-hardware: drobot and land or flying robot physician units are manufactured, constructed and customized based on reassembling, revising, recreating, recompressing autonomous and non-autonomous, robotic and non-robotic, computer controlled or non-computer controlled devices, instruments and tools into new configurations.
2-software: artificial intelligence techniques and algorithmic sciences are available that will be reconstructed to build software programs for the Drobot and robotic medical units.
Other supportive accessories and technologies are available, such as UAVs and quadcopters, which may be complementary tools to support future needs for intelligent Drobot.
4-efficiency and yield: similar to automotive mass production, each Drobot unit can be produced in minutes and put into operation quickly, which can be an all-in-one physician with multiple professional and technical diagnostic capabilities, compared to humans requiring years of education, teaching, training, experience.
5-cost: its price is its cost, its compensation is battery charging, its functional authority is 1 square meter space and periodic maintenance, its subsidy is a newly customized device, its reward is program update, and its retirement is recovery.
6-artificial soul induction: in the near future, it will be able to manually perform telepathic communication, sensation, sensing and response with identified humans in its database or its geographic circle.
Human survival (longevity) reduces infection rate. Using artificial intelligence physicians (Drobot) in various places, stem cells will restore the human's organs like young people, and using extracted plasma, the human regains young energy, which can continuously survive for hundreds of years with future technologies and retire after a hundred years.
Part number index:
20 Drobot. 47 infrared eye.
21 first right hand. 48 ears.
22 groups of compartments. 49 nose.
23 first left hand. 50 mouths.
24 second lower right hand. 51 flying the ambulance cabin.
25 second lower left hand. 52 Intelligent Learning System (ILS)
26 smart display screen. 53 Intelligent Perception System (IPS)
27 gel supply tube. 54 experience/knowledge system (EKS)
28 for injection. 55 health assessment/fail safe system (HEFS)
29 ultrasonic transducer. 56 search Server suggestion System (SSAS)
The end of 30 has a light/camera bar. 57 observability/vision/cognition (OVCS)
31 blood sample tube (syringe). 58 intuition/inspiration/innovation system (IIIS)
32X-ray camera. 59 task/decision making System (MDMS)
33 hand recess. 60 robotic doctor unit (AI medical unit).
34 CPU (central processing unit). 61 AI clinic sub-server.
35 DME (digital motor electronics). 62 autonomous blood testing laboratory.
36 1 st compartment (injection). 63 city local server.
37 nd compartment (blood line). 64 the handle can be adjusted.
38, 3 rd compartment (lozenge). 65 Adjustable foot/leg grip
39 th compartment (fitment). 66 sterilizing the electric compartment.
40 thermometer. 67 automatic telephone exchange.
41 attitude instrument. 68 caregiver drop.
42 magnetometer. 6S world medical history map (WMHP).
433 axis accelerometer. 70 city server.
44 eye camera (computer vision). 71 province server.
45 stereoscopic screens. 72 national server.
46 with recessed compartments. 73 regional server
74 worldwide servers.
75 are slidable tracks.
76 bed.
77 sterilizing the chamber.
78 the arm can be swung.
79 jet propulsion engine.
80 front can swing the duck wing.
81 rear two wings.
82 side capsule.
83 mechanic robot.
84 UAV (quadcopter).
85 stretcher.
86 cutting the tool.
87 a shearing tool.
88 doors.
89 computer controlled robotic arm.
90 end effector.
91 remote control of the manipulator.
92 spare autonomous smart appliances.
Patent applications cited documents:
Claims (32)
1. an artificial intelligence, robot (20) and UAV (51) based land or flight robot physician clinical unit (60) comprising:
a customized robot (botdoctor Drobot) (20);
a customized first right hand (21);
a set of compartments (22);
a customized first left hand (23);
a customized second lower right hand (24);
a customized second lower left hand (25);
a smart display screen (26);
a custom simulated gel delivery finger (27);
a custom simulated injectate (28);
a customized simulated ultrasound transducer finger (29);
a customized simulated end with a light/camera bar (30);
a custom-made simulated blood sample tube (syringe) (31);
a customized X-ray camera (32);
a hand recess (33);
an accelerometer, gyroscope, GPS group (43);
a computer vision camera (44);
a stereoscopic screen (45);
an infrared camera (47);
a flight ambulance cabin (51);
an Intelligent Learning System (ILS) (52);
an Intelligent Perception System (IPS) (53);
an experience/knowledge system (EKS) (54);
a health assessment/fail-safe system (HEFS) (55);
a search server suggestion system (SSAS) (56);
an observability/visual/cognitive system (OVCS) (57);
intuitive/inspiration/innovation system (IIIS) (58);
a task/decision making system (MDMS) (59);
an AI clinic sub-server (61);
an autonomous blood testing laboratory (62);
an adjustable handle (64);
a land sterilization electric compartment (66);
an automatic telephone exchange (68);
a slidable track (75);
a flight sterilization chamber (77);
a lateral pod (82);
a sheet metal technician robot (83);
a remote control manipulator (91).
2. The robotic physician unit (60) of claim 1, wherein the right hand (21) of the Drobot (20) is customizable with more or less medical devices, the devices/tools are replaceable, conventionally and autonomously retractable and storable inside a recess (33) and deployable in use according to medical Central Processing Unit (CPU) (34) commands sent to Digital Motor Electronics (DME) (35) to move a motorized left hand (23) to remove a used gel tube (27), injection (28) or syringe (31) from the right hand (21), place it in a designated compartment (22) and pick up one or another new different customization device, then install it in a specific location of the right hand (21), or in another embodiment, replace the entire hand with another different hand having a different device or with a different robotic arm, to perform the desired diagnosis or treatment.
3. The robotic medical unit (60) according to claim 1, wherein the set of compartments (22) is divided and structured according to the requirements and number of desired compartments, such that a basic set is made of four compartments, a first compartment (36) for emergency injections (28), a second compartment (37) for blood samples (31), a third compartment (38) for emergency pastilles or vial samples, and a fourth compartment (39) for spare material accessories.
4. The robotic medical unit (60) of claim 1, wherein the palm, grip and finger tips of the first left hand (23) are skin-like sensors, tactile perception, sensing abdominal sounds, heart rate, patient temperature depending on conventional techniques.
5. The robotic medical unit (60) of claim 1, wherein the second lower right hand (24) of the Drobot (20) is optionally added for additional tasks, such as measuring blood pressure and fixing or supporting a patient's body or a human organ, to assist other hands in performing their tasks on the patient's body.
6. The robotic medical unit (60) according to claim 1, wherein the four hands of the Drobot (20) are provided with all necessary devices and instrumentation, such as an attitude gauge (41), magnetometer (42), and accelerometer, gyroscope, GPS set (43.) for measuring angular velocity, tilt angle, yaw angle, pitch angle to specify a particular point on the human body via algorithmic program processing internal to the CPU (34), the particular points are then compared with images of settings of such organs scanned via a Drobot eye camera (44) (47) to assist in locating the position and location of the human organ and the coordinates of particular points thereon, knowing where the diagnosis should be made, spreading the gel, passing the transducer (29), drawing a blood sample, injecting a drug.
7. The robotic medical unit (60) of claim 1, wherein the lower robotic left hand (25) wraps adhesive around a bandage disposed over a wound with the assistance of the first left hand (23) and second right hand (24) using an adapted available conventional mechanism.
8. The robotic medical unit (60) of claim 1, wherein the 1 st eye camera (44) of the Drobot20 is a computer vision camera type, and the 2 nd infrared camera (eyes) 47 repeatedly processes 3D images of a patient with each movement to create 3D images of the patient's body, while the ears 48, nose 49, and mouth 50 are used to continuously look, listen, smell, speak based on conventional mechanisms, all working in concert to assist the Drobot in visual, speaking, and communication.
9. The robotic physician unit (60) of claim 1, wherein the stereoscopic screen (45) is stored inside a recessed compartment (46) below the main compartment group (22) of the Drobot (20) such that if the Drobot (20) finds that it needs to take an X-ray image, it pulls out the stereoscopic screen (45) with its first left hand (23), raises the stereoscopic screen vertically up to face the patient's chest, and starts imaging of the patient's back with his right hand palm X-ray camera (32).
10. The robotic medical unit (60) of claim 1, wherein the smart display screen (26) is mounted over the set of compartments (22) and is used to demonstrate and explain instructions, interpretations and descriptions of a Drobot (20) for the patient.
11. The robotic physician unit (60) of claim 1, wherein the self-education parameters of the drop (20) including a Machine Learning Algorithm Technology (MLAT) dependent Intelligent Learning System (ILS) (52), an Intelligent Perception System (IPS) (53), an Experience and Knowledge System (EKS) (54), a health assessment and fail-safe system (HEFS) (55), a search server suggestion system (SSAS) (56), an observability/vision/cognition system (OVCS) (57), an intuition, inspiration, and innovation system (IIIS) (58), and a task decision making system (MDMS) (59.) are installed and set to intelligently think about and act independently, beyond human training and updated program instructions.
12. The robotic physician unit (60) of claims 1 to 11, wherein the Drobot (20) components and systems are employed by the Drobot (20) to independently perform multiple types of diagnosis targeting a primary diagnosis by collecting symptoms, gathering signs, communicating, and taking corrective decisions.
13. The robotic medical unit (60) of claim 1, wherein the autonomous blood testing laboratory (61) receives the blood sample from the Drobot (20) hand, assigns a reference number to the blood sample and performs a test autonomously, shares a blood test board result with the sub-server (62) of the robotic medical unit (60), studies and performs an analysis on the result, discusses the result and, if necessary, adopts a recommendation of a city local server (63), and communicates the result and decision to the Drobot (20), which shares the result and decision with the patient via an application, and assigns another visit or appointment to the patient according to the patient's schedule to discuss the result and decide what to do next.
14. The robotic medical unit (60) of claim 1, wherein the Drobot (20) receives charging, accessories, and all autonomous services from the terrestrial robotic medical unit (60).
15. The robotic medical unit (60) of claim 1, wherein the Drobot (20) is connected to a sub-server (62) in the robotic medical unit (60), the sub-server is a communication connection center between a Drobot (20) and the local data server (63), the local data server is connected to a city server (70), a province server (71), a country server (72), a regional server (73), a world server open source data (74) and to a world medical history map (WMHP) (69), such that when a Drobot (20) encounters a problem associated with a patient problem case or a failure of the Drobot (20) itself, the robotic medical unit (60) searches for suggestions from an associated data server, or seek assistance from a 2d Drobot (20), the 2d Drobot being located in the unit 60 or from a nearby AI logistics center.
16. The robotic physician unit (60) according to claim 1, wherein the Drobot (20) performs computer aided detection (CADe) such that it uses built-in systems, microprocessors and telecommunication means to directly or indirectly share radiological and pathological image processing and interpretation of medical images, i.e. X-ray images or ultrasound digital images produced by the Drobot (20) itself, which are studied and analyzed by the AI medical unit sub-server (62) in the same way, highlighting salient regions, such as possible diseases, and shared with the Drobot (20) to take decisions and share them with the patient. The patient is transferred to a hospital via an autonomous appointment system when needed.
17. The robotic physician unit (60) of claim 1, wherein the Drobot (20) is provided with an adjustable handle and a foldable leg support (64) for securely holding the patient's hand at a suitable adjustable level and height, and with similar adjustable foot/leg grips and foldable leg supports (65) to assist in diagnosis, screening veins to be located for blood draw, injecting injectate, placing gel on a specific body part to perform ultrasound imaging.
18. The robotic medical unit (60) of claim 1, wherein the sterile sealed motorized wheel well (66) of the visitor Drobot (20) serves as an indoor transport member with which it is equipped with accessories.
19. The robotic physician unit (60) of claim 1, wherein the Drobot (20) automatic telephone switch (67) is built inside the Drobot (20) to perform many instantaneous communications from anywhere at any time while the Drobot (20) is or is not caring for a patient, all at the same time.
20. The robotic medical unit (60) of claim 1, wherein the Drobot (20) issues and sends electronic medical reports to employers, or electronic certification inspection results to institutions, or medical documentation histories to specific robotic medical units (60) in countries or internationally, additionally scheduling next visits, and assisting in autonomic screening and statistics of specific diseases.
21. The robotic physician unit (60) of claims 1, 15, and 16, wherein the Drobot (20) thinks and decides, studies and analyses, observes and learns from its built-in machine learning and other intelligent systems, establishes humanitarian relationships with humans, prepares schedules, performs studies, surveys, searches valuable data via a server, and shares in a robotic physician community, sends their articles and surveys about their local patients for review, publishes and shares between them and other institutions and medical units where professionals are searching for best practices and innovations.
22. The robotic medical unit (60) of claim 1, wherein the Drobot (20) integrally performs a plurality of physician specialties and procedures in sequence: visiting a psychiatric patient, instructing and describing medications, then visiting a dermatologic patient, taking an image of the patient's defective skin, comparing the image to his worldwide database and recommending therapy, then visiting a person who complains from his abdominal joystick side, performing ultrasound and blood tests for him on his own, viewing the inside of the larynx of another patient using lights and cameras, taking an image, comparing, prescribing and recommending appropriate medications, taking radiographic X-ray images as a radiologist's hands self-behaving at break, etc.
23. The robotic medical unit (60) of claim 1, wherein the Drobot (20) remains unaffected during war, earthquake, chemical contamination, flood, initiating screening and testing of microbial contamination spread among people under conditions where electricity and the Internet are available to provide statistical data about the focus of the microbial contamination spread.
24. The robotic medical unit (60) of claim 1, wherein the Drobot (20) receives phone calls and answers anywhere at any time, 7 days a week, 24 hours, instantly shares observation data, photographs or recordings of the patient's gait, skin, sound.
25. The robotic physician unit (60) of claim 1, wherein the Drobot (20) is programmed in multiple languages, listens and speaks in variable tones and pitches, male or female voices, even boy or child voices, whichever is appropriate, to qualify for rental or sale to any kind of facility or dedicated entity.
26. The robotic medical unit (60) of claim 1, wherein the Drobot (20) or the robotic medical unit (60) can be provided with other custom-made replaceable and easily removable/installable robotic hands, manipulators, or portable tools of any shape, depending on available space, to perform some additional diagnosis and symptom assessment using items that are typically but not limited to: eye compact diagnostic devices, pulse and vital signs monitors, doppler instruments, force meters, gauges, finger grips, sensory assessments, patient scales, pulse oximeters, retinoscopes, blood pressure monitoring, convertible viewing instruments, stethoscopes, thermistors and temperature detectors, Welch Allen, telemetry bladders, skin surface microscopes, montemps, LED lights, symptom detectors, compact 2-modular diagnostic sites, anatomical maps and models, enabling the main robotic biomedical unit (60) to be a compact medical diagnostic site.
27. The flying robot physician unit (60) according to claim 1, wherein the twin Drobot (20) is positioned inside a flight ambulance (51), relatively mounted on a slidable track (75), around an area on a bed (76) that receives evacuated persons inside a sterile compartment of the cabin (77).
28. The flying robot doctor unit (60) according to claim 1, wherein the flight chamber (51) is VTOL in one embodiment with four arms (78) with swingable jet propulsion engines (79) at the ends (vertical takeoff, horizontal cruise) or with two swingable front canard wings (80) and two rear wings (81), each with a jet propulsion engine (79) at the ends (vertical takeoff, horizontal cruise).
29. The flying robot doctor unit (60) according to claim 1, wherein the flight deck (51) lands near an accident area, swinging its side compartments (82) into a vertical configuration to open, facilitating two sheet metal technician robots (83) overhead provided with small UAVs (84) to assess the accident area exclusively around humans.
30. The aircraft robotics unit (60) according to claims 1 and 29, wherein the sheet metal technician robot (83) pulls out a rescue bed (76) and a stretcher (85), uses their preset data, programs, algorithms and images received from two small drones (81), so that the two technician robots (83) use their cutting tools (86) and cutting tools (87) to remove damaged pillars, doors or ceilings from around trapped people, then rescues him and let him lie on the stretcher (85), then puts him on the ambulance cot (76), pushes the bed (76) inside the sterilization chamber (77) and returns to their side chambers (82).
31. The flying robot doctor unit (60) according to claim 1, wherein the sterilization chamber (77) autonomously closes a hatch door (88) and takes off while the twin Drobot (20) collects data picked up from a small assessment drone (quad-pod) (84) and an injured person (if he can speak or point somewhere with his hand), in addition a full body fast scan of the injured person is also made via their camera (44) (47), then via a computer controlled system, the remote surgeon uses a computer to control the slidable Drobot (20) robotic arm (89) and its end effector (90) and any other telemanipulator (91) to perform some action such as controlled stretching of the ribs to reduce or eliminate tissue trauma traditionally associated with open surgery.
32. A method of simulated operation wherein the disclosed Drobot (20) method of operation can be extended and implemented to other professions, similarly loaded with a very large database formulated from professional knowledge, experience, training and decision making, with Machine Learning Algorithm Technology (MLAT) -like intelligent machine learning (ILS) (52), Intelligent Perception System (IPS) (53), Experience and Knowledge System (EKS) (54), health assessment and fail-safe system (HEFS) (55), search server suggestion system (SSAS) (56), observability/vision/cognition system (OVCS) (57), intuition, inspiration and innovation system (IIIS) (58), and task decision making system (MDMS) (59), and using its engageable similar intelligent replaceable instrumentation for teaching/education/training, searching, self-contained learning, and decision making, Survey, announcement, sale, marketing, diagnosis, examination, maintenance, service, advice, management, truthful and efficient adjudication and judgment, etc.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2019/000545 WO2019175675A2 (en) | 2019-07-01 | 2019-07-01 | Dr robot medical artificial intelligence robotic arrangement |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114072258A true CN114072258A (en) | 2022-02-18 |
Family
ID=67908716
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201980098296.6A Pending CN114072258A (en) | 2019-07-01 | 2019-07-01 | Medical artificial intelligent robot arrangement for robot doctor |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114072258A (en) |
WO (1) | WO2019175675A2 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019175675A2 (en) * | 2019-07-01 | 2019-09-19 | Wasfi Alshdaifat | Dr robot medical artificial intelligence robotic arrangement |
CN111916195A (en) * | 2020-08-05 | 2020-11-10 | 谈斯聪 | Medical robot device, system and method |
CN111923056A (en) * | 2020-06-17 | 2020-11-13 | 厦门波耐模型设计有限责任公司 | Architecture, method and system of unmanned intelligent hospital |
DE102020207778A1 (en) | 2020-06-23 | 2021-12-23 | Volkswagen Aktiengesellschaft | Autonomous vehicle with examination device and medical device |
DE102021209655A1 (en) * | 2020-09-03 | 2022-03-03 | Viralint Pte Ltd | Adaptable multifunctional robotic hands |
DE102020213038A1 (en) * | 2020-10-15 | 2022-04-21 | Kuka Deutschland Gmbh | Method of conducting health testing and mobile health testing facility |
CN113855250A (en) * | 2021-08-27 | 2021-12-31 | 谈斯聪 | Medical robot device, system and method |
CN117825731A (en) * | 2024-03-06 | 2024-04-05 | 内蒙古唯真科技有限公司 | Blood analysis device |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5673367A (en) * | 1992-10-01 | 1997-09-30 | Buckley; Theresa M. | Method for neural network control of motion using real-time environmental feedback |
KR101101274B1 (en) * | 2011-07-01 | 2012-01-04 | 전남대학교산학협력단 | Small-sized manipulator for single port surgery |
CN102665589A (en) * | 2009-11-13 | 2012-09-12 | 直观外科手术操作公司 | Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument |
US20130345718A1 (en) * | 2007-02-16 | 2013-12-26 | Excelsius Surgical, L.L.C. | Surgical robot platform |
CN105056351A (en) * | 2015-07-31 | 2015-11-18 | 京东方科技集团股份有限公司 | Automatic needle inserting device |
CN107752984A (en) * | 2017-11-15 | 2018-03-06 | 李玉东 | A kind of high intelligent general medical practice operation robot based on big data |
CN207415376U (en) * | 2017-10-20 | 2018-05-29 | 深圳市前海安测信息技术有限公司 | Multi-functional health care robot |
CN108175510A (en) * | 2018-01-19 | 2018-06-19 | 上海联影医疗科技有限公司 | Medical robot and medical system |
CN108942952A (en) * | 2018-04-23 | 2018-12-07 | 杨水祥 | A kind of medical robot |
CN109466402A (en) * | 2018-10-13 | 2019-03-15 | 广东嗨学云教育科技有限公司 | A kind of earthquake rescue robot with safeguard function |
CN208822821U (en) * | 2018-04-12 | 2019-05-07 | 叶舟 | Long-distance ultrasonic diagnosis system |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10258425B2 (en) * | 2008-06-27 | 2019-04-16 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide |
KR101550841B1 (en) * | 2008-12-22 | 2015-09-09 | 삼성전자 주식회사 | Robot hand and humanoid robot having the same |
US8380652B1 (en) * | 2011-05-06 | 2013-02-19 | Google Inc. | Methods and systems for autonomous robotic decision making |
KR102306959B1 (en) * | 2013-09-04 | 2021-10-01 | 삼성전자주식회사 | Surgical robot and control method thereof |
US10368850B2 (en) * | 2014-06-18 | 2019-08-06 | Siemens Medical Solutions Usa, Inc. | System and method for real-time ultrasound guided prostate needle biopsies using a compliant robotic arm |
WO2019175675A2 (en) * | 2019-07-01 | 2019-09-19 | Wasfi Alshdaifat | Dr robot medical artificial intelligence robotic arrangement |
-
2019
- 2019-07-01 WO PCT/IB2019/000545 patent/WO2019175675A2/en active Application Filing
- 2019-07-01 CN CN201980098296.6A patent/CN114072258A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5673367A (en) * | 1992-10-01 | 1997-09-30 | Buckley; Theresa M. | Method for neural network control of motion using real-time environmental feedback |
US20130345718A1 (en) * | 2007-02-16 | 2013-12-26 | Excelsius Surgical, L.L.C. | Surgical robot platform |
CN102665589A (en) * | 2009-11-13 | 2012-09-12 | 直观外科手术操作公司 | Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument |
KR101101274B1 (en) * | 2011-07-01 | 2012-01-04 | 전남대학교산학협력단 | Small-sized manipulator for single port surgery |
CN105056351A (en) * | 2015-07-31 | 2015-11-18 | 京东方科技集团股份有限公司 | Automatic needle inserting device |
CN207415376U (en) * | 2017-10-20 | 2018-05-29 | 深圳市前海安测信息技术有限公司 | Multi-functional health care robot |
CN107752984A (en) * | 2017-11-15 | 2018-03-06 | 李玉东 | A kind of high intelligent general medical practice operation robot based on big data |
CN108175510A (en) * | 2018-01-19 | 2018-06-19 | 上海联影医疗科技有限公司 | Medical robot and medical system |
CN208822821U (en) * | 2018-04-12 | 2019-05-07 | 叶舟 | Long-distance ultrasonic diagnosis system |
CN108942952A (en) * | 2018-04-23 | 2018-12-07 | 杨水祥 | A kind of medical robot |
CN109466402A (en) * | 2018-10-13 | 2019-03-15 | 广东嗨学云教育科技有限公司 | A kind of earthquake rescue robot with safeguard function |
Also Published As
Publication number | Publication date |
---|---|
WO2019175675A3 (en) | 2020-07-09 |
WO2019175675A2 (en) | 2019-09-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114072258A (en) | Medical artificial intelligent robot arrangement for robot doctor | |
Tavakoli et al. | Robotics, smart wearable technologies, and autonomous intelligent systems for healthcare during the COVID‐19 pandemic: An analysis of the state of the art and future vision | |
US20230138192A1 (en) | Medical service robot device, and method and system thereof | |
US20070122783A1 (en) | On-line healthcare consultation services system and method of using same | |
US20080082363A1 (en) | On-line healthcare consultation services system and method of using same | |
Deegan et al. | Mobile manipulators for assisted living in residential settings | |
Karabegović et al. | The role of service robots and robotic systems in the treatment of patients in medical institutions | |
WO2020023865A1 (en) | Apparatus and method for providing improved health care | |
Bravi et al. | An Inertial Measurement Unit-Based Wireless System for Shoulder Motion Assessment in Patients with Cervical Spinal Cord Injury: A Validation Pilot Study in a Clinical Setting | |
Thinh et al. | Telemedicine mobile robot-robots to assist in remote medical | |
CN111923056A (en) | Architecture, method and system of unmanned intelligent hospital | |
Vishnevskaya et al. | Study the possibility of creating self-diagnosis and first aid system | |
Kumar et al. | AI-based robotics in E-healthcare applications | |
Copeland et al. | Changes in sensorimotor cortical activation in children using prostheses and prosthetic simulators | |
RU62265U1 (en) | TELEMEDICAL SYSTEM | |
Karagözoglu | Biomedical engineering: Education, research and challenges | |
Bhat et al. | Case Studies on Implementation of Smart Health Care across Global Smart Cities | |
Potvin et al. | Report of an IEEE Task Force-An IEEE Opinion on Research Needs for Biomedical Engineerng Systems | |
Nawrat | A diagnostic robot–what is it? | |
RU121138U1 (en) | STUDENT ORGANISM SURVEY SYSTEM | |
Monthe et al. | MV-SYDIME: A virtual patient for medical diagnosis apprenticeship | |
Preston et al. | Computing in medicine | |
Guo et al. | Hospital Automation Robotics | |
DE102016219157A1 (en) | Assistance device for patients and method for operating an assistance device | |
Marcos Pablos et al. | Impact of state-of-the-art technologies on medical training processes and clinical practice |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |