AU2021321650A1 - Medical robotic device, system, and method - Google Patents

Medical robotic device, system, and method Download PDF

Info

Publication number
AU2021321650A1
AU2021321650A1 AU2021321650A AU2021321650A AU2021321650A1 AU 2021321650 A1 AU2021321650 A1 AU 2021321650A1 AU 2021321650 A AU2021321650 A AU 2021321650A AU 2021321650 A AU2021321650 A AU 2021321650A AU 2021321650 A1 AU2021321650 A1 AU 2021321650A1
Authority
AU
Australia
Prior art keywords
module
medical
robot
collection
speech
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
AU2021321650A
Inventor
Sicong TAN
Hao Yu
Mengfei YU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of AU2021321650A1 publication Critical patent/AU2021321650A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/0045Devices for taking samples of body liquids
    • A61B10/0051Devices for taking samples of body liquids for taking saliva or sputum samples
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/0045Devices for taking samples of body liquids
    • A61B10/007Devices for taking samples of body liquids for taking urine samples
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0088Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/15Devices for taking samples of blood
    • A61B5/151Devices specially adapted for taking samples of capillary blood, e.g. by lancets, needles or blades
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/15Devices for taking samples of blood
    • A61B5/151Devices specially adapted for taking samples of capillary blood, e.g. by lancets, needles or blades
    • A61B5/15101Details
    • A61B5/15103Piercing procedure
    • A61B5/15107Piercing being assisted by a triggering mechanism
    • A61B5/15109Fully automatically triggered, i.e. the triggering does not require a deliberate action by the user, e.g. by contact with the patient's skin
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/02Stethoscopes
    • A61B7/04Electric stethoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/94Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/42Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests having means for desensitising skin, for protruding skin to facilitate piercing, or for locating point where body is to be pierced
    • A61M5/427Locating point where body is to be pierced, e.g. vein location means using ultrasonic waves, injection site templates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Hematology (AREA)
  • Theoretical Computer Science (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Acoustics & Sound (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Vascular Medicine (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computing Systems (AREA)
  • Dentistry (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Pulmonology (AREA)
  • Computational Linguistics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Evolutionary Biology (AREA)

Abstract

Provided are a medical robotic device, a system, and a method. The medical robotic device for remote combined diagnosis and adjunct therapy is provided by utilizing artificial intelligence and robotics technology, and can achieve remote consultation, multi-department combined consultation, and remote medical advice provision, and solve problems such as unclear understanding of patient's conditions and inappropriate treatment methods; the use of an ultrasonic image acquisition device (104), an intraoral acquisition device (210), and a blood acquisition device carried by a robot achieves autonomous and remotely controlled acquisition and sharing of medical images, thereby solving problems of limitation of diagnosis by a single department and uniformity of the diagnosis plan. By means of a blood vessel amplifier (209) and an intravenous injector (211) carried by the robot, remote control of the robot, autonomous injection, and autonomous medicine preparation are achieved; and ward rounds and pickup/return of medical instruments are achieved, thereby relieving problems such as high work pressure of doctors and nurses and heavy night shift workload, improving the efficiency of experts and doctors in remote inquiries, ward rounds, and multi-department combined consultations, and facilitating solving clinical cases according to combined opinions of experts. The present application is applicable to outpatient clinics, wards, and overseas medical institutions.

Description

MEDICAL ROBOTIC DEVICE, SYSTEM, AND METHOD
TECHNICAL FIELD The present invention belongs to the technical field of artificial intelligence robot health medical devices, and relates to the technical field of robots, an image intelligent recognition method, an intelligent device, and a system.
BACKGROUND At present, the method is applied to the field of medical treatment, and in the inspection process, due to the fact that various human factors are analyzed, the illness state accuracy is poor, various specialist doctors and medical professional limited and multi-department multi-expert combined consultation is carried out, and the illness state is difficult to achieve in outpatient service, ward and patients. Administrator far-end control, far-end joint consultation, ward expert combined room checking and combined treatment robot device, the robot platform relates to robot theory and practice technology. The problems of high infection, low efficiency, inaccurate manual collection, plague propagation and the like caused by epidemic situations are serious, and the robot arm is used to autonomously collect oral cavity detection samples, blood detection samples, urine, excrement detection samples, autonomous injection, autonomous management, medicine preparation, medical supplies.
The robot arm, the camera, the machine vision and various intelligent recognition methods carried by the robot are used to assist nonrecognition the recognition of the disease symptom associated disease, so as to realize remote detection, autonomous detection, infectious detection, and intelligent analysis of data, thereby effectively preventing the spreading of major diseases such as infectious diseases and plague.
TECHNICAL PROBLEM
The objective of the present invention is to overcome the shortcomings and deficiencies of the prior art, and provide a medical robot device, which utilizes remote consultation, multi-department joint consultation, far-end medical advice, patient-doctor communication unsmooth, illness state understanding and unclear treatment method, etc. The ultrasonic image collection device, the oral cavity collection device, the blood collection device, the CT image and the DR radiology department image far-end control collection and sharing are used for realizing image sharing, so that the problems of human diagnosis and treatment errors, limitation of a single diagnosis and treatment department, singleness of a diagnosis scheme and the like are solved. Robot remote control, autonomous injection, autonomous medicine configuration, itinerant pick-and-place medical equipment are realized through the vascular amplifier, the vein injector and other injection devices carried by the robot, and the problems of large operation pressure, many night shift and the like of medical staff are solved. The flexibility of expert, doctor far-end inquiry, room checking and multi-department joint consultation is improved, and the high-efficiency multi-treatment-scheme multi-expert common opinion solves clinical cases.
The invention further provides an outpatient ward multi-task allocation optimization management system and a picture sharing medical picture real-time collection and sharing multi-user-robot voice interaction combined inquiry method. The medical care, patient and robot three-party matching remote control and autonomous collection sample and injection management method is provided. A method for autonomously locating and recognizing human organ feature positions, collecting and classifying internal organ ultrasound and CT images.
Technical Solutions of the Present Invention
A medical robotic device includes: The robot main system module is configured to implement robot main control, speech interaction module between users and robot, visual recognition module, heart sound recognition module, lung screw sound recognition module, medical scene demo recognition module, radar real-time mapping, autonomous navigation and movement module, blood collection motion planning module, injection motion planning module, robot arm motion planning module for picking, placing, scanning,arranging. Speech module, is configured to collect patient communication speech and an outpatient ward scene demo sound. The speech module is use for speech interaction, speech guidance between the robot main system users and speech commands. Visual recognition module is connected to the medical image collection devices, and to recognize medical images, the image collection device includes: general visual device,depth camera,binocular cameras, but they are not limited to the above-mentioned image collection devices. The visual recognition module includes: face recognition, facial feature recognition of human body, human body feature position recognition, medical scene demo recognition, medical products recognition, and drug medicine recognition. Face recognition is face recognition for patients, users and administrators. Five sense organs recognition of the human body is for five sense organs recognition and for location of five sense organs position, the angular position of the oral cavity, and is use for nucleic acid detection, biological feature detection, and another oral cavity detection items. Feature position recognition of human body refers to joint position recognition, including shoulder, wrist, arm elbow, finger joints and position recognition thereof, arm joints for recognizing finger, toe end, wrist, elbow, shoulder, for blood vessel position recognition of wrist vein, elbow vein ,and for shoulder muscle injection position recognition, blood vessel locating, and another key position locating under blood vessel amplifier. Medical scene demo recognition module is used to recognize medical scene demo by improved neural network method. Comprehensive scene demo recognition includes outpatient demo, ward demo, patient, doctor, doorplate alphanumeric characters. Medical products recognition module includes: blood pressure meterglucose meter, thermometer,stethoscope,heart rate device,breathing device,negative pressure device,24 hours monitoring device,and another medical equipment instruments for each specialist as basic medical devices which are connected to robot and are used in basic medical device area of the robot which is used to collect medical information. Neural network methods are improved by extracting shape features, color features, digital codes,two-dimensional codes, recognize and manage medical products. According to doctor's advice,three-part information matching among the recognized medical product information correspond to recognized patient face and two-dimensional code of the bracelet according to task arrangement, and perform matching management on the medical products. Medicine recognition module includes: feature recognition of medicine outer label digital code , two-dimensional code, character, color and shape,to recognize and to match three party information which include medicine name and quantity, the recognized patient face and the recognized two-dimensional code of the bracelet, to perform matching management. The heart sound, the lung screw sound recognition module, the heart sound, and the lung screw sound recognition module are use for voice feature extraction of heart sound and lung screw sound voiceprint , and the improved sound recognition method is used to recognize the heart sound and the abnormal sound intelligently. Radar self-controlled movement, medical scenario recognition, and mapping module. radar self-controlled autonomous location, navigation, real-time mapping, the medical scene recognition department applying the visual recognition module, the ward doorplate alphanumeric characters,bed number and the radar real-time mapping are fused, and autonomous location, navigation, moving to the corresponding department, ward and bed location are utilized. The motion planning module is to self-set and self adjust parameters by administrator.To train robot to learn planning motion and adaptive setting motion planning parameters by neural network improvement method. It is use for motion plan module which include collection module,injection module,medical equipment wearing and using motion planning module, medical products and medicine picking and placing arrange management module. The collection and injection module includes blood collection module, injection module, oral detection sample collection module, urine sample collection, excrement sample storage management module, and medical image collection and sharing module. Blood collection module, injection module, finger tip blood collection module, syringe needle collection module, arm fixing device which are used to locate of toe end position, arm wrist, elbow vein blood vessel position, upper arm muscle injection position which are the positions of fingers, tail end and each joint of arm to perform collection needle, injection needle collection blood, intravenous injection and muscle injection. Further, Oral cavity detection sample collection module, the oral cavity collection motion planning module, which are used for face recognition and five-sense organ recognition, oral cavity position location, tooth position location, oral wall position location of images recognition module, carried by robotic arm connecting to oral collector, oral collector cotton, oral cavity mirror and motion planning modules which plan collection motion that robot arms collect samples according to left-right front-back direction sliding collector along oral cavity wall collection location, to collect saliva and oral cavity features sample MMJand oral cavity images more accurately. Further,urine and feces sample collection storage and management module, which is used for matching three party information which include two-dimensional code and digital code corresponding to ward, hospital bed, patient and robot. Robotic arms and motion plan module are used to recognize sample, grasp, move, place urine and feces sample in the collection area.
Further, medical image collection and sharing module, which is configured to collect ultrasonic images, CT images, DR radiology department and MRI nuclear magnetic images, to remote control collection and sharing medical images consultation and multi-department joint consultation. As further improvement of the present invention, motion planning module for wearing and using medical devices. Medical devices include breathing device, negative pressure device, 24-hour monitoring device and another medical devices. The robot is controlled by robot main system and robot vision recognition module is applied to recognize five sense organ and body features, recognize mouths, nose, ears, eyes, body feature positions, locate positions, plan motion of picking, moving, placing, wearing, using devices, monitoring normal operation motion of device by improved neural network method to adaptive train and to set parameters of robot arms. As further improvement of present invention, motion planning module for medical products and medicine taking, placing, managing which is used to pick, place, scan digital code, two-dimensional code, manage and distribute devices for medicine, treatment device, rehabilitation device and another medical products. The vision device and images recognition module is used to recognize patient face and to scan two-dimensional code of bracelet, and to match digital codes of the bed, hand plate information According to medical instruments, matching medicine, two-dimensional code and According to medical advice information. Autonomous picking, scanning, returning and managing medical instruments. A optimization task management system, includes medical robotic devices, a plurality of medical care tasks of multi departments, and call center sub-system. The medical robotic device is the above-mentioned devices, and medical care task subsystem and call center subsystem of all multiple departments are connected to and communicated with the robot main control system and optimization task management system. By task management optimization system, medical administrator add, modify, delete, query, dynamically and real-time scheduling various of tasks, connect to the medical area call center system, perform remote inquiry, combine with the consultation treatment jurisdiction patient, send medical advice information, receive patient message and return patient message. Remote control robot, manage robots in respective departments, ward jurisdictions, manage tasks allocated to the robot according to corresponding time periods.
A medical image real-time collection and sharing multi user-robot speech intermotion combined inquiry method, the method comprising the following steps: A medical image real-time collection and sharing multi user-robot speech interaction combined inquiry method, the method comprising the following steps: S1, Speech devices are connected to the robot platform and communicated with another users by administrator which are connected to speech module to communicate with another users. S2 , Speech recognition, speech synthesis, speech recording, speech-to-text recognition medical advice are performed by robot. S3 , The administrator subscribe and publish messages, services and images by robot platform and multi-user-robot shares the medical information, such as pictures, speech information; S4, Real-time speech interaction by administrator and robot platform, speech recognition module, real-time multi-user speech session and speech-to-text additional picture information, and records multi-user speech interaction and speech conference;
A medical care, patient, and robot three-party matching autonomous pick-and-place management method for drug medical instrument and the method which include the following steps: S1, Communication module public medical advice messages and services by administrator, speech module subscribe messages and services to receive medical advice messages, patient user subscribes medical advice message and service. S2 , Speech recognition, speech synthesis, speech recording, speech-to-text recognition medical advice are performed by robot. S3 , Vision device and image recognition module is used to recognize device, drug, corresponding to location information. S4 , Vision device and image recognition module, communication module publish the location information messages and services of equipment and medicine, radar location and navigation module subscribe the location information messages and services, autonomously move to the placement area of equipment, and medicine. S5 , Motion planning module is used to plan motions of picking device and medicine, scanning digital code and two-dimensional code. S6 , Communication module publish location information message which include patient, ward, department, bed location information. Radar location and navigation module subscribe patient location information and autonomously move to bed location. S7 , Vision device and image recognition module is used to recognize numeric futures of ward, doorplate letters, and hospital bed number and medical scene demo, to recognize face, to check and match the information , and if so, continue step 8 to relocate and re navigate. S8, The robot scans the digital code, two-dimensional code, and medical advice information digital code on the equipment and medicine by motion planning module, and matches the two-dimensional code, the digital code and medical advice information digital code. If the code is correct by scanning, equipment and medicine is dispensed. Otherwise, the medical advice information is returned to the manager. S 9, The robot arms and motion planning module are used to place, dispense equipment and medicine to the medicine box and instrument placement area. S10, Finish the task of this time period.
A medical care, patient, robot three-party matching remote control and autonomous collection sample, and injection management method, the method includes the following steps: S1, The administrator communication module publish medical advice messages and services, speech module subscribe the medical advice message, the patient user subscribes to receive the medical advice message and service. S2 , Speech recognition, speech synthesis, speech recording, speech-to-text recognition medical advice are performed by robot.
S3 , Vision device module, communication module publish the location information messages and services which include location information of patient,ward,department,bed location. Radar location and navigation module subscribe messages and services, autonomously move to the beds. S4, Vision device module, communication module publish the location information,radar and location and navigation module subscribe messages and services, autonomously move to the placement area of equipment, and medicine. S5 , The robot recognize faces, five sense organs, features and positions by vision device module and robot vision device and image recognition module. Finger, toe end, arm joints, and joint positions are recognized. Blood vessel amplifier, arm fixing device, toe end position location ,arm wrist, elbow vein blood vessel position, upper arm muscle injection position, and position information are as the following. S6, Communication module publish collection location messages, robot arm subscribe fix device, collection location messages and arm muscle injection location messages, motion planning module subscribe the location messages. S7, According to the location information of step S6, motion planning module, robot collect data and images of oral cavity and blood and inject according to motion planning module which include blood collection module, injection motion planning module, oral cavity collection motion planning module,urine excrement sample storage motion planning module. In step S7, motion planning module of blood collection, injection ,finger tip blood collection module, injection needle collection module, arm fixing device, locating toe end position, arm wrist, the elbow vein blood vessel position, muscle injection position, the application collection needle. The injection needle of collection blood ,intravenous injection and muscle injection are applied on the basis of recognizing the positions of the fingers,the toe end and each joint of the arm. locating toe end position, the finger tip, arm wrist, the elbow vein blood vessel position, Blood vessel amplifier, arm fixing device are used. In step S7, oral collection motion planning module, oral collection motion planning module, facial five-sense organ recognition module are applied by vision recognition module,oral cavity collector, oral collector cotton, oral cavity mirror are use to plan motion of movement, according to left-right front-back direction, oral cavity collector cotton,oral cavity mirror are used to plan motion of movement,according to left-right front-back direction and the left-right front-back direction slide along the wall and collect saliva ,oral cavity features more accurately and oral cavity images in the oral cavity are collected. In step S7, urine sample and feces sample collection module, urine sample and feces sample storage motion planning module is used for the robot to cruise back corresponding to the ward, hospital bed, patient and corresponding to the two-dimensional code and digital code, robotic arm is used to recognize automatically and grasp, move, place urine sample and feces sample in the sample collection area. S8, Vision device module, communication module publish the location information Radar and location and navigation module subscribe messages and services, autonomously move to the placement area of equipment, and medicine.saliva sample placement area,biological information sample placement area, blood sample placement area,urine sample placement area and feces sample placement area, robot arm and motion planning module are used to place and pick the sample autonomously.
S9, Move to saliva sample placement area,biological information sample placement area, blood sample placement area,urine sample placement area and feces sample placement area,place and pick the sample autonomously. S10, Return task completion information to the manager. If not, the task is moved into the next time period.
A method for autonomous locating robot and recognizing features of human organ, and an organ for classifying images, the method for collecting images comprising the following steps: A method for classifying and recognizing internal organs of human organ feature position and medical images: S1, Design human organ feature model, comprising: shoulder joint, breast, nipple, belly navel, feature genital member and a waist joint. S2, Extracting the internal contour of organ image, feature value of each organ and the external position area of human body corresponding to external feature. S3, Inputting the characteristic value of human body internal organ image corresponding to the external characteristic value of each organ, and improving deep neural network method. weight optimizer, According to output value,internal organ classification and organ recognition are obtained by image training. S4, Output the result, classify and recognize human organ images accurately.
A novel method of self-locate position of visceral organs, classification and recognition visceral organs in the medical image collection model is as bellowing steps. The descriptions will proceed in the following orders. S1, Machine version function module public the positions of in vitro feature location S2, According to the positions of in vitro feature location, robot main system,robot arm and ultrasonic examination device subtract the positions public by machine version function module. S3, robot main system,robot arm and ultrasonic examination device subtract the positions, robot arm moves to the positions and scans the collection zone according to motion planing model. Ultrasonic scanner and ultrasonic examination device public image information, robot main system and machine version function module subtract the image information. S4, robot main system and robot arm input the features of outlines of inner organs and the feature value of organs, Improve weight optimizer, train images and output values by improved deep neural network method. S5, According to output value, classify visceral organs and return recognition results. output the value and disease result. Return the result information to the robot main system and admin.
[Claimsl] 1. A medical robot device, system and method,wherein, a medical robotic device includes: Robot main system module is configured to implement robot main control, speech interaction module between users and robot, visual recognition module, heart sound recognition module, lung screw sound recognition module, medical scene demo recognition module, radar real-time mapping, autonomous navigation and movement module, blood collection motion planning module, injection motion planning module, robot arm motion planning module for picking, placing, scanning,arranging; Radar self-controlled movement,medical scenario recognition, and mapping module. The robot main system module is connected to radar, vision device,movement base to implement autonomous navigation and movement and mapping Speech module, is configured to collect medical patient sound and outpatient ward scene sound. The speech module is used for interaction and speech guidance between the main control system and the user, speech commands and speech interaction. Visual recognition module is connected to the image collection device and robot main system, acquires and recognizes an image, and the image collection device comprises: one or more of a general camera,depth camera, and binocular camera, but is not limited to the above-mentioned image collection device. The visual recognition module includes: face recognition, facial feature recognition of a human body, human body feature position recognition, medical scene recognition, medical product recognition, and drug recognition. The face recognition is face recognition of patient user and medical administrator. The recognition of five sense organs of the human body is the recognition of five sense organs and the position of the five sense organs of the human face, the angular position of the oral cavity, and is used for nucleic acid detection, biological feature detection, and other oral cavity detection. Human body feature position recognition refers to joint position recognition, including shoulder, wrist, arm elbow, finger joints and position recognition thereof, arm joints for recognizing finger, toe end, wrist, elbow, shoulder, wrist vein, elbow vein blood vessel position, near shoulder muscle injection position recognition, blood vessel locating, and other key position locating under blood vessel amplifier. The medical scene recognition, the improved neural network method, and recognition of the medical scene. recognition of comprehensive scenes including outpatient demo, ward demo, patient, doctor, doorplate alphanumeric characters. Medical scene demo recognition module is used to recognize medical scene demo by improved neural network method. Comprehensive scene demo recognition Feature position recognition of human body refers to joint position recognition, including shoulder, wrist, arm elbow, finger joints and position recognition thereof, arm joints for recognizing finger, toe end, wrist, elbow, shoulder, for blood vessel position recognition of wrist vein, elbow vein ,and for shoulder muscle injection position recognition, blood vessel locating, and another key position locating under blood vessel amplifier. The motion planning module is to self-set and self adjust parameters by administrator.To train robot to learn planning motion and adaptive setting motion planning parameters by neural network improvement method.It is use for motion plan module which include collection moduleinjection module,medical equipment wearing and using motion planning module, medical products and medicine picking and placing arrange management module.

Claims (1)

  1. [Claims2] 2. A medical robot device, system and method, wherein, the speech device, the speech module and the multi-user robot speech interaction method are characterized in the robot main control system which is connected with the speech device, the speech module comprises speech acquisition device, microphone device, loudspeaker and pickup device, and the speech interaction module, the speech command, the speech character mutual conversion, the speech synthesis and the voiceprint identification are used for acquiring and recognizing the doctor-patient sound, the outpatient ward scene sound, the robot main control system and the multi-user inter-administrator. Further, medical image real-time collection and sharing multi-user-robot speech interaction combined inquiry method comprises the following steps: A medical image real-time collection and sharing multi user-robot speech interaction combined inquiry method, the method comprising the following steps: S1, Speech devices are connected to the robot platform and communicated with another users by administrator which are connected to speech module to communicate with another users. S2, Speech recognition, speech synthesis, speech recording, speech-to-text recognition medical advice are performed by robot. S3, The administrator subscribe and publish messages, services and images by robot platform and multi-user-robot shares the medical information, such as pictures, speech information; S4, Real-time speech interaction by administrator and robot platform, speech recognition module, real-time multi-user speech session and speech-to-text additional picture information, and records multi-user speech interaction and speech conference;
    [Claims3] 3. The medical robot device according to claim 1, wherein,radar self-controlled movement,medical scenario recognition, and mapping module. radar self-controlled autonomous location, navigation, real-time mapping, the medical scene recognition department applying the visual recognition module, the ward doorplate alphanumeric characters,bed number and the radar real-time mapping are fused, and autonomous location, navigation, moving to the corresponding department, ward and bed location are utilized. A medical robot device, characterized in that the radar autonomous mapping navigation medical scene recognition mapping and movement module connect to radar, movement base, main system, radar autonomous location, navigation, real-time mapping, and image recognition human face and medical scene, and the medical scene comprises: department, ward doorplate alphanumeric character bed number and radar real-time mapping fusion, autonomous positioning, navigation, moving to corresponding department, ward, bed position.
    [Claims4] 4. The medical robot device according to claim 1, wherein the collection and injection module and robot main control system are connected to the robot arm, and the collection and injection module is used for adjusting setting parameters through an administrator, the motion planning module is to self-set and self adjust parameters by administrator and to train robot to learn planning motion and adaptive setting motion planning parameters by improved neural network method. It is use for motion plan module which include collection module,injection module,medical equipment wearing and using motion planning module, medical products and medicine picking and placing arrange management module. The collection and injection module includes blood collection module, injection module, oral detection sample collection module, urine sample collection, excrement sample storage management module, and medical image collection and sharing module. The blood sample collection module, injection module, finger tip blood collection module, collection syringe needle module, arm fixing device, locate toe end position, arm wrist, elbow vein blood vessel position, upper arm muscle injection position, application collection needle, the injection needle collection blood, intravenous injection and intramuscular injection are applied on the basis of recognizing the positions of the fingers, the tail end and each joint of the arm. The oral cavity detection sample collection module, the oral cavity collection action planning module, and face five-sense organ recognition, recognition, locating oral cavity position, tooth position, oral wall position of the visual recognition module, the oral collector carried by the robotic arm, the oral collector cotton, the oral cavity mirror, plan motion of movement, the left-right front-back direction and the left-right front-back direction slide along the wall, the action is collected, the saliva is accurately collected, and the oral cavity features and the oral cavity images in the oral cavity and nose are accurately collected. The urine and feces sample storage management module, the urine and feces sample storage action planning module is used for the robot to cruise back to the corresponding ward, hospital bed, patient and corresponding two-dimensional code and digital code matching, the robotic arm is used to automatically identify and grasp, move, place urine, and the feces sample is in the sample collection area. medical image collection and sharing module is characterized in that the medical image collection and sharing module is configured to acquire ultrasonic image CT image, image sharing module, DR radiology department and MRI nuclear magnetic image remote control collection and sharing and far-end consultation and multi-department joint consultation.
    [Claims5] 5. The medical robot device according to claim 1, wherein, The motion planning is to adjust setting parameters by administrator and by training robot to learn to plan motion and adaptive mediation setting motion planning parameter by neural network improvement method, and is used for motion planning and comprised collection and injection module, medical equipment wearing and using motion planning module, medical article and medicine taking and placing configuration management module. Motion planning and comprised collection and injection module, medical equipment wearing and using motion planning module, medical article and medicine taking and placing configuration management module connect to the robot main system and the following modules and devices. Medical products recognition module includes: blood pressure meter, glucose meter, thermometer,stethoscope,heart rate device,breathing device,negative pressure device,24 hours monitoring device,and another medical equipment instruments for each specialist as basic medical devices which are connected to robot and used in basic medical device area of the robot which is used to collect medical information. Neural network methods are improved by extracting shape features, color features, digital codes,two-dimensional codes, recognize and manage medical products. According to doctor's advice,three-party information matching among the recognized medical product information correspond to recognized patient face and two-dimensional code of the bracelet according to task arrangement, and perform matching management on the medical products. Medicine recognition module includes: feature recognition of medicine outer label digital code , two-dimensional code, character, color and shape,to recognize and to match three party information which include medicine name and quantity, the recognized patient face and the recognized two-dimensional code of the bracelet, to perform matching management. The heart sound, the lung screw sound recognition module, the heart sound, and the lung screw sound recognition module are used for heart sound and lung screw voice voiceprint feature extraction, and the improved sound recognition algorithm is used to intelligently recognize the heart sound and the abnormal sound.
    [Claims6] 6. The medical robot device according to claim 1, wherein,task management optimization system, which includes medical robotic devices, a plurality of medical care task management sub-system of multiple departments and a call center sub-system.The medical robotic device is the above-mentioned devices, and medical care task management subsystem and call center subsystem of all multiple departments are connected to and communicated with the robot main control system and optimization task management system.
    [Claims7] 7. The medical robot device according to claim 1, wherein,Three-party of among medical care, patient, matching pick-and-place management method and robot for drug medical instrument and the method which include the following steps: S1, Communication module public medical advice messages and services by administrator
    , speech module subscribe messages and services to receive medical advice messages, patient user subscribes medical advice message and service. S2, Speech recognition, speech synthesis, speech recording, speech-to-text recognition medical advice are performed by robot. S3, Vision device and image recognition module is used to recognize device, drug, corresponding to location information. S4, Vision device and image recognition module, communication module publish the location information messages and services of equipment and medicine, radar location and navigation module subscribe the location information messages and services, autonomously move to the placement area of equipment, and medicine. S5, Motion planning module is used to plan motions of picking device and medicine, scanning digital code and two-dimensional code. S6, Communication module publish location information message which include patient, ward, department, bed location information. Radar location and navigation module subscribe patient location information and autonomously move to bed location. S7, Vision device and image recognition module is used to recognize numeric futures of ward, doorplate letters, and hospital bed number and medical scene demo, to recognize face, to check and match the information , and if so, continue step 8 to relocate and re navigate.
    S8 , The robot scans the digital code, two-dimensional code, and medical advice information digital code on the equipment and medicine by motion planning module, and matches the two-dimensional code, the digital code and medical advice information digital code. If the code is correct by scanning, equipment and medicine is dispensed. Otherwise, the medical advice information is returned to the manager. S 9, The robot arms and motion planning module are used to place, dispense equipment and medicine to the medicine box and instrument placement area. S10, Finish the task of this time period.
    [Claims8] 8, A medical robot device, system and method, wherein,characterized in that medical care device, patient, and robot three-party matching medicine medical device autonomous pickup distribution management method, the management method comprising the following steps: S1, The administrator communication module publish medical advice messages and services, speech module subscribe the medical advice message, the patient user subscribes to receive the medical advice message and service. S2, Speech recognition, speech synthesis, speech recording, speech-to-text recognition medical advice are performed by robot. S3, Vision device module, communication module publish the location information messages and services which include location information of patient,ward,department,bed location. Radar location and navigation module subscribe messages and services, autonomously move to the beds. S4, Vision device module, communication module publish the location information radar and location and navigation module subscribe messages and services, autonomously move to the placement area of equipment, and medicine. S5, The robot recognize faces, five sense organs, features and positions by vision device module and robot vision device and image recognition module. Finger, toe end, arm joints, and joint positions are recognized. Blood vessel amplifier, arm fixing device, toe end position location ,arm wrist, elbow vein blood vessel position, upper arm muscle injection position, and position information are as the following. S6, Communication module publish collection location messages, robot arm subscribe fix device, collection location messages and arm muscle injection location messages, motion planning module subscribe the location messages. S7, According to the location information of step S6, motion planning module, robot collect data and images of oral cavity and blood and inject according to motion planning module which include blood collection module, injection motion planning module, oral cavity collection motion planning module,urine excrement sample storage motion planning module. In step S7, motion planning module of blood collection, injection ,finger tip blood collection module, injection needle collection module, arm fixing device, locating toe end position, arm wrist, the elbow vein blood vessel position, muscle injection position, the application collection needle. The injection needle of collection blood ,intravenous injection and muscle injection are applied on the basis of recognizing the positions of the fingers,the toe end and each joint of the arm. locating toe end position, the finger tip, arm wrist, the elbow vein blood vessel position, Blood vessel amplifier, arm fixing device are used.
    In step S7, oral collection motion planning module, oral collection motion planning module, facial five-sense organ recognition module are applied by vision recognition module,oral cavity collector, oral collector cotton, oral cavity mirror are use to plan motion of movement, according to left-right front-back direction, oral cavity collector cotton,oral cavity mirror are used to plan motion of movement,according to left-right front-back direction and the left-right front-back direction slide along the wall and collect saliva ,oral cavity features more accurately and oral cavity images in the oral cavity are collected. In step S7, urine sample and feces sample collection module, urine sample and feces sample storage motion planning module is used for the robot to cruise back corresponding to the ward, hospital bed, patient and corresponding to the two-dimensional code and digital code, robotic arm is used to recognize automatically and grasp, move, place urine sample and feces sample in the sample collection area. S8 , Vision device module, communication module publish the location information
    . Radar and location and navigation module subscribe messages and services, autonomously move to the placement area of equipment, and medicine.saliva sample placement area,biological information sample placement area, blood sample placement area,urine sample placement area and feces sample placement area, robot arm and motion planning module are used to place and pick the sample autonomously. S9, Move to saliva sample placement area,biological information sample placement area, blood sample placement area,urine sample placement area and feces sample placement area,place and pick the sample autonomously. S10, Return task completion information to the manager. If not, the task is moved into the next time period.
    [Claims9] 9.A medical robot device, system and method,wherein, a medical care, patient, robot three-party matching remote control and autonomous collection sample, and an injection management method, the collection method comprising the following steps: A method for autonomous locating and recognizing features of human organ, and an organ for classifying images, the method for collecting images comprising the following steps: A method for classifying and recognizing medical images of internal organs of human organ feature position: S1, Design feature organ model, comprising: shoulder joint, breast, nipple, belly navel, feature genital member and waist joint. S2, Extract the internal contour of organ image, feature value of each organ and position of the external collection area of human body corresponding to external feature. S3 , Input the characteristic value of human body internal organ image corresponding to the external characteristic value of each organ, and improve deep neural network method and weight optimizer.According to output value,internal organ classification and organ recognition are obtained by image training. S4, Output the result, classify and recognize images of human organ accurately.
    A novel method of self-locate position of visceral organs, classification and recognition of images of visceral organs in the medical image collection model is as the following steps. The descriptions will proceed in the following orders. S1, Vision device module publishes the positions of in vitro feature location S2, According to the positions of in vitro feature location,robot main system,robot arm and ultrasonic examination device subtract the positions published by vision device module. S3 , Robot main system,robot arm and ultrasonic examination device subtract the positions, robot arm moves to the positions and scans the collection zone according to motion planing model. Ultrasonic scanner and ultrasonic examination device publish image information, robot main system and vision device module subtract the image information. S4, Robot main system and robot arm input the features of outlines of inner organs and the feature value of organs, Improve weight optimizer, train images and output values by improved deep neural network method. S5 , According to output value, classify and recognize visceral organs images, return recognition results, output the value and symbols result of disease. return the result information to the robot main system and admin.
AU2021321650A 2020-08-05 2021-07-29 Medical robotic device, system, and method Pending AU2021321650A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202010780479.0A CN111916195A (en) 2020-08-05 2020-08-05 Medical robot device, system and method
CN202010780479.0 2020-08-05
PCT/CN2021/000162 WO2022027921A1 (en) 2020-08-05 2021-07-29 Medical robotic device, system, and method

Publications (1)

Publication Number Publication Date
AU2021321650A1 true AU2021321650A1 (en) 2023-04-13

Family

ID=73287855

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021321650A Pending AU2021321650A1 (en) 2020-08-05 2021-07-29 Medical robotic device, system, and method

Country Status (3)

Country Link
CN (1) CN111916195A (en)
AU (1) AU2021321650A1 (en)
WO (1) WO2022027921A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111916195A (en) * 2020-08-05 2020-11-10 谈斯聪 Medical robot device, system and method
WO2021254427A1 (en) * 2020-06-17 2021-12-23 谈斯聪 Integrated robot and platform for ultrasound image data acquisition, analysis, and recognition
WO2021253809A1 (en) * 2020-06-19 2021-12-23 谈斯聪 Integrated device, system and method for blood collection and analysis as well as intelligent image identification and diagnosis
CN114800538A (en) * 2021-01-21 2022-07-29 谈斯聪 Accompanying robot device, self-adaptive learning system and method
CN112951230A (en) * 2021-02-08 2021-06-11 谈斯聪 Remote and autonomous experimental robot device, management system and method
CN113110325A (en) * 2021-04-12 2021-07-13 谈斯聪 Multi-arm sorting operation mobile delivery device, and optimized management system and method
CN115192051A (en) * 2021-04-13 2022-10-18 佳能医疗系统株式会社 Medical imaging apparatus, medical imaging system, and auxiliary examination method in medical imaging apparatus
CN112990101B (en) * 2021-04-14 2021-12-28 深圳市罗湖医院集团 Facial organ positioning method based on machine vision and related equipment
CN113425332A (en) * 2021-06-29 2021-09-24 尹丰 Integrated device and method for nucleic acid collection and vaccination
CN113478457A (en) * 2021-08-03 2021-10-08 爱在工匠智能科技(苏州)有限公司 Medical service robot
CN113858219A (en) * 2021-08-23 2021-12-31 谈斯聪 Medical robot device, system and method
CN113855067A (en) * 2021-08-23 2021-12-31 谈斯聪 Visual image and medical image fusion recognition and autonomous positioning scanning method
CN113855250A (en) * 2021-08-27 2021-12-31 谈斯聪 Medical robot device, system and method
CN113855068A (en) * 2021-08-27 2021-12-31 谈斯聪 Method for intelligently identifying chest organs and autonomously positioning and scanning chest organs
CN114310957A (en) * 2022-01-04 2022-04-12 中国科学技术大学 Robot system for medical detection and detection method
WO2023167830A1 (en) * 2022-03-01 2023-09-07 The Johns Hopkins University Autonomous robotic point of care ultrasound imaging
CN114886476B (en) * 2022-07-14 2022-09-20 清华大学 Automatic collection robot for throat swabs
CN117245635A (en) * 2022-12-12 2023-12-19 北京小米机器人技术有限公司 Robot, control method and device thereof, and storage medium
CN116079720A (en) * 2022-12-23 2023-05-09 深圳优地科技有限公司 Robot control method, robot, and storage medium
CN115781686A (en) * 2022-12-26 2023-03-14 北京悬丝医疗科技有限公司 Mechanical arm for remotely diagnosing pulse and control method
CN116129112A (en) * 2022-12-28 2023-05-16 深圳市人工智能与机器人研究院 Oral cavity three-dimensional point cloud segmentation method of nucleic acid detection robot and robot

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120130739A1 (en) * 2010-11-21 2012-05-24 David Crane Unsupervised Telemedical Office for Remote &/or Autonomous & Automated Medical Care of Patients
US20150273697A1 (en) * 2014-03-27 2015-10-01 Fatemah A.J.A. Abdullah Robot for medical assistance
CN206780416U (en) * 2017-05-23 2017-12-22 周葛 A kind of intelligent medical assistant robot
CN107030714A (en) * 2017-05-26 2017-08-11 深圳市天益智网科技有限公司 A kind of medical nurse robot
CN107322602B (en) * 2017-06-15 2020-02-14 重庆柚瓣家科技有限公司 Home service robot for telemedicine
CN107788958A (en) * 2017-10-20 2018-03-13 深圳市前海安测信息技术有限公司 medical monitoring robot and medical monitoring method
CN114072258A (en) * 2019-07-01 2022-02-18 瓦斯菲·阿希达法特 Medical artificial intelligent robot arrangement for robot doctor
CN110477956A (en) * 2019-09-27 2019-11-22 哈尔滨工业大学 A kind of intelligent checking method of the robotic diagnostic system based on ultrasound image guidance
CN111358439A (en) * 2020-03-14 2020-07-03 厦门波耐模型设计有限责任公司 General practitioner robot system
CN111916195A (en) * 2020-08-05 2020-11-10 谈斯聪 Medical robot device, system and method

Also Published As

Publication number Publication date
CN111916195A (en) 2020-11-10
WO2022027921A1 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
AU2021321650A1 (en) Medical robotic device, system, and method
US20220331028A1 (en) System for Capturing Movement Patterns and/or Vital Signs of a Person
US20210030275A1 (en) System and method for remotely adjusting sound acquisition sensor parameters
CA2827523C (en) System and method for performing an automatic and self-guided medical examination
US6997873B2 (en) System and method for processing normalized voice feedback for use in automated patient care
CN107752984A (en) A kind of high intelligent general medical practice operation robot based on big data
EP1072994B1 (en) System and method for providing normalized voice feedback from an individual patient in an automated collection and analysis patient care system
CN111844078A (en) Intelligent nursing robot assisting nurse in clinical work
WO2023024399A1 (en) Medical robot apparatus, system and method
US20120130739A1 (en) Unsupervised Telemedical Office for Remote &/or Autonomous & Automated Medical Care of Patients
WO2018220565A1 (en) Apparatus and methods for the management of patients in a medical setting
WO2023024396A1 (en) Recognition, autonomous positioning and scanning method for visual image and medical image fusion
CN113035353A (en) Digital twin health management system
CN112151137A (en) Method, device, system and storage medium for robot-machine cooperation accompanying medical ward round
CN109044656B (en) Medical nursing equipment
CN110673721A (en) Robot nursing system based on vision and idea signal cooperative control
CN111012309A (en) Morning check self-service system
WO2012111013A1 (en) System and method for performing an automatic and remote trained personnel guided medical examination
Gritsenko et al. Current state and prospects for the development of digital medicine
US20100249529A1 (en) Pain Monitoring Apparatus and Methods Thereof
Meindl Microelectronics and computers in medicine
CN110811590A (en) Remote real-time online monitoring and management system for human health information
CN205866740U (en) Acquire collection system of multidimension heartbeat information in step
Ricci The Italian national telemedicine programme
Kurzynski et al. TelFam: a telemedicine system for the family doctor practices