CN112925325A - Multifunctional intelligent robot medicine distribution system based on data fusion - Google Patents
Multifunctional intelligent robot medicine distribution system based on data fusion Download PDFInfo
- Publication number
- CN112925325A CN112925325A CN202110128624.1A CN202110128624A CN112925325A CN 112925325 A CN112925325 A CN 112925325A CN 202110128624 A CN202110128624 A CN 202110128624A CN 112925325 A CN112925325 A CN 112925325A
- Authority
- CN
- China
- Prior art keywords
- data
- medicine
- control unit
- angle
- laser radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000003814 drug Substances 0.000 title claims abstract description 70
- 230000004927 fusion Effects 0.000 title claims abstract description 42
- 238000012545 processing Methods 0.000 claims abstract description 32
- 230000010354 integration Effects 0.000 claims abstract description 23
- 238000012384 transportation and delivery Methods 0.000 claims description 36
- 238000000034 method Methods 0.000 claims description 21
- 230000008569 process Effects 0.000 claims description 18
- 238000007726 management method Methods 0.000 claims description 14
- 230000004888 barrier function Effects 0.000 claims description 12
- 230000001276 controlling effect Effects 0.000 claims description 12
- 238000012377 drug delivery Methods 0.000 claims description 12
- 230000006870 function Effects 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 6
- 238000011161 development Methods 0.000 claims description 6
- 230000008447 perception Effects 0.000 claims description 6
- 238000013439 planning Methods 0.000 claims description 6
- 238000012360 testing method Methods 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 claims description 6
- 238000003491 array Methods 0.000 claims description 3
- 230000002146 bilateral effect Effects 0.000 claims description 3
- 230000005540 biological transmission Effects 0.000 claims description 3
- 238000009795 derivation Methods 0.000 claims description 3
- 238000003745 diagnosis Methods 0.000 claims description 3
- 238000011065 in-situ storage Methods 0.000 claims description 3
- 210000001503 joint Anatomy 0.000 claims description 2
- 238000012544 monitoring process Methods 0.000 claims description 2
- 230000001105 regulatory effect Effects 0.000 claims description 2
- 230000008054 signal transmission Effects 0.000 claims description 2
- 238000010276 construction Methods 0.000 abstract description 3
- 206010063385 Intellectualisation Diseases 0.000 abstract description 2
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 description 12
- 229940079593 drug Drugs 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 208000015181 infectious disease Diseases 0.000 description 4
- 208000035473 Communicable disease Diseases 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 230000002349 favourable effect Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 206010011409 Cross infection Diseases 0.000 description 1
- 206010029803 Nosocomial infection Diseases 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 230000007721 medicinal effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 238000004148 unit process Methods 0.000 description 1
- 230000009385 viral infection Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0234—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
- G05D1/0236—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0225—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a multifunctional intelligent robot medicine distribution system based on data fusion, which comprises: the system comprises a bottom data processing control unit, an upper data fusion and sensor data network integration control unit, an automatic driving unit and a medicine distribution information management unit; the bottom layer data processing control unit comprises a coding motor, an attitude sensor, a Mecanum wheel and an STM32 main control board on hardware; a Mecanum wheel and an encoder motor closed-loop control system are adopted on software; the upper layer data fusion and sensor data network integration control unit comprises an NVIDIA TX2 control panel, an RPLIDAR S1, a ZED binocular camera and two ordinary cameras on hardware; the automatic driving unit and the medicine distribution information management unit comprise information input and automatic driving algorithms. The invention can complete the tasks of high repeatability and single work content such as medicine distribution, improves the intellectualization and the unmanned degree of the hospital, is beneficial to perfecting the medical system of the hospital and promotes the informatization construction of the hospital.
Description
Technical Field
The invention relates to an automatic medicine distribution system, in particular to a multifunctional intelligent robot medicine distribution system based on data fusion.
Background
As an emerging technology rising in recent decades, an intelligent robot and an automation technology thereof have been continuously developed and advanced into the lives of people, and in order to make the robot better serve the lives of people, the development of the robot is slow from an assembly robot on a workshop assembly line to an indoor floor sweeping robot, a restaurant ordering robot, and the like. Robotics is becoming an indispensable part of human life.
However, with the continuous development of scientific technology and the continuous improvement of service level, hospitals are also continuously improving service quality and service efficiency as the first battle line of healthy life of people. In hospitals, doctors have the responsibility of providing diagnoses to patients and nurses have the responsibility of caring for patients, but as the overall strength of hospitals increases, the size of hospitals increases. At present, hospitals are mainly faced with the following problems in the care of patients in wards:
(1) and (6) medicine distribution. The task of the nurse is to regularly dispense the medication to the patients on a daily basis, and because of the plethora of patients, the medication required for each patient is not the same, resulting in a medication delivery process that requires sufficient patience and no possible carelessness.
(2) And (4) distributing the personnel. Because the medicine delivery process of the hospital adopts manual delivery, the medicine can be completely distributed by multiple operations, which causes huge resource waste in time and personnel, and the hospital needs to additionally bear the additional work charge of nurses and pay output.
(3) And (4) cross infection. Doctors and nurses as personnel working at the front line of epidemic battles have many cases of virus infection in the process of contacting with patients, and the safety of the nurses cannot be guaranteed.
Disclosure of Invention
In order to solve the technical problems, the technical scheme provides a multifunctional intelligent robot medicine distribution system based on data fusion, and the system realizes automatic driving in a hospital environment by introducing an automatic driving technology and an image processing technology. The system adopts a Mecanum wheel four-wheel trolley for realizing the automatic driving hardware, a data acquisition part comprises an attitude sensor, a motor encoder, a single line laser radar and a binocular depth camera, a 2D plane model of a ward area is established through the laser radar, the binocular depth camera can simultaneously measure the depth information of an image, the image processing is carried out through the data acquired by the camera, the current position of the robot is analyzed and checked, and the marking is carried out in an established map; the problems can be effectively solved.
Technical scheme
A multi-functional intelligent robot drug delivery system based on data fusion includes: the system comprises a bottom data processing control unit, an upper data fusion and sensor data network integration control unit, an automatic driving unit and a medicine distribution information management unit; the hardware of the bottom layer data processing control unit comprises an STM32 main control board used for collecting data and sending control signals, an attitude sensor connected with the STM32 main control board and used for observing and controlling the state of the intelligent robot, and a coding motor used for controlling the intelligent robot to run, wherein the coding motor is connected with a Mecanum wheel; a Mecanum wheel control system for controlling a Mecanum wheel and an encoder motor closed-loop control system for controlling an encoding motor to operate are implanted in the STM32 main control board; the upper layer data fusion and sensor data network integration control unit comprises an NVIDIA TX2 control panel in signal connection with an STM32 main control panel, a laser radar used for monitoring the surrounding environment and mounted on the intelligent robot, and hardware equipment capable of obtaining a depth image and a cloud picture, wherein the laser radar and the hardware equipment capable of obtaining the depth image and the cloud picture are connected with the NVIDIA TX2 control panel for signal transmission; the automatic driving unit and the medicine distribution information management unit comprise an information entry system and an automatic driving control system, the information entry system is in butt joint with a doctor diagnosis and prescription system, a doctor can directly enter patient information and prescription information into the information entry system when prescribing medicine, and the automatic driving control system can input the driving track of each sickbed in a certain floor into the automatic driving control system in advance.
Furthermore, the intelligent robot can adopt a vehicle of a distribution vehicle, and wheels of the distribution vehicle adopt a Mecanum wheel structure; the intelligent robot can also adopt platforms with other mechanical structures such as a biped robot, an aircraft and the like.
Furthermore, the bottom layer data processing control unit has two control modes of manual control and automatic control.
Furthermore, the speed control mode of the bottom layer data processing control unit is closed-loop control, and the rotating speed of the wheels is sensed through a speed feedback device; the main algorithm implementation process is as follows:
(1) receiving the speed data V sent by the upper control unit by the lower control unit STM32S;
(2) Converting, in a control board, speed information of a vehicle into target PWM duty ratios of four Mecanum wheels;
(3) inputting the target value into a PWM setting function to drive the wheel to rotate, and simultaneously reading the rotating speed V of the wheel through a self-contained encoderA;
(4) Inputting the target rotating speed and the actual rotating speed into a PID algorithm to realize a closed-loop speed regulation function;
meanwhile, the real-time data value of the encoder is stored in the data frame, and after all data are collected, the data frame is sent to the upper control unit, so that the purpose of self state perception and the basis of later attitude control are achieved.
Furthermore, the bottom layer data processing control unit performs data fusion on the attitude sensor and the motor rotating speed data when controlling the direction of the robot, senses the self direction through the motor rotating speed, and modifies the zero offset of the attitude sensor in the yaw direction due to the problem of sensor precision;
the rotating speeds of motors on the left side of the robot are respectively set as V1L、V2LThe rotating speed of the right motor is V3R、 V4RThe angle of yaw direction rotation measured by the attitude sensor is AYTherefore, the left wheel average speed is:
VL=(|V1L|+|V2Li)/2 (formula 1)
The same can be obtained
VR=(|V3R|+|V4RI)/2 (formula 2)
When the Mecanum wheel trolley is steered in situ, the rotating speeds of the four wheels are consistent according to the mechanical principle, and only the rotating direction of the motor needs to be changed, at the moment, the distribution trolley rotates clockwise or anticlockwise according to the central point of the trolley body;
it can be obtained that the steering linear velocity of the delivery vehicle is VT:
VT=VL-VR(formula 3)
When the vehicle body length is L, the angular speed of the rotation of the distribution vehicle is omegav:
By integration, the angle θ at which the dispensing vehicle is rotated at this time is obtained.
The angle theta is correctedσ:
θσ=(θ+AY) /2 (formula 5)
And the angle is used as final data and stored in a corresponding unit of a protocol, and after all data are acquired, the data frame is sent to an upper control unit so as to achieve the purposes of sensing the self state and improving the accuracy of the sensor.
Furthermore, the laser radar can adopt a single-line laser radar or a multi-line laser radar; RPLIDAR S1 single line lidar is the most preferred.
Furthermore, the hardware equipment capable of obtaining the depth image and the point cloud picture comprises a binocular camera and two ordinary cameras, wherein the binocular camera adopts a ZED binocular camera.
Furthermore, the NVIDIA TX2 control board in the upper layer data fusion and sensor data network integration control unit is a Jeston NVIDIA TX2 development board running ubuntu16.04 and kinetic ROS operating systems, and is connected with a binocular camera and two ordinary cameras installed in the front of the intelligent robot, and the binocular camera is a camera for testing depth;
the depth image of a scene is obtained by a depth testing camera through a binocular algorithm, the depth image is transmitted to an NVIDIA TX2 control panel for processing, the detection wide angle of the ZED camera is 110 degrees according to the visual angle parameters of the depth camera, 1080P resolution is used for the image, namely the image size is 3840 x 1080 (35 pixels are corresponding to one degree), and the condition of a front road barrier is obtained by perceiving the depth image in the scene; meanwhile, the laser radar continuously scans the surrounding environment (0 degree of the laser radar always points to the opposite direction of the forward direction of the vehicle body), and the position D of the intelligent robot relative to the left wall body is obtained through the values of 90 degrees and 270 degrees of the laser radarLAnd wall position D relative to the right sideRAnd the length of the vehicle body is h and the width is l, when the intelligent robot requires the intelligent robot to make judgment and reaction on an object in the front within 1.5 meters in the advancing process, the detection field angle alpha is obtained according to the width of the intelligent robot per se:
thus obtaining the required depth data obstacle avoidance range alpha Acquiring the depth data in the area to obtain the data of the depth camera, classifying the data according to the distance, and determining the distance d in the image1±dσCounting the points in the range, and storing the coordinates in an array Ad1Performing the following steps; the distance in the image is d2±dσCounting the points in the range, and storing the coordinates in an array Ad2…, and so on, a series of arrays can be obtained, and when the obstacle avoidance judgment is carried out, only the obstacle avoidance operation is carried out on the obstacle with the distance within the range of 1.5 m;
after the depth array is obtained, obstacle avoidance operation is carried out by combining laser radar data, and firstly, a depth array set { A ] is obtainedd1,Ad2,…,AdnAnd the corresponding angle (125 deg. for lidar for 0 deg. for depth camera view) and distance d1,d2,…,dnExplaining an obstacle avoidance process as follows:
when the obstacle appears in alpha, the set of points is A, and the mean value x of the abscissa in A is takenAIf xA>1980, judging the obstacle is on the right side of the relative medicine delivery vehicle, otherwise, judging the obstacle is on the left side. Let x beA<1980, according to the corresponding relation between the visual angle and the pixel point, the deviation angle alpha of the barrier relative to the positive advancing direction of the AGV can be obtainedA:
αA=(1980-xA) 35 (equation 2)
The service vehicle is regulated to carry out obstacle avoidance operation right first, and the deflection angle and the distance d at the moment are combinedAAs can be seen, the distance l that the AGV needs to travel to the rightA:
lA=2tanαA+dsafe(formula 3)
dsafeIs a set safe distance;
at the moment, collecting the laser radar data, and detecting the angle in the laser radarThe distance perception is carried out in, and the judgment is carried out on the right l of the AGV according to the angle data of the laser radarAWhether the obstacle appears in the range or not is judged according to the following criteria:
ensuring that each laser radar data in the angle range is larger than l corresponding to the anglelaserThen, it can be determined that there is no obstacle on the right side of the AGV, and right-side obstacle avoidance is performed; when the obstacle avoidance can not be carried out on the right side (according to the bilateral symmetry, no derivation is carried out), the laser radar is used Sensing the left side of the AGV, and ensuring that the left side of the AGV does not have an obstacle; and moving the barrier out of the traveling route by adjusting the position of the intelligent robot.
Further, when in the deflection angle range alphaAWhen a plurality of obstacles appear in the device, the obstacle avoidance is carried out on the nearer obstacle and then on the farther obstacle.
Furthermore, a full-duplex data transmission mode is adopted between the bottom layer data processing control unit and the upper layer data fusion and sensor data network integration control unit, and data are transmitted in a link in two directions; carry out signal connection through serial ports and/or wireless mode between STM32 main control panel and the NVIDIA TX2 control panel, wireless connection mode includes other wireless connection modes such as WIFI, bluetooth.
Further, the specific control steps of the automatic driving unit and the drug delivery information management unit are as follows:
the method comprises the following steps: initializing a bottom sensor and establishing a data channel with an upper layer;
step two: inputting main information, putting a medicine into the main information, and planning a distribution path;
step three: the distribution vehicle is self-navigated, and barrier avoidance actions and doorplate scanning are processed in parallel by the system while the distribution vehicle travels;
step four: broadcasting information after arriving at a medicine delivery place, informing a patient to take medicine, opening a cabinet door after the patient inputs self information and checks that the patient does not have the correct medicine, marking the point if no person takes the medicine, and continuing to advance to the next delivery point after waiting for a certain time;
step five: after the first medicine delivery is finished, if the medicine is delivered completely, returning to the starting point, otherwise, performing the second path planning and continuing to deliver, wherein the path planning time is not more than 3;
step six: and after the medicine is delivered for three times, returning to the starting point, recording the condition into the system, informing the medical staff of the medicine delivery condition, and confirming the condition by the medical staff.
(III) advantageous effects
Compared with the prior art, the multifunctional intelligent robot medicine distribution system based on data fusion has the following beneficial effects:
(1) according to the technical scheme, each data acquisition device and each sensor connected with the bottom data processing control unit, the upper data fusion and sensor data network integration control unit form nodes, a node network design thought is adopted, distributed nodes are established, the complexity of the system is simplified, and the independence and the network expansibility among the systems are improved.
(2) In the technical scheme, advanced automatic driving technology and image processing technology are introduced through the automatic driving unit and the medicine distribution information management unit, so that tasks such as medicine distribution and the like with high repeatability and single work content are completed, the intelligent and unmanned degree of a hospital is improved on the basis of saving manpower and material resources of the hospital, the medical system of the hospital is favorably perfected, and the informatization construction of the hospital is promoted. To doctor's nurse, dispense through this device, can alleviate foreign matter personnel's work load, still reduced the number of times that nurse and patient contacted simultaneously, reduced doctor's nurse infectious disease's probability, ensured doctor's nurse's health safety and personal safety.
(3) The automatic driving technology introduced into the technical scheme is a new technology developed in recent years, and although automatic driving is difficult to realize outdoors, the automatic driving in a fixed and simple scene of a hospital can achieve high accuracy and improve the working efficiency of the hospital through a multi-sensor fusion technology, so that a large amount of manpower and material resources are saved, and the intellectualization and the high efficiency of a hospital management service system are improved.
(4) This technical scheme can be under the existing environment of not changing, the safe delivery of medicine has been accomplished, the work efficiency of hospital is greatly improved, the intellectuality has been realized simultaneously, be favorable to the hospital to master patient's the information of dosing better, through the scientization, intelligent management, inpatient's experience in hospital has been improved, the expense of hospital has also significantly reduced simultaneously, through will sending medicine system access hospital network, be favorable to the hospital integration, high-efficient construction, in the guarantee treatment, reduce the number of times that doctor and patient contacted, effectively reduce the infectious disease's infection route, guarantee doctor nurse's life safety, it has very important meaning to the development of medical career.
Drawings
FIG. 1 is a schematic block diagram of the overall architecture of the system of the present invention.
FIG. 2 is a schematic block diagram of closed loop control regulation in the present invention.
Fig. 3 is a schematic diagram of the work flow of the underlying data processing control unit in the present invention.
Fig. 4 is a schematic diagram showing a specific work flow of the automatic driving unit and the drug delivery information managing unit in the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. The described embodiments are only some embodiments of the invention, not all embodiments. Various modifications and improvements of the technical solutions of the present invention may be made by those skilled in the art without departing from the design concept of the present invention, and all of them should fall into the protection scope of the present invention.
Example 1:
as shown in fig. 1 to 4, a multifunctional intelligent robot medicine distribution system based on data fusion includes: the system comprises a bottom layer data processing control unit, an upper layer data fusion and sensor data network integration control unit, an automatic driving unit and a medicine distribution information management unit.
1. Bottom layer data processing control unit:
the bottom layer data processing unit comprises a coding motor, an attitude sensor, a Mecanum wheel, an OLED display screen and an STM32 main control board on hardware, wherein the coding motor comprises a rotating part and an encoder part, and the speed of the vehicle is calculated through the encoder; the attitude sensor is used for sensing angle information of the trolley; the OLED display screen displays necessary data, so that debugging and error diagnosis are facilitated; the STM32 is the underlying control system that performs tasks including information collection, data processing, data encapsulation, data transmission, and receiving control information.
Firstly, an STM32 main control board in a bottom layer data processing unit completes initialization of devices such as an attitude sensor and an encoder interface, and the system has two functions of automatic driving and manual driving for convenient debugging;
the speed control mode of the bottom data processing control unit is closed-loop control, and the rotating speed of the wheel is sensed through a speed feedback device; the main algorithm implementation process is as follows:
(1) receiving the speed data V sent by the upper control unit by the lower control unit STM32s;
(2) Converting, in a control board, speed information of a vehicle into target PWM duty ratios of four Mecanum wheels;
(3) inputting the target value into a PWM setting function to drive the wheel to rotate, and simultaneously reading the rotating speed V of the wheel through a self-contained encoderA;
(4) Inputting the target rotating speed and the actual rotating speed into a PID algorithm to realize a closed-loop speed regulation function;
meanwhile, the real-time data value of the encoder is stored in the data frame, and after all data are collected, the data frame is sent to the upper control unit, so that the purpose of self state perception and the basis of later attitude control are achieved.
The bottom layer data processing control unit is used for carrying out data fusion on the attitude sensor and the motor rotating speed data when controlling the direction of the robot, sensing the self direction through the motor rotating speed and modifying zero offset of the attitude sensor in the yaw direction due to the problem of sensor precision;
the rotating speeds of motors on the left side of the robot are respectively set as V1L、V2LThe rotating speed of the right motor is V3R、 V4RThe angle of yaw direction rotation measured by the attitude sensor is AYTherefore, the left wheel average speed is:
VL=(|V1L|+|V2Li)/2 (formula 1)
The same can be obtained
VR=(|V3R|+|V4RI)/2 (formula 2)
When the Mecanum wheel trolley is steered in situ, the rotating speeds of the four wheels are consistent according to the mechanical principle, and only the rotating direction of the motor needs to be changed, at the moment, the distribution trolley rotates clockwise or anticlockwise according to the central point of the trolley body;
it can be obtained that the steering linear velocity of the delivery vehicle is VT:
VT=VL-VR(equation 3) when the vehicle body length is L, the angular velocity of the rotation of the dispensing vehicle is ωv:
By integration, the angle θ at which the dispensing vehicle is rotated at this time is obtained.
The angle theta is correctedσ:
θσ=(θ+AY) /2 (formula 5)
And the angle is used as final data and stored in a corresponding unit of a protocol, and after all data are acquired, the data frame is sent to an upper control unit so as to achieve the purposes of sensing the self state and improving the accuracy of the sensor.
In the manual driving part, the bottom layer receives data from the Bluetooth, receives the data according to a set protocol format through a handle or a terminal, and receives the data from the Bluetooth serial port.
Then, after the sensor initialization is completed, the raw data obtained by the attitude sensor is processed, the raw data is converted into an Euler angle through a quaternion, meanwhile, the encoder starts to read the rotating speed of the wheel, and the speed is divided into two parts of the size and the direction of the speed to be transmitted separately.
Then, the data is packaged into a frame according to the sequence of the frame header, the speed, the direction, the Euler angle and the quaternion read by the four motor encoders, and the data is sent back to the upper control panel through the serial port.
And finally, the NVIDIA TX2 control board in the upper layer data fusion and sensor data network integration control unit processes the data, and then the driving data is fed back to the STM32 main control board in the bottom layer data processing control unit, so that the automatic driving of the delivery vehicle is completed.
In the driving process, an STM32 main control board in a bottom data processing control unit receives speed control data sent by an NVIDIA TX2 control board in an upper data fusion and sensor data network integration control unit in real time through the serial port, the data are converted into four electronic rotating speeds, the speed is strictly controlled through a closed loop system, an attitude sensor also controls the direction of a distribution vehicle through self data while controlling the speed, the self attitude is controlled through the double actions of an encoder and the attitude sensor, and self errors are corrected. Meanwhile, in order to facilitate debugging and data observation, parameters are displayed in real time through the OLED screen in the whole process.
2. The upper layer data fusion and sensor data network integration control unit:
the upper layer data fusion and sensor data network integration control unit comprises an NVIDIA TX2 control panel, an RPLIDAR S1 laser radar, a ZED binocular camera and two ordinary cameras in hardware, in the system, a distributed network is established by establishing sensor nodes, data relationships can be established among the nodes through corresponding topics, and meanwhile, the nodes can be freely added and deleted through the network, so that the complexity of the system can be simplified, and the expandability and the tailorability of the system can be improved.
In the driving process, the NVIDIA TX2 serving as an upper main control board establishes necessary nodes for the sensor, firstly starts a binocular camera to acquire and process data, creates corresponding nodes and themes for data such as original images of left and right cameras, synthesized depth of field images, point cloud images and the like, and adds the nodes and themes into a system network; the NVIDIA TX2 control panel in the upper data fusion and sensor data network integration control unit is a Jeston Nvidia TX2 development panel running an ubuntu16.04 and kinetic ROS operating system, and is connected with a binocular camera and two ordinary cameras which are arranged at the front part of the intelligent robot, and the binocular camera is a camera for testing depth;
the depth image of a scene is obtained by a depth testing camera through a binocular algorithm, the depth image is transmitted to an NVIDIA TX2 control panel for processing, the detection wide angle of the ZED camera is 110 degrees according to the visual angle parameters of the depth camera, 1080P resolution is used for the image, namely the image size is 3840 x 1080 (35 pixels are corresponding to one degree), and the condition of a front road barrier is obtained by perceiving the depth image in the scene; at the same time, the lidar is continuously directed to the surroundingsThe environment is scanned (0 degree of the laser radar always points to the reverse direction of the forward direction of the vehicle body), and the position D of the intelligent robot relative to the left wall body is obtained through the values of 90 degrees and 270 degrees of the laser radarLAnd wall position D relative to the right sideRAnd the length of the vehicle body is h and the width is l, when the intelligent robot requires the intelligent robot to make judgment and reaction on an object in the front within 1.5 meters in the advancing process, the detection field angle alpha is obtained according to the width of the intelligent robot per se:
thus obtaining the required depth data obstacle avoidance range alpha Acquiring the depth data in the area to obtain the data of the depth camera, classifying the data according to the distance, and determining the distance d in the image1±dσCounting the points in the range, and storing the coordinates in an array Ad1Performing the following steps; the distance in the image is d2±dσCounting the points in the range, and storing the coordinates in an array Ad2…, and so on, a series of arrays can be obtained, and when the obstacle avoidance judgment is carried out, only the obstacle avoidance operation is carried out on the obstacle with the distance within the range of 1.5 m;
after the depth array is obtained, obstacle avoidance operation is carried out by combining laser radar data, and firstly, a depth array set { A ] is obtainedd1,Ad2,…,AdnAnd the corresponding angle (125 deg. for lidar for 0 deg. for depth camera view) and distance d1,d2,…,dnExplaining an obstacle avoidance process as follows:
when the obstacle appears in alpha, the set of points is A, and the mean value x of the abscissa in A is takenAIf xA>1980, judge the obstacle on the right side of the relative medicine delivery vehicleSide, otherwise, the obstacle appears on the left side. Let x beA<1980, according to the corresponding relation between the visual angle and the pixel point, the deviation angle alpha of the barrier relative to the positive advancing direction of the AGV can be obtainedA:
αA=(1980-xA) 35 (formula 2) specifies that the service vehicle carries out obstacle avoidance operation right first, and combines the deflection angle and the distance d at the momentAAs can be seen, the distance l that the AGV needs to travel to the rightA:
lA=2tanαA+dsafe(formula 3)
dsafeIs a set safe distance;
at the moment, collecting the laser radar data, and detecting the angle in the laser radarThe distance perception is carried out in, and the judgment is carried out on the right l of the AGV according to the angle data of the laser radarAWhether the obstacle appears in the range or not is judged according to the following criteria:
ensuring that each laser radar data in the angle range is larger than l corresponding to the anglelaserThen, it can be determined that there is no obstacle on the right side of the AGV, and right-side obstacle avoidance is performed; when the obstacle avoidance can not be carried out on the right side (according to the bilateral symmetry, no derivation is carried out), the laser radar is used Sensing the left side of the AGV, and ensuring that the left side of the AGV does not have an obstacle; and moving the barrier out of the traveling route by adjusting the position of the intelligent robot. When in the declination angle range alphaAWhen a plurality of obstacles appear in the device, the obstacle avoidance is carried out on the nearer obstacle and then on the farther obstacle.
The upper layer data fusion and sensor data network integration control unit can simultaneously subscribe related cameras and laser radar nodes, and after data acquisition is finished, obstacles in the driving process of the trolley are avoided through a data processing algorithm.
3. An automatic driving unit and a drug delivery information management unit:
the medicine distribution management system is mainly used for storing necessary information of medicines and medical orders of medical staff, uniformly managing and storing information such as the quantity and dosage of the input medicines, and meanwhile, at the background, an attending doctor can monitor and follow the condition of a patient in real time through the medicine information, so that the doctor can conveniently master the condition of the patient.
The automatic driving system is the core of the system, before the automatic driving system is used for the first time, medical staff carry out manual remote control on the delivery vehicle, the laser radar carries out accurate two-dimensional modeling on the working environment of the delivery vehicle and stores the modeling in the system, and the established map is used for position initialization and traveling position estimation before each delivery.
Before carrying out the medicine delivery, at first need medical personnel to classify and information input to the medicine, medical personnel inputs into the delivery system according to the necessary doctor's advice such as patient's number, explanation of using medicine and dose of using medicine according to the normal operation flow, and at this moment, delivery information management system keeps the information to open the cabinet door, medical personnel only need put into the medicine in the middle of the appointed cupboard calculate and accomplish an operation promptly. Medical personnel start the delivery car after accomplishing the operation, can begin this medicine delivery, and the delivery flow of system is as follows:
firstly, after the system is started, the patient ward is determined by the patient number input by the distribution management system, and the patient ward is visited according to the sequence of house numbers to determine the number of the patient ward for medicine distribution.
And then, the distribution vehicle starts to work, automatic navigation is carried out according to a specified route, in the automatic driving process, an STM32 main control board in a bottom layer data processing control unit reads values of all sensors and feeds the values back to an NVIDIA TX2 control board in an upper layer data fusion and sensor data network integration control unit in real time, and the NVIDIA TX2 control board is combined with a laser radar and a depth camera. The speed and the direction of the vehicle are calculated through algorithm decision, so that the distribution vehicle always runs at the middle position of a corridor, and the depth camera always detects the front road and reacts to the barrier at any time.
Then the cameras on the left and right sides identify the house number to determine whether the house number is matched with the distribution information, if not, the distribution vehicle continues to move forward, if so, the distribution vehicle stops beside the house, the patient number begins to be broadcasted, the patient or the family members are informed to take medicine, and in some large hospitals, the system can be accessed to a broadcasting system of a ward to inform the broadcast. After the patient is informed, the self information is input, the cabinet door is opened after the system checks that the information is correct, and meanwhile, the medical advice is displayed on the display screen and the patient is informed to check. If the patient is not in the ward, the delivery vehicle stays for a certain time and then continues to complete the next delivery task, after all tasks are completed, the medicine which is not delivered last time is delivered, and if no medicine is delivered, the condition of the delivery vehicle is recorded for the reference of medical staff if the medicine is not delivered for 3 times.
Finally, after all medicines are sent out, the delivery vehicle returns to the initial point, the delivery information is arranged and uploaded to a system of a hospital, the delivery condition is communicated with medical staff, and the medical staff inquire the delivery condition through the existing nurse station of the hospital and a voice device at the head of a sickbed according to the condition. Thus, a drug dispensing process is completed.
A multifunctional intelligent robot medicine distribution system based on data fusion is based on an automatic driving technology and a sensor fusion technology, has the advantages of high efficiency, simplicity in operation and the like, and has important significance in the field of modern medical treatment. The system improves the medical system network, so that doctors can know the physical conditions of patients better, the treatment efficiency of the doctors is further improved through the administration information, and the treatment time is saved. For hospitals, the system not only saves a large amount of medical resources, but also improves the treatment system of the hospitals and improves the intelligent degree and the informatization level of the hospitals; to the doctor nurse, when not reducing medical effect, less with patient face-to-face contact's chance has guaranteed doctor nurse's personal safety, to having the patient that suffers from the infectious disease simultaneously, also can effectually reduce the probability that doctor nurse infects, guarantee doctor nurse's health. Through later-stage upgrading, the system is not only applied to medicine distribution, but also can complete tasks of guiding patients to see a doctor and the like, and the application field is wide.
Claims (9)
1. The utility model provides a multi-functional intelligent robot medicine delivery system based on data fusion which characterized in that: the method comprises the following steps: the system comprises a bottom data processing control unit, an upper data fusion and sensor data network integration control unit, an automatic driving unit and a medicine distribution information management unit;
the hardware of the bottom layer data processing control unit comprises an STM32 main control board used for collecting data and sending control signals, an attitude sensor connected with the STM32 main control board and used for observing and controlling the state of the intelligent robot, and a coding motor used for controlling the intelligent robot to run, wherein the coding motor is connected with a Mecanum wheel; a Mecanum wheel control system for controlling a Mecanum wheel and an encoder motor closed-loop control system for controlling an encoding motor to operate are implanted in the STM32 main control board;
the upper layer data fusion and sensor data network integration control unit comprises an NVIDIA TX2 control panel in signal connection with an STM32 main control panel, a laser radar used for monitoring the surrounding environment and mounted on the intelligent robot, and hardware equipment capable of obtaining a depth image and a cloud picture, wherein the laser radar and the hardware equipment capable of obtaining the depth image and the cloud picture are connected with the NVIDIA TX2 control panel for signal transmission;
the automatic driving unit and the medicine distribution information management unit comprise an information entry system and an automatic driving control system, the information entry system is in butt joint with a doctor diagnosis and prescription system, a doctor can directly enter patient information and prescription information into the information entry system when prescribing medicine, and the automatic driving control system can input the driving track of each sickbed in a certain floor into the automatic driving control system in advance.
2. The multifunctional intelligent robotic drug delivery system based on data fusion as claimed in claim 1, wherein: the speed control mode of the bottom data processing control unit is closed-loop control, and the rotating speed of the wheels is sensed through a speed feedback device; the main implementation mode is as follows:
(1) receiving the speed data V sent by the upper control unit by the lower control unit STM32S;
(2) Converting, in a control board, speed information of a vehicle into target PWM duty ratios of four Mecanum wheels;
(3) inputting the target value into a PWM setting function to drive the wheel to rotate, and simultaneously reading the rotating speed V of the wheel through a self-contained encoderA;
(4) Inputting the target rotating speed and the actual rotating speed into a PID algorithm to realize a closed-loop speed regulation function; meanwhile, the real-time data value of the encoder is stored in the data frame, and after all data are collected, the data frame is sent to the upper control unit, so that the purpose of self state perception and the basis of later attitude control are achieved.
3. The multifunctional intelligent robotic drug delivery system based on data fusion as claimed in claim 1, wherein: the bottom layer data processing control unit performs data fusion on the attitude sensor and the motor rotating speed data when controlling the direction of the robot, senses the direction of the robot through the rotating speed of the motor and modifies the zero offset of the attitude sensor in the yaw direction due to the problem of sensor precision;
the rotating speeds of motors on the left side of the robot are respectively set as V1L、V2LThe rotating speed of the right motor is V3R、V4RThe angle of yaw direction rotation measured by the attitude sensor is AYTherefore, the left wheel average speed is:
VL=(|V1L|+|V2Li)/2 (formula 1)
The same can be obtained
VR=(|V3R|+|V4RI)/2 (formula 2)
When the Mecanum wheel trolley is steered in situ, the rotating speeds of the four wheels are consistent according to the mechanical principle, and only the rotating direction of the motor needs to be changed, at the moment, the distribution trolley rotates clockwise or anticlockwise according to the central point of the trolley body;
it can be obtained that the steering linear velocity of the delivery vehicle is VT:
VT=VL-VR(formula 3)
When the vehicle body length is L, the angular speed of the rotation of the distribution vehicle is omegav:
By integration, the angle θ at which the dispensing vehicle is rotated at this time is obtained.
The angle theta is correctedσ:
θσ=(θ+AY) /2 (formula 5)
And the angle is used as final data and stored in a corresponding unit of a protocol, and after all data are acquired, the data frame is sent to an upper control unit so as to achieve the purposes of sensing the self state and improving the accuracy of the sensor.
4. The multifunctional intelligent robotic drug delivery system based on data fusion as claimed in claim 1, wherein: the laser radar can adopt a single-line laser radar or a multi-line laser radar; RPLIDAR S1 single line lidar is the most preferred.
5. The multifunctional intelligent robotic drug delivery system based on data fusion as claimed in claim 1, wherein: the hardware equipment capable of obtaining the depth image and the point cloud picture comprises a binocular camera and two ordinary cameras, wherein the binocular camera adopts a ZED binocular camera.
6. The multifunctional intelligent robotic drug delivery system based on data fusion as claimed in claim 5, wherein: the NVIDIA TX2 control panel in the upper data fusion and sensor data network integration control unit is a Jeston Nvidia TX2 development panel running an ubuntu16.04 and kinetic ROS operating system, and is connected with a binocular camera and two ordinary cameras which are arranged at the front part of the intelligent robot, and the binocular camera is a camera for testing depth;
the depth image of a scene is obtained by a depth testing camera through a binocular algorithm, the depth image is transmitted to an NVIDIA TX2 control panel for processing, the detection wide angle of the ZED camera is 110 degrees according to the visual angle parameters of the depth camera, 1080P resolution is used for the image, namely the image size is 3840 x 1080 (35 pixels are corresponding to one degree), and the condition of a front road barrier is obtained by perceiving the depth image in the scene; meanwhile, the laser radar continuously scans the surrounding environment (0 degree of the laser radar always points to the opposite direction of the forward direction of the vehicle body), and the position D of the intelligent robot relative to the left wall body is obtained through the values of 90 degrees and 270 degrees of the laser radarLAnd wall position D relative to the right sideRAnd the length of the vehicle body is h and the width is l, when the intelligent robot requires the intelligent robot to make judgment and reaction on an object in the front within 1.5 meters in the advancing process, the detection field angle alpha is obtained according to the width of the intelligent robot per se:
thus obtaining the required depth data obstacle avoidance range alpha Acquiring the depth data in the area to obtain the data of the depth camera, classifying the data according to the distance, and determining the distance d in the image1±dσCounting the points in the range, and storing the coordinates in an array Ad1Performing the following steps; the distance in the image is d2±dσCounting the points in the range, and storing the coordinates in an array Ad2…, and so on, a series of arrays can be obtained, and when the obstacle avoidance judgment is carried out, only the obstacle avoidance operation is carried out on the obstacle with the distance within the range of 1.5 m;
after the depth array is obtained, obstacle avoidance operation is carried out by combining laser radar data, and firstly, a depth array set { A ] is obtainedd1,Ad2,...,AdnAnd the corresponding angle (125 deg. for lidar corresponding to 0 deg. from the depth camera view) and distance d1,d2,...,dnExplaining an obstacle avoidance process as follows:
when the obstacle appears in alpha, the set of points is A, and the mean value x of the abscissa in A is takenAIf xAIf the position is more than 1980, the obstacle is judged to be on the right side relative to the medicine delivery vehicle, otherwise, the obstacle appears on the left side. Let x beA< 1980, the deflection angle G of the barrier relative to the positive advancing direction of the AGV can be obtained according to the corresponding relation between the visual angle and the pixel pointsA:
αA=(1980-xA) 35 (equation 2)
The service vehicle is regulated to carry out obstacle avoidance operation right first, and the deflection angle and the distance d at the moment are combinedAAs can be seen, the distance l that the AGV needs to travel to the rightA:
lA=2tanαA+dsafe(formula 3)
dsafeIs a set safe distance;
at the moment, collecting the laser radar data, and detecting the angle in the laser radarThe distance perception is carried out in, and the judgment is carried out on the right l of the AGV according to the angle data of the laser radarAWhether the obstacle appears in the range or not is judged according to the following criteria:
ensuring that each laser radar data in the angle range is larger than l corresponding to the anglelaserThen, it can be determined that there is no obstacle on the right side of the AGV, and right-side obstacle avoidance is performed; when the obstacle avoidance can not be carried out on the right side (according to the bilateral symmetry, no derivation is carried out), the laser radar is used Sensing the left side of the AGV, and ensuring that the left side of the AGV does not have an obstacle; and moving the barrier out of the traveling route by adjusting the position of the intelligent robot.
7. The multifunctional intelligent robotic drug delivery system based on data fusion as claimed in claim 1, wherein: when in the declination angle range alphaAWhen a plurality of obstacles appear in the device, the obstacle avoidance is carried out on the nearer obstacle and then on the farther obstacle.
8. The multifunctional intelligent robotic drug delivery system based on data fusion as claimed in claim 1, wherein: the bottom data processing control unit and the upper data fusion and sensor data network integration control unit adopt a full duplex data transmission mode, and data are transmitted in a link in two directions; carry out signal connection through serial ports and/or wireless mode between STM32 main control panel and the NVIDIA TX2 control panel, wireless connection mode includes other wireless connection modes such as WIFI, bluetooth.
9. The multifunctional intelligent robotic drug delivery system based on data fusion as claimed in claim 1, wherein: the specific control steps of the automatic driving unit and the medicine distribution information management unit are as follows:
the method comprises the following steps: initializing a bottom sensor and establishing a data channel with an upper layer;
step two: inputting main information, putting a medicine into the main information, and planning a distribution path;
step three: the distribution vehicle is self-navigated, and barrier avoidance actions and doorplate scanning are processed in parallel by the system while the distribution vehicle travels;
step four: broadcasting information after arriving at a medicine delivery place, informing a patient to take medicine, opening a cabinet door after the patient inputs self information and checks that the patient does not have the correct medicine, marking the point if no person takes the medicine, and continuing to advance to the next delivery point after waiting for a certain time;
step five: after the first medicine delivery is finished, if the medicine is delivered completely, returning to the starting point, otherwise, performing the second path planning and continuing to deliver, wherein the path planning time is not more than 3;
step six: and after the medicine is delivered for three times, returning to the starting point, recording the condition into the system, informing the medical staff of the medicine delivery condition, and confirming the condition by the medical staff.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110128624.1A CN112925325B (en) | 2021-01-29 | 2021-01-29 | Multifunctional intelligent robot medicine distribution system based on data fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110128624.1A CN112925325B (en) | 2021-01-29 | 2021-01-29 | Multifunctional intelligent robot medicine distribution system based on data fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112925325A true CN112925325A (en) | 2021-06-08 |
CN112925325B CN112925325B (en) | 2022-01-11 |
Family
ID=76168760
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110128624.1A Active CN112925325B (en) | 2021-01-29 | 2021-01-29 | Multifunctional intelligent robot medicine distribution system based on data fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112925325B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060026205A1 (en) * | 2004-07-30 | 2006-02-02 | Butterfield Robert D | System and method for managing medical databases for patient care devices |
CN106708058A (en) * | 2017-02-16 | 2017-05-24 | 上海大学 | Robot object conveying method and control system based on ROS (Robot Operating System) |
CN107608525A (en) * | 2017-10-25 | 2018-01-19 | 河北工业大学 | VR interacts mobile platform system |
CN111413963A (en) * | 2020-02-20 | 2020-07-14 | 上海交通大学 | Multifunctional robot autonomous distribution method and system |
CN111590582A (en) * | 2020-05-27 | 2020-08-28 | 华南理工大学 | Intelligent medical service robot capable of realizing biofeedback and remote prospect |
CN112207838A (en) * | 2020-09-04 | 2021-01-12 | 青岛通产智能科技股份有限公司 | Mobile intelligent inspection medicine delivery robot |
-
2021
- 2021-01-29 CN CN202110128624.1A patent/CN112925325B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060026205A1 (en) * | 2004-07-30 | 2006-02-02 | Butterfield Robert D | System and method for managing medical databases for patient care devices |
CN106708058A (en) * | 2017-02-16 | 2017-05-24 | 上海大学 | Robot object conveying method and control system based on ROS (Robot Operating System) |
CN107608525A (en) * | 2017-10-25 | 2018-01-19 | 河北工业大学 | VR interacts mobile platform system |
CN111413963A (en) * | 2020-02-20 | 2020-07-14 | 上海交通大学 | Multifunctional robot autonomous distribution method and system |
CN111590582A (en) * | 2020-05-27 | 2020-08-28 | 华南理工大学 | Intelligent medical service robot capable of realizing biofeedback and remote prospect |
CN112207838A (en) * | 2020-09-04 | 2021-01-12 | 青岛通产智能科技股份有限公司 | Mobile intelligent inspection medicine delivery robot |
Also Published As
Publication number | Publication date |
---|---|
CN112925325B (en) | 2022-01-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109966064B (en) | Wheelchair with detection device and integrated with brain control and automatic driving and control method | |
EP3538967B1 (en) | Method and system for operating an automatically moving robot | |
EP2214111A2 (en) | Medical tele-robotic system | |
CN101556629B (en) | Chronic disease health management system for household | |
CN110658813A (en) | Intelligent medical material supply robot based on Internet of things and SLAM technology | |
Tasaki et al. | Prototype design of medical round supporting robot “Terapio” | |
CN207457833U (en) | A kind of obstruction-avoiding control system of robot | |
CN108245122B (en) | Magnetic guiding type capsule endoscope system and track planning method | |
CN207326997U (en) | Robot and robot service management system | |
CN108801269A (en) | A kind of interior cloud Algorithms of Robots Navigation System and method | |
CN108873914A (en) | A kind of robot autonomous navigation system and method based on depth image data | |
CN215082705U (en) | Novel disinfection epidemic prevention robot | |
CN101251756A (en) | All fours type bionic robot control device | |
CN111590582A (en) | Intelligent medical service robot capable of realizing biofeedback and remote prospect | |
CN112925325B (en) | Multifunctional intelligent robot medicine distribution system based on data fusion | |
Miao et al. | A construction method of lower limb rehabilitation robot with remote control system | |
CN113001553B (en) | Intelligent inspection robot | |
Juneja et al. | A comparative study of slam algorithms for indoor navigation of autonomous wheelchairs | |
Guan et al. | Study of a 6DOF robot assisted ultrasound scanning system and its simulated control handle | |
Thinh et al. | Telemedicine mobile robot-robots to assist in remote medical | |
CN205729299U (en) | The Pose Control system of capsule endoscope and capsule endoscope | |
KR100991853B1 (en) | Robot System For Remote Diagnosing And Treating And the Method Controlling Robot Part | |
CN115237113B (en) | Robot navigation method, robot system and storage medium | |
CN116100565A (en) | Immersive real-time remote operation platform based on exoskeleton robot | |
CN116270047A (en) | Electric wheelchair for realizing intelligent movement and remote health monitoring and control system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |