CN107307851A - Intelligent robot system - Google Patents
Intelligent robot system Download PDFInfo
- Publication number
- CN107307851A CN107307851A CN201610270435.7A CN201610270435A CN107307851A CN 107307851 A CN107307851 A CN 107307851A CN 201610270435 A CN201610270435 A CN 201610270435A CN 107307851 A CN107307851 A CN 107307851A
- Authority
- CN
- China
- Prior art keywords
- unit
- intelligent robot
- control processor
- physiological data
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000009471 action Effects 0.000 claims abstract description 26
- 238000004891 communication Methods 0.000 claims description 26
- 238000000034 method Methods 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 11
- 238000013532 laser treatment Methods 0.000 claims description 6
- 238000013459 approach Methods 0.000 claims description 5
- 239000008280 blood Substances 0.000 claims description 5
- 210000004369 blood Anatomy 0.000 claims description 5
- 230000036772 blood pressure Effects 0.000 claims description 5
- 230000036760 body temperature Effects 0.000 claims description 5
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 claims description 4
- 230000008859 change Effects 0.000 claims description 4
- 229910052760 oxygen Inorganic materials 0.000 claims description 4
- 239000001301 oxygen Substances 0.000 claims description 4
- 238000002647 laser therapy Methods 0.000 abstract description 3
- 238000012545 processing Methods 0.000 abstract description 3
- 238000013480 data collection Methods 0.000 abstract 2
- 230000001815 facial effect Effects 0.000 abstract 1
- 230000033001 locomotion Effects 0.000 description 9
- 238000004422 calculation algorithm Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 208000026106 cerebrovascular disease Diseases 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 208000024172 Cardiovascular disease Diseases 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000002526 effect on cardiovascular system Effects 0.000 description 2
- 238000009093 first-line therapy Methods 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000007619 statistical method Methods 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 206010008132 Cerebral thrombosis Diseases 0.000 description 1
- 206010008190 Cerebrovascular accident Diseases 0.000 description 1
- 208000031226 Hyperlipidaemia Diseases 0.000 description 1
- 206010020772 Hypertension Diseases 0.000 description 1
- 201000001429 Intracranial Thrombosis Diseases 0.000 description 1
- 208000006011 Stroke Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 206010008118 cerebral infarction Diseases 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 206010012601 diabetes mellitus Diseases 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 208000019622 heart disease Diseases 0.000 description 1
- 238000011221 initial treatment Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 208000010125 myocardial infarction Diseases 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/06—Radiation therapy using light
- A61N5/067—Radiation therapy using light using laser light
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Cardiology (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Physiology (AREA)
- Pulmonology (AREA)
- Optics & Photonics (AREA)
- Vascular Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Manipulator (AREA)
Abstract
The present invention provides a kind of intelligent robot system, including:Intelligent robot, wearable physiological data collection device, device for laser therapy and Cloud Server;Intelligent robot includes:Control processor, image acquisition units and display unit;Cloud Server includes:Memory cell, comparing unit, receiving unit and transmitting element;Wearable physiological data collection device collection human body physiological data gives comparing unit, and the human body physiological data after comparison is sent to display unit by comparing unit by transmitting element;Image acquisition units obtain facial image and are sent to comparing unit, and comparing unit carries out owner's association according to comparison result;Image acquisition units obtain limb action image and are sent to comparing unit, and comparing unit controls intelligent robot to make and is turned on and off action according to action command;Image acquisition units obtain target body image and are sent to control processor, and driving intelligent robot is close to target body after control processor processing.
Description
Technical Field
The invention relates to the field of robots, in particular to an intelligent robot system.
Background
At present, the aging degree of China is higher and higher, the number of empty nesters and solitary old people is increased day by day, and the attention of the old people becomes a social problem. In addition, the elderly often suffer from cardiovascular and cerebrovascular diseases such as heart disease, cerebral thrombosis, myocardial infarction, cerebral infarction, apoplexy and the like due to cardiovascular and cerebrovascular diseases, hypertension, hyperlipidemia and diabetes, and if the empty-nest elderly cannot be rescued in time when the diseases appear, the consequences are unreasonable. Therefore, how to design a robot, realize identity authentication and physiological data interaction through a cloud technology, automatically follow the real-time monitoring of human physiological data and carry out primary treatment is a problem to be solved.
Disclosure of Invention
The invention aims to solve the technical problem that an intelligent robot system is provided aiming at the defects of the prior art, the cloud technology is adopted to realize identity authentication and physiological data interaction, and the intelligent robot system can automatically follow the identity authentication and physiological data interaction to monitor the physiological data of a user in real time so as to facilitate the user to know the self physical condition in real time; meanwhile, the laser therapy device can be used for primary therapy.
The technical problem to be solved by the invention is realized by the following technical scheme:
the invention provides an intelligent robot system, comprising: the wearable physiological data acquisition device is connected with the intelligent robot through wireless communication, the intelligent robot is connected with the cloud server through wireless communication, and the laser treatment device is arranged on the intelligent robot; wherein, the intelligent robot includes: the system comprises a control processor, an image acquisition unit and a display unit; the image acquisition unit and the display unit are respectively electrically connected with the control processor; the cloud server includes: the device comprises a storage unit, a comparison unit, a receiving unit and a sending unit; the wearable physiological data acquisition device transmits the human physiological data to the receiving unit through wireless communication after acquiring the human physiological data, the receiving unit transmits the acquired human physiological data to the comparison unit, the comparison unit reads standard human physiological data preset in the storage unit and compares the standard human physiological data with the acquired human physiological data, the compared human physiological data is transmitted to the control processor through the transmitting unit, and the control processor transmits the compared human physiological data to the display unit;
the image acquisition unit acquires a face image and sends the face image to the receiving unit through wireless communication, the receiving unit sends the acquired face image to the comparison unit, the comparison unit compares the acquired face image with a face image prestored in the storage unit, and according to a comparison result, the compared human body physiological data is associated with all people; the image acquisition unit acquires a limb action image and sends the limb action image to the receiving unit through wireless communication, the receiving unit sends the limb action image to the comparison unit, the comparison unit compares the acquired action instruction with each action instruction template stored in the storage unit and sends the action instruction with the highest similarity to the control processor, and the control processor controls the intelligent robot to perform opening or closing action corresponding to the action instruction; the image acquisition unit acquires a depth image and a color image of a target human body and sends the depth image and the color image to the control processor, the control processor processes the depth image and the color image to obtain spatial position data of the target human body, and the control processor drives the intelligent robot to approach the target human body according to the spatial position data.
Preferably, the intelligent robot further comprises: drive unit and walking unit, the walking unit includes: the control processor drives the driving unit to control the driving wheels to walk, and the driven wheels are positioned on a symmetrical shaft between the two driving wheels; wherein, the driven wheel is a universal wheel.
Preferably, the intelligent robot further comprises: the obstacle avoidance device is electrically connected with the control processor, and the control processor controls the walking unit through the driving unit to drive the intelligent robot to change the travelling route according to the sensing signal sent by the obstacle avoidance device; the obstacle avoidance device comprises an ultrasonic sensor or an infrared sensor.
Preferably, the wearable physiological data acquisition device is one or a combination of a body temperature acquisition device, a blood pressure acquisition device, a heart rate acquisition device and a blood oxygen acquisition device.
Preferably, the intelligent robot further comprises an alarm unit, and the alarm unit is connected with the control processor.
The present invention also provides an intelligent robot system, comprising: the wearable physiological data acquisition device is connected with the intelligent robot through wireless communication, the intelligent robot is connected with the cloud server through wireless communication, and the laser treatment device is arranged on the intelligent robot; wherein, the intelligent robot includes: the system comprises a control processor, an image acquisition unit, a display unit and a microphone; the image acquisition unit and the display unit are respectively electrically connected with the control processor; the cloud server includes: the device comprises a storage unit, a comparison unit, a receiving unit and a sending unit; the wearable physiological data acquisition device transmits the human physiological data to the receiving unit through wireless communication after acquiring the human physiological data, the receiving unit transmits the acquired human physiological data to the comparison unit, the comparison unit reads standard human physiological data preset in the storage unit and compares the standard human physiological data with the acquired human physiological data, the compared human physiological data is transmitted to the control processor through the transmitting unit, and the control processor transmits the compared human physiological data to the display unit; the image acquisition unit acquires a face image and sends the face image to the receiving unit through wireless communication, the receiving unit sends the acquired face image to the comparison unit, the comparison unit compares the acquired face image with a face image prestored in the storage unit, and according to a comparison result, the compared human body physiological data is associated with all people; the microphone acquires a voice instruction and sends the voice instruction to the receiving unit through wireless communication, the receiving unit sends the voice instruction to the comparison unit, the comparison unit compares the acquired voice instruction with each voice instruction template stored in the storage unit and sends the voice instruction with the highest similarity to the control processor, and the control processor controls the intelligent robot to perform an action corresponding to the action instruction; the image acquisition unit acquires a depth image and a color image of a target human body and sends the depth image and the color image to the control processor, the control processor processes the depth image and the color image to obtain spatial position data of the target human body, and the control processor drives the intelligent robot to approach the target human body according to the spatial position data.
Preferably, the intelligent robot further comprises: drive unit and walking unit, the walking unit includes: the control processor drives the driving unit to control the driving wheels to walk, and the driven wheels are positioned on a symmetrical shaft between the two driving wheels; wherein, the driven wheel is a universal wheel.
Preferably, the intelligent robot further comprises: the obstacle avoidance device is electrically connected with the control processor, and the control processor controls the walking unit through the driving unit to drive the intelligent robot to change the travelling route according to the sensing signal sent by the obstacle avoidance device; the obstacle avoidance device comprises an ultrasonic sensor or an infrared sensor.
Preferably, the wearable physiological data acquisition device is one or a combination of a body temperature acquisition device, a blood pressure acquisition device, a heart rate acquisition device and a blood oxygen acquisition device.
Preferably, the intelligent robot further comprises an alarm unit, and the alarm unit is connected with the control processor.
The intelligent robot system adopts the cloud technology to realize identity authentication and physiological data interaction, can automatically follow and monitor the physiological data of the user in real time so as to facilitate the user to know the self physical condition in real time; meanwhile, the laser therapy device can be used for primary therapy.
The technical solution of the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
Drawings
FIG. 1 is a block diagram of a logic structure of an intelligent robot system according to an embodiment of the present invention;
fig. 2 is a block diagram of a logic structure of a second intelligent robot system according to an embodiment of the present invention.
Detailed Description
Example one
Fig. 1 is a block diagram of a logical structure of an intelligent robot system according to an embodiment of the present invention, and as shown in fig. 1, the present invention provides an intelligent robot system, including: the system comprises an intelligent robot 1, a wearable physiological data acquisition device 3, a laser treatment device (not shown in the figure) and a cloud server 2, wherein the wearable physiological data acquisition device 3 is connected with the intelligent robot 1 through wireless communication, the intelligent robot 1 is connected with the cloud server 2 through wireless communication, and the laser treatment device is arranged on the intelligent robot; wherein the wireless signal comprises: infrared, bluetooth, WLAN, 3G and 4G.
The intelligent robot 1 includes: a control processor 11, an image acquisition unit 12 and a display unit 13; the image acquisition unit 12 and the display unit 13 are respectively electrically connected with the control processor 11;
the cloud server includes: a storage unit 21, a comparison unit 22, a receiving unit 23 and a transmitting unit 24; after the wearable physiological data acquisition device 3 acquires human physiological data, the human physiological data are transmitted to the receiving unit 23 through wireless communication, the receiving unit 23 sends the acquired human physiological data to the comparison unit 24, the comparison unit 24 reads standard human physiological data preset in the storage unit 21 and compares the standard human physiological data with the acquired human physiological data, the compared human physiological data are sent to the control processor 11 through the sending unit 24, the control processor 11 sends the compared human physiological data to the display unit 13, or the control processor 11 compares the human physiological data with the acquired human physiological data, and if the acquired human physiological data exceed the standard human physiological data, an alarm is sent to inform a user through the alarm unit 16; therefore, compared physiological data can be convenient for a user to know the physical condition of the user in real time, and the compared physiological data can be sent to family members, guardians or medical personnel in a hospital through the sending unit 24 so as to monitor and manage the physiological data. The wearable physiological data acquisition device 3 is one or a combination of a body temperature acquisition device, a blood pressure acquisition device, a heart rate acquisition device and a blood acquisition device. The intelligent robot 1 further includes: a driving unit 14 and a traveling unit 15, the traveling unit 15 including: a driving wheel (not shown) and a driven wheel (not shown), wherein the driven wheel is a universal wheel, the control processor 11 drives the driving unit 14 to control the driving wheel to travel, and the driven wheel is positioned on a symmetrical shaft between the two driving wheels.
In order to better realize that the intelligent robot more accurately follows a target human body and associates the compared human body physiological data with all people of the intelligent robot in the working process of the intelligent robot, the image acquisition unit acquires a human face image and sends the human face image to the receiving unit through wireless communication, the receiving unit sends the acquired human face image to the comparison unit, the comparison unit compares the acquired human face image with the human face image prestored in the storage unit, and associates the compared human body physiological data with all people of the intelligent robot according to a comparison result so as to be convenient for multiple people in the family to use; specifically, in the process of face recognition, firstly, the comparison unit 22 processes the acquired face image into black and white square data with unique features, and can generate a unique data image of the face of the following object after combining skin color; then, preprocessing the data image to eliminate the interference of a complex background in the image; then, the collected face information is processed by PCA (Principal Component Analysis, PCA, Principal Component Analysis, which is a statistical method, a group of variables possibly having correlation are converted into a group of linearly uncorrelated variables through orthogonal transformation, the group of converted variables are called as Principal components), namely, a three-dimensional image is projected to a two-dimensional data space, characteristic information is extracted, then LDA processing is carried out, (Linear discriminatinal analysis, LDA, Linear discriminant analysis, which is a multivariate statistical analysis method for discriminating the type attribution problem of a certain research object according to various characteristic values under the condition of classification determination, the basic principle is to establish one or more discriminant functions according to a certain discriminant criterion, determine the coefficient to be determined in the discriminant functions by using a large amount of data of the research object, and calculate the discriminant indexes, thereby determining which type a certain sample belongs to) and finally determining the face characteristics by fuzzy processing; then, through the recognition of the comparison unit 22, the face information collected in real time is compared with the face sample stored in the memory 21, the face information closest to the sample is locked, and the data (physiological data) of the current user is called.
Further, the image acquisition unit 12 acquires a limb motion image (or a human body skeleton point image), and sends the limb motion image to the receiving unit 23 through wireless communication, the receiving unit 23 sends the limb motion image to the comparison unit 22, the comparison unit 22 compares the acquired motion instruction with each motion instruction template stored in the storage unit 21, and sends the motion instruction with the highest similarity to the control processor 11, and the control processor 11 controls the intelligent robot 1 to perform an opening or closing motion corresponding to the motion instruction; in the following process of the intelligent robot 1, the image acquisition unit 12 acquires a depth image and a color image of a target human body and sends the depth image and the color image to the control processor 11, the control processor 11 processes the depth image and the color image to obtain spatial position data of the target human body, and the control processor 11 drives the intelligent robot 1 to approach the target human body according to the spatial position data. Specifically, the control processor 11 is based on a human body target tracking (computer) algorithm that fuses depth information, color information, and prediction information, and expands a continuous adaptive mean shift algorithm into a three-dimensional space using the color information and the depth information, then fuses the depth information of a human body target in the three-dimensional space, and at the same time, predicts position information of the target using a Kalman filter, eliminates interference of a complex background in an image, and finally calculates the centroid position of the human body in a corresponding depth image. In order to enable the intelligent robot 1 to accurately and quickly follow a target human body, the control process 11 adopts PID control and combines an M/T speed measurement algorithm to achieve accurate and quick following of the target human body, specifically: PID control (proportional, integral, derivative): and correcting the control signal by using PID regulation to eliminate the steady-state error. Wherein P is a proportional link and plays an amplifying role; i is an integral link, so that steady-state errors can be eliminated; d is a derivative link, so that the reaction of the system can be accelerated; by adding PID control, the system can execute actions more accurately and quickly; M/T speed measurement algorithm: the algorithm realizes the real-time performance of measurement by presetting the approximate output time interval of the rotating speed of the motor, and ensures the measurement precision by the dynamic self-adaptive adjustment of the pulse count of the photoelectric encoder.
In order to reduce collision of the intelligent robot 1 during movement, the intelligent robot 1 further includes an obstacle avoidance device 17, where the obstacle avoidance device 17 includes: the intelligent robot comprises an ultrasonic sensor (not shown in the figure) or an infrared sensor (not shown in the figure), wherein the ultrasonic sensor or the infrared sensor sends out a sensing signal, the sensing signal can be fed back when an obstacle is encountered, the ultrasonic sensor or the infrared sensor sends the sensing signal fed back to a control processor 11, the control processor 11 controls a walking unit 15 through a driving unit 14 to drive the intelligent robot, the driving path is changed on the premise of not deviating from the target tracking direction, and after the feedback sensing signal of the ultrasonic sensor or the infrared sensor disappears, the control processor 11 controls the walking unit 15 through the driving unit 14 to drive the intelligent robot to return to the optimal driving path for tracking the target.
In addition, the intelligent robot can avoid obstacles in another way, the intelligent robot further comprises a laser scanner (not shown in the figure), the laser scanner is used for enabling the intelligent robot to realize the automatic environment map construction and path planning functions, in short, in the walking process of the intelligent robot, the environment (a fixed environment in a certain range, such as a household) is scanned through the laser scanner, a scanning feedback signal is input into the control processor 11, an environment map is formed through a comprehensive calculation method (an environment map algorithm is a known technology), meanwhile, the control processor 11 automatically records the passing route of the intelligent robot based on the environment map coordinates, identification points in the environment map are memorized (the robot needs to be controlled by a person to walk to a specific target position and input target information, for example, before the intelligent robot is driven to a water dispenser, and the robot is made to recognize and memorize the position as the water dispenser, and the following path planning and optimization of the intelligent robot to a specific target (or a given instruction target) can be realized on the basis of the learning of the intelligent robot.
Example two
Fig. 2 is a block diagram of a logical structure of an intelligent robot system according to a second embodiment of the present invention, and as shown in fig. 2, the present invention further provides an intelligent robot system, where a structure of the intelligent robot system in the present embodiment is substantially the same as that of the intelligent robot system in the first embodiment, except that, in the present embodiment, the intelligent robot 1 further includes: the microphone 18, the microphone 18 is connected with the control processor 11, the microphone 18 acquires a voice instruction, the voice instruction is sent to the receiving unit 23 through wireless communication, the receiving unit 23 sends the voice instruction to the comparing unit 22, the comparing unit 22 compares the acquired voice instruction with each voice instruction template stored in the storage unit 21, the voice instruction with the highest similarity is sent to the control processor 11, and the control processor 11 controls the intelligent robot 1 to perform actions such as opening, closing, media playing, reaching a specified place or fetching and the like corresponding to the action instruction.
Claims (10)
1. An intelligent robot system, comprising: the wearable physiological data acquisition device is connected with the intelligent robot through wireless communication, the intelligent robot is connected with the cloud server through wireless communication, and the laser treatment device is arranged on the intelligent robot; wherein,
the intelligent robot includes: the system comprises a control processor, an image acquisition unit and a display unit; the image acquisition unit and the display unit are respectively electrically connected with the control processor;
the cloud server includes: the device comprises a storage unit, a comparison unit, a receiving unit and a sending unit;
the wearable physiological data acquisition device transmits the human physiological data to the receiving unit through wireless communication after acquiring the human physiological data, the receiving unit transmits the acquired human physiological data to the comparison unit, the comparison unit reads standard human physiological data preset in the storage unit and compares the standard human physiological data with the acquired human physiological data, the compared human physiological data is transmitted to the control processor through the transmitting unit, and the control processor transmits the compared human physiological data to the display unit;
the image acquisition unit acquires a face image and sends the face image to the receiving unit through wireless communication, the receiving unit sends the acquired face image to the comparison unit, the comparison unit compares the acquired face image with a face image prestored in the storage unit, and according to a comparison result, the compared human body physiological data is associated with all people;
the image acquisition unit acquires a limb action image and sends the limb action image to the receiving unit through wireless communication, the receiving unit sends the limb action image to the comparison unit, the comparison unit compares the acquired action instruction with each action instruction template stored in the storage unit and sends the action instruction with the highest similarity to the control processor, and the control processor controls the intelligent robot to perform opening or closing action corresponding to the action instruction;
the image acquisition unit acquires a depth image and a color image of a target human body and sends the depth image and the color image to the control processor, the control processor processes the depth image and the color image to obtain spatial position data of the target human body, and the control processor drives the intelligent robot to approach the target human body according to the spatial position data.
2. The intelligent robot system of claim 1, wherein the intelligent robot further comprises: drive unit and walking unit, the walking unit includes: the control processor drives the driving unit to control the driving wheels to walk, and the driven wheels are positioned on a symmetrical shaft between the two driving wheels; wherein, the driven wheel is a universal wheel.
3. The intelligent robot system of claim 1, wherein the intelligent robot further comprises: the obstacle avoidance device is electrically connected with the control processor, and the control processor controls the walking unit through the driving unit to drive the intelligent robot to change the travelling route according to the sensing signal sent by the obstacle avoidance device; the obstacle avoidance device comprises an ultrasonic sensor or an infrared sensor.
4. The intelligent robotic system as claimed in claim 1, wherein the wearable physiological data acquisition device is one or more of a body temperature acquisition device, a blood pressure acquisition device, a heart rate acquisition device or a blood oxygen acquisition device.
5. The intelligent robot system according to any one of claims 1-4, wherein the intelligent robot further comprises an alarm unit, the alarm unit being connected to the control processor.
6. An intelligent robot system, comprising: the wearable physiological data acquisition device is connected with the intelligent robot through wireless communication, the intelligent robot is connected with the cloud server through wireless communication, and the laser treatment device is arranged on the intelligent robot; wherein,
the intelligent robot includes: the system comprises a control processor, an image acquisition unit, a display unit and a microphone; the image acquisition unit and the display unit are respectively electrically connected with the control processor;
the cloud server includes: the device comprises a storage unit, a comparison unit, a receiving unit and a sending unit;
the wearable physiological data acquisition device transmits the human physiological data to the receiving unit through wireless communication after acquiring the human physiological data, the receiving unit transmits the acquired human physiological data to the comparison unit, the comparison unit reads standard human physiological data preset in the storage unit and compares the standard human physiological data with the acquired human physiological data, the compared human physiological data is transmitted to the control processor through the transmitting unit, and the control processor transmits the compared human physiological data to the display unit;
the image acquisition unit acquires a face image and sends the face image to the receiving unit through wireless communication, the receiving unit sends the acquired face image to the comparison unit, the comparison unit compares the acquired face image with a face image prestored in the storage unit, and according to a comparison result, the compared human body physiological data is associated with all people;
the microphone acquires a voice instruction and sends the voice instruction to the receiving unit through wireless communication, the receiving unit sends the voice instruction to the comparison unit, the comparison unit compares the acquired voice instruction with each voice instruction template stored in the storage unit and sends the voice instruction with the highest similarity to the control processor, and the control processor controls the intelligent robot to perform an action corresponding to the action instruction;
the image acquisition unit acquires a depth image and a color image of a target human body and sends the depth image and the color image to the control processor, the control processor processes the depth image and the color image to obtain spatial position data of the target human body, and the control processor drives the intelligent robot to approach the target human body according to the spatial position data.
7. The intelligent robot system of claim 6, wherein the intelligent robot further comprises: drive unit and walking unit, the walking unit includes: the control processor drives the driving unit to control the driving wheels to walk, and the driven wheels are positioned on a symmetrical shaft between the two driving wheels; wherein, the driven wheel is a universal wheel.
8. The intelligent robot system of claim 6, wherein the intelligent robot further comprises: the obstacle avoidance device is electrically connected with the control processor, and the control processor controls the walking unit through the driving unit to drive the intelligent robot to change the travelling route according to the sensing signal sent by the obstacle avoidance device; the obstacle avoidance device comprises an ultrasonic sensor or an infrared sensor.
9. The intelligent robotic system as claimed in claim 6, wherein the wearable physiological data acquisition device is one or more of a body temperature acquisition device, a blood pressure acquisition device, a heart rate acquisition device or a blood oxygen acquisition device.
10. The intelligent robot system according to any one of claims 6-9, wherein the intelligent robot further comprises an alarm unit, the alarm unit being connected to the control processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610270435.7A CN107307851A (en) | 2016-04-27 | 2016-04-27 | Intelligent robot system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610270435.7A CN107307851A (en) | 2016-04-27 | 2016-04-27 | Intelligent robot system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107307851A true CN107307851A (en) | 2017-11-03 |
Family
ID=60185475
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610270435.7A Pending CN107307851A (en) | 2016-04-27 | 2016-04-27 | Intelligent robot system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107307851A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108877942A (en) * | 2018-06-11 | 2018-11-23 | 天津科技大学 | A kind of safe assistance system based on artificial intelligence |
CN109998496A (en) * | 2019-01-31 | 2019-07-12 | 中国人民解放军海军工程大学 | A kind of autonomous type body temperature automatic collection and respiratory monitoring system and method |
CN110228073A (en) * | 2019-06-26 | 2019-09-13 | 郑州中业科技股份有限公司 | Active response formula intelligent robot |
CN112652409A (en) * | 2020-12-22 | 2021-04-13 | 孙甲子 | Medical monitoring system based on intelligent robot and wearable equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102323817A (en) * | 2011-06-07 | 2012-01-18 | 上海大学 | Service robot control platform system and multimode intelligent interaction and intelligent behavior realizing method thereof |
CN103324197A (en) * | 2013-06-26 | 2013-09-25 | 西安电子科技大学 | Voice-control multi-functional intelligent service robot |
CN104573385A (en) * | 2015-01-24 | 2015-04-29 | 无锡桑尼安科技有限公司 | Robot system for acquiring data of sickrooms |
CN204485108U (en) * | 2015-03-10 | 2015-07-22 | 北京大学深圳医院 | Wearable laser therapy detection system |
-
2016
- 2016-04-27 CN CN201610270435.7A patent/CN107307851A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102323817A (en) * | 2011-06-07 | 2012-01-18 | 上海大学 | Service robot control platform system and multimode intelligent interaction and intelligent behavior realizing method thereof |
CN103324197A (en) * | 2013-06-26 | 2013-09-25 | 西安电子科技大学 | Voice-control multi-functional intelligent service robot |
CN104573385A (en) * | 2015-01-24 | 2015-04-29 | 无锡桑尼安科技有限公司 | Robot system for acquiring data of sickrooms |
CN204485108U (en) * | 2015-03-10 | 2015-07-22 | 北京大学深圳医院 | Wearable laser therapy detection system |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108877942A (en) * | 2018-06-11 | 2018-11-23 | 天津科技大学 | A kind of safe assistance system based on artificial intelligence |
CN109998496A (en) * | 2019-01-31 | 2019-07-12 | 中国人民解放军海军工程大学 | A kind of autonomous type body temperature automatic collection and respiratory monitoring system and method |
CN110228073A (en) * | 2019-06-26 | 2019-09-13 | 郑州中业科技股份有限公司 | Active response formula intelligent robot |
CN112652409A (en) * | 2020-12-22 | 2021-04-13 | 孙甲子 | Medical monitoring system based on intelligent robot and wearable equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110605724B (en) | Intelligence endowment robot that accompanies | |
Li et al. | Human cooperative wheelchair with brain–machine interaction based on shared control strategy | |
US20170095383A1 (en) | Intelligent wheel chair control method based on brain computer interface and automatic driving technology | |
CN107307851A (en) | Intelligent robot system | |
CN101549498B (en) | Automatic tracking and navigation system of intelligent aid type walking robots | |
US11514688B2 (en) | Drowsiness detection system | |
Liu et al. | Brain–robot interface-based navigation control of a mobile robot in corridor environments | |
CN102895092A (en) | Multi-sensor integration based three-dimensional environment identifying system for walker aid robot | |
CN110666791A (en) | RGBD robot nursing system and method based on deep learning | |
Petit et al. | An integrated framework for humanoid embodiment with a BCI | |
Yuan et al. | Brain–computer interface-based stochastic navigation and control of a semiautonomous mobile robot in indoor environments | |
Chu et al. | The helping hand: An assistive manipulation framework using augmented reality and tongue-drive interfaces | |
CN107307850A (en) | Intelligent robot system | |
CN116763355A (en) | Ultrasonic image-based method and system for adaptively scanning carotid artery by mechanical arm | |
Barea et al. | Guidance of a wheelchair using electrooculography | |
CN107307852A (en) | Intelligent robot system | |
Chongyu et al. | Deep learning-driven front-following within close proximity: a hands-free control model on a smart walker | |
Yuan et al. | Brain teleoperation of a mobile robot using deep learning technique | |
Ching et al. | Touchless shared-control for wheelchair navigation | |
SHARMA et al. | Human following robot | |
Li et al. | Safety Protection Method of Rehabilitation Robot Based on fNIRS and RGB-D Information Fusion | |
Turnip et al. | Development of brain-controlled wheelchair supported by raspicam image processing based Raspberry Pi | |
Garrote et al. | Reinforcement learning motion planning for an EOG-centered robot assisted navigation in a virtual environment | |
Li et al. | Transfer force perception skills to robot‐assisted laminectomy via imitation learning from human demonstrations | |
CN114022557A (en) | Control method and system for collecting soft tissue secretion based on mechanical arm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20171103 |
|
RJ01 | Rejection of invention patent application after publication |