CN111360816A - Robot control method and system and robot - Google Patents
Robot control method and system and robot Download PDFInfo
- Publication number
- CN111360816A CN111360816A CN201811602936.6A CN201811602936A CN111360816A CN 111360816 A CN111360816 A CN 111360816A CN 201811602936 A CN201811602936 A CN 201811602936A CN 111360816 A CN111360816 A CN 111360816A
- Authority
- CN
- China
- Prior art keywords
- robot
- executed
- client
- information
- execution
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 230000008569 process Effects 0.000 claims abstract description 24
- 230000002159 abnormal effect Effects 0.000 claims description 27
- 238000012544 monitoring process Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 7
- 230000007613 environmental effect Effects 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 5
- 230000005856 abnormality Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
Abstract
The application relates to the technical field of intelligent control, and particularly discloses a control method, a control system and a robot for the robot, wherein the control method comprises the following steps: receiving a to-be-executed operation sent by a client, wherein the to-be-executed operation is created by the client according to a topological map of the robot work; acquiring an execution path and execution times of the robot when executing corresponding operation according to the operation to be executed; and instructing the robot to execute corresponding operation according to the execution path and the execution times. In the process, the client can preset the operation to be executed of the robot, so that the intervention in the working process of the robot is realized, the execution route can be planned when the robot executes tasks, and the robot can execute the created operation to be executed for multiple times, so that the practicability of the robot is improved.
Description
Technical Field
The present disclosure relates to the field of intelligent control technologies, and in particular, to a robot control method, a robot control system, and a robot.
Background
Along with the continuous development of science and technology, security protection equipment is more intelligent. As a high-end intelligent subversive product in the security industry, the security robot gradually enters the visual field of people in recent two years and is strongly concerned. Many security robots on the market at present are already applied to life, such as security robots at airports and the like; although security robots have been applied in many ways, the security robots themselves have certain defects, for example, some robots do not respond to emergency situations timely enough, and the navigation function of some robots cannot solve all application scenarios; in some cases, a user cannot intervene the robot in real time according to the scene of the robot, so that the security robot is not practical.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and a system for controlling a robot, and a robot, so as to solve the problems that, in the prior art, a security robot has few interactions with a user and is not highly practical in a large-area patrol process.
A first aspect of an embodiment of the present application provides a control method for a robot, where the control method for a robot includes:
and receiving the operation to be executed sent by the client, wherein the operation to be executed is created by the client according to the topological map of the robot work.
And acquiring an execution path and execution times of the robot when executing corresponding operation according to the operation to be executed.
And instructing the robot to execute corresponding operation according to the execution path and the execution times.
Optionally, the instructing the robot to execute the corresponding job according to the execution path and the execution times includes:
and acquiring position information of the robot when the robot executes the operation and the environmental parameters at the corresponding position in real time.
And sending the position information and the environment parameters to the client.
Optionally, the control method of the robot further includes:
and when the robot does not execute the operation, monitoring and receiving a control instruction sent by the client.
And adjusting the running speed or/and the running direction of the robot according to the control instruction.
Optionally, the control method of the robot further includes:
and receiving an adjusting instruction sent by the holder in the process of executing the operation by the robot.
And adjusting the shooting direction and the focal length of the robot camera according to the adjusting instruction.
Optionally, after the instructing robot executes the corresponding job according to the execution path and the execution times, the method includes:
detecting whether the robot executes the operation or not;
if the robot is detected to be abnormal when the robot executes the operation, recording abnormal information;
and generating an abnormal log according to the abnormal information, and sending the abnormal log to the client so that the client displays the abnormal log to a user.
A second aspect of the embodiments of the present application provides a control method for a robot, including:
and acquiring relevant information of the robot for executing the operation, and creating a topological map of the robot work according to the relevant information.
And creating the to-be-executed operation of the robot according to the topological map.
And sending the to-be-executed operation to the robot so as to instruct the robot to execute a corresponding program according to the to-be-executed operation.
Optionally, the obtaining information related to the robot performing the task, and creating a topological map of the robot work according to the information includes:
and acquiring the position information of the place to be passed by the robot when the robot executes the operation.
And creating a topological map of the robot work according to the position information.
Optionally, the instructing the robot to execute a corresponding program according to the job to be executed includes.
An information entry request input by a user is received.
And detecting whether the user has the information entry authority.
And if so, sending an information input instruction to the robot so that the robot records the face information detected in the process of executing the operation.
A third aspect of embodiments of the present application provides a control system for a robot, including: a robot and a client, wherein:
the client comprises:
the robot comprises a first acquisition unit, a second acquisition unit and a third acquisition unit, wherein the first acquisition unit is used for acquiring relevant information of robot execution work and creating a topological map of robot work according to the relevant information;
the creating unit is used for creating the to-be-executed operation of the robot according to the topological map;
the sending unit is used for sending the to-be-executed operation to the robot so as to instruct the robot to execute a corresponding program according to the to-be-executed operation;
the robot includes:
the robot comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving a job to be executed sent by a client, and the job to be executed is created by the client according to a topological map of the robot work;
and the second acquisition unit is used for acquiring the execution path and the execution times of the robot when executing the corresponding operation according to the operation to be executed.
And the execution unit is used for instructing the robot to execute corresponding operation according to the execution path and the execution times.
A fourth aspect of embodiments of the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, implements the steps of any one of the software upgrading methods according to the first aspect.
According to the embodiment provided by the application, the robot can receive the to-be-executed operation sent by the android client before executing the operation, the to-be-executed operation is created by the android client (the android client refers to a terminal running an android system) according to a topological map when the robot works, and after the robot receives the to-be-executed operation from the client, the execution path and the execution times of the to-be-executed operation are obtained, so that the operation execution for a certain number of times is carried out according to the execution path. In the process, the client can preset the operation to be executed of the robot, so that the intervention in the working process of the robot is realized, the execution route can be planned when the robot executes tasks, the robot can execute the created operation to be executed for multiple times, and the practicability of the robot is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below.
Fig. 1 is an overall structural diagram of a robot controlled by a client according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of an implementation of a control method of a robot according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a control method for a robot according to another embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a control system of a robot according to an embodiment of the present disclosure;
fig. 5 is a schematic view of a robot according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the application and do not constitute a limitation on the application.
The robot control method provided by the application is applied to an android client, the android client is in wireless connection with a robot to control the robot, and the control of the robot by the client as shown in fig. 1 mainly comprises motion control and visual management of the robot and monitoring of the robot (namely, machine monitoring in the figure); the motion control comprises manual control and operation control, wherein the manual control comprises control over gears (high, medium and low gears) and directions of the robot; the job control includes management job execution of a job executed by the robot and creation of a topology map, such as creation, download, upload, deletion, and modification of an execution job (not illustrated, see fig. 1). The vision management comprises management of information input of the robot by a client and control of the robot through the cloud platform. The monitoring of the robot includes monitoring of the position of the robot and monitoring of whether an abnormal condition occurs during the execution of the work by the robot (see fig. 1 in particular).
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
The first embodiment is as follows:
fig. 2 shows a schematic implementation flow chart of a control method of a robot provided by an embodiment of the present application, including steps S21-S23, where:
and step S21, receiving the operation to be executed sent by the client, wherein the operation to be executed is created by the client according to the topological map of the robot work.
In the embodiment that this application provided, the operation process of robot is controlled by the control end, and above-mentioned client includes tall and erect client of ann, for example with intelligent equipment such as smart mobile phone, panel computer of tall and erect system as the basis. When the robot runs, the to-be-executed operation is received from the client, and the to-be-executed operation is created by the client according to the topological map according to the robot when the robot works.
Specifically, a user creates a topological map on a client according to the range of the robot to execute a task and by combining a hundred-degree map, then selects and edits path points of the robot to execute a task in the topological map on the basis of the topological map, the path points are stored to become a task to be executed which can be issued and managed, and the client sends a specific task to be executed to the client according to the task to be executed by the robot. Further, when the client issues the job to be executed to the robot, the job to be executed may be sent to the robot through a Lightweight Communications and data encapsulation Library (LCM), so that the robot executes the corresponding job.
And step S22, acquiring the execution path and the execution times of the robot executing the corresponding operation according to the operation to be executed.
In the step, after the robot acquires the to-be-executed operation sent by the client, the execution path of the to-be-executed operation and the execution times of the to-be-executed operation are analyzed. The execution path is established according to the topology, for example, when the robot needs to pass through 3 locations during one operation, the client end edits and marks position information of the 3 locations on the topology map (for example, sets what contents are executed at the first location of the robot) in sequence when the robot creates the operation to be executed, sets a sequence of the robot passing through the 3 locations, includes the information in the operation to be executed, and after the robot receives the operation to be executed, the processor analyzes the operation path of the robot when executing the operation, the contents to be executed at each operation point (that is, the 3 locations) and the execution times of each operation to be executed (for example, it can be set that a certain patrol operation of the robot needs to be executed 2 times in the operation to be executed).
And step S23, instructing the robot to execute corresponding operation according to the execution path and the execution times.
In this step, the robot executes the corresponding job according to the analysis result of the job to be executed by the processor of the robot.
Optionally, in another embodiment provided by the present application, the instructing the robot to execute the corresponding job according to the execution path and the execution times includes:
acquiring position information of the robot when the robot executes the operation and environmental parameters at the corresponding position in real time;
and sending the position information and the environment parameters to the client.
In the step, when the robot executes corresponding operation, the robot acquires the position of the robot through a positioning system of the robot, and shoots the environmental information of the position of the robot through a camera or detects corresponding environmental parameters through other modules, for example, people flow information of the position of the patrol robot in an airport can be acquired. The robot sends the position information and the environmental parameters back to the client, so that a user at the client can obtain the detection information of the robot in real time.
Furthermore, when the robot sends the monitoring information and the position information of the environment, the robot can also obtain the information of the electric quantity of the robot, the temperature, the humidity and the like of the position where the robot is located, and then the obtained information is sent to the client side, so that the client side user can keep monitoring the robot.
Further, after the instructing robot executes the corresponding job according to the execution path and the execution times, the method includes:
detecting whether the robot executes the operation or not;
if the robot is detected to be abnormal when the robot executes the operation, recording abnormal information;
and generating an abnormal log according to the abnormal information, and sending the abnormal log to the client so that the client displays the abnormal log to a user.
In the step, whether an abnormality occurs is detected in the process that the robot executes the operation, wherein the abnormality comprises the abnormality of the detected environment parameters (for example, the detected environment temperature exceeds a preset temperature value) and the abnormality of the running state of the robot (for example, the robot fault), if the abnormality is detected, the robot records the abnormal information to generate an abnormal log, and sends the daily log to a client side having a control relation with the daily log, so that a client side user can know the state of the robot and the state of the position of the robot in real time.
Optionally, in another embodiment provided by the present application, the control method of the robot further includes:
monitoring and receiving a control instruction sent by the client when the robot does not execute the operation;
and adjusting the running speed or/and the running direction of the robot according to the control instruction.
In this step, if the robot is not in a state of executing the operation, it may be monitored in real time whether a client sends a control instruction, where the control instruction is used to control the motion of the robot. When a control instruction sent by the client is received, the gear (the consignment movement speed) and/or the movement direction of the robot during movement are/is adjusted according to the control instruction, as shown in fig. 1, the client can adjust the robot to be in a high-grade, medium-grade or resistant state, so that a user can manually control the robot through the client.
Optionally, in another embodiment provided by the present application, the method for controlling the robot further includes:
receiving an adjusting instruction sent by a holder in the process that the robot executes the operation;
and adjusting the shooting direction and the focal length of the robot camera according to the adjusting instruction.
Specifically, the robot acquires surrounding information through the camera in the working process, sends an instruction to the cradle head when the client needs to adjust the robot camera, and controls the shooting direction and/or the focal length of the robot camera through the cradle head.
According to the embodiment provided by the application, the robot can receive the to-be-executed operation sent by the android client before executing the operation, the to-be-executed operation is created by the android client according to the topological map when the robot works, and after receiving the to-be-executed operation from the client, the robot acquires the execution path and the execution times of the to-be-executed operation, so that the operation execution for a certain number of times is carried out according to the execution path. In the process, the client can preset the operation to be executed of the robot, so that the intervention in the working process of the robot is realized, the execution route can be planned when the robot executes tasks, the robot can execute the created operation to be executed for multiple times, and the practicability of the robot is improved.
Example two:
fig. 3 shows a schematic implementation flow chart of a control method of a robot provided in another embodiment of the present application, including steps S31-S33, where:
and step S31, acquiring relevant information of the robot to execute the operation, and creating a topological map of the robot work according to the relevant information.
Optionally, the obtaining information related to the robot performing the task, and creating a topological map of the robot work according to the information includes:
acquiring position information of a place where the robot needs to pass when executing operation;
and creating a topological map of the robot work according to the position information.
In the step, a user inputs the places to be passed by when the robot executes the task through a client, selects the places on a hundred-degree map, and performs attribute setting and connection processing on the places to form a robot walking path planning map, namely a topological map for the robot to work.
Further, before creating the topology map, the client verifies the identity information of the user creating the map, and if the identity information of the user meets a preset condition (if the user is a senior user, the definition of the senior user can be set by an administrator), the user is allowed to create the topology map. The specific creating process comprises the following steps: the user who meets the preset conditions logs in the control system, and points are manually taken as mark points (namely points which the robot can pass through) on a Baidu map (or other maps). And detecting whether the error exists between the acquired point and the actual point, if the error exists between the acquired point and the actual point, manually and remotely controlling the robot to move to an ideal position, setting an adjacent point by using coordinates transmitted from a GPS (global positioning system) as a mark point, and forming a topological map of the robot work according to the sequence of the acquired point and the sequence of the robot passing through each point. The attributes of the mark points are divided into 4 types: conventional point, bifurcation point, endpoint, charging point.
And step S32, creating the to-be-executed operation of the robot according to the topological map.
In the step, after receiving a click operation (or a long-time press operation and the like) of a user on any mark point in the topological map, a mark point configuration window is displayed, wherein the ID and the longitude and latitude of the mark point are displayed in the configuration window, and the user can set the speed or/and the mode of the robot passing through the position or/and the content of executing a task and the like. The edited mark points are displayed in different modes to distinguish the unedited mark points. If the two editing points are adjacent points, the connecting line between the two editing points can be displayed in different modes, such as different colors, so that the user can edit the operation more intuitively, and the user experience is enhanced.
And step S33, sending the work to be executed to the robot so as to instruct the robot to execute a corresponding program according to the work to be executed.
In the step, the client sends the created job to be executed to the robot, so that the robot executes the corresponding job according to the created job. For details, refer to steps S21-S23 in the first embodiment, which are not described herein again.
Further, the user can send the job to be executed to the robot through the client, and can also perform operations such as modification, uploading, deletion and the like on the job to be executed, so that the user can better interact with the robot.
And instructing the robot to execute a corresponding program according to the operation to be executed, including.
Receiving an information input request input by a user;
detecting whether the user has information input authority;
and if so, sending an information input instruction to the robot so that the robot records the face information detected in the process of executing the operation.
In the process of executing corresponding operation, a user (such as the advanced user) can send an information entry request to a client, the client detects whether the user has authority to indicate the robot to perform information entry (such as whether the user is the advanced user), and if the user has the authority, the information entry request is sent to the robot so that the robot performs information entry, wherein the information entry comprises human face information entry. The face information refers to information of people which can be monitored by visual equipment in a patrol range of the robot. The robot can send the information fact of typing into to the customer end, guarantees that the customer end user knows the behavior of robot.
It should be noted that the robot in the present application may be a security robot, and the client may be an android client, and the two may communicate with each other in a wireless or other manner. According to the embodiment provided by the application, the robot can receive the to-be-executed operation sent by the android client before executing the operation, the to-be-executed operation is created by the android client according to the topological map when the robot works, and after receiving the to-be-executed operation from the client, the robot acquires the execution path and the execution times of the to-be-executed operation, so that the operation execution for a certain number of times is carried out according to the execution path. In the process, the client can preset the operation to be executed of the robot, so that the intervention in the working process of the robot is realized, the execution route can be planned when the robot executes tasks, the robot can execute the created operation to be executed for multiple times, and the practicability of the robot is improved.
Example three:
fig. 4 shows a schematic structural diagram of a robot control system provided in another embodiment of the present application, including a client 41 and a robot 42, where:
the client 41 includes:
a first obtaining unit 411, configured to obtain relevant information of a robot performing a job, and create a topological map of the robot work according to the relevant information;
a creating unit 412, configured to create a to-be-executed job of the robot according to the topological map;
a sending unit 413, configured to send the job to be executed to the robot, so as to instruct the robot to execute a corresponding program according to the job to be executed.
Optionally, the first obtaining unit 411 is specifically configured to:
acquiring position information of a place where the robot needs to pass when executing operation;
and creating a topological map of the robot work according to the position information.
Optionally, the sending unit 413 is specifically configured to: receiving an information input request input by a user; detecting whether the user has information input authority; and if so, sending an information input instruction to the robot so that the robot records the face information detected in the process of executing the operation.
The robot 42 includes:
the receiving unit 421 is configured to receive a job to be executed sent by a client, where the job to be executed is created by the client according to a topological map of the robot work;
a second obtaining unit 422, configured to obtain, according to the to-be-executed job, an execution path and an execution number when the robot executes a corresponding job;
and the execution unit 423 is used for instructing the robot to execute corresponding work according to the execution path and the execution times.
Optionally, the execution unit 423 is specifically configured to:
acquiring position information of the robot when the robot executes the operation and environmental parameters at the corresponding position in real time; and sending the position information and the environment parameters to the client.
Optionally, the robot 42 is further configured to monitor and receive a control instruction sent by the client when the robot does not execute a job; and adjusting the running speed or/and the running direction of the robot according to the control instruction.
Optionally, the robot 42 is further configured to receive an adjustment instruction sent by the pan/tilt head during the process of executing the job by the robot; and adjusting the shooting direction and/or the focal length of the robot camera according to the adjusting instruction.
Optionally, the robot 42 is further configured to detect whether an abnormality occurs while the robot performs the job;
if the robot is detected to be abnormal when the robot executes the operation, recording abnormal information;
and generating an abnormal log according to the abnormal information, and sending the abnormal log to the client so that the client displays the abnormal log to a user.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention. Any other corresponding changes and modifications made according to the technical idea of the present invention should be included in the protection scope of the claims of the present invention.
Example four:
fig. 5 shows a schematic structural diagram of a robot provided in an embodiment of the present application, where the robot 5 of the embodiment includes: a processor 50, a memory 51 and a computer program 52 stored in said memory 51 and executable on said processor 50, such as a program in a software upgrade method. The steps in the above-described embodiments of the software upgrading method, such as steps S21 to S23 shown in fig. 2, are implemented when the processor 50 executes the computer program 52, and the procedures in steps S21 to S23 shown in fig. 2 are implemented when the processor 50 executes the computer program 52.
The robot 5 may be a security robot. The robot 5 may include, but is not limited to, a processor 50 and a memory 51. Those skilled in the art will appreciate that fig. 5 is merely an example of the robot 5, and does not constitute a limitation of the robot 5, and may include more or less components than those shown, or combine certain components, or different components, for example, the robot 5 may further include input and output devices, network access devices, buses, etc.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process described above may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
Claims (10)
1. A control method of a robot, characterized by comprising:
receiving a to-be-executed operation sent by a client, wherein the to-be-executed operation is created by the client according to a topological map of the robot work;
acquiring an execution path and execution times of the robot when executing corresponding operation according to the operation to be executed;
and instructing the robot to execute corresponding operation according to the execution path and the execution times.
2. The method for controlling a robot according to claim 1, wherein the instructing the robot to execute the corresponding job according to the execution path and the execution number includes:
acquiring position information of the robot when the robot executes the operation and environmental parameters at the corresponding position in real time;
and sending the position information and the environment parameters to the client.
3. The control method of a robot according to claim 1, further comprising:
monitoring and receiving a control instruction sent by the client when the robot does not execute the operation;
and adjusting the running speed or/and the running direction of the robot according to the control instruction.
4. A control method of a robot according to any of claims 1-3, characterized in that the control method of a robot further comprises:
receiving an adjusting instruction sent by a holder in the process that the robot executes the operation;
and adjusting the shooting direction and/or the focal length of the robot camera according to the adjusting instruction.
5. The method of controlling a robot according to claim 1, comprising, after the instructing the robot to execute the corresponding job according to the execution path and the number of executions:
detecting whether the robot executes the operation or not;
if the robot is detected to be abnormal when the robot executes the operation, recording abnormal information;
and generating an abnormal log according to the abnormal information, and sending the abnormal log to the client so that the client displays the abnormal log to a user.
6. A control method of a robot, characterized by comprising:
acquiring relevant information of a robot for executing operation, and creating a topological map of the robot according to the relevant information;
creating a to-be-executed operation of the robot according to the topological map;
and sending the to-be-executed operation to the robot so as to instruct the robot to execute a corresponding program according to the to-be-executed operation.
7. The method of controlling a robot according to claim 6, wherein the obtaining information about a task performed by the robot and the creating a topological map of the robot work based on the information comprises:
acquiring position information of a place where the robot needs to pass when executing operation;
and creating a topological map of the robot work according to the position information.
8. The method of controlling a robot according to claim 6, wherein said instructing the robot to execute a corresponding program according to the job to be executed includes.
Receiving an information input request input by a user;
detecting whether the user has information input authority;
and if so, sending an information input instruction to the robot so that the robot records the face information detected in the process of executing the operation.
9. A control system of a robot, characterized by comprising: a robot and a client, wherein:
the client comprises:
the robot comprises a first acquisition unit, a second acquisition unit and a third acquisition unit, wherein the first acquisition unit is used for acquiring relevant information of robot execution work and creating a topological map of robot work according to the relevant information;
the creating unit is used for creating the to-be-executed operation of the robot according to the topological map;
the sending unit is used for sending the to-be-executed operation to the robot so as to instruct the robot to execute a corresponding program according to the to-be-executed operation;
the robot includes:
the robot comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving a job to be executed sent by a client, and the job to be executed is created by the client according to a topological map of the robot work;
the second acquisition unit is used for acquiring an execution path and execution times of the robot when executing corresponding operation according to the operation to be executed;
and the execution unit is used for instructing the robot to execute corresponding operation according to the execution path and the execution times.
10. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the steps of the method according to any of the claims 1 to 5 are implemented when the computer program is executed by the processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811602936.6A CN111360816A (en) | 2018-12-26 | 2018-12-26 | Robot control method and system and robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811602936.6A CN111360816A (en) | 2018-12-26 | 2018-12-26 | Robot control method and system and robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111360816A true CN111360816A (en) | 2020-07-03 |
Family
ID=71200950
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811602936.6A Pending CN111360816A (en) | 2018-12-26 | 2018-12-26 | Robot control method and system and robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111360816A (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110098874A1 (en) * | 2009-10-26 | 2011-04-28 | Electronics And Telecommunications Research Institute | Method and apparatus for navigating robot |
CN104243918A (en) * | 2014-09-03 | 2014-12-24 | 深圳奇沃智联科技有限公司 | Robot monitoring system automatically patrolling based on Bluetooth positioning |
CN204408502U (en) * | 2014-09-03 | 2015-06-17 | 深圳奇沃智联科技有限公司 | Application bluetooth locates the robot monitoring system automatically gone on patrol |
CN104965426A (en) * | 2015-06-24 | 2015-10-07 | 百度在线网络技术(北京)有限公司 | Intelligent robot control system, method and device based on artificial intelligence |
CN106021382A (en) * | 2016-05-10 | 2016-10-12 | 天津同丰信息技术有限公司 | Map base system applied to smart environmental sanitation system |
CN106506589A (en) * | 2016-09-28 | 2017-03-15 | 中国人民解放军国防科学技术大学 | A kind of robot cluster method and system |
CN108121330A (en) * | 2016-11-26 | 2018-06-05 | 沈阳新松机器人自动化股份有限公司 | A kind of dispatching method, scheduling system and map path planing method |
WO2018121448A1 (en) * | 2016-12-30 | 2018-07-05 | 深圳市杉川机器人有限公司 | Topology map creation method and navigation method for mobile robot, programmable device, and computer readable medium |
CN108422419A (en) * | 2018-02-09 | 2018-08-21 | 上海芯智能科技有限公司 | A kind of intelligent robot and its control method and system |
CN108602188A (en) * | 2015-12-11 | 2018-09-28 | 罗伯特有限责任公司 | Remote control movable independent robot |
CN108628314A (en) * | 2018-06-13 | 2018-10-09 | 西安交通大学 | A kind of Multi computer cooperation turf-mown machine people's system and method |
CN108818569A (en) * | 2018-07-30 | 2018-11-16 | 浙江工业大学 | Intelligent robot system towards public service scene |
JP6430079B1 (en) * | 2017-10-05 | 2018-11-28 | 三菱電機株式会社 | Monitoring system and monitoring method |
-
2018
- 2018-12-26 CN CN201811602936.6A patent/CN111360816A/en active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110098874A1 (en) * | 2009-10-26 | 2011-04-28 | Electronics And Telecommunications Research Institute | Method and apparatus for navigating robot |
CN104243918A (en) * | 2014-09-03 | 2014-12-24 | 深圳奇沃智联科技有限公司 | Robot monitoring system automatically patrolling based on Bluetooth positioning |
CN204408502U (en) * | 2014-09-03 | 2015-06-17 | 深圳奇沃智联科技有限公司 | Application bluetooth locates the robot monitoring system automatically gone on patrol |
CN104965426A (en) * | 2015-06-24 | 2015-10-07 | 百度在线网络技术(北京)有限公司 | Intelligent robot control system, method and device based on artificial intelligence |
CN108602188A (en) * | 2015-12-11 | 2018-09-28 | 罗伯特有限责任公司 | Remote control movable independent robot |
CN106021382A (en) * | 2016-05-10 | 2016-10-12 | 天津同丰信息技术有限公司 | Map base system applied to smart environmental sanitation system |
CN106506589A (en) * | 2016-09-28 | 2017-03-15 | 中国人民解放军国防科学技术大学 | A kind of robot cluster method and system |
CN108121330A (en) * | 2016-11-26 | 2018-06-05 | 沈阳新松机器人自动化股份有限公司 | A kind of dispatching method, scheduling system and map path planing method |
WO2018121448A1 (en) * | 2016-12-30 | 2018-07-05 | 深圳市杉川机器人有限公司 | Topology map creation method and navigation method for mobile robot, programmable device, and computer readable medium |
JP6430079B1 (en) * | 2017-10-05 | 2018-11-28 | 三菱電機株式会社 | Monitoring system and monitoring method |
CN108422419A (en) * | 2018-02-09 | 2018-08-21 | 上海芯智能科技有限公司 | A kind of intelligent robot and its control method and system |
CN108628314A (en) * | 2018-06-13 | 2018-10-09 | 西安交通大学 | A kind of Multi computer cooperation turf-mown machine people's system and method |
CN108818569A (en) * | 2018-07-30 | 2018-11-16 | 浙江工业大学 | Intelligent robot system towards public service scene |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108073277B (en) | System and method for virtual reality and augmented reality for industrial automation | |
US20210004025A1 (en) | Unmanned aerial vehicle and supervision method and monitoring system for flight state thereof | |
KR102296455B1 (en) | Methods and devices for controlling the flight of drones | |
US20190147655A1 (en) | Augmented reality safety automation zone system and method | |
JP2019117650A (en) | Method and apparatus for seamlessly transmitting state between user interface devices in mobile control room | |
EP3680648B1 (en) | Port machinery inspection device and inspection method | |
CN106502265A (en) | A kind of airline generation method and apparatus of unmanned vehicle | |
US20220410010A1 (en) | Game Scene Editing Method, Storage Medium, and Electronic Device | |
CN114779796A (en) | Flight control method and device, monitoring method and device and storage medium | |
CN109582034A (en) | A kind of multitask flight course planning method, apparatus and electronic equipment | |
EP3690561B1 (en) | Method and system for 3d visually monitoring a building, and memorizer | |
KR20170071443A (en) | Behavior-based distributed control system and method of multi-robot | |
CN113260939A (en) | Unmanned aerial vehicle control method, terminal device, unmanned aerial vehicle and storage medium | |
Wang | Cyber manufacturing: research and applications | |
CN109901830B (en) | Signal configuration method and system for scada system development | |
CN105320010A (en) | Unmanned plane flight control system supporting secondary exploitation | |
WO2020061855A1 (en) | Special robot control system, method, electronic device, medium and program | |
US11281456B2 (en) | Application development environment providing system, application development environment provision method, terminal device, and application display method | |
CN111360816A (en) | Robot control method and system and robot | |
CN108490981A (en) | A kind of holder servo intelligent control method and system | |
US20110172791A1 (en) | Automatically addressable configuration system for recognition of a motion tracking system and method of use | |
KR20190006634A (en) | System, method and apparatus for wide area drone operation | |
CN109799841A (en) | A kind of unmanned aerial vehicle ground control system, equipment and storage medium | |
CN116931596A (en) | Unmanned aerial vehicle flight system with flight program automatically arranged | |
US20230305931A1 (en) | Control device and non-transitory machine readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200703 |