CN111203874B - Robot control method, device, electronic device and storage medium - Google Patents

Robot control method, device, electronic device and storage medium Download PDF

Info

Publication number
CN111203874B
CN111203874B CN201911364602.4A CN201911364602A CN111203874B CN 111203874 B CN111203874 B CN 111203874B CN 201911364602 A CN201911364602 A CN 201911364602A CN 111203874 B CN111203874 B CN 111203874B
Authority
CN
China
Prior art keywords
robot
target
robots
identification information
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911364602.4A
Other languages
Chinese (zh)
Other versions
CN111203874A (en
Inventor
陈远亮
王轶丹
崔宁
熊友军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN201911364602.4A priority Critical patent/CN111203874B/en
Publication of CN111203874A publication Critical patent/CN111203874A/en
Application granted granted Critical
Publication of CN111203874B publication Critical patent/CN111203874B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application is applicable to the technical field of robots, and provides a control method of a robot, which comprises the following steps: in response to receiving a first control instruction for one or more target robots, the first control instruction comprising identification information of the target robots; determining the type of each target robot according to the identification information; determining a target communication mode corresponding to each target robot in a plurality of preset communication modes according to the identification information; converting the first control instruction into a second control instruction corresponding to the type of each target robot; and sending the second control instruction to each target robot through each target communication mode corresponding to each target robot. Therefore, the purpose that one terminal device controls a plurality of robots through a plurality of communication modes is achieved, the operation efficiency is improved, and the user experience is improved.

Description

Robot control method, device, electronic device and storage medium
Technical Field
The present application relates to the field of robotics, and in particular, to a method and an apparatus for controlling a robot, an electronic device, and a storage medium.
Background
The current mobile-end robot control software can only control one type of robot or a series of robots, and the robots all have similar communication protocols. However, in some application scenarios, a plurality of different robots have different communication modes, and how to obtain state information of the robots at the same time or control a plurality of robots at the same time becomes a problem to be solved.
Disclosure of Invention
Embodiments of the present application provide a method and an apparatus for controlling a robot, an electronic device, and a storage medium, which may solve at least some of the above problems.
In a first aspect, an embodiment of the present application provides a control method for a robot, including:
in response to receiving a first control instruction for one or more target robots, the first control instruction comprising identification information of the target robots; determining the type of each target robot according to the identification information; determining a target communication mode corresponding to each target robot in a plurality of preset communication modes according to the identification information; converting the first control instruction into a second control instruction corresponding to the type of each target robot; and sending the second control instruction to each target robot through each target communication mode corresponding to each target robot.
It can be understood that the user command is converted into a command which can be recognized by a target robot according to the type of the robot, the command which can be recognized by the robot is sent to the robot to control the robot by presetting a plurality of communication modes which are communicated with the robot in control software, and therefore the purpose that one terminal device controls a plurality of robots through a plurality of communication modes is achieved, the operation efficiency is improved, and the user experience is improved.
In a second aspect, an embodiment of the present application provides a control apparatus for a robot, including:
an instruction response module for responding to a received first control instruction for one or more target robots, the first control instruction comprising identification information of the target robots;
the robot type determining module is used for determining the type of each target robot according to the identification information;
a communication mode determining module, configured to determine, according to the identification information, a target communication mode corresponding to each of the target robots among a plurality of preset communication modes;
the instruction conversion module is used for converting the first control instruction into a second control instruction corresponding to the type of each target robot;
and the instruction sending module is used for sending the second control instruction to each target robot through each target communication mode corresponding to each target robot.
In a third aspect, an embodiment of the present application provides an electronic device, including:
comprising a memory, a processor and a computer program stored in said memory and executable on said processor, said computer program realizing the method steps of the first aspect described above when executed by said processor.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, including: the computer readable storage medium stores a computer program which, when executed by a processor, performs the method steps of the first aspect described above.
Fifth aspect embodiments provide a computer program product, which when run on an electronic device, causes the electronic device to perform the method steps of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic view of an application scenario of a robot control system according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a control method of a robot according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a software architecture applicable to a control method of a robot according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a control method of a robot according to another embodiment of the present disclosure;
fig. 5 is a schematic flowchart of a control method of a robot according to another embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a control device of a robot according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The current mobile-end robot control software can only control one type of robot or a series of robots, and the robots all have similar communication protocols. However, in some application scenarios, a plurality of different robots have different communication modes respectively, and different communication protocols are applied, so how to obtain state information of the robots at the same time or control a plurality of robots at the same time becomes a problem to be solved.
Fig. 1 illustrates a robot control system provided in an embodiment of the present application, including a terminal device 10 and one or more robots, where fig. 1 is an application scenario of multi-robot control of the robot control system, in which the terminal device 10 controls multiple robots, including but not limited to a wheel self-propelled robot 201, an auxiliary cleaning robot 202, and a track self-propelled robot 203, through built-in robot control software 11. The application scenario shown in fig. 1 is only an example of the embodiment of the present application, and those skilled in the art should understand that the embodiment of the present application is applicable to other scenarios of multi-robot control under the teaching of the embodiment of the present application.
In the application scenario shown in fig. 1, the wheel self-propelled robot 201 communicates with the terminal device 10 through the bluetooth communication protocol; the auxiliary cleaning robot 202 communicates with the terminal device 10 through a WiFi communication protocol; the crawler self-propelled robot 203 communicates with the terminal device 10 through a mobile communication network protocol.
In the application scenario shown in fig. 1, in some embodiments, the terminal device 10 needs to send a wake-up instruction to three robots at the same time, so that the three robots operate according to a predetermined program; or simultaneously sending a traveling instruction to the three robots so that the three robots move to the specified place; or the terminal device 10 needs to obtain the state information of the three robots at the same time so that the user can know the operating state of the robots. But the three robots communicate with the terminal through different communication protocols; the terminal device 10 is very inconvenient to control each robot or to check the state.
In order to solve at least part of the problems, the embodiment of the application provides a control method of a robot. The user instruction is converted into the instruction which can be identified by the target robot according to the type of the robot, a plurality of communication modes which are communicated with the robot are preset in the control software, the instruction which can be identified by the robot is sent to the robot to control the robot, and therefore the purpose that one terminal device controls a plurality of robots through a plurality of communication modes is achieved, the operation efficiency is improved, and the user experience is improved.
The terminal device provided in the embodiment of the present application includes, but is not limited to, a mobile phone, a tablet computer, a wearable device, an in-vehicle device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and other terminal devices, and the specific type of the terminal device is not limited in the embodiment of the present application.
The robot provided by the embodiment of the present application includes, but is not limited to, various robots that communicate with a terminal device in a wireless communication manner such as a bluetooth communication network, a WiFi communication network, a mobile communication network, or an infrared communication manner, or in a wired communication manner.
Fig. 2 illustrates a control method of a robot according to an embodiment of the present application, which is applied to the terminal device 10 in the robot control system illustrated in fig. 1, and can be implemented by software/hardware of the terminal device. As shown in fig. 2, the method includes steps S110 to S150. The specific realization principle of each step is as follows:
s110, responding to a received first control instruction aiming at one or more target robots, wherein the first control instruction comprises identification information of the target robots.
In a non-limiting example, a user selects one or more target robots through a user operation interface of the terminal device and sends a first control instruction to the target robots through the user operation interface; the terminal device receives and responds to the first control instruction for the one or more target robots. The first control instruction comprises identification information of each target robot.
In one non-limiting example, the terminal device responds to a first control instruction sent by the user to the terminal device through an instruction receiving device such as voice, gesture, motion or bioelectric signal.
Without limitation, the first control instructions include, but are not limited to, instructions for controlling the robot to perform a specified action and instructions for controlling the robot to communicate; the command for controlling the robot to execute the specified action comprises but is not limited to a light control command, a traveling control command, a sounding control command, a waking/standby control command, a communication control command and the like; the instructions for controlling the robot to communicate include, but are not limited to, instructions for controlling the robot to feed back status information, instructions for controlling the target robot to communicate with other devices or the robot.
Without limitation, the identification information of the robot is used to uniquely identify the robot to distinguish different robots; the identification information of the robot may be a serial number of the robot, a certificate of the robot, or address information of the robot, which is not limited specifically herein.
In a non-limiting example, fig. 3 shows a software system architecture diagram of the robot control software 11 running in the terminal device 10 in some embodiments of the present application. The Software architecture includes an application layer component 111, a Software Development Kit (SDK) component 112, and an operating system communication interface component 113; wherein, the SDK component 112 includes an external interface layer component 1121, a search and connection layer component 1122, and a logical processing layer component 1123; wherein the external interface layer components include, but are not limited to, a robot control interface component 11211, a robot data query interface component 11212, and a robot search and connect interface component 11213; the searching and connecting layer component 1122 includes a plurality of robot connector components 11221 for implementing different preset communication modes; wherein the logical processing layer component 1123 comprises one or more robot object components 11231 for processing the conversion of instructions or state data corresponding to the identification information of each robot and communicating with a robot connector corresponding to a communication mode corresponding to the individual robot identification information; wherein, the application layer component 111 is used for data interaction with the user; the external interface layer component 1121 is used to implement an interface between the application layer component 111 and the search connection layer component 1122 and the logic processing layer component 1123; wherein the operating system communication interface component 113 is used for the terminal device 10 to communicate with a plurality of robotic entities 20; illustratively, the robot entity 20 includes, but is not limited to, a wheel-mounted robot 201, an auxiliary cleaning robot 202, and a crawler-mounted robot 203 shown in fig. 1.
As shown in fig. 3, a user selects one or more robots through a user operation interface of the terminal device, and transmits a travel instruction to the selected robots, the instruction including identification information of each target robot, such as a robot number, to be executed. The application layer assembly of the robot control software in the terminal equipment responds to a traveling instruction received by a user operation interface and sends the traveling instruction to an external interface layer assembly of the robot control software, and the external interface layer assembly sends a first control instruction, namely the traveling instruction, to a logic processing layer assembly of the robot control software through a robot control interface of the assembly.
And S120, determining the type of each target robot according to the identification information.
In a non-limiting example, the terminal device obtains the type of the robot corresponding to the identification information by querying the mapping relationship between the identification information and the robot type according to the identification information of the target robot, for example, the robot number.
In a non-limiting example, as shown in fig. 3, after receiving a first control instruction through a robot control interface, a logic processing layer component of robot control software in a terminal device queries a mapping relationship between identification information and a robot type according to identification information of each robot included in the first control instruction, determines a type of a target robot corresponding to each identification information, and sends the first control instruction to a component corresponding to a robot object of a different type. Without limitation, each type may correspond to a plurality of different robot object components, each robot object component corresponding to unique robot identification information, each robot object component for processing control instructions for a target robot corresponding to the robot identification information.
And S130, determining a target communication mode corresponding to each target robot in a plurality of preset communication modes according to the identification information.
In a non-limiting example, the terminal device determines a target communication mode corresponding to each target robot among a plurality of preset communication modes according to a mapping relationship between the identification information of the target robot and the plurality of preset communication modes.
In a non-limiting example, after receiving the first control instruction, the robot object component in the logical processing layer component of the robot control software in the terminal device requests the search and connection component to determine the target communication mode corresponding to each target robot in a plurality of preset communication modes according to the identification information, and, without limitation, determining the target communication mode corresponding to each target robot in the plurality of preset communication modes includes: and selecting one connector component corresponding to the identification information of the target robot from a plurality of preset robot connector components for processing the communication between the target robot and the terminal equipment.
And S140, converting the first control instruction into a second control instruction corresponding to the type of each target robot.
In one non-limiting example, the terminal device converts the first control instruction into a second control instruction corresponding to the type of each target robot.
In one non-limiting example, the first control instruction is a user instruction, which is an action that the user desires the robot to perform; the second control instructions are robot-executable instructions including, but not limited to, calls to robot instruction interface functions, instruction messages, instruction encodings, etc. that are recognized and executed by the robot. In some implementations, the first control command received by the terminal device is a blinking indicator light command, but the commands that can be directly executed by robots such as blinking indicators of the respective robots, for example, the wheel self-propelled robot, the tracked robot, or the auxiliary cleaning robot, are different, as shown in fig. 3, a robot object component in a logic processing component of the robot control software converts the first control command into a second control command corresponding to the type of the respective target robot, for example, converts the blinking indicator light command into a command code that can be directly executed by the tracked robot.
It can be understood that step S130 and step S140 are not in a sequential order, and the execution order of step S130 and step S140 may be selected according to actual needs when the embodiment of the present application is actually implemented.
And S150, sending the second control instruction to each target robot through each target communication mode corresponding to each target robot.
In one non-limiting example, the terminal device sends the second control instruction to each of the target robots through each of the target communication means corresponding to each of the target robots.
In one non-limiting example, the first control instruction is a travel to target position instruction; for the target robot wheel self-propelled robot, the second control instruction is a control message which corresponds to the wheel self-propelled robot and is carried by a Bluetooth communication protocol; for the auxiliary cleaning robot, the second control instruction is a control message corresponding to the auxiliary cleaning robot, carried by the WiFi communication protocol. And the terminal equipment sends the second control instruction to the wheel self-propelled robot and the auxiliary cleaning robot through a Bluetooth communication protocol and a WiFi communication protocol respectively.
In a non-limiting example, as shown in fig. 3, in the logical processing layer component of the robot control software in the terminal device, the robot object component corresponding to each robot identification information obtains the communication mode of the robot through the robot connector component corresponding to the robot identification information, and sends the second control instruction of each target robot to each target robot through the operating system communication interface component in the obtained communication mode of each target robot.
It can be understood that the user command is converted into a command which can be recognized by a target robot according to the type of the robot, the command which can be recognized by the robot is sent to the robot to control the robot by presetting a plurality of communication modes which are communicated with the robot in control software, and therefore the purpose that one terminal device controls a plurality of robots through a plurality of communication modes is achieved, the operation efficiency is improved, and the user experience is improved.
On the basis of the above-mentioned embodiment of the control method of the robot shown in fig. 2, before responding to the received first control instruction for the one or more target robots, as shown in fig. 4, the method further includes:
and S101, responding to the received search instruction, and searching robots respectively matched with a plurality of preset communication modes in the current environment.
In a non-limiting example, a user sends a search instruction through a user operation interface of the terminal device, and the terminal device searches robots respectively matched with a plurality of preset communication modes in network environments of various communication networks which can be connected with the terminal device currently in response to the search instruction received by the user operation interface. Without limitation, the terminal device searches for robots that can be detected in a bluetooth network environment, a WiFi network environment, and a mobile communication network environment, respectively, through these several network connection methods.
In a non-limiting example, as shown in fig. 3, a user sends a search instruction through a user operation interface of a terminal device, the terminal device receives the search instruction through an application layer component and responds to the search instruction, the search instruction is sent to a search and connection layer component through a robot search and connection interface of an external interface layer component, a plurality of preset robot connector components corresponding to different communication modes in the search and connection layer component are searched, and robots to which various communication modes can be connected are searched through an operating system communication interface component.
And S102, generating a second robot list according to the identification information of the one or more searched robots.
In a non-limiting example, the terminal device generates a second robot list according to the identification information of the one or more robots; the second robot list comprises a robot list which can be used by the terminal equipment to establish communication connection through each preset communication mode; without limitation, the second robot list includes identification information of each robot and a communication mode corresponding to the identification information;
in one non-limiting example, as shown in fig. 3, each preset communication mode is realized by a robot connector assembly corresponding to each communication mode.
And S103, responding to a selection instruction aiming at the second robot list, establishing connection with the robot corresponding to the identification information of the robot in the selection instruction, and generating a first robot list.
In one non-limiting example, the terminal device responds to a selection instruction for the second robot list, establishes connection with the robot corresponding to the identification information of the robot in the selection instruction, and generates the first robot list. The first machine list is a robot list for establishing communication connection with the terminal device.
In a non-limiting example, after the terminal device generates the second robot list, the second robot list is presented through the user operation interface, and the user selects the robots in the second robot list through the user operation interface to determine one or more robots in the second robot list that need to establish a communication connection.
In one non-limiting example, after establishing a connection with one or more robots, the logic processing layer establishes a mapping relationship between the robot identification information and the communication mode according to the identification information of the connected robots, and generates an instance of a robot object component corresponding to the robot identification information.
In one non-limiting example, after generating the first list of robots, the method further comprises: and displaying the first robot list through a user operation interface for a user to check the connected robot information.
In one non-limiting example, it is understood that the terminal device, in response to receiving a first control instruction for one or more target robots, comprises: in response to receiving a first control instruction for one or more target robots in the first robot list.
It can be understood that, by searching for a robot capable of establishing a communication connection with a terminal device in the current network environment and establishing the communication connection, the controlled robot can be confirmed in advance, thereby providing convenience for a user to control a plurality of robots through the terminal device.
On the basis of the above embodiment of the control method of the robot shown in fig. 4, after establishing a connection with the robots corresponding to the identification information of one or more robots in the selection instruction in response to the selection instruction for the second robot list, as shown in fig. 5, the method further includes:
s210, receiving first state data sent by each connected robot, wherein the first state data comprises identification information of the robot.
In one non-limiting example, the terminal device communicates with each of the robots that have established a communication connection; the method comprises the steps that terminal equipment receives first state data sent by each connected robot, wherein the first state data comprise identification information of the robot; the first state data includes, but is not limited to, position information of the robot, software and hardware running state information of the robot, fault information of the robot, and feedback information of a control instruction executed by the robot.
In a non-limiting example, it is understood that after the robot establishes a communication connection with the terminal device, the robot may send the first state data to the terminal device according to its own software and hardware settings and operating conditions; or after responding to a second control instruction sent by the terminal device, the second control instruction is used for instructing the target robot to feed back the first state data; accordingly, the target robot receives the first state data transmitted from each connected robot after transmitting the second control command to each target robot through each target communication unit corresponding to each target robot.
In one non-limiting example, the terminal device receives first status data sent by each robot that has established a communication connection through the operating system communication interface component.
S220, determining the type of each connected robot according to the identification information.
In a non-limiting example, the terminal device determines the type of each connected robot through the mapping relationship between the identification information and the robot type according to the identification information in each first state data.
In a non-limiting example, the logic processing component of the robot control software in the terminal device determines the type of the robot by querying the identification information and the robot-to-human mapping table according to the robot identification information in the received first state data, and determines the instance of the robot object component corresponding to the processing identification information. And distributing each first state data to the object component corresponding to the robot identification information for processing.
S230, converting the first state data into second state data corresponding to the type of each of the connected robots.
In one non-limiting example, the terminal device converts the first state data into second state data corresponding to the type of each connected robot; the second status data includes, but is not limited to, data that the terminal device can display to the user interface, data that the terminal device can convert into voice playing through a sound generating device, and data that the terminal device can use to control the display status of the display lamp.
In a non-limiting example, the object component corresponding to the robot identification information receives the first state data, and converts the first state data into second state data corresponding to the robot type according to the robot type corresponding to the first state data and the identification information.
And S240, displaying the second state data through a user operation interface.
In one non-limiting example, the robot object component of the robot control software in the terminal device passes the second state data through the robot data application layer component of the external interface layer component; and the application layer component displays the second state data on the user operation interface.
In one non-limiting example, presenting the second status data via a user interface includes: and responding to the first query instruction, and displaying the second state data through a user operation interface. In a non-limiting manner, a user sends out a first query instruction through a user operation interface, the terminal device converts the first query instruction into a second query instruction which can be executed by each robot, and sends the second query instruction to each robot with established communication connection.
It can be understood that the first state data which are sent by the robot and cannot be directly identified by the user are received, and the first state data are converted into the second state data which are uniform and can be identified by the user according to the robot identification information in the first state data, so that the user can check the operating states of a plurality of different types of robots through the terminal equipment, and the user experience is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Corresponding to the method for controlling the robot shown in fig. 2, fig. 6 shows a device for controlling a robot according to an embodiment of the present application, including:
an instruction response module M110 for responding to a received first control instruction for one or more target robots, the first control instruction including identification information of the target robots;
a robot type determining module M120, configured to determine the type of each target robot according to the identification information;
a communication mode determining module M130, configured to determine, according to the identification information, a target communication mode corresponding to each target robot among a plurality of preset communication modes;
an instruction conversion module M140, configured to convert the first control instruction into a second control instruction corresponding to the type of each target robot;
and the instruction sending module M150 is configured to send the second control instruction to each target robot through each target communication mode corresponding to each target robot.
It is understood that various embodiments and combinations of the embodiments in the above embodiments and their advantages are also applicable to this embodiment, and are not described herein again.
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 7, the terminal device D7 of this embodiment includes: at least one processor D70 (only one of which is shown in fig. 7), a memory D71, and a computer program D72 stored in the memory D71 and executable on the at least one processor D70, the processor D70, when executing the computer program D72, implementing the steps in any of the various method embodiments described above. Alternatively, the processor D70 realizes the functions of the modules/units in the above-mentioned device embodiments when executing the computer program D72.
The terminal device D7 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor D70, a memory D71. Those skilled in the art will appreciate that fig. 7 is merely an example of the terminal device D7 and does not constitute a limitation of the terminal device D7, and may include more or less components than those shown, or combine some components, or different components, such as input output devices, network access devices, etc.
The processor D70 may be a Central Processing Unit (CPU), the processor D70 may be other general purpose processors, Digital Signal Processors (DSP), Application Specific Integrated Circuits (ASIC), Field-Programmable Gate arrays (FPGA) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage D71 may in some embodiments be an internal storage unit of the terminal device D7, such as a hard disk or a memory of the terminal device D7. In other embodiments, the memory D71 may also be an external storage device of the terminal device D7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal device D7. Further, the memory D71 may also include both an internal storage unit and an external storage device of the terminal device D7. The memory D71 is used for storing an operating system, application programs, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer programs. The memory D71 may also be used to temporarily store data that has been output or is to be output.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a terminal device, enables the terminal device to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (8)

1. A method for controlling a robot, comprising:
in response to receiving a first control instruction for one or more target robots, the first control instruction comprising identification information of the target robots;
determining the type of each target robot according to the identification information;
determining a target communication mode corresponding to each target robot in a plurality of preset communication modes according to the identification information;
converting the first control instruction into a second control instruction corresponding to the type of each target robot;
sending the second control instruction to each target robot through each target communication mode corresponding to each target robot;
in response to the received first control instruction for the one or more target robots, the method further comprises the following steps: responding to the received search instruction, and searching robots respectively matched with a plurality of preset communication modes in the current environment; generating a second robot list according to the identification information of one or more searched robots; the second robot list comprises identification information of each robot and a communication mode corresponding to the identification information; responding to a selection instruction aiming at the second robot list, establishing connection with the robot corresponding to the identification information of the robot in the selection instruction, and generating a first robot list;
in response to a selection instruction for the second robot list, after the robots corresponding to the identification information of one or more robots in the selection instruction establish a connection, the method further includes: receiving first state data sent by each connected robot, wherein the first state data comprise identification information of the robot; determining the type of each connected robot according to the identification information; the first state data also comprises at least one of position information, software and hardware information, fault information and feedback information of executing control instructions of the connected robot; converting the first state data into second state data corresponding to the type of each of the connected robots; and displaying the second state data through a user operation interface.
2. The method of claim 1, wherein after generating the first list of robots, further comprising:
and displaying the first robot list through a user operation interface.
3. The method of claim 1 or 2, wherein in response to receiving the first control instruction for the one or more target robots, comprising:
in response to receiving a first control instruction for one or more target robots in the first robot list.
4. The method of claim 1, wherein presenting the second state data through a user interface comprises:
and responding to the first query instruction, and displaying the second state data through a user operation interface.
5. The method of claim 1, wherein the second control instruction is for instructing the target robot to feed back the first state data; accordingly, the method can be used for solving the problems that,
and receiving first state data transmitted by each connected robot after transmitting the second control command to each target robot through each target communication unit corresponding to each target robot.
6. A control device for a robot, comprising:
an instruction response module for responding to a received first control instruction for one or more target robots, the first control instruction comprising identification information of the target robots;
the robot type determining module is used for determining the type of each target robot according to the identification information;
a communication mode determining module, configured to determine, according to the identification information, a target communication mode corresponding to each of the target robots among a plurality of preset communication modes;
the instruction conversion module is used for converting the first control instruction into a second control instruction corresponding to the type of each target robot;
the instruction sending module is used for sending the second control instruction to each target robot through each target communication mode corresponding to each target robot;
in response to the received first control instruction for the one or more target robots, the method further comprises the following steps: responding to the received search instruction, and searching robots respectively matched with a plurality of preset communication modes in the current environment; generating a second robot list according to the identification information of one or more searched robots; the second robot list comprises identification information of each robot and a communication mode corresponding to the identification information; responding to a selection instruction aiming at the second robot list, establishing connection with the robot corresponding to the identification information of the robot in the selection instruction, and generating a first robot list;
in response to a selection instruction for the second robot list, after the robots corresponding to the identification information of one or more robots in the selection instruction establish a connection, the method further includes: receiving first state data sent by each connected robot, wherein the first state data comprise identification information of the robot; determining the type of each connected robot according to the identification information; the first state data also comprises at least one of position information, software and hardware information, fault information and feedback information of executing control instructions of the connected robot; converting the first state data into second state data corresponding to the type of each of the connected robots; and displaying the second state data through a user operation interface.
7. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 5 when executing the computer program.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 5.
CN201911364602.4A 2019-12-26 2019-12-26 Robot control method, device, electronic device and storage medium Active CN111203874B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911364602.4A CN111203874B (en) 2019-12-26 2019-12-26 Robot control method, device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911364602.4A CN111203874B (en) 2019-12-26 2019-12-26 Robot control method, device, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN111203874A CN111203874A (en) 2020-05-29
CN111203874B true CN111203874B (en) 2021-10-29

Family

ID=70780463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911364602.4A Active CN111203874B (en) 2019-12-26 2019-12-26 Robot control method, device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN111203874B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112207824B (en) * 2020-09-22 2022-07-01 慧灵科技(深圳)有限公司 Method, system, device and storage medium for controlling multiple single-axis modules
CN114343483B (en) * 2020-10-12 2023-08-18 百度在线网络技术(北京)有限公司 Control method, device, equipment and storage medium for movable object
CN112518747A (en) * 2020-11-26 2021-03-19 深圳市优必选科技股份有限公司 Robot control method, robot control device and wearable equipment
CN112506197A (en) * 2020-12-09 2021-03-16 深圳市优必选科技股份有限公司 Control method and device of navigation robot, electronic equipment and storage medium
CN113687656B (en) * 2021-08-25 2024-03-12 遨博(北京)智能科技股份有限公司 Robot control method and system
CN113977611B (en) * 2021-10-27 2022-11-29 深圳市注能科技有限公司 Interactive synchronization device and method for robot, robot and robot system
CN114217540A (en) * 2021-11-10 2022-03-22 深圳市鑫信腾科技股份有限公司 Control method, device, equipment and storage medium of non-standard equipment
CN114227689B (en) * 2021-12-30 2023-11-17 深圳市优必选科技股份有限公司 Robot motion control system and motion control method thereof
CN114227698B (en) * 2022-01-27 2024-04-26 上海擎朗智能科技有限公司 Control method, device, equipment and storage medium for robot
CN114393583B (en) * 2022-01-28 2024-02-20 北京云迹科技股份有限公司 Method and device for controlling equipment through robot
CN114800492A (en) * 2022-04-11 2022-07-29 深圳市大族机器人有限公司 Method, system, computer device and storage medium for controlling slave station terminal
CN114722050B (en) * 2022-06-10 2022-09-30 辰星(天津)自动化设备有限公司 Data synchronization method of robot system and robot system
CN117301074B (en) * 2023-11-17 2024-04-30 浙江孚宝智能科技有限公司 Control method and chip of intelligent robot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105517193A (en) * 2016-01-29 2016-04-20 深圳市凯立德科技股份有限公司 Method for building connection and device
CN105807617A (en) * 2014-12-30 2016-07-27 西安Tcl软件开发有限公司 Control method and control device for intelligent household devices
CN105916151A (en) * 2016-06-06 2016-08-31 惠州Tcl移动通信有限公司 Intelligent equipment wireless network access method and system thereof
CN109195150A (en) * 2018-10-15 2019-01-11 珠海格力电器股份有限公司 A kind of method and apparatus that equipment is shown

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9590976B2 (en) * 2014-10-08 2017-03-07 Google Inc. Network-assisted fabric pairing
CN106603530A (en) * 2016-12-14 2017-04-26 杭州古北电子科技有限公司 Control system and method for electronic device compatible with different communication protocols
KR101942905B1 (en) * 2017-02-14 2019-01-28 엘에스산전 주식회사 A method for modeling communication data
CN109729072B (en) * 2018-11-29 2022-05-24 沈阳无距科技有限公司 Communication protocol information processing method and device and electronic equipment
CN109831362A (en) * 2019-03-29 2019-05-31 深圳创维-Rgb电子有限公司 A kind of electronic equipment control method and system
CN109901412B (en) * 2019-04-19 2022-01-28 北京远鼎科技有限公司 Intelligent home system
CN110347622A (en) * 2019-06-05 2019-10-18 芜湖职业技术学院 A kind of multiplex roles agreement bi-directional conversion device and implementation method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105807617A (en) * 2014-12-30 2016-07-27 西安Tcl软件开发有限公司 Control method and control device for intelligent household devices
CN105517193A (en) * 2016-01-29 2016-04-20 深圳市凯立德科技股份有限公司 Method for building connection and device
CN105916151A (en) * 2016-06-06 2016-08-31 惠州Tcl移动通信有限公司 Intelligent equipment wireless network access method and system thereof
CN109195150A (en) * 2018-10-15 2019-01-11 珠海格力电器股份有限公司 A kind of method and apparatus that equipment is shown

Also Published As

Publication number Publication date
CN111203874A (en) 2020-05-29

Similar Documents

Publication Publication Date Title
CN111203874B (en) Robot control method, device, electronic device and storage medium
CN110647305B (en) Voice interaction method, device, equipment and medium of application program
CN105516441A (en) Information processing method and device
CN108781000A (en) Shared charging method, system, mobile terminal, shared charging equipment and server
CN105262792B (en) A kind of vehicle abnormality processing method and car-mounted terminal
US20130091502A1 (en) System and method of providing virtual machine using device cloud
JP6380812B2 (en) Route guidance method, intelligent terminal device, and wearable device
US10732958B2 (en) Terminal device and software rewriting program
CN105392117B (en) Method and device for querying short-distance object
CN114185579A (en) Software upgrading method and device, electronic equipment and readable storage medium
CN107506245B (en) Method and device for remotely controlling multiple systems, storage medium and terminal equipment
CN110430553B (en) Interaction method and device between vehicles, storage medium and control terminal
JP2013065207A (en) Portable communication system, portable communication terminal and server
CN110727736A (en) Data synchronization method and device, electronic equipment and storage medium
CN108121596A (en) Data transmission method and device, storage medium, electronic equipment
CN112578774A (en) Operation mode switching method and device, electronic equipment, system and storage medium
CN111131358A (en) Information acquisition method, device and system, readable storage medium and electronic equipment
CN115373354A (en) Device control method, device, terminal device and storage medium
CN112770307A (en) Multi-mode Bluetooth adapter and working method thereof
CN111124611A (en) Robot management method, robot management device and electronic equipment
CN105635228A (en) Method, device, server and system for controlling target machine to carry out task
CN112581069A (en) System, method, apparatus and storage medium for processing task
CN105743929B (en) A kind of method and device of terminal interaction
CN110060393A (en) Outbound control method, smartwatch and mobile terminal based on electronics riding information
CN117440365B (en) Method for downloading resource package and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant