CN106774345A - A kind of method and apparatus for carrying out multi-robot Cooperation - Google Patents

A kind of method and apparatus for carrying out multi-robot Cooperation Download PDF

Info

Publication number
CN106774345A
CN106774345A CN201710067320.2A CN201710067320A CN106774345A CN 106774345 A CN106774345 A CN 106774345A CN 201710067320 A CN201710067320 A CN 201710067320A CN 106774345 A CN106774345 A CN 106774345A
Authority
CN
China
Prior art keywords
robot
cooperation
destination object
instruction
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710067320.2A
Other languages
Chinese (zh)
Other versions
CN106774345B (en
Inventor
戴萧何
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai xianruan Information Technology Co., Ltd
Original Assignee
Shanghai Zhixian Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Zhixian Robot Technology Co Ltd filed Critical Shanghai Zhixian Robot Technology Co Ltd
Priority to CN201710067320.2A priority Critical patent/CN106774345B/en
Publication of CN106774345A publication Critical patent/CN106774345A/en
Application granted granted Critical
Publication of CN106774345B publication Critical patent/CN106774345B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0263Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic strips
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0289Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling with means for avoiding collisions between vehicles

Abstract

The purpose of the application is to provide a kind of method and apparatus for carrying out multi-robot Cooperation;The instruction that cooperates matched with robot is obtained from the network equipment;Based on the cooperation instruction, corresponding multi-robot Cooperation task is performed.Compared with prior art, in this application, the multiple independent robot for carrying out collaborative task is based on the cooperation instruction got from map network equipment, and corresponding multi-robot Cooperation task is performed jointly.Herein, the application that the application can be based on concrete scene needs, the cooperation sent by the network equipment is instructed, multiple independent robots are carried out into flexible combination, allow that each robot after combination realizes the work compound of the complicated task of or job classification larger to workload, so as to contribute to the decomposition of complex work and the optimization of overall resource.

Description

A kind of method and apparatus for carrying out multi-robot Cooperation
Technical field
The application is related to computer realm, more particularly to a kind of technology for carrying out multi-robot Cooperation.
Background technology
Existing robot application is mostly individual machine people's independently working, for example, individual machine people independently moves, independently removes Cargo transport thing etc., because individual machine people has certain limitations in equipment scale, application of function so that single robot can be with complete Into by a relatively simple, larger for a few thing amount or complex task of task, individual machine people then cannot be complete Into or complete effect it is undesirable.For example, in transport operation, it is for the moving object of some large volumes, then highly desirable many Robot collaboration is carried and mobile operation.But lack carries out efficient combination with common in the prior art by multiple independent robots Perform the technology of same item or same group task.
The content of the invention
The purpose of the application is to provide a kind of method and apparatus for carrying out multi-robot Cooperation.
According to the one side of the application, there is provided one kind carries out multi-Robotics Cooperation Method in robotic end, including:
The instruction that cooperates matched with robot is obtained from the network equipment;
Based on the cooperation instruction, corresponding multi-robot Cooperation task is performed.
According to further aspect of the application, additionally provide one kind carries out multi-Robotics Cooperation Method at network equipment end, Including:
The cooperation for providing matching to one or more robots is instructed, wherein, the robot is based on corresponding cooperation and refers to Order performs corresponding multi-robot Cooperation task.
According to the another aspect of the application, a kind of robot for carrying out multi-robot Cooperation is additionally provided, including:
First device, for obtaining the instruction that cooperates matched with robot from the network equipment;
Second device, for based on the cooperation instruction, performing corresponding multi-robot Cooperation task.
According to the another aspect of the application, a kind of network equipment for carrying out multi-robot Cooperation is additionally provided, including:
4th device, the cooperation for providing matching to one or more robots is instructed, wherein, the robot is based on Corresponding cooperation instruction performs corresponding multi-robot Cooperation task.
According to the another aspect of the application, a kind of system for carrying out multi-robot Cooperation is additionally provided, wherein the system Including:It is and another according to the application according to a kind of robot for carrying out multi-robot Cooperation that on the other hand the application provides A kind of network equipment for carrying out multi-robot Cooperation that aspect is provided.
Compared with prior art, in this application, the multiple independent robot for carrying out collaborative task is based on from correspondence net The cooperation instruction that network equipment gets, performs corresponding multi-robot Cooperation task jointly.Here, the application can be based on specific The application of scene is needed, and the cooperation sent by the network equipment is instructed, and multiple independent robots are carried out into flexible combination, Allow that each robot after combination realizes the work compound of the complicated task of or job classification larger to workload, from And contribute to the decomposition of complex work and the optimization of overall resource.
Further, in a kind of implementation of the application, the robot can be used for realizing that multi-robot formation is moved It is dynamic, for example, the robot for being cooperated can be based on the cooperation instruction of matching to the movement of destination locations, or follow target pair As movement, moved with the formation for realizing multiple robots.Based on this implementation, can flexibly and effectively realize that needs are based on All kinds of collaborative tasks that multiple robot team formation movements are realized, for example cooperate mobile carrying task dispatching.
Further, in a kind of implementation of the application, the robot get cooperation instruction after, by determine Robot destination object to be followed;And then recognize the destination object from the robot in real time scene of capture; So as to realize, based on the cooperation instruction, controlling the robot to be moved to destination object by corresponding mobile route.With it is existing Robot follow technology to compare, the application can real-time change, in the natural environment that disturbing factor is more, lock exactly Set the goal object, and is effectively tracked, and so as to improve the degree of accuracy that robot is followed, solves current robot and follows The recurrent technical problem with wrong target or with losing target.Phase is pressed based on the cooperation instruction control robot simultaneously The mobile route answered is moved to destination object, can on the whole realize the interoperable movement of forming into columns of multiple robots.
Further, in one implementation, based on the cooperation instruction, the robot is controlled by corresponding mobile road Radial direction destination object is moved, wherein, relative position and many machines in instructing that cooperate between the robot and destination object People's formation status information matches, and relative distance between second robot and first robot be included in it is default Relative distance range threshold in.Here, can control to control many machines in multi-robot Cooperation task by cooperating instruction The queue shape of people, or specific to the relative position between robot so that the work compound between each robot is matched somebody with somebody It is right higher, improve the completion efficiency of collaborative task.
Brief description of the drawings
By the detailed description made to non-limiting example made with reference to the following drawings of reading, the application other Feature, objects and advantages will become more apparent upon:
Fig. 1 shows to carry out multi-robot Cooperation in robotic end and network equipment end according to one kind of the application one side Method flow diagram;
Fig. 2 shows a kind of method flow that multi-robot Cooperation is carried out in robotic end according to the application one side Figure;
Fig. 3 shows a kind of system diagram for carrying out multi-robot Cooperation according to the application one side.
Same or analogous reference represents same or analogous part in accompanying drawing.
Specific embodiment
The application is described in further detail below in conjunction with the accompanying drawings.
In one typical configuration of the application, terminal, the equipment of service network and trusted party include one or more Processor (CPU), input/output interface, network interface and internal memory.
Internal memory potentially includes the volatile memory in computer-readable medium, random access memory (RAM) and/or The forms such as Nonvolatile memory, such as read-only storage (ROM) or flash memory (flash RAM).Internal memory is computer-readable medium Example.
Computer-readable medium includes that permanent and non-permanent, removable and non-removable media can be by any method Or technology realizes information Store.Information can be computer-readable instruction, data structure, the module of program or other data. The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), moves State random access memory (DRAM), other kinds of random access memory (RAM), read-only storage (ROM), electric erasable Programmable read only memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc read-only storage (CD-ROM), Digital versatile disc (DVD) or other optical storages, magnetic cassette tape, magnetic disk storage or other magnetic storage apparatus or Any other non-transmission medium, can be used to store the information that can be accessed by a computing device.Defined according to herein, computer Computer-readable recording medium does not include non-temporary computer readable media (transitory media), such as data-signal and carrier wave of modulation.
Fig. 1 shows to carry out multi-robot Cooperation in robotic end and network equipment end according to one kind of the application one side Method flow diagram.
Wherein, methods described includes step S11, step S12 and step S21.
The embodiment of the present application provides a kind of method for carrying out multi-robot Cooperation, and methods described can be in corresponding machine People end and network equipment end are realized.Wherein, the robot includes performing automatically the various machinery equipments of work, Ke Yishi Machinery equipment with locomotive function or carrying load function or other functions, it is also possible to while having above-mentioned various functions Machinery equipment, for example, various with the mobile artificial intelligence equipment for carrying function.In this application, same cooperation is carried out to appoint The function that multiple robots of business have can with it is identical, can also be different.The network equipment includes but is not limited to computer, net Network main frame, single network server, multiple webserver collection or Cloud Server, wherein, the Cloud Server can be operation In a distributed system, a virtual supercomputer being made up of the computer collection of a group loose couplings, it is used to realize Simple efficient, safe and reliable, disposal ability can elastic telescopic calculating service.In this application, the robot may refer to It is the robot 1, the network equipment may refer to be the network equipment 2.
Specifically, in the step s 21, the cooperation that the network equipment 2 can provide matching to one or more robots 1 refers to Order, wherein, the robot 1 is based on corresponding cooperation instruction and performs corresponding multi-robot Cooperation task.Accordingly, in step In S11, the instruction that cooperates matched with itself is obtained from the network equipment 2 by corresponding robot 1.Here, many machine cooperations are appointed Business can with multiple robots 1 execution various tasks.For example, multiple robots 1 keep the synchronizing moving of similarity distance; And for example, multiple robots 1 deliver same target jointly;And for example, multiple robots 1 carry out an assembling for all parts of object Task dispatching.In one implementation, the network equipment 2 can be based on the difference of collaborative task type or specific cooperative operation And be that different robots 1 matches corresponding cooperation instruction.
In one implementation, the cooperation instruction can include following at least any one:The multimachine of the robot Device people's formation status information;The speed control rule of the robot;The coordinate of the destination object to be followed of the robot Information;Other of the robot perform relevant information.
Specifically, delivered jointly together with the synchronizing moving of the holding similarity distance of multiple robots 1, or multiple robots 1 As a example by the scene of one object, in one implementation, the network equipment 2 can by the instruction that cooperates be given each robot 1 its Each the mobile required formation status information for maintaining, for example, keep a row, a line or multiple row to form into columns;In another realization side In formula, the network equipment 2 can also be instructed by the cooperation of speed control rule, control the operation speed of each robot 1 of cooperation Degree, to adjust the distance between each robot 1, so as to realize the control to whole queue movement;In another realization In mode, the network equipment 2 can also provide the coordinate information of its destination object to be followed to one or more robots 1, can To be the coordinate information of the destination object for providing to be determined when moving operation starts, or in moving process, it is based on The coordinate information for providing destination object in real time is set.
So that multiple robots 1 carry out a scene for the assembling task of all parts of object as an example, the collaborative task Can include in order that each robot 1 moves to the speed control rule of the robot of respective assembling position; The coordinate information of the target location of the robot;And the assembly operation Step Information of robot etc..Additionally, based on others The specific tasks of collaborative task need, and the collaborative task is also by adaptations.
In one implementation, the network equipment 2 can be unified to corresponding each robot 1 of the collaborative task simultaneously Send cooperation instruction;In another implementation, the network equipment 2 can also be on any opportunity to any one or more machines People 1 sends cooperation instruction respectively.In one implementation, the corresponding cooperation of multiple robots 1 in same collaborative task Instruction can be with identical;Can also differ, or part is identical, part is different, for example, multiple robots 1 with the formation of a row, Keep in the synchronizing moving scene of similarity distance, the cooperation instruction of the first robot 1 of queue can be with other robot in queue 1 Cooperation instruction it is different.
Then, in step s 12, robot 1 can be performed corresponding multi-robot Cooperation and appointed based on the cooperation instruction Business.In one implementation, Direct Communication is needed not move through to realize correspondingly collaborative task between each robot 1, And one or more robots 1 that the network equipment 2 is cooperated with the cooperation instruction real-time control are can be by, and by each Individual robot 1 performs cooperation instruction to realize the completion of collaborative task respectively.In one implementation, the network equipment 2 can be with Each robot 1 necessity that cooperation needs each other is only given to instruct, and other i.e. executable operations that need not cooperate, Can be independently executed by robot 1, for example, keeping the synchronizing moving of similarity distance, or multiple robots 1 in multiple robots 1 In the scene of common delivery same target, kept for overall formation and the control of the queue speed of service can be by the network equipment 1 By the instruction control that cooperates, and operation is specifically followed for each robot 1, such as follow determination, the identification operation of object Can be set by each robot 1 itself and performed.
In this application, carrying out the multiple independent robot 1 of collaborative task can be based on being obtained from map network equipment 2 The cooperation instruction arrived, performs corresponding multi-robot Cooperation task jointly.Here, the application can be based on the application of concrete scene Need, the cooperation sent by the network equipment 2 is instructed, multiple independent robots are carried out into flexible combination so that combination Each robot afterwards can realize the work compound of the complicated task of or job classification larger to workload, so as to contribute to The decomposition of complex work and the optimization of overall resource.
In one implementation, in the step S12, robot 1 can be described based on the cooperation instruction, control Robot 1 is moved by corresponding mobile route to destination locations or destination object.Here, the multi-robot Cooperation task of the application Can be the collaborative task for needing multiple robot team formation movements, such as multiple robots 1 keep the synchronizing moving of similarity distance; And for example, multiple robots 1 deliver same target jointly.Specifically, in one implementation, based on the cooperation instruction, can To control robot 1 to be moved to destination locations by corresponding mobile route, such as described robot 1 is to be located at queue forefront One or more robots, it without specific destination object, and may correspond to the destination locations for needing to reach;In one kind In implementation, based on the cooperation instruction, robot 1 can also be controlled to be moved to destination object by corresponding mobile route, For example, one or more robots 1 positioned at robot queue forefront can have the object of tracking, such as people of certain movement Or thing, and for example, the non-robot 1 positioned at robot queue forefront needs to follow destination object, i.e. target robot to be moved Dynamic, the target robot can be the immediate other robot in front of robot 1, or other are preset or based on cooperation Instruct the other robot for determining.
In this implementation, the robot 1 can be used for realizing that multi-robot formation is moved, for example, being cooperated Robot 1 can be based on the cooperation instruction of matching to the movement of destination locations, or follow destination object to move, it is many to realize The formation movement of individual robot 1.Based on this implementation, can flexibly and effectively realize needing based on multiple robot team formations The mobile all kinds of collaborative tasks realized, for example cooperate mobile carrying task dispatching.
Further, Fig. 2 shows a kind of side that multi-robot Cooperation is carried out in robotic end according to the application one side Method flow chart.Wherein, methods described includes step S11 and step S12, and further, the step S12 includes step S121, step Rapid S122 and step S123.
Specifically, in step S121, robot 1 can determine the destination object to be followed of the robot 1.In one kind In implementation, the destination object includes target robot, is carried in the corresponding target robot of the robot Same transport object, now, the collaborative task can correspond to the mobile carrying task that cooperates.The robot 1 is in cooperation It needs to be determined that its destination object to be followed when task starts.
In one implementation, in step S121, when the robot 1 is arranged to follow the mode, the machine Device people 1 can recognize corresponding matching object from the robot 1 in real time peripheral information of capture, and then the matching is right As the destination object to be followed of the robot 1.In one implementation, can be opened by default trigger action The follow the mode of robot 1.When the follow the mode starts, robot 1 can in real time capture peripheral information, in a kind of realization side In formula, the initial data of ambient condition information, original number can be got by one or more sensing devices in robot 1 According to can be image, picture or point cloud.And then, robot 1 detects the object class that needs are followed from the initial data Type, can have one or more objects to belong to the object type in environment.By the method for machine learning, precondition is classified well Device, that is, extract the characteristic information of the scan data of a certain class object, is input in grader, is examined from environmental information by contrasting Measure certain class object.Certain class object often has multiple, and matching object is the conduct selected from one or more class objects The object of destination object.
Further, in one implementation, the matching object can include but is not limited to following at least any one: With the immediate object of the robot 1 around the robot 1;It is closest with the robot 1 in the front of the robot 1 Object;In the front of the robot 1 and the immediate object of the robot 1;Around the robot 1 and with treat with With the object of the characteristics of objects information match of object;Believe around the robot 1 and with the characteristics of objects of object to be followed The object that breath is most matched;In multiple objects around the robot 1 with the characteristics of objects information match of object to be followed With the immediate object of the robot.In one implementation, the characteristics of objects information can include but is not limited to treat Follow one or more information in positional information, movement state information, the main body characteristic information of object.
Further, in a kind implementation, in step S121, the robot 1 can be referred to based on the cooperation Order determines the coordinate information of destination object to be followed;And then, the robot 1 obtains the surrounding environment of the robot in real time Information, wherein, the distance between the robot 1 and the coordinate information is less than or equal to predetermined distance threshold;Then, it is described Robot 1 recognizes corresponding matching object from the ambient condition information, and the matching object is treated as the robot 1 The destination object for following.Here, coordinate information both can be absolute coordinate information, or relative co-ordinate information.Robot 1 Its ambient condition information is obtained by scanning, if now the distance between the robot 1 and coordinate information is less than or equal to pre- Fixed distance threshold;The matching object matched with the coordinate information can be then identified from environmental information, and it is right by matching As being set as destination object.
Further, in a kind implementation, if robot 1 obtain it is described cooperation instruction, its position with wait to follow More than predetermined distance threshold, the application further gives the one kind in the case of this kind for the distance between position of object Solution:When the distance between the robot 1 and the coordinate information is more than predetermined distance threshold, the robot is controlled 1, towards coordinate information movement, the distance between robot 1 and coordinate information is reduced with this;Then, in moving process In, the ambient condition information of the robot 1 is obtained in real time, until the distance between the robot 1 and the coordinate information is small When predetermined distance threshold, you can recognize corresponding matching object with from the ambient condition information, and will be described Matching object is used as the destination object to be followed of the robot 1.
Then, in step S122, robot 1 can recognize the mesh from the robot 1 in real time scene of capture Mark object.In the moving process of robot 1, each object in environment is also at the state being continually changing, therefore, robot 1 is needed The environment that be based on real-time change repeats the identification operation of destination object again and again.In one implementation, machine People 1 can obtain real time environmental data information, then detected from the environmental data information by periodically scanning for surrounding environment Go out and belong to of a sort all objects with destination object, finally according to the detection in some cycle or multiple cycles of lasting scanning As a result, the destination object for matching is identified;
Specifically, in a kind of implementation, in step S122, robot 1 can obtain the robot 1 with real time scan Ambient condition information;Then, the characteristics of objects information with the destination object can be detected from the ambient condition information One or more object of observations for matching, here, the destination object determined due to the last recongnition of objects operation, Its corresponding characteristics of objects information is stored, for example, the characteristics of objects information of destination object will be determined with history observational record Form storage, therefore, it can by current context information scanning determined by one or more object of observations characteristics of objects letter Breath carries out Similarity matching with the characteristics of objects information of the destination object of storage, here, the object of observation or the destination object Characteristics of objects information can include but is not limited to following any one:The positional information of object;The movement state information of object;It is right Main body characteristic information of elephant etc., wherein, the positional information refers to the position of object described in the correspondence scanning moment;Motion state is believed Cease the movable information such as including the direction of motion, velocity magnitude;Main body characteristic information refers to then the external appearance characteristic of the subject body, Including shape, size and colouring information etc.;And then, robot 1 can be identified from object of observation one or more described The destination object, for example, the object of observation for meeting certain matching degree may both be estimated as destination object.
Further, in one implementation, it is described that the target is identified from object of observation one or more described Object can include:Each object of observation is observed with history in determining corresponding one or more object of observations of the robot 1 The related information of record, wherein, one or more of object of observations include the destination object, the history observational record bag Include the object-related information of one or more history object of observations;Then, the robot 1 according to the object of observation with it is described The related information of history observational record identifies the destination object from object of observation one or more described.
Specifically, when the environment that robot 1 is based on real-time change repeats the identification operation of destination object again and again After when determining the destination object, this destination object and its corresponding characteristics of objects information record can be entered history observation note Record, at the same time it can also other object of observations that will simultaneously be determined with the destination object and its corresponding characteristics of objects information Matching determination is carried out, is equally recorded in history observational record.Further, when currently carrying out recongnition of objects and operating, can be with Each object of observation carries out data correlation with history observational record in one or more object of observations that will currently get, one Plant in implementation, the data correlation can refer to by each object of observation point in current one or more object of observations Observational record not with each object in the history observational record of storage is matched, and its result is related information. For example, there is N number of object of observation in certain present scanning cycle, in environment, the M history observation note of object is stored before robot Record, wherein, M may be identical or different from the quantity of N;And N number of object specific object corresponding with M object there may be one Individual or multiple objects occur simultaneously.Data correlation is carried out, is that N number of object of observation is right with M in history observational record respectively one by one The observational record of elephant is matched, the matching degree for being matched each time, and whole matching result is a square for N rows M row Battle array, matrix element is corresponding matching degree, and this matrix is related information.Wherein, the object of observation includes target pair As including.In one implementation, the spy that the matching can be carried out with object-based one or more characteristics of objects information Levy matching.Then, the destination object is identified based on the related information for obtaining.Obtaining related information i.e. matching degree square After battle array, by comprehensive analysis computing, a kind of whole matching interrelational form of degree highest is chosen, so as to obtain the target pair As.
In one implementation, methods described also includes step S13 (not shown), and robot 1 can be with step s 13 The history observational record is updated according to one or more of object of observations, wherein, the history observational record after renewal In object include the destination object that is identified from object of observation one or more described.Corresponding to robot 1 Object of observation is continually changing with the change of environment, in one implementation, if there is the new object of observation to occur, Increase the corresponding observational record;If the existing object of observation disappears, the corresponding sight of the object of observation is deleted Survey record;If the existing object of observation is still suffered from, the relevant information in the corresponding observational record is updated.
Then, in step S123, robot 1 can control the robot by corresponding based on the cooperation instruction Mobile route is moved to destination object.Specifically, robot 1 can determine movement of the robot 1 to the destination object Path;And then, control the robot 1 to be moved by the mobile route.Wherein, the determination of the mobile route or mobile What the cooperation instruction that controlling behavior can be all based on the network equipment 2 was performed, or only one is referred to based on the cooperation What order was performed.
In one implementation, robot 1 can control the robot by corresponding shifting based on the cooperation instruction Dynamic path is moved to destination object, wherein, formation state between the robot and destination object with cooperate instruct in it is many Robot formation status information matches, and relative distance between second robot and first robot is included in In default relative distance range threshold.Wherein, the network equipment 2 can by the instruction that cooperates be given each described robot 1 its Each the mobile required formation status information for maintaining, for example, keep a row, a line or multiple row to form into columns, in a kind of implementation In, these formation states can be realized by the mobile route of robot 1, the isoparametric setting of motion state;In another reality In existing mode, the network equipment 2 can also be instructed by the cooperation of speed control rule, control the fortune of each robot 1 of cooperation Scanning frequency degree, to adjust the distance between each robot 1, so as to realize the control of whole queue movement.Here, can lead to Cooperation instruction is crossed to control to control in multi-robot Cooperation task the queue shape of multirobot, or specific to robot each other it Between relative position.So that the work compound fitness between each robot 1 is higher, the completion efficiency of cotasking is improved.
In one implementation, the step S123 can include step S1231 (not shown) and step S1232 (not Show).Specifically, in step S1231, robot 1 can determine the robot 1 to described based on the cooperation instruction The mobile route of destination object;In step S1232, robot 1 can control the robot 1 based on the cooperation instruction Moved by the mobile route.
Further, in step S1231, robot 1 can be obtained from the ambient condition information of the robot Obstacle information;Then, the positional information based on the destination object for identifying, determines the coordinates of targets of the robot 1; Then, based on the cooperation instruction, with reference to the coordinates of targets and the obstacle information, determine the robot to described The mobile route of destination object, wherein, the cooperation instruction includes multirobot formation status information.
Specifically, robot 1 determines from robot body to the obstacle information the destination object first, its In, barrier refer in environment in addition to the destination object other all objects, therefore, the existing static barrier of barrier Hinder the buildings such as thing, such as wall, pillar when tracking indoors, also there is mobile barrier, for example, be not belonging to the target The object of observation of object.Then, by the positional information of presently described destination object, for example, it is reported in corresponding history observation Positional information in record, is set to the coordinates of targets of robot 1.Finally, based on the cooperation instruction, according to distribution of obstacles The coordinates of targets of situation and robot, determines mobile route of the robot to the destination object.In actual applications, due to The mobile route for reaching another position from a position is not unique, thus for robot determine mobile route nor Uniquely, but from mulitpath the path being best suitable for is selected.In multi-robot Cooperation task, each robot it is only Vertical motion needs the cooperation take into account simultaneously between, here, the network equipment 2 is supplied to the cooperation of each robot 1 to refer to Order includes multirobot formation status information, is used to the mobile formation information of each robot 1 for indicating cooperation, for example, keep one Row, a line or multiple row are formed into columns, and then, by the formation status information come movement from planning robot to the destination object Path, if for example, each robot 1 is advanced in capable mode, it is necessary to the path width considered on mobile route, excludes The limited path candidate of path width.In one implementation, the cooperation instruction for containing the formation status information can be with Received by corresponding robot 1 before movement starts in robot 1, in acceptable tangible motion process, based on field The change of scape and be supplied to robot 1 in real time.
Further, in step S1232, robot 1 can determine the shifting of the robot 1 based on the cooperation instruction Dynamic speed, wherein, the cooperation instruction includes speed control rule;Then, controlling the robot 1 can be based on the movement Speed, moved by the mobile route, wherein, controlled between the robot 1 and destination object by the translational speed Relative distance is included in default relative distance range threshold.Specifically, when multi-robot Cooperation forms into columns movement, except needing Formation is considered, in addition it is also necessary in view of the relative position between specific robot 1, for example, in the mobile carrying task of cooperation In, if each robot 1 is moved with a row, when transport object is a length of N meters, in order to ensure each robot while carrying this Transport task, the relative position of two neighboring robot 1 is not just random, and be the need to ensure that two neighboring robot 1 it Between be maintained in the range of a certain distance, here, can by cooperate instruction in speed control rule determine robot 1 shifting Dynamic speed so that the robot 1 can be moved based on the translational speed, by the mobile route, meanwhile, keep with Default distance range between the target robot (can correspond to another robot 1) that it is followed.
Further, in one implementation, it is described based on the cooperation instruction, determine the mobile speed of the robot 1 Degree, wherein, the cooperation instruction includes that speed control rule includes:Based on speed control rule, the robot 1 is determined Translational speed, wherein, the translational speed include pace and/or turning velocity.Here, the motion of robot 1 needs Kinematics and dynamic (dynamical) constraint by robot body, meanwhile, also need to consider the chi of robot 1 in collision free It is very little.When control robot 1 is moved according to the mobile route, on the one hand control robot 1 is transported without departing from path domain The control in dynamic direction, on the other hand needs to control the translational speed of robot 1.Further, it is preferable that the mobile speed of robot 1 Degree is divided into two components of pace and turning velocity, and specifically, pace refers to the speed point on the direction of robot 1 Amount, turning velocity refers in the velocity component on pace direction.
On this basis, further one kind is achieved in that:When robot 1 is big with the distance between the destination object When distance threshold, while carrying out planning control to the pace and the turning velocity;When robot 1 and institute When stating the distance between destination object less than distance threshold, i.e., when robot is already close to destination object, then only need to machine The direction of motion of people, i.e. turning velocity are micro-adjusted.
In this application, the robot 1 is to be followed by determining the robot 1 after cooperation instruction is got Destination object;And then recognize the destination object from the robot in real time scene of capture;So as to realize based on the association Instruct, control the robot 1 to be moved to destination object by corresponding mobile route.Technology phase is followed with existing robot Than, the application can real-time change, in the natural environment that disturbing factor is more, lock onto target object exactly, and carrying out Effectively track, so as to improve the degree of accuracy that robot is followed, solve current robot follow it is recurrent with wrong mesh Mark or the technical problem with losing target.Control the robot by corresponding mobile route to mesh based on the cooperation instruction simultaneously Mark object movement, can on the whole realize the interoperable movement of forming into columns of multiple robots.
In one implementation, in the step s 21, the network equipment 1 can provide the first cooperation and refer to the first robot Order, wherein, first robot is based on the described first cooperation instruction, controls first robot to press corresponding mobile route Moved to destination object or destination locations;Then, the second cooperation instruction is provided to the second robot, wherein, second machine People is based on the described second cooperation instruction, controls second robot to follow the first robot to move by corresponding mobile route. Further, in one implementation, the phase formation state between second robot and first robot with cooperate Multirobot formation status information in instruction matches, and relative between second robot and first robot Distance is included in default relative distance range threshold.Here, first robot and the second robot can be corresponded to It is different robots 1, in one implementation, same multi-robot Cooperation task can be by one or more the first machines Device people and one or more second robots cooperate execution jointly.In one implementation, the first cooperation instruction and the Two cooperation instructions can be with identical or different.
Fig. 3 shows a kind of system diagram for carrying out multi-robot Cooperation according to the application one side.Wherein, the system Including robot 1 and the network equipment 2.
Wherein, the robot 1 includes first device 31 and second device 32, and the network equipment 2 includes the 4th device 41。
The embodiment of the present application provide a kind of system for carrying out multi-robot Cooperation, the system can include robot and The network equipment.Wherein, the robot includes performing automatically the various machinery equipments of work, can have mobile work( The machinery equipment of load function or other functions or can be carried, it is also possible to while there is the machinery equipment of above-mentioned various functions, For example, various with the mobile artificial intelligence equipment for carrying function.In this application, multiple machines of same collaborative task are carried out The function that device people has can with it is identical, can also be different.The network equipment includes but is not limited to computer, network host, list The individual webserver, multiple webserver collection or Cloud Server, wherein, the Cloud Server can operate in distributed system A virtual supercomputer in system, being made up of the computer collection of a group loose couplings, it is used to realize simple efficient, peace Complete reliable, disposal ability can elastic telescopic calculating service.In this application, the robot may refer to be the robot 1, the network equipment may refer to be the network equipment 2.
Specifically, the cooperation that the 4th device 41 can provide matching to one or more robots 1 is instructed, wherein, it is described Robot 1 is based on corresponding cooperation instruction and performs corresponding multi-robot Cooperation task.Accordingly, first device 31 sets from network The instruction that cooperates matched with itself is obtained in standby 2.Here, many machine collaborative tasks can the execution with multiple robots 1 Various tasks.For example, multiple robots 1 keep the synchronizing moving of similarity distance;And for example, multiple robots 1 deliver same jointly Object;And for example, multiple robots 1 carry out an assembling task dispatching for all parts of object.In one implementation, network Equipment 2 can be that different robots 1 matches corresponding association based on the difference of collaborative task type or specific cooperative operation Instruct.
In one implementation, the cooperation instruction can include following at least any one:The multimachine of the robot Device people's formation status information;The speed control rule of the robot;The coordinate of the destination object to be followed of the robot Information;Other of the robot perform relevant information.
Specifically, delivered jointly together with the synchronizing moving of the holding similarity distance of multiple robots 1, or multiple robots 1 As a example by the scene of one object, in one implementation, the network equipment 2 can by the instruction that cooperates be given each robot 1 its Each the mobile required formation status information for maintaining, for example, keep a row, a line or multiple row to form into columns;In another realization side In formula, the network equipment 2 can also be instructed by the cooperation of speed control rule, control the operation speed of each robot 1 of cooperation Degree, to adjust the distance between each robot 1, so as to realize the control to whole queue movement;In another realization In mode, the network equipment 2 can also provide the coordinate information of its destination object to be followed to one or more robots 1, can To be the coordinate information of the destination object for providing to be determined when moving operation starts, or in moving process, it is based on The coordinate information for providing destination object in real time is set.
So that multiple robots 1 carry out a scene for the assembling task of all parts of object as an example, the collaborative task Can include in order that each robot 1 moves to the speed control rule of the robot of respective assembling position; The coordinate information of the target location of the robot;And the assembly operation Step Information of robot etc..Additionally, based on others The specific tasks of collaborative task need, and the collaborative task is also by adaptations.
In one implementation, the 4th device 41 can unite to corresponding each robot 1 of the collaborative task simultaneously One sends cooperation instruction;In another implementation, the 4th device 41 can also be on any opportunity to any one or more Robot 1 sends cooperation instruction respectively.In one implementation, the multiple robots 1 in same collaborative task are corresponding Cooperation instruction can be with identical;Can also differ, or part is identical, partly different, for example, in multiple robots 1 with a row In formation, the synchronizing moving scene of holding similarity distance, the cooperation instruction of the first robot 1 of queue can be with other machines in queue The cooperation instruction of device people 1 is different.
Then, second device 32 can perform corresponding multi-robot Cooperation task based on the cooperation instruction.In one kind In implementation, Direct Communication is needed not move through to realize correspondingly collaborative task between each robot 1, and can be logical One or more robots 1 that the network equipment 2 is cooperated with the cooperation instruction real-time control are crossed, and by each robot 1 Cooperation instruction is performed respectively to realize the completion of collaborative task.In one implementation, the network equipment 2 can be given only respectively The necessary instruction that cooperation needs each other of individual robot 1, and other i.e. executable operations that need not cooperate, can be by machine People 1 independently executes, for example, keeping the synchronizing moving of similarity distance, or multiple robots 1 to deliver jointly together in multiple robots 1 In the scene of one object, keep to be referred to by cooperation by the network equipment 1 with the control of the queue speed of service for overall formation Order control, and operation is specifically followed for each robot 1, such as follow the determination of object, recognize that operation can be by each Robot 1 itself sets and performs.
In this application, carrying out the multiple independent robot 1 of collaborative task can be based on being obtained from map network equipment 2 The cooperation instruction arrived, performs corresponding multi-robot Cooperation task jointly.Here, the application can be based on the application of concrete scene Need, the cooperation sent by the network equipment 2 is instructed, multiple independent robots are carried out into flexible combination so that combination Each robot afterwards can realize the work compound of the complicated task of or job classification larger to workload, so as to contribute to The decomposition of complex work and the optimization of overall resource.
In one implementation, second device 21 can control the robot 1 by corresponding based on the cooperation instruction Mobile route moved to destination locations or destination object.Here, the multi-robot Cooperation task of the application can be that needs are more The collaborative task of individual robot team formation movement, such as multiple robots 1 keep the synchronizing moving of similarity distance;And for example, Duo Geji Device people 1 delivers same target jointly.Specifically, in one implementation, based on the cooperation instruction, robot can be controlled 1 is moved by corresponding mobile route to destination locations, such as described robot 1 is one or more machines for being located at queue forefront Device people, it without specific destination object, and may correspond to the destination locations for needing to reach;In one implementation, Based on the cooperation instruction, robot 1 can also be controlled to be moved to destination object by corresponding mobile route, for example, the machine of being located at One or more robots 1 of device people's queue forefront can have the object of tracking, people or thing that for example certain is moved, and for example, The non-robot 1 positioned at robot queue forefront needs to follow destination object, i.e. target robot to move, the target machine Device people can be the immediate other robot in front of robot 1, or other are preset or instruct its of determination based on cooperation His robot.
In this implementation, the robot 1 can be used for realizing that multi-robot formation is moved, for example, being cooperated Robot 1 can be based on the cooperation instruction of matching to the movement of destination locations, or follow destination object to move, it is many to realize The formation movement of individual robot 1.Based on this implementation, can flexibly and effectively realize needing based on multiple robot team formations The mobile all kinds of collaborative tasks realized, for example cooperate mobile carrying task dispatching.
Further, in one implementation, the second device 32 includes first module (not shown), second unit (not shown) and the 3rd unit (not shown).
Specifically, first module can determine the destination object to be followed of the robot 1.In one implementation, The destination object includes target robot, same transport is carried in the corresponding target robot of the robot right As now, the collaborative task can correspond to the mobile carrying task that cooperates.The robot 1 is needed when collaborative task starts Determine its destination object to be followed.
In one implementation, when the robot 1 is arranged to follow the mode, first module can be from the machine Corresponding matching object is recognized in the peripheral information that device people 1 captures in real time, and then object as the robot 1 is matched using described Destination object to be followed.In one implementation, mould can be followed by default trigger action unlatching robot 1 Formula.When the follow the mode starts, robot 1 can in real time capture peripheral information, in one implementation, can be by machine One or more sensing devices in device people 1 get the initial data of ambient condition information, and initial data can be image, figure Piece or point cloud.And then, robot 1 detects the object type that needs are followed from the initial data, can have in environment One or more objects belong to the object type.By the method for machine learning, the good grader of precondition extracts a certain class The characteristic information of the scan data of object, is input in grader, and certain class object is detected from environmental information by contrasting.Certain Class object often has multiple, and matching object is the object as destination object selected from one or more class objects.
Further, in one implementation, the matching object can include but is not limited to following at least any one: With the immediate object of the robot 1 around the robot 1;It is closest with the robot 1 in the front of the robot 1 Object;In the front of the robot 1 and the immediate object of the robot 1;Around the robot 1 and with treat with With the object of the characteristics of objects information match of object;Believe around the robot 1 and with the characteristics of objects of object to be followed The object that breath is most matched;In multiple objects around the robot 1 with the characteristics of objects information match of object to be followed With the immediate object of the robot.In one implementation, the characteristics of objects information can include but is not limited to treat Follow one or more information in positional information, movement state information, the main body characteristic information of object.
Further, in a kind implementation, first module can determine mesh to be followed based on the cooperation instruction Mark the coordinate information of object;And then, the robot 1 obtains the ambient condition information of the robot in real time, wherein, the machine Distance between device people 1 and the coordinate information is less than or equal to predetermined distance threshold;Then, the robot 1 is from the week Collarette environment information recognizes corresponding matching object, and matches object as the destination object to be followed of the robot 1 using described. Here, coordinate information both can be absolute coordinate information, or relative co-ordinate information.Robot 1 obtains it by scanning Ambient condition information, if now the distance between the robot 1 and coordinate information is less than or equal to predetermined distance threshold;Then The matching object matched with the coordinate information can be identified from environmental information, and matching object is set as target pair As.
Further, in a kind implementation, if robot 1 obtain it is described cooperation instruction, its position with wait to follow More than predetermined distance threshold, the application further gives the one kind in the case of this kind for the distance between position of object Solution:When the distance between the robot 1 and the coordinate information is more than predetermined distance threshold, the robot is controlled 1, towards coordinate information movement, the distance between robot 1 and coordinate information is reduced with this;Then, in moving process In, the ambient condition information of the robot 1 is obtained in real time, until the distance between the robot 1 and the coordinate information is small When predetermined distance threshold, you can recognize corresponding matching object with from the ambient condition information, and will be described Matching object is used as the destination object to be followed of the robot 1.
Then, second unit can recognize the destination object from the robot 1 in real time scene of capture.In machine In the moving process of people 1, each object in environment is also at the state being continually changing, therefore, robot 1 is needed based on real-time change Environment again and again repeat destination object identification operation.In one implementation, robot 1 can be by the cycle Property ground scanning surrounding environment, obtain real time environmental data information, then detected from the environmental data information and destination object category In of a sort all objects, finally according to some cycle or the testing result in multiple cycles of lasting scanning, phase is identified The destination object of matching;
Specifically, in a kind of implementation, the surrounding environment that second unit can obtain the robot 1 with real time scan is believed Breath;Then, with the characteristics of objects information match of the destination object can be detected from the ambient condition information Or multiple object of observations, here, due to destination object, its corresponding object of the last recongnition of objects operation determination Characteristic information is stored, for example, will determine that the characteristics of objects information of destination object is stored in the form of history observational record, because This, can be by the characteristics of objects information of one or more object of observations determined by current context information scanning and the target for storing The characteristics of objects information of object carries out Similarity matching, here, the characteristics of objects information of the object of observation or the destination object Following any one can be included but is not limited to:The positional information of object;The movement state information of object;The main body characteristic letter of object Breath etc., wherein, the positional information refers to the position of object described in the correspondence scanning moment;Movement state information includes the side of motion To movable informations such as, velocity magnitudes;Main body characteristic information refers to then the external appearance characteristic of the subject body, including shape, size And colouring information etc.;And then, robot 1 can identify the destination object from object of observation one or more described, For example, the object of observation for meeting certain matching degree may both be estimated as destination object.
Further, in one implementation, it is described that the target is identified from object of observation one or more described Object can include:Each object of observation is observed with history in determining corresponding one or more object of observations of the robot 1 The related information of record, wherein, one or more of object of observations include the destination object, the history observational record bag Include the object-related information of one or more history object of observations;Then, the robot 1 according to the object of observation with it is described The related information of history observational record identifies the destination object from object of observation one or more described.
Specifically, when the environment that robot 1 is based on real-time change repeats the identification operation of destination object again and again After when determining the destination object, this destination object and its corresponding characteristics of objects information record can be entered history observation note Record, at the same time it can also other object of observations that will simultaneously be determined with the destination object and its corresponding characteristics of objects information Matching determination is carried out, is equally recorded in history observational record.Further, when currently carrying out recongnition of objects and operating, can be with Each object of observation carries out data correlation with history observational record in one or more object of observations that will currently get, one Plant in implementation, the data correlation can refer to by each object of observation point in current one or more object of observations Observational record not with each object in the history observational record of storage is matched, and its result is related information. For example, there is N number of object of observation in certain present scanning cycle, in environment, the M history observation note of object is stored before robot Record, wherein, M may be identical or different from the quantity of N;And N number of object specific object corresponding with M object there may be one Individual or multiple objects occur simultaneously.Data correlation is carried out, is that N number of object of observation is right with M in history observational record respectively one by one The observational record of elephant is matched, the matching degree for being matched each time, and whole matching result is a square for N rows M row Battle array, matrix element is corresponding matching degree, and this matrix is related information.Wherein, the object of observation includes target pair As including.In one implementation, the spy that the matching can be carried out with object-based one or more characteristics of objects information Levy matching.Then, the destination object is identified based on the related information for obtaining.Obtaining related information i.e. matching degree square After battle array, by comprehensive analysis computing, a kind of whole matching interrelational form of degree highest is chosen, so as to obtain the target pair As.
In one implementation, the robot 1 also includes 3rd device (not shown), and robot 1 can be according to institute State one or more object of observations and update the history observational record, wherein, it is right in the history observational record after renewal As the destination object including being identified from object of observation one or more described.Observation corresponding to robot 1 is right It is continually changing as the change with environment, in one implementation, if there is the new object of observation to occur, increases phase The observational record answered;If the existing object of observation disappears, the corresponding observational record of the object of observation is deleted; If the existing object of observation is still suffered from, the relevant information in the corresponding observational record is updated.
Then, Unit the 3rd can control the robot by corresponding mobile route to mesh based on the cooperation instruction Mark object movement.Specifically, robot 1 can determine mobile route of the robot 1 to the destination object;And then, control The robot 1 is made to be moved by the mobile route.Wherein, the determination of the mobile route or mobile controlling behavior can be with The cooperation instruction execution of the network equipment 2 is all based on, or only one is performed based on the cooperation instruction.
In one implementation, Unit the 3rd can control the robot by corresponding based on the cooperation instruction Mobile route is moved to destination object, wherein, formation state between the robot and destination object with cooperate and instruct Multirobot formation status information matches, and relative distance between second robot and first robot is included In default relative distance range threshold.Wherein, the network equipment 2 can provide each robot 1 by the instruction that cooperates The formation status information maintained required for it is each mobile, for example, keep a row, a line or multiple row to form into columns, in a kind of realization side In formula, these formation states can be realized by the mobile route of robot 1, the isoparametric setting of motion state;At another In implementation, the network equipment 2 can also be instructed by the cooperation of speed control rule, control each robot 1 of cooperation The speed of service, to adjust the distance between each robot 1, so as to realize the control of whole queue movement.Here, can be with The queue shape of control multirobot in multi-robot Cooperation task is controlled by cooperating instruction, or specific to robot each other Between relative position.So that the work compound fitness between each robot 1 is higher, the completion effect of cotasking is improved Rate.
In one implementation, Unit the 3rd can include the first subelement (not shown) and the second subelement (not shown).Specifically, the first subelement can determine the robot 1 to the destination object based on the cooperation instruction Mobile route;Second subelement can control the robot 1 to be moved by the mobile route based on the cooperation instruction.
Further, the first subelement can obtain obstacle information from the ambient condition information of the robot; Then, the positional information based on the destination object for identifying, determines the coordinates of targets of the robot 1;Then, based on institute Cooperation instruction is stated, with reference to the coordinates of targets and the obstacle information, determines the robot to the destination object Mobile route, wherein, the cooperation instruction includes multirobot formation status information.
Specifically, the first subelement determines from robot body to the obstacle information the destination object first, Wherein, barrier refer in environment in addition to the destination object other all objects, therefore, barrier is existing static The buildings such as barrier, such as wall, pillar when tracking indoors, also there is mobile barrier, for example, be not belonging to the mesh Mark the object of observation of object.Then, by the positional information of presently described destination object, for example, it is reported in the corresponding conception of history The positional information surveyed in record, is set to the coordinates of targets of robot 1.Finally, based on the cooperation instruction, according to barrier point The coordinates of targets of cloth situation and robot, determines mobile route of the robot to the destination object.In actual applications, by It is not unique in the mobile route for reaching another position from a position, therefore the mobile route determined for robot is not yet It is unique, but the path being best suitable for is selected from mulitpath.In multi-robot Cooperation task, each robot Self-movement needs the cooperation take into account simultaneously between, here, the network equipment 2 is supplied to the cooperation of each robot 1 Instruction includes multirobot formation status information, is used to the mobile formation information of each robot 1 for indicating cooperation, for example, keep One row, a line or multiple row are formed into columns, and then, by the formation status information come shifting from planning robot to the destination object Dynamic path, if for example, each robot 1 is advanced in capable mode, it is necessary to the path width considered on mobile route, excludes Fall the limited path candidate of path width.In one implementation, the cooperation instruction for containing the formation status information can Received by corresponding robot 1 before movement starts with robot 1, in acceptable tangible motion process, be based on The change of scene and be supplied to robot 1 in real time.
Further, the second subelement can determine the translational speed of the robot 1 based on the cooperation instruction, wherein, The cooperation instruction includes speed control rule;Then, the control robot 1 can be based on the translational speed, by described Mobile route is moved, wherein, control the relative distance between the robot 1 and destination object to include by the translational speed In default relative distance range threshold.Specifically, when multi-robot Cooperation forms into columns movement, except needing to consider team Shape, in addition it is also necessary in view of the relative position between specific robot 1, for example, in the mobile carrying task of cooperation, if each machine Device people 1 is moved with a row, when transport object is a length of N meters, in order to ensure each robot while carrying the transport task, phase The relative position of Lin Liangge robots 1 is not just random, and is maintained at certain between being the need to ensure that two neighboring robot 1 Distance range in, here, can by cooperate instruction in speed control rule determine robot 1 translational speed so that The robot 1 can be moved based on the translational speed, by the mobile route, meanwhile, the mesh that holding is followed with it Default distance range between scalar robot (can correspond to another robot 1).
Further, in one implementation, it is described based on the cooperation instruction, determine the mobile speed of the robot 1 Degree, wherein, the cooperation instruction includes that speed control rule includes:Based on speed control rule, the robot 1 is determined Translational speed, wherein, the translational speed include pace and/or turning velocity.Here, the motion of robot 1 needs Kinematics and dynamic (dynamical) constraint by robot body, meanwhile, also need to consider the chi of robot 1 in collision free It is very little.When control robot 1 is moved according to the mobile route, on the one hand control robot 1 is transported without departing from path domain The control in dynamic direction, on the other hand needs to control the translational speed of robot 1.Further, it is preferable that the mobile speed of robot 1 Degree is divided into two components of pace and turning velocity, and specifically, pace refers to the speed point on the direction of robot 1 Amount, turning velocity refers in the velocity component on pace direction.
On this basis, further one kind is achieved in that:When robot 1 is big with the distance between the destination object When distance threshold, while carrying out planning control to the pace and the turning velocity;When robot 1 and institute When stating the distance between destination object less than distance threshold, i.e., when robot is already close to destination object, then only need to machine The direction of motion of people, i.e. turning velocity are micro-adjusted.
In this application, the robot 1 is to be followed by determining the robot 1 after cooperation instruction is got Destination object;And then recognize the destination object from the robot in real time scene of capture;So as to realize based on the association Instruct, control the robot 1 to be moved to destination object by corresponding mobile route.Technology phase is followed with existing robot Than, the application can real-time change, in the natural environment that disturbing factor is more, lock onto target object exactly, and carrying out Effectively track, so as to improve the degree of accuracy that robot is followed, solve current robot follow it is recurrent with wrong mesh Mark or the technical problem with losing target.Control the robot by corresponding mobile route to mesh based on the cooperation instruction simultaneously Mark object movement, can on the whole realize the interoperable movement of forming into columns of multiple robots.
In one implementation, the 4th device 41 of the network equipment 1 can provide the first cooperation and refer to the first robot Order, wherein, first robot is based on the described first cooperation instruction, controls first robot to press corresponding mobile route Moved to destination object or destination locations;Then, the second cooperation instruction is provided to the second robot, wherein, second machine People is based on the described second cooperation instruction, controls second robot to follow the first robot to move by corresponding mobile route. Further, in one implementation, the phase formation state between second robot and first robot with cooperate Multirobot formation status information in instruction matches, and relative between second robot and first robot Distance is included in default relative distance range threshold.Here, first robot and the second robot can be corresponded to It is different robots 1, in one implementation, same multi-robot Cooperation task can be by one or more the first machines Device people and one or more second robots cooperate execution jointly.In one implementation, the first cooperation instruction and the Two cooperation instructions can be with identical or different.
It is obvious to a person skilled in the art that the application is not limited to the details of above-mentioned one exemplary embodiment, Er Qie In the case of without departing substantially from spirit herein or essential characteristic, the application can be in other specific forms realized.Therefore, no matter From the point of view of which point, embodiment all should be regarded as exemplary, and be nonrestrictive, scope of the present application is by appended power Profit requires to be limited rather than described above, it is intended that all in the implication and scope of the equivalency of claim by falling Change is included in the application.Any reference in claim should not be considered as the claim involved by limitation.This Outward, it is clear that " including " word is not excluded for other units or step, odd number is not excluded for plural number.The multiple stated in device claim Unit or device can also be realized by a unit or device by software or hardware.The first, the second grade word is used for table Show title, and be not offered as any specific order.

Claims (24)

1. a kind of method for carrying out multi-robot Cooperation in robotic end, wherein, methods described includes:
The instruction that cooperates matched with robot is obtained from the network equipment;
Based on the cooperation instruction, corresponding multi-robot Cooperation task is performed.
2. method according to claim 1, wherein, it is described based on the cooperation instruction, perform corresponding multirobot association Include as task:
Based on the cooperation instruction, the robot is controlled to be moved to destination locations or destination object by corresponding mobile route.
3. method according to claim 2, wherein, it is described based on the cooperation instruction, perform corresponding multirobot association Include as task:
Determine robot destination object to be followed;
The destination object is recognized from the robot in real time scene of capture;
Based on the cooperation instruction, the robot is controlled to be moved to destination object by corresponding mobile route.
4. method according to claim 3, wherein, the destination object includes target robot, the robot and its Same transport object is carried in corresponding target robot.
5. the method according to claim 3 or 4, wherein, it is described based on the cooperation instruction, control the robot to press phase The mobile route answered includes to destination object movement:
Based on the cooperation instruction, the robot is controlled to be moved to destination object by corresponding mobile route, wherein, the machine Formation state between device people and destination object matches with the multirobot formation status information in the instruction that cooperates, and the machine Relative distance between device people and the destination object is included in default relative distance range threshold.
6. the method according to any one of claim 3 to 5, wherein, it is described based on the cooperation instruction, control the machine Device people is included by corresponding mobile route to destination object movement:
Based on the cooperation instruction, mobile route of the robot to the destination object is determined;
Based on the cooperation instruction, the robot is controlled to be moved by the mobile route.
7. method according to claim 6, wherein, it is described based on the cooperation instruction, determine the robot to described The mobile route of destination object includes:
Obstacle information is obtained from the ambient condition information of the robot;
Based on the positional information of the destination object for identifying, the coordinates of targets of the robot is determined;
Based on the cooperation instruction, with reference to the coordinates of targets and the obstacle information, determine the robot to described The mobile route of destination object, wherein, the cooperation instruction includes multirobot formation status information.
8. the method according to claim 6 or 7, wherein, it is described based on the cooperation instruction, control the robot to press institute Stating mobile route movement includes:
Based on the cooperation instruction, the translational speed of the robot is determined, wherein, the cooperation instruction includes that speed control is advised Then;
Control the robot to be based on the translational speed, moved by the mobile route, wherein, by the translational speed control The relative distance between the robot and destination object is made to be included in default relative distance range threshold.
9. method according to claim 8, wherein, it is described based on the cooperation instruction, determine the movement of the robot Speed, wherein, the cooperation instruction includes that speed control rule includes:
Based on speed control rule, the translational speed of the robot is determined, wherein, the translational speed includes speed of advancing Degree and/or turning velocity.
10. method according to claim 9, wherein, the speed control rule includes:
When the destination object is more than or equal to distance threshold with the distance of the robot, while controlling the robot The pace and the turning velocity;
When the destination object is less than distance threshold with the distance of the robot, the steering speed of the robot is controlled Degree.
11. method according to any one of claim 3 to 10, wherein, the mesh for determining that the robot is to be followed Mark object includes:
When the robot is arranged to follow the mode, recognized from the robot in real time peripheral information of capture corresponding Matching object;
Using the matching object as robot destination object to be followed.
12. method according to any one of claim 3 to 10, wherein, the mesh for determining that the robot is to be followed Mark object includes:
The coordinate information of destination object to be followed is determined based on the cooperation instruction;
The ambient condition information of the robot is obtained in real time, wherein, the distance between the robot and the coordinate information is small In or equal to predetermined distance threshold;
Corresponding matching object is recognized from the ambient condition information, and the matching object is waited to follow as the robot Destination object.
13. methods according to claim 12, wherein, the ambient condition information that the robot is obtained in real time, its In, the distance between the robot and the coordinate information includes less than or equal to predetermined distance threshold:
When the distance between the robot and the coordinate information is more than predetermined distance threshold, the robot towards institute is controlled State coordinate information movement;
The ambient condition information of the robot is obtained in real time, wherein, the distance between the robot and the coordinate information is small In or equal to predetermined distance threshold.
14. methods according to claim 3, wherein, the identification from the robot in real time scene of capture is described Destination object includes:
Real time scan obtains the ambient condition information of the robot;
One or more sights with the characteristics of objects information match of the destination object are detected from the ambient condition information Survey object;
The destination object is identified from object of observation one or more described.
15. methods according to claim 14, wherein, it is described identified from object of observation one or more described it is described Destination object includes:
Determine associating for each object of observation and history observational record in corresponding one or more object of observations of the robot Information, wherein, one or more of object of observations include the destination object, and the history observational record includes one or many The object-related information of individual history object of observation;
Known from object of observation one or more described with the related information of the history observational record according to the object of observation Do not go out the destination object.
16. methods according to claim 15, wherein, methods described also includes:
The history observational record is updated according to one or more of object of observations, wherein, the history observation after renewal Object in record includes the destination object identified from object of observation one or more described.
A kind of 17. methods for carrying out multi-robot Cooperation at network equipment end, wherein, methods described includes:
The cooperation for providing matching to one or more robots is instructed, wherein, the robot is based on corresponding cooperation instruction and holds The corresponding multi-robot Cooperation task of row.
18. methods according to claim 17, wherein, the cooperation for providing matching to one or more robots refers to Order, wherein, the robot performs corresponding multi-robot Cooperation task based on corresponding cooperation instruction to be included:
Identical or different cooperation is provided respectively to multiple robots to instruct, wherein, each robot is based on respective execution and refers to Order performs corresponding multi-robot Cooperation task.
19. method according to claim 17 or 18, wherein, the cooperation instruction includes following at least any one:
The multirobot formation status information of the robot;
The speed control rule of the robot;
The coordinate information of the destination object to be followed of the robot;
Other of the robot perform relevant information.
20. method according to any one of claim 17 to 19, wherein, it is described to the offer of one or more robots The cooperation instruction matched somebody with somebody, wherein, the robot performs corresponding multi-robot Cooperation task based on corresponding cooperation instruction to be included:
The first cooperation instruction is provided to the first robot, wherein, first robot is based on the described first cooperation instruction, control First robot is moved by corresponding mobile route to destination object or destination locations;
The second cooperation instruction is provided to the second robot, wherein, second robot is based on the described second cooperation instruction, control Second robot follows the first robot to move by corresponding mobile route.
21. methods according to claim 20, wherein, the formation between second robot and first robot State matches with the multirobot formation status information in the instruction that cooperates, and second robot and first robot Between relative distance be included in default relative distance range threshold.
A kind of 22. robots for carrying out multi-robot Cooperation, wherein, the robot includes:
First device, for obtaining the instruction that cooperates matched with robot from the network equipment;
Second device, for based on the cooperation instruction, performing corresponding multi-robot Cooperation task.
A kind of 23. network equipments for carrying out multi-robot Cooperation, wherein, the equipment includes:
4th device, the cooperation for providing matching to one or more robots is instructed, wherein, the robot is based on corresponding Cooperation instruction perform corresponding multi-robot Cooperation task.
A kind of 24. systems for carrying out multi-robot Cooperation, wherein, the system includes:
The network equipment described in robot and claim 24 described in claim 22.
CN201710067320.2A 2017-02-07 2017-02-07 Method and equipment for multi-robot cooperation Active CN106774345B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710067320.2A CN106774345B (en) 2017-02-07 2017-02-07 Method and equipment for multi-robot cooperation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710067320.2A CN106774345B (en) 2017-02-07 2017-02-07 Method and equipment for multi-robot cooperation

Publications (2)

Publication Number Publication Date
CN106774345A true CN106774345A (en) 2017-05-31
CN106774345B CN106774345B (en) 2020-10-30

Family

ID=58956308

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710067320.2A Active CN106774345B (en) 2017-02-07 2017-02-07 Method and equipment for multi-robot cooperation

Country Status (1)

Country Link
CN (1) CN106774345B (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108428059A (en) * 2018-03-27 2018-08-21 昆明理工大学 A kind of detecting robot of pipe queue forms and develops method
CN108527367A (en) * 2018-03-28 2018-09-14 华南理工大学 A kind of description method of multirobot work compound task
CN108873913A (en) * 2018-08-22 2018-11-23 深圳乐动机器人有限公司 From mobile device work compound control method, device, storage medium and system
CN109683556A (en) * 2017-10-18 2019-04-26 苏州宝时得电动工具有限公司 From mobile device work compound control method, device and storage medium
CN109676611A (en) * 2019-01-25 2019-04-26 北京猎户星空科技有限公司 Multirobot cooperating service method, device, control equipment and system
CN109740464A (en) * 2018-12-21 2019-05-10 北京智行者科技有限公司 The identification follower method of target
CN109765889A (en) * 2018-12-31 2019-05-17 深圳市越疆科技有限公司 A kind of monitoring method of robot, device and intelligent terminal
CN109947105A (en) * 2019-03-27 2019-06-28 科大智能机器人技术有限公司 A kind of speed regulating method and speed regulation device of automatic tractor
CN110153983A (en) * 2018-02-15 2019-08-23 欧姆龙株式会社 Control system, slave device control unit, control method and storage medium
CN110347159A (en) * 2019-07-12 2019-10-18 苏州融萃特种机器人有限公司 Mobile robot Multi computer cooperation method and system
CN110355751A (en) * 2018-03-26 2019-10-22 发那科株式会社 Control device and machine learning device
CN111065981A (en) * 2017-09-25 2020-04-24 日本电产新宝株式会社 Moving body and moving body system
CN111443642A (en) * 2020-04-24 2020-07-24 深圳国信泰富科技有限公司 Cooperative control system and method for robot
CN111612312A (en) * 2020-04-29 2020-09-01 深圳优地科技有限公司 Robot distribution method, robot, terminal device and storage medium
CN111766854A (en) * 2019-03-27 2020-10-13 杭州海康机器人技术有限公司 Control system and control method for AGV cooperative transportation
CN112396653A (en) * 2020-10-31 2021-02-23 清华大学 Target scene oriented robot operation strategy generation method
CN112540605A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Multi-robot cooperation clearance method, server, robot and storage medium
CN112775957A (en) * 2019-11-08 2021-05-11 珠海市一微半导体有限公司 Control method of working robot, working robot system and chip
CN112873206A (en) * 2021-01-22 2021-06-01 中国铁建重工集团股份有限公司 Multi-task automatic distribution mechanical arm control system and operation trolley
CN113771033A (en) * 2021-09-13 2021-12-10 中冶赛迪技术研究中心有限公司 Multi-robot site integrated control system, method, device and medium
CN114019912A (en) * 2021-10-15 2022-02-08 上海电机学院 Group robot motion planning control method and system
CN114296460A (en) * 2021-12-30 2022-04-08 杭州海康机器人技术有限公司 Cooperative transportation method and device, readable storage medium and electronic equipment
CN114536339A (en) * 2022-03-03 2022-05-27 深圳市大族机器人有限公司 Method and device for controlling cooperative robot, cooperative robot and storage medium
CN115097816A (en) * 2022-05-20 2022-09-23 深圳市大族机器人有限公司 Modularized multi-robot cooperation control method
CN115218904A (en) * 2022-06-13 2022-10-21 深圳市优必选科技股份有限公司 Following navigation method, device, computer readable storage medium and mobile device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662377A (en) * 2012-05-17 2012-09-12 哈尔滨工业大学 Formation system and formation method of multi-mobile robot based on wireless sensor network
CN103608741A (en) * 2011-06-13 2014-02-26 微软公司 Tracking and following of moving objects by a mobile robot
CN103901889A (en) * 2014-03-27 2014-07-02 浙江大学 Multi-robot formation control path tracking method based on Bluetooth communications
CN104950887A (en) * 2015-06-19 2015-09-30 重庆大学 Transportation device based on robot vision system and independent tracking system
CN105425791A (en) * 2015-11-06 2016-03-23 武汉理工大学 Swarm robot control system and method based on visual positioning
CN105527960A (en) * 2015-12-18 2016-04-27 燕山大学 Mobile robot formation control method based on leader-follow
CN106094835A (en) * 2016-08-01 2016-11-09 西北工业大学 The dynamic formation control method of front-wheel drive vehicle type moving machine device people
CN106155065A (en) * 2016-09-28 2016-11-23 上海仙知机器人科技有限公司 A kind of robot follower method and the equipment followed for robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103608741A (en) * 2011-06-13 2014-02-26 微软公司 Tracking and following of moving objects by a mobile robot
CN102662377A (en) * 2012-05-17 2012-09-12 哈尔滨工业大学 Formation system and formation method of multi-mobile robot based on wireless sensor network
CN103901889A (en) * 2014-03-27 2014-07-02 浙江大学 Multi-robot formation control path tracking method based on Bluetooth communications
CN104950887A (en) * 2015-06-19 2015-09-30 重庆大学 Transportation device based on robot vision system and independent tracking system
CN105425791A (en) * 2015-11-06 2016-03-23 武汉理工大学 Swarm robot control system and method based on visual positioning
CN105527960A (en) * 2015-12-18 2016-04-27 燕山大学 Mobile robot formation control method based on leader-follow
CN106094835A (en) * 2016-08-01 2016-11-09 西北工业大学 The dynamic formation control method of front-wheel drive vehicle type moving machine device people
CN106155065A (en) * 2016-09-28 2016-11-23 上海仙知机器人科技有限公司 A kind of robot follower method and the equipment followed for robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
中国人工智能学会编著: "《中国人工智能进展》", 31 December 2009 *
卢惠民: "《ROS与中型组足球机器人》", 31 October 2016 *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111065981A (en) * 2017-09-25 2020-04-24 日本电产新宝株式会社 Moving body and moving body system
CN109683556B (en) * 2017-10-18 2021-02-09 苏州宝时得电动工具有限公司 Cooperative work control method and device for self-moving equipment and storage medium
CN109683556A (en) * 2017-10-18 2019-04-26 苏州宝时得电动工具有限公司 From mobile device work compound control method, device and storage medium
CN110153983A (en) * 2018-02-15 2019-08-23 欧姆龙株式会社 Control system, slave device control unit, control method and storage medium
CN110355751A (en) * 2018-03-26 2019-10-22 发那科株式会社 Control device and machine learning device
CN110355751B (en) * 2018-03-26 2023-04-28 发那科株式会社 Control device and machine learning device
CN108428059A (en) * 2018-03-27 2018-08-21 昆明理工大学 A kind of detecting robot of pipe queue forms and develops method
CN108428059B (en) * 2018-03-27 2021-07-16 昆明理工大学 Pipeline detection robot queue forming and evolution method
CN108527367B (en) * 2018-03-28 2021-11-19 华南理工大学 Description method of multi-robot cooperative work task
CN108527367A (en) * 2018-03-28 2018-09-14 华南理工大学 A kind of description method of multirobot work compound task
CN108873913A (en) * 2018-08-22 2018-11-23 深圳乐动机器人有限公司 From mobile device work compound control method, device, storage medium and system
CN109740464A (en) * 2018-12-21 2019-05-10 北京智行者科技有限公司 The identification follower method of target
CN109765889A (en) * 2018-12-31 2019-05-17 深圳市越疆科技有限公司 A kind of monitoring method of robot, device and intelligent terminal
CN109676611A (en) * 2019-01-25 2019-04-26 北京猎户星空科技有限公司 Multirobot cooperating service method, device, control equipment and system
CN109947105A (en) * 2019-03-27 2019-06-28 科大智能机器人技术有限公司 A kind of speed regulating method and speed regulation device of automatic tractor
CN111766854A (en) * 2019-03-27 2020-10-13 杭州海康机器人技术有限公司 Control system and control method for AGV cooperative transportation
CN110347159B (en) * 2019-07-12 2022-03-08 苏州融萃特种机器人有限公司 Mobile robot multi-machine cooperation method and system
CN110347159A (en) * 2019-07-12 2019-10-18 苏州融萃特种机器人有限公司 Mobile robot Multi computer cooperation method and system
CN112775957A (en) * 2019-11-08 2021-05-11 珠海市一微半导体有限公司 Control method of working robot, working robot system and chip
CN112540605A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Multi-robot cooperation clearance method, server, robot and storage medium
CN111443642A (en) * 2020-04-24 2020-07-24 深圳国信泰富科技有限公司 Cooperative control system and method for robot
CN111612312A (en) * 2020-04-29 2020-09-01 深圳优地科技有限公司 Robot distribution method, robot, terminal device and storage medium
CN111612312B (en) * 2020-04-29 2023-12-22 深圳优地科技有限公司 Robot distribution method, robot, terminal device, and storage medium
CN112396653A (en) * 2020-10-31 2021-02-23 清华大学 Target scene oriented robot operation strategy generation method
CN112873206A (en) * 2021-01-22 2021-06-01 中国铁建重工集团股份有限公司 Multi-task automatic distribution mechanical arm control system and operation trolley
CN113771033A (en) * 2021-09-13 2021-12-10 中冶赛迪技术研究中心有限公司 Multi-robot site integrated control system, method, device and medium
CN114019912A (en) * 2021-10-15 2022-02-08 上海电机学院 Group robot motion planning control method and system
CN114019912B (en) * 2021-10-15 2024-02-27 上海电机学院 Group robot motion planning control method and system
CN114296460A (en) * 2021-12-30 2022-04-08 杭州海康机器人技术有限公司 Cooperative transportation method and device, readable storage medium and electronic equipment
CN114296460B (en) * 2021-12-30 2023-12-15 杭州海康机器人股份有限公司 Collaborative handling method and device, readable storage medium and electronic equipment
CN114536339A (en) * 2022-03-03 2022-05-27 深圳市大族机器人有限公司 Method and device for controlling cooperative robot, cooperative robot and storage medium
CN115097816A (en) * 2022-05-20 2022-09-23 深圳市大族机器人有限公司 Modularized multi-robot cooperation control method
CN115218904A (en) * 2022-06-13 2022-10-21 深圳市优必选科技股份有限公司 Following navigation method, device, computer readable storage medium and mobile device

Also Published As

Publication number Publication date
CN106774345B (en) 2020-10-30

Similar Documents

Publication Publication Date Title
CN106774345A (en) A kind of method and apparatus for carrying out multi-robot Cooperation
Schmuck et al. Multi-uav collaborative monocular slam
Reif et al. Social potential fields: A distributed behavioral control for autonomous robots
Martinez-Cantin et al. A Bayesian exploration-exploitation approach for optimal online sensing and planning with a visually guided mobile robot
Eich et al. Towards coordinated multirobot missions for lunar sample collection in an unknown environment
KR20100027683A (en) Path planning device and method for the autonomous mobile robot
CN108369422A (en) Motion planning based on rapid discovery randomization feedback
US10220510B2 (en) Unified collaborative environments
González-Banos et al. Motion planning with visibility constraints: Building autonomous observers
Eilers et al. Modeling an AGV based facility logistics system to measure and visualize performance availability in a VR environment
Wei et al. Vision-guided fine-operation of robot and its application in eight-puzzle game
Stipes et al. Cooperative localization and mapping
Gianni et al. ARE: Augmented reality environment for mobile robots
Mansour et al. Depth estimation with ego-motion assisted monocular camera
Hofmann et al. The Carologistics RoboCup Logistics Team 2018
Umari Multi-robot map exploration based on multiple rapidly-exploring randomized trees
Gazdag et al. Autonomous racing of micro air vehicles and their visual tracking within the MIcro aerial vehicle and MOtion capture (MIMO) arena
Asavasirikulkij et al. A Study of Digital Twin and Its Communication Protocol in Factory Automation Cell
Martínez et al. A data-driven path planner for small autonomous robots using deep regression models
Skoglar et al. Concurrent path and sensor planning for a UAV-towards an information based approach incorporating models of environment and sensor
Feng Camera Marker Networks for Pose Estimation and Scene Understanding in Construction Automation and Robotics.
Kang et al. Team Tidyboy at the WRS 2020: A modular software framework for home service robots
Vithalani et al. Autonomous navigation using monocular ORB SLAM2
Hutter et al. Robust and resource-efficient cooperative exploration and mapping using homogeneous autonomous robot teams
Denysyuk et al. A* Modification for Mobile Robotic Systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200702

Address after: 200131 2nd floor, building 13, No. 27, Xinjinqiao Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant after: Shanghai xianruan Information Technology Co., Ltd

Address before: 201203, Shanghai, Pudong New Area, China (Shanghai) free trade test area, No. 301, Xia Xia Road, room 22

Applicant before: SHANGHAI SEER ROBOTICS TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant