CN106643701A - Robot inter-detection method and robot inter-detection device - Google Patents

Robot inter-detection method and robot inter-detection device Download PDF

Info

Publication number
CN106643701A
CN106643701A CN201710029024.3A CN201710029024A CN106643701A CN 106643701 A CN106643701 A CN 106643701A CN 201710029024 A CN201710029024 A CN 201710029024A CN 106643701 A CN106643701 A CN 106643701A
Authority
CN
China
Prior art keywords
robot
coordinate
profile
under
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710029024.3A
Other languages
Chinese (zh)
Other versions
CN106643701B (en
Inventor
焦小亮
李超
曹立冬
刘文泽
夏舸
顾震江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Top Technology Co Ltd
Original Assignee
Shenzhen Top Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Top Technology Co Ltd filed Critical Shenzhen Top Technology Co Ltd
Priority to CN201710029024.3A priority Critical patent/CN106643701B/en
Publication of CN106643701A publication Critical patent/CN106643701A/en
Application granted granted Critical
Publication of CN106643701B publication Critical patent/CN106643701B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot inter-detection method and a robot inter-detection device. The robot inter-detection method includes the following steps: a first robot acquires central coordinates of a second robot in a global map; the profile information of the second robot is acquired; profile coordinates of the second robot in the global map are worked out according to the profile information of the second robot and the central coordinates of the second robot in the global map; the profile coordinates of the second robot in the global map are converted into profile coordinates in a first local map, wherein the first local map is a local map of the first robot; and the second robot in the first local map is marked according to the profile coordinates of the second robot in the first local map. The solution of the invention can ensure that a plurality of single-line laser radar robots operating in the same building can quickly and accurately detect the position of one another, thus guaranteeing the normal operation of the plurality of robots.

Description

A kind of mutual detection method of robot and device
Technical field
The present invention relates to robot field, and in particular to a kind of mutual detection method of robot and device.
Background technology
Indoor Robot needs in real time to detect surrounding in navigation.Current indoor mobile robot The sensor main for using will have two classes:One kind is RGB depth (Red Green Blue Depth, RGB-D) photographic head or double Mesh camera, another kind is laser radar.Because RGB-D photographic head is to be used as initial data by image information or point cloud information Carry out obstacle detection, cause its operational data amount larger, also, its result is easily affected by indoor light, robustness compared with Difference.And laser radar with data accurately, is not affected compared to RGB-D photographic head by ambient lighting, data volume is less to wait special Point, thus be widely used in robot indoors, as " eyes " of Indoor Robot.Because multi-line laser radar into This is higher, and single line laser radar is generally used at present.The robot of laser radar is installed, the necessary hollow out of its shell, Barrier during the ray emission of laser radar can be allowed to go out to detect environment, thus when there is multiple robots indoors same When running in floor, a robot can only see the radar of other robots, and cannot see the monolithic wheel of other robots It is wide.But the physical size of radar is still too little for the overall profile of robot, which results in robot mutual Between cannot carry out accurate position detection, have influence on robot operation indoors.
The content of the invention
The embodiment of the present invention provides a kind of mutual detection method of robot and device, it is intended to when running has multiple in building During single line laser radar robot, position detection can be accurately carried out between robot.
The first aspect of the embodiment of the present invention, there is provided a kind of mutual detection method of robot, the robot is detected mutually Method includes:
First robot obtains centre coordinate of second robot under global map;
Obtain the profile information of second robot;
Centre coordinate according to the profile information and second robot of second robot under global map, meter Calculation obtains profile coordinate of second robot under global map;
It is the profile seat under the first local map by profile Coordinate Conversion of second robot under global map Mark, wherein, first local map is the local map of first robot;
Under first local map, according to profile coordinate of second robot under the first local map, mark Remember second robot.
The second aspect of the embodiment of the present invention, there is provided a kind of mutual detection means of robot, the robot is detected mutually Device includes:
Centre coordinate acquiring unit, for obtaining centre coordinate of second robot under global map;
Profile information acquiring unit, for obtaining the profile information of second robot;
Profile coordinate calculating unit, the profile of the second robot for being got according to the profile information acquiring unit Centre coordinate of the second robot that information and the centre coordinate acquiring unit get under global map, is calculated institute State profile coordinate of second robot under global map;
Profile coordinate transformation unit, for by calculated second robot of the profile coordinate calculating unit in the overall situation Profile Coordinate Conversion under map is the profile coordinate under the first local map, wherein, first local map is described The local map of the first robot;
Second robot indexing unit, under first local map, according to the profile coordinate transformation unit Profile coordinate of the second robot for obtaining under the first local map, the second robot described in labelling.
Therefore, in embodiments of the present invention, first by the second robot of acquisition of the first robot under global map Centre coordinate, and subsequently obtain the profile information of second robot, then believed according to the profile of second robot Centre coordinate under global map of breath and second robot, is calculated second robot under global map Profile coordinate, in the profile Coordinate Conversion under global map is the profile under the first local map by second robot Coordinate, wherein, first local map is the local map of first robot, finally in first local map Under, according to profile coordinate of second robot under the first local map, the second robot described in labelling.The present invention is implemented Example enables single line laser radar robot quickly to know the position of other robots, it is to avoid because shadow not in time is detected in position Ring the operation to robot.
Description of the drawings
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing The accompanying drawing to be used needed for having technology description is briefly described, it should be apparent that, drawings in the following description are only this Some embodiments of invention, for those of ordinary skill in the art, without having to pay creative labor, may be used also To obtain other accompanying drawings according to these accompanying drawings.
Fig. 1 is the flowchart of the mutual detection method of robot provided in an embodiment of the present invention;
Fig. 2 is the structured flowchart of the mutual detection means of robot provided in an embodiment of the present invention.
Specific embodiment
To enable goal of the invention, feature, the advantage of the present invention more obvious and understandable, below in conjunction with the present invention Accompanying drawing in embodiment, is clearly and completely described to the technical scheme in the embodiment of the present invention, it is clear that described reality It is only a part of embodiment of the invention to apply example, and not all embodiments.Based on the embodiment in the present invention, the common skill in this area The every other embodiment that art personnel are obtained under the premise of creative work is not made, belongs to the model of present invention protection Enclose.
It is described in detail below in conjunction with realization of the specific embodiment to the present invention:
Embodiment one
Fig. 1 shows that the robot mutual detection method that the embodiment of the present invention one is provided realizes flow process, and details are as follows:
In step S101, the first robot obtains centre coordinate of second robot under global map.
In embodiments of the present invention, in order to better illustrate the scheme of the embodiment of the present invention, first to the first robot and Second robot is briefly described.In all robots in same building thing, select wherein any one, as the present invention The first robot in embodiment, and detect other robot locations using embodiment of the present invention scheme.Due to present invention offer Be a kind of mutual detection method of robot, thus other all robots in addition to above-mentioned first robot can As the second robot.In fact, because arbitrary robot can detect other machines using the scheme of the embodiment of the present invention People, thus, above-mentioned first robot and the second robot can be selected according to practical situation, be not construed as limiting herein.
Above-mentioned first robot can obtain centre coordinate of above-mentioned second robot under global map.For indoor set The navigation of device people, for its safety, it will usually before robot formally comes into operation, with positioning and map structuring immediately Technology (Simultaneous Localization And Mapping, SLAM) builds indoor environment map, used as robot Global map.Although by the priori map of SLAM technique constructions due to the Dynamic Uncertain factor such as mankind's activity, can cause with There is little difference in map during robot actual motion, but it remains to reflect the environment residing for robot substantially.For complete Local figure, existing main stream approach is to be represented using grating map.Grating map is and represents environment with grid one by one Map, its coordinate axess unit is row or column.When there is barrier in somewhere in environment, the correspondence position of grating map also can be set For obstacle, when somewhere does not have barrier in environment, it can be arranged to idle in the correspondence position of grating map.Utilize The global map of SLAM technique constructions can allow robot to recognize the general situation of itself local environment, primarily determine that operation rail Mark.Further, the coordinate position of the second robot for now getting, is the center of the second robot, and not The outline position of two robots.It is believed that because robot can have different external shape, thus second for getting The centre coordinate of robot can not truly reflect the robot manual space in the environment, it is then desired to obtaining To centre coordinate position be for further processing.
Specifically, above-mentioned first robot can set up bluetooth connection with above-mentioned second robot, then by above-mentioned first The host computer of robot obtains centre coordinate of above-mentioned second robot under world coordinate system by bluetooth, and according to above-mentioned complete The relation with above-mentioned world coordinate system preserved in local figure, by centre coordinate of above-mentioned second robot under world coordinate system Be converted to centre coordinate of above-mentioned second robot under global map.Wherein, world coordinate system is physical coordinates system, its coordinate Axle unit is rice.It is initial that robot walks when SLAM technique construction global maps are utilized after initial start in operation area Position is world coordinate system.After being finished using SLAM technique construction global maps, save in above-mentioned global map its with Relation between above-mentioned world coordinate system.By the relation between above-mentioned global map and above-mentioned world coordinate system, it is possible to will Coordinate under above-mentioned global map is changed with the coordinate under world coordinate system.And when each robot runs, robot Coordinate of the current time oneself under world coordinate system can in real time be known by the alignment system of itself, therefore, above-mentioned During the centre coordinate of the second robot that one robot gets is actually above-mentioned second robot under world coordinate system Heart coordinate.Having got above-mentioned second robot after the centre coordinate under world coordinate system, above-mentioned first robot root again According to preserve in the global map and relation between above-mentioned world coordinate system, above-mentioned second robot is obtained under global map Centre coordinate.
In step s 102, the profile information of above-mentioned second robot is obtained.
In embodiments of the present invention, above-mentioned first robot can continue the profile information for obtaining above-mentioned second robot. Because different machines people can have different external shape, thus, need to obtain the wheel corresponding with the second robot herein Wide information.Wherein, the corresponding profile information of above-mentioned second robot can be associated with the central point of above-mentioned second robot.
Alternatively, because each robot possesses unique No. ID, for the wheel of above-mentioned second robot of quick obtaining Wide information, before step S102, can first obtain the ID of above-mentioned second robot, and according to above-mentioned second robot ID, determines the model of above-mentioned second robot.Wherein, because the ID of each robot is unique, thus can be according to getting The ID of the second robot quickly knows the model of the second robot.After the model for having got above-mentioned second robot, can According to the model of above-mentioned second robot, by different channels the profile information of above-mentioned second robot is obtained.For example, when When above-mentioned second robot is differed with the model of above-mentioned first robot, then above-mentioned first robot can be preserved from itself The profile information of above-mentioned second robot is searched in default robot profile information table;When above-mentioned second robot and above-mentioned When the model of one robot is identical, then above-mentioned first robot can directly read the profile information of itself, and by the first machine Profile information of the profile information of people itself as above-mentioned second robot.Wherein, above-mentioned default robot profile information table The all robot models and corresponding profile information existed in operation area can be preserved, and can be by user according to operation The robot of actual motion voluntarily increases classification or deletes classification in region.Because above-mentioned first robot can be according to above-mentioned The ID of two robots quickly determines its model, then by searching above-mentioned default robot profile information table, can be with simple and convenient Above-mentioned second robot of confirmation profile information.And when the model of above-mentioned second robot is identical with the first robot, Above-mentioned second robot is completely the same with the appearance profile of above-mentioned first robot.Because above-mentioned first robot is necessarily known certainly The profile information of body, thus only need to simply read the profile information of itself and using the profile information of itself as above-mentioned second machine The profile information of device people.It is of course also possible to pass through the profile information that other methods obtain above-mentioned second robot, herein not It is construed as limiting.If all robot models run in region are all consistent, above-mentioned default robot profile letter need not be preserved Breath table.
In step s 103, according to the profile information and above-mentioned second robot of above-mentioned second robot under global map Centre coordinate, be calculated profile coordinate of above-mentioned second robot under global map.
In embodiments of the present invention, the profile information and step of above-mentioned second robot for being got according to step S102 Centre coordinate of above-mentioned second robot that S101 gets under global map, is calculated above-mentioned second robot in the overall situation Profile coordinate under map.Due to getting in the profile information of the second robot that gets in step S102 and step S101 The centre coordinate of the second robot be associated, thus above-mentioned second robot can be calculated and existed by the incidence relation Profile coordinate under global map.
In step S104, by profile Coordinate Conversion of above-mentioned second robot under global map be first partly Profile coordinate under figure.
In embodiments of the present invention, got profile of above-mentioned second robot under global map in step S103 to sit After mark, it is first game that above-mentioned first robot can continue the profile Coordinate Conversion by above-mentioned second robot under global map Profile coordinate under portion's map, wherein, above-mentioned first local map is the local map of above-mentioned first robot, and it is a kind of grid Lattice map, coordinate axess unit is row or column.Local map is the basis that robot carries out Robot dodge strategy in navigation, with machine The barrier in preset range centered on people can labelling wherein, it is moved with the movement of robot, constantly brush Newly.Specifically, step S104 can be achieved in that:First above-mentioned second robot for getting is existed by above-mentioned first robot Profile Coordinate Conversion under global map is the profile coordinate under the first robot coordinate system, then further according to first partly The transformational relation with the first robot coordinate system that figure is preserved, by wheel of above-mentioned second robot under the first robot coordinate system Wide Coordinate Conversion is the profile coordinate under the first local map.Above-mentioned robot coordinate system be robot using itself barycenter as The coordinate system that zero builds, its coordinate axess unit is rice.
Alternatively, in due to step S104, profile coordinate of second robot under the first local map has only been got, And robot actually not merely simply a profile or drive shell, thus, after step s 104, also include:In acquisition State intrinsic coordinates of second robot under the first local map.
Wherein it is possible to transverse axis coordinate identical profile coordinate under the first local map is put in same priority queue, And filter out the profile coordinate points that vertical coordinate is minimum and vertical coordinate is maximum in above-mentioned same priority queue.Can simply recognize For under above-mentioned transverse axis coordinate, the ordinate of orthogonal axes scope that above-mentioned second robot is captured is the above-mentioned vertical coordinate for filtering out most Little point is to the maximum point of vertical coordinate.By said method, you can get above-mentioned second robot under the first local map All intrinsic coordinates.
In step S105, under above-mentioned first local map, according to above-mentioned second robot under the first local map Profile coordinate, above-mentioned second robot of labelling.
In embodiments of the present invention, under above-mentioned first local map, according to second got in step S104 Profile coordinate of the robot under the first local map, marks above-mentioned second robot as barrier, wherein, above-mentioned first Static layer, obstacle nitride layer and expanding layer are included in local map.Above-mentioned static layer is to build the static state obtained during global map Figure layer;Above-mentioned obstacle nitride layer is the figure layer for being marked with the barrier that sensor is detected;Above-mentioned expanding layer is to above-mentioned obstacle The barrier of nitride layer labelling has carried out the figure layer obtained after expansion process.In order to by above-mentioned second robot also first partly Represented in figure, robot layer can be set up in above-mentioned first local map, wherein, above-mentioned robot layer and above-mentioned first The size of local map, zero, change in coordinate axis direction and resolution are consistent;Above-mentioned second robot is then based on first Profile coordinate under local map, according to algorithm (Point in Polygon, PNPoly) of the point in polygon, in above-mentioned machine The grid captured by above-mentioned second robot is filtered out in device people's layer, and is labeled as barrier;In expanding layer, by above-mentioned obstacle The grid for being marked as barrier in nitride layer and above-mentioned robot layer makees expansion process;Finally it is superimposed above-mentioned robot layer, quiet State layer, obstacle nitride layer and expanding layer, complete under above-mentioned first local map to the labelling of above-mentioned second robot.
Wherein, the barrier in robot preset range is only shown due to the local map of robot, and above-mentioned second Robot actually may beyond the first local map institute energy in the profile coordinate and intrinsic coordinates of the first local map The scope of expression, may increase unnecessary amount of calculation in the operating process so in step S105, cause processing system Burden increase.In order to avoid the generation of such case, in step S105, PNPoly algorithms are applied to, with can be correct The grid captured by the second robot filtered out in the first local body of a map or chart, weed out not in the first local map model Null contour coordinate and/or invalid intrinsic coordinates in enclosing.
Alternatively, in order to save the calculation resources of robot, after above-mentioned steps S101, above-mentioned robot is detected mutually Method also includes:
Detect above-mentioned second robot whether in the preset range of above-mentioned first robot;
The profile information of above-mentioned second robot of above-mentioned acquisition, specially:When above-mentioned second robot is in above-mentioned first machine When in the preset range of device people, the profile information of above-mentioned second robot is obtained.
Wherein, after centre coordinate information of above-mentioned second robot under global map has been got, Europe can be used Euler's distance that above-mentioned second robot and the first robot are calculated apart from computing formula is drawn, it is above-mentioned according to above-mentioned Euler's distance Whether the first robot can make a preliminary judgement, judge above-mentioned second robot in the preset range of the first robot. Under normal circumstances, the scope that the local map of robot is represented be centered on robot, radius fix border circular areas or The rectangular area that the length of side is fixed.It is apparent that when the first robot and the second robot be separated by tens of rice it is remote when, it is not necessary that Position detection is carried out, only when the second robot is entered in the preset range of the first robot, is just necessary using this Inventive embodiments scheme carries out position detection to it.The preset range of above-mentioned first robot can be configured by user, Can by the first robot be voluntarily set to scope that the first local map can represent or be set to the first robot can The scope of bluetooth connection is carried out, is not construed as limiting herein.
Therefore, in embodiments of the present invention, having in the region of multiple single line laser radar robot operations, it is arbitrary Machine can obtain per capita the coordinate of other single line laser radar robots, and with reference to corresponding profile information, as barrier Hinder substance markers in the local map of itself so that the position of other robots can quickly be known in single line laser radar robot Put, it is to avoid because position detection has influence on not in time the operation of robot.
One of ordinary skill in the art will appreciate that realizing that all or part of step in above-described embodiment method can be Related hardware is instructed to complete by program, corresponding program can be stored in a computer read/write memory medium, Above-mentioned storage medium, such as ROM/RAM, disk or CD.
Embodiment two
Fig. 2 shows the concrete structure block diagram of the mutual detection means of robot that the embodiment of the present invention two is provided, in order to just In explanation, the part related to the embodiment of the present invention is illustrate only.The mutual detection means 2 of the robot includes:Centre coordinate is obtained Take unit 21, profile information acquiring unit 22, profile coordinate calculating unit 23, profile coordinate transformation unit 24, the second robot Indexing unit 25.
Wherein, centre coordinate acquiring unit 21, for obtaining centre coordinate of second robot under global map;
Profile information acquiring unit 22, for obtaining the profile information of above-mentioned second robot;
Profile coordinate calculating unit 23, for the second robot for being got according to above-mentioned profile information acquiring unit 22 Centre coordinate of the second robot that profile information and above-mentioned centre coordinate acquiring unit 21 get under global map, calculates Obtain profile coordinate of above-mentioned second robot under global map;
Profile coordinate transformation unit 24, for calculated second robot of above-mentioned profile coordinate calculating unit 23 to be existed Profile Coordinate Conversion under global map is the profile coordinate under the first local map, wherein, above-mentioned first local map is The local map of above-mentioned first robot;
Second robot indexing unit 25, under above-mentioned first local map, according to above-mentioned profile Coordinate Conversion list Profile coordinate of the second robot that unit 24 obtains under the first local map, above-mentioned second robot of labelling.
Alternatively, the mutual detection means 2 of above-mentioned robot, also includes:
Apart from detector unit, for detecting above-mentioned second robot whether in the preset range of above-mentioned first robot;
Above-mentioned profile information acquisition device 22, specifically for detecting above-mentioned second machine apart from detector unit when above-mentioned When people is in the preset range of above-mentioned first robot, the profile information of above-mentioned second robot is obtained.
Alternatively, the mutual detection means 2 of above-mentioned robot also includes:
ID acquiring units, for obtaining the ID of above-mentioned second robot;
Model determining unit, the ID of the second robot for being got according to above-mentioned ID acquiring units determines above-mentioned The model of two robots;
Above-mentioned profile information acquisition device 22, specifically for if the type of above-mentioned second robot and above-mentioned first robot Number differ, then the profile information of above-mentioned second robot is searched from default robot profile information table, if above-mentioned second Robot is identical with the model of above-mentioned first robot, then the profile information of above-mentioned first robot itself is read, as above-mentioned The profile information of the second robot.
Alternatively, above-mentioned centre coordinate acquiring unit 21, including:
Connection establishment subelement, for setting up bluetooth connection with above-mentioned second robot;
World coordinates obtains subelement, for obtaining center of above-mentioned second robot under world coordinate system by bluetooth Coordinate;
World coordinates conversion subunit, for according to the pass with above-mentioned world coordinate system preserved in above-mentioned global map System, centre coordinate of the second robot that above-mentioned world coordinates acquisition subelement is got under world coordinate system is converted to State centre coordinate of second robot under global map.
Alternatively, above-mentioned first local map includes:Static layer, obstacle nitride layer and expanding layer;Above-mentioned second robot Indexing unit 25, including:
Robot layer arranges subelement, in above-mentioned first local map, setting up robot layer, wherein, above-mentioned machine Device people layer is consistent with the size of above-mentioned first local map, zero, change in coordinate axis direction and resolution;
Robot barrier substance markers subelement, sits for the profile based on above-mentioned second robot under the first local map Mark, filters out by above-mentioned second robot according to PNPoly algorithms in above-mentioned robot arranges the robot layer that subelement is set up The grid for capturing, and it is labeled as barrier;
Barrier expands subelement, in expanding layer, will be marked as in above-mentioned obstacle nitride layer and robot layer The grid of barrier makees expansion process;
Map overlay subelement, for being superimposed above-mentioned robot layer, static layer, obstacle nitride layer and expanding layer, completes upper State under the first local map to the labelling of above-mentioned second robot.
It should be noted that the mutual detection means of robot in the embodiment of the present invention specifically can be in the way of software (form of such as App) and/or the mode of hardware are integrated in robot.
Therefore, in embodiments of the present invention, having in the region of multiple single line laser radar robot operations, it is arbitrary Machine can utilize per capita the mutual detection means of robot, obtain the coordinate of other single line laser radar robots, and with reference to right The profile information answered, using other single line laser radar robots as obstacle tag in the local map of itself so that it is single The position of other robots can quickly be known in line laser radar robot, it is to avoid the detection because of position has influence on machine not in time The operation of people.
It should be noted that in several embodiments provided herein, it should be understood that disclosed device and side Method, can realize by another way.For example, device embodiment described above is only schematic, for example, above-mentioned The division of unit, only a kind of division of logic function can have other dividing mode, such as multiple units when actually realizing Or component can with reference to or be desirably integrated into another system, or some features can be ignored, or not perform.It is another, institute The coupling each other for showing or discussing or direct-coupling or communication connection can be by some interfaces, device or unit INDIRECT COUPLING or communication connection, can be electrical, mechanical or other forms.
For aforesaid each method embodiment, for easy description, therefore it is all expressed as a series of combination of actions, but It is that those skilled in the art should know, the present invention is not limited by described sequence of movement, because according to the present invention, certain A little steps can adopt other orders or while carry out.Secondly, those skilled in the art also should know, be retouched in description The embodiment stated belongs to preferred embodiment, and involved action and module might not all be necessary to the present invention.
In the above-described embodiments, the description to each embodiment all emphasizes particularly on different fields, without the portion described in detail in certain embodiment Point, may refer to the associated description of other embodiments.
It is more than to a kind of preferred embodiment provided by the present invention, for one of ordinary skill in the art, according to this The thought of inventive embodiments, will change in specific embodiments and applications, and to sum up, this specification content is not It is interpreted as limitation of the present invention.

Claims (10)

1. the mutual detection method of a kind of robot, it is characterised in that the mutual detection method of the robot includes:
First robot obtains centre coordinate of second robot under global map;
Obtain the profile information of second robot;
Centre coordinate according to the profile information and second robot of second robot under global map, calculates To profile coordinate of second robot under global map;
It is the profile coordinate under the first local map by profile Coordinate Conversion of second robot under global map, its In, first local map is the local map of first robot;
Under first local map, according to profile coordinate of second robot under the first local map, labelling institute State the second robot.
2. the mutual detection method of robot as claimed in claim 1, it is characterised in that first robot obtains the second machine Centre coordinate of the device people under global map, also includes afterwards:
Detect second robot whether in the preset range of first robot;
The profile information for obtaining second robot, specially:When second robot is in first robot Preset range in when, obtain the profile information of second robot.
3. the mutual detection method of robot as claimed in claim 1, it is characterised in that the acquisition second robot Profile information, also includes before:
First robot obtains the ID of second robot;
According to the ID of second robot, the model of second robot is determined;
The profile information for obtaining second robot, specially:
If second robot is differed with the model of first robot, from default robot profile information table Search the profile information of second robot;
If second robot is identical with the model of first robot, the profile of first robot itself is read Information, as the profile information of second robot.
4. the mutual detection method of robot as described in any one of claims 1 to 3, it is characterised in that first robot Centre coordinate of second robot under global map is obtained, including:
First robot sets up bluetooth connection with second robot;
The host computer of first robot obtains centre coordinate of second robot under world coordinate system by bluetooth;
According to the relation with the world coordinate system preserved in the global map, by second robot in world coordinates Centre coordinate under system is converted to centre coordinate of second robot under global map.
5. the mutual detection method of robot as described in any one of claims 1 to 3, it is characterised in that described first partly Figure includes:Static layer, obstacle nitride layer and expanding layer;It is described in first local map, according to second robot Profile coordinate under the first local map, the second robot described in labelling, including:
In first local map, robot layer is set up, wherein, the chi of the robot layer and first local map Very little, zero, change in coordinate axis direction and resolution are consistent;
Based on profile coordinate of second robot under the first local map, according to PNPoly algorithms in the robot layer In filter out the grid captured by second robot, and be labeled as barrier;
In expanding layer, the grid for being marked as barrier in the obstacle nitride layer and the robot layer is made at expansion Reason;
The robot layer, static layer, obstacle nitride layer and expanding layer are superimposed, are completed under first local map to described The labelling of two robots.
6. the mutual detection means of a kind of robot, it is characterised in that the mutual detection means of the robot includes:
Centre coordinate acquiring unit, for obtaining centre coordinate of second robot under global map;
Profile information acquiring unit, for obtaining the profile information of second robot;
Profile coordinate calculating unit, the profile information of the second robot for being got according to the profile information acquiring unit And centre coordinate of the second robot that gets of the centre coordinate acquiring unit under global map, it is calculated described Profile coordinate of two robots under global map;
Profile coordinate transformation unit, for by calculated second robot of the profile coordinate calculating unit in global map Under profile Coordinate Conversion be the profile coordinate under the first local map, wherein, first local map be described first The local map of robot;
Second robot indexing unit, under first local map, being obtained according to the profile coordinate transformation unit Profile coordinate of second robot under the first local map, the second robot described in labelling.
7. the mutual detection means of robot as claimed in claim 6, it is characterised in that the mutual detection means of the robot, Also include:
Apart from detector unit, for detecting second robot whether in the preset range of first robot;
The profile information acquisition device, specifically for detecting second robot in institute apart from detector unit when described When stating in the preset range of the first robot, the profile information of second robot is obtained.
8. the mutual detection means of robot as claimed in claim 6, it is characterised in that the mutual detection means of the robot is also Including:
ID acquiring units, for obtaining the ID of second robot;
Model determining unit, the ID of the second robot for being got according to the ID acquiring units determines second machine The model of device people;
The profile information acquisition device, specifically for if the model of second robot and first robot not phase Together, then the profile information of second robot is searched from default robot profile information table, if second robot It is identical with the model of first robot, then the profile information of first robot itself is read, as second machine The profile information of device people.
9. the mutual detection means of robot as described in any one of claims 1 to 3, it is characterised in that the centre coordinate is obtained Unit is taken, including:
Connection establishment subelement, for setting up bluetooth connection with second robot;
World coordinates obtains subelement, sits for obtaining center of second robot under world coordinate system by bluetooth Mark;
World coordinates conversion subunit, for according to the relation with the world coordinate system preserved in the global map, inciting somebody to action The world coordinates obtains the centre coordinate of the second robot that subelement gets under world coordinate system and is converted to described the Centre coordinate of two robots under global map.
10. the mutual detection means of robot as described in any one of claims 1 to 3, it is characterised in that described first partly Figure includes:Static layer, obstacle nitride layer and expanding layer;Second robot indexing unit, including:
Robot layer arranges subelement, in first local map, setting up robot layer, wherein, the robot Layer is consistent with the size of first local map, zero, change in coordinate axis direction and resolution;
Robot barrier substance markers subelement, for being based on profile coordinate of second robot under the first local map, Filtered out in the robot arranges the robot layer that subelement is set up according to PNPoly algorithms and accounted for by second robot The grid of neck, and it is labeled as barrier;
Barrier expands subelement, in expanding layer, will be marked as obstacle in the obstacle nitride layer and robot layer The grid of thing makees expansion process;
Map overlay subelement, for being superimposed the robot layer, static layer, obstacle nitride layer and expanding layer, completes described To the labelling of second robot under one local map.
CN201710029024.3A 2017-01-16 2017-01-16 A kind of mutual detection method and device of robot Active CN106643701B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710029024.3A CN106643701B (en) 2017-01-16 2017-01-16 A kind of mutual detection method and device of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710029024.3A CN106643701B (en) 2017-01-16 2017-01-16 A kind of mutual detection method and device of robot

Publications (2)

Publication Number Publication Date
CN106643701A true CN106643701A (en) 2017-05-10
CN106643701B CN106643701B (en) 2019-05-14

Family

ID=58844424

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710029024.3A Active CN106643701B (en) 2017-01-16 2017-01-16 A kind of mutual detection method and device of robot

Country Status (1)

Country Link
CN (1) CN106643701B (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107145152A (en) * 2017-06-28 2017-09-08 上海木爷机器人技术有限公司 Prevent method, device and the robot of multimachine sensor mutual interference
CN107480638A (en) * 2017-08-16 2017-12-15 北京京东尚科信息技术有限公司 Vehicle obstacle-avoidance method, controller, device and vehicle
CN108303101A (en) * 2018-03-05 2018-07-20 弗徕威智能机器人科技(上海)有限公司 A kind of construction method of navigation map
CN108334090A (en) * 2018-02-12 2018-07-27 弗徕威智能机器人科技(上海)有限公司 A kind of setting method of virtual obstacles
CN108733065A (en) * 2017-09-29 2018-11-02 北京猎户星空科技有限公司 A kind of barrier-avoiding method of robot, device and robot
CN109213155A (en) * 2018-08-21 2019-01-15 北京云迹科技有限公司 Dispatching method, device and the server mutually avoided for multirobot
CN110070712A (en) * 2019-04-12 2019-07-30 同济大学 A kind of low speed sweeper Global localization system and method
CN111338384A (en) * 2019-12-17 2020-06-26 北京化工大学 Self-adaptive path tracking method of snake-like robot
CN111383300A (en) * 2018-12-28 2020-07-07 深圳市优必选科技有限公司 Method, device and equipment for updating navigation cost map
CN111546348A (en) * 2020-06-10 2020-08-18 上海有个机器人有限公司 Robot position calibration method and position calibration system
CN111984013A (en) * 2020-08-21 2020-11-24 北京云迹科技有限公司 Distance calculation method and device based on robot footprint data
CN112601060A (en) * 2020-12-10 2021-04-02 西北工业大学 Active sharing projection surface sensing system and method for desktop cluster robot
CN112950705A (en) * 2021-03-15 2021-06-11 中原动力智能机器人有限公司 Image target filtering method and system based on positioning system
CN113063426A (en) * 2020-01-02 2021-07-02 北京初速度科技有限公司 Position information determining method and device
TWI739255B (en) * 2018-12-28 2021-09-11 南韓商Lg電子股份有限公司 Mobile robot
CN114061563A (en) * 2021-10-15 2022-02-18 深圳优地科技有限公司 Method and device for judging reasonability of target point, terminal equipment and storage medium
CN114199251A (en) * 2021-12-03 2022-03-18 江苏集萃智能制造技术研究所有限公司 Anti-collision positioning method for robot
CN114434496A (en) * 2022-01-19 2022-05-06 山东新一代信息产业技术研究院有限公司 Assistant robot for detecting performance of robot
CN114485662A (en) * 2021-12-28 2022-05-13 深圳优地科技有限公司 Robot repositioning method and device, robot and storage medium
CN116358531A (en) * 2023-06-01 2023-06-30 佛山云曼健康科技有限公司 Map construction method, device, robot and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101000507A (en) * 2006-09-29 2007-07-18 浙江大学 Method for moving robot simultanously positioning and map structuring at unknown environment
CN105115497A (en) * 2015-09-17 2015-12-02 南京大学 Reliable indoor mobile robot precise navigation positioning system and method
CN105806315A (en) * 2014-12-31 2016-07-27 上海新跃仪表厂 Active coded information based non-cooperative object relative measurement system and measurement method thereof
US20160371301A1 (en) * 2010-03-09 2016-12-22 Sony Corporation Information processing device, map update method, program, and information processing system
CN107390681A (en) * 2017-06-21 2017-11-24 华南理工大学 A kind of mobile robot real-time location method based on laser radar and map match
CN109100730A (en) * 2018-05-18 2018-12-28 北京师范大学-香港浸会大学联合国际学院 A kind of fast run-up drawing method of more vehicle collaborations
CN109241228A (en) * 2018-09-04 2019-01-18 山东理工大学 A kind of multiple mobile robot's cooperation synchronous superposition strategy

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101000507A (en) * 2006-09-29 2007-07-18 浙江大学 Method for moving robot simultanously positioning and map structuring at unknown environment
US20160371301A1 (en) * 2010-03-09 2016-12-22 Sony Corporation Information processing device, map update method, program, and information processing system
CN105806315A (en) * 2014-12-31 2016-07-27 上海新跃仪表厂 Active coded information based non-cooperative object relative measurement system and measurement method thereof
CN105115497A (en) * 2015-09-17 2015-12-02 南京大学 Reliable indoor mobile robot precise navigation positioning system and method
CN107390681A (en) * 2017-06-21 2017-11-24 华南理工大学 A kind of mobile robot real-time location method based on laser radar and map match
CN109100730A (en) * 2018-05-18 2018-12-28 北京师范大学-香港浸会大学联合国际学院 A kind of fast run-up drawing method of more vehicle collaborations
CN109241228A (en) * 2018-09-04 2019-01-18 山东理工大学 A kind of multiple mobile robot's cooperation synchronous superposition strategy

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐悦 等: "关于机器人导航目标点搜索路径模糊控制", 《计算机仿真》 *

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107145152B (en) * 2017-06-28 2020-02-14 上海木木机器人技术有限公司 Method and device for preventing mutual interference of multiple sensors and robot
CN107145152A (en) * 2017-06-28 2017-09-08 上海木爷机器人技术有限公司 Prevent method, device and the robot of multimachine sensor mutual interference
CN107480638A (en) * 2017-08-16 2017-12-15 北京京东尚科信息技术有限公司 Vehicle obstacle-avoidance method, controller, device and vehicle
CN107480638B (en) * 2017-08-16 2020-06-30 北京京东尚科信息技术有限公司 Vehicle obstacle avoidance method, controller, device and vehicle
CN108733065A (en) * 2017-09-29 2018-11-02 北京猎户星空科技有限公司 A kind of barrier-avoiding method of robot, device and robot
CN108334090A (en) * 2018-02-12 2018-07-27 弗徕威智能机器人科技(上海)有限公司 A kind of setting method of virtual obstacles
CN108303101A (en) * 2018-03-05 2018-07-20 弗徕威智能机器人科技(上海)有限公司 A kind of construction method of navigation map
CN109213155A (en) * 2018-08-21 2019-01-15 北京云迹科技有限公司 Dispatching method, device and the server mutually avoided for multirobot
CN109213155B (en) * 2018-08-21 2021-09-14 北京云迹科技有限公司 Scheduling method and device for mutual avoidance of multiple robots and server
CN111383300A (en) * 2018-12-28 2020-07-07 深圳市优必选科技有限公司 Method, device and equipment for updating navigation cost map
CN111383300B (en) * 2018-12-28 2023-04-14 深圳市优必选科技有限公司 Method, device and equipment for updating navigation cost map
TWI739255B (en) * 2018-12-28 2021-09-11 南韓商Lg電子股份有限公司 Mobile robot
CN110070712A (en) * 2019-04-12 2019-07-30 同济大学 A kind of low speed sweeper Global localization system and method
CN111338384A (en) * 2019-12-17 2020-06-26 北京化工大学 Self-adaptive path tracking method of snake-like robot
CN111338384B (en) * 2019-12-17 2021-06-08 北京化工大学 Self-adaptive path tracking method of snake-like robot
CN113063426A (en) * 2020-01-02 2021-07-02 北京初速度科技有限公司 Position information determining method and device
CN113063426B (en) * 2020-01-02 2022-12-13 北京魔门塔科技有限公司 Position information determining method and device
CN111546348A (en) * 2020-06-10 2020-08-18 上海有个机器人有限公司 Robot position calibration method and position calibration system
CN111984013B (en) * 2020-08-21 2024-03-19 北京云迹科技股份有限公司 Distance calculation method and device based on robot footprint data
CN111984013A (en) * 2020-08-21 2020-11-24 北京云迹科技有限公司 Distance calculation method and device based on robot footprint data
CN112601060B (en) * 2020-12-10 2022-03-15 西北工业大学 Active sharing projection surface sensing system of desktop cluster robot
CN112601060A (en) * 2020-12-10 2021-04-02 西北工业大学 Active sharing projection surface sensing system and method for desktop cluster robot
CN112950705A (en) * 2021-03-15 2021-06-11 中原动力智能机器人有限公司 Image target filtering method and system based on positioning system
CN114061563A (en) * 2021-10-15 2022-02-18 深圳优地科技有限公司 Method and device for judging reasonability of target point, terminal equipment and storage medium
CN114061563B (en) * 2021-10-15 2024-04-05 深圳优地科技有限公司 Target point rationality judging method, device, terminal equipment and storage medium
CN114199251A (en) * 2021-12-03 2022-03-18 江苏集萃智能制造技术研究所有限公司 Anti-collision positioning method for robot
CN114199251B (en) * 2021-12-03 2023-09-15 江苏集萃智能制造技术研究所有限公司 Anti-collision positioning method for robot
CN114485662A (en) * 2021-12-28 2022-05-13 深圳优地科技有限公司 Robot repositioning method and device, robot and storage medium
CN114485662B (en) * 2021-12-28 2024-03-08 深圳优地科技有限公司 Robot repositioning method, device, robot and storage medium
CN114434496A (en) * 2022-01-19 2022-05-06 山东新一代信息产业技术研究院有限公司 Assistant robot for detecting performance of robot
CN116358531A (en) * 2023-06-01 2023-06-30 佛山云曼健康科技有限公司 Map construction method, device, robot and storage medium
CN116358531B (en) * 2023-06-01 2023-09-01 佛山市星曼信息科技有限公司 Map construction method, device, robot and storage medium

Also Published As

Publication number Publication date
CN106643701B (en) 2019-05-14

Similar Documents

Publication Publication Date Title
CN106643701A (en) Robot inter-detection method and robot inter-detection device
US11320834B2 (en) Methods and systems for mapping, localization, navigation and control and mobile robot
WO2020134082A1 (en) Path planning method and apparatus, and mobile device
CN104330090B (en) Robot distributed sign intelligent semantic map creating method
EP3672762B1 (en) Self-propelled robot path planning method, self-propelled robot and storage medium
CN108827249A (en) A kind of map constructing method and device
CN107917712A (en) A kind of synchronous superposition method and apparatus
US20220161430A1 (en) Recharging Control Method of Desktop Robot
CN106092104A (en) The method for relocating of a kind of Indoor Robot and device
CN105856243A (en) Movable intelligent robot
CN106959691A (en) Mobile electronic equipment and immediately positioning and map constructing method
CN111121753A (en) Robot joint graph building method and device and computer readable storage medium
CN107644273A (en) A kind of navigation path planning method and equipment
CN109084749B (en) Method and device for semantic positioning through objects in environment
CN109425348A (en) A kind of while positioning and the method and apparatus for building figure
WO2021036587A1 (en) Positioning method and system for electric power patrol scenario
CN111679661A (en) Semantic map construction method based on depth camera and sweeping robot
CN108022265A (en) Infrared camera pose determines method, equipment and system
WO2018076777A1 (en) Robot positioning method and device, and robot
JP2016149090A (en) Autonomous mobile device, autonomous mobile system, autonomous mobile method and program
CN112034830A (en) Map information processing method and device and mobile device
CN111521971B (en) Robot positioning method and system
CN111679664A (en) Three-dimensional map construction method based on depth camera and sweeping robot
CN113447014A (en) Indoor mobile robot, mapping method, positioning method, and mapping positioning device
JP7055395B2 (en) Information processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant