CN109725327A - A kind of method and system of multimachine building map - Google Patents
A kind of method and system of multimachine building map Download PDFInfo
- Publication number
- CN109725327A CN109725327A CN201910172775.XA CN201910172775A CN109725327A CN 109725327 A CN109725327 A CN 109725327A CN 201910172775 A CN201910172775 A CN 201910172775A CN 109725327 A CN109725327 A CN 109725327A
- Authority
- CN
- China
- Prior art keywords
- map
- robot
- dimensional code
- multimachine
- grating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The present disclosure proposes a kind of method and system of multimachine building map, comprising: each robot passes through laser radar scanning environmental information and is uploaded to the grating map for generating the robot in local server;By two-dimensional barcode information in visual sensor environment-identification, the unique ID and robot for obtaining each two dimensional code relative to the pose of two dimensional code are uploaded to local server for each robot;Grating map that the local server respectively constructs robot all in the group and and each map in two-dimensional barcode information handle, calculate the constraint information for the two dimensional code for including in the map that each robot establishes, intersection in different maps is obtained according to the constraint information of two dimensional code and carries out map fusion, after completing map fusion, final global map is uploaded to cloud server.It can be realized the global map building under large scene, and support the extension of robot, system is flexible, and robustness is high.
Description
Technical field
This disclosure relates to field of navigation technology, more particularly to a kind of method and system of multimachine building map.
Background technique
For robot field, independently exploring zone of ignorance and constructing accurate environmental map is a basic times
Business and mobile robot are in place of the key that the later period completes the functions such as independent navigation.It can fast and accurately establish out strange
The cartographic model of environment is the embodiment of robot perception and algorithm process ability, also determine the intelligence degree of robot with
And the robustness of navigation feature.It is the sensor by robot own device mostly in the detailed process for building figure come to entire
Environment is observed, and the observation model of the sensor of different testing principles is also entirely different, such as the laser rangings such as laser radar
Instrument is modeled by the modeling of the characteristics such as the reflection of light, ultrasonic sensor using the propagation of sound wave and reflection characteristic, and camera sensor is then
It is modeled using projective geometry.
With in decades technology development, especially robot environment sensing sensor such as laser radar mounted,
The raising in detection accuracy and frequency such as depth camera, and the continuous renewal and optimization of figure strategy and algorithm are built, so that
Individual machine people above tends to be mature in the application that small range builds figure and stablizes.
However, face large scene environment when, only by single robot disposably completes the overall situation build figure effect often
It can be limited to the factors such as individual machine people's controller processing capacity deficiency and cumulative errors.Therefore, it is directed to multirobot
The research of cooperation is widely carried out.Researcher wishes to model environment by multi-robot Cooperation, realizes higher
Efficiency and improve map precision and robustness.
In the research that Multi computer cooperation builds figure, it is necessary first to which the network architecture for being the determining machine human world and data of solution are led to
Letter mode.The network architecture about the robot under whole system is broadly divided into centralization and distributed frame.Traditional collection
To the sensor information of all robots using the form of centralized management in structure of Chinese style, matchmaker is communicated by wireless network etc.
Be situated between be transferred to central processing unit carry out data centralized processing with merge.This structure it is larger to the load pressure of network and
Since Multi computer cooperation requirement of real-time is high, once there is loss of data it will cause the timing entanglement of data and is difficult to correct immediately.
In distributed frame, each robot the machine first carries out the sensing data processing of local environment, then passes through wireless network etc.
Network media shares treated information to other robot, to safeguard itself environmental data and safeguard global context data.
Distributed structure pressure caused by communication network is smaller, has more flexibilities and stronger robustness.
Secondly, the fusion for focusing on map of the research in the specific implementation process of Multi computer cooperation.Multimachine is built
For the overall flow of figure, wherein each single machine builds that diagram technology is very mature, using the SLAM algorithm such as Gmapping of open source,
Hector etc. establishes the grating map of two-dimensional environment, but in large scene single machine build figure there are low efficiency, controller processing energy
Large scene is divided so Multi computer cooperation is needed to build figure and carries out the sub- map in part to different robots by hypodynamic problem
Building, while guaranteeing there is intersection between the sub- map of every two, in order to which the map in later period merges.It is how existing heavy by two
Closing part to have the map of respective different piece to carry out fusion again is a crucial place.
Currently, one is the pure algorithm optimization to maps by mathematics to carry out processing ratio for the amalgamation mode of several more mainstreams
It is such as merged based on the map of Extended Kalman filter and bayesian probability model, one is the sides of view-based access control model characteristic matching
Formula, transformation relation is between subgraph is found out after being matched by extraction ORB feature to merge, but this method is due to visual signature
The process of extraction is affected by environment, illumination etc., and the point of error hiding excessively be easy to cause wrong fusion.Also, existing place
The method of purification for managing Mismatching point also only artificially limits filter criteria, there is very big contingency.
Summary of the invention
The system that the first purpose of this specification embodiment is to provide a kind of multimachine building map, is tied by distribution
Structure utilizes map amalgamation mode during multi-robot Cooperation builds figure, extracts accurate map feature, this method by with
Lower technical solution is to realize:
This specification embodiment provides a kind of system of multimachine building map, comprising: is divided into several machines of multiple groups
People, every group of robot are communicated with corresponding local server respectively, several local servers and global cloud server communication;
Each robot, which passes through laser radar scanning environmental information and is uploaded in local server, generates the robot
Grating map;
Each robot by two-dimensional barcode information in visual sensor environment-identification, obtain unique ID of each two dimensional code with
And robot is uploaded to local server relative to the pose of two dimensional code;
Grating map that the local server respectively constructs robot all in the group and and each map in
Two-dimensional barcode information handled, calculate the constraint information for the two dimensional code for including in the map that each robot establishes, according to
The constraint information of two dimensional code obtains intersection in different maps and carries out map fusion, will be final after completing map fusion
Global map is uploaded to cloud server.
The second purpose of this specification embodiment is to provide a kind of method of multimachine building map, comprising:
By environmental information locating for laser radar scanning robot and generate the grating map of the robot;
By visual sensor identify robot local environment in two-dimensional barcode information, obtain unique ID of each two dimensional code with
And pose of the robot relative to two dimensional code;
The grating map that respectively constructed to all robots and and each map in two-dimensional barcode information handle,
The constraint information for calculating the two dimensional code for including in the map that each robot establishes obtains not according to the constraint information of two dimensional code
With intersection in map and map fusion is carried out, after completing map fusion, obtains global map.
Compared with prior art, the beneficial effect of the disclosure is:
It is lower very that disclosed technique scheme solves two-dimensional laser SLAM map splicing success rate during multimachine builds figure
The problem of can not extremely splicing, and then improve the accuracy rate that scene detection is repeated in three-dimensional scenic reconstruct.
Disclosed technique scheme is by way of being laid with two dimensional code in the environment and extracting information by visual sensor, phase
When providing accurate characteristic information in each piece of region in the environment.The process that two dimensional code extracts is quick and accurate, and two
Code feature is tieed up not by such environmental effects, carries out the better effect of map match accordingly.
Disclosed technique scheme can be realized the building of the global map under large scene, and support the extension of robot, system
Flexibly, robustness is high.
Figure is built for large scene, the feature of large scene is mainly that real area is big, and environmental information is relatively fixed, such as factory's vehicle
Between, office building leveling etc..Single robot can not be obtained fast and accurately due to factors such as computing capability and error accumulations
Desired result.Disclosed technique scheme is equivalent to by way of multi-robot Cooperation and is divided task, is as a result integrated into one
It rises, completes to build figure function under large scene.
Detailed description of the invention
The Figure of description for constituting a part of this disclosure is used to provide further understanding of the disclosure, and the disclosure is shown
Meaning property embodiment and its explanation do not constitute the improper restriction to the disclosure for explaining the disclosure.
Fig. 1 is that the sub- Multi computer cooperation of the embodiment of the present disclosure builds drawing system network structure;
Fig. 2 (a)-Fig. 2 (c) is that the sub- multimachine of the embodiment of the present disclosure builds in figure map fusion schematic diagram and (is with two secondary subgraphs
Example);
Fig. 3 is that the sub- Multi computer cooperation of the embodiment of the present disclosure builds drawing system work flow diagram.
Specific embodiment
It is noted that following detailed description is all illustrative, it is intended to provide further instruction to the disclosure.Unless another
It indicates, all technical and scientific terms used herein has usual with disclosure person of an ordinary skill in the technical field
The identical meanings of understanding.
It should be noted that term used herein above is merely to describe specific embodiment, and be not intended to restricted root
According to the illustrative embodiments of the disclosure.As used herein, unless the context clearly indicates otherwise, otherwise singular
Also it is intended to include plural form, additionally, it should be understood that, when in the present specification using term "comprising" and/or " packet
Include " when, indicate existing characteristics, step, operation, device, component and/or their combination.
Examples of implementation one
Referring to figure 1, which discloses a kind of system of multimachine building map, and the system is by being deployed in cloud
The global Cloud Server at end, n local machine people server and n robot composition.Global Cloud Server is responsible for building up
The service call of global map and control to local machine people's server, safeguard the running of total system.
In this embodiment, individual machine people passes through laser radar scanning environmental information respectively and visual sensor extracts two
Code information is tieed up, local machine people's server is uploaded to after preliminary treatment.
Space constraint vector is established for the two dimensional code that each subgraph extracts in local machine people's server, is solved
It is rotated between different subgraphs, the optimal solution of translation transformation.
Coordinate is converted between completing subgraph according to optimal solution, completes map fusion by least square method.
The map that fusion is completed is passed to global cloud for service call.
In specific embodiment, pass through Internet traffic between local robot server and global Cloud Server, with
Multiple machine human world are by wireless communication, since robot body processor computing capability mounted is limited in large scene
Efficiency is very low in map fusion, therefore robot is responsible for the building of local map, relevant information reach local machine people server into
Row is uniformly processed and merges.The examples of implementation are a kind of distributed frame of improvement type, robot distributed exploration, server set
Middle fusion map.
In specific embodiment, which has the more robots for being equipped with laser radar and visual sensor,
Raspberry pie obtains environmental information for generating local grid by laser radar as the processor of the robot body of low cost
Subgraph, the two-dimensional barcode information puted up in visual sensor extraction environment and record.WiFi module is responsible for realizing raspberry pie and local
Data transmit-receive function between robot server.
Individual machine people is uploaded to local machine people clothes by laser sensor scanning circumstance information, and by wireless network
The grating map of the robot is generated in business device.Meanwhile robot is obtained by two-dimensional barcode information in visual sensor environment-identification
To the pose relative to two dimensional code of unique ID and robot of each two dimensional code.By in same map two dimensional code ID and its
Posture information is uploaded to local machine people server and stores by the different classifications of robot serial number.By classification storage, it is convenient for
The processing and calling of subsequent data.
Have that multiple robots upload in local machine people's server respectively construct map (Fig. 1, map
Next two-dimensional barcode information in 2 ... ..., map n) and each map is exactly to complete to handle to being collected into information, looks for
The work of coordinate unification before intersection and map merge in different maps out.For two subgraphs, accurately look for
The something in common i.e. intersection of the two out, and the process for calculating affine transformation is very heavy to the map fusion in later period
It wants.
In the specific embodiment of the disclosure, grating map independent for one calculates what robot detected
The space vector constraint of two dimensional code between any two, herein under the premise of, packet can be calculated in the map that each robot establishes
The constraint information of the two dimensional code contained, and the position of two dimensional code is fixed and invariable in the actual environment, therefore is once detected
Two dimensional code and know its ID, then no matter robot is observed from which angle, either which robot observes all
It is this unique two dimensional code, this accurate characteristic information makes the matching of two different maps more accurate and convenient.
After completing map fusion, final global map is uploaded to cloud server, it can be in a variety of applications of robot
In be called, and support can be provided for more intelligentized robot technology.
What embodiment of the present disclosure was proposed can be realized Multi computer cooperation map structuring system under large scene, has and takes into account reality
The multiple cloud base network structure of Shi Xingyu scalability used map amalgamation mode during multi-robot Cooperation builds figure,
Accurate map feature is extracted, helps to provide more reliable the constraint relationship in the foundation of global map.In addition, the disclosure is real
It applies example and solves the problems, such as the insufficient and traditional multi-robot Cooperation coupling defective tightness of single robot operational capability, and complete
Support to the whole cloud service scheduling of robot.
Embodiment of the present disclosure subsystem has a large amount of accurately environmental labelling units, that is, is laid with two in advance in the environment
Tie up code label, it is possible to provide to the accurate character representation of environment.Each two dimensional code has unique ID, therefore each two dimensional code generation
Position feature in table actual environment, detects by visual sensor and reads two-dimensional barcode information, carry out it is preliminary
It is transmitted to after coordinate conversion in local machine people's server and carries out two-dimentional intersymbol vector constrained matching and conversion, merged for sub- map
Provide accurate and stable fixed reference feature.The key that system realization multimachine builds figure is the subgraph of multiple and different coordinate systems
The matching and alignment of intersection can guarantee matched accuracy by establishing the constraint relationship of multiple two-dimentional intersymbols, and
The process of entire map fusion is not single execution movement but continuous, as more identical two dimensional codes are detected i.e.
More corresponding the constraint relationships are established, and the matching effect of two dimensional code is more and more accurate.
In the specific implementation, two dimensional code laying requires low, is building figure environmental grounds or metope, is being laid in most with ground
Good scheme.Paving location, direction between multiple two dimensional codes etc. can be arbitrarily arranged to be required without fixed.
In addition, another big feature of system is its expansibility, it is different with the increase of more multirobot detection zone
Two-dimension code label can also increase therewith, adapt to the variation of environment, remain the good expression to environmental characteristic, to ensure
Subsequent map match and the process merged.
Examples of implementation two
Referring to shown in attached drawing 3, which discloses a kind of method of multimachine building map, and this method can be based on real
The system in example one is applied, but is not limited to above system.Such as all robots can be communicated with a server
Letter carries out the centralized processing of data using the server.
In specific implementation, individual machine people passes through laser sensor scan ring to a kind of method of above-mentioned multimachine building map
Border information is handled for the environmental information, generates the grating map of the robot, and robot is identified by visual sensor
Two-dimensional barcode information in environment obtains the pose of the unique ID and robot of each two dimensional code relative to two dimensional code.
Due to can get its pose Probot=(xr, yr, θ when robot moves in its map coordinates systemr) therefore finally
Single two dimensional code pose P under the map coordinates system outcode(n)=(xn,yn,θn), n two dimensional code ID.Since different robots are built
Coordinate system is different in vertical map, therefore even if Liang Ge robot detects same two two dimensional codes but counts herein
The posture information of calculation is different, thus by same map two dimensional code ID and its posture information be uploaded to server and by
The different classifications of robot serial number store.
Have that multiple robots upload in specific embodiment, in server respectively construct map (Fig. 1, ground
Next two-dimensional barcode information in Fig. 2 ... ..., map n) and each map is exactly to complete to handle to being collected into information,
Find out the work of the coordinate unification before intersection and map merge in different maps.
Grating map independent for one calculates the space vector of two dimensional code that robot detects between any two about
Beam, if any Pcode(16)=(x16,y16,θ16) and Pcode(32)=(x32,y32,θ32), then can be calculated quickly the two two
Tie up the constraint relationship of the space vector between codeAccording to the above every two two dimension
The method that code establishes constraint, can at most establish the n two dimensional code detected in same mapA constraint item
Part.
Under the premise of herein, the constraint information for the two dimensional code for including can be calculated in the map that each robot establishes,
And the position of two dimensional code is fixed and invariable in the actual environment, therefore the two dimensional code once detected and knows its ID, then
It is all this unique two dimensional code that no matter robot is observed from which angle, either which robot observes, this
One accurate characteristic information makes the matching of two different maps more accurate and convenient.
Referring to shown in 2 (a)-Fig. 2 (c) of attached drawing, there is intersection again with the existing independent sector that Liang Ge robot is built
For map fusion, it is as follows to match the process solved with transformational relation: detecting that ID is 16,32 in ground Fig. 1 that robot 1 establishes
Two two dimensional codes and establish constrained vectorEqually also detect that ID is in ground Fig. 2 that robot 2 establishes simultaneously
16,32 two two dimensional codes, constrained vector areIn the actual environment, although what two sub- graphic calculations went outDifference, but the practical relativeness of the two two dimensional codes spatially be it is constant, so as to obtain two
Kind map centainly has intersection, can carry out the matching of map, establish transformation relation between two mapsTransform is the homography matrix for indicating two width subgraph transformational relations, including spin matrix
R and translation vector T and scale variable s, i.e.,
For theoretically, it is only necessary to which one group of corresponding vector can solve R, s, T in above formula.But in detection process
In, camera is extremely accurate to the detection and identification of two dimensional code ID, but for the pose coordinate of return due to the precision of camera
And inevitable error, it can be deviated.Therefore, to minimize the influence of error, lasting search pair in the server
Than obtain more two dimensional code constrained vectors between two width subgraphs (as continue in Fig. 1 and 2 to detect ID for 48 two dimensional code if can
It establishesVector constraint).When finding out more corresponding constrained vectors, then illustrate two
The docking of the overlapping positions of map is more accurate, becomes polygon matching, the mistake of the method from traditional point-to-point matching
It is lower with rate.On this basis, overdetermined equation is established to solve transformational relation tranoform between two maps, to overdetermined equation
For such as following formula:
In above formula, equation number is greater than variable number therefore particular solution is not present, but can find out optimal solution and to take off mistake at this
Difference and minimum.The resolving of optimal solution is to solve gained by least square method by establishing normal equation group.It obtains after optimal solution just
It can be coordinately transformed with to map, it is unified into the same coordinate system, then carry out the fusion of map.Embodiment of the present disclosure is suitable
For the building of the global map in the biggish environment of area and apply to SLAM navigation in.
Optimal solution is exactly optimal R, s, T parameter, can be believed the map reference of another figure by these three parameters
Breath is converted into unanimously having the subgraph of intersection to be ground later to obtain two be equivalent under the same coordinate system with this figure
The process of figure anastomosing and splicing
The method of overdetermined equation has best balance as a result, the embodiment of the present disclosure in the accuracy of complexity and result
For the emphasis of son in the foundation and utilization with constraint, specific transformational relation calculates also various ways:
1, the method for direct mathematical computations: three unknown numbers solved as needed and arbitrary three groups of constraints are established
Equation group is solved.The method has accurately just for selected constraint as a result, being incorporated to calculating to other constraint later periods
For be not optimal solution.
2, initial transformation matrix (including R, s, T) method selectively iterated to calculate: is calculated according to one group of constraint.It utilizes
Initial transformation matrix to need translate rotation map on point convert, and calculate transformation after point coordinate and with its phase
The distance between point coordinate on corresponding another map.The deletion that map of rotation will be translated calculates initial transformation square
Point set after the point of battle array is converted according to initial transformation matrix, is solved coordinate and another map after converting and is deleted calculating just
The distance between point coordinate of beginning transformation matrix, finds out the minimum value of distance.If minimum value is less than given threshold value, right with this
Minimum range corresponding point optimizes transformation matrix;If minimum range is greater than threshold value, iteration terminates.
3, calculate dissimilarity function: it is maximum that solution optimum translation matrix makes dissimilarity function minimum obtain two maps
Similar portion.It is specifically solved using improvement of differential evolution algorithm.
In the examples of implementation, grating map independent for one calculates two dimensional code that robot detects two-by-two
Between space vector constraint, herein under the premise of, the two dimensional code for including can be calculated in the map that each robot establishes
Constraint information, and the position of two dimensional code is fixed and invariable in the actual environment, therefore the two dimensional code once detected is simultaneously
Know its ID, then it is all that this is unique that no matter robot is observed from which angle, either which robot observes
Two dimensional code, this accurate characteristic information makes the matching of two different maps more accurate and convenient.
Examples of implementation three
The examples of implementation disclose a kind of navigation system, which includes server, server and multiple robots
It is communicated, each robot realizes the building of respectively sub- map, and server is configured as executing a kind of above-mentioned multimachine building ground
The method of figure realizes the building of global map, completes the function of navigation.A kind of above-mentioned multimachine constructs the specific interior of the method for map
Hold referring to examples of implementation two, is no longer repeated herein.
It is understood that in the description of this specification, reference term " embodiment ", " another embodiment ", " other
The description of embodiment " or " first embodiment~N embodiment " etc. means specific spy described in conjunction with this embodiment or example
Sign, structure, material or feature are contained at least one embodiment or example of the disclosure.In the present specification, to above-mentioned
The schematic representation of term may not refer to the same embodiment or example.Moreover, the specific features of description, structure, material
Person's feature can be combined in any suitable manner in any one or more of the embodiments or examples.
The foregoing is merely preferred embodiment of the present disclosure, are not limited to the disclosure, for the skill of this field
For art personnel, the disclosure can have various modifications and variations.It is all within the spirit and principle of the disclosure, it is made any to repair
Change, equivalent replacement, improvement etc., should be included within the protection scope of the disclosure.
Claims (10)
1. a kind of system of multimachine building map, characterized in that include: several robots for being divided into multiple groups, every group of robot
It is communicated respectively with corresponding local server, several local servers and global cloud server communication;
Each robot passes through laser radar scanning environmental information and is uploaded to the grid for generating the robot in local server
Map;
Each robot obtains the unique ID and machine of each two dimensional code by two-dimensional barcode information in visual sensor environment-identification
Device people is uploaded to local server relative to the pose of two dimensional code;
Grating map that the local server respectively constructs robot all in the group and and each map in two
Dimension code information is handled, and the constraint information for the two dimensional code for including in the map that each robot establishes is calculated, according to two dimension
The constraint information of code obtains intersection in different maps and carries out map fusion, after completing map fusion, by the final overall situation
Map is uploaded to cloud server.
2. a kind of system of multimachine building map as described in claim 1, characterized in that the local server is for each
The two dimensional code that grating map extracts establishes space constraint vector, solves between different grating maps rotation, translation transformation most
Excellent solution, coordinate is converted between completing grating map according to optimal solution, completes map fusion by least square method.
3. a kind of system of multimachine building map as described in claim 1, characterized in that the local server will samely
Two dimensional code ID and its posture information in figure are stored by the different classifications of robot serial number.
4. a kind of system of multimachine building map as described in claim 1, characterized in that provide two-dimensional barcode information in environment
Environmental labelling unit is to be laid with two-dimension code label in advance in the environment, provides the accurate character representation to environment, each two dimension
Code label has unique ID, and each two dimensional code represents a position feature in actual environment.
5. a kind of method of multimachine building map, characterized in that include:
By environmental information locating for laser radar scanning robot and generate the grating map of the robot;
Two-dimensional barcode information in robot local environment is identified by visual sensor, obtains the unique ID and machine of each two dimensional code
Pose of the device people relative to two dimensional code;
The grating map that respectively constructed to all robots and and each map in two-dimensional barcode information handle, calculate
The constraint information for the two dimensional code for including in the map that each robot establishes out, obtains differently according to the constraint information of two dimensional code
Intersection and map fusion is carried out in figure, after completing map fusion, obtains global map.
6. a kind of method of multimachine building map as claimed in claim 5, characterized in that match two grating maps
Intersection in different maps is obtained, when two grating maps are matched, establishes transformation relation between two maps, wherein
The homography matrix for indicating two width subgraph transformational relations includes spin matrix and translation vector and scale variable.
7. a kind of method of multimachine building map as claimed in claim 6, characterized in that matched in two grating maps
When, using polygon matched mode, i.e., multiple two dimensional code constrained vectors come in fact between lasting search comparison obtains two width grating maps
It is existing.
8. a kind of method of multimachine building map as claimed in claim 6, characterized in that establish overdetermined equation to solve two
The homography matrix of transformational relation between map is solved by least square method by establishing normal equation group and obtains overdetermined equation most
Excellent solution.
9. a kind of method of multimachine building map as claimed in claim 8, characterized in that grating map after acquisition optimal solution
It is coordinately transformed, it is unified into the same coordinate system, then carry out the fusion of grating map.
10. a kind of navigation system, which includes server, and server is communicated with multiple robots, server quilt
The method for being configured to execute a kind of any multimachine building map of the claims 5-9 realizes the building of global map.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910172775.XA CN109725327B (en) | 2019-03-07 | 2019-03-07 | Method and system for building map by multiple machines |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910172775.XA CN109725327B (en) | 2019-03-07 | 2019-03-07 | Method and system for building map by multiple machines |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109725327A true CN109725327A (en) | 2019-05-07 |
CN109725327B CN109725327B (en) | 2020-08-04 |
Family
ID=66302162
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910172775.XA Active CN109725327B (en) | 2019-03-07 | 2019-03-07 | Method and system for building map by multiple machines |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109725327B (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110376934A (en) * | 2019-06-12 | 2019-10-25 | 深圳飞科机器人有限公司 | Clean robot, clean robot control method and terminal control method |
CN110986969A (en) * | 2019-11-27 | 2020-04-10 | Oppo广东移动通信有限公司 | Map fusion method and device, equipment and storage medium |
CN111089585A (en) * | 2019-12-30 | 2020-05-01 | 哈尔滨理工大学 | Mapping and positioning method based on sensor information fusion |
CN111090285A (en) * | 2019-12-24 | 2020-05-01 | 山东华尚电气有限公司 | Navigation robot control system and navigation information management method |
CN111123926A (en) * | 2019-12-20 | 2020-05-08 | 上海点甜农业专业合作社 | Method for building automatic navigation scene based on two-dimensional code road sign |
CN111121753A (en) * | 2019-12-30 | 2020-05-08 | 炬星科技(深圳)有限公司 | Robot joint graph building method and device and computer readable storage medium |
CN111352425A (en) * | 2020-03-16 | 2020-06-30 | 北京猎户星空科技有限公司 | Navigation system, method, device, electronic equipment and medium |
CN111369640A (en) * | 2020-02-28 | 2020-07-03 | 广州高新兴机器人有限公司 | Multi-robot graph establishing method and system, computer storage medium and electronic equipment |
CN111443713A (en) * | 2020-04-14 | 2020-07-24 | 三一机器人科技有限公司 | Fusion positioning navigation system and method |
CN111813102A (en) * | 2020-06-06 | 2020-10-23 | 浙江中力机械有限公司 | Distributed autonomous robot environment map construction method |
CN112527929A (en) * | 2020-10-20 | 2021-03-19 | 深圳市银星智能科技股份有限公司 | Grid map coding method and device and electronic equipment |
CN112685527A (en) * | 2020-12-31 | 2021-04-20 | 北京迈格威科技有限公司 | Method, device and electronic system for establishing map |
CN113074737A (en) * | 2021-03-25 | 2021-07-06 | 大连理工大学 | Multi-robot distributed collaborative vision mapping method based on scene identification |
CN113108798A (en) * | 2021-04-21 | 2021-07-13 | 浙江中烟工业有限责任公司 | Multi-storage robot indoor map positioning system based on laser radar |
CN115139300A (en) * | 2022-07-01 | 2022-10-04 | 北京盈迪曼德科技有限公司 | Cloud server, robot, multi-machine management system and multi-machine management method |
CN115965673A (en) * | 2022-11-23 | 2023-04-14 | 中国建筑一局(集团)有限公司 | Centralized multi-robot positioning method based on binocular vision |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101504761A (en) * | 2009-01-21 | 2009-08-12 | 北京中星微电子有限公司 | Image splicing method and apparatus |
KR20100104629A (en) * | 2009-03-18 | 2010-09-29 | 경기대학교 산학협력단 | Method and system for recommending point |
CN103165025A (en) * | 2013-03-08 | 2013-06-19 | 郑奕斌 | Two-dimension code guidance method, device and carrier |
CN103278170A (en) * | 2013-05-16 | 2013-09-04 | 东南大学 | Mobile robot cascading map building method based on remarkable scenic spot detection |
CN104143074A (en) * | 2013-05-07 | 2014-11-12 | 李东舸 | Method and equipment for generating motion feature codes on the basis of motion feature information |
CN106097443A (en) * | 2016-05-30 | 2016-11-09 | 南京林业大学 | City indoor and outdoor integrated three-dimensional scenario building and spatially adaptive air navigation aid |
CN106272423A (en) * | 2016-08-31 | 2017-01-04 | 哈尔滨工业大学深圳研究生院 | A kind of multirobot for large scale environment works in coordination with the method for drawing and location |
CN108227717A (en) * | 2018-01-30 | 2018-06-29 | 中国人民解放军陆军装甲兵学院 | Multiple mobile robot's map amalgamation method and convergence platform based on ORB features |
CN108897836A (en) * | 2018-06-25 | 2018-11-27 | 广州视源电子科技股份有限公司 | Method and device for robot to map based on semantics |
CN109211251A (en) * | 2018-09-21 | 2019-01-15 | 北京理工大学 | A kind of instant positioning and map constructing method based on laser and two dimensional code fusion |
-
2019
- 2019-03-07 CN CN201910172775.XA patent/CN109725327B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101504761A (en) * | 2009-01-21 | 2009-08-12 | 北京中星微电子有限公司 | Image splicing method and apparatus |
KR20100104629A (en) * | 2009-03-18 | 2010-09-29 | 경기대학교 산학협력단 | Method and system for recommending point |
CN103165025A (en) * | 2013-03-08 | 2013-06-19 | 郑奕斌 | Two-dimension code guidance method, device and carrier |
CN104143074A (en) * | 2013-05-07 | 2014-11-12 | 李东舸 | Method and equipment for generating motion feature codes on the basis of motion feature information |
CN103278170A (en) * | 2013-05-16 | 2013-09-04 | 东南大学 | Mobile robot cascading map building method based on remarkable scenic spot detection |
CN106097443A (en) * | 2016-05-30 | 2016-11-09 | 南京林业大学 | City indoor and outdoor integrated three-dimensional scenario building and spatially adaptive air navigation aid |
CN106272423A (en) * | 2016-08-31 | 2017-01-04 | 哈尔滨工业大学深圳研究生院 | A kind of multirobot for large scale environment works in coordination with the method for drawing and location |
CN108227717A (en) * | 2018-01-30 | 2018-06-29 | 中国人民解放军陆军装甲兵学院 | Multiple mobile robot's map amalgamation method and convergence platform based on ORB features |
CN108897836A (en) * | 2018-06-25 | 2018-11-27 | 广州视源电子科技股份有限公司 | Method and device for robot to map based on semantics |
CN109211251A (en) * | 2018-09-21 | 2019-01-15 | 北京理工大学 | A kind of instant positioning and map constructing method based on laser and two dimensional code fusion |
Non-Patent Citations (5)
Title |
---|
HONGYE LI: "Multi-robot SLAM and Map Merging", 《MULTI-ROBOT SLAM AND MAP MERGING》 * |
MONICA BALLESTA等: "Map Fusion in an Independent Multi-robot Approach", 《WSEAS TRANSACTIONS ON SYSTEMS》 * |
李倩等: "二值化的SIFT特征描述子及图像拼接优化", 《中国图象图形学报》 * |
潘薇等: "复杂环境下多机器人协作构建地图的方法", 《四川大学学报(工程科学版)》 * |
王东阁等: "基于定位二维码的室内地图服务系统的设计与实现", 《测绘地理信息》 * |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110376934A (en) * | 2019-06-12 | 2019-10-25 | 深圳飞科机器人有限公司 | Clean robot, clean robot control method and terminal control method |
CN110986969A (en) * | 2019-11-27 | 2020-04-10 | Oppo广东移动通信有限公司 | Map fusion method and device, equipment and storage medium |
CN111123926A (en) * | 2019-12-20 | 2020-05-08 | 上海点甜农业专业合作社 | Method for building automatic navigation scene based on two-dimensional code road sign |
CN111090285A (en) * | 2019-12-24 | 2020-05-01 | 山东华尚电气有限公司 | Navigation robot control system and navigation information management method |
CN111090285B (en) * | 2019-12-24 | 2023-04-25 | 山东华尚电气有限公司 | Navigation robot control system and navigation information management method |
WO2021135813A1 (en) * | 2019-12-30 | 2021-07-08 | 炬星科技(深圳)有限公司 | Robot joint mapping method and device, and computer-readable storage medium |
CN111089585A (en) * | 2019-12-30 | 2020-05-01 | 哈尔滨理工大学 | Mapping and positioning method based on sensor information fusion |
CN111121753A (en) * | 2019-12-30 | 2020-05-08 | 炬星科技(深圳)有限公司 | Robot joint graph building method and device and computer readable storage medium |
CN111369640A (en) * | 2020-02-28 | 2020-07-03 | 广州高新兴机器人有限公司 | Multi-robot graph establishing method and system, computer storage medium and electronic equipment |
CN111369640B (en) * | 2020-02-28 | 2024-03-26 | 广州高新兴机器人有限公司 | Multi-robot mapping method, system, computer storage medium and electronic equipment |
CN111352425A (en) * | 2020-03-16 | 2020-06-30 | 北京猎户星空科技有限公司 | Navigation system, method, device, electronic equipment and medium |
CN111352425B (en) * | 2020-03-16 | 2024-02-09 | 北京猎户星空科技有限公司 | Navigation system, method, device, electronic equipment and medium |
CN111443713B (en) * | 2020-04-14 | 2023-07-18 | 三一机器人科技有限公司 | Fusion positioning navigation system and method |
CN111443713A (en) * | 2020-04-14 | 2020-07-24 | 三一机器人科技有限公司 | Fusion positioning navigation system and method |
CN111813102B (en) * | 2020-06-06 | 2023-11-21 | 浙江中力机械股份有限公司 | Distributed autonomous robot environment map construction method |
CN111813102A (en) * | 2020-06-06 | 2020-10-23 | 浙江中力机械有限公司 | Distributed autonomous robot environment map construction method |
CN112527929A (en) * | 2020-10-20 | 2021-03-19 | 深圳市银星智能科技股份有限公司 | Grid map coding method and device and electronic equipment |
CN112527929B (en) * | 2020-10-20 | 2023-12-08 | 深圳银星智能集团股份有限公司 | Grid map coding method and device and electronic equipment |
CN112685527A (en) * | 2020-12-31 | 2021-04-20 | 北京迈格威科技有限公司 | Method, device and electronic system for establishing map |
CN112685527B (en) * | 2020-12-31 | 2024-09-17 | 北京迈格威科技有限公司 | Method, device and electronic system for establishing map |
CN113074737A (en) * | 2021-03-25 | 2021-07-06 | 大连理工大学 | Multi-robot distributed collaborative vision mapping method based on scene identification |
CN113074737B (en) * | 2021-03-25 | 2023-12-29 | 大连理工大学 | Multi-robot distributed collaborative vision mapping method based on scene identification |
CN113108798A (en) * | 2021-04-21 | 2021-07-13 | 浙江中烟工业有限责任公司 | Multi-storage robot indoor map positioning system based on laser radar |
CN115139300A (en) * | 2022-07-01 | 2022-10-04 | 北京盈迪曼德科技有限公司 | Cloud server, robot, multi-machine management system and multi-machine management method |
CN115965673A (en) * | 2022-11-23 | 2023-04-14 | 中国建筑一局(集团)有限公司 | Centralized multi-robot positioning method based on binocular vision |
CN115965673B (en) * | 2022-11-23 | 2023-09-12 | 中国建筑一局(集团)有限公司 | Centralized multi-robot positioning method based on binocular vision |
Also Published As
Publication number | Publication date |
---|---|
CN109725327B (en) | 2020-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109725327A (en) | A kind of method and system of multimachine building map | |
JP6896077B2 (en) | Vehicle automatic parking system and method | |
CN104330090B (en) | Robot distributed sign intelligent semantic map creating method | |
CN101413806B (en) | Mobile robot grating map creating method of real-time data fusion | |
Suveg et al. | Reconstruction of 3D building models from aerial images and maps | |
CN103268729B (en) | Based on mobile robot's tandem type map creating method of composite character | |
CN108230247B (en) | Generation method, device, equipment and the computer-readable storage medium of three-dimensional map based on cloud | |
CN110070139A (en) | Small sample towards automatic Pilot environment sensing is in ring learning system and method | |
Yu et al. | An autonomous restaurant service robot with high positioning accuracy | |
Borrmann et al. | The project thermalmapper–thermal 3d mapping of indoor environments for saving energy | |
CN108921947A (en) | Generate method, apparatus, equipment, storage medium and the acquisition entity of electronic map | |
CN109144072A (en) | A kind of intelligent robot barrier-avoiding method based on three-dimensional laser | |
US20230236280A1 (en) | Method and system for positioning indoor autonomous mobile robot | |
Aider et al. | A model-based method for indoor mobile robot localization using monocular vision and straight-line correspondences | |
CN103247040A (en) | Layered topological structure based map splicing method for multi-robot system | |
Raza et al. | Comparing and evaluating indoor positioning techniques | |
Tsardoulias et al. | Critical rays scan match SLAM | |
Wang et al. | Visual semantic navigation based on deep learning for indoor mobile robots | |
Skrzypczyński | Mobile robot localization: Where we are and what are the challenges? | |
CN109313822A (en) | Virtual wall construction method and device, map constructing method, mobile electronic equipment based on machine vision | |
CN117980761A (en) | Sensor array, system and method for constructing magnetic map, and system and method for localizing mobile device based on magnetic map | |
Jiafa et al. | Target distance measurement method using monocular vision | |
Feng et al. | Floorplannet: Learning topometric floorplan matching for robot localization | |
Xie et al. | Autonomous Multi-robot Navigation and Cooperative Mapping in Partially Unknown Environments | |
Zhao et al. | Scalable building height estimation from street scene images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |