CN109074638B - Fusion graph building method, related device and computer readable storage medium - Google Patents

Fusion graph building method, related device and computer readable storage medium Download PDF

Info

Publication number
CN109074638B
CN109074638B CN201880001190.5A CN201880001190A CN109074638B CN 109074638 B CN109074638 B CN 109074638B CN 201880001190 A CN201880001190 A CN 201880001190A CN 109074638 B CN109074638 B CN 109074638B
Authority
CN
China
Prior art keywords
map
corrected
sensor data
direction sensor
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880001190.5A
Other languages
Chinese (zh)
Other versions
CN109074638A (en
Inventor
王超鹏
林义闽
廉士国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Shanghai Robotics Co Ltd
Original Assignee
Cloudminds Shenzhen Robotics Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Shenzhen Robotics Systems Co Ltd filed Critical Cloudminds Shenzhen Robotics Systems Co Ltd
Publication of CN109074638A publication Critical patent/CN109074638A/en
Application granted granted Critical
Publication of CN109074638B publication Critical patent/CN109074638B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/60Rotation of a whole image or part thereof
    • G06T3/608Skewing or deskewing, e.g. by two-pass or three-pass rotation

Abstract

The application relates to the technical field of computer vision, and discloses a fusion image building method, a related device and a computer readable storage medium. The method is applied to a terminal or a cloud terminal and comprises the following steps: acquiring image data under different conditions, determining a reference map and a correction map, and correcting the correction map according to the reference map; and determining a final output map according to the reference map and the corrected correction map. By acquiring image data under different conditions, respectively establishing maps corresponding to the different conditions according to the image data under the different conditions, determining a reference map and a correction map, and correcting the correction map according to the reference map, the final output map determined according to the reference map and the corrected correction map is better matched with an actual scene, so that accurate positioning and navigation are realized.

Description

Fusion graph building method, related device and computer readable storage medium
Technical Field
The present application relates to the field of computer vision technologies, and in particular, to a fusion mapping method, a related apparatus, and a computer-readable storage medium.
Background
In the instant positioning and mapping (SLAM), intelligent equipment such as a robot and the like moves from an unknown position in an unknown environment, self-positioning is carried out according to position estimation and a map in the moving process, and meanwhile, an incremental map is built on the basis of self-positioning to realize autonomous positioning and navigation of the robot. In addition, image data are collected by a camera at present, And Mapping And positioning are performed, that is, visual instantaneous positioning And Mapping (vSLAM) is widely applied to the fields of autonomous mobile robots, intelligent driving And the like.
The inventors have discovered during research into the prior art that the prior art vslams are based on image processing and are susceptible to illumination and viewing angle variations. For example, a vSLAM constructs a map of a certain scene at a certain moment in the day, and for the same scene at night, due to the change of illumination, the acquired image cannot find a matching image in the vSLAM map constructed in the day, so that positioning and navigation cannot be realized; in addition, the vSLAM is influenced by the visual angle of the sensor when the image is collected, the visual angle for constructing the map is directional, and if the visual angle is greatly different from the stored visual angle during positioning, the positioning and navigation cannot be finished.
Disclosure of Invention
An object of some embodiments of the present application is to provide a fusion graph building method, a related apparatus, and a computer-readable storage medium, so as to solve the above technical problems.
One embodiment of the present application provides a fusion graph building method, including: collecting image data under different conditions; respectively establishing maps corresponding to different conditions according to the image data under different conditions, determining one of the maps corresponding to the different conditions as a reference map, and determining the maps except the reference map in the maps corresponding to the different conditions as a correction map; correcting the correction map according to the reference map; and determining a final output map according to the reference map and the corrected correction map.
The embodiment of the present application further provides a fusion graph establishing device, which includes: the acquisition module is used for acquiring image data under different conditions; the map establishing module is used for respectively establishing maps corresponding to different conditions according to the image data under the different conditions, determining one of the maps corresponding to the different conditions as a reference map, and determining the maps except the reference map in the maps corresponding to the different conditions as a correction map; the correction module is used for correcting the correction map according to the reference map; and the determining module is used for determining a final output map according to the reference map and the corrected correction map.
An embodiment of the present application further provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to perform the fusion mapping method in any of the method embodiments of the present application.
The embodiment of the application also provides a computer-readable storage medium, which stores computer instructions, and the computer instructions are used for enabling a computer to execute the fusion mapping method in any method embodiment of the application.
Compared with the prior art, the method and the device have the advantages that the map corresponding to different conditions is respectively established according to the image data under different conditions by acquiring the image data under different conditions, the reference map and the correction map are determined, and the correction map is corrected according to the reference map, so that the final output map determined according to the reference map and the corrected correction map is matched with an actual scene better, and accurate positioning and navigation are realized.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
FIG. 1 is a flow chart of a method of merging maps in a first embodiment of the present application;
FIG. 2 is a schematic view of a first embodiment of the present application showing data acquisition from different viewing angles;
FIG. 3 is a schematic diagram of data acquisition at different times in a first embodiment of the present application;
FIG. 4 is a flow chart of a method of fusion map creation in a second embodiment of the present application;
FIG. 5 is a block diagram of a fusion map creation apparatus according to a third embodiment of the present application;
FIG. 6 is a block diagram of a fusion map creation apparatus according to a fourth embodiment of the present application;
fig. 7 is a diagram illustrating an example of the structure of an electronic device according to a fifth embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, some embodiments of the present application will be described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The first embodiment of the application relates to a fusion image building method which is applied to a terminal or a cloud terminal. The terminal can be equipment such as a blind guiding helmet, an unmanned vehicle or an intelligent robot. The cloud end is in communication connection with the terminal, and provides a map for positioning for the terminal or directly provides a positioning result for the terminal. In this embodiment, a terminal is taken as an example to explain an execution process of the fused graph building method, and reference may be made to the content of the embodiment of the present application to the process of executing the fused graph building method by the cloud.
The specific flow of the fusion map building method is shown in fig. 1, and comprises the following steps:
in step 101, image data under different conditions is acquired.
Specifically, the image data includes: each frame of image information and timestamp information corresponding to each frame of image information, the timestamp information corresponding to each frame of image information represents the time for acquiring the frame of image information, and the timestamp information corresponding to the direction information represents the time for acquiring the direction information.
It should be noted that, while acquiring image data under different conditions, the method further includes: collecting orientation sensor data under different conditions, the orientation sensor data comprising: the device comprises direction information and timestamp information corresponding to the direction information, wherein the direction information represents the running direction of the current device.
In the present embodiment, the image data and the direction sensor data are acquired on the same path under different conditions, where the different conditions include different viewing angles or different times. As shown in fig. 2, when data acquisition is performed at different viewing angles, the viewing angles may be classified according to front view, left view, and right view, and of course, the specific angle value of the viewing angle during data acquisition is not limited in this embodiment; as shown in fig. 3, when data acquisition is performed at different times, the time may be segmented according to morning, noon and evening, and of course, the specific time of data acquisition in each time segment is not limited in this embodiment.
The direction sensor of the present embodiment includes an odometer or an Inertial Measurement Unit (IMU), and when the direction sensor is an odometer, the direction information includes a physical output and an euler angle, and when the direction sensor is an IMU, the direction information includes an angular velocity and an acceleration.
In step 102, a reference map and a correction map are determined.
Specifically, maps corresponding to different conditions are respectively established according to image data under different conditions, one of the maps corresponding to the different conditions is determined as a reference map, and the other of the maps corresponding to the different conditions except the reference map is determined as a correction map.
For example, when data acquisition is performed at different viewing angles along the same route, a map created by image data continuously acquired when the viewing angle is a front view is used as a reference map, and a map created by image data continuously acquired when the viewing angle is a left side and the viewing angle is a right side is used as a correction map. Of course, a map created when the angle of view is to the left may be used as the reference map, and a map created when the angle of view is to the right and the front view may be used as the correction map. Similarly, when data acquisition is performed along the same path at different times, the specific time corresponding to the reference map is not limited.
In the present embodiment, the mapping is performed based on the vSLAM method. When the robot starts to move in an unknown position in an unknown environment, image data acquired by shooting of a camera device can be acquired, feature extraction is carried out on each frame of image information in the image data, a map is built for the unknown environment according to the extracted features and timestamp information corresponding to each frame of image information, and the position of the robot is located according to the built map. The reference map and the correction map are constructed in the same manner, and the technology for constructing the map based on the vSLAM manner is relatively mature, so that the detailed description is omitted in this embodiment.
In step 103, the direction sensor data under the condition corresponding to the correction map is corrected based on the direction sensor data under the condition corresponding to the reference map.
Specifically, key points of the reference map are determined according to the direction sensor data under the condition corresponding to the reference map, and key points of the correction map are determined according to the direction sensor data under the condition corresponding to the correction map; matching the key points of the reference map with the key points of the correction map, and determining each section of path matched by the reference map and the correction map; and correcting the direction sensor data corresponding to the correction map according to the direction sensor data corresponding to the reference map in each section of the path.
In one specific implementation, the manner of determining the key points of the reference map is as follows: if the key point of the reference map in the present embodiment is a corner in the map route, the map route is divided into different links at the corner. And there may be a plurality of inflection points in succession at one turn. The determination principle of the inflection point is as follows: setting a movement distance threshold dtSetting a direction threshold value thetatDetermining d for each movement based on the direction sensor data under the conditions corresponding to the reference maptIf the difference between the two running directions at a certain position is greater than thetatThen the location is considered to be an inflection point. Suppose that n inflection points are detected at the mth turn in the map path of the reference map, and the positions of the n inflection points are k respectivelym+0、km+1...km+(n-1)Therefore, the specific position of the mth turn in the map path of the reference map can be calculated using equation (1), where equation (1) is expressed as follows:
Figure GDA0002193656110000041
wherein m represents the mth turn in the map path in the reference map, n represents the number of inflection points contained in the mth turn, i represents the number of the inflection points, t representsmIndicating the position, k, of the mth turn in the map path in the reference mapm+iThe position of the inflection point with the mth turn number i. Similarly, the manner of determining the key points in the correction map is similar to the manner of determining the key points in the reference map, and therefore the detailed description of the embodiment is omitted.
It should be noted that, because the map paths of the reference map and the correction map are the same, the key points determined in the reference map are matched with the key points determined in the correction map, and each map path matched with the reference map and the correction map is determined.
In one specific implementation, when a map created when the viewing angle is the front view is used as a reference map, and a map created when the viewing angle is the left and the viewing angle is the right is used as a correction map, the direction sensor data of the correction map for viewing angle left is corrected by using the following formula (2) and formula (3), where the formula (2) and formula (3) are expressed as follows:
θdl=θcl(2)
Figure GDA0002193656110000051
wherein, thetacIndicates an angle, theta, corresponding to the direction sensor data corresponding to the reference maplTheta corresponding to direction sensor data corresponding to a corrected map indicating an uncorrected forward view angle to the leftdlIndicates the angle deviation theta corresponding to the direction sensor data corresponding to the reference map and the corrected map with the uncorrected forward view angle deviated to the leftljIndicating a view angle to the leftThe angle corresponding to the initial direction sensor data corresponding to the corrected map of (1),
Figure GDA0002193656110000052
and j represents the serial number of the sensor data in each map path of the corrected map.
Similarly, when the map created when the viewing angle is taken as the front view is taken as the reference map, and the map created when the viewing angle is inclined to the right and the viewing angle is taken as the correction map, the direction sensor data of the correction map at the right viewing angle is corrected by using the formula (4) and the formula (5), and the formula (4) and the formula (5) are expressed as follows:
θdr=θcr(4)
θcorrj=θrjdr(5)
wherein, thetacIndicates an angle, theta, corresponding to the direction sensor data corresponding to the reference maprAn angle theta corresponding to direction sensor data corresponding to a correction map showing an uncorrected forward-looking angle deviationdrIndicates an angular deviation theta corresponding to the direction sensor data corresponding to the reference map and the corrected map without correcting the forward-looking angle deviationrjIndicating the angle corresponding to the initial direction sensor data corresponding to the corrected map with the right viewing angle,
Figure GDA0002193656110000053
and j represents the angle corresponding to the direction sensor data corresponding to the corrected map with the right visual angle after correction, and j represents the serial number of the sensor data in each map path of the corrected map.
It should be noted that, when data acquisition and map creation are performed at different times, the principle of correcting the direction sensor data under the condition corresponding to the correction map according to the direction sensor data under the condition corresponding to the reference map is the same as the principle of acquiring data and map creation according to different viewing angles, and therefore, details of this embodiment are not repeated.
In step 104, it is determined whether the correction map has a deviation, if yes, step 105 is executed, otherwise, step 108 is executed.
Specifically, each frame of acquired image information needs to be processed in the process of establishing the map, accumulated errors often occur along with the movement of the camera, and the most effective method for eliminating the errors is to find a loop, and the established map is accurate and has no deviation under the condition that the loop exists. In the process of map construction based on the vSLAM mode, a Graphical User Interface (GUI) is provided, and whether loop exists is determined by reading GUI Interface information, so as to determine whether a deviation exists in the established map. In addition, if feature points included in each frame of image information in the acquired image data are sparse and the information in the image data is insufficient for building a map in the map building process, a map building interruption occurs, and at this time, initialization is performed again. Due to the initialization, discontinuities in the direction of the established map and the absence of path information are caused, so that the established map is also biased in the case of an interruption in the mapping. And under the condition of drawing interruption, a plurality of groups of map files appear, so that whether deviation exists can be judged by acquiring the number of the map files.
In step 105, corrected directional sensor data corresponding to the keyframe of the revised map is determined.
Specifically, in the present embodiment, corrected direction sensor data corresponding to a key frame of a modified map may be determined by means of timestamp alignment. The method specifically comprises the following steps: matching the timestamp corresponding to the key frame with the timestamp in the corrected direction sensor data; acquiring a serial number of corrected direction sensor data matched with a timestamp corresponding to the key frame; corrected direction sensor data corresponding to the serial number of the corrected direction sensor data is determined.
It should be noted that, the amount of the corrected direction sensor data obtained by the device applying the method for fusing and constructing the map in unit time is much larger than the number of the key frames in the map, and since the timestamp represents the time for obtaining the data, the serial number of the corrected direction sensor data corresponding to the moment for obtaining the key frame of the corrected map can be found by matching the timestamp corresponding to the key frame with the timestamp in the corrected direction sensor data, and since the direction sensor data corresponding to the serial number of the direction sensor data is continuously collected and stored in the database during the driving process of the device along a section of path, the corrected direction sensor data corresponding to the serial number of the corrected direction sensor data is also stored in the database, so that the corrected direction sensor number corresponding to the serial number of the corrected direction sensor data can be obtained by searching the database Accordingly.
In step 106, the direction of the modified map is corrected based on the corrected direction sensor data corresponding to the keyframe.
In the case where the direction of the reference map is deviated, the direction of the reference map is corrected based on the direction sensor data corresponding to the keyframe in the reference map, and the principle is the same as that of the correction map in which the direction is deviated.
In step 107, a final output map is determined from the reference map and the corrected correction map.
Specifically, the coordinates of a reference map are determined; determining the coordinates of the corrected correction map according to the coordinates of the reference map; and determining a final output map according to the coordinates of the reference map and the coordinates of the corrected correction map.
It should be noted that, in the present embodiment, since the order of the image data acquired under different conditions is sequential, when the maps corresponding to the different conditions are established, the established reference map and the established correction map are also sequential.
In one specific implementation, when the image data acquired under the correction map corresponding condition is before the image data acquired under the reference map corresponding condition, assuming that the coordinates of the determined reference map are known, the angle corresponding to the corrected map is corrected, and then the coordinates of the corrected correction map are calculated using formula (6), formula (7) and formula (8), where formula (6), formula (7) and formula (8) are as follows:
x’=x+d*cos(θ’) (6)
y’=y+d*sin(θ’) (7)
Figure GDA0002193656110000071
where θ represents an angle corresponding to the corrected map after correction, θ ' represents a reverse angle of the angle θ corresponding to the corrected map after correction, (x, y) represents coordinates of the reference map, (x ', y ') represents coordinates of the corrected map, and d represents a distance between two adjacent key frames in the corrected map.
When the image data acquired under the correction map corresponding condition is subsequent to the image data acquired under the reference map corresponding condition, the coordinates of the corrected correction map can be calculated by directly using the angle θ corresponding to the corrected correction map without calculating the reverse angle.
And mapping the coordinates of the reference map and the coordinates of the corrected correction map into a map determined by the same coordinate system, and determining the map as a final output map.
In step 108, a final output map is determined based on the reference map and the correction map.
Compared with the prior art, the fusion map building method of the embodiment collects the image data under different conditions, respectively builds the maps corresponding to the different conditions according to the image data under the different conditions, determines the reference map and the corrected map, and corrects the corrected map according to the reference map, so that the final output map determined according to the reference map and the corrected map is more matched with the actual scene, and accurate positioning and navigation are realized.
The second embodiment of the present application relates to a method for building a map by fusion, and the present embodiment is further improved on the basis of the first embodiment, and the specific improvements are as follows: implementations are described in detail for correcting the direction of a modified map based on corrected direction sensor data corresponding to a keyframe. The flow of the fusion map building method in this embodiment is shown in fig. 2.
Specifically, in this embodiment, steps 201 to 210 are included, where steps 201 to 205 are substantially the same as steps 101 to 105 in the first embodiment, and steps 209 to 210 are substantially the same as steps 107 to 108 in the first embodiment, and are not repeated here, so that differences are mainly introduced below, and technical details not described in detail in this embodiment may be referred to the fusion map building method provided in the first embodiment, and are not repeated here.
After steps 201 to 205, step 206 is performed.
In step 206, the direction information of the corrected direction sensor data is determined based on the serial number of the corrected direction sensor data determined by the correction map.
After the serial number of the corrected direction sensor data is determined, the corrected direction sensor data corresponding to the serial number of the corrected direction sensor data can be searched in a database searching mode, and direction information is determined from the corrected direction sensor data.
In step 207, the determined orientation information of the corrected orientation sensor data is substituted for the orientation information in the keyframe.
In step 208, the direction of the corrected map after correction is determined according to the direction information in the replaced key frame.
The key frame of the correction map includes direction information of the correction map, such as euler angles. However, since the created correction map is biased, the direction information of the correction map included in the key frame is not accurate. And the direction sensor always acquires data in real time in the process of drawing, and the direction information in the corrected direction sensor data is accurate according to the reference map, so that after the direction information in the keyframe is replaced by the accurate direction information of the corrected direction sensor data, the direction of the corrected correction map is determined according to the direction information in the replaced keyframe, and the direction of the corrected correction map is more accurate.
The third embodiment of the present application relates to a fusion map building device, and the specific structure is shown in fig. 5.
As shown in fig. 5, the fusion map building apparatus includes: the map correction system comprises an acquisition module 501, a map building module 502, a first correction module 503, a judgment module 504, a first determination module 505, a second correction module 506, a map determination module 507 and a second determination module 508.
The acquiring module 501 is configured to acquire image data under different conditions.
A map building module 502 for determining a reference map and a correction map.
The first correction module 503 is configured to correct the direction sensor data under the condition corresponding to the correction map according to the direction sensor data under the condition corresponding to the reference map.
The determining module 504 is configured to determine whether the corrected map has a deviation.
The first determining module 505 is configured to determine corrected direction sensor data corresponding to the keyframe of the revised map.
And a second correction module 506, configured to correct the direction of the revised map according to the corrected direction sensor data corresponding to the key frame.
And a map determining module 507, configured to determine a final output map according to the reference map and the corrected modified map.
A second determining module 508, configured to determine a final output map according to the reference map and the modified map.
It should be understood that this embodiment is an example of the apparatus corresponding to the first embodiment, and may be implemented in cooperation with the first embodiment. The related technical details mentioned in the first embodiment are still valid in this embodiment, and are not described herein again in order to reduce repetition. Accordingly, the related-art details mentioned in the present embodiment can also be applied to the first embodiment.
A fourth embodiment of the present application relates to a fusion map creation apparatus, and this embodiment is substantially the same as the third embodiment, and the specific configuration is as shown in fig. 6. Wherein, the main improvement lies in: the fourth embodiment specifically describes the structure of the second correction module 506 in the third embodiment.
Wherein the second correction module 506 includes a third determination module 5061, a replacement module 5062, and a fourth determination module 5063.
A third determining module 5061, configured to determine direction information of the corrected direction sensor data according to the serial number of the corrected direction sensor data determined by the modified map.
A replacing module 5062 for replacing the determined directional information of the corrected directional sensor data with the directional information in the key frame.
A fourth determining module 5063, configured to determine the direction of the corrected map after correction according to the direction information in the replaced key frame.
It should be understood that this embodiment is an example of the apparatus corresponding to the second embodiment, and that this embodiment can be implemented in cooperation with the second embodiment. The related technical details mentioned in the second embodiment are still valid in this embodiment, and are not described herein again in order to reduce repetition. Accordingly, the related-art details mentioned in the present embodiment can also be applied to the second embodiment.
The above-described embodiments of the apparatus are merely illustrative, and do not limit the scope of the present application, and in practical applications, a person skilled in the art may select some or all of the modules to implement the purpose of the embodiments according to practical needs, and the present invention is not limited herein.
A fifth embodiment of the present application relates to an electronic device, and a specific structure is shown in fig. 7. Includes at least one processor 701; and a memory 702 communicatively coupled to the at least one processor 701. The memory 702 stores instructions executable by the at least one processor 701, and the instructions are executed by the at least one processor 701 to enable the at least one processor 701 to perform a fusion map building method.
In this embodiment, the processor 701 is exemplified by a Central Processing Unit (CPU), and the Memory 502 is exemplified by a Random Access Memory (RAM). The processor 701 and the memory 702 may be connected by a bus or by other means, and fig. 7 illustrates an example of a bus connection. The memory 702 is used as a non-volatile computer-readable storage medium for storing a non-volatile software program, a non-volatile computer-executable program, and modules, such as a program for implementing the environment information determining method according to the embodiment of the present application, in the memory 702. The processor 701 executes various functional applications and data processing of the device by executing nonvolatile software programs, instructions, and modules stored in the memory 702, that is, implements the above fusion map building method.
The memory 702 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store a list of options, etc. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory 702 may optionally include memory located remotely from the processor 701, which may be connected to an external device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
One or more program modules are stored in the memory 702 and, when executed by the one or more processors 701, perform the fusion mapping method in any of the method embodiments described above.
The product can execute the method provided by the embodiment of the application, has corresponding functional modules and beneficial effects of the execution method, and can refer to the method provided by the embodiment of the application without detailed technical details in the embodiment.
A sixth embodiment of the present application relates to a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, is capable of implementing the fusion mapping method in any of the method embodiments of the present application.
It should be noted that, as those skilled in the art can understand, all or part of the steps in the method for implementing the above embodiments may be implemented by a program to instruct related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, etc.) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the present application, and that various changes in form and details may be made therein without departing from the spirit and scope of the present application in practice.

Claims (10)

1. A fusion mapping method comprises the following steps:
collecting image data under different conditions;
respectively establishing maps corresponding to the different conditions according to the image data under the different conditions, determining one of the maps corresponding to the different conditions as a reference map, and determining the maps except the reference map in the maps corresponding to the different conditions as a correction map;
correcting the correction map according to the reference map;
determining a final output map according to the reference map and the corrected correction map;
wherein, when the image data under different conditions is gathered, still include: collecting direction sensor data under different conditions;
the correcting the correction map according to the reference map specifically includes:
correcting the direction sensor data under the condition corresponding to the correction map according to the direction sensor data under the condition corresponding to the reference map;
judging whether the corrected map has deviation, if yes, determining the corrected direction sensor data corresponding to the key frame of the corrected map;
and correcting the direction of the corrected map according to the corrected direction sensor data corresponding to the key frame.
2. The fusion mapping method of claim 1, wherein the different conditions include: different viewing angles or different times.
3. The fusion mapping method according to claim 1, wherein the correcting, according to the direction sensor data under the condition corresponding to the reference map, the direction sensor data under the condition corresponding to the correction map includes:
determining key points of the reference map according to the direction sensor data under the condition corresponding to the reference map, and determining key points of the correction map according to the direction sensor data under the condition corresponding to the correction map;
matching the key points of the reference map with the key points of the correction map, and determining each section of path matched by the reference map and the correction map;
and correcting the direction sensor data corresponding to the correction map according to the direction sensor data corresponding to the reference map in each section of the path.
4. The fused map building method according to claim 3, wherein the determining a final output map according to the reference map and the corrected modified map specifically comprises:
determining coordinates of the reference map;
determining the coordinates of the corrected correction map according to the coordinates of the reference map;
and determining the final output map according to the coordinates of the reference map and the corrected coordinates of the correction map.
5. The fusion mapping method of any of claims 1 to 4, wherein the image data comprises: each frame of image information and time stamp information corresponding to the each frame of image information,
the orientation sensor data includes: the direction information and the timestamp information corresponding to the direction information.
6. The fusion mapping method according to any one of claims 1 to 4, wherein the determining the corrected direction sensor data corresponding to the keyframe of the revised map specifically includes:
matching a timestamp corresponding to the key frame with a timestamp in the corrected direction sensor data;
acquiring a serial number of the corrected direction sensor data matched with a timestamp corresponding to the key frame;
determining the corrected direction sensor data corresponding to a serial number of the corrected direction sensor data.
7. The fusion map building method of claim 6, wherein the correcting the direction of the revised map according to the corrected direction sensor data corresponding to the keyframe specifically comprises:
determining direction information of the corrected direction sensor data according to the determined serial number of the corrected direction sensor data;
replacing the determined corrected directional information of the directional sensor data with directional information in the keyframe;
and determining the direction of the corrected map after correction according to the direction information in the replaced key frame.
8. A fusion mapping device comprises:
the acquisition module is used for acquiring image data under different conditions;
the map establishing module is used for respectively establishing maps corresponding to the different conditions according to the image data under the different conditions, determining one of the maps corresponding to the different conditions as a reference map, and determining the maps except the reference map in the maps corresponding to the different conditions as a correction map;
the correction module is used for correcting the correction map according to the reference map;
the map determining module is used for determining a final output map according to the reference map and the corrected correction map;
wherein, when the image data under different conditions is gathered, still include: collecting direction sensor data under different conditions;
the correcting the correction map according to the reference map specifically includes:
correcting the direction sensor data under the condition corresponding to the correction map according to the direction sensor data under the condition corresponding to the reference map;
judging whether the corrected map has deviation, if yes, determining the corrected direction sensor data corresponding to the key frame of the corrected map;
and correcting the direction of the corrected map according to the corrected direction sensor data corresponding to the key frame.
9. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the fused graph building method of any one of claims 1 to 7.
10. A computer-readable storage medium storing a computer program which, when executed by a processor, implements the fusion mapping method of any one of claims 1 to 7.
CN201880001190.5A 2018-07-23 2018-07-23 Fusion graph building method, related device and computer readable storage medium Active CN109074638B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/096653 WO2020019115A1 (en) 2018-07-23 2018-07-23 Fusion mapping method, related device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN109074638A CN109074638A (en) 2018-12-21
CN109074638B true CN109074638B (en) 2020-04-24

Family

ID=64789303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880001190.5A Active CN109074638B (en) 2018-07-23 2018-07-23 Fusion graph building method, related device and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN109074638B (en)
WO (1) WO2020019115A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109528095B (en) * 2018-12-28 2020-11-17 深圳市愚公科技有限公司 Calibration method of sweeping record chart, sweeping robot and mobile terminal
CN109829849B (en) * 2019-01-29 2023-01-31 达闼机器人股份有限公司 Training data generation method and device and terminal
WO2020223974A1 (en) * 2019-05-09 2020-11-12 珊口(深圳)智能科技有限公司 Method for updating map and mobile robot
CN110825832B (en) * 2019-11-07 2022-08-19 深圳创维数字技术有限公司 SLAM map updating method, device and computer readable storage medium
CN112604276B (en) * 2020-12-25 2022-09-02 珠海金山数字网络科技有限公司 Terrain modification method and terrain modification device
CN112833912B (en) * 2020-12-31 2024-03-05 杭州海康机器人股份有限公司 V-SLAM map verification method, device and equipment
CN113899357B (en) * 2021-09-29 2023-10-31 北京易航远智科技有限公司 Incremental mapping method and device for visual SLAM, robot and readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1802678A (en) * 2003-06-12 2006-07-12 株式会社电装 Image server, image acquisition device, and image display terminal
CN101010710A (en) * 2005-07-07 2007-08-01 松下电器产业株式会社 Map information correction device, map information correction method, program, information providing device and information acquisition device using the map information correction device
CN102087530A (en) * 2010-12-07 2011-06-08 东南大学 Vision navigation method of mobile robot based on hand-drawing map and path
CN104457772A (en) * 2013-09-13 2015-03-25 伊莱比特汽车公司 Technique for correcting digitized map data
CN105096386A (en) * 2015-07-21 2015-11-25 中国民航大学 Method for automatically generating geographic maps for large-range complex urban environment
CN105143821A (en) * 2013-04-30 2015-12-09 高通股份有限公司 Wide area localization from SLAM maps
CN106092104A (en) * 2016-08-26 2016-11-09 深圳微服机器人科技有限公司 The method for relocating of a kind of Indoor Robot and device
CN107687860A (en) * 2016-08-04 2018-02-13 鸿富锦精密工业(深圳)有限公司 The autonomous mobile apparatus and method of automatic amendment environmental information
CN108051002A (en) * 2017-12-04 2018-05-18 上海文什数据科技有限公司 Transport vehicle space-location method and system based on inertia measurement auxiliary vision

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107065925B (en) * 2017-04-01 2020-04-07 成都通甲优博科技有限责任公司 Unmanned aerial vehicle return method and device
CN109073387B (en) * 2018-07-20 2021-03-23 达闼机器人有限公司 Method, device, terminal and storage medium for aligning multiple maps

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1802678A (en) * 2003-06-12 2006-07-12 株式会社电装 Image server, image acquisition device, and image display terminal
CN101010710A (en) * 2005-07-07 2007-08-01 松下电器产业株式会社 Map information correction device, map information correction method, program, information providing device and information acquisition device using the map information correction device
CN102087530A (en) * 2010-12-07 2011-06-08 东南大学 Vision navigation method of mobile robot based on hand-drawing map and path
CN105143821A (en) * 2013-04-30 2015-12-09 高通股份有限公司 Wide area localization from SLAM maps
CN104457772A (en) * 2013-09-13 2015-03-25 伊莱比特汽车公司 Technique for correcting digitized map data
CN105096386A (en) * 2015-07-21 2015-11-25 中国民航大学 Method for automatically generating geographic maps for large-range complex urban environment
CN107687860A (en) * 2016-08-04 2018-02-13 鸿富锦精密工业(深圳)有限公司 The autonomous mobile apparatus and method of automatic amendment environmental information
CN106092104A (en) * 2016-08-26 2016-11-09 深圳微服机器人科技有限公司 The method for relocating of a kind of Indoor Robot and device
CN108051002A (en) * 2017-12-04 2018-05-18 上海文什数据科技有限公司 Transport vehicle space-location method and system based on inertia measurement auxiliary vision

Also Published As

Publication number Publication date
CN109074638A (en) 2018-12-21
WO2020019115A1 (en) 2020-01-30

Similar Documents

Publication Publication Date Title
CN109074638B (en) Fusion graph building method, related device and computer readable storage medium
Shan et al. Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping
CN108303103B (en) Method and device for determining target lane
CN110160542B (en) Method and device for positioning lane line, storage medium and electronic device
CN110084832B (en) Method, device, system, equipment and storage medium for correcting camera pose
CN109727288B (en) System and method for monocular simultaneous localization and mapping
CN112734852B (en) Robot mapping method and device and computing equipment
CN107909614B (en) Positioning method of inspection robot in GPS failure environment
CN107167826B (en) Vehicle longitudinal positioning system and method based on variable grid image feature detection in automatic driving
CN112197770A (en) Robot positioning method and positioning device thereof
EP3023740B1 (en) Method, apparatus and computer program product for route matching
WO2019203084A1 (en) Map information updating system and map information updating program
CN111256687A (en) Map data processing method and device, acquisition equipment and storage medium
CN111986261B (en) Vehicle positioning method and device, electronic equipment and storage medium
CN111912416A (en) Method, device and equipment for positioning equipment
Cui et al. Real-time global localization of intelligent road vehicles in lane-level via lane marking detection and shape registration
CN114252082B (en) Vehicle positioning method and device and electronic equipment
CN109073390B (en) Positioning method and device, electronic equipment and readable storage medium
KR20170068937A (en) Autonomous driving vehicle navigation system using the tunnel lighting
CN112556685A (en) Navigation route display method and device, storage medium and electronic equipment
CN111721305B (en) Positioning method and apparatus, autonomous vehicle, electronic device, and storage medium
CN110415174B (en) Map fusion method, electronic device and storage medium
CN113223064B (en) Visual inertial odometer scale estimation method and device
CN111145634B (en) Method and device for correcting map
WO2020019116A1 (en) Multi-source data mapping method, related apparatus, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210210

Address after: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Patentee after: Dalu Robot Co.,Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Patentee before: Shenzhen Qianhaida Yunyun Intelligent Technology Co.,Ltd.

TR01 Transfer of patent right
CP03 Change of name, title or address

Address after: 200245 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Patentee after: Dayu robot Co.,Ltd.

Address before: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Patentee before: Dalu Robot Co.,Ltd.

CP03 Change of name, title or address