WO2020019116A1 - 多源数据建图方法、相关装置及计算机可读存储介质 - Google Patents

多源数据建图方法、相关装置及计算机可读存储介质 Download PDF

Info

Publication number
WO2020019116A1
WO2020019116A1 PCT/CN2018/096658 CN2018096658W WO2020019116A1 WO 2020019116 A1 WO2020019116 A1 WO 2020019116A1 CN 2018096658 W CN2018096658 W CN 2018096658W WO 2020019116 A1 WO2020019116 A1 WO 2020019116A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
sensor data
key frame
direction sensor
coordinates
Prior art date
Application number
PCT/CN2018/096658
Other languages
English (en)
French (fr)
Inventor
王超鹏
林义闽
廉士国
Original Assignee
深圳前海达闼云端智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳前海达闼云端智能科技有限公司 filed Critical 深圳前海达闼云端智能科技有限公司
Priority to PCT/CN2018/096658 priority Critical patent/WO2020019116A1/zh
Priority to CN201880001179.9A priority patent/CN109074407A/zh
Publication of WO2020019116A1 publication Critical patent/WO2020019116A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Definitions

  • the present application relates to the field of computer vision technology, and in particular, to a multi-source data mapping method, a related device, and a computer-readable storage medium.
  • SLAM Real-time positioning and map building
  • vSLAM visual localization and mapping
  • vSLAM Visual Simultaneous Localization And Mapping
  • the inventor discovered in the process of studying the prior art that, in the process of constructing a map of the vSLAM in the prior art, due to the influence of the surrounding environment or its own algorithm, the resulting map is often inaccurate. For example, when the feature points collected by the camera are relatively sparse, the construction of the map will be interrupted. When the map is rebuilt, the orientation will be initialized, which will affect the accuracy of the map. In addition, when there is no During the loopback, there will be a certain cumulative error in the map data obtained from the mapping, which will also affect the accuracy of the mapping and make the map created to be different from the actual space.
  • a technical problem to be solved in some embodiments of the present application is to provide a multi-source data mapping method, a related device, and a computer-readable storage medium to solve the above technical problems.
  • An embodiment of the present application provides a multi-source data mapping method, including: collecting image data and direction sensor data; establishing a map based on the image data; and if it is determined that the map has a deviation, determining a direction sensor corresponding to a key frame of the map Data; correct the direction of the map according to the direction sensor data corresponding to the key frame; calculate the coordinates of the map after the direction correction.
  • An embodiment of the present application further provides a multi-source data mapping device, the multi-source data mapping device includes: an acquisition module for acquiring image data and orientation sensor data; a map establishing module for establishing a map based on the image data; A determining module for determining direction sensor data corresponding to a key frame of the map if it is determined that there is a deviation in the map; a correction module for correcting the direction of the map according to the direction sensor data corresponding to the key frame; a calculating module for Calculate the coordinates of the map after orientation correction.
  • An embodiment of the present application further provides an electronic device including: at least one processor; and a memory communicatively connected to the at least one processor; wherein the memory stores instructions executable by the at least one processor, and the instructions are processed by at least one Processor execution, so that at least one processor can execute the multi-source data mapping method involved in any method embodiment of the present application.
  • the embodiment of the present application further provides a computer-readable storage medium storing computer instructions, and the computer instructions are used to cause a computer to execute the multi-source data mapping method involved in any method embodiment of the present application.
  • the embodiment of the present application establishes a map by collecting image data and orientation sensor data, and when it is determined that there is a deviation in the established map, the orientation sensor data corresponding to the key frames of the map is determined by The corresponding direction sensor data corrects the direction of the map and determines the coordinates of the map, so that the created map is more accurate and matches the actual space.
  • FIG. 1 is a flowchart of a multi-source data mapping method in a first embodiment of the present application
  • FIG. 2 is a schematic diagram showing a relationship between a direction sensor data and a key frame in the first embodiment of the present application
  • FIG. 3 is a schematic diagram of a relationship between a direction sensor data and a key frame in an interruption situation in the first embodiment of the present application;
  • FIG. 4 is a flowchart of a multi-source data mapping method in a second embodiment of the present application.
  • FIG. 5 is a schematic block diagram of a multi-source data mapping device in a third embodiment of the present application.
  • FIG. 6 is a block diagram of a multi-source data mapping device in a fourth embodiment of the present application.
  • FIG. 7 is a structural example diagram of an electronic device in a fifth embodiment of the present application.
  • the first embodiment of the present application relates to a multi-source data mapping method, which is applied to a terminal or a cloud.
  • Terminals can be devices such as blind guide helmets, intelligent robots, and unmanned vehicles.
  • the cloud communicates with the terminal, providing the terminal with a map for positioning or providing positioning results directly to the terminal.
  • This embodiment uses a terminal as an example to describe the execution process of the multi-source data mapping method.
  • For the process of executing the multi-source data mapping method in the cloud reference may be made to the content of the embodiment of this application.
  • the specific process of this multi-source data mapping method is shown in Figure 1, and includes the following steps:
  • step 101 image data and direction sensor data are collected.
  • the image data includes: each frame of image information collected and time stamp information corresponding to each frame of image information
  • the direction sensor data includes: direction information and time stamp information corresponding to the direction information.
  • the time stamp information corresponding to each frame image represents the time when the frame image is acquired
  • the time stamp information corresponding to the direction information represents the time when the direction information is acquired.
  • the senor in this embodiment includes an odometer or an inertial measurement unit (IMU).
  • the direction information is a physical output and Euler angle.
  • the direction information is Angular velocity and acceleration.
  • step 102 a map is established based on the image data.
  • mapping is performed based on the vSLAM method.
  • the robot When the robot starts to move in an unknown position in an unknown environment, it will acquire the image data collected by the camera device through shooting, perform feature extraction on each frame of image information in the image data, and according to the extracted features and each frame image The timestamp information corresponding to the information constructs a map of the surrounding environment, and positions itself according to the constructed map. Since the technology of map construction based on the vSLAM method is relatively mature, it will not be described in detail in this embodiment.
  • the map created by the image data is a visual map
  • the created visual map includes multiple key frames, and time stamp information, physical output, and Euler angle information corresponding to the key frames.
  • the physical output represents the x and y coordinate values of the created visual map in spatial coordinates
  • the Euler angle represents the direction of the created visual map.
  • step 103 it is determined whether there is a deviation in the map. If yes, step 104 is performed, otherwise step 107 is performed.
  • GUI Graphical User Interface
  • the smart device determines whether there is a loopback by reading the GUI interface information.
  • the mapping process if the feature points contained in each frame of the image data collected are sparse and the information is not sufficient to build the map, the mapping will be interrupted. At this time, initialization will be re-established to create a new vision.
  • the map due to initialization, will cause discontinuities in the direction of the visual map and lack of path information. Therefore, there is also a deviation in the map created when the mapping is interrupted. However, when the construction of the map is interrupted, multiple sets of map files will appear. Therefore, the number of map files obtained can also be used to determine whether there is a deviation.
  • step 104 the orientation sensor data corresponding to the key frames of the map is determined.
  • the direction sensor data corresponding to the key frames of the map may be determined in a manner of time stamp alignment. Specifically, it includes: matching the timestamp corresponding to the key frame with the timestamp in the direction sensor data, obtaining the serial number of the direction sensor data that matches the timestamp corresponding to the key frame, and determining the sequence number corresponding to the direction sensor data.
  • Direction sensor data includes: matching the timestamp corresponding to the key frame with the timestamp in the direction sensor data, obtaining the serial number of the direction sensor data that matches the timestamp corresponding to the key frame, and determining the sequence number corresponding to the direction sensor data.
  • the amount of direction sensor data obtained by the smart device per unit time is far greater than the number of frames of the key frames in the map.
  • FIG. 2 the relationship between the direction sensor data and the key frames is shown, where the long line segments represent the key frames.
  • the short line segment represents the direction sensor data. Because the timestamp represents the time to acquire the data, by matching the timestamp corresponding to the key frame with the timestamp in the direction sensor data, you can find the serial number of the direction sensor data corresponding to the time when the key frame of the map was obtained.
  • the direction data corresponding to the serial number of the sensor data is stored in the database by the smart device through continuous collection during the driving along a route, so the corresponding number of the direction sensor data can be obtained by searching the database.
  • Direction sensor data is stored in the database by the smart device through continuous collection during the driving along a route, so the corresponding number of the direction sensor data can be obtained by searching the database.
  • the time represented by the timestamp information corresponding to the first key frame is 3:15, and then the sequence number of the direction sensor data corresponding to time 3:15 in the direction sensor data is found. If the sequence number is determined to be 10, then Look up the direction sensor data with serial number 10 in the database.
  • Map (n) represents the number of the n-th map segment.
  • step 105 the direction of the map is corrected according to the direction sensor data corresponding to the key frame.
  • the direction information of the direction sensor data is determined according to the determined serial number of the direction sensor data, the direction information of the determined direction sensor data is replaced with the direction information in the key frame, and the correction is determined according to the direction information in the replaced key frame.
  • Direction after the map is determined according to the determined serial number of the direction sensor data, the direction information of the determined direction sensor data is replaced with the direction information in the key frame, and the correction is determined according to the direction information in the replaced key frame.
  • the key frame of the map includes the direction information of the map, such as Euler angle.
  • the direction sensor always collects data in real time during the mapping process.
  • the direction information in the direction sensor data is accurate. Therefore, after determining the direction information of the direction sensor data according to the determined serial number of the direction sensor data, it will be accurate.
  • the direction information in the direction sensor data replaces the direction information in the key frame, thereby realizing the direction correction of the map.
  • step 106 the coordinates of the map after the direction correction are calculated.
  • the coordinates of the corrected map can be determined through the coordinates of the key frames in the map and the coordinates of the sensors to improve the accuracy and stability of the map.
  • step 107 it is determined that the created map is the final map.
  • the created map is directly determined as the final map.
  • the multi-source data mapping method of this embodiment establishes a map by collecting image data and orientation sensor data, and determines that the key frames of the map correspond to when the established map has deviations Use the corresponding direction sensor data to correct the direction of the map and determine the coordinates of the map, so that the established map is more accurate and more closely matches the actual space.
  • the second embodiment of the present application relates to a multi-source data mapping method.
  • This embodiment is further improved based on the first embodiment.
  • the specific improvement is that the coordinates of the map after the correction of the direction are calculated. specific description.
  • the flow of the multi-source data mapping method in this embodiment is shown in FIG. 4.
  • this embodiment includes steps 201 to 209, where steps 201 to 205 are substantially the same as steps 101 to 105 in the first embodiment.
  • Step 209 is substantially the same as step 107 in the first embodiment, and is not repeated here.
  • the following mainly introduces the differences.
  • the method of data mapping is not repeated here.
  • step 206 is performed.
  • step 206 the coordinates corresponding to the key frames of the map are determined.
  • the coordinates of each key frame in the map data are known, so the coordinates of each key frame in the map can be collected. Because the correction is performed according to the direction sensor data corresponding to the key frame, and the direction sensor data is a known amount, the angle corresponding to the direction of the map after the correction is determined.
  • step 207 the coordinates of the direction sensor corresponding to the serial number of the direction sensor data are calculated according to the coordinates corresponding to the key frame.
  • the coordinates of the sensor corresponding to the serial number of the direction sensor data can be calculated.
  • the coordinates of the sensor are calculated using the following formula (1) and formula (2):
  • (x, y) represents the coordinates corresponding to the last key frame in the map
  • (x ', y') represents the coordinates corresponding to the direction sensor data
  • d represents the distance between the sensor data of the two frames before and after
  • ⁇ ' represents the direction The angle of the map after correction.
  • the first map is 10 meters
  • the middle is 10 meters
  • the second map is 10 meters.
  • the 10-meter map can be located and connected using orientation sensor data. Assume that the coordinates corresponding to the last key frame of the first map are (x, y). In the 10-meter map lost in the middle, the direction sensor moves n distances, and the distance between the sensor data of the two frames before and after D ' 0 , d' 1 ... d ' n , and the angles of the map after correction at each distance are ⁇ ' 0 , ⁇ ' 1 ... ⁇ ' n , then the direction sensor data at each distance can be determined Corresponding coordinates. For the coordinates of the direction sensor at the first moving distance d ' 0 , the following formula (3) and formula (4) are used to calculate and obtain:
  • (x, y) represents the coordinates corresponding to the last key frame in the first map
  • (x ' 0 , y' 0 ) represents the coordinates corresponding to the direction sensor data at the first moving distance
  • d ' 0 represents the first A moving distance
  • (x 'n-1, y' n-1) represents the n-1 th movement distance of the direction sensor data corresponding to the coordinates
  • (x 'n, y' n) denotes the n-th movement distance of the direction sensor data
  • Corresponding coordinates d ' n represents the n-th moving distance
  • ⁇ ' n represents the angle of the map after the direction correction of the n-th moving distance.
  • step 208 the coordinates of the map are determined according to the coordinates of the sensors.
  • the coordinates of the sensor are the same as the coordinates of the map, after determining the coordinates of the sensor, the coordinates of the map can be determined according to the coordinates of the sensor, so that the determined map information is more accurate.
  • the map has a deviation in scale.
  • the coordinates of the map can be determined using formulas (7), (8), and (9):
  • (x n , y n ) represents the coordinates corresponding to the previous key frame in the map
  • (x w , y w ) represents the coordinates of the corrected map
  • d represents the distance between two adjacent key frames in the map.
  • represents a distance scale factor
  • d t represents a sensor moving distance
  • l represents a visual moving distance.
  • the third embodiment of the present application relates to a multi-source data mapping device, and a specific structure thereof is shown in FIG. 5.
  • the obstacle avoidance device includes a collection module 501, a map establishment module 502, a determination module 503, a determination module 504, a correction module 505, a calculation module 506 and a final map determination module 507.
  • the acquisition module 501 is configured to acquire image data and direction sensor data
  • a map building module 502 configured to build a map according to the image data
  • the determining module 503 is configured to determine whether there is a deviation in the map.
  • a determining module 504 configured to determine direction sensor data corresponding to a key frame of a map
  • a correction module 505 configured to correct the direction of the map according to the direction sensor data corresponding to the key frame;
  • the calculation module 506 is configured to calculate coordinates of the map after the direction correction.
  • the final map determining module 507 is configured to determine that the established map is a final map.
  • this embodiment is a device example corresponding to the first embodiment, and this embodiment can be implemented in cooperation with the first embodiment. Relevant technical details mentioned in the first embodiment are still valid in this embodiment, and in order to reduce repetition, details are not repeated here. Accordingly, the related technical details mentioned in this embodiment can also be applied in the first embodiment.
  • the fourth embodiment of the present application relates to a multi-source data mapping device. This embodiment is substantially the same as the fourth embodiment.
  • the specific structure is shown in FIG. 6. Among them, the main improvement is that the fifth embodiment specifically describes the structure of the calculation module 506 in the fourth embodiment.
  • the calculation module 506 includes a first determination sub-module 5061, a calculation sub-module 5062, and a second determination sub-module 5063.
  • the first determining submodule 5061 is configured to determine coordinates corresponding to key frames of the map.
  • the calculation submodule 5062 is configured to calculate the coordinates of the sensor corresponding to the serial number of the direction sensor data according to the coordinates corresponding to the key frame.
  • the second determining sub-module 5063 is configured to determine the coordinates of the map according to the coordinates of the sensors.
  • this embodiment is a device example corresponding to the second embodiment, and this embodiment can be implemented in cooperation with the second embodiment. Relevant technical details mentioned in the second embodiment are still valid in this embodiment, and in order to reduce repetition, details are not repeated here. Correspondingly, the related technical details mentioned in this embodiment can also be applied in the second embodiment.
  • a fifth embodiment of the present application relates to an electronic device, and a specific structure thereof is shown in FIG. 7.
  • the memory 702 stores instructions executable by the at least one processor 701, and the instructions are executed by the at least one processor 701, so that the at least one processor 701 can execute a multi-source data mapping method.
  • the processor 701 uses a central processing unit (CPU) as an example, and the memory 702 uses a readable and writable memory (Random Access Memory, RAM) as an example.
  • the processor 701 and the memory 702 may be connected through a bus or in other manners. In FIG. 7, connection through a bus is taken as an example.
  • the memory 702 is a non-volatile computer-readable storage medium, and can be used to store non-volatile software programs, non-volatile computer executable programs, and modules.
  • the program for implementing the method for determining environmental information is Stored in the memory 702.
  • the processor 701 executes various functional applications and data processing of the device by running non-volatile software programs, instructions, and modules stored in the memory 702, that is, the above-mentioned multi-source data mapping method is implemented.
  • the memory 702 may include a storage program area and a storage data area, where the storage program area may store an operating system and an application program required for at least one function; the storage data area may store a list of options and the like.
  • the memory may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage device.
  • the memory 702 may optionally include a memory remotely disposed with respect to the processor 701, and these remote memories may be connected to an external device through a network. Examples of the above network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • One or more program modules are stored in the memory 702, and when executed by one or more processors 701, the multi-source data mapping method in any of the foregoing method embodiments is executed.
  • An eighth embodiment of the present application relates to a computer-readable storage medium.
  • a computer program is stored in the computer-readable storage medium.
  • the computer program When executed by a processor, it can implement multi-source data involved in any method embodiment of the present application. Mapping method.
  • the program is stored in a storage medium and includes several instructions for making a device (which can be (Single chip microcomputer, chip, etc.) or a processor (processor) executes all or part of the steps of the method described in each embodiment of the present application.
  • the foregoing storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disks or optical disks and other media that can store program codes .

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)
  • Image Analysis (AREA)

Abstract

本申请涉及计算机视觉技术领域,公开了一种多源数据建图方法、相关装置及计算机可读存储介质。该方法应用于终端或云端,包括以下步骤:采集图像数据和方向传感器数据;根据图像数据建立地图;若确定地图存在偏差,则确定地图的关键帧所对应的方向传感器数据;根据关键帧所对应的方向传感器数据对地图的方向进行校正;计算方向校正之后的地图的坐标。通过采集图像数据和方向传感器数据来建立地图,并在确定所建立的地图存在偏差的情况下,确定地图的关键帧所对应的方向传感器数据,通过对应的方向传感器数据对地图的方向进行校正,并确定地图的坐标,从而使所建立的地图更加准确,与实际空间更加匹配。

Description

多源数据建图方法、相关装置及计算机可读存储介质 技术领域
本申请涉及计算机视觉技术领域,特别涉及一种多源数据建图方法、相关装置及计算机可读存储介质。
背景技术
即时定位与地图构建(simultaneous localization and mapping,SLAM)是机器人等智能设备在未知环境中从一个未知位置开始移动,在移动过程中根据位置估计和地图进行自身定位,同时在自身定位的基础上建造增量式地图,实现机器人的自主定位和导航。并且,目前通过相机采集图像数据并进行建图与定位,即视觉即时定位与地图构建(vSLAM,Visual Simultaneous Localization And Mapping)在自主移动机器人和智能驾驶等领域得到广泛应用。
技术问题
发明人在研究现有技术过程中发现,现有技术中的vSLAM在进行建图的过程中,由于受周围环境或自身算法的影响,往往会造成最终所建立的地图并不准确。例如,当相机所采集环境中的特征点较为稀疏时会造成建图的中断,当重新建图时由于会进行方向初始化,从而影响建图的准确性;另外,当在建图的过程中无回环时,建图获取的地图数据会存在一定的累积误差,也会影响建图的准确性,使所建立的地图与实际空间存在一定的差异。
技术解决方案
本申请部分实施例所要解决的一个技术问题在于提供一种多源数据建图方法、相关装置及计算机可读存储介质,以解决上述技术问题。
本申请的一个实施例提供了一种多源数据建图方法,包括:采集图像数据和方向传感器数据;根据图像数据建立地图;若确定地图 存在偏差,则确定地图的关键帧所对应的方向传感器数据;根据关键帧所对应的方向传感器数据对地图的方向进行校正;计算方向校正之后的地图的坐标。
本申请实施例还提供了一种多源数据建图装置,该多源数据建图装置包括:采集模块,用于采集图像数据和方向传感器数据;地图建立模块,用于根据图像数据建立地图;确定模块,用于若确定地图存在偏差,则确定地图的关键帧所对应的方向传感器数据;校正模块,用于根据关键帧所对应的方向传感器数据对地图的方向进行校正;计算模块,用于计算方向校正之后的地图的坐标。
本申请实施例还提供了一种电子设备,包括:至少一个处理器;以及与至少一个处理器通信连接的存储器;其中,存储器存储有可被至少一个处理器执行的指令,指令被至少一个处理器执行,以使至少一个处理器能够执行本申请任意方法实施例中涉及的多源数据建图方法。
本申请实施例还提供了一种计算机可读存储介质,存储有计算机指令,计算机指令用于使计算机执行本申请任意方法实施例中涉及的多源数据建图方法。
有益效果
本申请实施例相对于现有技术而言,通过采集图像数据和方向传感器数据来建立地图,并在确定所建立的地图存在偏差的情况下,确定地图的关键帧所对应的方向传感器数据,通过对应的方向传感器数据对地图的方向进行校正,并确定地图的坐标,从而使所建立的地图更加准确,与实际空间更加匹配。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1是本申请第一实施例中多源数据建图方法的流程图;
图2是本申请第一实施例中方向传感器数据与关键帧关系示意图;
图3是本申请第一实施例中中断情况下方向传感器数据与关键帧关系示意图;
图4是本申请第二实施例中多源数据建图方法的流程图;
图5是本申请第三实施例中多源数据建图装置的方框示意图;
图6是本申请第四实施例中多源数据建图装置的方框示意图;
图7是本申请第五实施例中电子设备的结构实例图。
本发明的实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请部分实施例进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
本申请的第一实施例涉及一种多源数据建图方法,应用于终端或云端。终端可以是导盲头盔、智能机器人、无人驾驶车辆等设备。云端与终端通信连接,为终端提供用于定位的地图或直接为终端提供定位结果。本实施例以终端为例说明多源数据建图方法的执行过程,云端执行该多源数据建图方法的过程可以参考本申请实施例的内容。该多源数据建图方法的具体流程如图1所示,包括以下步骤:
在步骤101中,采集图像数据和方向传感器数据。
具体的说,图像数据包括:采集的每帧图像信息和每帧图像信息对应的时间戳信息,方向传感器数据包括:方向信息和方向信息对应的时间戳信息。每帧图像对应的时间戳信息表示获取该帧图像的时间,方向信息对应的时间戳信息表示获取该方向信息的时间。
需要说明的是,本实施方式的传感器包括里程计或惯性测量单元(Inertial Measurement Unit,IMU),当传感器为里程计时,方向信息为物理输出和欧拉角,当传感器为IMU时,方向信息为角速度和加速度。
在步骤102中,根据图像数据建立地图。
具体的说,本实施方式中,是基于vSLAM方式来进行建图的。 当机器人在未知环境中的一个未知位置中开始移动时,会获取摄像装置通过拍摄所采集的图像数据,通过对图像数据中的每帧图像信息进行特征提取,并根据提取的特征和每帧图像信息对应的时间戳信息对周围环境进行地图的构建,根据所构建的地图对自身位置进行定位。由于基于vSLAM方式进行地图构建的技术已经比较成熟,所以本实施例中不再对此进行赘述。
其中,通过图像数据所建立的地图为视觉地图,并且所建立的视觉地图中包括多个关键帧,以及关键帧所对应的时间戳信息、物理输出以及欧拉角信息。其中,物理输出表示所建立的视觉地图在空间坐标下的x,y坐标值,欧拉角表示所建立的视觉地图的方向。
在步骤103中,判断地图是否存在偏差,若是,则执行步骤104,否则执行步骤107。
具体的说,在建图的过程中在对每帧图像进行处理时,随着相机的运动往往会有累计误差,而消除误差最有效的方法就是发现回环,并且存在回环的情况下所建立的地图是准确的,因此不存在偏差。而在基于vSLAM方式进行地图构建的过程中是有图形用户界面(Graphical User Interface,GUI)的,智能设备通过读取GUI界面信息确定是否存在回环。另外,在建图过程中如果出现所采集的图像数据中每帧图像信息所包含的特征点比较稀疏,信息不足以建立地图,则会出现建图中断,此时会重新进行初始化建立新的视觉地图,由于进行了初始化,会造成视觉地图方向上的不连续和路径信息的缺失。因此存在建图中断的情况下所建立的地图也是存在偏差的。而在建图中断的情况下会出现多组地图文件,因此,还可以通过获取的地图文件的个数判断是否存在偏差。
在步骤104中,确定地图的关键帧所对应的方向传感器数据。
具体的说,在本实施方式中,可以通过时间戳对齐的方式确定地图的关键帧所对应的方向传感器数据。具体包括:将关键帧对应的时间戳与方向传感器数据中的时间戳进行匹配,获取与关键帧对应的时间戳相匹配的方向传感器数据的序列号,确定与方向传感器数据的序列号所对应的方向传感器数据。
需要说明的是,智能设备单位时间内获取方向传感器数据的数 量远大于地图中关键帧的帧数,如图2所示为方向传感器数据与关键帧的关系示意图,其中,长线段代表关键帧,短线段代表方向传感器数据。由于时间戳表示了获取数据的时间,因此通过将关键帧对应的时间戳与方向传感器数据中的时间戳进行匹配,可以找到获取地图关键帧时刻所对应的方向传感器数据的序列号,由于与方向传感器数据的序列号所对应的方向传感器数据是智能设备在沿一段路径行驶过程中,通过不断采集而存储在在数据库中的,因此可以通过查找数据库的方式获取与方向传感器数据的序列号所对应的方向传感器数据。
例如,第一个关键帧所对应的时间戳信息代表的时刻是3:15,然后查找方向传感器数据中3:15时刻所对应的方向传感器数据的序列号,如果确定序列号为10,则在数据库中查找序列号为10的方向传感器数据。
需要说明的是,在建图中断的情况下,还可以确定多个地图片段中的起始关键帧和结束关键帧,如图3所示,为中断情况下方向传感器数据与关键帧关系示意图,其中,Map(n)表示第n个地图片段的编号。在中断情况下确定地图的关键帧所对应的方向传感器数据的方式与上述过程类似,因此不再进行赘述。
在步骤105中,根据关键帧所对应的方向传感器数据对地图的方向进行校正。
具体的说,根据确定的方向传感器数据的序列号确定方向传感器数据的方向信息,将确定的方向传感器数据的方向信息替换关键帧中的方向信息,根据替换后的关键帧中的方向信息确定校正之后的地图的方向。
其中,地图的关键帧中包括地图的方向信息,例如欧拉角。但由于所建立的地图是存在偏差的,因此,关键帧中所包括的地图的方向信息并不准确。而方向传感器在建图过程中一直是实时进行数据采集的,方向传感器数据中的方向信息是准确的,因此,在根据确定的方向传感器数据的序列号确定方向传感器数据的方向信息后,将准确的方向传感器数据的方向信息替换关键帧中的方向信息,从而实现对地图的方向校正。
在步骤106中,计算方向校正之后的地图的坐标。
具体的说,在对地图的方向进行校正后,通过地图中的关键帧坐标和传感器的坐标可以确定校正之后的地图的坐标,以提高建图的精度和稳定性。
在步骤107中,确定所建立的地图为最终地图。
其中,在经过判断确定所建立的地图不存在偏差,说明根据地图数据所建立的地图是准确的,并且存在回环,此时直接将所建立的地图确定为最终地图。
与现有技术相比,本实施方式的多源数据建图方法,通过采集图像数据和方向传感器数据来建立地图,并在确定所建立的地图存在偏差的情况下,确定地图的关键帧所对应的方向传感器数据,通过对应的方向传感器数据对地图的方向进行校正,并确定地图的坐标,从而使所建立的地图更加准确,与实际空间更加匹配。
本申请的第二实施例涉及一种多源数据建图方法,本实施例在第一实施例的基础上做了进一步改进,具体改进之处为:对计算方向校正之后的地图的坐标进行了具体描述。本实施例中的多源数据建图方法的流程如图4所示。
具体的说,在本实施例中,包括步骤201至步骤209,其中步骤201至步骤205与第一实施方式中的步骤101至步骤105大致相同,
步骤209与第一实施方式中的步骤107大致相同,此处不再赘述,下面主要介绍不同之处,未在本实施方式中详尽描述的技术细节,可参见第一实施例所提供的多源数据建图方法,此处不再赘述。
在步骤201至步骤205之后,执行步骤206。
在步骤206中,确定地图的关键帧所对应的坐标。
需要说明的是,在对地图的方向进行校正后,地图数据中每一个关键帧的坐标就是已知的,因此可以对地图中每一个关键帧的坐标进行采集。由于是根据关键帧所对应的方向传感器数据进行的校正,而方向传感器数据是已知量,所以经过校正之后的地图的方向所对应的角度就是确定的。
在步骤207中,根据关键帧所对应的坐标计算方向传感器数据的序列号所对应方向传感器的坐标。
具体的说,在已知地图数据中的关键帧所对应的坐标,以及通 过方向校正所确定的地图角度之后,可以计算得到方向传感器数据的序列号所对应传感器的坐标。
具体的说,在中断的情况下,利用如下公式(1)和公式(2)计算传感器的坐标:
x’=x+d*cos(θ’)  (1)
y’=y+d*sin(θ’)  (2)
其中,(x,y)表示地图中最后一个关键帧所对应的坐标,(x’,y’)表示方向传感器数据对应的坐标,d表示前后两帧传感器数据之间的距离,θ’表示方向校正后地图的角度。
在一个具体实现中,假设一段路径长度为30米,在建图过程中出现中断,出现两段地图,第一段地图10米,中间丢失10米,第二段地图10米,因此中间丢失的10米地图可以使用方向传感器数据进行定位并连接上。假设第一段地图的最后一个关键帧所对应的坐标为(x,y),在中间丢失的10米地图中,方向传感器移动了n段距离,并且在前后两帧传感器数据之间的距离分别为d’ 0,d’ 1...d’ n,每段距离处校正后地图的角度分别为θ’ 0,θ’ 1...θ’ n,则可以确定每段距离处方向传感器数据对应的坐标。对于第一个移动距离d’ 0处方向传感器的坐标,利用如下公式(3)和公式(4)计算获得:
x' 0=x+d' 0*cos(θ’ 0)  (3)
y' 0=y+d' 0*sin(θ’ 0)  (4)
其中,(x,y)表示第一段地图中最后一个关键帧所对应的坐标,(x’ 0,y’ 0)表示第一个移动距离处方向传感器数据对应的坐标,d' 0表示第一个移动距离,θ’表示第一个移动距离方向校正后地图的角 度。
以此类推,对于第n个移动距离d’ n处方向传感器的坐标,利用如下公式(5)和公式(6)计算获得
x' n=x' n-1+d' n*cos(θ’ n)  (5)
y' n=y' n-1+d' n*sin(θ’ n)  (6)
其中,(x' n-1,y' n-1)表示第n-1个移动距离处方向传感器数据对应的坐标,(x’ n,y’ n)表示第n个移动距离处方向传感器数据对应的坐标,d' n表示第n个移动距离,θ’ n表示第n个移动距离方向校正后地图的角度。
在步骤208中,根据传感器的坐标确定地图的坐标。
其中,由于传感器的坐标与地图的坐标是相同的,因此在确定出传感器的坐标后,可以根据传感器的坐标确定出地图的坐标,使确定的地图信息更加准确。
需要说明的是,在无回环的情况下地图是存在尺度上的偏差的,此时可以利用公式(7)、(8)和(9)确定地图的坐标:
x w=x n+ε*d*cos(θ’)  (7)
y w=y n+ε*d*sin(θ’)  (8)
Figure PCTCN2018096658-appb-000001
其中,(x n,y n)表示地图中前一个关键帧所对应的坐标,(x w,y w)表示修正后地图的坐标,d表示地图中两个相邻关键帧之间的距离,ε表示距离比例因子,d t表示传感器移动距离,l表示视觉移动距离。
本申请的第三实施方式涉及一种多源数据建图装置,具体结构如图5所示。
如图5所示,避障装置包括采集模块501,地图建立模块502, 判断模块503,确定模块504,校正模块505、计算模块506和最终地图确定模块507。
其中,采集模块501,用于采集图像数据和方向传感器数据;
地图建立模块502,用于根据图像数据建立地图;
判断模块503,用于判断地图是否存在偏差。
确定模块504,用于确定地图的关键帧所对应的方向传感器数据;
校正模块505,用于根据关键帧所对应的方向传感器数据对地图的方向进行校正;
计算模块506,用于计算方向校正之后的地图的坐标。
最终地图确定模块507,用于确定所建立的地图为最终地图。
不难发现,本实施方式为与第一实施方式相对应的装置实施例,本实施方式可与第一实施方式互相配合实施。第一实施方式中提到的相关技术细节在本实施方式中依然有效,为了减少重复,这里不再赘述。相应地,本实施方式中提到的相关技术细节也可应用在第一实施方式中。
本申请的第四实施例涉及一种多源数据建图装置,该实施方式与第四实施方式大致相同,具体结构如图6所示。其中,主要改进之处在于:第五实施方式对第四实施方式中的计算模块506的结构进行了具体描述。
其中,计算模块506包括:第一确定子模块5061、计算子模块5062和第二确定子模块5063。
第一确定子模块5061,用于确定地图的关键帧所对应的坐标。
计算子模块5062,用于根据关键帧所对应的坐标计算方向传感器数据的序列号所对应传感器的坐标。
第二确定子模块5063,用于根据传感器的坐标确定地图的坐标。
不难发现,本实施方式为与第二实施方式相对应的装置实施例,本实施方式可与第二实施方式互相配合实施。第二实施方式中提到的相关技术细节在本实施方式中依然有效,为了减少重复,这里不再赘述。相应地,本实施方式中提到的相关技术细节也可应用在第二 实施方式中。
本申请的第五实施例涉及一种电子设备,具体结构如图7所示。包括至少一个处理器701;以及,与至少一个处理器701通信连接的存储器702。其中,存储器702存储有可被至少一个处理器701执行的指令,指令被至少一个处理器701执行,以使至少一个处理器701能够执行多源数据建图方法。
本实施例中,处理器701以中央处理器(Central Processing Unit,CPU)为例,存储器702以可读写存储器(Random Access Memory,RAM)为例。处理器701、存储器702可以通过总线或者其他方式连接,图7中以通过总线连接为例。存储器702作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中实现环境信息确定方法的程序就存储于存储器702中。处理器701通过运行存储在存储器702中的非易失性软件程序、指令以及模块,从而执行设备的各种功能应用以及数据处理,即实现上述多源数据建图方法。
存储器702可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储选项列表等。此外,存储器可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器702可选包括相对于处理器701远程设置的存储器,这些远程存储器可以通过网络连接至外接设备。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
一个或者多个程序模块存储在存储器702中,当被一个或者多个处理器701执行时,执行上述任意方法实施例中的多源数据建图方法。
上述产品可执行本申请实施例所提供的方法,具备执行方法相应的功能模块和有益效果,未在本实施例中详尽描述的技术细节,可参见本申请实施例所提供的方法。
本申请的第八实施例涉及一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,该计算机程序被处理器执行时能 够实现本申请任意方法实施例中涉及的多源数据建图方法。
本领域技术人员可以理解,实现上述实施例方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
本领域的普通技术人员可以理解,上述各实施例是实现本申请的具体实施例,而在实际应用中,可以在形式上和细节上对其作各种改变,而不偏离本申请的精神和范围。

Claims (10)

  1. 一种多源数据建图方法,包括:
    采集图像数据和方向传感器数据;
    根据所述图像数据建立地图;
    若确定所述地图存在偏差,则确定所述地图的关键帧所对应的所述方向传感器数据;
    根据所述关键帧所对应的所述方向传感器数据对所述地图的方向进行校正;
    计算方向校正之后的所述地图的坐标。
  2. 如权利要求1所述的多源数据建图方法,其中,所述图像数据包括:每帧图像信息和所述每帧图像信息对应的时间戳信息,
    所述方向传感器数据包括:方向信息和所述方向信息对应的时间戳信息。
  3. 如权利要求1至2任一项所述的多源数据建图方法,其中,所述确定所述地图的关键帧所对应的所述方向传感器数据,具体包括:
    通过时间戳对齐的方式确定所述地图的关键帧所对应的所述方向传感器数据。
  4. 如权利要求3所述的多源数据建图方法,其中,所述通过时间戳对齐的方式确定所述地图的关键帧所对应的所述方向传感器数据,具体包括:
    将所述关键帧对应的时间戳与所述方向传感器数据中的时间戳进行匹配;
    获取与所述关键帧对应的时间戳相匹配的所述方向传感器数据的序列号;
    确定与所述方向传感器数据的序列号所对应的所述方向传感器数据。
  5. 如权利要求3至4任一项所述的多源数据建图方法,其中,所述关键帧包括:
    所述地图的起始关键帧和所述地图的结束关键帧。
  6. 如权利要求5所述的多源数据建图方法,其中,所述根据所述关键帧所对应的所述方向传感器数据对所述地图的方向进行校正,具 体包括:
    根据确定的所述方向传感器数据的序列号确定所述方向传感器数据的方向信息;
    将确定的所述方向传感器数据的方向信息替换所述关键帧中的方向信息;
    根据替换后的所述关键帧中的方向信息确定校正之后的所述地图的方向。
  7. 如权利要求6所述的多源数据建图方法,其中,所述计算方向校正之后的所述地图的坐标,具体包括:
    确定所述地图的关键帧所对应的坐标;
    根据所述关键帧所对应的坐标计算所述方向传感器数据的序列号所对应传感器的坐标;
    根据所述传感器的坐标确定所述地图的坐标。
  8. 一种多源数据建图装置,包括:
    采集模块,用于采集图像数据和方向传感器数据;
    地图建立模块,用于根据所述图像数据建立地图;
    确定模块,用于若确定所述地图存在偏差,则确定所述地图的关键帧所对应的所述方向传感器数据;
    校正模块,用于根据所述关键帧所对应的所述方向传感器数据对所述地图的方向进行校正;
    计算模块,用于计算方向校正之后的所述地图的坐标。
  9. 一种电子设备,包括:
    至少一个处理器;以及,
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1至7任一项所述的多源数据建图方法。
  10. 一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时实现权利要求1至7任一项所述的多源数据建图方法。
PCT/CN2018/096658 2018-07-23 2018-07-23 多源数据建图方法、相关装置及计算机可读存储介质 WO2020019116A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/096658 WO2020019116A1 (zh) 2018-07-23 2018-07-23 多源数据建图方法、相关装置及计算机可读存储介质
CN201880001179.9A CN109074407A (zh) 2018-07-23 2018-07-23 多源数据建图方法、相关装置及计算机可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/096658 WO2020019116A1 (zh) 2018-07-23 2018-07-23 多源数据建图方法、相关装置及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2020019116A1 true WO2020019116A1 (zh) 2020-01-30

Family

ID=64789271

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/096658 WO2020019116A1 (zh) 2018-07-23 2018-07-23 多源数据建图方法、相关装置及计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN109074407A (zh)
WO (1) WO2020019116A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110487264B (zh) * 2019-09-02 2020-08-07 上海图聚智能科技股份有限公司 修正地图的方法、装置、电子设备、及存储介质
CN111352425B (zh) * 2020-03-16 2024-02-09 北京猎户星空科技有限公司 一种导航系统、方法、装置、电子设备及介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9459104B2 (en) * 2014-07-28 2016-10-04 Google Inc. Systems and methods for performing a multi-step process for map generation or device localizing
CN106446815A (zh) * 2016-09-14 2017-02-22 浙江大学 一种同时定位与地图构建方法
CN107004028A (zh) * 2014-12-19 2017-08-01 高通股份有限公司 可缩放3d地图绘制系统
CN108051002A (zh) * 2017-12-04 2018-05-18 上海文什数据科技有限公司 基于惯性测量辅助视觉的运输车空间定位方法及系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140323148A1 (en) * 2013-04-30 2014-10-30 Qualcomm Incorporated Wide area localization from slam maps
US20150092048A1 (en) * 2013-09-27 2015-04-02 Qualcomm Incorporated Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation
CN107193279A (zh) * 2017-05-09 2017-09-22 复旦大学 基于单目视觉和imu信息的机器人定位与地图构建系统
CN107862720B (zh) * 2017-11-24 2020-05-22 北京华捷艾米科技有限公司 基于多地图融合的位姿优化方法及位姿优化系统
CN108280442B (zh) * 2018-02-10 2020-07-28 西安交通大学 一种基于轨迹匹配的多源目标融合方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9459104B2 (en) * 2014-07-28 2016-10-04 Google Inc. Systems and methods for performing a multi-step process for map generation or device localizing
CN107004028A (zh) * 2014-12-19 2017-08-01 高通股份有限公司 可缩放3d地图绘制系统
CN106446815A (zh) * 2016-09-14 2017-02-22 浙江大学 一种同时定位与地图构建方法
CN108051002A (zh) * 2017-12-04 2018-05-18 上海文什数据科技有限公司 基于惯性测量辅助视觉的运输车空间定位方法及系统

Also Published As

Publication number Publication date
CN109074407A (zh) 2018-12-21

Similar Documents

Publication Publication Date Title
WO2021232470A1 (zh) 基于多传感器融合的slam制图方法、系统
US10789771B2 (en) Method and apparatus for fusing point cloud data
CN110084832B (zh) 相机位姿的纠正方法、装置、系统、设备和存储介质
WO2020019115A1 (zh) 融合建图方法、相关装置及计算机可读存储介质
CN112734852B (zh) 一种机器人建图方法、装置及计算设备
WO2020038285A1 (zh) 车道线的定位方法和装置、存储介质、电子装置
US20210158567A1 (en) Visual positioning method and apparatus, electronic device, and system
CN107909614B (zh) 一种gps失效环境下巡检机器人定位方法
US20140316698A1 (en) Observability-constrained vision-aided inertial navigation
CN111750853B (zh) 一种地图建立方法、装置及存储介质
WO2020140431A1 (zh) 相机位姿确定方法、装置、电子设备及存储介质
WO2022193508A1 (zh) 位姿优化方法、装置、电子设备、计算机可读存储介质、计算机程序及程序产品
CN112556685B (zh) 导航路线的显示方法、装置和存储介质及电子设备
WO2018133727A1 (zh) 一种正射影像图的生成方法及装置
CN114323033B (zh) 基于车道线和特征点的定位方法、设备及自动驾驶车辆
CN112183171A (zh) 一种基于视觉信标建立信标地图方法、装置
WO2020063878A1 (zh) 一种处理数据的方法和装置
CN114612348B (zh) 激光点云运动畸变校正方法、装置、电子设备及存储介质
WO2023005457A1 (zh) 位姿计算方法和装置、电子设备、可读存储介质
WO2020019116A1 (zh) 多源数据建图方法、相关装置及计算机可读存储介质
JP2023021994A (ja) 自動運転車両に対するデータ処理方法及び装置、電子機器、記憶媒体、コンピュータプログラム、ならびに自動運転車両
CN116429116A (zh) 一种机器人定位方法及设备
CN112729294B (zh) 适用于机器人的视觉和惯性融合的位姿估计方法及系统
CN114279434A (zh) 一种建图方法、装置、电子设备和存储介质
CN113570716A (zh) 云端三维地图构建方法、系统及设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18927832

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.05.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18927832

Country of ref document: EP

Kind code of ref document: A1