WO2021139590A1 - Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor - Google Patents

Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor Download PDF

Info

Publication number
WO2021139590A1
WO2021139590A1 PCT/CN2020/141624 CN2020141624W WO2021139590A1 WO 2021139590 A1 WO2021139590 A1 WO 2021139590A1 CN 2020141624 W CN2020141624 W CN 2020141624W WO 2021139590 A1 WO2021139590 A1 WO 2021139590A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
bluetooth
indoor positioning
navigation device
time
Prior art date
Application number
PCT/CN2020/141624
Other languages
French (fr)
Chinese (zh)
Inventor
雷鸣
杜珣弤
Original Assignee
三个机器人公司
雷鸣
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三个机器人公司, 雷鸣 filed Critical 三个机器人公司
Publication of WO2021139590A1 publication Critical patent/WO2021139590A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Definitions

  • This application relates to an indoor positioning and navigation device and method based on Bluetooth and SLAM.
  • SLAM simultaneous localization and mapping
  • the robot starts to move from an unknown location in an unknown environment, and is based on the location estimation and map during the movement.
  • Self-positioning and at the same time, an incremental map is built on the basis of self-positioning to realize autonomous positioning and navigation of the robot.
  • indoor positioning and navigation technology uses a variety of physical sensors to achieve positioning and navigation, such as cameras, IMUs, etc.
  • the sensor In the complex indoor environment, it is susceptible to environmental factors or the sensor itself is low in accuracy, resulting in low positioning accuracy. For example, the factors that are affected by each sensor are different. The camera is illuminated, and the camera cannot obtain feature points in the dark. The IMU is an external factor, such as man-made drag and ground slippage. There will be cumulative errors in long-term work.
  • the present application discloses an indoor positioning and navigation device, wherein the indoor positioning and navigation device is built in a movable intelligent platform or robot, and includes: a Bluetooth module for obtaining information about the indoor positioning and navigation device Mesh network information, obtain the addresses, attributes, RSSI (Received Signal Strength Indication, received signal strength indication), IQ (In-phase Quadrature, same direction orthogonal) data, angle, and arrival time of multiple Bluetooth nodes in the Mesh network.
  • the vision module is used to collect images around the robot, and perform grayscale processing, image correction, and feature point extraction and feature line segment extraction on the image.
  • the odometer module is used to calculate the change of the position of the robot at each time relative to the position of the previous time through relative positioning, and to estimate the position of the robot in real time.
  • the data storage and processing module is used to receive the data information collected and calculated by the Bluetooth module, the vision module and the odometer module.
  • the position fusion and estimation module is used to fuse the data in the data storage and processing module to obtain the real-time position of the indoor positioning and navigation device.
  • the map building module is used to store the real-time location of the indoor positioning and navigation device estimated by the location fusion and estimation module, and the three-dimensional environment map created based on the spatial location information of the Bluetooth nodes and feature points in the data storage and processing module, as well as path planning and
  • the motion control module is used to drive the robot and perform path planning and navigation according to the three-dimensional environment map created by the map building module.
  • this application discloses an indoor positioning and navigation method, including: calibrating a camera in a vision module; acquiring data collected by a Bluetooth module, a vision module, and an odometer module; performing Kalman filtering on the collected data; The real-time position of the indoor positioning and navigation device, the spatial position of the Bluetooth node, and the feature point are calculated to create a three-dimensional environment map, and positioning and navigation are performed according to the created three-dimensional environment map.
  • a computer device including a memory and one or more processors, and computer-readable instructions are stored in the memory.
  • the computer-readable instructions are executed by the processor, the The one or more processors execute or implement the indoor positioning and navigation methods disclosed in some embodiments of the present application.
  • one or more non-volatile storage media storing computer-readable instructions are provided.
  • the computer-readable instructions are executed by one or more processors, the one or more processors execute or implement The indoor positioning and navigation method disclosed in some embodiments of the present application.
  • a computer program which, when executed by a processor, implements the indoor positioning and navigation method disclosed in some embodiments of the present application.
  • Fig. 1 is a block diagram of an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
  • Fig. 2 is a working flow chart of the vision module in the indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
  • Fig. 3 is a flowchart of Kalman filtering of the position fusion and estimation module in the Bluetooth and SLAM indoor positioning and navigation device provided by one or more embodiments of the present application.
  • FIG. 4 is a positioning model diagram of a Bluetooth module in an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
  • Fig. 5 is a motion model diagram of an odometer module in an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
  • FIG. 6 is a working flow chart of an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
  • FIG. 7 is a diagram of the internal structure of a computer device provided by one or more embodiments of the present application.
  • FIG. 1 is a block diagram of an indoor positioning and navigation device 100 based on Bluetooth and SLAM provided by an embodiment of the present application.
  • the indoor positioning and navigation device 100 based on Bluetooth and SLAM includes: a Bluetooth module 111, a vision module 112, an odometer module 113, a data storage and processing module 114, a position fusion and estimation module 115, a map construction module 116, and a path Planning and Motion Control 117.
  • the indoor positioning and navigation device 100 can be built into a sweeping robot or an indoor robot, and is used to implement map construction and precise positioning and navigation of the robot.
  • the Bluetooth module 111 is a set of basic circuits of a chip that integrates Bluetooth functions, which can be used for wireless network communication, and can realize data transmission and audio transmission functions.
  • the Bluetooth module 111 is located in the indoor positioning and navigation device 100 and has intelligent networking and positioning functions, specifically including the realization of networking and positioning of Bluetooth devices in the Bluetooth network.
  • the Bluetooth module 111 obtains the Mesh network where the indoor positioning and navigation device 100 is located, the address, attributes, RSSI (Received Signal Strength Indication, received signal strength indication), IQ (In-phase Quadrature, same direction orthogonality) of the Bluetooth node in the Bluetooth network ) Data, angle (including direction angle and elevation angle), time of arrival and other information.
  • RSSI Receiveived Signal Strength Indication, received signal strength indication
  • IQ In-phase Quadrature, same direction orthogonality
  • the unit of RSSI is dbm, which is generally a negative value, which can reflect the distance between two Bluetooth nodes in the Bluetooth network.
  • the address information of the Bluetooth node is an identifier of the Bluetooth node, expressed in bytes; the attributes of the Bluetooth node include: friendly nodes, low-power nodes, relay nodes, standard nodes, and proxy nodes.
  • the Bluetooth module 111 obtains TOA (Time Of Arrive), RSSI, IQ data, and AOA (Angle Of Arrive) from surrounding Bluetooth nodes to the robot (including the indoor positioning and navigation device 100) through the positioning function.
  • TOA Time Of Arrive
  • RSSI Signal to Interference
  • IQ IQ
  • AOA Angle Of Arrive
  • the Bluetooth module 111 and the RSSI ranging model and the phase difference ranging model are constructed based on the TOA, RSSI, phase and AOA positioning model equations, as follows:
  • (x, y, z) represents the spatial position of the indoor positioning and navigation device 100
  • the spatial position of the Bluetooth node B i is (X bi , Y bi , Z bi )
  • ⁇ i is the Bluetooth node B i and indoor positioning and navigation
  • D 1i is the distance between the indoor positioning and navigation device 100 and the Bluetooth device B i obtained by the RSSI ranging model
  • D 2i is determined by The distance between the indoor positioning and navigation device 100 and the Bluetooth device B i obtained by the TOA ranging model
  • D 3i is the distance between the indoor positioning and navigation device 100 and the Bluetooth device B i obtained by the phase difference ranging model.
  • FIG. 4 is a positioning model diagram of a Bluetooth module in an indoor positioning and navigation device based on Bluetooth and SLAM provided by an embodiment of the present application. (X bi, Y bi, Z bi) coordinates of a representative node B i Bluetooth network, B 'i B i is a Bluetooth Node projected on a two-dimensional plane.
  • the vision module 112 is used to collect images around the robot, and perform grayscale processing, image correction, and feature point extraction and feature line segment extraction on the collected images, that is, preprocessing the image to remove lens distortion.
  • the influence of feature description, and the ORB (Oriented Fast and Rotated Brief) feature point detection method is used to extract feature points and feature line segments in the image.
  • the wide-angle camera in the vision module 112 is used to collect image information of the environment where the robot is located.
  • a 12*9 10mm*10mm black and white checkerboard calibration board is used to calibrate the wide-angle camera in the vision module 112 of the indoor positioning and navigation device 100.
  • the calibration method can avoid the shortcomings of high equipment requirements and cumbersome operation of the traditional method, and has higher accuracy compared with the current calibration method.
  • the vision module 112 obtains the camera internal parameters and distortion parameter information.
  • the camera internal parameters are the horizontal focal length fx, the vertical focal length fy, the principal point abscissa u0, and the principal point ordinate v0
  • the distortion parameters include radial distortion parameters k1, k2, and k2.
  • k3 and tangential distortion parameters p1 and p2 include but are not limited to these.
  • the odometer module 113 is used to calculate the change of the position of the robot at each time relative to the position of the previous time through relative positioning, so as to realize the real-time estimation of the position. That is, calculate the X-direction offset ⁇ x, the Y-direction offset ⁇ y, and the Z-direction angular offset ⁇ relative to the previous time at each time of the robot.
  • Figure 5 is a motion model diagram of the odometer module in the Bluetooth and SLAM-based indoor positioning and navigation device provided by an embodiment of the present application.
  • X(k-1) is the state quantity at time k-1
  • X(k) is the state quantity at the next time, that is, the state quantity at time k
  • u(k) is the offset of time k with respect to time k-1
  • the values of u(k) are the X-direction offset ⁇ x, the Y-direction offset ⁇ y, and the Z-direction angular offset ⁇ .
  • the data storage and processing module 115 is coupled with the Bluetooth module 111, the vision module 112, and the odometer module 113, and is configured to receive data information collected and calculated by the Bluetooth module 111, the vision module 112, and the odometer module 113.
  • these data information also include the wide-angle camera internal parameters and distortion parameters collected by the vision module.
  • the data calculated by the above data information includes the spatial position of the Bluetooth node, the spatial position of the feature point, and the real-time updated intelligent platform or robot position information obtained by the position fusion and estimation module.
  • the location fusion and estimation module 115 is coupled with the data storage and processing module 114 to fuse the data in the data storage and processing module 114.
  • the fusion algorithms include least squares method, LM algorithm, BA algorithm, and intelligent optimization algorithm (such as Genetic algorithm, particle swarm algorithm, ant colony algorithm, etc.), Kalman filter algorithm, etc.
  • This application uses one of these algorithms, such as Kalman filter as an example, for implementation and description to obtain the real-time position of the indoor positioning and navigation device 100. After Kalman filtering, real-time accurate positioning of the indoor positioning and navigation device 100 can be realized.
  • this embodiment cannot be used as a limitation of the application
  • the map construction module 116 is coupled with the position fusion and estimation module 115, and is used to store the real-time position of the indoor positioning and navigation device 100 estimated by the position fusion and estimation module 115, and the space of Bluetooth nodes and feature points in the data storage and processing module 114 A three-dimensional map of the environment created by the location.
  • the path planning and motion control 117 is coupled with the map construction module 116, and is used for path planning and navigation according to the three-dimensional environment map stored in the map construction module 116.
  • the path planning and motion control 117 is used to drive the robot accordingly, and obtain the current position information of the robot in real time.
  • Each module is optionally implemented as a non-transitory computer-readable medium containing logic, stored instructions, firmware, and/or a combination thereof.
  • a processor may be provided to execute such instructions so that the existing indoor positioning and navigation device executes the method described herein.
  • Fig. 2 is a working flow chart of the vision module in the Bluetooth and SLAM indoor positioning and navigation device provided by an embodiment of the present application. It includes the following steps:
  • Step 202 The wide-angle camera in the vision module 112 collects image information around the indoor positioning and navigation device 100.
  • Step 204 The vision module 112 preprocesses the collected images. Including the gray-scale processing of the image.
  • Step 206 The vision module 112 corrects the image distortion.
  • Step 208 Extract the feature point information in the image information, and the method used is the ORB feature extraction that comes with OpenCV. This method is characterized by fast calculation speed, and has certain characteristics of anti-noise and anti-rotation. After processing the image using the ORB feature extraction method, a series of feature point data can be obtained, and the feature information will be stored in the feature database.
  • the ORB feature points in the image include lights, wall corners, etc.
  • Step 210 Extract a characteristic line segment in the image information.
  • the characteristic line segment is a 1sd line segment, which is generally an edge of a wall or an edge line of an object with a certain length, such as an edge line of a square lamp, etc.
  • Fig. 3 is a flowchart of the Kalman filtering of the position fusion and estimation module in the Bluetooth and SLAM-based indoor positioning and navigation device provided by an embodiment of the present application. Kalman filtering mainly includes the following steps:
  • Step 302 sub-module initialization, specifically refers to the definition and initialization of variables and matrices involved in Kalman filtering.
  • Step 304 The position fusion and estimation module establishes a motion equation and an observation equation. Specifically:
  • the posture change U k+1 ( ⁇ x(k+1), ⁇ y(k+1), ⁇ (k+1)) of the indoor positioning and navigation device 100 at k+1 time is obtained by the mileage module 114 data.
  • the system state X(k) at time k calculates the system state X(k+1
  • Q k+1 is the motion equation noise at k+1 time, this noise is determined by the positioning accuracy of the odometer module itself; U k+1 is the attitude change at k+1 time.
  • the system observation at time k+1 is estimated which is among them,
  • B i represents a Bluetooth node in the Bluetooth network, and its coordinates are (X bi , Y bi , Z bi ), which is relative to the system state quantity X(k+1
  • the three-dimensional coordinates of the feature point are (X fi , Y fi , Z fi ), f is the focal length of the camera, W k+1 is the noise of the observation equation at k+1, and H is The Jacobian matrix of the observation equation versus the state quantity.
  • Step 306 Update the covariance equation, the calculation formula is as follows:
  • P(k+1) is the covariance matrix at k+1
  • R is the observed noise covariance matrix
  • Step 308 Update the gain matrix, the calculation formula is as follows:
  • K is the gain matrix
  • Step 310 Update the state vector, the calculation formula is as follows:
  • X(k+1) is the state vector at time k+1
  • L k+1 is the real observation at k+1
  • f i (u i, v i ) is the feature point in time k + 1 F i (X fi, Y fi , Z fi) corresponding to the image coordinates. Due to the problems of wheel slipping, dragging or cumulative error, the position obtained only by the equation of motion will be wrong. Therefore, the indoor positioning and navigation device 100 uses other indoor information, such as feature points, and the observation equation composed of Bluetooth nodes to correct it. Improve the accuracy of calculations.
  • FIG. 6 is a working flow chart of an indoor positioning and navigation device based on Bluetooth and SLAM provided by an embodiment of the present application.
  • FIG. 6 will be described in conjunction with FIG. 1. Specifically include the following steps:
  • Step 602 Calibrate the camera in the vision module 112.
  • Step 604 Obtain data collected by the Bluetooth module 111, the vision module 112, and the mileage module 113.
  • Step 606 Fusion of the collected data.
  • Kalman filter fusion refer to the method flowchart of FIG. 3 for specific filtering steps.
  • Step 608 The map construction module 116 creates a three-dimensional environment map according to the calculated real-time position of the indoor positioning and navigation device, the spatial position of the Bluetooth node, and the feature point.
  • Step 610 The path planning and motion control module 117 performs positioning and navigation according to the created three-dimensional environment map.
  • the Bluetooth and SLAM-based indoor positioning and navigation device and method of the present application can collect mileage information by detecting and tracking ORB feature points, Bluetooth node information, and odometer, and constructing a scene map through Kalman filtering for accurate robot accuracy. Positioning, path planning and navigation can greatly improve positioning accuracy. Compared with the existing indoor positioning and navigation technology, the method disclosed in the present application can not only effectively reduce the influence caused by environmental factors to improve the positioning accuracy, but also provide a more reliable and accurate three-dimensional environment map.
  • the various modules in the aforementioned indoor positioning and navigation device include a Bluetooth module, a vision module, an odometer module, a data storage and processing module, a position fusion and estimation module, a map construction module, and a path planning and motion control module, It can be implemented in whole or in part by software, hardware and a combination thereof.
  • the above-mentioned modules may be embedded in the form of hardware or independent of the processor in the computer equipment, or may be stored in the memory of the computer equipment in the form of software, so that the processor can call and execute the operations corresponding to the above-mentioned modules.
  • a computer device is provided.
  • the computer device may be a movable intelligent platform or a robot, and its internal structure diagram may be as shown in FIG. 7.
  • the computer equipment includes a processor, a memory, a network interface, an input device, a Bluetooth device and a driving device connected through a system bus.
  • the processor of the computer device is used to provide calculation and control capabilities.
  • the memory of the computer device includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium stores an operating system and computer readable instructions.
  • the internal memory provides an environment for the operation of the operating system and computer-readable instructions in the non-volatile storage medium.
  • the network interface of the computer device is used to communicate with an external terminal or server through a network connection.
  • the input device of the computer equipment can be a touch layer covered on the display screen, it can also be a button, a trackball or a touchpad set on the housing of the computer equipment, and it can also be an external keyboard, a touchpad or a mouse.
  • the Bluetooth device of the computer device can communicate with other devices with Bluetooth communication capabilities, and the motion device of the computer device can be any form of device that can drive the computer device to move.
  • the computer device may also include a display screen, and the display screen of the computer device may be a liquid crystal display screen or an electronic ink display screen,
  • FIG. 7 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the computer device to which the solution of the present application is applied.
  • the specific computer device may Including more or fewer parts than shown in the figure, or combining some parts, or having a different arrangement of parts.
  • a computer device including a memory and one or more processors, and computer-readable instructions are stored in the memory.
  • the computer-readable instructions are executed by the processor, the The one or more processors execute a method, including: calibrating the camera in the vision module; acquiring data collected by the Bluetooth module, vision module, and odometer module; fusing the collected data; obtaining the indoor positioning and navigation device according to the calculation Create a three-dimensional environment map based on the real-time location, Bluetooth node, and spatial location of feature points; and perform positioning and navigation based on the created three-dimensional environment map.
  • one or more computer program products including a storage medium, and the storage medium stores computer-readable instructions.
  • the one or more processors When the computer-readable instructions are executed by one or more processors, the one or more processors Implementing a method includes: calibrating the camera in the vision module; acquiring data collected by the Bluetooth module, vision module, and odometer module; fusing the collected data; obtaining the real-time position, Bluetooth node, and characteristics of the indoor positioning and navigation device according to calculations Create a three-dimensional environment map based on the spatial position of the point; and perform positioning and navigation based on the created three-dimensional environment map.
  • one or more non-volatile storage media storing computer-readable instructions are provided.
  • the one or more processors execute a The method includes: calibrating the camera in the vision module; acquiring the data collected by the Bluetooth module, the vision module and the odometer module; fusing the collected data; obtaining the real-time position of the indoor positioning navigation device, the Bluetooth node, and the space of the feature point according to the calculation Create a three-dimensional environment map based on the location; and perform positioning and navigation based on the created three-dimensional environment map.
  • one or more computer programs which execute a method when executed by one or more processors, including: calibrating the camera in the vision module; acquiring the Bluetooth module, the vision module, and the odometer module. Integrate the collected data; create a three-dimensional environment map based on the calculated real-time position of the indoor positioning and navigation device, the spatial location of the Bluetooth node, and the feature point; and perform positioning and navigation based on the created three-dimensional environment map.
  • the method executed by the one or more processors is the indoor positioning and navigation method in some embodiments of the present application.
  • Non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory may include random access memory (RAM) or external cache memory.
  • RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain Channel (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.
  • the indoor positioning and navigation device and method of the present application can construct a scene map based on Bluetooth node information, image feature points, feature line segment information, and real-time location information estimated by the odometer in the Mesh network, and can be used for robots Precise positioning and path planning.
  • multiple positioning sensors can be used for data fusion, which has strong anti-interference ability, realizes higher-precision positioning, and effectively reduces the influence caused by environmental factors to improve positioning. Accuracy, and can provide a more reliable and accurate three-dimensional environment map, so that it can be extended and applied more widely.
  • Bluetooth is not affected by light and slippage, but it will be blocked by obstacles such as walls and tables, resulting in low positioning accuracy. Therefore, in some embodiments, multiple positioning such as Bluetooth, camera, and odometer can be used. Combining the advantages of sensors, learn from each other's strengths.
  • the device or method of the present application constructs a scene map using Bluetooth node information, feature points and feature information collected by the vision module, and real-time location information estimated by the odometer module, so as to achieve precise positioning and navigation functions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)

Abstract

An indoor localization and navigation apparatus (100), comprising: a Bluetooth module (111), configured to acquire Mesh network information of the indoor localization and navigation apparatus (100), and acquire addresses, attributes, RSSI, IQ data, angles, and arrival times of a plurality of Bluetooth nodes in a Mesh network; a vision module (112), configured to acquire an image around a robot; an odometer module (113), configured to estimate the position of the robot in real time; a data storage and processing module (114), configured to receive data information acquired and calculated by the Bluetooth module (111), the vision module (112), and the odometer module (113); a position integration and estimation module (115), configured to integrate data in the data storage and processing module (114); a map construction module (116), configured to create a three-dimensional environment map according to spatial position information of the Bluetooth nodes and feature points; and a path planning and motion control module (117), configured to drive the robot and perform path planning and navigation according to the created three-dimensional environment map. The localization and navigation apparatus (100) can effectively improve the localization and navigation accuracy.

Description

基于蓝牙与SLAM的室内定位导航装置及其方法Indoor positioning and navigation device and method based on Bluetooth and SLAM 技术领域Technical field
本申请涉及一种基于蓝牙与SLAM的室内定位导航装置及其方法。This application relates to an indoor positioning and navigation device and method based on Bluetooth and SLAM.
背景技术Background technique
随着移动互联网技术、智能移动终端技术和物联网技术的发展,人们对室内定位服务的需求越来越多元化,使得室内定位导航技术成为智能定位服务领域的研究热点。SLAM(simultaneous localization and mapping,同步定位与建图)技术是机器人领域比较经典的问题,通常SLAM问题可以描述为:机器人在未知环境中从一个未知位置开始移动,在移动过程中根据位置估计和地图自身定位,同时在自身定位的基础上建造增量式地图,实现机器人的自主定位和导航。目前室内定位导航技术是通过各种物理传感器来实现定位导航的,如摄像头、IMU等,在室内复杂的环境中易受环境因素的影响或传感器自身精度低而导致定位精度不高的问题。比如,每个传感器受影响的因素都不一样,摄像头是光照,黑暗下摄像头获取不到特征点,IMU是外界因素,如人为拖拽、地面打滑等,长时间工作会存在累计误差。With the development of mobile Internet technology, smart mobile terminal technology and Internet of Things technology, people's demand for indoor positioning services is becoming more and more diversified, making indoor positioning and navigation technology a research hotspot in the field of intelligent positioning services. SLAM (simultaneous localization and mapping) technology is a classic problem in the robotics field. Generally, the SLAM problem can be described as: the robot starts to move from an unknown location in an unknown environment, and is based on the location estimation and map during the movement. Self-positioning, and at the same time, an incremental map is built on the basis of self-positioning to realize autonomous positioning and navigation of the robot. At present, indoor positioning and navigation technology uses a variety of physical sensors to achieve positioning and navigation, such as cameras, IMUs, etc. In the complex indoor environment, it is susceptible to environmental factors or the sensor itself is low in accuracy, resulting in low positioning accuracy. For example, the factors that are affected by each sensor are different. The camera is illuminated, and the camera cannot obtain feature points in the dark. The IMU is an external factor, such as man-made drag and ground slippage. There will be cumulative errors in long-term work.
发明内容Summary of the invention
在一些实施例中,本申请公开了一种室内定位导航装置,其中所述室内定位导航装置内置于可移动的智能平台或机器人中,包括:蓝牙模块,用于获取所述室内定位导航装置的Mesh网络信息,获取Mesh网络中多个蓝牙节点的地址、属性、RSSI(Received Signal Strength Indication,接收的信号强度指示)、IQ(In-phase Quadrature,同向正交)数据、角度、到达时间。视觉模块,用于采集机器人周围的图像,并对所采集的图像灰度化处理、图像矫正以及对图像进行特征点提取和特征线段提取。里程计模块,用于通过相对定位计算出机器人每一时刻位置相对于上一时刻位置的变化,实时估计机器人的位置。数据存储与处理模块,用于接收蓝牙模块、视觉模块以及里程计模块所采集以及计算得到的数据信息。位置融合与估算模块,用于对数据存储与处理模块中的数据进行融合,获取室内定位导航装置的实时位置。地图构建模块,用于存储位置融合与估算模块估计的室内定位导航装置的实时位置,以及根据数据存储与处理模块中的蓝牙节点、特征点的空间位置信息创建的三维环境地图,以及路径规划与运动控制模块,用于驱动机器人,并根据地图构建模 块创建的三维环境地图进行路径规划与导航。In some embodiments, the present application discloses an indoor positioning and navigation device, wherein the indoor positioning and navigation device is built in a movable intelligent platform or robot, and includes: a Bluetooth module for obtaining information about the indoor positioning and navigation device Mesh network information, obtain the addresses, attributes, RSSI (Received Signal Strength Indication, received signal strength indication), IQ (In-phase Quadrature, same direction orthogonal) data, angle, and arrival time of multiple Bluetooth nodes in the Mesh network. The vision module is used to collect images around the robot, and perform grayscale processing, image correction, and feature point extraction and feature line segment extraction on the image. The odometer module is used to calculate the change of the position of the robot at each time relative to the position of the previous time through relative positioning, and to estimate the position of the robot in real time. The data storage and processing module is used to receive the data information collected and calculated by the Bluetooth module, the vision module and the odometer module. The position fusion and estimation module is used to fuse the data in the data storage and processing module to obtain the real-time position of the indoor positioning and navigation device. The map building module is used to store the real-time location of the indoor positioning and navigation device estimated by the location fusion and estimation module, and the three-dimensional environment map created based on the spatial location information of the Bluetooth nodes and feature points in the data storage and processing module, as well as path planning and The motion control module is used to drive the robot and perform path planning and navigation according to the three-dimensional environment map created by the map building module.
在一些实施例中,本申请公开了一种室内定位导航方法,包括:标定视觉模块内的摄像头;获取蓝牙模块、视觉模块以及里程计模块采集的数据;对采集的数据进行卡尔曼滤波;根据计算得到室内定位导航装置的实时位置、蓝牙节点、特征点的空间位置创建三维环境地图,以及根据创建的三维环境地图进行定位和导航。In some embodiments, this application discloses an indoor positioning and navigation method, including: calibrating a camera in a vision module; acquiring data collected by a Bluetooth module, a vision module, and an odometer module; performing Kalman filtering on the collected data; The real-time position of the indoor positioning and navigation device, the spatial position of the Bluetooth node, and the feature point are calculated to create a three-dimensional environment map, and positioning and navigation are performed according to the created three-dimensional environment map.
在一些实施例中,提供了一种计算机设备,包括存储器和一个或多个处理器,所述存储器中储存有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述一个或多个处理器执行或实现本申请的一些实施例中所公开的室内定位导航方法。In some embodiments, a computer device is provided, including a memory and one or more processors, and computer-readable instructions are stored in the memory. When the computer-readable instructions are executed by the processor, the The one or more processors execute or implement the indoor positioning and navigation methods disclosed in some embodiments of the present application.
在一些实施例中,提供了一个或多个存储有计算机可读指令的非易失性存储介质,计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行或实现本申请的一些实施例中所公开的室内定位导航方法。In some embodiments, one or more non-volatile storage media storing computer-readable instructions are provided. When the computer-readable instructions are executed by one or more processors, the one or more processors execute or implement The indoor positioning and navigation method disclosed in some embodiments of the present application.
在一些实施例中,提供了一种计算机程序,其被处理器执行时实现本申请的一些实施例中所公开的室内定位导航方法。In some embodiments, a computer program is provided, which, when executed by a processor, implements the indoor positioning and navigation method disclosed in some embodiments of the present application.
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征和优点将从说明书、附图以及权利要求书变得明显。The details of one or more embodiments of the present application are set forth in the following drawings and description. Other features and advantages of this application will become apparent from the description, drawings and claims.
附图说明Description of the drawings
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图。In order to more clearly describe the technical solutions in the embodiments of the present application, the following will briefly introduce the drawings needed in the embodiments. Obviously, the drawings in the following description are only some embodiments of the present application. A person of ordinary skill in the art can obtain other drawings based on these drawings without creative work.
图1是本申请的一个或多个实施例提供的基于蓝牙与SLAM室内定位导航装置的模块图。Fig. 1 is a block diagram of an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
图2是本申请的一个或多个实施例提供的基于蓝牙与SLAM室内定位导航装置中视觉模块的工作流程图。Fig. 2 is a working flow chart of the vision module in the indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
图3是本申请的一个或多个实施例提供的基于蓝牙与SLAM室内定位导航装置中位置融合与估算模块卡尔曼滤波的流程图。Fig. 3 is a flowchart of Kalman filtering of the position fusion and estimation module in the Bluetooth and SLAM indoor positioning and navigation device provided by one or more embodiments of the present application.
图4是本申请的一个或多个实施例提供的基于蓝牙与SLAM室内定位导航装置中蓝牙模块的定位模型图。FIG. 4 is a positioning model diagram of a Bluetooth module in an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
图5是本申请的一个或多个实施例提供的基于蓝牙与SLAM室内定位导航装置中里程计模块的运动模型图。Fig. 5 is a motion model diagram of an odometer module in an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
[根据细则91更正 04.03.2021] 
图6是本申请的一个或多个实施例提供的基于蓝牙与SLAM室内定位导航装置的工作流程图。
图7是本申请的一个或多个实施例提供的一种计算机设备,其内部结构图。
[Corrected according to Rule 91 04.03.2021]
Fig. 6 is a working flow chart of an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
FIG. 7 is a diagram of the internal structure of a computer device provided by one or more embodiments of the present application.
具体实施方式Detailed ways
为了使本申请的技术方案及有益效果更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。In order to make the technical solutions and beneficial effects of the present application clearer, the following further describes the present application in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, and are not used to limit the present application.
图1是本申请实施例提供的基于蓝牙与SLAM室内定位导航装置100的模块图。如图1所示,基于蓝牙与SLAM室内定位导航装置100包括:蓝牙模块111、视觉模块112、里程计模块113、数据存储与处理模块114、位置融合与估算模块115、地图构建模块116以及路径规划与运动控制117。该室内定位导航装置100可内置于扫地机器人、室内机器人中,用于实现机器人的地图构建以及精准定位和导航。FIG. 1 is a block diagram of an indoor positioning and navigation device 100 based on Bluetooth and SLAM provided by an embodiment of the present application. As shown in FIG. 1, the indoor positioning and navigation device 100 based on Bluetooth and SLAM includes: a Bluetooth module 111, a vision module 112, an odometer module 113, a data storage and processing module 114, a position fusion and estimation module 115, a map construction module 116, and a path Planning and Motion Control 117. The indoor positioning and navigation device 100 can be built into a sweeping robot or an indoor robot, and is used to implement map construction and precise positioning and navigation of the robot.
如图1所示,蓝牙模块111为集成了蓝牙功能的芯片基本电路集合,可用于无线网络通讯,可实现数据传输、音频传输功能。蓝牙模块111位于室内定位导航装置100中,具有智能组网和定位功能,具体包括有实现蓝牙网络中蓝牙设备的组网以及定位。蓝牙模块111获取室内定位导航装置100所处的Mesh网络,蓝牙网络中蓝牙节点的地址、属性、RSSI(Received Signal Strength Indication,接收的信号强度指示)、IQ(In-phase Quadrature,同向正交)数据、角度(包括方向角和仰角)、到达时间等信息。其中RSSI的单位为dbm,一般为负值,可以反映蓝牙网络中两个蓝牙节点之间距离的远近。蓝牙节点的地址信息是蓝牙节点的一个标识符,用字节表示;蓝牙节点的属性有:友好节点、低功耗节点、中继节点、标准节点以及代理节点这五种类型。As shown in FIG. 1, the Bluetooth module 111 is a set of basic circuits of a chip that integrates Bluetooth functions, which can be used for wireless network communication, and can realize data transmission and audio transmission functions. The Bluetooth module 111 is located in the indoor positioning and navigation device 100 and has intelligent networking and positioning functions, specifically including the realization of networking and positioning of Bluetooth devices in the Bluetooth network. The Bluetooth module 111 obtains the Mesh network where the indoor positioning and navigation device 100 is located, the address, attributes, RSSI (Received Signal Strength Indication, received signal strength indication), IQ (In-phase Quadrature, same direction orthogonality) of the Bluetooth node in the Bluetooth network ) Data, angle (including direction angle and elevation angle), time of arrival and other information. The unit of RSSI is dbm, which is generally a negative value, which can reflect the distance between two Bluetooth nodes in the Bluetooth network. The address information of the Bluetooth node is an identifier of the Bluetooth node, expressed in bytes; the attributes of the Bluetooth node include: friendly nodes, low-power nodes, relay nodes, standard nodes, and proxy nodes.
进一步地,蓝牙模块111通过定位功能获取周围蓝牙节点到机器人(包含有室内定位导航装置100)的TOA(Time Of Arrive,到达时间)、RSSI、IQ数据和AOA(Angle Of Arrive,到达角度)。蓝牙模块111与RSSI测距模型和相位差测距模型构建基于TOA、RSSI、相位以及AOA定位模型方程,如下:Further, the Bluetooth module 111 obtains TOA (Time Of Arrive), RSSI, IQ data, and AOA (Angle Of Arrive) from surrounding Bluetooth nodes to the robot (including the indoor positioning and navigation device 100) through the positioning function. The Bluetooth module 111 and the RSSI ranging model and the phase difference ranging model are constructed based on the TOA, RSSI, phase and AOA positioning model equations, as follows:
Figure PCTCN2020141624-appb-000001
Figure PCTCN2020141624-appb-000001
其中,(x,y,z)代表了室内定位导航装置100的空间位置,蓝牙节点B i的空间位置为 (X bi,Y bi,Z bi),θ i是蓝牙节点B i与室内定位导航装置100的蓝牙天线阵列平面之间的方位角,
Figure PCTCN2020141624-appb-000002
是蓝牙节点B i与室内定位导航装置100的蓝牙天线阵列平面之间的仰角,D 1i是由RSSI测距模型获取的室内定位导航装置100与蓝牙设备B i之间的距离,D 2i是由TOA测距模型获取的室内定位导航装置100与蓝牙设备B i之间的距离,D 3i是由相位差测距模型获取的室内定位导航装置100与蓝牙设备B i之间的距离。通过最小二乘方法获取上述定位模型的最佳解,可以计算出蓝牙节点B i相对于室内定位导航装置100的空间位置,即(x,y,z)的值。具体的关于参数θ i
Figure PCTCN2020141624-appb-000003
D 1i,D 2i以及D 3i可参考图4。图4是本申请实施例提供的基于蓝牙与SLAM室内定位导航装置中蓝牙模块的定位模型图。(X bi,Y bi,Z bi)代表代表了蓝牙网络中的一个节点B i的坐标,B′ i为蓝牙节点B i在二维平面上的投影。
Among them, (x, y, z) represents the spatial position of the indoor positioning and navigation device 100, the spatial position of the Bluetooth node B i is (X bi , Y bi , Z bi ), and θ i is the Bluetooth node B i and indoor positioning and navigation The azimuth angle between the bluetooth antenna array planes of the device 100,
Figure PCTCN2020141624-appb-000002
Is the elevation angle between the Bluetooth node B i and the Bluetooth antenna array plane of the indoor positioning and navigation device 100, D 1i is the distance between the indoor positioning and navigation device 100 and the Bluetooth device B i obtained by the RSSI ranging model, and D 2i is determined by The distance between the indoor positioning and navigation device 100 and the Bluetooth device B i obtained by the TOA ranging model, where D 3i is the distance between the indoor positioning and navigation device 100 and the Bluetooth device B i obtained by the phase difference ranging model. By obtaining the best solution of the above positioning model by the least square method, the spatial position of the Bluetooth node B i relative to the indoor positioning and navigation device 100 can be calculated, that is, the value of (x, y, z). Specifically about the parameter θ i ,
Figure PCTCN2020141624-appb-000003
Refer to Figure 4 for D 1i , D 2i and D 3i. FIG. 4 is a positioning model diagram of a Bluetooth module in an indoor positioning and navigation device based on Bluetooth and SLAM provided by an embodiment of the present application. (X bi, Y bi, Z bi) coordinates of a representative node B i Bluetooth network, B 'i B i is a Bluetooth Node projected on a two-dimensional plane.
视觉模块112,用于采集机器人周围的图像,并对所采集的图像灰度化处理、图像矫正以及对图像进行特征点提取和特征线段提取,即对所述图像做预处理来去除镜头畸变对于特征描述的影响,并且用ORB(Oriented Fast and Rotated Brief)特征点检测方法来提取所述图像中的特征点以及特征线段。具体地,视觉模块112内的广角摄像头用于采集机器人所处环境的图像信息。在本申请中,采用12*9的10mm*10mm的黑白棋盘格标定板标定室内定位导航装置100中视觉模块112内的广角摄像头。该标定方法可以避免传统方法设备要求高、操作繁琐的缺点,相对于目前标定方法而言,又具有较高的精度。视觉模块112获取摄像头内参数和畸变参数信息,比如:摄像头内参数是横焦距fx、纵焦距fy、主点横坐标u0以及主点纵坐标v0,而畸变参数包括径向畸变参数k1、k2、k3和切向畸变参数p1、p2,包括但不限于此。The vision module 112 is used to collect images around the robot, and perform grayscale processing, image correction, and feature point extraction and feature line segment extraction on the collected images, that is, preprocessing the image to remove lens distortion. The influence of feature description, and the ORB (Oriented Fast and Rotated Brief) feature point detection method is used to extract feature points and feature line segments in the image. Specifically, the wide-angle camera in the vision module 112 is used to collect image information of the environment where the robot is located. In this application, a 12*9 10mm*10mm black and white checkerboard calibration board is used to calibrate the wide-angle camera in the vision module 112 of the indoor positioning and navigation device 100. The calibration method can avoid the shortcomings of high equipment requirements and cumbersome operation of the traditional method, and has higher accuracy compared with the current calibration method. The vision module 112 obtains the camera internal parameters and distortion parameter information. For example, the camera internal parameters are the horizontal focal length fx, the vertical focal length fy, the principal point abscissa u0, and the principal point ordinate v0, and the distortion parameters include radial distortion parameters k1, k2, and k2. k3 and tangential distortion parameters p1 and p2 include but are not limited to these.
里程计模块113,用于通过相对定位计算出机器人每一时刻位置相对于上一时刻位置的变化,从而实现位置的实时估计。即计算出机器人每一时刻相对于上一时刻X方向偏移量Δx、Y方向偏移量Δy和绕Z方向角度偏移量Δφ。如图5所示,图5是本申请实施例提供的基于蓝牙与SLAM室内定位导航装置中里程计模块的运动模型图。X(k-1)为k-1时刻的状态量,X(k)为下一时刻,即k时刻的状态量,则u(k)为k时刻相对于k-1时刻的偏移量,其中u(k)的值为X方向偏移量Δx、Y方向偏移量Δy和绕Z方向角度偏移量Δφ。数据存储与处理模块115与蓝牙模块111、视觉模块112以及里程计模块113相耦合,用于接收蓝牙模块111、视觉模块112以及里程计模块113所采集以及计算得到的数据信息。具体地,这些数据信息除了上述蓝牙模块采集的蓝牙节点的属性、地址、RSSI、IQ数据、角度(仰角和方位角)、到达时间等信息,还包括视觉模块采集的广角摄像头内参数和畸变参数信息、特征点图像坐标及描述、特征线段及描述等,以及里程计模块采集的智能平台 或者机器人的位置变化量等外。通过上述数据信息计算出的数据有蓝牙节点的空间位置、特征点的空间位置,同时还有由位置融合与估算模块获取的实时更新的智能平台或者机器人的位置信息。The odometer module 113 is used to calculate the change of the position of the robot at each time relative to the position of the previous time through relative positioning, so as to realize the real-time estimation of the position. That is, calculate the X-direction offset Δx, the Y-direction offset Δy, and the Z-direction angular offset Δφ relative to the previous time at each time of the robot. As shown in Figure 5, Figure 5 is a motion model diagram of the odometer module in the Bluetooth and SLAM-based indoor positioning and navigation device provided by an embodiment of the present application. X(k-1) is the state quantity at time k-1, X(k) is the state quantity at the next time, that is, the state quantity at time k, then u(k) is the offset of time k with respect to time k-1, The values of u(k) are the X-direction offset Δx, the Y-direction offset Δy, and the Z-direction angular offset Δφ. The data storage and processing module 115 is coupled with the Bluetooth module 111, the vision module 112, and the odometer module 113, and is configured to receive data information collected and calculated by the Bluetooth module 111, the vision module 112, and the odometer module 113. Specifically, in addition to the attributes, address, RSSI, IQ data, angle (elevation and azimuth), arrival time and other information of the Bluetooth node collected by the above-mentioned Bluetooth module, these data information also include the wide-angle camera internal parameters and distortion parameters collected by the vision module. Information, feature point image coordinates and description, feature line segment and description, etc., as well as the position change of the intelligent platform or robot collected by the odometer module. The data calculated by the above data information includes the spatial position of the Bluetooth node, the spatial position of the feature point, and the real-time updated intelligent platform or robot position information obtained by the position fusion and estimation module.
位置融合与估算模块115与数据存储与处理模块114相耦合,用于对数据存储与处理模块114中的数据进行融合,融合算法有最小二乘方法、LM算法、BA算法、智能优化算法(如遗传算法、粒子群算法、蚁群算法等)、卡尔曼滤波算法等,本申请以其中一种算法,如卡尔曼滤波为例进行实施说明,获取室内定位导航装置100的实时位置。经过卡尔曼滤波可以实现室内定位导航装置100的实时精确定位。本领域的技术人员应该理解,本实施例不能作为本申请的限制The location fusion and estimation module 115 is coupled with the data storage and processing module 114 to fuse the data in the data storage and processing module 114. The fusion algorithms include least squares method, LM algorithm, BA algorithm, and intelligent optimization algorithm (such as Genetic algorithm, particle swarm algorithm, ant colony algorithm, etc.), Kalman filter algorithm, etc. This application uses one of these algorithms, such as Kalman filter as an example, for implementation and description to obtain the real-time position of the indoor positioning and navigation device 100. After Kalman filtering, real-time accurate positioning of the indoor positioning and navigation device 100 can be realized. Those skilled in the art should understand that this embodiment cannot be used as a limitation of the application
地图构建模块116与位置融合与估算模块115相耦合,用于存储位置融合与估算模块115估计的室内定位导航装置100的实时位置,以及数据存储与处理模块114中的蓝牙节点、特征点的空间位置共同创建的三维环境地图。The map construction module 116 is coupled with the position fusion and estimation module 115, and is used to store the real-time position of the indoor positioning and navigation device 100 estimated by the position fusion and estimation module 115, and the space of Bluetooth nodes and feature points in the data storage and processing module 114 A three-dimensional map of the environment created by the location.
路径规划与运动控制117与地图构建模块116耦合,用于根据存储于地图构建模块116中的三维环境地图进行路径规划与导航。路径规划与运动控制117用于相应驱动机器人,并且实时获取所述机器人的当前位置信息。每一模块可选地实施为包含逻辑、具有存储的指令的非暂态计算机可读介质、固件、和/或其组合。对于用存储的指令和/或固体实现的逻辑,可以提供处理器来执行这样的指令而使现有的室内定位和导航装置执行本文所描述的方法。The path planning and motion control 117 is coupled with the map construction module 116, and is used for path planning and navigation according to the three-dimensional environment map stored in the map construction module 116. The path planning and motion control 117 is used to drive the robot accordingly, and obtain the current position information of the robot in real time. Each module is optionally implemented as a non-transitory computer-readable medium containing logic, stored instructions, firmware, and/or a combination thereof. For the logic implemented with stored instructions and/or solid state, a processor may be provided to execute such instructions so that the existing indoor positioning and navigation device executes the method described herein.
图2是本申请实施例提供的基于蓝牙与SLAM室内定位导航装置中视觉模块的工作流程图。包括以下步骤:Fig. 2 is a working flow chart of the vision module in the Bluetooth and SLAM indoor positioning and navigation device provided by an embodiment of the present application. It includes the following steps:
步骤202:视觉模块112内的广角摄像头采集室内定位导航装置100周围的图像信息。Step 202: The wide-angle camera in the vision module 112 collects image information around the indoor positioning and navigation device 100.
步骤204:视觉模块112对所采集的图像进行预处理。具体包括图像的灰度化处理。Step 204: The vision module 112 preprocesses the collected images. Including the gray-scale processing of the image.
步骤206:视觉模块112对图像畸变进行校正。Step 206: The vision module 112 corrects the image distortion.
步骤208:提取图像信息中的特征点信息,使用的方法为OpenCV自带的ORB特征提取。这种方法的特点为运算速度快,并且具有一定的抗噪声和抗旋转等特性。使用ORB特征提取的方法处理完图像后,可以得到一系列的特征点数据,并将特征信息将保存在特征数据库中。图像中的ORB特征点包括灯、墙角等。Step 208: Extract the feature point information in the image information, and the method used is the ORB feature extraction that comes with OpenCV. This method is characterized by fast calculation speed, and has certain characteristics of anti-noise and anti-rotation. After processing the image using the ORB feature extraction method, a series of feature point data can be obtained, and the feature information will be stored in the feature database. The ORB feature points in the image include lights, wall corners, etc.
步骤210:提取图像信息中的特征线段,特征线段是1sd线段,一般是墙的边缘或者有一定长度物体的边缘线,如方形灯的边缘线等等。Step 210: Extract a characteristic line segment in the image information. The characteristic line segment is a 1sd line segment, which is generally an edge of a wall or an edge line of an object with a certain length, such as an edge line of a square lamp, etc.
图3是本申请实施例提供的基于蓝牙与SLAM室内定位导航装置中位置融合与估算模 块卡尔曼滤波的流程图。卡尔曼滤波主要包括以下步骤:Fig. 3 is a flowchart of the Kalman filtering of the position fusion and estimation module in the Bluetooth and SLAM-based indoor positioning and navigation device provided by an embodiment of the present application. Kalman filtering mainly includes the following steps:
步骤302:子模块初始化,具体来说是指对卡尔曼滤波中涉及的变量和矩阵进行定义和初始化。Step 302: sub-module initialization, specifically refers to the definition and initialization of variables and matrices involved in Kalman filtering.
步骤304:位置融合与估算模块建立运动方程和观测方程。具体的包括:Step 304: The position fusion and estimation module establishes a motion equation and an observation equation. Specifically:
通过室内定位导航装置100在k+1时刻的姿态变化量U k+1(Δx(k+1),Δy(k+1),Δφ(k+1))该数据是里程模块114所获取的数据。k时刻的系统状态X(k)推算出第k+1时刻系统状态X(k+1|k): The posture change U k+1 (Δx(k+1),Δy(k+1),Δφ(k+1)) of the indoor positioning and navigation device 100 at k+1 time is obtained by the mileage module 114 data. The system state X(k) at time k calculates the system state X(k+1|k) at time k+1:
X(k+1|k)=X(k)+U k+1+Q k+1…………(2) X(k+1|k)=X(k)+U k+1 +Q k+1 …………(2)
其中,Q k+1为k+1时刻的运动方程噪声,这个噪声是由里程计模块自身的定位精度决定的;U k+1为k+1时刻的姿态变化量。 Among them, Q k+1 is the motion equation noise at k+1 time, this noise is determined by the positioning accuracy of the odometer module itself; U k+1 is the attitude change at k+1 time.
进一步地,根据室内定位导航装置100在k+1时刻的系统状态X(k+1|k)估算出k+1时刻系统观测量
Figure PCTCN2020141624-appb-000004
Figure PCTCN2020141624-appb-000005
其中,
Further, according to the system state X(k+1|k) of the indoor positioning and navigation device 100 at time k+1, the system observation at time k+1 is estimated
Figure PCTCN2020141624-appb-000004
which is
Figure PCTCN2020141624-appb-000005
among them,
Figure PCTCN2020141624-appb-000006
Figure PCTCN2020141624-appb-000006
其中,B i代表了蓝牙网络中的一个蓝牙节点,其坐标为(X bi,Y bi,Z bi),相对于室内定位导航装置100在k+1时刻的系统状态量X(k+1|k)的方位角θ i和仰角
Figure PCTCN2020141624-appb-000007
F i表示其中的一个特征点,该特征点的空间三维坐标为(X fi,Y fi,Z fi),f是摄像头的焦距,W k+1为k+1时刻的观测方程噪声,H是观测方程对状态量的雅克比矩阵。
Among them, B i represents a Bluetooth node in the Bluetooth network, and its coordinates are (X bi , Y bi , Z bi ), which is relative to the system state quantity X(k+1| k) azimuth angle θ i and elevation angle
Figure PCTCN2020141624-appb-000007
F i represents one of the feature points. The three-dimensional coordinates of the feature point are (X fi , Y fi , Z fi ), f is the focal length of the camera, W k+1 is the noise of the observation equation at k+1, and H is The Jacobian matrix of the observation equation versus the state quantity.
步骤306:更新协方差方程,计算公式如下:Step 306: Update the covariance equation, the calculation formula is as follows:
P(k+1)=(P(k+1|k) -1+H T*R -1*H) -1…………(4) P(k+1)=(P(k+1|k) -1 +H T *R -1 *H) -1 …………(4)
其中,P(k+1)为k+1时刻的协方差矩阵,R为观测噪声协方差矩阵;Among them, P(k+1) is the covariance matrix at k+1, and R is the observed noise covariance matrix;
步骤308:更新增益矩阵,计算公式如下:Step 308: Update the gain matrix, the calculation formula is as follows:
K=P(k+1)*H T*R -1…………(5) K=P(k+1)*H T *R -1 …………(5)
其中,K是增益矩阵;Among them, K is the gain matrix;
步骤310:更新状态向量,计算公式如下:Step 310: Update the state vector, the calculation formula is as follows:
X(k+1)=X(k+1|k)+K*ΔL…………(6)X(k+1)=X(k+1|k)+K*ΔL…………(6)
其中,X(k+1)是k+1时刻的状态向量,
Figure PCTCN2020141624-appb-000008
L k+1是k+1时刻真实的观测量,
Figure PCTCN2020141624-appb-000009
f i(u i,v i)是k+1时刻特征点F i(X fi,Y fi,Z fi)对 应的图像坐标。由于轮子打滑,拖拽或者累计误差的问题,仅仅通过运动方程的获取的位置会出现错误,因此室内定位导航装置100借助室内其他信息,比如特征点,蓝牙节点构成的观测方程进行修正,这样可以提高计算的精确度。
Where X(k+1) is the state vector at time k+1,
Figure PCTCN2020141624-appb-000008
L k+1 is the real observation at k+1,
Figure PCTCN2020141624-appb-000009
f i (u i, v i ) is the feature point in time k + 1 F i (X fi, Y fi , Z fi) corresponding to the image coordinates. Due to the problems of wheel slipping, dragging or cumulative error, the position obtained only by the equation of motion will be wrong. Therefore, the indoor positioning and navigation device 100 uses other indoor information, such as feature points, and the observation equation composed of Bluetooth nodes to correct it. Improve the accuracy of calculations.
图6是本申请实施例提供的基于蓝牙与SLAM室内定位导航装置的工作流程图。图6将结合图1进行描述。具体包括如下步骤:Fig. 6 is a working flow chart of an indoor positioning and navigation device based on Bluetooth and SLAM provided by an embodiment of the present application. FIG. 6 will be described in conjunction with FIG. 1. Specifically include the following steps:
步骤602:标定视觉模块112内的摄像头。Step 602: Calibrate the camera in the vision module 112.
步骤604:获取蓝牙模块111、视觉模块112以及里程模块113采集的数据。Step 604: Obtain data collected by the Bluetooth module 111, the vision module 112, and the mileage module 113.
步骤606:对所采集的数据进行融合。在一个实施例中采用卡尔曼滤波融合时,具体滤波的步骤参考图3的方法流程图。Step 606: Fusion of the collected data. When Kalman filter fusion is used in an embodiment, refer to the method flowchart of FIG. 3 for specific filtering steps.
步骤608:地图构建模块116根据计算得到室内定位导航装置的实时位置、蓝牙节点、特征点的空间位置创建三维环境地图。Step 608: The map construction module 116 creates a three-dimensional environment map according to the calculated real-time position of the indoor positioning and navigation device, the spatial position of the Bluetooth node, and the feature point.
步骤610:路径规划与运动控制模块117根据创建的三维环境地图进行定位和导航。Step 610: The path planning and motion control module 117 performs positioning and navigation according to the created three-dimensional environment map.
有利地,本申请的基于蓝牙和SLAM室内定位导航装置及其方法能够通过检测和跟踪ORB特征点、蓝牙节点信息以及里程计采集里程信息,并通过卡尔曼滤波构建场景地图,用于机器人的精确定位、路径规划与导航,可以极大程度提高定位精度。与现有的室内定位导航技术相比,本申请公开的方法不仅可以有效地减少环境因素造成的影响来提高定位精度,而且可以提供可靠性更高、更精确的三维环境地图。Advantageously, the Bluetooth and SLAM-based indoor positioning and navigation device and method of the present application can collect mileage information by detecting and tracking ORB feature points, Bluetooth node information, and odometer, and constructing a scene map through Kalman filtering for accurate robot accuracy. Positioning, path planning and navigation can greatly improve positioning accuracy. Compared with the existing indoor positioning and navigation technology, the method disclosed in the present application can not only effectively reduce the influence caused by environmental factors to improve the positioning accuracy, but also provide a more reliable and accurate three-dimensional environment map.
在一些实施例中,上述室内定位导航装置中的各个模块,包括蓝牙模块、视觉模块、里程计模块、数据存储与处理模块、位置融合与估算模块、地图构建模块以及路径规划与运动控制模块,可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于计算机设备中的处理器中,也可以以软件形式存储于计算机设备中的存储器中,以便于处理器调用执行以上各个模块对应的操作。In some embodiments, the various modules in the aforementioned indoor positioning and navigation device include a Bluetooth module, a vision module, an odometer module, a data storage and processing module, a position fusion and estimation module, a map construction module, and a path planning and motion control module, It can be implemented in whole or in part by software, hardware and a combination thereof. The above-mentioned modules may be embedded in the form of hardware or independent of the processor in the computer equipment, or may be stored in the memory of the computer equipment in the form of software, so that the processor can call and execute the operations corresponding to the above-mentioned modules.
在一些实施例中,提供了一种计算机设备,该计算机设备可以是可移动的智能平台或机器人,其内部结构图可以如图7所示。该计算机设备包括通过系统总线连接的处理器、存储器、网络接口、输入装置、蓝牙装置和驱动装置。其中,该计算机设备的处理器用于提供计算和控制能力。该计算机设备的存储器包括非易失性存储介质、内存储器。该非易失性存储介质存储有操作系统和计算机可读指令。该内存储器为非易失性存储介质中的操作系统和计算机可读指令的运行提供环境。该计算机设备的网络接口用于与外部的终端或者服务器通过网络连接通信。该计算机可读指令被处理器执行时以实现一种室内定位导航方法。该计算机设备的输入装置可以是显示屏上覆盖的触摸层,也可以是计算机设备外壳 上设置的按键、轨迹球或触控板,还可以是外接的键盘、触控板或鼠标等。该计算机设备的蓝牙装置可以与其他具有蓝牙通讯能力的设备进行通讯,该计算机设备的运动装置可以是任意形式的可带动该计算机设备移动的装置。此外,该计算机设备还可以包括显示屏,该计算机设备的显示屏可以是液晶显示屏或者电子墨水显示屏,In some embodiments, a computer device is provided. The computer device may be a movable intelligent platform or a robot, and its internal structure diagram may be as shown in FIG. 7. The computer equipment includes a processor, a memory, a network interface, an input device, a Bluetooth device and a driving device connected through a system bus. Among them, the processor of the computer device is used to provide calculation and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and computer readable instructions. The internal memory provides an environment for the operation of the operating system and computer-readable instructions in the non-volatile storage medium. The network interface of the computer device is used to communicate with an external terminal or server through a network connection. When the computer-readable instructions are executed by the processor, an indoor positioning and navigation method is realized. The input device of the computer equipment can be a touch layer covered on the display screen, it can also be a button, a trackball or a touchpad set on the housing of the computer equipment, and it can also be an external keyboard, a touchpad or a mouse. The Bluetooth device of the computer device can communicate with other devices with Bluetooth communication capabilities, and the motion device of the computer device can be any form of device that can drive the computer device to move. In addition, the computer device may also include a display screen, and the display screen of the computer device may be a liquid crystal display screen or an electronic ink display screen,
本领域技术人员可以理解,图7中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。Those skilled in the art can understand that the structure shown in FIG. 7 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the computer device to which the solution of the present application is applied. The specific computer device may Including more or fewer parts than shown in the figure, or combining some parts, or having a different arrangement of parts.
在一些实施例中,提供了一种计算机设备,包括存储器和一个或多个处理器,所述存储器中储存有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述一个或多个处理器执行一种方法,包括:标定视觉模块内的摄像头;获取蓝牙模块、视觉模块以及里程计模块采集的数据;对采集的数据进行融合;根据计算得到室内定位导航装置的实时位置、蓝牙节点、特征点的空间位置创建三维环境地图;以及根据创建的三维环境地图进行定位和导航。In some embodiments, a computer device is provided, including a memory and one or more processors, and computer-readable instructions are stored in the memory. When the computer-readable instructions are executed by the processor, the The one or more processors execute a method, including: calibrating the camera in the vision module; acquiring data collected by the Bluetooth module, vision module, and odometer module; fusing the collected data; obtaining the indoor positioning and navigation device according to the calculation Create a three-dimensional environment map based on the real-time location, Bluetooth node, and spatial location of feature points; and perform positioning and navigation based on the created three-dimensional environment map.
在一些实施例中,提供了一个或多个计算机程序产品,包括存储介质,存储介质存储有计算机可读指令,计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行一种方法,包括:标定视觉模块内的摄像头;获取蓝牙模块、视觉模块以及里程计模块采集的数据;对采集的数据进行融合;根据计算得到室内定位导航装置的实时位置、蓝牙节点、特征点的空间位置创建三维环境地图;以及根据创建的三维环境地图进行定位和导航。In some embodiments, one or more computer program products are provided, including a storage medium, and the storage medium stores computer-readable instructions. When the computer-readable instructions are executed by one or more processors, the one or more processors Implementing a method includes: calibrating the camera in the vision module; acquiring data collected by the Bluetooth module, vision module, and odometer module; fusing the collected data; obtaining the real-time position, Bluetooth node, and characteristics of the indoor positioning and navigation device according to calculations Create a three-dimensional environment map based on the spatial position of the point; and perform positioning and navigation based on the created three-dimensional environment map.
在一些实施例中,提供了一个或多个存储有计算机可读指令的非易失性存储介质,计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行一种方法,包括:标定视觉模块内的摄像头;获取蓝牙模块、视觉模块以及里程计模块采集的数据;对采集的数据进行融合;根据计算得到室内定位导航装置的实时位置、蓝牙节点、特征点的空间位置创建三维环境地图;以及根据创建的三维环境地图进行定位和导航。In some embodiments, one or more non-volatile storage media storing computer-readable instructions are provided. When the computer-readable instructions are executed by one or more processors, the one or more processors execute a The method includes: calibrating the camera in the vision module; acquiring the data collected by the Bluetooth module, the vision module and the odometer module; fusing the collected data; obtaining the real-time position of the indoor positioning navigation device, the Bluetooth node, and the space of the feature point according to the calculation Create a three-dimensional environment map based on the location; and perform positioning and navigation based on the created three-dimensional environment map.
在一些实施例中,提供了一个或多个计算机程序,其被一个或多个处理器执行时执行一种方法,包括:标定视觉模块内的摄像头;获取蓝牙模块、视觉模块以及里程计模块采集的数据;对采集的数据进行融合;根据计算得到室内定位导航装置的实时位置、蓝牙节点、特征点的空间位置创建三维环境地图;以及根据创建的三维环境地图进行定位和导航。In some embodiments, one or more computer programs are provided, which execute a method when executed by one or more processors, including: calibrating the camera in the vision module; acquiring the Bluetooth module, the vision module, and the odometer module. Integrate the collected data; create a three-dimensional environment map based on the calculated real-time position of the indoor positioning and navigation device, the spatial location of the Bluetooth node, and the feature point; and perform positioning and navigation based on the created three-dimensional environment map.
在一些实施例中,上述一个或多个处理器执行的方法是本申请一些实施例中的室内定位导航方法。In some embodiments, the method executed by the one or more processors is the indoor positioning and navigation method in some embodiments of the present application.
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一非易失性计算机可读取存储介质中,该计算机程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。A person of ordinary skill in the art can understand that all or part of the processes in the above-mentioned embodiment methods can be implemented by instructing relevant hardware through a computer program. The computer program can be stored in a non-volatile computer readable storage. In the medium, when the computer program is executed, it may include the processes of the above-mentioned method embodiments. Wherein, any reference to memory, storage, database, or other media used in the embodiments provided in this application may include non-volatile and/or volatile memory. Non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory. Volatile memory may include random access memory (RAM) or external cache memory. As an illustration and not a limitation, RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain Channel (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.
应该理解的是,虽然图2、3、6的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,图2、3、6中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。It should be understood that although the various steps in the flowcharts of FIGS. 2, 3, and 6 are displayed in sequence as indicated by the arrows, these steps are not necessarily performed in sequence in the order indicated by the arrows. Unless specifically stated in this article, the execution of these steps is not strictly limited in order, and these steps can be executed in other orders. Moreover, at least some of the steps in Figures 2, 3, and 6 may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily executed at the same time, but can be executed at different times. These sub-steps Or the execution order of the stages is not necessarily carried out sequentially, but may be executed alternately or alternately with at least a part of other steps or sub-steps or stages of other steps.
在一些实施例中,本申请的室内定位导航装置及方法可以通过Mesh网络中蓝牙节点信息、图像的特征点、特征线段信息以及里程计估算的实时位置信息构建场景地图,并可以用于机器人的精确定位及路径规划。在一些实施例中,为了解决室内定位服务的定位精度问题,可以将多个定位传感器进行数据融合,抗干扰能力强,实现更高精度的定位,有效地减少因环境因素造成的影响来提高定位精度,且可以提供可靠性更高、更精确的三维环境地图,使其扩展应用更广泛。比如:蓝牙不受光照影响和打滑影响,但会因墙、桌子等障碍物遮挡,导致定位精度不高,所以,在一些实施例中,可以通过涉及的蓝牙、摄像头以及里程计等多个定位传感器的优势相结合,取长补短。在一些实施例中,本申请的装置或方法通过蓝牙节点信息、视觉模块采集的特征点以及特征信息,里程计模块估算的实时位置信息构建场景地图,从而可以实现精确的定位和导航功能。In some embodiments, the indoor positioning and navigation device and method of the present application can construct a scene map based on Bluetooth node information, image feature points, feature line segment information, and real-time location information estimated by the odometer in the Mesh network, and can be used for robots Precise positioning and path planning. In some embodiments, in order to solve the problem of positioning accuracy of indoor positioning services, multiple positioning sensors can be used for data fusion, which has strong anti-interference ability, realizes higher-precision positioning, and effectively reduces the influence caused by environmental factors to improve positioning. Accuracy, and can provide a more reliable and accurate three-dimensional environment map, so that it can be extended and applied more widely. For example, Bluetooth is not affected by light and slippage, but it will be blocked by obstacles such as walls and tables, resulting in low positioning accuracy. Therefore, in some embodiments, multiple positioning such as Bluetooth, camera, and odometer can be used. Combining the advantages of sensors, learn from each other's strengths. In some embodiments, the device or method of the present application constructs a scene map using Bluetooth node information, feature points and feature information collected by the vision module, and real-time location information estimated by the odometer module, so as to achieve precise positioning and navigation functions.
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾, 都应当认为是本说明书记载的范围。以上所述仅为本申请的较佳实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本申请的保护范围之内。The technical features of the above embodiments can be combined arbitrarily. In order to make the description concise, all possible combinations of the technical features in the above embodiments are not described. However, as long as there is no contradiction in the combination of these technical features, they should all be combined. It is considered as the range described in this specification. The above descriptions are only the preferred embodiments of this application and are not intended to limit this application. Any modification, equivalent replacement and improvement made within the spirit and principle of this application shall be included in the protection of this application. Within range.

Claims (15)

  1. 一种室内定位导航装置,其中所述室内定位导航装置内置于可移动的智能平台或机器人中,包括:An indoor positioning and navigation device, wherein the indoor positioning and navigation device is built in a movable intelligent platform or robot, and includes:
    蓝牙模块,用于获取所述室内定位导航装置的Mesh网络信息,获取Mesh网络中多个蓝牙节点的地址、属性、RSSI、IQ数据、角度、到达时间;The Bluetooth module is used to obtain the Mesh network information of the indoor positioning and navigation device, and obtain the addresses, attributes, RSSI, IQ data, angles, and arrival time of multiple Bluetooth nodes in the Mesh network;
    视觉模块,用于采集机器人周围的图像,并对所采集的图像灰度化处理、图像矫正以及对图像进行特征点提取和特征线段提取;The vision module is used to collect images around the robot, and perform grayscale processing, image correction, and feature point extraction and feature line segment extraction on the image;
    里程计模块,用于通过相对定位计算出机器人每一时刻位置相对于上一时刻位置的变化,实时估计机器人的位置;The odometer module is used to calculate the change of the position of the robot at each time relative to the position of the previous time through relative positioning, and to estimate the position of the robot in real time;
    数据存储与处理模块,用于接收蓝牙模块、视觉模块以及里程计模块所采集以及计算得到的数据信息;The data storage and processing module is used to receive the data information collected and calculated by the Bluetooth module, the vision module and the odometer module;
    位置融合与估算模块,用于对数据存储与处理模块中的数据进行融合,获取室内定位导航装置的实时位置;The position fusion and estimation module is used to fuse the data in the data storage and processing module to obtain the real-time position of the indoor positioning and navigation device;
    地图构建模块,用于存储位置融合与估算模块估计的室内定位导航装置的实时位置,以及根据数据存储与处理模块中的蓝牙节点、特征点的空间位置信息创建的三维环境地图,以及The map building module is used to store the real-time location of the indoor positioning and navigation device estimated by the location fusion and estimation module, as well as a three-dimensional environment map created based on the spatial location information of the Bluetooth nodes and feature points in the data storage and processing module, and
    路径规划与运动控制模块,用于驱动机器人,并根据地图构建模块创建的三维环境地图进行路径规划与导航。The path planning and motion control module is used to drive the robot and perform path planning and navigation according to the three-dimensional environment map created by the map building module.
  2. 一种室内定位导航方法,包括:An indoor positioning and navigation method, including:
    标定视觉模块内的摄像头;Calibration of the camera in the vision module;
    获取蓝牙模块、视觉模块以及里程计模块采集的数据;Obtain the data collected by the Bluetooth module, the vision module and the odometer module;
    对采集的数据进行融合;Fusion of collected data;
    根据计算得到室内定位导航装置的实时位置、蓝牙节点、特征点的空间位置创建三维环境地图;以及Create a three-dimensional environment map based on the calculated real-time position of the indoor positioning and navigation device, the spatial position of the Bluetooth node, and the feature point; and
    根据创建的三维环境地图进行定位和导航。Perform positioning and navigation according to the created three-dimensional environment map.
  3. 如权利要求2所述的方法,其中所述蓝牙模块获取所述室内定位导航装置的Mesh网络信息,获取Mesh网络中多个蓝牙节点的地址、属性、RSSI、IQ数据、角度、到达时间。The method according to claim 2, wherein the Bluetooth module obtains Mesh network information of the indoor positioning and navigation device, and obtains addresses, attributes, RSSI, IQ data, angles, and arrival times of multiple Bluetooth nodes in the Mesh network.
  4. 如权利要求2或3任一所述的方法,其中所述视觉模块采集机器人周围的图像,并对所采集的图像灰度化处理、图像矫正以及对图像进行特征点提取和特征线段提取。The method according to any one of claims 2 or 3, wherein the vision module collects images around the robot, and performs grayscale processing, image correction, and feature point extraction and feature line segment extraction on the image.
  5. 如权利要求2-4任一所述的方法,其中所述里程计模块通过相对定位计算出机器人每一时刻位置相对于上一时刻位置的变化,实时估计机器人的位置。The method according to any one of claims 2-4, wherein the odometer module calculates the change of the position of the robot at each time relative to the position of the previous time through relative positioning, and estimates the position of the robot in real time.
  6. 如权利要求2-5任一所述的方法,其中所述对数据融合,是通过位置融合与估算模块对数据进行融合。The method according to any one of claims 2-5, wherein the data fusion is fusion of the data through a position fusion and estimation module.
  7. 如权利要求1-6任一所述的装置或方法,其中所述蓝牙模块与RSSI测距模型和相位差测距模型构建基于TOA、RSSI、相位以及AOA的定位模型方程,并通过方程The device or method according to any one of claims 1-6, wherein the Bluetooth module and the RSSI ranging model and the phase difference ranging model construct a positioning model equation based on TOA, RSSI, phase, and AOA, and pass the equation
    Figure PCTCN2020141624-appb-100001
    Figure PCTCN2020141624-appb-100001
    计算室内定位导航装置的实时位置,其中(x,y,z)代表了室内定位导航装置的空间位置,蓝牙节点B i的空间位置为(X bi,Y bi,Z bi),θ i是蓝牙节点B i与室内定位导航装置的蓝牙天线阵列平面之间的方位角,
    Figure PCTCN2020141624-appb-100002
    是蓝牙节点B i与室内定位导航装置的蓝牙天线阵列平面之间的仰角,D 1i是由RSSI测距模型获取的室内定位导航装置与蓝牙设备B i之间的距离,D 2i是由TOA测距模型获取的室内定位导航装置与蓝牙设备B i之间的距离,D 3i是由相位差测距模型获取的室内定位导航装置与蓝牙设备B i之间的距离。
    Calculate the real-time position of the indoor positioning and navigation device, where (x, y, z) represents the spatial position of the indoor positioning and navigation device, the spatial position of Bluetooth node B i is (X bi , Y bi , Z bi ), and θ i is Bluetooth The azimuth angle between node B i and the bluetooth antenna array plane of the indoor positioning and navigation device,
    Figure PCTCN2020141624-appb-100002
    Is the elevation angle between the Bluetooth node B i and the Bluetooth antenna array plane of the indoor positioning and navigation device, D 1i is the distance between the indoor positioning and navigation device and the Bluetooth device B i obtained by the RSSI ranging model, and D 2i is the distance measured by TOA The distance between the indoor positioning and navigation device and the Bluetooth device B i obtained by the model, D 3i is the distance between the indoor positioning and navigation device and the Bluetooth device B i obtained by the phase difference ranging model.
  8. 如权利要求1-7任一所述的装置、方法、非易失性存储介质或程序,其中所述视觉模块中的摄像头被12*9的10mm*10mm的黑白棋盘格标定板标定,可选地,所述视觉模块中的摄像头包含摄像头参数和畸变参数信息。The device, method, non-volatile storage medium or program according to any one of claims 1-7, wherein the camera in the vision module is calibrated by a 12*9 10mm*10mm black and white checkerboard calibration board, optional Preferably, the camera in the vision module includes camera parameter and distortion parameter information.
  9. 如权利要求1-8任一所述的室内定位导航装置、方法、非易失性存储介质或程序,其中所述视觉模块还用于对图像畸变进行校正。The indoor positioning and navigation device, method, non-volatile storage medium or program according to any one of claims 1-8, wherein the vision module is also used to correct image distortion.
  10. 如权利要求1-9任一所述的装置或方法,其中所述对数据进行融合,包括:The device or method according to any one of claims 1-9, wherein said fusing data includes:
    根据选自于由最小二乘方法、LM算法、BA算法、智能优化算法以及卡尔曼滤波算法所组成的群组的至少一者进行融合,Perform fusion based on at least one selected from the group consisting of least squares method, LM algorithm, BA algorithm, intelligent optimization algorithm, and Kalman filter algorithm,
    其中所述卡尔曼滤波算法包括选自于由对各模块初始化、建立运动方程、观测方程、更新协方差矩阵、更新增益矩阵以及更新状态向量所组成的群组的至少一者。The Kalman filter algorithm includes at least one selected from the group consisting of initializing each module, establishing a motion equation, an observation equation, updating a covariance matrix, updating a gain matrix, and updating a state vector.
  11. 如权利要求1-10任一所述的装置或方法,其中,所述对数据进行融合包括:The device or method according to any one of claims 1-10, wherein said fusing data comprises:
    通过公式By formula
    X(k+1|k)=X(k)+U k+1+Q k+1 X(k+1|k)=X(k)+U k+1 +Q k+1
    估算室内定位导航装置在k+1时刻的的系统状态X(k+1|k),其中X(k)为k时刻的系统状态,Q k+1为k+1时刻的运动方程噪声,U k+1为k+1时刻的姿态变化量; Estimate the system state X(k+1|k) of the indoor positioning and navigation device at k+1, where X(k) is the system state at k, Q k+1 is the motion equation noise at k+1, U k+1 is the amount of attitude change at k+1;
    可选地,Optionally,
    根据k+1时刻的系统状态X(k+1|k),通过公式According to the system state X(k+1|k) at k+1 time, through the formula
    Figure PCTCN2020141624-appb-100003
    Figure PCTCN2020141624-appb-100003
    ,即, which is
    Figure PCTCN2020141624-appb-100004
    Figure PCTCN2020141624-appb-100004
    估算k+1时刻的系统观测量,Estimate the system observations at time k+1,
    其中,among them,
    Figure PCTCN2020141624-appb-100005
    Figure PCTCN2020141624-appb-100005
    ,且And
    其中,坐标(X bi,Y bi,Z bi)为蓝牙网络中的蓝牙节点B i的坐标,其相对于室内定位导航装置在k+1时刻的系统状态量X(k+1|k)的方位角θ i和仰角
    Figure PCTCN2020141624-appb-100006
    F i表示其中的一个特征点,该特征点的空间三维坐标为(X fi,Y fi,Z fi),f是摄像头的焦距,W k+1为k+1时刻的观测方程噪声,H是观测方程对状态量的雅克比矩阵。
    Among them, the coordinates (X bi , Y bi , Z bi ) are the coordinates of the Bluetooth node B i in the Bluetooth network, which are relative to the system state quantity X(k+1|k) of the indoor positioning and navigation device at time k+1 Azimuth angle θ i and elevation angle
    Figure PCTCN2020141624-appb-100006
    F i represents one of the feature points. The three-dimensional coordinates of the feature point are (X fi , Y fi , Z fi ), f is the focal length of the camera, W k+1 is the noise of the observation equation at k+1, and H is The Jacobian matrix of the observation equation versus the state quantity.
  12. 如权利要求1-11任一所述的装置或方法,其中,所述对数据进行融合包括:The device or method according to any one of claims 1-11, wherein said fusing data comprises:
    更新协方差方程计算公式,为Update the calculation formula of the covariance equation as
    P(k+1)=(P(k+1|k) -1+H T*R -1*H) -1 P(k+1)=(P(k+1|k) -1 +H T *R -1 *H) -1
    其中P(k+1)为k+1时刻的协方差矩阵,R为观测噪声协方差矩阵;Where P(k+1) is the covariance matrix at k+1, and R is the observed noise covariance matrix;
    且/或And/or
    更新增益矩阵计算公式,为Update the gain matrix calculation formula as
    K=P(k+1)*H T*R -1 K=P(k+1)*H T *R -1
    其中,K是增益矩阵;Among them, K is the gain matrix;
    且/或And/or
    更新状态向量计算公式,为Update the calculation formula of the state vector as
    X(k+1)=X(k+1|k)+K*ΔLX(k+1)=X(k+1|k)+K*ΔL
    其中,X(k+1)是k+1时刻的状态向量,
    Figure PCTCN2020141624-appb-100007
    L k+1是k+1时刻真实的观测量,
    Figure PCTCN2020141624-appb-100008
    且f i(u i,v i)是k+1时刻特征点F i(X fi,Y fi,Z fi)对应的图像坐标。
    Among them, X(k+1) is the state vector at time k+1,
    Figure PCTCN2020141624-appb-100007
    L k+1 is the real observation at k+1,
    Figure PCTCN2020141624-appb-100008
    And f i (u i, v i ) is the feature point in time k + 1 F i (X fi, Y fi , Z fi) corresponding to the image coordinates.
  13. 一种存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行权利要求2-12任一所述的方法。A non-volatile storage medium storing computer-readable instructions, which when executed by one or more processors, cause one or more processors to execute any one of claims 2-12 method.
  14. 一种计算机程序产品,包括存储介质,所述存储介质存储有计算机可读指令,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行权利要求2-12任一所述的方法。A computer program product, comprising a storage medium storing computer readable instructions, and when the computer readable instructions are executed by one or more processors, the one or more processors execute claims 2-12 Any of the methods described.
  15. 一种计算机程序,其被处理器执行时实现权利要求2-12任一所述的方法。A computer program that, when executed by a processor, implements the method described in any one of claims 2-12.
PCT/CN2020/141624 2020-01-06 2020-12-30 Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor WO2021139590A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010009926.2A CN113074727A (en) 2020-01-06 2020-01-06 Indoor positioning navigation device and method based on Bluetooth and SLAM
CN202010009926.2 2020-01-06

Publications (1)

Publication Number Publication Date
WO2021139590A1 true WO2021139590A1 (en) 2021-07-15

Family

ID=76609029

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/141624 WO2021139590A1 (en) 2020-01-06 2020-12-30 Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor

Country Status (2)

Country Link
CN (1) CN113074727A (en)
WO (1) WO2021139590A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113568413A (en) * 2021-08-19 2021-10-29 深圳中智永浩机器人有限公司 Robot safety guarantee method and device, computer equipment and storage medium
CN113587917A (en) * 2021-07-28 2021-11-02 北京百度网讯科技有限公司 Indoor positioning method, device, equipment, storage medium and computer program product
CN114001743A (en) * 2021-10-29 2022-02-01 京东方科技集团股份有限公司 Map drawing method, map drawing device, map drawing system, storage medium, and electronic apparatus
CN114025320A (en) * 2021-11-08 2022-02-08 易枭零部件科技(襄阳)有限公司 Indoor positioning method based on 5G signal
CN114205748A (en) * 2021-12-08 2022-03-18 珠海格力电器股份有限公司 Network configuration method and device, electronic equipment and storage medium
CN114510044A (en) * 2022-01-25 2022-05-17 北京圣威特科技有限公司 AGV navigation ship navigation method and device, electronic equipment and storage medium
CN115218907A (en) * 2022-09-19 2022-10-21 季华实验室 Unmanned aerial vehicle path planning method and device, electronic equipment and storage medium
CN115334448A (en) * 2022-08-15 2022-11-11 重庆大学 Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor
CN115802282A (en) * 2022-12-16 2023-03-14 兰笺(苏州)科技有限公司 Wireless signal field co-location method and device
CN115808170A (en) * 2023-02-09 2023-03-17 宝略科技(浙江)有限公司 Indoor real-time positioning method integrating Bluetooth and video analysis
CN116954235A (en) * 2023-09-21 2023-10-27 深圳大工人科技有限公司 AGV trolley navigation control method and system
CN117119585A (en) * 2023-08-26 2023-11-24 江苏蓝策电子科技有限公司 Bluetooth positioning navigation system and method
CN115802282B (en) * 2022-12-16 2024-06-07 兰笺(苏州)科技有限公司 Co-location method and device for wireless signal field

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113949999B (en) * 2021-09-09 2024-01-30 之江实验室 Indoor positioning navigation equipment and method
CN114136306B (en) * 2021-12-01 2024-05-07 浙江大学湖州研究院 Expandable device and method based on relative positioning of UWB and camera

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150268058A1 (en) * 2014-03-18 2015-09-24 Sri International Real-time system for multi-modal 3d geospatial mapping, object recognition, scene annotation and analytics
US20160349362A1 (en) * 2015-05-08 2016-12-01 5D Robotics, Inc. Mobile localization using sparse time-of-flight ranges and dead reckoning
CN108801265A (en) * 2018-06-08 2018-11-13 武汉大学 Multidimensional information synchronous acquisition, positioning and position service apparatus and system and method
WO2019000417A1 (en) * 2017-06-30 2019-01-03 SZ DJI Technology Co., Ltd. Map generation systems and methods
CN109541535A (en) * 2019-01-11 2019-03-29 浙江智澜科技有限公司 A method of AGV indoor positioning and navigation based on UWB and vision SLAM
CN110308729A (en) * 2019-07-18 2019-10-08 石家庄辰宙智能装备有限公司 The AGV combined navigation locating method of view-based access control model and IMU or odometer

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150268058A1 (en) * 2014-03-18 2015-09-24 Sri International Real-time system for multi-modal 3d geospatial mapping, object recognition, scene annotation and analytics
US20160349362A1 (en) * 2015-05-08 2016-12-01 5D Robotics, Inc. Mobile localization using sparse time-of-flight ranges and dead reckoning
WO2019000417A1 (en) * 2017-06-30 2019-01-03 SZ DJI Technology Co., Ltd. Map generation systems and methods
CN108801265A (en) * 2018-06-08 2018-11-13 武汉大学 Multidimensional information synchronous acquisition, positioning and position service apparatus and system and method
CN109541535A (en) * 2019-01-11 2019-03-29 浙江智澜科技有限公司 A method of AGV indoor positioning and navigation based on UWB and vision SLAM
CN110308729A (en) * 2019-07-18 2019-10-08 石家庄辰宙智能装备有限公司 The AGV combined navigation locating method of view-based access control model and IMU or odometer

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113587917A (en) * 2021-07-28 2021-11-02 北京百度网讯科技有限公司 Indoor positioning method, device, equipment, storage medium and computer program product
CN113568413A (en) * 2021-08-19 2021-10-29 深圳中智永浩机器人有限公司 Robot safety guarantee method and device, computer equipment and storage medium
CN114001743A (en) * 2021-10-29 2022-02-01 京东方科技集团股份有限公司 Map drawing method, map drawing device, map drawing system, storage medium, and electronic apparatus
CN114025320A (en) * 2021-11-08 2022-02-08 易枭零部件科技(襄阳)有限公司 Indoor positioning method based on 5G signal
CN114205748B (en) * 2021-12-08 2023-03-10 珠海格力电器股份有限公司 Network configuration method and device, electronic equipment and storage medium
CN114205748A (en) * 2021-12-08 2022-03-18 珠海格力电器股份有限公司 Network configuration method and device, electronic equipment and storage medium
CN114510044A (en) * 2022-01-25 2022-05-17 北京圣威特科技有限公司 AGV navigation ship navigation method and device, electronic equipment and storage medium
CN115334448B (en) * 2022-08-15 2024-03-15 重庆大学 Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor
CN115334448A (en) * 2022-08-15 2022-11-11 重庆大学 Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor
CN115218907B (en) * 2022-09-19 2022-12-09 季华实验室 Unmanned aerial vehicle path planning method and device, electronic equipment and storage medium
CN115218907A (en) * 2022-09-19 2022-10-21 季华实验室 Unmanned aerial vehicle path planning method and device, electronic equipment and storage medium
CN115802282A (en) * 2022-12-16 2023-03-14 兰笺(苏州)科技有限公司 Wireless signal field co-location method and device
CN115802282B (en) * 2022-12-16 2024-06-07 兰笺(苏州)科技有限公司 Co-location method and device for wireless signal field
CN115808170A (en) * 2023-02-09 2023-03-17 宝略科技(浙江)有限公司 Indoor real-time positioning method integrating Bluetooth and video analysis
CN117119585A (en) * 2023-08-26 2023-11-24 江苏蓝策电子科技有限公司 Bluetooth positioning navigation system and method
CN117119585B (en) * 2023-08-26 2024-02-06 江苏蓝策电子科技有限公司 Bluetooth positioning navigation system and method
CN116954235A (en) * 2023-09-21 2023-10-27 深圳大工人科技有限公司 AGV trolley navigation control method and system
CN116954235B (en) * 2023-09-21 2023-11-24 深圳大工人科技有限公司 AGV trolley navigation control method and system

Also Published As

Publication number Publication date
CN113074727A (en) 2021-07-06

Similar Documents

Publication Publication Date Title
WO2021139590A1 (en) Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor
CN109887057B (en) Method and device for generating high-precision map
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
WO2021026850A1 (en) Qr code-based navigation attitude determining and positioning method and system
CN111089585A (en) Mapping and positioning method based on sensor information fusion
JP5992184B2 (en) Image data processing apparatus, image data processing method, and image data processing program
JP7300550B2 (en) METHOD AND APPARATUS FOR CONSTRUCTING SIGNS MAP BASED ON VISUAL SIGNS
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
CN110118556A (en) A kind of robot localization method and device based on covariance mixing together SLAM
CN112184812B (en) Method for improving identification and positioning precision of unmanned aerial vehicle camera to april tag and positioning method and system
CN113763548B (en) Vision-laser radar coupling-based lean texture tunnel modeling method and system
WO2019136613A1 (en) Indoor locating method and device for robot
WO2020019115A1 (en) Fusion mapping method, related device and computer readable storage medium
CN112967344B (en) Method, device, storage medium and program product for calibrating camera external parameters
CN112734765A (en) Mobile robot positioning method, system and medium based on example segmentation and multi-sensor fusion
US11067694B2 (en) Locating method and device, storage medium, and electronic device
CN114111776B (en) Positioning method and related device
WO2024027350A1 (en) Vehicle positioning method and apparatus, computer device and storage medium
CN114758011B (en) Zoom camera online calibration method fusing offline calibration results
CN111856499A (en) Map construction method and device based on laser radar
Choi et al. Monocular SLAM with undelayed initialization for an indoor robot
KR20220058846A (en) Robot positioning method and apparatus, apparatus, storage medium
CN113252066B (en) Calibration method and device for parameters of odometer equipment, storage medium and electronic device
CN111736137B (en) LiDAR external parameter calibration method, system, computer equipment and readable storage medium
Xue et al. Visual-Marker Based Localization for Flat-Variation Scene

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20912908

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20912908

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20912908

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: OTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03/02/2023)