WO2021139590A1 - Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor - Google Patents
Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor Download PDFInfo
- Publication number
- WO2021139590A1 WO2021139590A1 PCT/CN2020/141624 CN2020141624W WO2021139590A1 WO 2021139590 A1 WO2021139590 A1 WO 2021139590A1 CN 2020141624 W CN2020141624 W CN 2020141624W WO 2021139590 A1 WO2021139590 A1 WO 2021139590A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- module
- bluetooth
- indoor positioning
- navigation device
- time
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Definitions
- This application relates to an indoor positioning and navigation device and method based on Bluetooth and SLAM.
- SLAM simultaneous localization and mapping
- the robot starts to move from an unknown location in an unknown environment, and is based on the location estimation and map during the movement.
- Self-positioning and at the same time, an incremental map is built on the basis of self-positioning to realize autonomous positioning and navigation of the robot.
- indoor positioning and navigation technology uses a variety of physical sensors to achieve positioning and navigation, such as cameras, IMUs, etc.
- the sensor In the complex indoor environment, it is susceptible to environmental factors or the sensor itself is low in accuracy, resulting in low positioning accuracy. For example, the factors that are affected by each sensor are different. The camera is illuminated, and the camera cannot obtain feature points in the dark. The IMU is an external factor, such as man-made drag and ground slippage. There will be cumulative errors in long-term work.
- the present application discloses an indoor positioning and navigation device, wherein the indoor positioning and navigation device is built in a movable intelligent platform or robot, and includes: a Bluetooth module for obtaining information about the indoor positioning and navigation device Mesh network information, obtain the addresses, attributes, RSSI (Received Signal Strength Indication, received signal strength indication), IQ (In-phase Quadrature, same direction orthogonal) data, angle, and arrival time of multiple Bluetooth nodes in the Mesh network.
- the vision module is used to collect images around the robot, and perform grayscale processing, image correction, and feature point extraction and feature line segment extraction on the image.
- the odometer module is used to calculate the change of the position of the robot at each time relative to the position of the previous time through relative positioning, and to estimate the position of the robot in real time.
- the data storage and processing module is used to receive the data information collected and calculated by the Bluetooth module, the vision module and the odometer module.
- the position fusion and estimation module is used to fuse the data in the data storage and processing module to obtain the real-time position of the indoor positioning and navigation device.
- the map building module is used to store the real-time location of the indoor positioning and navigation device estimated by the location fusion and estimation module, and the three-dimensional environment map created based on the spatial location information of the Bluetooth nodes and feature points in the data storage and processing module, as well as path planning and
- the motion control module is used to drive the robot and perform path planning and navigation according to the three-dimensional environment map created by the map building module.
- this application discloses an indoor positioning and navigation method, including: calibrating a camera in a vision module; acquiring data collected by a Bluetooth module, a vision module, and an odometer module; performing Kalman filtering on the collected data; The real-time position of the indoor positioning and navigation device, the spatial position of the Bluetooth node, and the feature point are calculated to create a three-dimensional environment map, and positioning and navigation are performed according to the created three-dimensional environment map.
- a computer device including a memory and one or more processors, and computer-readable instructions are stored in the memory.
- the computer-readable instructions are executed by the processor, the The one or more processors execute or implement the indoor positioning and navigation methods disclosed in some embodiments of the present application.
- one or more non-volatile storage media storing computer-readable instructions are provided.
- the computer-readable instructions are executed by one or more processors, the one or more processors execute or implement The indoor positioning and navigation method disclosed in some embodiments of the present application.
- a computer program which, when executed by a processor, implements the indoor positioning and navigation method disclosed in some embodiments of the present application.
- Fig. 1 is a block diagram of an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
- Fig. 2 is a working flow chart of the vision module in the indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
- Fig. 3 is a flowchart of Kalman filtering of the position fusion and estimation module in the Bluetooth and SLAM indoor positioning and navigation device provided by one or more embodiments of the present application.
- FIG. 4 is a positioning model diagram of a Bluetooth module in an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
- Fig. 5 is a motion model diagram of an odometer module in an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
- FIG. 6 is a working flow chart of an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
- FIG. 7 is a diagram of the internal structure of a computer device provided by one or more embodiments of the present application.
- FIG. 1 is a block diagram of an indoor positioning and navigation device 100 based on Bluetooth and SLAM provided by an embodiment of the present application.
- the indoor positioning and navigation device 100 based on Bluetooth and SLAM includes: a Bluetooth module 111, a vision module 112, an odometer module 113, a data storage and processing module 114, a position fusion and estimation module 115, a map construction module 116, and a path Planning and Motion Control 117.
- the indoor positioning and navigation device 100 can be built into a sweeping robot or an indoor robot, and is used to implement map construction and precise positioning and navigation of the robot.
- the Bluetooth module 111 is a set of basic circuits of a chip that integrates Bluetooth functions, which can be used for wireless network communication, and can realize data transmission and audio transmission functions.
- the Bluetooth module 111 is located in the indoor positioning and navigation device 100 and has intelligent networking and positioning functions, specifically including the realization of networking and positioning of Bluetooth devices in the Bluetooth network.
- the Bluetooth module 111 obtains the Mesh network where the indoor positioning and navigation device 100 is located, the address, attributes, RSSI (Received Signal Strength Indication, received signal strength indication), IQ (In-phase Quadrature, same direction orthogonality) of the Bluetooth node in the Bluetooth network ) Data, angle (including direction angle and elevation angle), time of arrival and other information.
- RSSI Receiveived Signal Strength Indication, received signal strength indication
- IQ In-phase Quadrature, same direction orthogonality
- the unit of RSSI is dbm, which is generally a negative value, which can reflect the distance between two Bluetooth nodes in the Bluetooth network.
- the address information of the Bluetooth node is an identifier of the Bluetooth node, expressed in bytes; the attributes of the Bluetooth node include: friendly nodes, low-power nodes, relay nodes, standard nodes, and proxy nodes.
- the Bluetooth module 111 obtains TOA (Time Of Arrive), RSSI, IQ data, and AOA (Angle Of Arrive) from surrounding Bluetooth nodes to the robot (including the indoor positioning and navigation device 100) through the positioning function.
- TOA Time Of Arrive
- RSSI Signal to Interference
- IQ IQ
- AOA Angle Of Arrive
- the Bluetooth module 111 and the RSSI ranging model and the phase difference ranging model are constructed based on the TOA, RSSI, phase and AOA positioning model equations, as follows:
- (x, y, z) represents the spatial position of the indoor positioning and navigation device 100
- the spatial position of the Bluetooth node B i is (X bi , Y bi , Z bi )
- ⁇ i is the Bluetooth node B i and indoor positioning and navigation
- D 1i is the distance between the indoor positioning and navigation device 100 and the Bluetooth device B i obtained by the RSSI ranging model
- D 2i is determined by The distance between the indoor positioning and navigation device 100 and the Bluetooth device B i obtained by the TOA ranging model
- D 3i is the distance between the indoor positioning and navigation device 100 and the Bluetooth device B i obtained by the phase difference ranging model.
- FIG. 4 is a positioning model diagram of a Bluetooth module in an indoor positioning and navigation device based on Bluetooth and SLAM provided by an embodiment of the present application. (X bi, Y bi, Z bi) coordinates of a representative node B i Bluetooth network, B 'i B i is a Bluetooth Node projected on a two-dimensional plane.
- the vision module 112 is used to collect images around the robot, and perform grayscale processing, image correction, and feature point extraction and feature line segment extraction on the collected images, that is, preprocessing the image to remove lens distortion.
- the influence of feature description, and the ORB (Oriented Fast and Rotated Brief) feature point detection method is used to extract feature points and feature line segments in the image.
- the wide-angle camera in the vision module 112 is used to collect image information of the environment where the robot is located.
- a 12*9 10mm*10mm black and white checkerboard calibration board is used to calibrate the wide-angle camera in the vision module 112 of the indoor positioning and navigation device 100.
- the calibration method can avoid the shortcomings of high equipment requirements and cumbersome operation of the traditional method, and has higher accuracy compared with the current calibration method.
- the vision module 112 obtains the camera internal parameters and distortion parameter information.
- the camera internal parameters are the horizontal focal length fx, the vertical focal length fy, the principal point abscissa u0, and the principal point ordinate v0
- the distortion parameters include radial distortion parameters k1, k2, and k2.
- k3 and tangential distortion parameters p1 and p2 include but are not limited to these.
- the odometer module 113 is used to calculate the change of the position of the robot at each time relative to the position of the previous time through relative positioning, so as to realize the real-time estimation of the position. That is, calculate the X-direction offset ⁇ x, the Y-direction offset ⁇ y, and the Z-direction angular offset ⁇ relative to the previous time at each time of the robot.
- Figure 5 is a motion model diagram of the odometer module in the Bluetooth and SLAM-based indoor positioning and navigation device provided by an embodiment of the present application.
- X(k-1) is the state quantity at time k-1
- X(k) is the state quantity at the next time, that is, the state quantity at time k
- u(k) is the offset of time k with respect to time k-1
- the values of u(k) are the X-direction offset ⁇ x, the Y-direction offset ⁇ y, and the Z-direction angular offset ⁇ .
- the data storage and processing module 115 is coupled with the Bluetooth module 111, the vision module 112, and the odometer module 113, and is configured to receive data information collected and calculated by the Bluetooth module 111, the vision module 112, and the odometer module 113.
- these data information also include the wide-angle camera internal parameters and distortion parameters collected by the vision module.
- the data calculated by the above data information includes the spatial position of the Bluetooth node, the spatial position of the feature point, and the real-time updated intelligent platform or robot position information obtained by the position fusion and estimation module.
- the location fusion and estimation module 115 is coupled with the data storage and processing module 114 to fuse the data in the data storage and processing module 114.
- the fusion algorithms include least squares method, LM algorithm, BA algorithm, and intelligent optimization algorithm (such as Genetic algorithm, particle swarm algorithm, ant colony algorithm, etc.), Kalman filter algorithm, etc.
- This application uses one of these algorithms, such as Kalman filter as an example, for implementation and description to obtain the real-time position of the indoor positioning and navigation device 100. After Kalman filtering, real-time accurate positioning of the indoor positioning and navigation device 100 can be realized.
- this embodiment cannot be used as a limitation of the application
- the map construction module 116 is coupled with the position fusion and estimation module 115, and is used to store the real-time position of the indoor positioning and navigation device 100 estimated by the position fusion and estimation module 115, and the space of Bluetooth nodes and feature points in the data storage and processing module 114 A three-dimensional map of the environment created by the location.
- the path planning and motion control 117 is coupled with the map construction module 116, and is used for path planning and navigation according to the three-dimensional environment map stored in the map construction module 116.
- the path planning and motion control 117 is used to drive the robot accordingly, and obtain the current position information of the robot in real time.
- Each module is optionally implemented as a non-transitory computer-readable medium containing logic, stored instructions, firmware, and/or a combination thereof.
- a processor may be provided to execute such instructions so that the existing indoor positioning and navigation device executes the method described herein.
- Fig. 2 is a working flow chart of the vision module in the Bluetooth and SLAM indoor positioning and navigation device provided by an embodiment of the present application. It includes the following steps:
- Step 202 The wide-angle camera in the vision module 112 collects image information around the indoor positioning and navigation device 100.
- Step 204 The vision module 112 preprocesses the collected images. Including the gray-scale processing of the image.
- Step 206 The vision module 112 corrects the image distortion.
- Step 208 Extract the feature point information in the image information, and the method used is the ORB feature extraction that comes with OpenCV. This method is characterized by fast calculation speed, and has certain characteristics of anti-noise and anti-rotation. After processing the image using the ORB feature extraction method, a series of feature point data can be obtained, and the feature information will be stored in the feature database.
- the ORB feature points in the image include lights, wall corners, etc.
- Step 210 Extract a characteristic line segment in the image information.
- the characteristic line segment is a 1sd line segment, which is generally an edge of a wall or an edge line of an object with a certain length, such as an edge line of a square lamp, etc.
- Fig. 3 is a flowchart of the Kalman filtering of the position fusion and estimation module in the Bluetooth and SLAM-based indoor positioning and navigation device provided by an embodiment of the present application. Kalman filtering mainly includes the following steps:
- Step 302 sub-module initialization, specifically refers to the definition and initialization of variables and matrices involved in Kalman filtering.
- Step 304 The position fusion and estimation module establishes a motion equation and an observation equation. Specifically:
- the posture change U k+1 ( ⁇ x(k+1), ⁇ y(k+1), ⁇ (k+1)) of the indoor positioning and navigation device 100 at k+1 time is obtained by the mileage module 114 data.
- the system state X(k) at time k calculates the system state X(k+1
- Q k+1 is the motion equation noise at k+1 time, this noise is determined by the positioning accuracy of the odometer module itself; U k+1 is the attitude change at k+1 time.
- the system observation at time k+1 is estimated which is among them,
- B i represents a Bluetooth node in the Bluetooth network, and its coordinates are (X bi , Y bi , Z bi ), which is relative to the system state quantity X(k+1
- the three-dimensional coordinates of the feature point are (X fi , Y fi , Z fi ), f is the focal length of the camera, W k+1 is the noise of the observation equation at k+1, and H is The Jacobian matrix of the observation equation versus the state quantity.
- Step 306 Update the covariance equation, the calculation formula is as follows:
- P(k+1) is the covariance matrix at k+1
- R is the observed noise covariance matrix
- Step 308 Update the gain matrix, the calculation formula is as follows:
- K is the gain matrix
- Step 310 Update the state vector, the calculation formula is as follows:
- X(k+1) is the state vector at time k+1
- L k+1 is the real observation at k+1
- f i (u i, v i ) is the feature point in time k + 1 F i (X fi, Y fi , Z fi) corresponding to the image coordinates. Due to the problems of wheel slipping, dragging or cumulative error, the position obtained only by the equation of motion will be wrong. Therefore, the indoor positioning and navigation device 100 uses other indoor information, such as feature points, and the observation equation composed of Bluetooth nodes to correct it. Improve the accuracy of calculations.
- FIG. 6 is a working flow chart of an indoor positioning and navigation device based on Bluetooth and SLAM provided by an embodiment of the present application.
- FIG. 6 will be described in conjunction with FIG. 1. Specifically include the following steps:
- Step 602 Calibrate the camera in the vision module 112.
- Step 604 Obtain data collected by the Bluetooth module 111, the vision module 112, and the mileage module 113.
- Step 606 Fusion of the collected data.
- Kalman filter fusion refer to the method flowchart of FIG. 3 for specific filtering steps.
- Step 608 The map construction module 116 creates a three-dimensional environment map according to the calculated real-time position of the indoor positioning and navigation device, the spatial position of the Bluetooth node, and the feature point.
- Step 610 The path planning and motion control module 117 performs positioning and navigation according to the created three-dimensional environment map.
- the Bluetooth and SLAM-based indoor positioning and navigation device and method of the present application can collect mileage information by detecting and tracking ORB feature points, Bluetooth node information, and odometer, and constructing a scene map through Kalman filtering for accurate robot accuracy. Positioning, path planning and navigation can greatly improve positioning accuracy. Compared with the existing indoor positioning and navigation technology, the method disclosed in the present application can not only effectively reduce the influence caused by environmental factors to improve the positioning accuracy, but also provide a more reliable and accurate three-dimensional environment map.
- the various modules in the aforementioned indoor positioning and navigation device include a Bluetooth module, a vision module, an odometer module, a data storage and processing module, a position fusion and estimation module, a map construction module, and a path planning and motion control module, It can be implemented in whole or in part by software, hardware and a combination thereof.
- the above-mentioned modules may be embedded in the form of hardware or independent of the processor in the computer equipment, or may be stored in the memory of the computer equipment in the form of software, so that the processor can call and execute the operations corresponding to the above-mentioned modules.
- a computer device is provided.
- the computer device may be a movable intelligent platform or a robot, and its internal structure diagram may be as shown in FIG. 7.
- the computer equipment includes a processor, a memory, a network interface, an input device, a Bluetooth device and a driving device connected through a system bus.
- the processor of the computer device is used to provide calculation and control capabilities.
- the memory of the computer device includes a non-volatile storage medium and an internal memory.
- the non-volatile storage medium stores an operating system and computer readable instructions.
- the internal memory provides an environment for the operation of the operating system and computer-readable instructions in the non-volatile storage medium.
- the network interface of the computer device is used to communicate with an external terminal or server through a network connection.
- the input device of the computer equipment can be a touch layer covered on the display screen, it can also be a button, a trackball or a touchpad set on the housing of the computer equipment, and it can also be an external keyboard, a touchpad or a mouse.
- the Bluetooth device of the computer device can communicate with other devices with Bluetooth communication capabilities, and the motion device of the computer device can be any form of device that can drive the computer device to move.
- the computer device may also include a display screen, and the display screen of the computer device may be a liquid crystal display screen or an electronic ink display screen,
- FIG. 7 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the computer device to which the solution of the present application is applied.
- the specific computer device may Including more or fewer parts than shown in the figure, or combining some parts, or having a different arrangement of parts.
- a computer device including a memory and one or more processors, and computer-readable instructions are stored in the memory.
- the computer-readable instructions are executed by the processor, the The one or more processors execute a method, including: calibrating the camera in the vision module; acquiring data collected by the Bluetooth module, vision module, and odometer module; fusing the collected data; obtaining the indoor positioning and navigation device according to the calculation Create a three-dimensional environment map based on the real-time location, Bluetooth node, and spatial location of feature points; and perform positioning and navigation based on the created three-dimensional environment map.
- one or more computer program products including a storage medium, and the storage medium stores computer-readable instructions.
- the one or more processors When the computer-readable instructions are executed by one or more processors, the one or more processors Implementing a method includes: calibrating the camera in the vision module; acquiring data collected by the Bluetooth module, vision module, and odometer module; fusing the collected data; obtaining the real-time position, Bluetooth node, and characteristics of the indoor positioning and navigation device according to calculations Create a three-dimensional environment map based on the spatial position of the point; and perform positioning and navigation based on the created three-dimensional environment map.
- one or more non-volatile storage media storing computer-readable instructions are provided.
- the one or more processors execute a The method includes: calibrating the camera in the vision module; acquiring the data collected by the Bluetooth module, the vision module and the odometer module; fusing the collected data; obtaining the real-time position of the indoor positioning navigation device, the Bluetooth node, and the space of the feature point according to the calculation Create a three-dimensional environment map based on the location; and perform positioning and navigation based on the created three-dimensional environment map.
- one or more computer programs which execute a method when executed by one or more processors, including: calibrating the camera in the vision module; acquiring the Bluetooth module, the vision module, and the odometer module. Integrate the collected data; create a three-dimensional environment map based on the calculated real-time position of the indoor positioning and navigation device, the spatial location of the Bluetooth node, and the feature point; and perform positioning and navigation based on the created three-dimensional environment map.
- the method executed by the one or more processors is the indoor positioning and navigation method in some embodiments of the present application.
- Non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
- Volatile memory may include random access memory (RAM) or external cache memory.
- RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain Channel (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.
- the indoor positioning and navigation device and method of the present application can construct a scene map based on Bluetooth node information, image feature points, feature line segment information, and real-time location information estimated by the odometer in the Mesh network, and can be used for robots Precise positioning and path planning.
- multiple positioning sensors can be used for data fusion, which has strong anti-interference ability, realizes higher-precision positioning, and effectively reduces the influence caused by environmental factors to improve positioning. Accuracy, and can provide a more reliable and accurate three-dimensional environment map, so that it can be extended and applied more widely.
- Bluetooth is not affected by light and slippage, but it will be blocked by obstacles such as walls and tables, resulting in low positioning accuracy. Therefore, in some embodiments, multiple positioning such as Bluetooth, camera, and odometer can be used. Combining the advantages of sensors, learn from each other's strengths.
- the device or method of the present application constructs a scene map using Bluetooth node information, feature points and feature information collected by the vision module, and real-time location information estimated by the odometer module, so as to achieve precise positioning and navigation functions.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
Abstract
Description
图6是本申请的一个或多个实施例提供的基于蓝牙与SLAM室内定位导航装置的工作流程图。
图7是本申请的一个或多个实施例提供的一种计算机设备,其内部结构图。
[Corrected according to Rule 91 04.03.2021]
Fig. 6 is a working flow chart of an indoor positioning and navigation device based on Bluetooth and SLAM provided by one or more embodiments of the present application.
FIG. 7 is a diagram of the internal structure of a computer device provided by one or more embodiments of the present application.
Claims (15)
- 一种室内定位导航装置,其中所述室内定位导航装置内置于可移动的智能平台或机器人中,包括:An indoor positioning and navigation device, wherein the indoor positioning and navigation device is built in a movable intelligent platform or robot, and includes:蓝牙模块,用于获取所述室内定位导航装置的Mesh网络信息,获取Mesh网络中多个蓝牙节点的地址、属性、RSSI、IQ数据、角度、到达时间;The Bluetooth module is used to obtain the Mesh network information of the indoor positioning and navigation device, and obtain the addresses, attributes, RSSI, IQ data, angles, and arrival time of multiple Bluetooth nodes in the Mesh network;视觉模块,用于采集机器人周围的图像,并对所采集的图像灰度化处理、图像矫正以及对图像进行特征点提取和特征线段提取;The vision module is used to collect images around the robot, and perform grayscale processing, image correction, and feature point extraction and feature line segment extraction on the image;里程计模块,用于通过相对定位计算出机器人每一时刻位置相对于上一时刻位置的变化,实时估计机器人的位置;The odometer module is used to calculate the change of the position of the robot at each time relative to the position of the previous time through relative positioning, and to estimate the position of the robot in real time;数据存储与处理模块,用于接收蓝牙模块、视觉模块以及里程计模块所采集以及计算得到的数据信息;The data storage and processing module is used to receive the data information collected and calculated by the Bluetooth module, the vision module and the odometer module;位置融合与估算模块,用于对数据存储与处理模块中的数据进行融合,获取室内定位导航装置的实时位置;The position fusion and estimation module is used to fuse the data in the data storage and processing module to obtain the real-time position of the indoor positioning and navigation device;地图构建模块,用于存储位置融合与估算模块估计的室内定位导航装置的实时位置,以及根据数据存储与处理模块中的蓝牙节点、特征点的空间位置信息创建的三维环境地图,以及The map building module is used to store the real-time location of the indoor positioning and navigation device estimated by the location fusion and estimation module, as well as a three-dimensional environment map created based on the spatial location information of the Bluetooth nodes and feature points in the data storage and processing module, and路径规划与运动控制模块,用于驱动机器人,并根据地图构建模块创建的三维环境地图进行路径规划与导航。The path planning and motion control module is used to drive the robot and perform path planning and navigation according to the three-dimensional environment map created by the map building module.
- 一种室内定位导航方法,包括:An indoor positioning and navigation method, including:标定视觉模块内的摄像头;Calibration of the camera in the vision module;获取蓝牙模块、视觉模块以及里程计模块采集的数据;Obtain the data collected by the Bluetooth module, the vision module and the odometer module;对采集的数据进行融合;Fusion of collected data;根据计算得到室内定位导航装置的实时位置、蓝牙节点、特征点的空间位置创建三维环境地图;以及Create a three-dimensional environment map based on the calculated real-time position of the indoor positioning and navigation device, the spatial position of the Bluetooth node, and the feature point; and根据创建的三维环境地图进行定位和导航。Perform positioning and navigation according to the created three-dimensional environment map.
- 如权利要求2所述的方法,其中所述蓝牙模块获取所述室内定位导航装置的Mesh网络信息,获取Mesh网络中多个蓝牙节点的地址、属性、RSSI、IQ数据、角度、到达时间。The method according to claim 2, wherein the Bluetooth module obtains Mesh network information of the indoor positioning and navigation device, and obtains addresses, attributes, RSSI, IQ data, angles, and arrival times of multiple Bluetooth nodes in the Mesh network.
- 如权利要求2或3任一所述的方法,其中所述视觉模块采集机器人周围的图像,并对所采集的图像灰度化处理、图像矫正以及对图像进行特征点提取和特征线段提取。The method according to any one of claims 2 or 3, wherein the vision module collects images around the robot, and performs grayscale processing, image correction, and feature point extraction and feature line segment extraction on the image.
- 如权利要求2-4任一所述的方法,其中所述里程计模块通过相对定位计算出机器人每一时刻位置相对于上一时刻位置的变化,实时估计机器人的位置。The method according to any one of claims 2-4, wherein the odometer module calculates the change of the position of the robot at each time relative to the position of the previous time through relative positioning, and estimates the position of the robot in real time.
- 如权利要求2-5任一所述的方法,其中所述对数据融合,是通过位置融合与估算模块对数据进行融合。The method according to any one of claims 2-5, wherein the data fusion is fusion of the data through a position fusion and estimation module.
- 如权利要求1-6任一所述的装置或方法,其中所述蓝牙模块与RSSI测距模型和相位差测距模型构建基于TOA、RSSI、相位以及AOA的定位模型方程,并通过方程The device or method according to any one of claims 1-6, wherein the Bluetooth module and the RSSI ranging model and the phase difference ranging model construct a positioning model equation based on TOA, RSSI, phase, and AOA, and pass the equation计算室内定位导航装置的实时位置,其中(x,y,z)代表了室内定位导航装置的空间位置,蓝牙节点B i的空间位置为(X bi,Y bi,Z bi),θ i是蓝牙节点B i与室内定位导航装置的蓝牙天线阵列平面之间的方位角, 是蓝牙节点B i与室内定位导航装置的蓝牙天线阵列平面之间的仰角,D 1i是由RSSI测距模型获取的室内定位导航装置与蓝牙设备B i之间的距离,D 2i是由TOA测距模型获取的室内定位导航装置与蓝牙设备B i之间的距离,D 3i是由相位差测距模型获取的室内定位导航装置与蓝牙设备B i之间的距离。 Calculate the real-time position of the indoor positioning and navigation device, where (x, y, z) represents the spatial position of the indoor positioning and navigation device, the spatial position of Bluetooth node B i is (X bi , Y bi , Z bi ), and θ i is Bluetooth The azimuth angle between node B i and the bluetooth antenna array plane of the indoor positioning and navigation device, Is the elevation angle between the Bluetooth node B i and the Bluetooth antenna array plane of the indoor positioning and navigation device, D 1i is the distance between the indoor positioning and navigation device and the Bluetooth device B i obtained by the RSSI ranging model, and D 2i is the distance measured by TOA The distance between the indoor positioning and navigation device and the Bluetooth device B i obtained by the model, D 3i is the distance between the indoor positioning and navigation device and the Bluetooth device B i obtained by the phase difference ranging model.
- 如权利要求1-7任一所述的装置、方法、非易失性存储介质或程序,其中所述视觉模块中的摄像头被12*9的10mm*10mm的黑白棋盘格标定板标定,可选地,所述视觉模块中的摄像头包含摄像头参数和畸变参数信息。The device, method, non-volatile storage medium or program according to any one of claims 1-7, wherein the camera in the vision module is calibrated by a 12*9 10mm*10mm black and white checkerboard calibration board, optional Preferably, the camera in the vision module includes camera parameter and distortion parameter information.
- 如权利要求1-8任一所述的室内定位导航装置、方法、非易失性存储介质或程序,其中所述视觉模块还用于对图像畸变进行校正。The indoor positioning and navigation device, method, non-volatile storage medium or program according to any one of claims 1-8, wherein the vision module is also used to correct image distortion.
- 如权利要求1-9任一所述的装置或方法,其中所述对数据进行融合,包括:The device or method according to any one of claims 1-9, wherein said fusing data includes:根据选自于由最小二乘方法、LM算法、BA算法、智能优化算法以及卡尔曼滤波算法所组成的群组的至少一者进行融合,Perform fusion based on at least one selected from the group consisting of least squares method, LM algorithm, BA algorithm, intelligent optimization algorithm, and Kalman filter algorithm,其中所述卡尔曼滤波算法包括选自于由对各模块初始化、建立运动方程、观测方程、更新协方差矩阵、更新增益矩阵以及更新状态向量所组成的群组的至少一者。The Kalman filter algorithm includes at least one selected from the group consisting of initializing each module, establishing a motion equation, an observation equation, updating a covariance matrix, updating a gain matrix, and updating a state vector.
- 如权利要求1-10任一所述的装置或方法,其中,所述对数据进行融合包括:The device or method according to any one of claims 1-10, wherein said fusing data comprises:通过公式By formulaX(k+1|k)=X(k)+U k+1+Q k+1 X(k+1|k)=X(k)+U k+1 +Q k+1估算室内定位导航装置在k+1时刻的的系统状态X(k+1|k),其中X(k)为k时刻的系统状态,Q k+1为k+1时刻的运动方程噪声,U k+1为k+1时刻的姿态变化量; Estimate the system state X(k+1|k) of the indoor positioning and navigation device at k+1, where X(k) is the system state at k, Q k+1 is the motion equation noise at k+1, U k+1 is the amount of attitude change at k+1;可选地,Optionally,根据k+1时刻的系统状态X(k+1|k),通过公式According to the system state X(k+1|k) at k+1 time, through the formula,即, which is估算k+1时刻的系统观测量,Estimate the system observations at time k+1,其中,among them,,且And其中,坐标(X bi,Y bi,Z bi)为蓝牙网络中的蓝牙节点B i的坐标,其相对于室内定位导航装置在k+1时刻的系统状态量X(k+1|k)的方位角θ i和仰角 F i表示其中的一个特征点,该特征点的空间三维坐标为(X fi,Y fi,Z fi),f是摄像头的焦距,W k+1为k+1时刻的观测方程噪声,H是观测方程对状态量的雅克比矩阵。 Among them, the coordinates (X bi , Y bi , Z bi ) are the coordinates of the Bluetooth node B i in the Bluetooth network, which are relative to the system state quantity X(k+1|k) of the indoor positioning and navigation device at time k+1 Azimuth angle θ i and elevation angle F i represents one of the feature points. The three-dimensional coordinates of the feature point are (X fi , Y fi , Z fi ), f is the focal length of the camera, W k+1 is the noise of the observation equation at k+1, and H is The Jacobian matrix of the observation equation versus the state quantity.
- 如权利要求1-11任一所述的装置或方法,其中,所述对数据进行融合包括:The device or method according to any one of claims 1-11, wherein said fusing data comprises:更新协方差方程计算公式,为Update the calculation formula of the covariance equation asP(k+1)=(P(k+1|k) -1+H T*R -1*H) -1 P(k+1)=(P(k+1|k) -1 +H T *R -1 *H) -1其中P(k+1)为k+1时刻的协方差矩阵,R为观测噪声协方差矩阵;Where P(k+1) is the covariance matrix at k+1, and R is the observed noise covariance matrix;且/或And/or更新增益矩阵计算公式,为Update the gain matrix calculation formula asK=P(k+1)*H T*R -1 K=P(k+1)*H T *R -1其中,K是增益矩阵;Among them, K is the gain matrix;且/或And/or更新状态向量计算公式,为Update the calculation formula of the state vector asX(k+1)=X(k+1|k)+K*ΔLX(k+1)=X(k+1|k)+K*ΔL
- 一种存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行权利要求2-12任一所述的方法。A non-volatile storage medium storing computer-readable instructions, which when executed by one or more processors, cause one or more processors to execute any one of claims 2-12 method.
- 一种计算机程序产品,包括存储介质,所述存储介质存储有计算机可读指令,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行权利要求2-12任一所述的方法。A computer program product, comprising a storage medium storing computer readable instructions, and when the computer readable instructions are executed by one or more processors, the one or more processors execute claims 2-12 Any of the methods described.
- 一种计算机程序,其被处理器执行时实现权利要求2-12任一所述的方法。A computer program that, when executed by a processor, implements the method described in any one of claims 2-12.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010009926.2A CN113074727A (en) | 2020-01-06 | 2020-01-06 | Indoor positioning navigation device and method based on Bluetooth and SLAM |
CN202010009926.2 | 2020-01-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021139590A1 true WO2021139590A1 (en) | 2021-07-15 |
Family
ID=76609029
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/141624 WO2021139590A1 (en) | 2020-01-06 | 2020-12-30 | Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113074727A (en) |
WO (1) | WO2021139590A1 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113568413A (en) * | 2021-08-19 | 2021-10-29 | 深圳中智永浩机器人有限公司 | Robot safety guarantee method and device, computer equipment and storage medium |
CN113587917A (en) * | 2021-07-28 | 2021-11-02 | 北京百度网讯科技有限公司 | Indoor positioning method, device, equipment, storage medium and computer program product |
CN114001743A (en) * | 2021-10-29 | 2022-02-01 | 京东方科技集团股份有限公司 | Map drawing method, map drawing device, map drawing system, storage medium, and electronic apparatus |
CN114025320A (en) * | 2021-11-08 | 2022-02-08 | 易枭零部件科技(襄阳)有限公司 | Indoor positioning method based on 5G signal |
CN114205748A (en) * | 2021-12-08 | 2022-03-18 | 珠海格力电器股份有限公司 | Network configuration method and device, electronic equipment and storage medium |
CN114510044A (en) * | 2022-01-25 | 2022-05-17 | 北京圣威特科技有限公司 | AGV navigation ship navigation method and device, electronic equipment and storage medium |
CN115218907A (en) * | 2022-09-19 | 2022-10-21 | 季华实验室 | Unmanned aerial vehicle path planning method and device, electronic equipment and storage medium |
CN115334448A (en) * | 2022-08-15 | 2022-11-11 | 重庆大学 | Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor |
CN115802282A (en) * | 2022-12-16 | 2023-03-14 | 兰笺(苏州)科技有限公司 | Wireless signal field co-location method and device |
CN115808170A (en) * | 2023-02-09 | 2023-03-17 | 宝略科技(浙江)有限公司 | Indoor real-time positioning method integrating Bluetooth and video analysis |
CN116954235A (en) * | 2023-09-21 | 2023-10-27 | 深圳大工人科技有限公司 | AGV trolley navigation control method and system |
CN117119585A (en) * | 2023-08-26 | 2023-11-24 | 江苏蓝策电子科技有限公司 | Bluetooth positioning navigation system and method |
CN115802282B (en) * | 2022-12-16 | 2024-06-07 | 兰笺(苏州)科技有限公司 | Co-location method and device for wireless signal field |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113949999B (en) * | 2021-09-09 | 2024-01-30 | 之江实验室 | Indoor positioning navigation equipment and method |
CN114136306B (en) * | 2021-12-01 | 2024-05-07 | 浙江大学湖州研究院 | Expandable device and method based on relative positioning of UWB and camera |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150268058A1 (en) * | 2014-03-18 | 2015-09-24 | Sri International | Real-time system for multi-modal 3d geospatial mapping, object recognition, scene annotation and analytics |
US20160349362A1 (en) * | 2015-05-08 | 2016-12-01 | 5D Robotics, Inc. | Mobile localization using sparse time-of-flight ranges and dead reckoning |
CN108801265A (en) * | 2018-06-08 | 2018-11-13 | 武汉大学 | Multidimensional information synchronous acquisition, positioning and position service apparatus and system and method |
WO2019000417A1 (en) * | 2017-06-30 | 2019-01-03 | SZ DJI Technology Co., Ltd. | Map generation systems and methods |
CN109541535A (en) * | 2019-01-11 | 2019-03-29 | 浙江智澜科技有限公司 | A method of AGV indoor positioning and navigation based on UWB and vision SLAM |
CN110308729A (en) * | 2019-07-18 | 2019-10-08 | 石家庄辰宙智能装备有限公司 | The AGV combined navigation locating method of view-based access control model and IMU or odometer |
-
2020
- 2020-01-06 CN CN202010009926.2A patent/CN113074727A/en active Pending
- 2020-12-30 WO PCT/CN2020/141624 patent/WO2021139590A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150268058A1 (en) * | 2014-03-18 | 2015-09-24 | Sri International | Real-time system for multi-modal 3d geospatial mapping, object recognition, scene annotation and analytics |
US20160349362A1 (en) * | 2015-05-08 | 2016-12-01 | 5D Robotics, Inc. | Mobile localization using sparse time-of-flight ranges and dead reckoning |
WO2019000417A1 (en) * | 2017-06-30 | 2019-01-03 | SZ DJI Technology Co., Ltd. | Map generation systems and methods |
CN108801265A (en) * | 2018-06-08 | 2018-11-13 | 武汉大学 | Multidimensional information synchronous acquisition, positioning and position service apparatus and system and method |
CN109541535A (en) * | 2019-01-11 | 2019-03-29 | 浙江智澜科技有限公司 | A method of AGV indoor positioning and navigation based on UWB and vision SLAM |
CN110308729A (en) * | 2019-07-18 | 2019-10-08 | 石家庄辰宙智能装备有限公司 | The AGV combined navigation locating method of view-based access control model and IMU or odometer |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113587917A (en) * | 2021-07-28 | 2021-11-02 | 北京百度网讯科技有限公司 | Indoor positioning method, device, equipment, storage medium and computer program product |
CN113568413A (en) * | 2021-08-19 | 2021-10-29 | 深圳中智永浩机器人有限公司 | Robot safety guarantee method and device, computer equipment and storage medium |
CN114001743A (en) * | 2021-10-29 | 2022-02-01 | 京东方科技集团股份有限公司 | Map drawing method, map drawing device, map drawing system, storage medium, and electronic apparatus |
CN114025320A (en) * | 2021-11-08 | 2022-02-08 | 易枭零部件科技(襄阳)有限公司 | Indoor positioning method based on 5G signal |
CN114205748B (en) * | 2021-12-08 | 2023-03-10 | 珠海格力电器股份有限公司 | Network configuration method and device, electronic equipment and storage medium |
CN114205748A (en) * | 2021-12-08 | 2022-03-18 | 珠海格力电器股份有限公司 | Network configuration method and device, electronic equipment and storage medium |
CN114510044A (en) * | 2022-01-25 | 2022-05-17 | 北京圣威特科技有限公司 | AGV navigation ship navigation method and device, electronic equipment and storage medium |
CN115334448B (en) * | 2022-08-15 | 2024-03-15 | 重庆大学 | Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor |
CN115334448A (en) * | 2022-08-15 | 2022-11-11 | 重庆大学 | Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor |
CN115218907B (en) * | 2022-09-19 | 2022-12-09 | 季华实验室 | Unmanned aerial vehicle path planning method and device, electronic equipment and storage medium |
CN115218907A (en) * | 2022-09-19 | 2022-10-21 | 季华实验室 | Unmanned aerial vehicle path planning method and device, electronic equipment and storage medium |
CN115802282A (en) * | 2022-12-16 | 2023-03-14 | 兰笺(苏州)科技有限公司 | Wireless signal field co-location method and device |
CN115802282B (en) * | 2022-12-16 | 2024-06-07 | 兰笺(苏州)科技有限公司 | Co-location method and device for wireless signal field |
CN115808170A (en) * | 2023-02-09 | 2023-03-17 | 宝略科技(浙江)有限公司 | Indoor real-time positioning method integrating Bluetooth and video analysis |
CN117119585A (en) * | 2023-08-26 | 2023-11-24 | 江苏蓝策电子科技有限公司 | Bluetooth positioning navigation system and method |
CN117119585B (en) * | 2023-08-26 | 2024-02-06 | 江苏蓝策电子科技有限公司 | Bluetooth positioning navigation system and method |
CN116954235A (en) * | 2023-09-21 | 2023-10-27 | 深圳大工人科技有限公司 | AGV trolley navigation control method and system |
CN116954235B (en) * | 2023-09-21 | 2023-11-24 | 深圳大工人科技有限公司 | AGV trolley navigation control method and system |
Also Published As
Publication number | Publication date |
---|---|
CN113074727A (en) | 2021-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021139590A1 (en) | Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor | |
CN109887057B (en) | Method and device for generating high-precision map | |
CN111156998B (en) | Mobile robot positioning method based on RGB-D camera and IMU information fusion | |
WO2021026850A1 (en) | Qr code-based navigation attitude determining and positioning method and system | |
CN111089585A (en) | Mapping and positioning method based on sensor information fusion | |
JP5992184B2 (en) | Image data processing apparatus, image data processing method, and image data processing program | |
JP7300550B2 (en) | METHOD AND APPARATUS FOR CONSTRUCTING SIGNS MAP BASED ON VISUAL SIGNS | |
CN111121754A (en) | Mobile robot positioning navigation method and device, mobile robot and storage medium | |
CN110118556A (en) | A kind of robot localization method and device based on covariance mixing together SLAM | |
CN112184812B (en) | Method for improving identification and positioning precision of unmanned aerial vehicle camera to april tag and positioning method and system | |
CN113763548B (en) | Vision-laser radar coupling-based lean texture tunnel modeling method and system | |
WO2019136613A1 (en) | Indoor locating method and device for robot | |
WO2020019115A1 (en) | Fusion mapping method, related device and computer readable storage medium | |
CN112967344B (en) | Method, device, storage medium and program product for calibrating camera external parameters | |
CN112734765A (en) | Mobile robot positioning method, system and medium based on example segmentation and multi-sensor fusion | |
US11067694B2 (en) | Locating method and device, storage medium, and electronic device | |
CN114111776B (en) | Positioning method and related device | |
WO2024027350A1 (en) | Vehicle positioning method and apparatus, computer device and storage medium | |
CN114758011B (en) | Zoom camera online calibration method fusing offline calibration results | |
CN111856499A (en) | Map construction method and device based on laser radar | |
Choi et al. | Monocular SLAM with undelayed initialization for an indoor robot | |
KR20220058846A (en) | Robot positioning method and apparatus, apparatus, storage medium | |
CN113252066B (en) | Calibration method and device for parameters of odometer equipment, storage medium and electronic device | |
CN111736137B (en) | LiDAR external parameter calibration method, system, computer equipment and readable storage medium | |
Xue et al. | Visual-Marker Based Localization for Flat-Variation Scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20912908 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20912908 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20912908 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: OTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03/02/2023) |