CN115265534A - Multi-sensor fusion positioning navigation method, device and system based on AprilTag code - Google Patents

Multi-sensor fusion positioning navigation method, device and system based on AprilTag code Download PDF

Info

Publication number
CN115265534A
CN115265534A CN202210921400.0A CN202210921400A CN115265534A CN 115265534 A CN115265534 A CN 115265534A CN 202210921400 A CN202210921400 A CN 202210921400A CN 115265534 A CN115265534 A CN 115265534A
Authority
CN
China
Prior art keywords
robot
pose
code
grid map
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210921400.0A
Other languages
Chinese (zh)
Inventor
任妮
张文翔
张兵园
贡宇
卢鑫羽
周玲莉
程雅雯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Sunongxin Data Technology Co ltd
Jiangsu Academy of Agricultural Sciences
Original Assignee
Nanjing Sunongxin Data Technology Co ltd
Jiangsu Academy of Agricultural Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Sunongxin Data Technology Co ltd, Jiangsu Academy of Agricultural Sciences filed Critical Nanjing Sunongxin Data Technology Co ltd
Priority to CN202210921400.0A priority Critical patent/CN115265534A/en
Publication of CN115265534A publication Critical patent/CN115265534A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a multi-sensor fusion positioning navigation method, a multi-sensor fusion positioning navigation device and a multi-sensor fusion positioning navigation system based on AprilTag codes, wherein the method comprises the following steps: planning a path on the grid map according to the initial pose and the pose of the target point to obtain a planned path; controlling the robot to walk along the planned path based on a positioning algorithm, and simultaneously acquiring image data through a camera; when the camera acquires an image containing an April Tag code, decoding the April Tag code in the image by using an April Tag library to obtain ID data of the April Tag code and the relative pose of the camera and the April Tag code; calculating the accurate pose of the robot according to the relative pose and pose data of the AprilTag code in the grid map; the positioning algorithm relocates the robot on the grid map according to the accurate pose; and updating the moving path of the robot in the grid map. The technical scheme of the invention enables the robot to stably perform autonomous positioning and navigation in the greenhouse for a long time.

Description

Multi-sensor fusion positioning navigation method, device and system based on AprilTag code
Technical Field
The invention relates to the technical field of positioning and navigation of agricultural robots, in particular to a method, a device and a system for multi-sensor fusion positioning and navigation based on AprilTag codes.
Background
Under the development background of agricultural intellectualization, autonomous operation equipment such as an inspection robot appears in a greenhouse, and in order to realize autonomous navigation, movement and further operation functions, the positioning needs to be accurately sensed in the greenhouse environment. A navigation function package under a Robot Operating System (ROS) is an existing basic technology for realizing robot positioning navigation. At present, a robot in a greenhouse environment mainly uses a laser radar to obtain characteristic information of the surrounding environment and motion information of the robot by combining sensors such as a wheel encoder and an inertia measurement unit, and multi-source information is fused by using a navigation function package, so that the positioning and navigation of the robot are realized.
The prior art has the defects that when the characteristic points in the environment are insufficient and the map scene similarity factors are excessive, the reliability of the laser radar is poor, and the data of the wheel encoder, the inertial measurement unit and other sensors accumulate errors along with the increase of time, so that the reliability of the data of the sensors is reduced. The problem that the environment is similar and the characteristics of a cultivation frame are single exists among planting ridge channels of an existing greenhouse, the problem that the position of a robot is lost in a global map cannot be effectively solved by means of a fusion positioning navigation technology of an existing laser radar combined with sensors such as a wheel encoder and an inertia measurement unit, and therefore the robot cannot conduct autonomous positioning and navigation in the greenhouse stably for a long time.
Also have among the prior art to navigate to the robot through the two-dimensional code, traditional two-dimensional code coding structure is complicated, solves the complexity height and causes to solve for a long time, and the real-time is poor and sensitive to illumination change, and need lay the two-dimensional code in a large number subaerial, need carry out the precision measurement to the position of two-dimensional code during the implementation, and work load is big, and the easy dirty wearing and tearing of two-dimensional code, maintains the difficulty.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides a multi-sensor fusion positioning navigation method, device and system based on AprilTag codes, which can enable a robot to perform autonomous positioning and navigation in a greenhouse stably for a long time.
The technical scheme is as follows: in order to achieve the above object, the multi-sensor fusion positioning navigation method based on aprilat codes of the present invention includes:
performing path planning on the grid map according to the initial pose and the target point pose to obtain a planned path, wherein the planned path passes through at least one AprilTag code; each AprilTag code has corresponding pose data in the grid map, and ID data corresponding to different AprilTag codes are different;
controlling the robot to walk along the planned path based on a positioning algorithm, and simultaneously acquiring image data through a camera;
when the camera acquires an image containing an April Tag code, decoding the April Tag code in the image by using an April Tag library to obtain ID data of the April Tag code and the relative pose of the camera and the April Tag code;
calculating the accurate pose of the robot according to the relative pose and pose data of the AprilTag code in the grid map;
feeding back the accurate pose to the positioning algorithm, and repositioning the robot on the grid map by the positioning algorithm;
and updating the moving path of the robot in the grid map.
Further, the method further comprises a grid map construction process, wherein the grid map construction process comprises the following steps:
controlling the robot to walk in the greenhouse by using an SLAM method, acquiring data through a sensor module, and establishing an original grid map according to the acquired data;
and executing editing operation, and editing the original grid map to obtain the grid map.
Further, the grid map construction process further includes:
and writing the corresponding pose data of each AprilTag code in the grid map into a pose transformation program in a control unit.
Further, the calculating the precise pose of the robot from the relative pose and pose data of the aprilat code in the grid map comprises:
calculating to obtain the relative pose of the robot and an AprilTag code through the TF transformation relation between a camera coordinate system and a robot chassis coordinate system;
inquiring the pose data of the AprilTag code in the grid map according to the ID data of the AprilTag code;
and calculating the accurate pose of the robot according to the relative pose and the pose data.
Further, the controlling the robot to walk along the planned path based on the positioning algorithm comprises:
controlling a laser radar to scan the surrounding environment to obtain point cloud information;
estimating the position of the robot at the current moment by utilizing an AMCL (advanced positioning language) positioning algorithm based on the point cloud information, data of other sensors and the grid map to obtain an estimated position;
and correcting the moving path of the robot in real time according to the estimated position.
Further, the repositioning the robot on the grid map by the positioning algorithm comprises:
redistributing the particle cloud through an AMCL positioning algorithm;
and point cloud data returned by the laser radar are acquired, and the position of the robot on the grid map is repositioned.
AprilTag code-based multi-sensor fusion positioning navigation device, comprising:
the planning module is used for planning a path on the grid map according to the initial pose and the pose of the target point to obtain a planned path, and the planned path passes through at least one AprilTag code; each AprilTag code has corresponding pose data in the grid map, and ID data corresponding to different AprilTag codes are different;
a navigation acquisition module for controlling the robot to walk along the planned path based on a positioning algorithm while acquiring image data by a camera;
a first computing module, configured to, when an image including an AprilTag code is captured by the camera, decode the AprilTag code in the image by using an AprilTag library, and obtain ID data of the AprilTag code and a relative pose of the camera and the AprilTag code;
a second calculation module for calculating an accurate pose of the robot from the relative pose and pose data of the AprilTag code in the grid map;
the positioning updating module is used for feeding back the accurate pose to the positioning algorithm, and the positioning algorithm relocates the robot on the grid map;
a path update module for updating a movement path of the robot in the grid map.
The multi-sensor fusion positioning system based on AprilTag codes comprises a robot for executing operation in a greenhouse and a control unit which is arranged in the robot and/or is arranged outside the robot; aprilTag codes are scattered and stuck to the ground in the greenhouse; the robot is provided with a sensor module and a camera, and a lens of the camera faces downwards to acquire a ground image; the control unit can perform data interaction with the sensor module and the camera to execute the AprilTag code-based multi-sensor fusion positioning navigation method.
Has the beneficial effects that: according to the method, the device and the system for positioning and navigating based on the aprilTag code by fusing the multiple sensors, the accurate 3D position and direction of the aprilTag code relative to a camera and the ID data of the aprilTag code can be directly calculated by using the aprilTag library, the calculating efficiency is high, and compared with the traditional two-dimensional code, the method has the advantages of complex structure, high calculating complexity, long calculating time, poor real-time performance and sensitivity to illumination change. In addition, in the invention, the robot mainly depends on the sensor module to perform navigation operation, so that April Tag codes can be sparsely distributed, the using amount of the April Tag codes is greatly reduced, the pose of the robot can be corrected in time when the robot is controlled to move only by enabling a planned path to pass through the April Tag codes when positioning correction is performed each time, and the accuracy of the robot movement is ensured.
Drawings
FIG. 1 is a schematic layout of a greenhouse area;
FIG. 2 is a schematic view of the robot;
FIG. 3 is a schematic flow chart of a greenhouse agricultural robot positioning and navigation method for multi-sensor fusion based on AprilTag codes;
fig. 4 is a schematic diagram of a greenhouse agricultural robot positioning navigation system based on aprilat codes for multi-sensor fusion.
Detailed Description
The present invention will be further described with reference to the accompanying drawings.
The greenhouse agricultural robot positioning and navigation method based on the AprilTag code for multi-sensor fusion is implemented by a control unit, the control unit can control the robot 10 to operate, as shown in FIG. 2, the robot 10 is provided with a walking mechanism 11 and a sensor module, and further comprises a camera 12 for acquiring a ground image, the camera 12 is an industrial CCD camera and is installed in the front of a chassis of the robot, a lens of the camera faces downwards horizontally, after the installation is finished, the direction of the camera image is required to correspond to the actual environment, namely, the upper part of the image corresponds to the direction of a vehicle head, and the left side of the image corresponds to the left side of a vehicle body. In this embodiment, the sensor module includes a laser radar 13, an IMU, and a odometer. As shown in fig. 1, a plurality of aprilat codes 20 are provided on the floor of a greenhouse in which the robot 10 operates, and all aprilat codes 20 are dispersedly fixed on the floor in the greenhouse, and cultivation shelves 30 are arranged in the greenhouse. The control unit is a general term, and may include a plurality of control elements capable of communicating with each other, such as an upper computer and a lower computer.
Based on the above, as shown in fig. 3, the method for positioning and navigating the greenhouse agricultural robot 10 based on aprilat code to perform multi-sensor fusion of the invention comprises the following steps S101-S106:
step S101, planning a path on a grid map according to an initial pose and a target pose to obtain a planned path, wherein the planned path passes through at least one AprilTag code 20;
in this step, the grid map is constructed in advance through a map construction process, each aprilat code 20 has corresponding pose data in the grid map, and the ID data corresponding to different aprilat codes 20 are different; the planned path comprises a global path and a local path. When the robot 10 runs, the control unit executes tasks in the task list, each task has data of a task point pose, before step S101, the control unit may first group the tasks according to preset conditions, so that at least the tasks and the adjacent tasks thereof are combined into a stage task, and plan a path for the stage task when step S101 is executed, when the path is planned for the stage task, the task point pose corresponding to the last task in the stage task is a target point pose, and the task point poses corresponding to other tasks are path point poses, that is, when step S101 is executed, the planned path is required to pass through all path point poses in the stage task, in addition to passing through at least one aprilat code 20. By the scheme, the problem that a proper April Tag code 20 is difficult to find as a passing April Tag code 20 when a path is directly planned for a single task can be avoided, or a planned path needs to pass through a certain April Tag code 20 around a long path.
Step S102, controlling the robot 10 to walk along the planned path based on a positioning algorithm, and simultaneously acquiring image data through a camera;
in this step, the control unit controls the robot 10 to travel along the planned path by using the data detected by the sensor module, for example, the control unit controls the robot 10 to travel by using the point cloud information collected by the laser radar, or controls the robot 10 to move by fusing the data collected by the laser radar and the odometer. The camera simply captures images while the robot 10 is in motion.
Step S103, when the camera acquires an image containing an aprilat code 20, decoding the aprilat code 20 in the image by using an aprilat library to obtain ID data of the aprilat code 20 and a relative pose between the camera and the aprilat code 20;
step S104, calculating the accurate pose of the robot 10 according to the relative pose and pose data of the AprilTag code 20 in the grid map;
step S105, feeding back the accurate pose to the positioning algorithm, and repositioning the robot 10 on the grid map by the positioning algorithm;
and step S106, updating the moving path of the robot 10 in the grid map.
In this step, specifically: and calling the move _ base packet to update the global path and the local path of the robot 10 in the map in real time.
After the step, the robot 10 is controlled to continue to move in the ridge according to the updated pose and the moving path, the sensor module (such as a laser radar) continuously scans the environment between the ridges, detection data (such as point cloud information) are continuously returned to maintain the positioning information of the robot 10, the camera continuously captures a ground image until the next AprilTag code 20 is identified, and the steps S103-S106 are executed to complete the relocation and navigation planning correction of the robot 10.
In the steps, the aprilat library is utilized to directly calculate the accurate 3D position and direction of the aprilat code 20 relative to the camera and the ID data of the aprilat code 20, the calculation efficiency is high, and compared with the traditional two-dimensional code, the method has the advantages that the calculation time is long, the real-time performance is poor and the illumination change is sensitive due to the complex structure and high calculation complexity of the traditional two-dimensional code. In addition, in the invention, because the robot 10 mainly depends on the sensor module to perform navigation operation, the aprilat code 20 can be sparsely laid out, the use amount of the aprilat code 20 is greatly reduced, and the pose of the robot 10 can be corrected in time when the robot 10 is controlled to move only by enabling the planned path to pass through the aprilat code 20 when positioning correction is performed each time, so that the accuracy of the movement of the robot 10 is ensured.
The grid map construction process includes the following steps S201 to S202:
step S201, controlling the robot 10 to walk in a greenhouse by using an SLAM method, collecting data through a sensor module, and establishing an original grid map according to the collected data;
in the step, the collected data specifically comprises data of a laser radar, an IMU and an encoder, which are fused with a milemeter, and the control unit establishes an original grid map in the greenhouse by using a Gmapping algorithm and stores the map.
Step S202, executing editing operation, editing the original grid map to obtain the grid map;
in this step, the purpose of editing operation is mainly to trim the original grid map, remove noise and fill in the blank defects, so that the map has no abnormal data affecting the use. The control unit can automatically execute editing operation or manually perform editing operation, in the latter case, kolourpaint software is used for preprocessing the established grid map under the ubuntu system, the control unit outputs the two-dimensional grid map firstly, the two-dimensional grid map is edited by an editing tool manually and then is input into the system, and the control unit receives the edited map as a preprocessed grid map.
Preferably, the grid map construction process further includes the following steps: and writing the pose data of each aprilat code 20 in the grid map into a pose transformation program in a control unit.
In the step, path points are manually selected on the ground in the ridge of the greenhouse, TAG36H11 codes in April Tag are pasted, then the accurate pose data of each April Tag 20 are accurately measured according to the existing grid map, and then the pose data of the April Tag 20 are recorded into a pose transformation program in a control unit.
Preferably, the step S102 of controlling the robot 10 to walk along the planned path based on the positioning algorithm includes the following steps S301 to S303:
step S301, controlling a laser radar to scan the surrounding environment to obtain point cloud information;
step S302, based on the point cloud information, data of other sensors and the grid map, estimating the position of the robot 10 at the current moment by using an AMCL (advanced surveillance algorithm) positioning algorithm to obtain an estimated position;
in this step, the data of the other sensors include IMU data and the fused odometry data of the encoder, and may also include other types of sensor data.
Step S303, correcting the planned path of the robot 10 in real time according to the estimated position.
Preferably, the step S104 of calculating the accurate pose of the robot 10 according to the relative pose and the pose data of the AprilTag code 20 in the grid map includes the following steps S401 to S403:
step S401, calculating to obtain the relative pose of the robot 10 and the AprilTag code 20 through the TF transformation relation between a camera coordinate system and a chassis coordinate system of the robot 10;
step S402, inquiring the position and pose data of the AprilTag code 20 in the grid map according to the ID data of the AprilTag code;
and S403, calculating the accurate pose of the robot 10 according to the relative pose and the pose data.
Preferably, the positioning algorithm in step S105 relocates the robot 10 on the grid map includes the following steps S501-S502:
step S501, feeding back the accurate pose to an AMCL positioning algorithm, and redistributing the particle cloud through the AMCL positioning algorithm;
step S502, point cloud data sent back by the laser radar is obtained, and the position of the robot 10 on the grid map is repositioned.
Preferably, the grid map has a topological network (shown by a dotted line in fig. 1) formed by topological lines, the topological network extends along the passage in the greenhouse and passes through all AprilTag codes 20, and the grouping of tasks to generate phase tasks can be performed according to the following method: firstly, acquiring a batch of tasks, evaluating each task to obtain a reference path between a task point (a point position corresponding to the task point position) of the task and a task point of a previous task, wherein the specific method comprises the steps of calculating a projection point of the task point of the evaluation task on a corresponding nearest topological line and a projection point of the task point of the previous task on the corresponding nearest topological line, and the reference path is a shortest path between two projection points and extending along a topological line network; judging whether the distance of the reference path is greater than a preset value and whether the reference path passes through at least one April tag code 20, if the two judgment structures are yes, the evaluated task can be used as an independently-grouped task, and other tasks are not independently-grouped tasks; and finally, merging the tasks which can not be independently grouped and a plurality of adjacent tasks into the same group, wherein the method specifically comprises the following steps: combining continuous tasks which cannot be independently grouped, judging whether the combined task group can be independently grouped according to the method, if so, taking the task group as a stage task, otherwise, combining the task group with the tasks which can be independently grouped before or after the task group to form a stage task; and finally, combining the independent tasks which cannot be independently grouped (the tasks before and after the independent tasks are all independently grouped tasks) with the tasks before or after the independent tasks which can be independently grouped to form a stage task. By adopting the method, the tasks are evaluated in a path reference mode, evaluation and grouping can be completed quickly, compared with the way of evaluating in advance through path planning, the method has the advantages that the calculated task amount is much smaller, and the operation efficiency can be improved.
The invention also provides a greenhouse agricultural robot positioning and navigation device 600 (hereinafter referred to as "positioning and navigation device 600") for multi-sensor fusion based on AprilTag code, wherein the positioning and navigation device 600 may include or be divided into one or more program modules, and the one or more program modules are stored in a storage medium and executed by one or more processors, so as to complete the invention and realize the multi-sensor fusion positioning and navigation method based on AprilTag code 20. The program module referred to in the embodiments of the present invention refers to a series of computer program instruction segments capable of performing specific functions, and is more suitable for describing the execution process of the multi-sensor fusion positioning navigation method based on aprilat code 20 in a storage medium than the program itself. The following description will specifically describe the functions of the program modules of the present embodiment, and as shown in fig. 4, the positioning navigation device 600 includes:
a planning module 610, configured to perform path planning on the grid map according to the initial pose and the target pose, so as to obtain a planned path, where the planned path passes through at least one aprilat code 20; each aprilat code 20 has corresponding pose data in the grid map, and the ID data corresponding to different aprilat codes 20 are different;
a navigation collection module 620 for controlling the robot 10 to walk along the planned path and collecting image data by a camera;
a first calculation module 630, configured to, when the camera acquires an image including an aprilta code 20, decode the aprilta code 20 in the image by using an aprilta library, and obtain ID data of the aprilta code 20 and a relative pose of the camera and the aprilta code 20;
a second calculation module 640 for calculating a precise pose of the robot 10 from the relative pose and pose data of the AprilTag code 20 in the grid map;
a positioning update module 650 for feeding back the accurate pose to the positioning algorithm, which repositions the robot 10 on the grid map;
a path update module 660 for updating the planned path of the robot 10 in the grid map.
The contents of other positioning navigation devices 600 implementing the above-mentioned multi-sensor fusion positioning navigation method based on AprilTag code have been described in detail in the previous embodiments, and reference may be made to the corresponding contents in the previous embodiments, which are not repeated herein.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.

Claims (8)

1. The method for positioning and navigating the fusion of the multiple sensors based on the AprilTag code is characterized by comprising the following steps:
performing path planning on the grid map according to the initial pose and the target point pose to obtain a planned path, wherein the planned path passes through at least one AprilTag code; each AprilTag code has corresponding pose data in the grid map, and ID data corresponding to different AprilTag codes are different;
controlling the robot to walk along the planned path based on a positioning algorithm, and simultaneously acquiring image data through a camera;
when the camera acquires an image containing an April Tag code, decoding the April Tag code in the image by using an April Tag library to obtain ID data of the April Tag code and the relative pose of the camera and the April Tag code;
calculating the accurate pose of the robot according to the relative pose and pose data of the AprilTag code in the grid map;
feeding back the accurate pose to the positioning algorithm, and repositioning the robot on the grid map by the positioning algorithm;
and updating the moving path of the robot in the grid map.
2. The AprilTag code-based multi-sensor fusion positioning and navigation method according to claim 1, further comprising a grid map construction process, said grid map construction process comprising:
controlling the robot to walk in the greenhouse by using an SLAM method, acquiring data through a sensor module, and establishing an original grid map according to the acquired data;
and executing editing operation, and editing the original grid map to obtain the grid map.
3. The AprilTag code based multi-sensor fusion positioning and navigation method according to claim 2, further comprising, after the grid mapping process:
and writing the corresponding pose data of each AprilTag code in the grid map into a pose transformation program in a control unit.
4. The AprilTag-code-based multi-sensor fusion positioning and navigation method of claim 1, wherein said calculating the precise pose of the robot from the relative pose and pose data of the AprilTag code in the grid map comprises:
calculating to obtain the relative pose of the robot and an AprilTag code through the TF transformation relation between a camera coordinate system and a robot chassis coordinate system;
inquiring the pose data of the AprilTag code in the grid map according to the ID data of the AprilTag code;
and calculating the accurate pose of the robot according to the relative pose and the pose data.
5. The AprilTag code-based multi-sensor fusion positioning navigation method according to claim 1, wherein the controlling the robot to walk along the planned path based on the positioning algorithm comprises:
controlling a laser radar to scan the surrounding environment to obtain point cloud information;
estimating the position of the robot at the current moment by utilizing an AMCL (advanced positioning language) positioning algorithm based on the point cloud information, data of other sensors and the grid map to obtain an estimated position;
and correcting the moving path of the robot in real time according to the estimated position.
6. The AprilTag code-based multi-sensor fusion positioning and navigation method according to claim 5, wherein the repositioning of the robot on the grid map by the positioning algorithm comprises:
redistributing the particle cloud through an AMCL positioning algorithm;
and acquiring point cloud data transmitted back by the laser radar, and repositioning the position of the robot on the grid map.
7. Many sensors based on aprilTag sign indicating number fuse the navigation head, its characterized in that, it includes:
the planning module is used for planning a path on the grid map according to the initial pose and the pose of the target point to obtain a planned path, and the planned path passes through at least one AprilTag code; each April tag code has corresponding pose data in the grid map, and ID data corresponding to different April tag codes are different;
a navigation acquisition module for controlling the robot to walk along the planned path based on a positioning algorithm while acquiring image data by a camera;
the camera comprises a first calculation module, a second calculation module and a third calculation module, wherein the first calculation module is used for decoding April Tag codes in images by utilizing an April Tag library when the images containing the April Tag codes are acquired by the camera, and obtaining ID data of the April Tag codes and relative poses of the camera and the April Tag codes;
a second calculation module for calculating a precise pose of the robot from the relative pose and pose data of the AprilTag code in the grid map;
the positioning updating module is used for feeding back the accurate pose to the positioning algorithm, and the positioning algorithm relocates the robot on the grid map;
a path update module for updating a movement path of the robot in the grid map.
8. The aprilTag code-based multi-sensor fusion positioning system is characterized by comprising a robot for executing operation in a greenhouse and a control unit which is internally and/or externally arranged on the robot; aprilTag codes are dispersedly attached to the ground in the greenhouse; the robot is provided with a sensor module and a camera, and a lens of the camera faces downwards to acquire a ground image; the control unit is capable of data interaction with the sensor module and the camera to perform the aprilatag code-based multi-sensor fusion positioning navigation method of any one of claims 1-6.
CN202210921400.0A 2022-08-02 2022-08-02 Multi-sensor fusion positioning navigation method, device and system based on AprilTag code Pending CN115265534A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210921400.0A CN115265534A (en) 2022-08-02 2022-08-02 Multi-sensor fusion positioning navigation method, device and system based on AprilTag code

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210921400.0A CN115265534A (en) 2022-08-02 2022-08-02 Multi-sensor fusion positioning navigation method, device and system based on AprilTag code

Publications (1)

Publication Number Publication Date
CN115265534A true CN115265534A (en) 2022-11-01

Family

ID=83747992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210921400.0A Pending CN115265534A (en) 2022-08-02 2022-08-02 Multi-sensor fusion positioning navigation method, device and system based on AprilTag code

Country Status (1)

Country Link
CN (1) CN115265534A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118010008A (en) * 2024-04-08 2024-05-10 西北工业大学 Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118010008A (en) * 2024-04-08 2024-05-10 西北工业大学 Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method
CN118010008B (en) * 2024-04-08 2024-06-07 西北工业大学 Binocular SLAM and inter-machine loop optimization-based double unmanned aerial vehicle co-location method

Similar Documents

Publication Publication Date Title
EP3672762B1 (en) Self-propelled robot path planning method, self-propelled robot and storage medium
US11288526B2 (en) Method of collecting road sign information using mobile mapping system
Harapanahalli et al. Autonomous Navigation of mobile robots in factory environment
JP5018458B2 (en) Coordinate correction method, coordinate correction program, and autonomous mobile robot
KR20170088228A (en) Map building system and its method based on multi-robot localization
CN113405544B (en) Mobile robot map building and positioning method and system
CN111881239A (en) Construction method, construction device, intelligent robot and readable storage medium
CN111762519B (en) Method and system for guiding picking robot operation and scheduling device
CN109445438A (en) Cruise control method and system based on the cruising device that map is shared
CN111487960A (en) Mobile robot path planning method based on positioning capability estimation
CN110716559A (en) Comprehensive control method for shopping mall and supermarket goods picking robot
CN112894758A (en) Robot cleaning control system, method and device and computer equipment
CN111857114A (en) Robot formation moving method, system, equipment and storage medium
CN115265534A (en) Multi-sensor fusion positioning navigation method, device and system based on AprilTag code
CN111168669A (en) Robot control method, robot, and readable storage medium
CN113654558A (en) Navigation method and device, server, equipment, system and storage medium
JP2022530246A (en) Simultaneous execution of self-position estimation and environmental map creation
CN116576859A (en) Path navigation method, operation control method and related device
CN116629106A (en) Quasi-digital twin method, system, equipment and medium for mobile robot operation scene
CN111580530A (en) Positioning method, positioning device, autonomous mobile equipment and medium
CN114281081B (en) Navigation system and navigation method of subway vehicle inspection robot and robot
CN113932825B (en) Robot navigation path width acquisition system, method, robot and storage medium
CN112729252B (en) Tunnel laser point cloud collection method based on robot platform and robot system
US20220196410A1 (en) Vehicle navigation
JP2021148698A (en) Automatic inspection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination