CN114720978A - Method and mobile platform for simultaneous localization and mapping - Google Patents

Method and mobile platform for simultaneous localization and mapping Download PDF

Info

Publication number
CN114720978A
CN114720978A CN202111224843.6A CN202111224843A CN114720978A CN 114720978 A CN114720978 A CN 114720978A CN 202111224843 A CN202111224843 A CN 202111224843A CN 114720978 A CN114720978 A CN 114720978A
Authority
CN
China
Prior art keywords
mobile platform
data
environment
environmental
odometer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111224843.6A
Other languages
Chinese (zh)
Inventor
苏群翔
郭家瑞
陈水石
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ali Corp
Original Assignee
Ali Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ali Corp filed Critical Ali Corp
Priority to US17/551,148 priority Critical patent/US20220214443A1/en
Publication of CN114720978A publication Critical patent/CN114720978A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method and mobile platform for simultaneous localization and mapping are disclosed. The method for simultaneous localization and mapping is applied to a mobile platform and comprises: continuously acquiring and storing odometer data and environment sensing data of the mobile platform when the mobile platform moves based on the motion track; combining a certain amount of odometer data and environment sensing data to obtain environment information when accumulating and storing the odometer data and the environment sensing data; executing a simultaneous positioning and map building program according to a current map and continuously obtained odometer data and environmental information; and repeating the steps until the moving platform finishes the motion track. Therefore, when the mobile platform is applied to the method for simultaneously positioning and mapping, the mobile platform can move without setting a laser radar with higher cost, and collects the odometer data and the environment sensing data to synthesize the environment information similar to the sensing data of the laser radar, so that the effects of accurate positioning, cost reduction and durability improvement are achieved.

Description

Method and mobile platform for simultaneous localization and mapping
Technical Field
The present application relates to the field of mobile platform technology, and in particular, to a method and a mobile platform for simultaneous localization and mapping.
Background
Currently, a mobile platform generally adopts a synchronous positioning And Mapping (SLAM) technology to generate an environment map And perform autonomous positioning.
Among them, the SLAM architecture based on laser radar (LIDAR) has been developed more maturely, and is widely applied to Mobile platforms such as Automated Guided Vehicles (AGVs), Autonomous Mobile Robots (AMR), Autonomous vehicles, service robots, and sweeping robots. However, since the SLAM architecture of the lidar requires the arrangement of the lidar which is complex in structure and high in cost, the mobile platform applying the SLAM architecture has the problem of high product cost. In addition, because the operation principle of laser radar is through the motor drive operation, it receives the vibrations factor easily and leads to work unusual, and the driving medium also easily loses, causes the problem that product durability is low and the spoilage is high.
Disclosure of Invention
The embodiment of the application provides a method and a mobile platform for simultaneous positioning and map building, which can effectively solve the problems of high product cost, low durability and high product damage rate of the mobile platform due to the fact that a laser radar needs to be arranged in the application of the conventional SLAM technology.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, a method for simultaneous localization and mapping is provided, comprising the steps of: continuously acquiring and storing odometer data and environment sensing data of the mobile platform when the mobile platform moves based on the motion track; combining a certain amount of odometer data and environment sensing data to obtain environment information when accumulating and storing the odometer data and the environment sensing data; executing a simultaneous positioning and map building program according to a current map and continuously obtained odometer data and environmental information; and repeatedly executing the steps until the moving platform finishes the motion track.
In a second aspect, there is provided a mobile platform comprising: the device comprises a odometer, an environment sensor, a memory and a processor, wherein the memory is connected with the odometer and the environment sensor, and the processor is connected with the odometer, the environment sensor and the memory. The odometer is used for continuously acquiring odometer data of the mobile platform when the mobile platform moves based on the motion track; the environment sensor is used for continuously acquiring environment sensing data of the mobile platform when the mobile platform moves based on the motion track; the memory is used to continuously store odometry data and environmental sensing data for the mobile platform. The processor is used for merging the odometer data and the environment sensing data to obtain environment information when the memory accumulates and stores a certain amount of odometer data and environment sensing data; executing a simultaneous positioning and map building program according to a current map and continuously obtained odometer data and environmental information; and repeatedly executing the steps until the moving platform finishes the motion track.
In the embodiment of the application, when the mobile platform applies the method for simultaneous positioning and mapping, the odometry data and the environment sensing data can be continuously collected while moving, so as to synthesize the ambient environment information sensed by the approximate 360-degree scanning of the laser radar. The resulting environmental information may be provided to perform simultaneous localization and mapping procedures to update the mobile platform pose and map. Therefore, the mobile platform can restore the environmental information which is enough to stably carry out simultaneous positioning and map construction operation by matching the environmental sensing data with the odometer data with relatively less data quantity under the condition of not arranging the laser radar with higher cost, thereby achieving the technical effects of accurate positioning, reduction of product cost and improvement of product durability.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a block diagram of one embodiment of a mobile platform according to the present application;
FIG. 2 is a schematic diagram illustrating an exemplary configuration of an environmental sensor of a mobile platform according to the present application;
FIG. 3 is a schematic diagram illustrating sensing directions of the environmental sensor of the mobile platform of FIG. 2 at time T1 to T4;
FIG. 4 is a schematic diagram illustrating the moving direction of the moving platform of FIG. 2 at time points T1 to T4;
FIG. 5 is a diagram illustrating relative coordinate points accumulated in a relative coordinate system during movement of the moving platform of FIG. 2 from time T1 to time T4;
FIG. 6 is a block diagram of another embodiment of a mobile platform according to the present application;
FIG. 7 is a method flow diagram of one embodiment of a method for simultaneous localization and mapping in accordance with the present application; and
FIG. 8 is a flowchart of an embodiment of the method of the simultaneous localization and mapping procedure of step 230 in FIG. 7.
Detailed Description
Embodiments of the present invention will be described below with reference to the accompanying drawings. In the drawings, the same reference numerals indicate the same or similar components or process flows.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, values, method steps, operations, components, and/or components, but do not preclude the presence or addition of further features, values, method steps, operations, components, and/or groups thereof.
It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is described as being "directly connected" or "directly coupled" to another element, there are no intervening elements present.
Please refer to fig. 1, which is a block diagram illustrating an embodiment of a mobile platform according to the present application. As shown in fig. 1, the mobile platform 100 includes: the odometer 110, the environment sensor 120, the memory 130 and the processor 140, wherein the memory 130 is connected with the odometer 110 and the environment sensor 120, and the processor 140 is connected with the odometer 110, the environment sensor 120 and the memory 130. The number of the environmental sensors 120 and the processors 140 may be, but is not limited to, one; the connection between the memory 130 and the odometer 110, between the memory 130 and the environment sensor 120, between the processor 140 and the odometer 110, between the processor 140 and the environment sensor 120, and between the processor 140 and the memory 130 may be performed in a wired manner, but the embodiment is not limited to the present application. For example, the number of the environmental sensors 120 and the processor 140 may be plural; in an example, when the number of the environmental sensors 120 is 3, the optical axes of the 3 environmental sensors 120 form a predetermined angle (i.e. the 3 environmental sensors 120 are disposed on the mobile platform 100 according to the angle data), so that the fields of view of two adjacent environmental sensors 120 have an overlapping portion (the angle between the optical axes of two adjacent environmental sensors 120 is determined by a predetermined overlapping portion percentage and a predetermined size of the fields of view); the connection between the memory 130 and the odometer 110, between the memory 130 and the environment sensor 120, between the processor 140 and the odometer 110, between the processor 140 and the environment sensor 120, and between the processor 140 and the memory 130 may be wireless.
In practical implementations, the odometer 110 may be an Inertial Measurement Unit (IMU) or a wheel odometer; the environment sensor 120 may be a sensor that measures a distance using a Time of Flight (Time of Flight) principle, such as a laser ranging sensor, an infrared ranging sensor, an ultrasonic sensor, or an RGBD camera; the memory 130 includes high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device; the Processor 140 may be a Reduced Instruction Set Computer (RISC) or a Micro Controller Unit (MCU), or may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
In the present embodiment, the odometer 110 is configured to continuously collect odometer data of the mobile platform 100 when the mobile platform 100 moves based on the motion trajectory; the environment sensor 120 is configured to continuously acquire environment sensing data (i.e., distance-related information between the mobile platform 100 and a measured object (e.g., a wall, an obstacle, etc.) in the surrounding environment) for calculating a sensing distance and a sensing direction of the mobile platform 100 when the mobile platform 100 moves based on the motion trajectory; memory 130 is used to continuously store odometry data and environmental sensing data for mobile platform 100. The processor 140 is configured to combine the odometry data and the environmental sensing data to obtain environmental information whenever the memory 130 cumulatively stores a certain amount of odometry data and environmental sensing data (e.g., 150 odometry data and 150 environmental sensing data); executing a simultaneous positioning and map building program according to a current map and continuously obtained odometer data and environmental information; and repeatedly executing the steps until the mobile platform 100 completes the motion track. Where the motion trajectory is not limited to any form of movement, the current map may be stored in memory 130 or an internal memory of processor 140.
In one embodiment, the processor 140 is further configured to convert and integrate each of the accumulated environment sensing data into the environment information based on the acquired odometry data at the integration time point and with each of the accumulated odometry data whenever a certain amount of the odometry data and the environment sensing data are accumulatively stored. The environment information is, for example, a distance measured at each angle (or each direction) with the mobile platform 100 as a center, and therefore, the environment information corresponds to information including the environment around the mobile platform 100. In other words, the environment information is substantially the same as the information obtained by the mobile platform scanning its own surrounding environment. The sensing direction of the environmental sensor 120 for collecting the environmental sensing data may be related to the position of the environmental sensor 120 disposed on the mobile platform 100, and in this embodiment, if the center of the mobile platform 100 is used as a reference point and the moving direction of the mobile platform 100 is set to 0 degree as a reference standard of an angle, the sensing direction of the environmental sensor 120 for collecting the environmental sensing data (e.g., the optical axis direction of the environmental sensor 120) may also be represented by an angle.
For example, please refer to fig. 2, which is a schematic configuration diagram of an environment sensor of a mobile platform according to an embodiment of the present application. As shown in fig. 2, when the number of the environment sensors 120 is 3, the sensing directions of the 3 environment sensors 120 may be 0 degrees (i.e., the optical axes of the environment sensors 120 point to the advancing direction of the mobile platform 100, as shown by the dashed arrow 50 in fig. 2), 90 degrees (i.e., the optical axes of the environment sensors 120 point to the right side of the mobile platform 100, as shown by the dashed arrow 52 in fig. 2), and-90 degrees (i.e., the optical axes of the environment sensors 120 point to the left direction of the mobile platform 100, as shown by the dashed arrow 54 in fig. 2), respectively. Each odometry data may be represented by (X, Y, Θ), where X, Y represents a position of the mobile platform 100 in a two-dimensional plane under a Relative Coordinate System, X represents a horizontal axis position in the Relative Coordinate System (which may refer to a Coordinate System with the mobile platform 100 as an origin of the Coordinate System), Y represents a vertical axis position in the Relative Coordinate System, and Θ represents a yaw angle (i.e., a direction of the two-dimensional plane on which the X axis and the Y axis are located) of the mobile platform 100. Therefore, the processor 140 can obtain the distance value measured by the environmental sensor 120 at each data acquisition point according to the odometry data acquired by the mobile platform 100 at each data acquisition point and the respective sensing directions of the 3 environmental sensors 120. The distance values may be further converted into relative coordinate points between each environmental sensor 120 and the environment sensed by the environmental sensor in the relative coordinate system, and when enough data is accumulated, the processor 140 may utilize the current odometer data and match the accumulated odometer data to convert and combine the relative coordinate points into environmental information (information representing the environment around the mobile platform 100), thereby simulating an effect equivalent or similar to 360 degree lidar or laser scanning. In this way, stable execution of the simultaneous localization and mapping procedure can be guaranteed.
More specifically, please refer to fig. 3 to 5, fig. 3 is a schematic diagram of sensing directions of the environmental sensor of the moving platform of fig. 2 at time points T1 to T4 during moving, fig. 4 is a schematic diagram of moving directions of the moving platform of fig. 2 at time points T1 to T4 during moving, and fig. 5 is a schematic diagram of relative coordinate points accumulated in the relative coordinate system during moving of the moving platform of fig. 2 at time points T1 to T4 during moving. The setting positions of the 3 environmental sensors 120 on the mobile platform 100 are fixed; when the mobile platform 100 moves from time T1 to time T4 (the positions of the mobile platform 100 from time T1 to time T4 are shown as circles in fig. 4, and the moving directions of the mobile platform 100 from time T1 to time T4 are shown as arrows in fig. 4), the sensing directions of the 3 environmental sensors 120 change accordingly, as shown in fig. 3 and 4. The processor 140 may obtain the distance value sensed by each environmental sensor 120 at time points T1 to T4 of the mobile platform 100 according to the sensing directions of the 3 environmental sensors 120 at time points T1 to T4 and the odometry data (which may include data related to the moving direction of the mobile platform 100 and data related to the position) obtained by the odometer 110 at time points T1 to T4.
The distance value may be described by a relative coordinate system (i.e., the distance value is represented by relative coordinate points in the relative coordinate system), so as to obtain the relative coordinate points accumulated during the movement of the mobile platform 100 from time T1 to time T4 as shown in fig. 5. Here, P1a, P1b, P1c are relative coordinate points in the relative coordinate system sensed by the 3 environmental sensors 120 at the time point T1, P2a, P2b, P2c are relative coordinate points in the relative coordinate system sensed by the 3 environmental sensors 120 at the time point T2, P3a, P3b, P3c are relative coordinate points in the relative coordinate system sensed by the 3 environmental sensors 120 at the time point T3, and P4a, P4b, P4c are relative coordinate points in the relative coordinate system sensed by the 3 environmental sensors 120 at the time point T4. It should be noted that, since the moving direction of the mobile platform 100 at the time points T1 and T2 is forward, the relative coordinate points P1b and P2b are overlapped.
When a certain amount of odometry data and environmental sensing data are accumulated and stored at time points T1 to T4, the processor 140 may convert the environmental information corresponding to the mobile platform 100 at time point T4 based on the odometry data obtained at the merging time point T4 and using the relative coordinate points accumulated at time points T1 to T4, so as to simulate the environmental information sensed at time point T4 by 360-degree laser radar or laser scanning. By analogy, when the time points T5 to T8 have accumulated a certain amount of odometry data and environment sensing data, the processor 140 may utilize the relative coordinate points accumulated at the time points T5 to T8 and convert the relative coordinate points accumulated at the time points T5 to T8 into the corresponding environment information of the mobile platform 100 at the time point T8 based on the odometry data acquired at the time point T8. Finally, the processor 140 may integrate the environment information obtained by the mobile platform 100 during the moving process, so as to obtain a map of the moving environment in which the mobile platform 100 is located.
In an embodiment, the simultaneous localization and mapping procedure comprises: updating the amount of movement of the mobile platform 100 based on the currently acquired odometry data and the current map; and correcting the posture of the mobile platform 100 according to the environmental information and the updated movement amount of the mobile platform 100, and updating the current map. The processor 140 may correct the posture of the mobile platform 100 through a simultaneous localization and mapping algorithm according to the environment information and the updated moving amount of the mobile platform 100, and update the current map.
In an embodiment, the processor 140 is further configured to obtain an initial map as the current map and an initial pose of the mobile platform 100 according to the obtained first environment information. That is, the initial map and initial pose of the mobile platform 100 may be constructed or initialized according to the first context information.
In one embodiment, the processor 140 is configured to combine the odometry data and the environmental sensing data accumulated and stored in the default time every time the default time elapses, in addition to the memory 130 for cumulatively storing a certain amount of the odometry data and the environmental sensing data into the environmental information, to obtain the environmental information. The default time may be, but is not limited to, 300 ms, but the present embodiment is not limited to this application, and may be adjusted according to actual requirements.
In one embodiment, the processor 140 is further configured to combine the odometry data and the environmental sensing data accumulated during the period when the Field of View (FOV) of the environmental sensor 120 of the mobile platform 100 is changed beyond a default angle as determined based on the posture of the mobile platform 100 and the continuously obtained odometry data during the movement of the mobile platform 100 based on the motion trajectory, into the environmental information in addition to the memory 130 for cumulatively storing a certain amount of the odometry data and the environmental sensing data. The default angle may be, but is not limited to, 90 degrees, but the present embodiment is not intended to limit the present application, and may be adjusted according to actual requirements.
For example, when the number of the environmental sensors 120 is 1, the default angle may be, but is not limited to, 90 degrees; when the number of the environment sensors 120 is 3, and the angle data of the 3 environment sensors 120 disposed on the mobile platform 100 may include that the optical axes of the 3 environment sensors 120 are 90 degrees, 0 degrees (i.e., the forward direction of the mobile platform 100) and-90 degrees, respectively, the default angle may be, but is not limited to, 30 degrees.
As can be seen from the above embodiments, the processor 140 may combine the accumulated odometry data and the environmental sensing data into the environmental information whenever the memory 130 accumulatively stores a certain amount of odometry data and environmental sensing data, whenever a default time elapses, or whenever it is determined that the angle of view of the environmental sensor 120 of the mobile platform 100 is changed beyond a certain default angle.
However, in other embodiments, the processor 140 may combine the accumulated odometry data and the environmental sensing data into the environmental information when the memory 130 cumulatively stores a certain amount of odometry data and environmental sensing data and determines that the angle of view of the environmental sensor 120 of the mobile platform 100 changes beyond a default angle. In other embodiments, the processor 140 may combine the accumulated odometry data and the environmental sensing data into the environmental information when a certain amount of odometry data and environmental sensing data are accumulatively stored in the memory 130 and a default time elapses. In other embodiments, the processor 140 may determine that the angle of view of the environment sensor 120 of the mobile platform 100 changes beyond a default angle and a default time elapses, and combine the accumulated odometry data and the environment sensing data to obtain the environment information. In other embodiments, the processor 140 may accumulate a certain amount of odometry data and environmental sensing data stored in the memory 130, and combine the accumulated odometry data and environmental sensing data to obtain the environmental information when a default time elapses and it is determined that the angle of view of the environmental sensor 120 of the mobile platform 100 changes beyond a default angle.
Therefore, in the present application, the processor 140 may decide to combine the accumulated odometry data and the environmental sensing data into the environmental information according to any one or a combination of the accumulated amount of data (i.e., the accumulated amount of the odometry data and the environmental sensing data), the time for collecting the data (i.e., the default time), and the angle of view change (i.e., the angle of view of the environmental sensor 120 is changed beyond the default angle).
In one embodiment, the odometer data continuously collected by the odometer 110 and the environment sensing data continuously collected by the environment sensor 120 are stored in the memory 130 by the processor 140 (as shown in fig. 6, fig. 6 is a block diagram of another embodiment of the mobile platform according to the present application). In another embodiment, the odometer data continuously collected by the odometer 110 and the environmental sensing data continuously collected by the environmental sensor 120 are stored directly to the memory 130 (as shown in FIG. 1).
In one embodiment, memory 130 is an internal memory of processor 140 (shown in FIG. 6).
In one embodiment, the processor 140 may include a merging unit 142 and a simultaneous localization and mapping unit 144, the merging unit 142 being connected to the simultaneous localization and mapping unit 144 (as shown in fig. 6); the memory 130 may store program code that the processor 140 executes to generate a merging unit 142 and a simultaneous localization and mapping unit 144; the merging unit 142 is configured to merge environmental information obtained by accumulating a certain amount of odometer data and environmental sensing data in the memory 130 every time the odometer data and the environmental sensing data are accumulated and stored; the simultaneous localization and mapping unit 144 is configured to perform a simultaneous localization and mapping procedure based on the current map and continuously obtained odometry data and environmental information.
Referring to fig. 1 and 7, fig. 7 is a flowchart illustrating an embodiment of a method for simultaneous localization and mapping according to the present application. In this embodiment, the method for simultaneous localization and mapping is applicable to the mobile platform 100, and includes the following steps: continuously collecting and storing odometry data and environmental sensing data of the mobile platform 100 when the mobile platform 100 moves based on the motion trajectory (step 210); combining a certain amount of odometer data and environment sensing data each time they are cumulatively stored to obtain environment information (step 220); performing a simultaneous localization and mapping procedure based on the current map and continuously obtained odometry data and environmental information (step 230); and repeating steps 210 to 230 until the mobile platform 100 completes the motion track (step 240). Wherein step 210 is performed by the odometer 110 and the environmental sensor 120; steps 220 and 230 are performed by memory 130 and processor 140; for a detailed description, reference may be made to the above description of the mobile platform 100, which is not repeated herein.
In one embodiment, step 220 may comprise: every time a certain amount of odometry data and environment sensing data are accumulatively stored, each accumulated environment sensing data is converted and integrated into environment information based on the obtained odometry data at the merging time point and collocated with each accumulated odometry data, and the environment information is substantially the same as the information obtained by the mobile platform scanning the surrounding environment by taking the mobile platform as a center. Therefore, the environmental information simulating 360-degree lidar or laser scanning can be obtained, and the detailed description may refer to the related description of the mobile platform 100, which is not repeated herein.
In one embodiment, referring to fig. 1, fig. 7 and fig. 8, fig. 8 is a flowchart of an embodiment of a method of the simultaneous localization and mapping procedure of step 230 in fig. 7. As shown in fig. 8, the simultaneous localization and mapping procedure of step 230 includes: updating the amount of movement of the mobile platform 100 based on the currently acquired odometry data and the current map (step 310); and correcting the posture of the mobile platform 100 according to the environment information and the updated moving amount of the mobile platform 100, and updating the current map (step 320). Wherein step 320 may include: the attitude of the mobile platform 100 is corrected by a simultaneous localization and mapping algorithm according to the environmental information and the updated amount of movement of the mobile platform 100, and the current map is updated.
In an embodiment, the method for simultaneous localization and mapping may further comprise: and acquiring an initial map serving as a current map and an initial posture of the mobile platform 100 according to the obtained first environment information. That is, the initial map and the initial pose of the mobile platform 100 may be constructed or initialized according to the first environment information.
In an embodiment, the method for simultaneous localization and mapping may further comprise: and combining the odometry data and the environment sensing data which are accumulated and stored in the default time every time when the default time passes to obtain the environment information. For a detailed description, reference may be made to the above description of the mobile platform 100, which is not repeated herein.
In an embodiment, the method for simultaneous localization and mapping may further comprise: in the process that the mobile platform 100 moves based on the motion trail, when it is determined that the angle of view of the environment sensor of the mobile platform 100 changes beyond a default angle based on the posture of the mobile platform 100 and the continuously obtained odometry data, the odometry data and the environment sensing data accumulated during the period when the angle of view changes beyond the default angle are merged to obtain the environment information. For a detailed description, reference may be made to the above description of the mobile platform 100, which is not repeated herein.
In an embodiment, the method for simultaneous localization and mapping may further comprise: when the memory 130 cumulatively stores a certain amount of odometry data and environmental sensing data and determines that the angle of view of the environmental sensor 120 of the mobile platform 100 is changed beyond a default angle, the cumulated odometry data and environmental sensing data are combined to obtain the environmental information.
In an embodiment, the method for simultaneous localization and mapping may further comprise: when the memory 130 accumulatively stores a certain amount of odometry data and environment sensing data and a default time elapses, the accumulated odometry data and environment sensing data are combined to obtain the environment information.
In an embodiment, the method for simultaneous localization and mapping may further comprise: when it is determined that the angle of view of the environment sensor 120 of the mobile platform 100 is changed beyond a default angle and a default time elapses, the accumulated odometry data and the environment sensing data are merged to obtain the environment information.
In an embodiment, the method for simultaneous localization and mapping may further comprise: when the memory 130 accumulatively stores a certain amount of odometry data and environment sensing data, a default time elapses, and it is determined that the angle of view of the environment sensor 120 of the mobile platform 100 is changed beyond a default angle, the accumulated odometry data and environment sensing data are combined to obtain the environment information.
To sum up, in the embodiment of the present application, when the mobile platform applies the method for simultaneous localization and mapping, the mobile platform can move and execute the simultaneous localization and mapping procedure at the same time, and synthesize the continuously collected odometer data and the environment sensing data into the environment information similar to the sensing data of the laser radar, so that the mobile platform can achieve the technical effects of accurate localization, reduction of product cost and improvement of product durability without setting the laser radar with higher cost.
Although the above-described elements are included in the drawings of the present application, it is not excluded that more additional elements may be used to achieve better technical results without departing from the spirit of the invention.
While the invention has been described using the above embodiments, it should be noted that these descriptions are not intended to limit the invention. Rather, this invention encompasses modifications and similar arrangements as would be apparent to one skilled in the art. The scope of the claims is, therefore, to be construed in the broadest possible manner to cover all such modifications and similar arrangements.

Claims (18)

1. A method for simultaneous localization and mapping for a mobile platform, comprising the steps of:
continuously acquiring and storing odometer data and environment sensing data of the mobile platform when the mobile platform moves based on the motion track;
combining a certain amount of the odometer data and the environmental sensing data to obtain environmental information whenever accumulated;
executing a simultaneous localization and mapping procedure according to a current map and the continuously obtained odometry data and the environmental information; and
and repeatedly executing the steps until the moving platform finishes the motion trail.
2. The method for simultaneous localization and mapping of claim 1, further comprising:
and acquiring an initial map serving as the current map and an initial posture of the mobile platform according to the obtained first environmental information.
3. The method for simultaneous localization and mapping of claim 1, further comprising:
combining the odometry data and the environmental sensing data, which are cumulatively stored during a default time, to obtain the environmental information whenever the default time elapses.
4. The method for simultaneous localization and mapping of claim 1, further comprising:
and in the process that the mobile platform moves based on the motion trail, when the change of the view angle of an environment sensor of the mobile platform is judged to exceed a default angle based on the posture of the mobile platform and the continuously obtained odometry data, the odometry data and the environment sensing data accumulated during the period that the view angle is changed to exceed the default angle are combined to obtain the environment information.
5. The method for simultaneous localization and mapping according to claim 1, wherein the simultaneous localization and mapping procedure comprises:
updating an amount of movement of the mobile platform based on the currently acquired odometry data and the current map; and
and correcting the posture of the mobile platform according to the environment information and the updated movement amount of the mobile platform, and updating the current map.
6. The method for simultaneous localization and mapping according to claim 5, wherein the step of correcting the pose of the mobile platform according to the environment information and the updated amount of movement of the mobile platform and updating the current map comprises:
and correcting the posture of the mobile platform through a simultaneous positioning and map building algorithm according to the environment information and the updated moving amount of the mobile platform, and updating the current map.
7. The method for simultaneous localization and mapping according to claim 1, wherein the mobile platform includes an environment sensor for continuously collecting the environment sensing data used to calculate sensing distance and sensing direction while the mobile platform moves based on the motion trajectory, and the step of combining the odometry data and the environment sensing data each time a certain number of them are cumulatively stored to obtain environment information comprises:
each time a certain amount of the odometry data and the environmental sensing data are accumulatively stored, each of the accumulated environmental sensing data is converted and merged into the environmental information, which is substantially the same as information obtained by the mobile platform scanning the surrounding environment centering on itself, based on the odometry data acquired at the current time point of merging and collocated with each of the accumulated odometry data.
8. A mobile platform, comprising:
the odometer is used for continuously acquiring odometer data of the mobile platform when the mobile platform moves based on the motion trail;
the environment sensor is used for continuously acquiring environment sensing data used for calculating the sensing distance and the sensing direction of the mobile platform when the mobile platform moves based on the motion track;
a memory connected to the odometer and the environmental sensor for continuously storing the odometer data and the environmental sensing data of the mobile platform; and
a processor connected to the odometer, the environmental sensor and the memory, for combining the odometer data and the environmental sensing data to obtain environmental information whenever the memory cumulatively stores a certain amount of the odometer data and the environmental sensing data; executing a simultaneous localization and mapping procedure according to a current map, the continuously obtained odometry data and the environment information; and repeatedly executing the steps until the moving platform finishes the motion trail.
9. The mobile platform of claim 8, wherein the environmental sensor is a laser range sensor, an infrared range sensor, an ultrasonic sensor, or an RGBD camera.
10. The mobile platform of claim 8, wherein the odometer is an inertial measurement unit or a wheel odometer.
11. The mobile platform of claim 8, wherein the processor is further configured to obtain an initial map as the current map and an initial pose of the mobile platform based on the obtained first one of the environmental information.
12. The mobile platform of claim 8, wherein the processor is further to merge the odometry data and the environmental sensing data that are cumulatively stored for a default time to obtain the environmental information each time the default time elapses.
13. The mobile platform of claim 8, wherein the processor is further configured to combine the stored odometry data and environmental sensing data accumulated during the change of perspective beyond a default angle with the environmental information when a change of perspective of an environmental sensor of the mobile platform is determined to exceed the default angle based on the pose of the mobile platform and the continuously obtained odometry data during the movement of the mobile platform based on the motion trajectory.
14. The mobile platform of claim 8, wherein the simultaneous localization and mapping procedure comprises: updating an amount of movement of the mobile platform based on the currently acquired odometry data and the current map; and correcting the posture of the mobile platform according to the environment information and the updated moving amount of the mobile platform, and updating the current map.
15. The mobile platform of claim 14, wherein the processor is further configured to correct the pose of the mobile platform and update the current map by a simultaneous localization and mapping algorithm based on the environmental information and an updated amount of movement of the mobile platform.
16. The mobile platform of claim 8, wherein the processor is further configured to convert and merge each of the accumulated environment sensing data into the environment information based on the odometry data acquired at the current point in time at the merging point in time and collocated with each of the accumulated odometry data each time a certain amount of the odometry data and the environment sensing data are cumulatively stored, the environment information being substantially the same as information obtained by the mobile platform scanning an ambient environment centered on itself.
17. The mobile platform of claim 8, wherein the odometer data continuously acquired by the odometer and the environment sensing data continuously acquired by the environment sensor are stored to the memory by the processor.
18. The mobile platform of claim 8, wherein the memory is an internal memory of the processor.
CN202111224843.6A 2021-01-06 2021-10-21 Method and mobile platform for simultaneous localization and mapping Pending CN114720978A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/551,148 US20220214443A1 (en) 2021-01-06 2021-12-14 Method for simultaneous localization and mapping and mobile platform using the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163134566P 2021-01-06 2021-01-06
US63/134,566 2021-01-06

Publications (1)

Publication Number Publication Date
CN114720978A true CN114720978A (en) 2022-07-08

Family

ID=82233989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111224843.6A Pending CN114720978A (en) 2021-01-06 2021-10-21 Method and mobile platform for simultaneous localization and mapping

Country Status (1)

Country Link
CN (1) CN114720978A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110039037A (en) * 2009-10-09 2011-04-15 고려대학교 산학협력단 A simultaneous localization and map building method of mobile robot using vanishing point
CN103105852A (en) * 2011-11-14 2013-05-15 联想(北京)有限公司 Method and device for displacement computing and method and device for simultaneous localization and mapping
CN108052103A (en) * 2017-12-13 2018-05-18 中国矿业大学 The crusing robot underground space based on depth inertia odometer positions simultaneously and map constructing method
US20180172451A1 (en) * 2015-08-14 2018-06-21 Beijing Evolver Robotics Co., Ltd Method and system for mobile robot to self-establish map indoors
CN108508894A (en) * 2018-04-03 2018-09-07 中科微至智能制造科技江苏有限公司 A kind of robot localization method based on two-dimensional laser
US20180297207A1 (en) * 2017-04-14 2018-10-18 TwoAntz, Inc. Visual positioning and navigation device and method thereof
US10175340B1 (en) * 2018-04-27 2019-01-08 Lyft, Inc. Switching between object detection and data transfer with a vehicle radar
CN109341694A (en) * 2018-11-12 2019-02-15 哈尔滨理工大学 A kind of autonomous positioning air navigation aid of mobile sniffing robot
CN109916411A (en) * 2019-03-29 2019-06-21 韦云智 A kind of method of the indoor positioning navigation of robot
US20190204838A1 (en) * 2017-12-30 2019-07-04 Lyft, Inc. Localization Based on Sensor Data
CN111007522A (en) * 2019-12-16 2020-04-14 深圳市三宝创新智能有限公司 Position determination system of mobile robot
US20200125845A1 (en) * 2018-10-22 2020-04-23 Lyft, Inc. Systems and methods for automated image labeling for images captured from vehicles
CN111089585A (en) * 2019-12-30 2020-05-01 哈尔滨理工大学 Mapping and positioning method based on sensor information fusion
US20200309529A1 (en) * 2019-03-29 2020-10-01 Trimble Inc. Slam assisted ins

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110039037A (en) * 2009-10-09 2011-04-15 고려대학교 산학협력단 A simultaneous localization and map building method of mobile robot using vanishing point
CN103105852A (en) * 2011-11-14 2013-05-15 联想(北京)有限公司 Method and device for displacement computing and method and device for simultaneous localization and mapping
US20180172451A1 (en) * 2015-08-14 2018-06-21 Beijing Evolver Robotics Co., Ltd Method and system for mobile robot to self-establish map indoors
US20180297207A1 (en) * 2017-04-14 2018-10-18 TwoAntz, Inc. Visual positioning and navigation device and method thereof
CN108052103A (en) * 2017-12-13 2018-05-18 中国矿业大学 The crusing robot underground space based on depth inertia odometer positions simultaneously and map constructing method
US20190204838A1 (en) * 2017-12-30 2019-07-04 Lyft, Inc. Localization Based on Sensor Data
CN108508894A (en) * 2018-04-03 2018-09-07 中科微至智能制造科技江苏有限公司 A kind of robot localization method based on two-dimensional laser
US10175340B1 (en) * 2018-04-27 2019-01-08 Lyft, Inc. Switching between object detection and data transfer with a vehicle radar
US20200125845A1 (en) * 2018-10-22 2020-04-23 Lyft, Inc. Systems and methods for automated image labeling for images captured from vehicles
CN109341694A (en) * 2018-11-12 2019-02-15 哈尔滨理工大学 A kind of autonomous positioning air navigation aid of mobile sniffing robot
CN109916411A (en) * 2019-03-29 2019-06-21 韦云智 A kind of method of the indoor positioning navigation of robot
US20200309529A1 (en) * 2019-03-29 2020-10-01 Trimble Inc. Slam assisted ins
CN111007522A (en) * 2019-12-16 2020-04-14 深圳市三宝创新智能有限公司 Position determination system of mobile robot
CN111089585A (en) * 2019-12-30 2020-05-01 哈尔滨理工大学 Mapping and positioning method based on sensor information fusion

Similar Documents

Publication Publication Date Title
CN110673115B (en) Combined calibration method, device, equipment and medium for radar and integrated navigation system
CN109975792B (en) Method for correcting point cloud motion distortion of multi-line laser radar based on multi-sensor fusion
CN111532257B (en) Method and system for compensating for vehicle calibration errors
Zhang et al. Localization and navigation using QR code for mobile robot in indoor environment
US8195331B2 (en) Method, medium, and apparatus for performing path planning of mobile robot
Chen et al. Qualitative vision-based path following
US8135562B2 (en) System, method and medium calibrating gyrosensors of mobile robots
US9274526B2 (en) Autonomous vehicle and method of estimating self position of autonomous vehicle
CN111881239B (en) Construction method, construction device, intelligent robot and readable storage medium
US11279045B2 (en) Robot pose estimation method and apparatus and robot using the same
US11852484B2 (en) Method for determining the orientation of a robot, orientation determination apparatus of a robot, and robot
CN110986988B (en) Track calculation method, medium, terminal and device integrating multi-sensor data
US9122278B2 (en) Vehicle navigation
CN110673608A (en) Robot navigation method
CN111288971B (en) Visual positioning method and device
CN110763224A (en) Navigation method and navigation system for automatic guided transport vehicle
CN113561963A (en) Parking method and device and vehicle
WO2020184013A1 (en) Vehicle control device
CN111469130A (en) Robot control method and device, storage medium and processor
CN113252066B (en) Calibration method and device for parameters of odometer equipment, storage medium and electronic device
CN113256728B (en) IMU equipment parameter calibration method and device, storage medium and electronic device
CN114720978A (en) Method and mobile platform for simultaneous localization and mapping
US20190346850A1 (en) Method of acquiring image for recognizing position and robot implementing the same
Gasparino et al. Improved localization in a corn crop row using a rotated laser rangefinder for three-dimensional data acquisition
KR20230123060A (en) Robot monitoring apparatus and robot monitoring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination