CN110444102B - Map construction method and device and unmanned equipment - Google Patents

Map construction method and device and unmanned equipment Download PDF

Info

Publication number
CN110444102B
CN110444102B CN201810408970.3A CN201810408970A CN110444102B CN 110444102 B CN110444102 B CN 110444102B CN 201810408970 A CN201810408970 A CN 201810408970A CN 110444102 B CN110444102 B CN 110444102B
Authority
CN
China
Prior art keywords
map
environment
coordinate system
switching
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810408970.3A
Other languages
Chinese (zh)
Other versions
CN110444102A (en
Inventor
门春雷
刘艳光
巴航
张文凯
徐进
韩微
郝尚荣
郑行
陈明轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN201810408970.3A priority Critical patent/CN110444102B/en
Publication of CN110444102A publication Critical patent/CN110444102A/en
Application granted granted Critical
Publication of CN110444102B publication Critical patent/CN110444102B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/005Map projections or methods associated specifically therewith

Abstract

The disclosure provides a map construction method and device and unmanned equipment, and relates to the technical field of unmanned driving. The map construction method comprises the following steps: determining the environment type of the environment where the unmanned equipment is located, wherein the environment type comprises an indoor bright environment, an outdoor bright environment or a dark environment; constructing a map by adopting a related map construction mode according to the environment type; and when the environment type changes, switching the used map construction mode and calibrating the map before and after switching. The method can be used for switching to a related map construction mode in time to construct a map when the environment changes, fully considers the problem of map construction fracture caused by inconsistent detector dimensions and positions, executes map calibration to connect the maps before and after switching, and improves the adaptability of map construction to the environment change.

Description

Map construction method and device and unmanned equipment
Technical Field
The disclosure relates to the technical field of unmanned driving, in particular to a map construction method and device and unmanned equipment.
Background
SLAM (Simultaneous Localization and Mapping) refers to dynamically building a map model in an incremental manner for the current environment only by using the observation of a sensor on the environment on the premise of not having prior information similar to an environment map when a robot carrying some sensors moves in a strange environment, and meanwhile, determining the position of the robot in the map building process. Therefore, based on the SLAM technology, the unmanned equipment can complete map building and positioning and realize autonomous navigation and path planning even in a strange environment completely without prior information.
The environment faced by the unmanned device is diversified and complicated, and in such an environment, to make the robot possess an autonomous capability, a reliable and robust SLAM system is one of indispensable key technologies.
Disclosure of Invention
The inventors have found that the related SLAM systems rely on a single sensor data, such as laser-based GMapping, monocular vision-based ORB (organized FAST and Rotated BRIEF, FAST feature point extraction and description algorithm) -SLAM, RGBD-SLAM based on RGBD (Red Green Blue and Depth ) cameras. When the operating environment meets certain conditions, the system can well complete map construction and positioning tasks. For example, if a robot is equipped with a lidar with high enough accuracy, this task can be performed by the robot to a large extent by means of an existing SLAM system, such as gmaping, for a two-dimensional map construction task in a relatively static indoor scene. However, when the SLAM system needs to operate for a long time, especially when the scene change is faced during the operation, such as the illumination condition and the environmental openness change, the scene change problem cannot be handled. In the practical application process, scene change is very common. Therefore, the mapping technology in multiple scenes is an important aspect of SLAM from theory to practical application.
Some embodiments of the present disclosure aim to improve the adaptability of mapping to the environment.
According to some aspects of the present disclosure, a map construction method is provided, including: determining the environment type of the environment where the unmanned equipment is located, wherein the environment type comprises an indoor bright environment, an outdoor bright environment or a dark environment; constructing a map by adopting a related map construction mode according to the environment type; and when the environment type changes, switching the used map construction mode and calibrating the map before and after switching.
Optionally, the map construction mode includes at least two of the following: under the condition that the environment type is an indoor bright environment, a map is constructed according to depth data detected by a depth camera; under the condition that the environment type is an outdoor bright environment, a map is constructed according to image data detected by a visual detector; and under the condition that the environment type is a dark environment, a map is constructed by adopting the distance data detected by the laser detector.
Optionally, calibrating the map before and after switching comprises: and unifying the maps constructed before and after switching to the same coordinate system according to the continuity of the pose of the unmanned equipment at the switching moment.
Optionally, calibrating the map before and after switching comprises: acquiring the final pose of the unmanned equipment in a global coordinate system in a map construction mode stopped at the switching moment, wherein the global coordinate system takes the position of the unmanned equipment when the unmanned equipment is started as a coordinate origin; acquiring an initial pose of the unmanned equipment in a local coordinate system in a map construction mode started at a switching moment; determining a coordinate system transformation matrix for transforming the initial pose into the final pose according to the final pose and the initial pose; unifying the constructed map after switching to a global coordinate system according to the coordinate system transformation matrix.
Optionally, determining a coordinate system transformation matrix for transforming the initial pose into the final pose according to the final pose and the initial pose comprises:
according to the formula
Tglobal=Tconversion·Tlocal
Determining a coordinate transformation matrix TconversionWherein, TglobalFor the pose, T, of the unmanned device in the global coordinate systemlocalThe pose of the unmanned equipment under the local coordinate system is obtained; coordinate transformation matrix TconversionComprising a rotation matrix RconversionAnd a translation vector tconversionAccording to the formula
cglobal=Rconversion·clocal+tconversion
Determining a rotation matrix RconversionAnd a translation vector tconversion,cglobalCoordinates of the unmanned aerial vehicle in a global coordinate system, clocalCoordinates of the unmanned equipment in a local coordinate system.
Optionally, calibrating the map before and after switching further comprises: when the drone enters an outdoor bright environment: determining a scale factor of image data detected by a visual detector according to coordinates of the unmanned equipment in a map construction mode stopped at the switching moment; and constructing a map under a local coordinate system according to the scale factors and the image data.
Optionally, the map construction method further includes: if the environment type when the unmanned equipment is started is an outdoor bright environment, determining a scale factor of the image data according to detection data of the distance sensor; and constructing a map under a local coordinate system according to the scale factors and the image data.
Optionally, the map construction method further includes: if the environment type when the unmanned equipment is started is an outdoor bright environment, constructing a map under a local coordinate system according to the image data; when environment switching occurs, determining a scale factor of image data according to coordinates of the unmanned equipment in a map construction mode started at the switching moment; and correcting the scale of the map constructed according to the image data according to the scale factor to generate a map under a global coordinate system, and unifying the constructed maps before and after switching under the global coordinate system.
The method can be used for switching to a related map construction mode in time to construct a map when the environment changes, fully considers the problem of map construction fracture caused by inconsistent detector dimensions and positions, executes map calibration to connect the maps before and after switching, and improves the adaptability of map construction to the environment change.
According to other aspects of the present disclosure, a map building apparatus is provided, including: an environment determination module configured to determine an environment type of an environment in which the unmanned device is located, the environment type including an indoor bright environment, an outdoor bright environment, or a dark environment; the map building module is configured to build a map by adopting an associated map building mode according to the environment type; the switching module is configured to switch the used map construction mode when the environment type changes; a calibration module configured to calibrate the map before and after switching.
Optionally, the map construction mode includes at least two of the following: under the condition that the environment type is an indoor bright environment, a map is constructed according to depth data detected by a depth camera; under the condition that the environment type is an outdoor bright environment, a map is constructed according to image data detected by a visual detector; and under the condition that the environment type is a dark environment, a map is constructed by adopting the distance data detected by the laser detector.
Optionally, the calibration module is configured to unify the maps constructed before and after the switching into the same coordinate system according to the continuity of the pose of the unmanned aerial vehicle at the switching time.
Optionally, the calibration module is configured to: acquiring the final pose of the unmanned equipment in a global coordinate system in a map construction mode stopped at the switching moment, wherein the global coordinate system takes the position of the unmanned equipment when the unmanned equipment is started as a coordinate origin; acquiring an initial pose of the unmanned equipment in a local coordinate system in a map construction mode started at a switching moment; determining a coordinate system transformation matrix for transforming the initial pose into the final pose according to the final pose and the initial pose; unifying the constructed map after switching to a global coordinate system according to the coordinate system transformation matrix.
Optionally, determining a coordinate system transformation matrix for transforming the initial pose into the final pose according to the final pose and the initial pose comprises: according to the formula
Tglobal=Tconversion·Tlocal
Determining a coordinate transformation matrix TconversionWherein, TglobalFor the pose, T, of the unmanned device in the global coordinate systemlocalThe pose of the unmanned equipment under the local coordinate system is obtained; coordinate transformation matrix TconversionComprising a rotation matrix RconversionAnd a translation vector tconversionAccording to the formula
cglobal=Rconversion·clocal+tconversion
Determining a rotation matrix RconversionAnd a translation vector tconversion,cglobalCoordinates of the unmanned aerial vehicle in a global coordinate system, clocalCoordinates of the unmanned equipment in a local coordinate system.
Optionally, the calibration module is further configured to: when the drone enters an outdoor bright environment: determining a scale factor of image data detected by a visual detector according to coordinates of the unmanned equipment in a map construction mode stopped at the switching moment; and constructing a map under a local coordinate system according to the scale factors and the image data.
Optionally, the map building module is further configured to determine a scale factor of the image data according to the detection data of the distance sensor if the environment type when the unmanned device is started is an outdoor bright environment, and build a map in the local coordinate system according to the scale factor and the image data.
Optionally, the calibration module is further configured to: if the environment type when the unmanned equipment is started is an outdoor bright environment, constructing a map under a local coordinate system according to the image data; when environment switching occurs, determining a scale factor of image data according to coordinates of the unmanned equipment in a map construction mode started at the switching moment; and correcting the scale of the map constructed according to the image data according to the scale factor to generate a map under a global coordinate system, and unifying the constructed maps before and after switching under the global coordinate system.
According to still further aspects of the present disclosure, a map construction apparatus is provided, including: a memory; and a processor coupled to the memory, the processor configured to perform any of the above map construction methods based on instructions stored in the memory.
The device can be switched to a related map construction mode in time to construct a map when the environment changes, the problem of map construction fracture caused by inconsistent detector dimensions and positions is fully considered, map calibration is executed to enable the maps before and after switching to be connected, and the adaptability of map construction to the environment change is improved.
According to still further aspects of the disclosure, a computer-readable storage medium is proposed, on which computer program instructions are stored, which instructions, when executed by a processor, realize the steps of performing any of the above map building methods.
By executing the instructions on the computer-readable storage medium, the map can be constructed by switching to the related map construction mode in time when the environment changes, the influence of map construction fracture caused by inconsistent detector dimensions and positions is fully considered, the map calibration is executed to connect the maps before and after switching, and the adaptability of the map construction to the environment change is improved.
Further, according to some aspects of the present disclosure, there is provided an unmanned aerial vehicle, comprising: an environment sensor configured to detect an environment in which the unmanned device is located; at least two of a laser detector, a depth camera, or a visual detector configured to provide detection data for constructing a map; and, a map building apparatus of any of the above.
The unmanned equipment can detect the environment where the unmanned equipment is located, timely switches to a related map construction mode to construct a map when the environment changes, fully considers the problem of map construction fracture caused by inconsistent detector dimensions and positions, executes map calibration to connect the maps before and after switching, and improves the adaptability of the map construction to the environment change.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this disclosure, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure and not to limit the disclosure. In the drawings:
fig. 1 is a flow diagram of some embodiments of a mapping method of the present disclosure.
Fig. 2 is a flow diagram of some embodiments of map calibration in a mapping method of the present disclosure.
FIG. 3 is a flow chart of further embodiments of map calibration in a mapping method of the present disclosure.
Fig. 4 is a schematic diagram of some embodiments of a mapping apparatus of the present disclosure.
FIG. 5 is a schematic diagram of further embodiments of a mapping apparatus of the present disclosure.
Fig. 6 is a schematic diagram of further embodiments of a mapping apparatus of the present disclosure.
Fig. 7 is a schematic diagram of some embodiments of the unmanned device of the present disclosure.
Detailed Description
The technical solution of the present disclosure is further described in detail by the accompanying drawings and examples.
A flow diagram of some embodiments of a mapping method of the present disclosure is shown in fig. 1.
In step 101, the environment type of the environment in which the unmanned device is located is determined. In some embodiments, the environment type can be divided according to the wide range and the illumination degree of the environment, for example, the environment type is divided into an indoor bright environment, an outdoor bright environment and a dark environment, and the environment in which the unmanned device is positioned is determined by the sensor. In some embodiments, the type of environment may be determined by a photosensitive sensing device in conjunction with a distance detection device.
In step 102, a map is constructed according to the environment type by adopting an associated map construction mode.
Because the laser detector is interfered under the condition of strong light and is more suitable for the dark environment, the distance data detected by the laser detector is adopted to construct a map, for example GMaping is adopted to construct the map under the condition that the environment type is the dark environment.
The detection process can be better performed by adopting the camera under the condition of stronger light. Cameras commonly used at present include a camera for taking a two-dimensional image, and a camera having a depth detection function (e.g., a binocular camera, a 3D camera, an RGBD camera, a kinect camera, etc.).
Based on the characteristic that the depth camera can directly acquire depth information, the depth camera can acquire distance data more directly and accurately in an indoor environment, so that a map is constructed according to the depth data detected by the depth camera under the condition that the environment type is an indoor bright environment, for example, the map construction is realized by adopting RGBD-SLAM.
In an outdoor environment, the environment is too wide, and the depth detection capability of the depth camera is limited, so that the method is more suitable for constructing a map by adopting a camera for shooting a two-dimensional image through image processing, for example, the ORB-SLAM is adopted to realize map construction.
In step 103, when the environment type changes, the map construction method to be used is switched and the maps before and after the switching are calibrated. In some embodiments, since the positions of the probes are not completely the same in different mapping manners, and different mapping manners have their own coordinate systems, it is necessary to calibrate the maps before and after switching in order to implement seamless docking of the maps after switching. In some embodiments, maps constructed before and after switching can be unified to the same coordinate system according to the continuity of the pose of the unmanned device at the switching moment.
The method can be used for switching to a related map construction mode in time to construct a map when the environment changes, fully considers the influence of map construction fracture caused by inconsistent detector dimensions and positions, executes map calibration to connect the maps before and after switching, and improves the adaptability of map construction to the environment change.
In some embodiments, a scene classification algorithm based on a convolutional neural network and Bayesian filtering optimization facing robot scene detection can be adopted, so that efficient and accurate detection of three environments, namely indoor, outdoor and dark environments, can be realized. The environment detection model obtains an image of a current environment, then the image of the current environment is input into an environment classifier based on a convolutional neural network to obtain an environment classification result, finally time and space correlation information between continuous images is added to the environment classification result through Bayesian filtering, and the stability and accuracy of the classification result are improved.
By the method, the accuracy of determining the environment type can be improved, the error influence is reduced, the repeated switching of the map construction mode is avoided, the calculation amount is reduced, and the accuracy of map construction is also improved.
A flow diagram of some embodiments of map calibration in a mapping method of the present disclosure is shown in fig. 2.
In step 201, a final pose of the unmanned aerial vehicle in the map construction mode stopped at the switching time is obtained in a global coordinate system, and the global coordinate system takes a position of the unmanned aerial vehicle when the unmanned aerial vehicle is started as a coordinate origin.
In some embodiments, the local coordinate system used at startup may be used as the global coordinate system; in another embodiment, the local coordinates used during startup may be calibrated in scale, and the local coordinate system is used as the global coordinate system after the local coordinate system is consistent with the real size.
In step 202, the initial pose of the unmanned aerial vehicle in the local coordinate system in the map construction mode which is turned on at the switching moment is acquired.
Due to the continuity of the states of the unmanned equipment, the final pose in the map construction mode closed at the switching time is the same as or very similar to the initial pose in the map construction mode opened, so that the transformation relation of the coordinate systems before and after switching can be obtained by taking the pose at the switching time as a reference.
In step 203, a coordinate system transformation matrix for transforming the initial pose to the final pose is determined according to the final pose and the initial pose.
In some embodiments, the final pose is expressed in global coordinates and the initial pose is expressed in switched local coordinates, according to a formula
Tglobal=Tconversion·Tlocal
Capable of determining a coordinate transformation matrix TconversionWherein, TglobalFor the pose, T, of the unmanned device in the global coordinate systemlocalThe pose of the unmanned equipment under the local coordinate system is obtained; coordinate transformation matrix TconversionComprising a rotation matrix RconversionAnd a translation vector tconversionAccording to the formula
cglobal=Rconversion·clocal+tconversion
Capable of determining a rotation matrix RconversionAnd a translation vector tconversion,cglobalCoordinates of the unmanned aerial vehicle in a global coordinate system, clocalCoordinates of the unmanned equipment in a local coordinate system.
In step 204, unifying the switched constructed map to the global coordinate system according to the coordinate system transformation matrix. In some embodiments, the pose of the unmanned aerial vehicle recorded in the coordinates in the local coordinate system may be gradually converted into the global coordinate system with the construction of the map, or the map construction may be performed based on the global coordinate by correcting the switched local coordinate system with the obtained coordinate system transformation matrix.
By the method, the maps constructed before and after switching can be unified under the global coordinate system by utilizing the continuity of the pose of the unmanned equipment at the switching moment, so that the continuous maps are constructed under the global coordinate system, and the unmanned equipment can be positioned more accurately.
In some embodiments, the scale of the map construction is affected because the scale of the visual image is not clear when the map is constructed using the visual image. A flow chart of further embodiments of map calibration in the mapping method of the present disclosure is shown in fig. 3.
In step 301, it is determined whether to switch from a dark environment or an indoor bright environment to an outdoor bright environment. If the switched target is an outdoor bright environment, it is determined that a map is to be constructed using image data detected by the visual detector, and step 302 is performed.
In step 302, a scale factor of the image data detected by the vision detector is determined based on the coordinates of the drone in the map construction mode stopped at the switching time. In some embodiments, if the map is constructed using the distance data detected by the laser detector before switching, the detection distance is determined according to the detection result of the laser detector; and before switching, if a map is constructed according to the depth data detected by the depth camera, determining the detection distance according to the distance detection result of the depth camera. The detected distance may be the height of the drone, the distance from an obstacle, or a z-axis coordinate. In the image data detected by the vision detector, the detected distance is used for calibrating the height of the unmanned equipment or the distance between the unmanned equipment and the obstacle, so that the scale factor of the image data can be obtained.
In step 303, a map in a local coordinate system is constructed from the scale factors and the image data such that the local coordinate system has an absolute scale that is consistent with the real situation.
In step 304, the map constructed after switching is unified into a global coordinate system.
By the method, the defect that the distance information cannot be directly obtained by the visual image can be made up, the uniform scale before and after switching is realized, and the accuracy and continuity of map construction are improved.
In some embodiments, if the start-up environment of the drone is an outdoor bright environment, the scale factor cannot be directly acquired from the visual image. In some embodiments, the distance data can be detected by using the distance sensor, and then the scale factor is obtained, so that the size of the constructed map in the local coordinate system conforms to the real situation as much as possible, and the map with a more accurate scale can be constructed even under the condition that the unmanned equipment does not perform environment switching.
In some embodiments, when the starting environment of the unmanned aerial vehicle is an outdoor bright environment, and when the unmanned aerial vehicle is switched to a dark environment or an indoor bright environment, the scale of the global coordinate system before switching can be corrected according to the distance information obtained by the switched map construction mode by using the continuity of the pose of the unmanned aerial vehicle at the switching time, so that the scale of the global coordinate system is more accurate, and the switched map construction mode is unified to the global coordinate system to construct a map. In some embodiments, the map constructed in the outdoor bright environment before can be corrected according to the scale factor of the corrected global coordinate system, so that the accuracy of the map at each stage is improved.
A schematic diagram of some embodiments of the mapping apparatus of the present disclosure is shown in fig. 4.
The environment determination module 401 can determine the type of environment in which the drone is located. In some embodiments, the environment type can be divided according to the wide range and the illumination degree of the environment, for example, the environment type is divided into an indoor bright environment, an outdoor bright environment and a dark environment, and the environment in which the unmanned device is positioned is determined by the sensor. In some embodiments, the type of environment may be determined by a photosensitive sensing device in conjunction with a distance detection device.
The map building module 402 can build a map according to the environment type using an associated map building approach. Because the laser detector can be disturbed under the condition of strong illumination, the method is more suitable for the dark environment, and therefore, the distance data detected by the laser detector is adopted to construct a map under the condition that the environment type is the dark environment. Based on the characteristic that the depth camera can directly acquire depth information, the depth camera can acquire distance data more directly and accurately in an indoor environment, and therefore a map is constructed according to the depth data detected by the depth camera under the condition that the environment type is an indoor bright environment. In an outdoor environment, the environment is too wide, and the detection capability of the depth camera is limited, so that the method is more suitable for constructing a map by adopting a camera for shooting a two-dimensional image through image processing.
The switching module 403 can switch the map building method used when the environment type changes, including: constructing a map by adopting distance data detected by a laser detector in a dark environment; constructing a map according to image data detected by a visual detector in an outdoor bright environment; a map is constructed from depth data detected by a depth camera in an indoor bright environment. And when the type of the environment where the unmanned equipment is located is changed, switching to a corresponding map construction mode.
The calibration module 404 can calibrate the map before and after the map construction mode is switched when the environment type changes. Because the positions of the detectors of different map building modes are not completely the same, and the different map building modes have their own coordinate systems, the maps before and after switching need to be calibrated to realize seamless docking of the switched maps. In some embodiments, maps constructed before and after switching can be unified to the same coordinate system according to the continuity of the pose of the unmanned device at the switching moment.
The device can be switched to a related map construction mode in time to construct a map when the environment changes, the problem of map construction fracture caused by inconsistent detector dimensions and positions is fully considered, map calibration is executed to enable the maps before and after switching to be connected, and the adaptability of map construction to the environment change is improved.
In some embodiments, the environment determination module 401 includes a sensing data acquisition unit, a convolutional neural network unit, and a bayesian filter optimization unit, and can implement efficient and accurate detection on three environment types, namely indoor, outdoor, and dark, by using an environment classification algorithm based on a convolutional neural network and bayesian filter optimization for robot environment detection. The environment determination module 401 obtains an image in the current environment through the sensing data obtaining unit, then inputs the image in the current environment into the environment classifier of the convolutional neural network unit to obtain an environment classification result, and finally adds time and space correlation information between continuous images to the environment classification result through the bayesian filtering unit to improve the stability and accuracy of the classification result.
The device can improve the accuracy of determining the environment type, reduce the error influence, avoid the repeated switching of the map construction mode, reduce the calculation amount and improve the accuracy of map construction.
In some embodiments, the calibration module 404 can obtain a final pose of the drone in the global coordinate system in the mapping mode that was stopped at the switch time, and obtain an initial pose of the drone in the local coordinate system in the mapping mode that was turned on at the switch time.
Due to the continuity of the states of the unmanned equipment, the final pose in the map construction mode closed at the switching time is the same as or very similar to the initial pose in the map construction mode opened, so that the transformation relation of the coordinate systems before and after switching can be obtained by taking the pose at the switching time as a reference.
In some embodiments, a coordinate system transformation matrix that transforms the initial pose to the final pose may be determined from the final pose and the initial pose described above. E.g. the final pose is expressed in global coordinates and the initial pose is expressed in switched local coordinates, according to the formula
Tglobal=Tconversion·Tlocal
Determining a coordinate transformation matrix TconversionWherein, TglobalFor the pose, T, of the unmanned device in the global coordinate systemlocalThe pose of the unmanned equipment under the local coordinate system is obtained; coordinate transformation matrix TconversionComprising a rotation matrix RconversionAnd a translation vector tconversionAccording to the formula
cglobal=Rconversion·clocal+tconversion
Determining a rotation matrix RconversionAnd a translation vector tconversion,cglobalCoordinates of the unmanned aerial vehicle in a global coordinate system, clocalCoordinates of the unmanned equipment in a local coordinate system.
Unifying the constructed map after switching to a global coordinate system according to the coordinate system transformation matrix. In some embodiments, the pose of the unmanned aerial vehicle recorded in the coordinates in the local coordinate system may be gradually converted into the global coordinate system as the map is constructed, or the map construction may be performed based on the global coordinate by correcting the switched local coordinate system with the coordinate system transformation matrix obtained above.
The device can unify maps constructed before and after switching to a global coordinate system by using the continuity of the pose of the unmanned equipment at the switching moment, so that a continuous map is constructed in the global coordinate system, and the unmanned equipment can be positioned more accurately.
In some embodiments, when the target mode of the switching is to construct a map by using image data detected by a visual detector, since the scale of the visual image is ambiguous, the calibration module 404 first determines a scale factor of the image data detected by the visual detector according to the coordinates of the unmanned device in the map construction mode stopped at the switching time, then constructs a map in a local coordinate system according to the scale factor and the image data, so that the local coordinate system has an absolute scale according with the real situation, and unifies the constructed map after the switching into a global coordinate system. The device can make up the defect that the distance information cannot be directly obtained from the visual image, realizes the uniform scale before and after switching, and improves the accuracy and continuity of map construction.
In some embodiments, when the starting environment of the unmanned device is an outdoor bright environment, the map construction module may detect distance data by using the distance sensor, and further obtain a scale factor, so that the size of the constructed map in the local coordinate system conforms to the real situation as much as possible, and even when the unmanned device does not perform environment switching, a more accurate map can be constructed.
In some embodiments, when the environment determination module determines that the unmanned aerial vehicle is switched from the outdoor bright environment to the dark environment or the indoor bright environment when the start environment of the unmanned aerial vehicle is the outdoor bright environment, the calibration module may correct the scale of the global coordinate system before switching according to the distance information obtained by the switched map construction method by using the continuity of the pose of the unmanned aerial vehicle at the switching time, so that the scale of the global coordinate system is more accurate, and the switched map construction method is unified to the map construction under the global coordinate system. In some embodiments, the calibration module may further correct the map constructed in the outdoor bright environment according to the scale factor of the rectified global coordinate system, so as to improve the accuracy of the map at each stage.
A schematic structural diagram of an embodiment of the map building apparatus of the present disclosure is shown in fig. 5. The mapping means comprises a memory 501 and a processor 502. Wherein: the memory 501 may be a magnetic disk, flash memory, or any other non-volatile storage medium. The memory is for storing the instructions in the corresponding embodiments of the map construction method above. The processor 502 is coupled to the memory 501 and may be implemented as one or more integrated circuits, such as a microprocessor or microcontroller. The processor 502 is configured to execute instructions stored in the memory, which can improve the adaptability of the mapping to environmental changes.
In some embodiments, as also shown in fig. 6, the map building apparatus 600 includes a memory 601 and a processor 602. The processor 602 is coupled to the memory 601 by a BUS 603. The map building apparatus 600 may also be connected to an external storage 605 via a storage interface 604 for calling external data, and may also be connected to a network or another computer system (not shown) via a network interface 606. And will not be described in detail herein.
In this embodiment, the data instructions are stored in the memory, and the instructions are processed by the processor, so that the adaptability of the map construction to the environmental changes can be improved.
In another embodiment, a computer-readable storage medium has stored thereon computer program instructions which, when executed by a processor, implement the steps of the method in the corresponding embodiment of the mapping method. As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, apparatus, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable non-transitory storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
A schematic diagram of some embodiments of the unmanned aerial device of the present disclosure is shown in fig. 7. The map building apparatus 70 may be any of the map building apparatuses described above. The laser detector 72 can provide detection data for the constructed map in a dark environment, the depth camera 73 can provide detection data for the constructed map in an indoor light environment, and the vision detector 74 can provide detection data for the constructed map in an outdoor light environment.
The environment sensor 71 can determine the type of environment in which the unmanned aerial vehicle is located according to the breadth and the degree of illumination of the environment. In some embodiments, the environment types may include indoor bright environments, outdoor bright environments, and dark environments. In some embodiments, the environmental sensor 71 may include a light-sensitive sensing device and a distance detection device, in some embodiments, the distance detection device may be a laser detector 72 or a depth camera 73, and the light-sensitive sensing device may be a vision detector 74. In another embodiment, the environment sensor 71 may also be an image acquisition device, which can acquire the environment where the unmanned device is located, and implement efficient and accurate detection of three environments, namely indoor, outdoor and dark, by using a scene classification algorithm based on a convolutional neural network and bayesian filtering optimization for robot-oriented scene detection. In some embodiments, vision detector 74 may be set to a continuously active state, providing detection data for map construction in bright outdoor environments, and providing a data basis for environment type determination in other environment types, thereby reducing the number of devices on which the drone may be mounted.
The unmanned equipment can detect the environment where the unmanned equipment is located, timely switches to a related map construction mode to construct a map when the environment changes, fully considers the problem of map construction fracture caused by inconsistent detector dimensions and positions, executes map calibration to connect the maps before and after switching, and improves the adaptability of the map construction to the environment change.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Thus far, the present disclosure has been described in detail. Some details that are well known in the art have not been described in order to avoid obscuring the concepts of the present disclosure. It will be fully apparent to those skilled in the art from the foregoing description how to practice the presently disclosed embodiments.
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, and firmware. The above-described order for the steps of the method is for illustration only, and the steps of the method of the present disclosure are not limited to the order specifically described above unless specifically stated otherwise. Further, in some embodiments, the present disclosure may also be embodied as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
Finally, it should be noted that: the above examples are intended only to illustrate the technical solutions of the present disclosure and not to limit them; although the present disclosure has been described in detail with reference to preferred embodiments, those of ordinary skill in the art will understand that: modifications to the specific embodiments of the disclosure or equivalent substitutions for parts of the technical features may still be made; all such modifications are intended to be included within the scope of the claims of this disclosure without departing from the spirit thereof.

Claims (15)

1. A map construction method, comprising:
determining environment types of an environment in which the unmanned device is located, wherein the environment types comprise an indoor bright environment, an outdoor bright environment and a dark environment;
constructing a map by adopting a related map construction mode according to the environment type;
when the environment type changes, switching the used map construction mode and calibrating the map before and after switching, wherein the method comprises the following steps:
when the drone enters an outdoor bright environment: determining a scale factor of image data detected by a visual detector according to the coordinates of the unmanned equipment in a map construction mode stopped at the switching moment; constructing a map under a local coordinate system according to the scale factor and the image data;
further comprising: and unifying the maps constructed before and after switching to the same coordinate system according to the continuity of the pose of the unmanned equipment at the switching moment.
2. The method of claim 1, wherein the map construction includes at least two of:
under the condition that the environment type is an indoor bright environment, a map is constructed according to depth data detected by a depth camera;
under the condition that the environment type is an outdoor bright environment, a map is constructed according to image data detected by a visual detector;
and under the condition that the environment type is a dark environment, a map is constructed by adopting the distance data detected by the laser detector.
3. The method of claim 1, wherein the calibrating the pre-and post-handover map comprises:
acquiring the final pose of the unmanned equipment in a global coordinate system in a map construction mode stopped at a switching moment, wherein the global coordinate system takes the position of the unmanned equipment when the unmanned equipment is started as a coordinate origin;
acquiring an initial pose of the unmanned equipment in a local coordinate system in a map construction mode started at a switching moment;
determining a coordinate system transformation matrix for transforming the initial pose into the final pose according to the final pose and the initial pose;
unifying the constructed map after switching to the global coordinate system according to the coordinate system transformation matrix.
4. The method of claim 3, wherein the determining a coordinate system transformation matrix that transforms the initial pose to the final pose as a function of the final pose and the initial pose comprises:
according to the formula
Tglobal=Tconversion·Tlocal
Determining a coordinate transformation matrix TconversionWherein, TglobalFor the pose, T, of the drone in a global coordinate systemlocalThe pose of the unmanned equipment under a local coordinate system is obtained; coordinate transformation matrix TconversionComprising a rotation matrix RconversionAnd a translation vector tconversionAccording to the formula
cglobal=Rconversion·clocal+tconversion
Determining a rotation matrix RconversionAnd a translation vector tconversion,cglobalAs coordinates of the unmanned aerial vehicle in a global coordinate system, clocalCoordinates of the unmanned equipment in a local coordinate system are obtained.
5. The method of claim 2, further comprising:
if the environment type when the unmanned equipment is started is an outdoor bright environment, then,
determining a scale factor of the image data according to detection data of a distance sensor;
and constructing a map under a local coordinate system according to the scale factor and the image data.
6. The method of claim 2 or 5, further comprising:
if the environment type when the unmanned equipment is started is an outdoor bright environment, constructing a map under a local coordinate system according to image data;
when environment switching occurs, determining a scale factor of the image data according to the coordinates of the unmanned equipment in a map construction mode started at the switching moment;
and correcting the scale of the map constructed according to the image data according to the scale factor to generate a map under a global coordinate system, and unifying the constructed maps before and after switching under the global coordinate system.
7. A map building apparatus comprising:
an environment determination module configured to determine an environment type of an environment in which the unmanned device is located, the environment type including an indoor bright environment, an outdoor bright environment, and a dark environment;
the map building module is configured to build a map by adopting an associated map building mode according to the environment type;
the switching module is configured to switch the used map construction mode when the environment type changes;
a calibration module configured to calibrate a map before and after switching, comprising:
when the drone enters an outdoor bright environment: determining a scale factor of image data detected by a visual detector according to the coordinates of the unmanned equipment in a map construction mode stopped at the switching moment; constructing a map under a local coordinate system according to the scale factor and the image data;
further comprising: and unifying the maps constructed before and after switching to the same coordinate system according to the continuity of the pose of the unmanned equipment at the switching moment.
8. The apparatus of claim 7, wherein the map construction includes at least two of:
under the condition that the environment type is an indoor bright environment, a map is constructed according to depth data detected by a depth camera;
under the condition that the environment type is an outdoor bright environment, a map is constructed according to image data detected by a visual detector;
and under the condition that the environment type is a dark environment, a map is constructed by adopting the distance data detected by the laser detector.
9. The apparatus of claim 7, wherein the calibration module is configured to:
acquiring the final pose of the unmanned equipment in a global coordinate system in a map construction mode stopped at a switching moment, wherein the global coordinate system takes the position of the unmanned equipment when the unmanned equipment is started as a coordinate origin;
acquiring an initial pose of the unmanned equipment in a local coordinate system in a map construction mode started at a switching moment;
determining a coordinate system transformation matrix for transforming the initial pose into the final pose according to the final pose and the initial pose;
unifying the constructed map after switching to the global coordinate system according to the coordinate system transformation matrix.
10. The apparatus of claim 9, wherein the determining a coordinate system transformation matrix to transform the initial pose to the final pose as a function of the final pose and the initial pose comprises:
according to the formula
Tglobal=Tconversion·Tlocal
Determining a coordinate transformation matrix TconversionWherein, TglobalFor the pose, T, of the drone in a global coordinate systemlocalThe pose of the unmanned equipment under a local coordinate system is obtained; coordinate transformation matrix TconversionComprising a rotation matrix RconversionAnd a translation vector tconversionAccording to the formula
cglobal=Rconversion·clocal+tconversion
Determining a rotation matrix RconversionAnd a translation vector tconversion,cglobalAs coordinates of the unmanned aerial vehicle in a global coordinate system, clocalCoordinates of the unmanned equipment in a local coordinate system are obtained.
11. The apparatus of claim 8, wherein,
the map building module is further configured to determine a scale factor of the image data according to detection data of a distance sensor if the environment type when the unmanned device is started is an outdoor bright environment, and build a map under a local coordinate system according to the scale factor and the image data.
12. The apparatus of claim 8 or 11,
the calibration module is further configured to:
if the environment type when the unmanned equipment is started is an outdoor bright environment, constructing a map under a local coordinate system according to image data;
when environment switching occurs, determining a scale factor of the image data according to the coordinates of the unmanned equipment in a map construction mode started at the switching moment;
and correcting the scale of the map constructed according to the image data according to the scale factor to generate a map under a global coordinate system, and unifying the constructed maps before and after switching under the global coordinate system.
13. A map building apparatus comprising:
a memory; and
a processor coupled to the memory, the processor configured to perform the method of any of claims 1-6 based on instructions stored in the memory.
14. A computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the method of any one of claims 1 to 6.
15. An unmanned device comprising:
an environment sensor configured to detect an environment in which the unmanned device is located;
at least two of a laser detector, a depth camera, or a visual detector configured to provide detection data for constructing a map;
and the combination of (a) and (b),
the mapping apparatus of any of claims 7-13.
CN201810408970.3A 2018-05-02 2018-05-02 Map construction method and device and unmanned equipment Active CN110444102B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810408970.3A CN110444102B (en) 2018-05-02 2018-05-02 Map construction method and device and unmanned equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810408970.3A CN110444102B (en) 2018-05-02 2018-05-02 Map construction method and device and unmanned equipment

Publications (2)

Publication Number Publication Date
CN110444102A CN110444102A (en) 2019-11-12
CN110444102B true CN110444102B (en) 2021-10-01

Family

ID=68427612

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810408970.3A Active CN110444102B (en) 2018-05-02 2018-05-02 Map construction method and device and unmanned equipment

Country Status (1)

Country Link
CN (1) CN110444102B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110986920B (en) * 2019-12-26 2021-06-22 武汉万集信息技术有限公司 Positioning navigation method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103786806A (en) * 2014-01-20 2014-05-14 北京航空航天大学 Multifunctional leg-and-wheel combination robot and multi-movement-mode intelligent switching method thereof
CN103900583A (en) * 2012-12-25 2014-07-02 联想(北京)有限公司 Device and method used for real-time positioning and map building
CN107160395A (en) * 2017-06-07 2017-09-15 中国人民解放军装甲兵工程学院 Map constructing method and robot control system
CN107392547A (en) * 2017-08-08 2017-11-24 北京京东尚科信息技术有限公司 Goods handling method, apparatus, system and computer-readable recording medium
CN107421465A (en) * 2017-08-18 2017-12-01 大连理工大学 A kind of binocular vision joining method based on laser tracker

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9976860B2 (en) * 2013-04-16 2018-05-22 Apple Inc. Seamless transition from outdoor to indoor mapping
IL227860B (en) * 2013-08-08 2019-05-30 Israel Aerospace Ind Ltd Classification of environment elements
CN107917712B (en) * 2017-11-16 2020-07-28 苏州艾吉威机器人有限公司 Synchronous positioning and map construction method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103900583A (en) * 2012-12-25 2014-07-02 联想(北京)有限公司 Device and method used for real-time positioning and map building
CN103786806A (en) * 2014-01-20 2014-05-14 北京航空航天大学 Multifunctional leg-and-wheel combination robot and multi-movement-mode intelligent switching method thereof
CN107160395A (en) * 2017-06-07 2017-09-15 中国人民解放军装甲兵工程学院 Map constructing method and robot control system
CN107392547A (en) * 2017-08-08 2017-11-24 北京京东尚科信息技术有限公司 Goods handling method, apparatus, system and computer-readable recording medium
CN107421465A (en) * 2017-08-18 2017-12-01 大连理工大学 A kind of binocular vision joining method based on laser tracker

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ORB-SLAM: A Versatile and Accurate Monocular;Raúl Mur-Artal,J. M. M. Montiel,Juan D. Tardós;《 IEEE Transactions on Robotics》;20150824;第1147 - 1163页 *
SceneSLAM: A SLAM Framework Combined with Scene Detection;Zhehang Tong,Dianxi Shi,Shaowu Yang;《2017 IEEE International Conference on Robotics and Biomimetics (ROBIO)》;20180326;第487-494 页 *
基于单目视觉的同时定位与地图构建方法综述;刘浩敏,章国锋,鲍虎军;《计算机辅助设计与图形学学报》;20160630;第855-868页 *

Also Published As

Publication number Publication date
CN110444102A (en) 2019-11-12

Similar Documents

Publication Publication Date Title
CN111442722B (en) Positioning method, positioning device, storage medium and electronic equipment
US11747477B2 (en) Data collecting method and system
US10796151B2 (en) Mapping a space using a multi-directional camera
US10339387B2 (en) Automated multiple target detection and tracking system
Levinson et al. Automatic online calibration of cameras and lasers.
KR102126513B1 (en) Apparatus and method for determining the pose of the camera
EP3028252B1 (en) Rolling sequential bundle adjustment
US11830216B2 (en) Information processing apparatus, information processing method, and storage medium
US20190355173A1 (en) Leveraging crowdsourced data for localization and mapping within an environment
CN107167826B (en) Vehicle longitudinal positioning system and method based on variable grid image feature detection in automatic driving
CN110910460B (en) Method and device for acquiring position information and calibration equipment
CN111220148A (en) Mobile robot positioning method, system and device and mobile robot
CN111380515A (en) Positioning method and device, storage medium and electronic device
CN113052907B (en) Positioning method of mobile robot in dynamic environment
CN110444102B (en) Map construction method and device and unmanned equipment
CN112985359B (en) Image acquisition method and image acquisition equipment
Hsia et al. Height estimation via stereo vision system for unmanned helicopter autonomous landing
JP2018009918A (en) Self-position detection device, moving body device, and self-position detection method
Mattoccia et al. A compact, lightweight and energy efficient system for autonomous navigation based on 3D vision
CN111638500A (en) Calibration method for a measuring device and measuring device
CN109587303B (en) Electronic equipment and mobile platform
CN115019167B (en) Fusion positioning method, system, equipment and storage medium based on mobile terminal
US20240069203A1 (en) Global optimization methods for mobile coordinate scanners
CN110763232B (en) Robot and navigation positioning method and device thereof
CN109660732B (en) Electronic equipment and mobile platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210302

Address after: Room a1905, 19 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Beijing Jingdong Qianshi Technology Co.,Ltd.

Address before: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant before: Beijing Jingbangda Trading Co.,Ltd.

Effective date of registration: 20210302

Address after: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant after: Beijing Jingbangda Trading Co.,Ltd.

Address before: 100195 Beijing Haidian Xingshikou Road 65 West Cedar Creative Garden 4 District 11 Building East 1-4 Floor West 1-4 Floor

Applicant before: BEIJING JINGDONG SHANGKE INFORMATION TECHNOLOGY Co.,Ltd.

Applicant before: BEIJING JINGDONG CENTURY TRADING Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant