WO2023273415A1 - Procédé et appareil de positionnement basés sur un véhicule aérien sans pilote, support de stockage, dispositif électronique et produit - Google Patents

Procédé et appareil de positionnement basés sur un véhicule aérien sans pilote, support de stockage, dispositif électronique et produit Download PDF

Info

Publication number
WO2023273415A1
WO2023273415A1 PCT/CN2022/080773 CN2022080773W WO2023273415A1 WO 2023273415 A1 WO2023273415 A1 WO 2023273415A1 CN 2022080773 W CN2022080773 W CN 2022080773W WO 2023273415 A1 WO2023273415 A1 WO 2023273415A1
Authority
WO
WIPO (PCT)
Prior art keywords
positioning data
terminal device
dimensional map
initial
target
Prior art date
Application number
PCT/CN2022/080773
Other languages
English (en)
Chinese (zh)
Inventor
黄晓庆
张站朝
董文锋
马世奎
Original Assignee
达闼机器人股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 达闼机器人股份有限公司 filed Critical 达闼机器人股份有限公司
Publication of WO2023273415A1 publication Critical patent/WO2023273415A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Definitions

  • the present disclosure relates to the technical field of wireless positioning, and in particular, to a positioning method, device, storage medium, electronic equipment and product based on a drone.
  • the robot can use various information collection devices to collect environmental information of the physical environment in which it is located, and use the global map within the range of activities to perform positioning.
  • the acquisition of the global map is usually to pre-control the robot to collect environmental information in every place within the range of activities to establish a global map.
  • the accuracy of the global map is low.
  • the efficiency of collecting environmental information is low, which reduces the efficiency of generating a global map.
  • the purpose of the present disclosure is to provide a positioning method, device, storage medium, electronic equipment and products based on drones, so as to solve related technical problems existing in the prior art.
  • a positioning method based on a drone which is applied to a terminal device, and the method includes:
  • the three-dimensional map is determined according to a plurality of initial positioning data sets and the height of the terminal device, each of the initial positioning data sets corresponds to a collection in the preset area track, the initial positioning data set includes a plurality of initial positioning data collected by the UAV according to the corresponding collection track, each of the initial positioning data includes point cloud data and image data, and each of the collection tracks is on the horizontal plane same projection;
  • the target positioning data including point cloud data and image data collected by the terminal device at the current moment;
  • each of the initial positioning data sets includes a plurality of initial positioning data collected by the UAV in oblique photography.
  • the determining the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and the target positioning data includes:
  • the target positioning data determine the three-dimensional environment features of the environment where the terminal device is currently located, and the three-dimensional environment features include point cloud features and image features;
  • the method further includes:
  • the target pose and the location information of the destination of the terminal device determine a target trajectory that satisfies a specified constraint condition, and move according to the target trajectory, and the specified constraint condition is based on the specified constraint condition
  • the projection of the acquisition trajectory on the horizontal plane is determined.
  • determining a target trajectory that satisfies a specified constraint condition includes:
  • the initial trajectory with the highest matching degree with the projection of the collection trajectory on the horizontal plane is used as the target trajectory.
  • a positioning method based on a drone is provided, which is applied to a control platform, and the method includes:
  • each of the initial positioning data sets corresponds to a collection trajectory in the preset area, and the initial positioning data set includes the data collected by the UAV according to the corresponding collection trajectory.
  • a plurality of initial positioning data, each of the initial positioning data includes point cloud data and image data, and the projection of each of the acquisition tracks on the horizontal plane is the same;
  • the terminal device sends the three-dimensional map to the terminal device, so that the terminal device determines the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and target positioning data, and the target positioning data includes The point cloud data and image data collected by the terminal device at the current moment.
  • each of the initial positioning data sets includes a plurality of initial positioning data collected by the UAV in oblique photography.
  • the determination of the three-dimensional map of the preset area according to the plurality of initial positioning data sets and the height of the terminal device includes:
  • a converted positioning data set is determined, the converted positioning data set includes a plurality of converted positioning data on the converted collection track, and the converted positioning data includes point cloud data and image data, the height of the converted acquisition trajectory is the same as the height of the terminal device, and the projection of the converted acquisition trajectory on the horizontal plane is the same as the projection of each of the acquisition trajectory on the horizontal plane;
  • the three-dimensional map is generated according to the converted positioning data set.
  • the determining the converted positioning data set according to the height of the terminal device and multiple initial positioning data sets includes:
  • the generating the three-dimensional map according to the converted positioning data set includes:
  • the method also includes:
  • a positioning device based on a drone which is applied to a terminal device, and the device includes:
  • the first acquisition module is used to acquire the three-dimensional map of the preset area sent by the control platform, the three-dimensional map is determined according to a plurality of initial positioning data sets and the height of the terminal device, each of the initial positioning data sets corresponds to the A collection trajectory in a preset area, the initial positioning data set includes a plurality of initial positioning data collected by the UAV according to the corresponding collection trajectory, each of the initial positioning data includes point cloud data and image data, each The projections of the collection tracks on the horizontal plane are the same;
  • the second acquisition module is used to acquire target positioning data, and the target positioning data includes point cloud data and image data collected by the terminal device at the current moment;
  • a determining module configured to determine the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and the target positioning data.
  • each of the initial positioning data sets includes a plurality of initial positioning data collected by the UAV in oblique photography.
  • the determination module includes:
  • the feature extraction submodule is used to determine the three-dimensional environment features of the environment where the terminal device is located at the current moment according to the target positioning data, and the three-dimensional environment features include point cloud features and image features;
  • a matching submodule configured to match the 3D environment features with the 3D map to determine the target pose.
  • the device also includes:
  • control module configured to, after determining the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and the target positioning data, according to the target pose and the purpose of the terminal device According to the preset path planning algorithm, determine the target trajectory that satisfies the specified constraint condition, and move according to the target trajectory.
  • the specified constraint condition is determined according to the projection of the collection trajectory on the horizontal plane.
  • control module is used for:
  • At least one initial trajectory is determined; and the initial trajectory with the highest matching degree with the projection of the collected trajectory on the horizontal plane, as the target trajectory.
  • a positioning device based on a drone which is applied to a control platform, and the device includes:
  • An acquisition module configured to acquire a plurality of initial positioning data sets sent by the UAV, each of the initial positioning data sets corresponds to a collection trajectory in a preset area, and the initial positioning data set includes the UAV according to the corresponding A plurality of initial positioning data collected by the collection track, each of the initial positioning data includes point cloud data and image data, and the projection of each of the collection tracks on the horizontal plane is the same;
  • a determining module configured to determine the three-dimensional map of the preset area according to the plurality of initial positioning data sets and the height of the terminal device;
  • a sending module configured to send the three-dimensional map to the terminal device, so that the terminal device determines the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and target positioning data
  • the target positioning data includes point cloud data and image data collected by the terminal device at the current moment.
  • each of the initial positioning data sets includes a plurality of initial positioning data collected by the UAV in oblique photography.
  • the determination module includes:
  • the first determination sub-module is configured to determine a converted positioning data set according to the height of the terminal device and a plurality of the initial positioning data sets, the converted positioning data set includes a plurality of converted positioning data on the converted collection track, the The converted positioning data includes point cloud data and image data, the height of the converted collection track is the same as the height of the terminal device, the projection of the converted collection track on the horizontal plane is the same as the height of each of the collected tracks on the horizontal plane the projection is the same;
  • the second determining submodule is configured to generate the three-dimensional map according to the converted positioning data set.
  • the first determining submodule is used for:
  • each of the initial positioning data sets determine an initial converted positioning data set corresponding to the initial positioning data set; for each The initial conversion positioning data sets corresponding to the initial positioning data sets are fused to obtain the conversion positioning data set;
  • the second determining submodule is used for:
  • the acquisition module is also used for:
  • a non-transitory computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the steps of the method described in the first aspect of the embodiments of the present disclosure are implemented. .
  • an electronic device including:
  • a processor configured to execute the computer program in the memory, so as to implement the steps of the method described in the first aspect of the embodiments of the present disclosure.
  • a non-transitory computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the steps of the method described in the second aspect of the embodiments of the present disclosure are implemented .
  • an electronic device including:
  • a processor configured to execute the computer program in the memory, so as to implement the steps of the method described in the second aspect of the embodiments of the present disclosure.
  • a computer program product includes a computer program executable by a programmable device, and the computer program has a function for executing the present disclosure when executed by the programmable device.
  • a computer program product includes a computer program executable by a programmable device, and the computer program has a function for executing the present disclosure when executed by the programmable device.
  • the control platform in the present disclosure first acquires a plurality of initial positioning data sets sent by the UAV, wherein each initial positioning data set includes a plurality of initial positioning data collected by the UAV according to the corresponding acquisition trajectory , and then, according to multiple initial positioning data sets and the height of the terminal device, determine the three-dimensional map of the preset area, and finally send the three-dimensional map to the terminal device.
  • the terminal device determines the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and the target positioning data collected at the current moment.
  • This disclosure utilizes the initial positioning data sets collected by drones on multiple collection tracks to determine a three-dimensional map, which can improve the accuracy, collection efficiency, and scope of application of the three-dimensional map.
  • the pose of the terminal device is determined through the three-dimensional map , which can improve the positioning accuracy of the terminal device.
  • Fig. 1 is a schematic diagram of a positioning system of a terminal device according to an exemplary embodiment
  • Fig. 2 is a flow chart of a positioning method based on a drone shown according to an exemplary embodiment
  • Fig. 3 is a flow chart of another UAV-based positioning method shown according to an exemplary embodiment
  • Fig. 4 is a flow chart of another UAV-based positioning method shown according to an exemplary embodiment
  • Fig. 5 is a flow chart of another UAV-based positioning method shown according to an exemplary embodiment
  • Fig. 6 is a flow chart of a positioning method based on a drone shown according to an exemplary embodiment
  • Fig. 7 is a flow chart of another UAV-based positioning method shown according to an exemplary embodiment
  • Fig. 8 is a flow chart of another drone-based positioning method shown according to an exemplary embodiment
  • Fig. 9 is a block diagram of a positioning device based on a drone according to an exemplary embodiment
  • Fig. 10 is a block diagram of another drone-based positioning device according to an exemplary embodiment
  • Fig. 11 is a block diagram of another drone-based positioning device according to an exemplary embodiment
  • Fig. 12 is a block diagram of a drone-based positioning device according to an exemplary embodiment
  • Fig. 13 is a block diagram of another drone-based positioning device according to an exemplary embodiment
  • Fig. 14 is a block diagram of an electronic device according to an exemplary embodiment
  • Fig. 15 is a block diagram of an electronic device according to an exemplary embodiment.
  • the application scenario may be a terminal device positioning system, which includes a drone, a terminal device and a control platform, as shown in FIG. 1 .
  • the UAVs can be equipped with information collection devices, including but not limited to: image collection devices (such as: depth camera, binocular camera), laser radar, etc., used to collect the initial Locate the dataset.
  • the terminal device can be one or more, and it can be any device that needs to be positioned within a preset area, such as a robot, which can be any kind of smart device, such as a sweeping robot, smart assistant, robotic arm, etc. , which is not specifically limited in the present disclosure.
  • the terminal equipment is also equipped with information acquisition devices, including but not limited to image acquisition devices (such as: depth camera, binocular camera), laser radar, IMU (English: Inertial Measurement Unit, Chinese: Inertial Measurement Unit), etc., for collecting Targeting data mentioned below.
  • the control platform can be understood as a server or cloud platform, which is used to generate, store, and update the three-dimensional map mentioned later.
  • the wireless communication protocols can include but are not limited to: 5G (English: the 5th Generation mobile communication technology, Chinese : the fifth generation mobile communication technology), 4G (English: the 4th Generation mobile communication technology, Chinese: the fourth generation mobile communication technology), WLAN (English: Wireless Local Area Networks, Chinese: wireless local area network), etc.
  • Fig. 2 is a flow chart of a positioning method based on a drone according to an exemplary embodiment. As shown in Fig. 2, the method is applied to a terminal device and includes the following steps:
  • Step 101 obtain the three-dimensional map of the preset area sent by the control platform, the three-dimensional map is determined according to multiple initial positioning data sets and the height of the terminal device, each initial positioning data set corresponds to a collection track in the preset area, the initial positioning The data set includes multiple initial positioning data collected by the UAV according to the corresponding acquisition trajectory.
  • Each initial positioning data includes point cloud data and image data, and the projection of each acquisition trajectory on the horizontal plane is the same.
  • the terminal device can obtain a three-dimensional map of the preset area from the control platform.
  • the three-dimensional map can reflect the environmental information of each place in the preset area.
  • the three-dimensional map can be a visual feature map, or a grid map, or a is a combination of a visual feature map and a grid map, which is not specifically limited in the present disclosure.
  • the control platform can first obtain multiple initial positioning data sets sent by the drone, and combine the height of the terminal device to generate a three-dimensional map suitable for the terminal device.
  • each initial positioning data set corresponds to a collection track in the preset area
  • the projection of each collection track on the horizontal plane is the same
  • the height of each collection track can be different, that is to say, multiple collection tracks are within the preset area.
  • the initial positioning data set includes multiple initial positioning data collected by the UAV according to the corresponding collection trajectory. It can be understood that the collection trajectory includes multiple collection locations. An initial positioning data is collected at each collection position, and the initial positioning data includes point cloud data and image data collected by the UAV at the collection position.
  • the UAV Since the UAV can fly in any direction in the space, compared with the terminal equipment that can only move on the ground, the UAV has a large range of activities and less airspace restrictions, and can effectively traverse every place in the preset area. Moreover, due to the high flexibility and high speed of UAV movement, it can quickly traverse every place in the preset area. Therefore, the accuracy of the three-dimensional map determined by the initial positioning data set collected by the UAV is high, and the efficiency of generating the three-dimensional map is high. Furthermore, the 3D map is determined based on multiple initial positioning data sets and the height of the terminal device. Multiple initial positioning data sets can reflect the environmental information collected by the UAV at different heights. By highly observing the preset area, more abundant point cloud data and image data can be obtained.
  • the height of the terminal device can be understood as the height of the information collection device set on the terminal device, that is, the height at which the terminal device collects the target positioning data mentioned later.
  • Step 102 acquiring target positioning data, the target positioning data includes point cloud data and image data collected by the terminal device at the current moment.
  • Step 103 according to the three-dimensional map and the target positioning data, determine the target pose of the terminal device in the three-dimensional map.
  • the terminal device can collect point cloud data and image data at the current moment through the information collection device set on it as the target positioning data.
  • the target positioning data can reflect the environmental information of the current environment where the terminal device is located.
  • the terminal device can determine the target pose of the terminal device in the 3D map according to the 3D map and the target positioning data.
  • the target pose can include the position (coordinate value) and posture (direction or angle) of the target terminal in the 3D map.
  • the target positioning data can be matched with the three-dimensional map, so as to determine the target pose according to the position in the three-dimensional map with the highest matching degree with the target positioning data. Due to the high accuracy of the three-dimensional map, correspondingly, determining the pose of the terminal device through the three-dimensional map can improve the positioning accuracy of the terminal device.
  • each initial positioning data set includes a plurality of initial positioning data collected by the UAV in oblique photography.
  • the initial positioning data included in each initial positioning data set may be collected by the UAV in the way of oblique photography, that is, the initial positioning data, including the UAV in the way of oblique photography, is collected in the corresponding Point cloud data and image data collected at a collection position on the trajectory.
  • Multiple initial positioning data sets collected by oblique photography can reflect the environmental information collected by UAVs at different heights and angles. Richer and denser point cloud data and image data, thereby further improving the accuracy of 3D maps.
  • Fig. 3 is a flow chart of another UAV-based positioning method shown according to an exemplary embodiment. As shown in Fig. 3 , the implementation of step 103 may include:
  • Step 1031 according to the target positioning data, determine the three-dimensional environment features of the environment where the terminal device is currently located, and the three-dimensional environment features include point cloud features and image features.
  • Step 1032 matching the 3D environment features with the 3D map to determine the target pose.
  • feature extraction may be performed on the target positioning data to obtain 3D environment features that can reflect the current environment of the terminal device, wherein the 3D environment features may include: points included in the target positioning data Point cloud features extracted from cloud data, image features extracted from image data included in target positioning data. Afterwards, the 3D environment features can be matched with the 3D map to determine the object pose.
  • image features can be extracted from the image data included in the target positioning data, and the image features can be feature points (for example: SIFT feature points or SURF feature points, etc.), and then the extracted Image features are matched against the visual feature map to determine the object pose.
  • the 3D map is a raster image
  • the point cloud features can be extracted from the point cloud data included in the target positioning data.
  • the point cloud features can be geometric features, intensity features, etc., and then the extracted geometric features and the raster image Matching is performed to determine the target pose.
  • Fig. 4 is a flowchart of another UAV-based positioning method shown according to an exemplary embodiment. As shown in Fig. 4, after step 103, the method may further include:
  • Step 104 according to the target pose and the location information of the destination of the terminal device, according to the preset path planning algorithm, determine the target trajectory that meets the specified constraint conditions, and move according to the target trajectory, and the specified constraint conditions are based on the acquisition trajectory on the horizontal plane Projection OK.
  • the target trajectory can also be planned according to the target pose and the preset destination location information of the terminal device, and the terminal device can be controlled to move according to the target trajectory.
  • the destination can be dynamically adjusted according to the tasks to be performed by the terminal device, or can be preset according to specific requirements, which is not limited in the present disclosure.
  • the position indicated by the target pose can be used as the starting point and the destination as the end point, and the target trajectory satisfying the specified constraints can be determined according to the preset path planning algorithm, wherein the path planning algorithm can be, for example, graph search method, RRT( English: Rapidly Exploring Random Tree, Chinese: Rapidly Exploring Random Tree) algorithm, artificial potential field method, etc., which are not limited in this disclosure.
  • the specified constraints are determined according to the projection of any collection trajectory on the horizontal plane, and are used to ensure that the target trajectory can match the projection of the collection trajectory on the horizontal plane as much as possible, that is to say, to ensure that the target trajectory matches the three-dimensional map as much as possible, so that the terminal In the process of moving the device according to the target trajectory, the accuracy of positioning according to the three-dimensional map is high.
  • the specified constraint condition may be, for example, that the matching degree with the projection of the collection trajectory on the horizontal plane is the highest, or that the matching degree with the projection of the collection trajectory on the horizontal plane satisfies a preset matching degree threshold.
  • Fig. 5 is a flow chart of another UAV-based positioning method shown according to an exemplary embodiment. As shown in Fig. 5, step 104 may be implemented through the following steps:
  • Step 1041 Determine at least one initial trajectory according to the target pose and the location information of the destination according to a preset path planning algorithm.
  • Step 1042 The initial trajectory with the highest matching degree with the projection of the collected trajectory on the horizontal plane is taken as the target trajectory.
  • the position indicated by the target pose can be used as the starting point and the destination as the end point, and then the starting point, end point, and three-dimensional map are input into the preset path planning algorithm to obtain the path planning algorithm
  • the path planning algorithm Output at least one initial trajectory, each initial trajectory has the same start and end points.
  • the matching degree of each initial trajectory and the projection of the collected trajectory on the horizontal plane is sequentially determined, and finally the initial trajectory with the highest matching degree can be used as the target trajectory.
  • a preset number of initial trajectories with the highest matching degrees may also be used as target trajectories, and initial trajectories whose matching degree meets a preset matching degree threshold may also be used as target trajectories, which is not specifically limited in the present disclosure.
  • the control platform first obtains multiple initial positioning data sets sent by the UAV, wherein each initial positioning data set includes multiple initial positioning data collected by the UAV according to the corresponding acquisition trajectory , and then, according to multiple initial positioning data sets and the height of the terminal device, determine the three-dimensional map of the preset area, and finally send the three-dimensional map to the terminal device.
  • the terminal device determines the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and the target positioning data collected at the current moment.
  • This disclosure utilizes the initial positioning data sets collected by drones on multiple collection tracks to determine a three-dimensional map, which can improve the accuracy, collection efficiency, and scope of application of the three-dimensional map.
  • the pose of the terminal device is determined through the three-dimensional map , which can improve the positioning accuracy of the terminal device.
  • Fig. 6 is a flowchart of a positioning method based on a drone according to an exemplary embodiment. As shown in Fig. 6, the method is applied to a control platform, including:
  • Step 201 obtain multiple initial positioning data sets sent by the UAV, each initial positioning data set corresponds to a collection trajectory in a preset area, and the initial positioning data set includes multiple data collected by the UAV according to the corresponding collection trajectory.
  • Initial positioning data each initial positioning data includes point cloud data and image data, and the projection of each acquisition trajectory on the horizontal plane is the same.
  • the control platform is used to generate, store, and update a three-dimensional map within a preset area.
  • the three-dimensional map can reflect the environmental information of each place in the preset area.
  • the three-dimensional map can be a visual feature map, a grid map, or a combination of a visual feature map and a grid map, which is not specifically limited in this disclosure.
  • the control platform first obtains multiple initial positioning data sets sent by the UAV. Among them, each initial positioning data set corresponds to a collection track in the preset area, the projection of each collection track on the horizontal plane is the same, and the height of each collection track can be different, that is to say, multiple collection tracks are within the preset area.
  • the initial positioning data set includes multiple initial positioning data collected by the UAV according to the corresponding collection trajectory. It can be understood that the collection trajectory includes multiple collection locations. An initial positioning data is collected at each collection position, and the initial positioning data includes point cloud data and image data collected by the UAV at the collection position.
  • the UAV can fly in any direction in the space, compared with the terminal equipment that can only move on the ground, the UAV has a large range of activities and less airspace restrictions, and can effectively traverse every place in the preset area. Moreover, due to the high flexibility and high speed of UAV movement, it can quickly traverse every place in the preset area. Therefore, it is more efficient to collect the initial positioning data set by UAV, and the accuracy of the information contained is higher.
  • multiple initial positioning data sets can reflect the environmental information collected by UAVs at different heights. It can be understood that UAVs can obtain more abundant point cloud data and image data when observing preset areas at different heights. That is to say, multiple initial positioning data sets contain more information, and can reflect the environmental information of each collection location in a denser and finer-grained manner.
  • Step 202 determine a three-dimensional map of the preset area according to multiple initial positioning data sets and the height of the terminal device.
  • Step 203 Send the three-dimensional map to the terminal device, so that the terminal device determines the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and target positioning data.
  • the target positioning data includes point cloud data and image data.
  • the control platform can combine the height of the terminal device to generate a three-dimensional map of the preset area.
  • the three-dimensional map is suitable for the height of the terminal device, that is, the three-dimensional map can reflect the , observe the environmental information of each place in the preset area at the height of the terminal device.
  • the height of the terminal device can be understood as the height of the information collection device provided on the terminal device, that is, the height at which the terminal device collects target positioning data.
  • the initial positioning data included in each initial positioning data set is the point cloud data and image data collected by the UAV on the corresponding acquisition track, that is to say, each initial positioning data set is the UAV in the corresponding Observation at the altitude corresponding to the collection trajectory.
  • 3D reconstruction can be performed directly according to the initial positioning data set corresponding to the collection track to obtain a 3D map. If there is no acquisition trajectory with the same height as the terminal device among the multiple collection trajectories, then multiple initial positioning data sets can be converted according to the preset rules to obtain the positioning data set observed at the height of the terminal device, and finally Three-dimensional reconstruction is performed according to the positioning data set obtained by observing the height of the terminal device to obtain a three-dimensional map.
  • the UAV collects the initial positioning data set more efficiently and contains higher accuracy information
  • the three-dimensional map determined according to the initial positioning data set has high accuracy. Furthermore, since the multiple initial positioning data sets contain more information, the environmental information of each collection location can be reflected in a denser and finer-grained manner. On this basis, combined with the height of the terminal equipment, a 3D map suitable for the terminal equipment is obtained, which can further improve the accuracy of the 3D map.
  • the three-dimensional map can be stored.
  • the control platform may send the stored 3D map to the terminal device when the terminal device requests the 3D map, or may send the stored 3D map to the terminal device according to a preset first cycle (for example, 60 minutes). Further, the UAV can also collect multiple initial positioning data sets according to the preset second cycle (for example, 24 hours), and send them to the control platform.
  • the control platform receives a new initial positioning data set, it can repeat step 201 Proceed to step 203 to update the three-dimensional map, and send the updated three-dimensional map to the terminal device.
  • the terminal device After the terminal device receives the 3D map, it can collect the point cloud data and image data at the current moment through the information collection device set on it as the target positioning data, and then determine the position of the terminal device on the 3D map according to the 3D map and target positioning data.
  • the target pose in the target pose may include the position (coordinate value) and posture (direction or angle) of the target terminal in the three-dimensional map.
  • the terminal device can match the target positioning data with the three-dimensional map, so as to determine the target pose according to the position in the three-dimensional map with the highest matching degree with the target positioning data. Due to the high accuracy of the three-dimensional map, correspondingly, determining the pose of the terminal device through the three-dimensional map can improve the positioning accuracy of the terminal device.
  • each initial positioning data set includes a plurality of initial positioning data collected by the UAV in oblique photography.
  • the initial positioning data included in each initial positioning data set may be collected by the UAV in the way of oblique photography, that is, the initial positioning data, including the UAV in the way of oblique photography, is collected in the corresponding Point cloud data and image data collected at a collection position on the trajectory.
  • Multiple initial positioning data sets collected by oblique photography can reflect the environmental information collected by UAVs at different heights and angles. Richer and denser point cloud data and image data, thereby further improving the accuracy of 3D maps.
  • Fig. 7 is a flow chart of another UAV-based positioning method shown according to an exemplary embodiment. As shown in Fig. 7, the height of each collection track is different, and step 202 may include the following steps:
  • Step 2021 according to the height of the terminal device and multiple initial positioning data sets, determine the converted positioning data set, the converted positioning data set includes multiple converted positioning data on the converted collection track, the converted positioning data includes point cloud data and image data, and the converted The height of the collection track is the same as that of the terminal equipment, and the projection of the converted collection track on the horizontal plane is the same as the projection of each collection track on the horizontal plane.
  • Step 2022 generate a three-dimensional map according to the converted positioning data set.
  • the converted positioning data set can be understood as the positioning data set obtained by observing the height of the terminal device, which includes the converted collection track Multiple transformations on the positioning data.
  • the conversion collection track includes a plurality of conversion collection positions, which correspond to the collection positions included in any collection track one by one.
  • Each converted positioning data includes point cloud data and image data collected at a converted collection position at the height of the terminal device.
  • the height of the converted collection track is the same as that of the terminal device, and the projection of the converted collection track on the horizontal plane is the same as the projection of each collection track on the horizontal plane. That is to say, the converted acquisition trajectory is parallel to any acquisition trajectory, and the corresponding height is the same as the height of the terminal device.
  • the collection track can be converted directly based on the collection track. If there is no collection track with the same height as the terminal device among the multiple collection tracks, then the multiple collection tracks may be converted according to a preset rule to obtain a converted collection track. Finally, 3D reconstruction can be performed according to the converted positioning data set according to a preset 3D reconstruction algorithm to obtain a 3D map.
  • step 2021 may be implemented through the following steps:
  • Step 1) For each initial positioning data set, according to each initial positioning data included in the initial positioning data set and the height of the terminal device, determine an initial converted positioning data set corresponding to the initial positioning data set.
  • Step 2) Fusing the initial converted positioning data set corresponding to each initial positioning data set to obtain the converted positioning data set.
  • the initial converted positioning data set corresponding to the initial positioning data set may be determined according to each initial positioning data included in the initial positioning data set and the height of the terminal device.
  • the height of the collection trajectory corresponding to a certain initial positioning data set is 5m, which includes 100 initial positioning data, that is to say, each initial positioning data is observed by a drone at a height of 5 meters.
  • the height of the terminal device is 2m, then each of the 100 initial positioning data can be converted into positioning data observed at a height of 2m according to the triangular transformation method, so as to obtain the corresponding initial positioning data set
  • the initial conversion positioning data set of the initial conversion data set includes 100 initial conversion positioning data, that is, the initial conversion positioning data corresponds to the initial positioning data one by one.
  • multiple initial converted positioning data sets can be fused according to the preset rules to obtain the converted positioning data set.
  • the location corresponds one-to-one to the collection locations included in any collection track.
  • the number of converted positioning data included in the converted positioning data set is the same as the number of initial positioning data included in any initial positioning data set.
  • multiple initial converted positioning data sets may be averaged to obtain a converted positioning data set.
  • weight corresponding to each initial converted positioning data set can be based on the height of the acquisition trajectory corresponding to the initial converted positioning data set.
  • the height difference determination with the height of the terminal device for example, the weighting can be inversely correlated with the height difference.
  • step 2022 may be:
  • 3D reconstruction is performed to obtain a 3D map.
  • 3D reconstruction can be performed according to the converted positioning data set and the converted acquisition trajectory according to a preset 3D reconstruction algorithm to obtain a 3D map.
  • the collection trajectory includes not only the coordinates of each collection location on the collection trajectory, but also the posture of the UAV when it is at the collection location.
  • the converted collection trajectory also includes each The coordinates of a converted collection position, and the predicted attitude of the terminal device when it is located at the converted collection position.
  • the predicted posture of the terminal device at the converted collection position may be the same as that of the UAV when it is located at the collection position corresponding to the converted collection position.
  • Fig. 8 is a flow chart of another UAV-based positioning method shown according to an exemplary embodiment. As shown in Fig. 8, the method may also include:
  • Step 204 acquiring the position of the positioning sensor set on the terminal device.
  • Step 205 determine the height of the terminal device according to the position of the positioning sensor.
  • the position of the positioning sensor set on the terminal device can be obtained first, and then the height of the terminal device can be determined according to the position of the positioning sensor.
  • the positioning sensor can be understood as an information collection device set on the terminal device, including but not limited to an image collection device (such as a depth camera, a binocular camera), a laser radar, an IMU, and the like. If only one positioning sensor is provided on the terminal device, or multiple positioning sensors have the same position, then the height of the positioning sensor can be directly determined according to the position of the positioning sensor as the height of the terminal device. If multiple positioning sensors are installed on the terminal device, and the positions of the multiple positioning sensors are not the same, then the height of the positioning sensor can be determined according to the position of each positioning sensor, and then the average height of the multiple positioning sensors value, as the height of the terminal device.
  • the control platform first obtains multiple initial positioning data sets sent by the UAV, wherein each initial positioning data set includes multiple initial positioning data collected by the UAV according to the corresponding acquisition trajectory , and then, according to multiple initial positioning data sets and the height of the terminal device, determine the three-dimensional map of the preset area, and finally send the three-dimensional map to the terminal device.
  • the terminal device determines the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and the target positioning data collected at the current moment.
  • This disclosure utilizes the initial positioning data sets collected by drones on multiple collection tracks to determine a three-dimensional map, which can improve the accuracy, collection efficiency, and scope of application of the three-dimensional map.
  • the pose of the terminal device is determined through the three-dimensional map , which can improve the positioning accuracy of the terminal device.
  • Fig. 9 is a block diagram of a positioning device based on a drone according to an exemplary embodiment. As shown in Fig. 9, the device 300 is applied to a terminal device, including:
  • the first acquisition module 301 is configured to acquire the three-dimensional map of the preset area sent by the control platform, the three-dimensional map is determined according to multiple initial positioning data sets and the height of the terminal device, and each initial positioning data set corresponds to a collection in the preset area trajectory, the initial positioning data set includes a plurality of initial positioning data collected by the UAV according to the corresponding collection trajectory, each initial positioning data includes point cloud data and image data, and the projection of each collection trajectory on the horizontal plane is the same.
  • the second acquisition module 302 is configured to acquire target positioning data, and the target positioning data includes point cloud data and image data collected by the terminal device at the current moment.
  • the determination module 303 is configured to determine the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and target positioning data.
  • each initial positioning data set includes a plurality of initial positioning data collected by the UAV in oblique photography.
  • Fig. 10 is a block diagram of another UAV-based positioning device according to an exemplary embodiment.
  • the determination module 303 may include:
  • the feature extraction sub-module 3031 is configured to determine the three-dimensional environment features of the environment where the terminal device is currently located according to the target positioning data.
  • the three-dimensional environment features include point cloud features and image features.
  • the matching sub-module 3032 is used to match the 3D environment features with the 3D map to determine the target pose.
  • Fig. 11 is a block diagram of another drone-based positioning device according to an exemplary embodiment. As shown in Fig. 11, the device 300 may also include:
  • the control module 304 is configured to, after determining the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and the target positioning data, according to the target pose and the location information of the destination of the terminal device, according to a preset path planning algorithm, Determine the target trajectory that satisfies the specified constraint conditions, and move according to the target trajectory.
  • the specified constraint conditions are determined according to the projection of the collected trajectory on the horizontal plane.
  • control module 304 may be used to:
  • At least one initial trajectory is determined according to a preset path planning algorithm.
  • the initial trajectory with the highest matching degree with the projection of the collected trajectory on the horizontal plane is taken as the target trajectory.
  • the control platform first obtains multiple initial positioning data sets sent by the UAV, wherein each initial positioning data set includes multiple initial positioning data collected by the UAV according to the corresponding acquisition trajectory , and then, according to multiple initial positioning data sets and the height of the terminal device, determine the three-dimensional map of the preset area, and finally send the three-dimensional map to the terminal device.
  • the terminal device determines the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and the target positioning data collected at the current moment.
  • This disclosure utilizes the initial positioning data sets collected by drones on multiple collection tracks to determine a three-dimensional map, which can improve the accuracy, collection efficiency, and scope of application of the three-dimensional map.
  • the pose of the terminal device is determined through the three-dimensional map , which can improve the positioning accuracy of the terminal device.
  • Fig. 12 is a block diagram of a positioning device based on a drone according to an exemplary embodiment. As shown in Fig. 12, the device 400 is applied to a control platform, including:
  • the acquisition module 401 is configured to acquire multiple initial positioning data sets sent by the UAV, each initial positioning data set corresponds to a collection trajectory in a preset area, and the initial positioning data set includes the data collected by the UAV according to the corresponding collection trajectory. Multiple initial positioning data, each initial positioning data includes point cloud data and image data, and the projection of each acquisition trajectory on the horizontal plane is the same.
  • the determining module 402 is configured to determine a three-dimensional map of a preset area according to multiple initial positioning data sets and the height of the terminal device.
  • the sending module 403 is configured to send the three-dimensional map to the terminal device, so that the terminal device determines the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and target positioning data, and the target positioning data includes points collected by the terminal device at the current moment Cloud data and image data.
  • each initial positioning data set includes a plurality of initial positioning data collected by the UAV in oblique photography.
  • Fig. 13 is a block diagram of another UAV-based positioning device according to an exemplary embodiment. As shown in Fig. 13 , the heights of each collection track are different.
  • Determining module 402 may include:
  • the first determination sub-module 4021 is used to determine the converted positioning data set according to the height of the terminal device and multiple initial positioning data sets.
  • the converted positioning data set includes multiple converted positioning data on the converted collection track, and the converted positioning data includes point clouds.
  • Data and image data, the height of the converted acquisition track is the same as the height of the terminal equipment, and the projection of the converted acquisition track on the horizontal plane is the same as the projection of each acquisition track on the horizontal plane.
  • the second determination sub-module 4022 is configured to generate a three-dimensional map according to the converted positioning data set.
  • the first determining submodule 4021 may be used to:
  • an initial converted positioning data set corresponding to the initial positioning data set is determined according to each initial positioning data included in the initial positioning data set and the height of the terminal device.
  • the initial converted positioning data set corresponding to each initial positioning data set is fused to obtain the converted positioning data set.
  • the second determining submodule 4022 can be used for:
  • 3D reconstruction is performed to obtain a 3D map.
  • the obtaining module 401 may also be used to:
  • the position of the positioning sensor set on the terminal device is determined.
  • the control platform first obtains multiple initial positioning data sets sent by the UAV, wherein each initial positioning data set includes multiple initial positioning data collected by the UAV according to the corresponding acquisition trajectory , and then, according to multiple initial positioning data sets and the height of the terminal device, determine the three-dimensional map of the preset area, and finally send the three-dimensional map to the terminal device.
  • the terminal device determines the target pose of the terminal device in the three-dimensional map according to the three-dimensional map and the target positioning data collected at the current moment.
  • This disclosure utilizes the initial positioning data sets collected by drones on multiple collection tracks to determine a three-dimensional map, which can improve the accuracy, collection efficiency, and scope of application of the three-dimensional map.
  • the pose of the terminal device is determined through the three-dimensional map , which can improve the positioning accuracy of the terminal device.
  • Fig. 14 is a block diagram of an electronic device 500 according to an exemplary embodiment.
  • the electronic device 500 may include: a processor 501 and a memory 502 .
  • the electronic device 500 may also include one or more of a multimedia component 503 , an input/output (I/O) interface 504 , and a communication component 505 .
  • I/O input/output
  • the processor 501 is used to control the overall operation of the electronic device 500, so as to complete all or part of the steps in the above-mentioned positioning method based on the drone applied to the terminal device.
  • the memory 502 is used to store various types of data to support the operation of the electronic device 500, for example, these data may include instructions for any application or method operating on the electronic device 500, and application-related data, Such as contact data, sent and received messages, pictures, audio, video, etc.
  • the memory 502 can be realized by any type of volatile or non-volatile storage device or their combination, such as Static Random Access Memory (Static Random Access Memory, referred to as SRAM), Electrically Erasable Programmable Read-Only Memory (EPROM) Electrically Erasable Programmable Read-Only Memory, referred to as EEPROM), Erasable Programmable Read-Only Memory (Erasable Programmable Read-Only Memory, referred to as EPROM), Programmable Read-Only Memory (Programmable Read-Only Memory, referred to as PROM), read-only Memory (Read-Only Memory, referred to as ROM), magnetic memory, flash memory, magnetic disk or optical disk.
  • Multimedia components 503 may include screen and audio components.
  • the screen can be, for example, a touch screen, and the audio component is used for outputting and/or inputting audio signals.
  • an audio component may include a microphone for receiving external audio signals.
  • the received audio signal may be further stored in the memory 502 or sent through the communication component 505 .
  • the audio component also includes at least one speaker for outputting audio signals.
  • the I/O interface 504 provides an interface between the processor 501 and other interface modules, which may be a keyboard, a mouse, buttons, and the like. These buttons can be virtual buttons or physical buttons.
  • the communication component 505 is used for wired or wireless communication between the electronic device 500 and other devices.
  • the communication component 505 may include: a Wi-Fi module, a Bluetooth module, an NFC module and the like.
  • the electronic device 500 may be implemented by one or more application-specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), digital signal processors (Digital Signal Processor, DSP for short), digital signal processing equipment (Digital Signal Processing Device, referred to as DSPD), programmable logic device (Programmable Logic Device, referred to as PLD), field programmable gate array (Field Programmable Gate Array, referred to as FPGA), controller, microcontroller, microprocessor or other electronic components
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processor
  • DSPD Digital Signal Processing Device
  • PLD programmable logic device
  • FPGA field programmable gate array
  • controller microcontroller
  • microprocessor or other electronic components The implementation is used to implement the above-mentioned drone-based positioning method applied to the terminal device.
  • a computer-readable storage medium including program instructions is also provided.
  • the program instructions are executed by a processor, the steps of the above-mentioned drone-based positioning method applied to a terminal device are implemented.
  • the computer-readable storage medium can be the above-mentioned memory 502 including program instructions, and the above-mentioned program instructions can be executed by the processor 501 of the electronic device 500 to complete the above-mentioned drone-based positioning method applied to the terminal device.
  • Fig. 15 is a block diagram of an electronic device 600 according to an exemplary embodiment.
  • the electronic device 600 may be provided as a server.
  • the electronic device 600 includes a processor 622 , the number of which may be one or more, and a memory 632 for storing computer programs executable by the processor 622 .
  • the computer program stored in memory 632 may include one or more modules each corresponding to a set of instructions.
  • the processor 622 can be configured to execute the computer program to implement the above-mentioned drone-based positioning method applied to the control platform.
  • the electronic device 600 may further include a power supply component 626 and a communication component 650, the power supply component 626 may be configured to perform power management of the electronic device 600, and the communication component 650 may be configured to implement communication of the electronic device 600, for example, wired or wireless communication.
  • the electronic device 600 may further include an input/output (I/O) interface 658 .
  • the electronic device 600 can operate based on an operating system stored in the memory 632, such as Windows Server TM , Mac OS X TM , Unix TM , Linux TM and so on.
  • a computer-readable storage medium including program instructions.
  • the program instructions are executed by a processor, the above-mentioned steps of the UAV-based positioning method applied to the control platform are implemented.
  • the non-transitory computer-readable storage medium can be the above-mentioned memory 632 including program instructions, and the above-mentioned program instructions can be executed by the processor 622 of the electronic device 600 to complete the above-mentioned drone-based positioning method applied to the control platform.
  • a computer program product comprising a computer program executable by a programmable device, the computer program having a function for performing the above-mentioned Part of the code for the UAV-based localization method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention concerne le domaine technique de la localisation sans fil, et concerne un procédé et un appareil de positionnement basés sur un véhicule aérien sans pilote, un support de stockage, un dispositif électronique et un produit. Le procédé est appliqué à un dispositif terminal, et comprend : l'obtention d'une carte tridimensionnelle d'une zone prédéfinie envoyée par une plateforme de commande, la carte tridimensionnelle étant déterminée en fonction d'une pluralité d'ensembles de données de positionnement initial et de la hauteur du dispositif terminal, chaque ensemble de données de positionnement initial correspondant à une trajectoire de collecte dans la zone prédéfinie, les ensembles de données de positionnement initial comprenant une pluralité d'éléments de données de positionnement initial collectées par le véhicule aérien sans pilote selon des trajectoires de collecte correspondantes, chaque élément de données de positionnement initial comprenant des données de nuage de points et des données d'image, et une projection de chaque trajectoire de collecte sur le plan horizontal étant la même ; l'obtention de données de positionnement cible, les données de positionnement cible comprenant des données de nuage de points et des données d'image collectées par le dispositif terminal à un moment actuel ; et la détermination d'une pose cible du dispositif terminal dans la carte tridimensionnelle conformément à la carte tridimensionnelle et aux données de positionnement cible. La présente invention peut améliorer la précision de positionnement du dispositif terminal.
PCT/CN2022/080773 2021-06-30 2022-03-14 Procédé et appareil de positionnement basés sur un véhicule aérien sans pilote, support de stockage, dispositif électronique et produit WO2023273415A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110739500.7A CN115222808B (zh) 2021-06-30 2021-06-30 基于无人机的定位方法、装置、存储介质和电子设备
CN202110739500.7 2021-06-30

Publications (1)

Publication Number Publication Date
WO2023273415A1 true WO2023273415A1 (fr) 2023-01-05

Family

ID=83606021

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/080773 WO2023273415A1 (fr) 2021-06-30 2022-03-14 Procédé et appareil de positionnement basés sur un véhicule aérien sans pilote, support de stockage, dispositif électronique et produit

Country Status (2)

Country Link
CN (1) CN115222808B (fr)
WO (1) WO2023273415A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116400733A (zh) * 2023-05-06 2023-07-07 北京理工大学 侦察无人机自适应调整随机树全覆盖路径规划方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116758157B (zh) * 2023-06-14 2024-01-30 深圳市华赛睿飞智能科技有限公司 一种无人机室内三维空间测绘方法、系统及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190295423A1 (en) * 2018-03-26 2019-09-26 D2, Llc Method and system for generating aerial imaging flight path
CN111696199A (zh) * 2020-06-14 2020-09-22 荆门汇易佳信息科技有限公司 同步定位建图的地空融合精密三维建模方法
WO2020198167A1 (fr) * 2019-03-22 2020-10-01 Solfice Research, Inc. Système et procédé de co-enregistrement et de localisation de données cartographiques
CN112184890A (zh) * 2020-10-14 2021-01-05 佳都新太科技股份有限公司 一种应用于电子地图中的摄像头精准定位方法及处理终端
CN112461210A (zh) * 2020-12-18 2021-03-09 湖南大学 一种空地协同建筑测绘机器人系统及其测绘方法

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109708636B (zh) * 2017-10-26 2021-05-14 广州极飞科技股份有限公司 导航图配置方法、避障方法以及装置、终端、无人飞行器
CN109521774B (zh) * 2018-12-27 2023-04-07 南京芊玥机器人科技有限公司 一种基于强化学习的喷涂机器人轨迹优化方法
CN110260857A (zh) * 2019-07-02 2019-09-20 北京百度网讯科技有限公司 视觉地图的校准方法、装置及存储介质
CN110675431B (zh) * 2019-10-08 2020-09-11 中国人民解放军军事科学院国防科技创新研究院 一种融合图像和激光点云的三维多目标跟踪方法
US11636618B2 (en) * 2019-11-14 2023-04-25 Samsung Electronics Co., Ltd. Device and method with simultaneous implementation of localization and mapping
CN111263308A (zh) * 2020-01-15 2020-06-09 上海交通大学 定位数据采集方法及系统
CN111415388B (zh) * 2020-03-17 2023-10-24 Oppo广东移动通信有限公司 一种视觉定位方法及终端
CN111442722B (zh) * 2020-03-26 2022-05-17 达闼机器人股份有限公司 定位方法、装置、存储介质及电子设备
CN111984021A (zh) * 2020-07-21 2020-11-24 武汉智会创新科技有限公司 无人机的控制方法及系统、无人机设备、远程控制设备
CN112810625B (zh) * 2021-04-19 2021-07-30 北京三快在线科技有限公司 一种轨迹修正的方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190295423A1 (en) * 2018-03-26 2019-09-26 D2, Llc Method and system for generating aerial imaging flight path
WO2020198167A1 (fr) * 2019-03-22 2020-10-01 Solfice Research, Inc. Système et procédé de co-enregistrement et de localisation de données cartographiques
CN111696199A (zh) * 2020-06-14 2020-09-22 荆门汇易佳信息科技有限公司 同步定位建图的地空融合精密三维建模方法
CN112184890A (zh) * 2020-10-14 2021-01-05 佳都新太科技股份有限公司 一种应用于电子地图中的摄像头精准定位方法及处理终端
CN112461210A (zh) * 2020-12-18 2021-03-09 湖南大学 一种空地协同建筑测绘机器人系统及其测绘方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116400733A (zh) * 2023-05-06 2023-07-07 北京理工大学 侦察无人机自适应调整随机树全覆盖路径规划方法
CN116400733B (zh) * 2023-05-06 2023-10-20 北京理工大学 侦察无人机自适应调整随机树全覆盖路径规划方法

Also Published As

Publication number Publication date
CN115222808A (zh) 2022-10-21
CN115222808B (zh) 2023-10-20

Similar Documents

Publication Publication Date Title
CN111442722B (zh) 定位方法、装置、存储介质及电子设备
EP3397554B1 (fr) Système et procédé d'utilisation d'un réseau de caméras multiples pour capturer des scènes statiques et/ou animées
WO2023273415A1 (fr) Procédé et appareil de positionnement basés sur un véhicule aérien sans pilote, support de stockage, dispositif électronique et produit
WO2017211029A1 (fr) Procédé et dispositif de planification de trajectoire de vol pour véhicule aérien sans pilote
Israr et al. Internet of things (IoT)-Enabled unmanned aerial vehicles for the inspection of construction sites: a vision and future directions
CN110246182B (zh) 基于视觉的全局地图定位方法、装置、存储介质和设备
CN103389699B (zh) 基于分布式智能监测控制节点的机器人监控及自主移动系统的运行方法
CN109542119B (zh) 飞行器航线规划方法及系统
JP2020030204A (ja) 距離測定方法、プログラム、距離測定システム、および可動物体
US20200012756A1 (en) Vision simulation system for simulating operations of a movable platform
Loianno et al. Flying smartphones: Automated flight enabled by consumer electronics
IL269560A (en) Distributed device mapping
WO2021081960A1 (fr) Procédé, dispositif et système de planification d'itinéraire et support de stockage
CN112581535B (zh) 机器人定位方法、装置、存储介质及电子设备
CN113959444A (zh) 用于无人设备的导航方法、装置、介质及无人设备
CN112991440A (zh) 车辆的定位方法和装置、存储介质和电子装置
US20210357620A1 (en) System, moving object, and information processing apparatus
US20210156710A1 (en) Map processing method, device, and computer-readable storage medium
CN114612622A (zh) 机器人三维地图位姿显示方法、装置、设备及存储介质
WO2023088127A1 (fr) Procédé de navigation en intérieur, serveur, appareil et terminal
CN112106112A (zh) 一种点云融合方法、设备、系统及存储介质
Evangeliou et al. Visual collaboration leader-follower uav-formation for indoor exploration
WO2023096588A1 (fr) Système, procédé et programme informatique pour surveillance de progression de construction
CN114571460A (zh) 机器人控制方法、装置及存储介质
Zhang et al. Leader-Follower cooperative localization based on VIO/UWB loose coupling for AGV group

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22831246

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE