CN111272172A - Unmanned aerial vehicle indoor navigation method, device, equipment and storage medium - Google Patents
Unmanned aerial vehicle indoor navigation method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN111272172A CN111272172A CN202010089061.5A CN202010089061A CN111272172A CN 111272172 A CN111272172 A CN 111272172A CN 202010089061 A CN202010089061 A CN 202010089061A CN 111272172 A CN111272172 A CN 111272172A
- Authority
- CN
- China
- Prior art keywords
- aerial vehicle
- unmanned aerial
- model
- information
- navigation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 238000012545 processing Methods 0.000 claims abstract description 20
- 206010034719 Personality change Diseases 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 9
- 238000004422 calculation algorithm Methods 0.000 claims description 7
- 230000002159 abnormal effect Effects 0.000 claims description 5
- 238000004891 communication Methods 0.000 description 7
- 230000008447 perception Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000002567 autonomic effect Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Electromagnetism (AREA)
- Navigation (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses an indoor navigation method, device, equipment and storage medium based on an unmanned aerial vehicle, wherein the method comprises the following steps: when receiving a navigation request of an unmanned aerial vehicle, acquiring an initial position and a target position of the unmanned aerial vehicle; acquiring running state information of the unmanned aerial vehicle through a first acquisition device in the unmanned aerial vehicle, and acquiring surrounding environment information of different heights and different shooting angles at the initial position through a second acquisition device in the unmanned aerial vehicle; constructing a characteristic point cloud of a shooting object according to the running state information and the surrounding environment information, and processing the characteristic point cloud to obtain a point cloud model of the shooting object; and superposing the point cloud model and a preset BIM model to obtain an superposed model, generating a navigation route reaching the target position according to the superposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route. The invention realizes the indoor automatic navigation of the unmanned aerial vehicle and improves the indoor navigation precision of the unmanned aerial vehicle.
Description
Technical Field
The invention relates to the field of unmanned aerial vehicle navigation, in particular to an unmanned aerial vehicle indoor navigation method, device, equipment and storage medium.
Background
The existing unmanned aerial vehicle capable of achieving autonomous flight depends on satellite signals to give position information, an existing map is reused to plan a path according to the position information, however, the unmanned aerial vehicle cannot receive GPS signals in environments with shelters such as indoors and tunnels, and therefore indoor positioning and navigation of the unmanned aerial vehicle are generally achieved in the following modes:
1. based on Bluetooth low-power consumption or wireless network positioning, in order to achieve high precision, a Bluetooth signal source grid needs to be deployed indoors in advance, and then the positioning is carried out by calculating signal intensity at a client, so that the indoor positioning cost is high;
2. based on the positioning of a cellular network, the technology does not require any preposed network deployment, and equipment which can receive mobile phone signals can be positioned by comparing the signals of a plurality of base stations. But it is difficult to ensure positioning accuracy;
3. the positioning technology based on radio frequency identification requires that radio frequency identification equipment is deployed in advance, and the accuracy of indoor navigation positioning is not ideal;
4. visual positioning, this kind of positioning system can utilize the camera to gather the image of surrounding environment, thereby confirm the position through the information contrast with type in advance, and the same overall arrangement's of unable different subregion (for example the same different rooms of hotel fitment) of confirming of this kind of mode, because the changeable of environment, in relatively narrow and small space, unmanned aerial vehicle probably can't be smooth keeps away the barrier.
Above-mentioned mode unmanned aerial vehicle can't obtain self and surrounding environment actual state at the flight in-process, and when the proruption situation appears around unmanned aerial vehicle, unmanned aerial vehicle can't make timely reaction, leads to unmanned aerial vehicle hardly to carry out autonomic location and navigation under indoor environment.
Disclosure of Invention
The invention mainly aims to provide an indoor navigation method, device, equipment and storage medium for an unmanned aerial vehicle, and aims to solve the technical problem that the current indoor navigation method for the unmanned aerial vehicle is high in cost and low in precision.
In order to achieve the purpose, the invention provides an indoor navigation method of an unmanned aerial vehicle, which comprises the following steps:
when receiving a navigation request of an unmanned aerial vehicle, acquiring an initial position and a target position of the unmanned aerial vehicle;
acquiring running state information of the unmanned aerial vehicle through a first acquisition device in the unmanned aerial vehicle, and acquiring surrounding environment information of different heights and different shooting angles at the initial position through a second acquisition device in the unmanned aerial vehicle;
constructing a characteristic point cloud of a shooting object according to the running state information and the surrounding environment information, and processing the characteristic point cloud to obtain a point cloud model of the shooting object;
and superposing the point cloud model and a preset BIM model to obtain an superposed model, generating a navigation route reaching the target position according to the superposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route.
In an embodiment, the step of obtaining the initial position and the destination position of the drone when receiving the drone navigation request includes:
when an unmanned aerial vehicle navigation request is received, acquiring a building identifier of a building where the unmanned aerial vehicle is located currently and a preset BIM (building information modeling) associated with the building identifier;
and constructing a three-dimensional coordinate system based on the preset BIM model, taking the three-dimensional coordinate of the current position of the unmanned aerial vehicle as the initial position of the unmanned aerial vehicle, acquiring a navigation destination corresponding to the unmanned aerial vehicle navigation request, and taking the three-dimensional of the navigation destination as the target position of the unmanned aerial vehicle.
In one embodiment, the first acquisition device comprises at least one inertial measurer and at least one sensing camera, and the second acquisition device comprises at least one depth camera;
the step of passing through first collection system gathers in the unmanned aerial vehicle's running state information, pass through second collection system gathers in the unmanned aerial vehicle initial position department all ring edge border information of different height and different shooting angles includes:
adjusting the height and the shooting angle of the unmanned aerial vehicle at the initial position, acquiring direction information of the unmanned aerial vehicle through the inertial measurer, acquiring a characteristic image through the sensing camera, analyzing the characteristic image to obtain relative position information of the unmanned aerial vehicle, and taking the direction information and the relative position information as running state information of the unmanned aerial vehicle;
the method comprises the steps of emitting an infrared pulse to a shooting object through the depth camera, receiving the infrared pulse reflected by the shooting object and the reflection time of the infrared pulse, processing the reflection time to obtain depth image information of the shooting object, and using the depth image information as surrounding environment information.
In an embodiment, the step of constructing a feature point cloud of a photographic object according to the operating state information and the ambient environment information, and processing the feature point cloud to obtain a point cloud model of the photographic object includes:
extracting direction information and relative position information in the running state information, and iterating the direction information and the relative position information to obtain an attitude change value of a first acquisition device in the unmanned aerial vehicle;
extracting depth image information in the surrounding environment information, and iterating the depth image information according to the attitude change value to obtain a characteristic point cloud of the unmanned aerial vehicle shooting object;
and processing the characteristic point cloud through a preset SLAM algorithm to obtain a point cloud model of the shooting object.
In an embodiment, the step of superimposing the point cloud model and a preset BIM model to obtain a superimposed model, generating a navigation route to the destination location according to the superimposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route includes:
determining a reference position corresponding to the initial position in a preset BIM model, and comparing edge information in the point cloud model with edge information at the reference position in the preset BIM model to obtain the minimum distance between the point cloud model and the preset BIM model;
superposing the point cloud model and a preset BIM model according to the minimum distance to obtain a superposed model;
and tracing a path from the initial position according to the superposition model to obtain a navigation route reaching the target position, and controlling the unmanned aerial vehicle to operate according to the navigation route.
In an embodiment, the step of tracing a path from the initial position according to the overlay model to obtain a navigation route to the destination position and controlling the drone to operate according to the navigation route includes:
tracing a path from the initial position in the superposition model along the target position, and judging whether an obstacle exists in the tracing path, whether the repetition rate of the tracing path is greater than a preset repetition rate and/or whether at least two tracing paths exist;
if the obstacle exists in the tracing path, the tracing direction of the path is changed; if the repetition rate of the tracing path is greater than the preset repetition rate, the tracing path is abandoned; and/or if at least two tracing paths are obtained, taking the tracing path with the shortest distance as a navigation path of the unmanned aerial vehicle, and controlling the unmanned aerial vehicle to operate according to the navigation path.
In an embodiment, after the step of superimposing the point cloud model and a preset BIM model to obtain a superimposed model, generating a navigation route to the destination location according to the superimposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route, the method includes:
when the unmanned aerial vehicle is monitored to deviate from the navigation route, sending a route control instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle returns to the navigation route;
if the unmanned aerial vehicle does not return to the navigation route within a preset time period, sending an information acquisition instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle feeds back current operation parameters;
and receiving the current operation parameters fed back by the unmanned aerial vehicle, and outputting prompt information if the current operation parameters are abnormal.
In addition, in order to achieve the above object, the present invention further provides an indoor navigation device for an unmanned aerial vehicle, including:
the request receiving module is used for acquiring the initial position and the target position of the unmanned aerial vehicle when receiving the navigation request of the unmanned aerial vehicle;
the information acquisition module is used for acquiring the running state information of the unmanned aerial vehicle through a first acquisition device in the unmanned aerial vehicle and acquiring the surrounding environment information of different heights and different shooting angles at the initial position through a second acquisition device in the unmanned aerial vehicle;
the model building module is used for building a characteristic point cloud of a shooting object according to the running state information and the surrounding environment information, and processing the characteristic point cloud to obtain a point cloud model of the shooting object;
and the route generation module is used for superposing the point cloud model and a preset BIM model to obtain an superposed model, generating a navigation route reaching the target position according to the superposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route.
In addition, in order to realize the purpose, the invention also provides unmanned aerial vehicle indoor navigation equipment;
unmanned aerial vehicle indoor navigation equipment includes: a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein:
the computer program, when executed by the processor, implements the steps of the drone indoor navigation method as described above.
In addition, to achieve the above object, the present invention also provides a computer storage medium;
the computer storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the unmanned aerial vehicle indoor navigation method as described above.
According to the indoor navigation method, the indoor navigation device, the indoor navigation equipment and the indoor navigation storage medium for the unmanned aerial vehicle, when a terminal receives a navigation request of the unmanned aerial vehicle, an initial position and a target position of the unmanned aerial vehicle are obtained; acquiring running state information of the unmanned aerial vehicle through a first acquisition device in the unmanned aerial vehicle, and acquiring surrounding environment information of different heights and different shooting angles at the initial position through a second acquisition device in the unmanned aerial vehicle; constructing a characteristic point cloud of a shooting object according to the running state information and the surrounding environment information, and processing the characteristic point cloud to obtain a point cloud model of the shooting object; and superposing the point cloud model and a preset BIM model to obtain an superposed model, generating a navigation route reaching the target position according to the superposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route. According to the technical scheme of the embodiment, extra hardware cost is not needed, information of unmanned aerial vehicle navigation obstacles such as building components and doors and windows is accurately determined through the superposition model, so that the unmanned aerial vehicle can better avoid the obstacles, the accident rate is reduced, the indoor automatic navigation of the unmanned aerial vehicle is realized, and the precision of the indoor navigation of the unmanned aerial vehicle is improved.
Drawings
FIG. 1 is a schematic diagram of an apparatus in a hardware operating environment according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a first embodiment of the indoor navigation method of the unmanned aerial vehicle according to the present invention;
fig. 3 is a schematic functional module diagram of an indoor navigation device of an unmanned aerial vehicle according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, fig. 1 is a schematic structural diagram of a terminal (also called an unmanned aerial vehicle indoor navigation device, where the unmanned aerial vehicle indoor navigation device may be formed by an independent unmanned aerial vehicle indoor navigation device, or formed by combining other devices with the unmanned aerial vehicle indoor navigation device) in a hardware operating environment according to an embodiment of the present invention.
The terminal of the embodiment of the invention can be a fixed terminal or a mobile terminal, such as an intelligent air conditioner with a networking function, an intelligent electric lamp, an intelligent power supply, an intelligent sound box, an automatic driving automobile, a Personal Computer (PC), a smart phone, a tablet computer, an electronic book reader, a portable computer and the like.
As shown in fig. 1, the terminal may include: a processor 1001, e.g., a Central Processing Unit (CPU), a network interface 1004, a user interface 1003, a memory 1005, and a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a WIreless interface (e.g., WIFI interface, WIreless FIdelity, WIFI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Optionally, the terminal may further include a camera, a Radio Frequency (RF) circuit, a sensor, an audio circuit, and a WiFi module; the input unit is compared with a display screen and a touch screen; the network interface may optionally be other than WiFi, bluetooth, probe, etc. in the wireless interface. Such as light sensors, motion sensors, and other sensors. In particular, the light sensor may include an ambient light sensor and a proximity sensor; of course, the mobile terminal may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which are not described herein again.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, the computer software product is stored in a storage medium (storage medium: also called computer storage medium, computer medium, readable storage medium, computer readable storage medium, or direct storage medium, etc., and the storage medium may be a non-volatile readable storage medium, such as RAM, magnetic disk, optical disk), and includes several instructions for enabling a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to execute the method according to the embodiments of the present invention, and a memory 1005 as a computer storage medium may include an operating system, a network communication module, a user interface module, and a computer program.
In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a backend server and performing data communication with the backend server; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and the processor 1001 may be configured to call the computer program stored in the memory 1005 and execute the steps of the indoor navigation method of the drone provided by the following embodiments of the present invention.
The embodiment of the unmanned aerial vehicle indoor navigation method is provided based on the hardware operation environment.
In the embodiment of the indoor navigation method of the unmanned aerial vehicle, two points in an indoor space are needed, and a point cloud model generated in real time is needed, so that an indoor navigation route of the unmanned aerial vehicle can be automatically planned, and the unmanned aerial vehicle can navigate, specifically:
referring to fig. 2, in a first embodiment of an indoor navigation method for an unmanned aerial vehicle according to the present invention, the indoor navigation method for an unmanned aerial vehicle includes:
and step S10, acquiring the initial position and the destination position of the unmanned aerial vehicle when the unmanned aerial vehicle navigation request is received.
The indoor navigation method of the unmanned aerial vehicle is applied to a terminal (called indoor navigation equipment of the unmanned aerial vehicle), the terminal is in communication connection with the unmanned aerial vehicle, the terminal can control the unmanned aerial vehicle, an acquisition device is preset in the unmanned aerial vehicle and used for acquiring the surrounding environment information and the running state information of the unmanned aerial vehicle (the running direction and the relative displacement information of the unmanned aerial vehicle, the running state information of the unmanned aerial vehicle), the acquisition device comprises but is not limited to an inertia measurer, a sensing camera and a depth camera, the inertia measurer, the specific number of the sensing camera and the depth camera is not limited, the unmanned aerial vehicle sends the acquired surrounding environment information and the running state information of the unmanned aerial vehicle to the terminal, the terminal obtains an indoor navigation route of the unmanned aerial vehicle by processing the surrounding environment information and the running state information of.
Specifically, the terminal receives the unmanned aerial vehicle navigation request, and the triggering mode of the unmanned aerial vehicle navigation request is not specifically limited, that is, the unmanned aerial vehicle navigation request can be actively triggered by a user, for example, the user clicks a button corresponding to unmanned aerial vehicle navigation on a terminal display interface to trigger the unmanned aerial vehicle navigation request; in addition, the unmanned aerial vehicle navigation request can also be automatically triggered by the terminal, for example, the unmanned aerial vehicle navigation request triggering condition is preset in the terminal: triggering an unmanned aerial vehicle navigation request every morning to acquire xxx floor information, and automatically triggering the unmanned aerial vehicle navigation request when the terminal arrives at the morning.
When a terminal receives an unmanned aerial vehicle navigation request, the terminal acquires an initial position and a target position of the unmanned aerial vehicle, wherein the initial position and the target position of the unmanned aerial vehicle can be set by a user; for example, the initial position of the input drone on the user terminal is: xxx building 3, floor 1 room, with the destination locations: xxx building 2 floor 3 room.
The embodiment provides a specific implementation mode for determining the initial position and the target position of the unmanned aerial vehicle, which comprises the following steps:
a1, when receiving a navigation request of an unmanned aerial vehicle, acquiring a building identifier of a building where the unmanned aerial vehicle is currently located and a preset BIM associated with the building identifier;
step a2, constructing a three-dimensional coordinate system based on the preset BIM model, taking the three-dimensional coordinate of the current position of the unmanned aerial vehicle as the initial position of the unmanned aerial vehicle, acquiring a navigation destination corresponding to the unmanned aerial vehicle navigation request, and taking the three-dimensional coordinate of the navigation destination as the target position of the unmanned aerial vehicle.
That is, when the terminal receives a navigation request of the unmanned aerial vehicle, the terminal obtains a Building identifier of a Building where the unmanned aerial vehicle is currently located (the Building identifier refers to identification Information for uniquely identifying the Building, such as a Building name or Building position Information), and the terminal obtains a preset BIM model associated with the Building identifier (the preset BIM model refers to a Building Information Modeling (Building Information Modeling) associated with the preset Building identifier, and the Building is included in the BIM model); the method includes the steps that a three-dimensional coordinate system is built on the basis of a preset BIM model, the three-dimensional coordinate of the current position of the unmanned aerial vehicle is used as the initial position of the unmanned aerial vehicle by the terminal, the navigation destination corresponding to the navigation request of the unmanned aerial vehicle is obtained by the terminal, and the three-dimensional position of the navigation destination is used as the target position of the unmanned aerial vehicle.
In this embodiment, a three-dimensional coordinate system is constructed based on a BIM model, and an initial position and a target position of the unmanned aerial vehicle are accurately determined, so as to realize accurate navigation of the unmanned aerial vehicle.
And step S20, acquiring the running state information of the unmanned aerial vehicle through a first acquisition device in the unmanned aerial vehicle, and acquiring the surrounding environment information of different heights and different shooting angles at the initial position through a second acquisition device in the unmanned aerial vehicle.
The unmanned aerial vehicle is internally provided with an acquisition device, the acquisition device is divided into a first acquisition device and a second acquisition device according to the purpose of the acquisition device, the first acquisition device is used for acquiring the running state information of the unmanned aerial vehicle, and the first acquisition device comprises at least one inertia measurer and at least one perception camera; the second collection system is used for gathering unmanned aerial vehicle all ring edge border environmental information, the second collection system includes at least one degree of depth camera, for example, unmanned aerial vehicle carries 1 inertial measurement ware, 1 degree of depth camera and 4 environmental perception cameras, wherein, inertial measurement ware is responsible for perception unmanned aerial vehicle's direction information, environmental perception camera is responsible for obtaining unmanned aerial vehicle's relative displacement, the degree of depth camera is responsible for the degree of depth image information that perception unmanned aerial vehicle shot the object, specifically, include:
b1, adjusting the height and shooting angle of the unmanned aerial vehicle at the initial position, acquiring direction information of the unmanned aerial vehicle through the inertial measurer, acquiring a characteristic image through the perception camera, analyzing the characteristic image to obtain relative position information of the unmanned aerial vehicle, and taking the direction information and the relative position information as the running state information of the unmanned aerial vehicle;
and b2, emitting an infrared pulse to a shooting object through the depth camera, receiving the infrared pulse reflected by the shooting object and the reflection time of the infrared pulse, processing the reflection time to obtain depth image information of the shooting object, and taking the depth image information as surrounding environment information.
In this embodiment, the terminal controls the unmanned aerial vehicle to independently adjust the height and shoot the angle near the initial position, and multi-angle shooting is performed to obtain the characteristic image, the terminal extracts the characteristic points of the characteristic image, the terminal analyzes the characteristic image to obtain the relative position information of the unmanned aerial vehicle, and the terminal takes the direction information and the relative position information as the running state information of the unmanned aerial vehicle. The terminal transmits infrared pulses through the depth camera, and depth image information, namely the distance from the surface of the object to the camera, is obtained by calculating the reflection time.
Step S30, constructing the characteristic point cloud of the shooting object according to the running state information and the surrounding environment information, and processing the characteristic point cloud to obtain the point cloud model of the shooting object.
The terminal converts depth image information in the surrounding environment information into three-dimensional feature point clouds, the terminal fuses the three-dimensional feature point clouds into a three-dimensional grid to obtain a point cloud model, namely, the terminal finds out intersection between a current acquisition device position emitting ray and the last three-dimensional feature point clouds to obtain the point clouds under the current frame view angle, and meanwhile, the terminal calculates a normal vector of the point cloud model to be used for registering input depth image information of the next frame and continuously circulates to obtain the feature point clouds under different view angles, so that the scene surface of the finished shooting object is reconstructed to form the point cloud model. Specifically, step S30 includes:
b1, extracting direction information and relative position information in the running state information, and iterating the direction information and the relative position information to obtain an attitude change value of a first acquisition device in the unmanned aerial vehicle;
b2, extracting depth image information in the surrounding environment information, and iterating the depth image information according to the attitude change value to obtain a characteristic point cloud of the unmanned aerial vehicle shooting object;
step b3, processing the characteristic point cloud through a preset SLAM algorithm to obtain a point cloud model of the shooting object.
Specifically, the terminal extracts direction information and relative position information in the running state information, and the terminal iterates the direction information and the relative position information to obtain an attitude change value of a first acquisition device in the unmanned aerial vehicle; the terminal extracts depth image information in the surrounding environment information, and iterates the depth image information according to the attitude change value to obtain a characteristic point cloud of the unmanned aerial vehicle shooting object; the terminal processes the characteristic point cloud through a preset SLAM algorithm (synchronous positioning and mapping algorithm) to obtain a point cloud model of the shooting object.
In the embodiment, the terminal determines the attitude change value of the unmanned aerial vehicle, is used for constructing the characteristic point cloud of the shooting object, obtains a rough point cloud model of the shooting target according to the SLAM algorithm, and identifies the obstacle information corresponding to the non-building component in real time according to the point cloud model.
And step S40, overlapping the point cloud model and a preset BIM model to obtain an overlapping model, generating a navigation route reaching the target position according to the overlapping model, and controlling the unmanned aerial vehicle to operate according to the navigation route.
The method comprises the steps that a BIM model is preset in a terminal, the terminal determines building component information according to the preset BIM model, the terminal can determine non-building component information according to a point cloud model, the BIM model and the point cloud model are overlapped by the terminal to obtain an overlapped model, unmanned aerial vehicle navigation obstacles such as the building component information and the non-building component information exist in the overlapped model, the terminal generates a navigation route reaching a target position according to the overlapped model, and the unmanned aerial vehicle is controlled to operate according to the navigation route. Specifically, step S30 includes:
step c1, determining a reference position corresponding to the initial position in a preset BIM model, and comparing edge information in the point cloud model with edge information at the reference position in the preset BIM model to obtain the minimum distance between the point cloud model and the preset BIM model;
step c2, overlapping the point cloud model and a preset BIM model according to the minimum distance to obtain an overlapped model;
and c3, tracing the path from the initial position according to the superposition model to obtain a navigation route reaching the target position, and controlling the unmanned aerial vehicle to operate according to the navigation route.
The terminal determines a reference position corresponding to an initial position in a preset BIM model, the terminal compares a shooting object image in the point cloud model with an image at the reference position in the preset BIM model to obtain the minimum distance between edge feature points of two images, and the terminal superposes the point cloud model and the preset BIM model according to the minimum distance to obtain a superposed model. And the terminal traces a path from the initial position according to the superposition model to obtain a navigation route reaching the target position, and controls the unmanned aerial vehicle to operate according to the navigation route.
According to the technical scheme of the embodiment, extra hardware cost is not needed, information of unmanned aerial vehicle navigation obstacles such as building components and doors and windows is accurately determined through the superposition model, so that the unmanned aerial vehicle can better avoid the obstacles, the accident rate is reduced, the indoor automatic navigation of the unmanned aerial vehicle is realized, and the precision of the indoor navigation of the unmanned aerial vehicle is improved.
Further, on the basis of the first embodiment of the invention, a second embodiment of the indoor navigation method of the unmanned aerial vehicle is provided.
This embodiment is a refinement of step S40 in the first embodiment, and is different from the first embodiment of the present invention in that:
step S41, tracing a path from the initial position in the superposition model along the target position, and judging whether an obstacle exists in the tracing path, whether the tracing path repetition rate is greater than a preset repetition rate and/or whether at least two tracing paths exist;
the terminal traces a path from an initial position in the superimposed model along a target position, that is, the superimposed model includes building component information and non-building component information, the terminal uses the building component information and the non-building component information as obstacles, the terminal avoids the obstacles to trace the path, specifically, the terminal determines whether obstacles (the obstacles may be walls, lamps, ornaments and the like) exist in the traced path, and whether the traced path repetition rate is greater than a preset repetition rate (the preset repetition rate refers to a preset ratio of a full path to a repeated path, for example, the preset repetition rate is set to 30%) and/or whether at least two traced paths exist.
Step S42, if the tracing path has an obstacle, the tracing direction of the path is changed; if the repetition rate of the tracing path is greater than the preset repetition rate, the tracing path is abandoned; and/or if at least two tracing paths are obtained, taking the tracing path with the shortest distance as a navigation path of the unmanned aerial vehicle, and controlling the unmanned aerial vehicle to operate according to the navigation path.
If the obstacle exists in the tracing path, the terminal determines that the path reaches the end, and the terminal changes the tracing direction of the path; if the repetition rate of the tracing path is greater than the preset repetition rate, the terminal judges that the path is a repeating path, and then the tracing path is abandoned; and/or if at least two tracing paths are obtained, the terminal takes the tracing path with the shortest distance as the navigation path of the unmanned aerial vehicle and controls the unmanned aerial vehicle to operate according to the navigation path. The route generation mode is provided in the embodiment, the reasonability of the navigation route of the unmanned aerial vehicle is effectively guaranteed, and the unmanned aerial vehicle navigation is more accurate.
Further, on the basis of the above embodiment of the present invention, a third embodiment of the indoor navigation method for the unmanned aerial vehicle of the present invention is provided.
This embodiment is a step after step S40 in the first embodiment, and is different from the first embodiment of the present invention in that:
step S50, when the unmanned aerial vehicle is monitored to deviate from the navigation route, a route control instruction is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle returns to the navigation route.
The terminal monitors the operation path information of the unmanned aerial vehicle in real time, the operation path information comprises operation speed, operation route and operation time and the like, the terminal judges whether the unmanned aerial vehicle deviates from the navigation route according to the operation path information of the unmanned aerial vehicle, and if the unmanned aerial vehicle deviates from the navigation route, the terminal sends a route control instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle returns to the navigation route according to the route control instruction.
Step S60, if the unmanned aerial vehicle does not return to the navigation route within a preset time period, an information acquisition instruction is sent to the unmanned aerial vehicle so that the unmanned aerial vehicle can feed back current operation parameters.
If the unmanned aerial vehicle does not return to the navigation route within the preset time period (the preset time period is set according to a specific scene, for example, the preset time period is set to be 2 minutes), the terminal sends an information acquisition instruction to the unmanned aerial vehicle, the unmanned aerial vehicle receives the acquisition instruction sent by the terminal, the unmanned aerial vehicle acquires current operation parameters, the operation parameters comprise operation time and operation route, and the unmanned aerial vehicle feeds the current operation parameters back to the mobile terminal.
And step S70, receiving the current operation parameters fed back by the unmanned aerial vehicle, and outputting prompt information if the current operation parameters are abnormal.
The terminal receives current operation parameters fed back by the unmanned aerial vehicle, compares the current operation parameters with preset standard operation parameters, judges whether the current operation parameters meet the standard operation parameters, if the current operation parameters do not meet the standard operation parameters, the terminal determines that the current operation parameters are abnormal, judges that the unmanned aerial vehicle has faults, and outputs prompt information. Terminal monitors unmanned aerial vehicle's running state in this embodiment, and when unmanned aerial vehicle trouble, the terminal can export prompt message in real time, carries out unmanned aerial vehicle's timely maintenance.
In addition, referring to fig. 3, an embodiment of the present invention further provides an indoor navigation device for an unmanned aerial vehicle, where the indoor navigation device for an unmanned aerial vehicle includes:
the request receiving module 10 is configured to obtain an initial position and a target position of the unmanned aerial vehicle when receiving a navigation request of the unmanned aerial vehicle;
the information acquisition module 20 is configured to acquire the operating state information of the unmanned aerial vehicle through a first acquisition device in the unmanned aerial vehicle, and acquire the ambient environment information at different heights and different shooting angles at the initial position through a second acquisition device in the unmanned aerial vehicle;
the model building module 30 is used for building a feature point cloud of a shooting object according to the running state information and the surrounding environment information, and processing the feature point cloud to obtain a point cloud model of the shooting object;
and the route generation module 40 is used for superposing the point cloud model and a preset BIM model to obtain an superposed model, generating a navigation route reaching the target position according to the superposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route.
In one embodiment, the first acquisition device comprises at least one inertial measurer and at least one sensing camera, and the second acquisition device comprises at least one depth camera;
the information collecting module 20 includes:
the first acquisition module is used for adjusting the height and the shooting angle of the unmanned aerial vehicle at the initial position, acquiring direction information of the unmanned aerial vehicle through the inertia measurer, acquiring a characteristic image through the sensing camera, analyzing the characteristic image to obtain relative position information of the unmanned aerial vehicle, and taking the direction information and the relative position information as running state information of the unmanned aerial vehicle;
the second acquisition module is used for transmitting infrared pulses to a shooting object through the depth camera, receiving the infrared pulses reflected by the shooting object and the reflection time of the infrared pulses, processing the reflection time to obtain depth image information of the shooting object, and taking the depth image information as surrounding environment information.
In one embodiment, the model building module 30 includes:
the attitude calculation unit is used for extracting direction information and relative position information in the running state information, and iterating the direction information and the relative position information to obtain an attitude change value of a first acquisition device in the unmanned aerial vehicle;
the point cloud determining unit is used for extracting depth image information in the surrounding environment information, iterating the depth image information according to the attitude change value and obtaining the characteristic point cloud of the unmanned aerial vehicle shooting object;
and the model generation unit is used for processing the characteristic point cloud through a preset SLAM algorithm to obtain a point cloud model of the shooting object.
In one embodiment, the route generation module 40 includes:
the information comparison submodule is used for determining a reference position corresponding to the initial position in a preset BIM model, and comparing edge information in the point cloud model with edge information at the reference position in the preset BIM model to obtain the minimum distance between the point cloud model and the preset BIM model;
the model superposition submodule is used for superposing the point cloud model and a preset BIM model according to the minimum distance to obtain a superposition model;
and the route generation submodule is used for tracing a route from the initial position according to the superposition model to obtain a navigation route reaching the target position and controlling the unmanned aerial vehicle to operate according to the navigation route.
In one embodiment, the route generation submodule includes:
a tracing judging unit, configured to trace a path from the initial position in the overlay model along the destination position, and judge whether an obstacle exists in the tracing path, whether a tracing path repetition rate is greater than a preset repetition rate, and/or whether at least two tracing paths exist;
the control operation unit is used for replacing the path tracing direction if an obstacle exists in the tracing path; if the repetition rate of the tracing path is greater than the preset repetition rate, the tracing path is abandoned; and/or if at least two tracing paths are obtained, taking the tracing path with the shortest distance as a navigation path of the unmanned aerial vehicle, and controlling the unmanned aerial vehicle to operate according to the navigation path.
In one embodiment, the indoor navigation device of unmanned aerial vehicle comprises:
the route monitoring module is used for sending a route control instruction to the unmanned aerial vehicle when the unmanned aerial vehicle is monitored to deviate from the navigation route so as to enable the unmanned aerial vehicle to return to the navigation route;
the instruction sending module is used for sending an information acquisition instruction to the unmanned aerial vehicle to enable the unmanned aerial vehicle to feed back current operation parameters if the unmanned aerial vehicle does not return to the navigation route within a preset time period;
and the prompt output module is used for receiving the current operation parameters fed back by the unmanned aerial vehicle, and outputting prompt information if the current operation parameters are abnormal.
The steps implemented by each functional module of the indoor navigation device of the unmanned aerial vehicle can refer to each embodiment of the indoor navigation method of the unmanned aerial vehicle, and are not described herein again.
In addition, the embodiment of the invention also provides a computer storage medium.
The computer storage medium stores thereon a computer program, which when executed by a processor implements the operations of the unmanned aerial vehicle indoor navigation method provided by the above embodiments.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity/action/object from another entity/action/object without necessarily requiring or implying any actual such relationship or order between such entities/actions/objects; the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
For the apparatus embodiment, since it is substantially similar to the method embodiment, it is described relatively simply, and reference may be made to some descriptions of the method embodiment for relevant points. The above-described apparatus embodiments are merely illustrative, in that elements described as separate components may or may not be physically separate. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the invention. One of ordinary skill in the art can understand and implement it without inventive effort.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (10)
1. An indoor navigation method of an unmanned aerial vehicle is characterized by comprising the following steps:
when receiving a navigation request of an unmanned aerial vehicle, acquiring an initial position and a target position of the unmanned aerial vehicle;
acquiring running state information of the unmanned aerial vehicle through a first acquisition device in the unmanned aerial vehicle, and acquiring surrounding environment information of different heights and different shooting angles at the initial position through a second acquisition device in the unmanned aerial vehicle;
constructing a characteristic point cloud of a shooting object according to the running state information and the surrounding environment information, and processing the characteristic point cloud to obtain a point cloud model of the shooting object;
and superposing the point cloud model and a preset BIM model to obtain an superposed model, generating a navigation route reaching the target position according to the superposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route.
2. The indoor navigation method of the drone of claim 1, wherein the step of obtaining the initial position and the destination position of the drone upon receiving the drone navigation request includes:
when an unmanned aerial vehicle navigation request is received, acquiring a building identifier of a building where the unmanned aerial vehicle is located currently and a preset BIM (building information modeling) associated with the building identifier;
and constructing a three-dimensional coordinate system based on the preset BIM model, taking the three-dimensional coordinate of the current position of the unmanned aerial vehicle as the initial position of the unmanned aerial vehicle, acquiring a navigation destination corresponding to the unmanned aerial vehicle navigation request, and taking the three-dimensional of the navigation destination as the target position of the unmanned aerial vehicle.
3. The unmanned aerial vehicle indoor navigation method of claim 1, wherein the first acquisition device comprises at least one inertial measurer and at least one sensing camera, and the second acquisition device comprises at least one depth camera;
the step of passing through first collection system gathers in the unmanned aerial vehicle's running state information, pass through second collection system gathers in the unmanned aerial vehicle initial position department all ring edge border information of different height and different shooting angles includes:
adjusting the height and the shooting angle of the unmanned aerial vehicle at the initial position, acquiring direction information of the unmanned aerial vehicle through the inertial measurer, acquiring a characteristic image through the sensing camera, analyzing the characteristic image to obtain relative position information of the unmanned aerial vehicle, and taking the direction information and the relative position information as running state information of the unmanned aerial vehicle;
the method comprises the steps of emitting an infrared pulse to a shooting object through the depth camera, receiving the infrared pulse reflected by the shooting object and the reflection time of the infrared pulse, processing the reflection time to obtain depth image information of the shooting object, and using the depth image information as surrounding environment information.
4. The unmanned aerial vehicle indoor navigation method of claim 1, wherein the step of constructing a feature point cloud of a photographic object according to the operating state information and the ambient environment information, and processing the feature point cloud to obtain a point cloud model of the photographic object comprises:
extracting direction information and relative position information in the running state information, and iterating the direction information and the relative position information to obtain an attitude change value of a first acquisition device in the unmanned aerial vehicle;
extracting depth image information in the surrounding environment information, and iterating the depth image information according to the attitude change value to obtain a characteristic point cloud of the unmanned aerial vehicle shooting object;
and processing the characteristic point cloud through a preset SLAM algorithm to obtain a point cloud model of the shooting object.
5. The unmanned aerial vehicle indoor navigation method of claim 1, wherein the step of superimposing the point cloud model with a preset BIM model to obtain a superimposed model, generating a navigation route to the destination location according to the superimposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route includes:
determining a reference position corresponding to the initial position in a preset BIM model, and comparing edge information in the point cloud model with edge information at the reference position in the preset BIM model to obtain the minimum distance between the point cloud model and the preset BIM model;
superposing the point cloud model and a preset BIM model according to the minimum distance to obtain a superposed model;
and tracing a path from the initial position according to the superposition model to obtain a navigation route reaching the target position, and controlling the unmanned aerial vehicle to operate according to the navigation route.
6. The unmanned aerial vehicle indoor navigation method of claim 5, wherein the step of performing path tracing from the initial position according to the superposition model to obtain a navigation route to the destination position and controlling the unmanned aerial vehicle to run according to the navigation route comprises:
tracing a path from the initial position in the superposition model along the target position, and judging whether an obstacle exists in the tracing path, whether the repetition rate of the tracing path is greater than a preset repetition rate and/or whether at least two tracing paths exist;
if the obstacle exists in the tracing path, the tracing direction of the path is changed; if the repetition rate of the tracing path is greater than the preset repetition rate, the tracing path is abandoned; and/or if at least two tracing paths are obtained, taking the tracing path with the shortest distance as a navigation path of the unmanned aerial vehicle, and controlling the unmanned aerial vehicle to operate according to the navigation path.
7. The unmanned aerial vehicle indoor navigation method of any one of claims 1 to 6, wherein after the steps of superimposing the point cloud model with a preset BIM model to obtain a superimposed model, generating a navigation route to the destination location according to the superimposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route, the method comprises:
when the unmanned aerial vehicle is monitored to deviate from the navigation route, sending a route control instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle returns to the navigation route;
if the unmanned aerial vehicle does not return to the navigation route within a preset time period, sending an information acquisition instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle feeds back current operation parameters;
and receiving the current operation parameters fed back by the unmanned aerial vehicle, and outputting prompt information if the current operation parameters are abnormal.
8. The utility model provides an indoor navigation head of unmanned aerial vehicle, its characterized in that, indoor navigation head of unmanned aerial vehicle includes:
the request receiving module is used for acquiring the initial position and the target position of the unmanned aerial vehicle when receiving the navigation request of the unmanned aerial vehicle;
the information acquisition module is used for acquiring the running state information of the unmanned aerial vehicle through a first acquisition device in the unmanned aerial vehicle and acquiring the surrounding environment information of different heights and different shooting angles at the initial position through a second acquisition device in the unmanned aerial vehicle;
the model building module is used for building a characteristic point cloud of a shooting object according to the running state information and the surrounding environment information, and processing the characteristic point cloud to obtain a point cloud model of the shooting object;
and the route generation module is used for superposing the point cloud model and a preset BIM model to obtain an superposed model, generating a navigation route reaching the target position according to the superposed model, and controlling the unmanned aerial vehicle to operate according to the navigation route.
9. The utility model provides an unmanned aerial vehicle indoor navigation equipment which characterized in that, unmanned aerial vehicle indoor navigation equipment includes: a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein:
the computer program when executed by the processor realizes the steps of the drone indoor navigation method of any one of claims 1 to 7.
10. A computer storage medium, characterized in that the computer storage medium has stored thereon a computer program which, when being executed by a processor, realizes the steps of the unmanned aerial vehicle indoor navigation method according to any one of claims 1 to 7.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010089061.5A CN111272172A (en) | 2020-02-12 | 2020-02-12 | Unmanned aerial vehicle indoor navigation method, device, equipment and storage medium |
PCT/CN2020/085853 WO2021159603A1 (en) | 2020-02-12 | 2020-04-21 | Indoor navigation method and apparatus for unmanned aerial vehicle, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010089061.5A CN111272172A (en) | 2020-02-12 | 2020-02-12 | Unmanned aerial vehicle indoor navigation method, device, equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111272172A true CN111272172A (en) | 2020-06-12 |
Family
ID=70997022
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010089061.5A Pending CN111272172A (en) | 2020-02-12 | 2020-02-12 | Unmanned aerial vehicle indoor navigation method, device, equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111272172A (en) |
WO (1) | WO2021159603A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111741263A (en) * | 2020-06-18 | 2020-10-02 | 广东电网有限责任公司 | Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle |
CN111880566A (en) * | 2020-07-28 | 2020-11-03 | 中国银行股份有限公司 | Unmanned aerial vehicle-based home-entry money receiving and delivering method and device, storage medium and equipment |
CN113485438A (en) * | 2021-07-30 | 2021-10-08 | 南京石知韵智能科技有限公司 | Intelligent planning method and system for space monitoring path of unmanned aerial vehicle |
WO2023173409A1 (en) * | 2022-03-18 | 2023-09-21 | 深圳市大疆创新科技有限公司 | Display method and apparatus for information, comparison method and apparatus for models, and unmanned aerial vehicle system |
CN117130392A (en) * | 2023-10-26 | 2023-11-28 | 深圳森磊弘泰消防科技有限公司 | Unmanned aerial vehicle for indoor positioning navigation based on BIM data and control method |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113706716B (en) * | 2021-10-21 | 2022-01-07 | 湖南省交通科学研究院有限公司 | Highway BIM modeling method utilizing unmanned aerial vehicle oblique photography |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160034013A (en) * | 2014-09-19 | 2016-03-29 | 한국건설기술연구원 | System and method for construction site management by using unmaned aerial vehicle |
CN106441286A (en) * | 2016-06-27 | 2017-02-22 | 上海大学 | Unmanned aerial vehicle tunnel inspection system based on BIM technology |
CN109410330A (en) * | 2018-11-12 | 2019-03-01 | 中国十七冶集团有限公司 | One kind being based on BIM technology unmanned plane modeling method |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104236548B (en) * | 2014-09-12 | 2017-04-05 | 清华大学 | Autonomous navigation method in a kind of MAV room |
US10665115B2 (en) * | 2016-01-05 | 2020-05-26 | California Institute Of Technology | Controlling unmanned aerial vehicles to avoid obstacle collision |
CA3012049A1 (en) * | 2016-01-20 | 2017-07-27 | Ez3D, Llc | System and method for structural inspection and construction estimation using an unmanned aerial vehicle |
US11397088B2 (en) * | 2016-09-09 | 2022-07-26 | Nanyang Technological University | Simultaneous localization and mapping methods and apparatus |
CN108303099B (en) * | 2018-06-14 | 2018-09-28 | 江苏中科院智能科学技术应用研究院 | Autonomous navigation method in unmanned plane room based on 3D vision SLAM |
CN109410327B (en) * | 2018-10-09 | 2022-05-17 | 广东博智林机器人有限公司 | BIM and GIS-based three-dimensional city modeling method |
CN109540142B (en) * | 2018-11-27 | 2021-04-06 | 达闼科技(北京)有限公司 | Robot positioning navigation method and device, and computing equipment |
-
2020
- 2020-02-12 CN CN202010089061.5A patent/CN111272172A/en active Pending
- 2020-04-21 WO PCT/CN2020/085853 patent/WO2021159603A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160034013A (en) * | 2014-09-19 | 2016-03-29 | 한국건설기술연구원 | System and method for construction site management by using unmaned aerial vehicle |
CN106441286A (en) * | 2016-06-27 | 2017-02-22 | 上海大学 | Unmanned aerial vehicle tunnel inspection system based on BIM technology |
CN109410330A (en) * | 2018-11-12 | 2019-03-01 | 中国十七冶集团有限公司 | One kind being based on BIM technology unmanned plane modeling method |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111741263A (en) * | 2020-06-18 | 2020-10-02 | 广东电网有限责任公司 | Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle |
CN111880566A (en) * | 2020-07-28 | 2020-11-03 | 中国银行股份有限公司 | Unmanned aerial vehicle-based home-entry money receiving and delivering method and device, storage medium and equipment |
CN113485438A (en) * | 2021-07-30 | 2021-10-08 | 南京石知韵智能科技有限公司 | Intelligent planning method and system for space monitoring path of unmanned aerial vehicle |
WO2023173409A1 (en) * | 2022-03-18 | 2023-09-21 | 深圳市大疆创新科技有限公司 | Display method and apparatus for information, comparison method and apparatus for models, and unmanned aerial vehicle system |
CN117130392A (en) * | 2023-10-26 | 2023-11-28 | 深圳森磊弘泰消防科技有限公司 | Unmanned aerial vehicle for indoor positioning navigation based on BIM data and control method |
CN117130392B (en) * | 2023-10-26 | 2024-02-20 | 深圳森磊弘泰消防科技有限公司 | Unmanned aerial vehicle for indoor positioning navigation based on BIM data and control method |
Also Published As
Publication number | Publication date |
---|---|
WO2021159603A1 (en) | 2021-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111272172A (en) | Unmanned aerial vehicle indoor navigation method, device, equipment and storage medium | |
US11100260B2 (en) | Method and apparatus for interacting with a tag in a wireless communication area | |
US10670408B2 (en) | System for sensing interior spaces to auto-generate a navigational map | |
US11640486B2 (en) | Architectural drawing based exchange of geospatial related digital content | |
US20160300389A1 (en) | Correlated immersive virtual simulation for indoor navigation | |
US11610033B2 (en) | Method and apparatus for augmented reality display of digital content associated with a location | |
JP2017509939A (en) | Method and system for generating a map including sparse and dense mapping information | |
CN110741395B (en) | On-site command vision | |
US20190156568A1 (en) | System and method of scanning an environment and generating two dimensional images of the environment | |
EP3527939A1 (en) | A system and method of on-site documentation enhancement through augmented reality | |
KR102221981B1 (en) | Method, device and system for mapping position detections to a graphical representation | |
TW202104929A (en) | Measurement apparatus and measurement system | |
EP4068218A1 (en) | Automated update of object-models in geometrical digital representation | |
CN113906481A (en) | Imaging display method, remote control terminal, device, system and storage medium | |
CN111596259A (en) | Infrared positioning system, positioning method and application thereof | |
JP7004374B1 (en) | Movement route generation method and program of moving object, management server, management system | |
CN112799418B (en) | Control method, control device, remote control equipment and readable storage medium | |
US20240219542A1 (en) | Auto-level step for extrinsic calibration | |
CN117589153B (en) | Map updating method and robot | |
KR102550637B1 (en) | A system for tracking an object in physical space using aligned frames of reference | |
KR20240104890A (en) | Method of generating 3D topographic data with multiple drones | |
CA2970940C (en) | System for sensing interior spaces to auto-generate a navigational map | |
JP2021131713A (en) | Own position estimating system and moving body | |
CN118425974A (en) | Indoor environment monitoring method and device and production line digital twin system | |
CN118447219A (en) | Method and system for generating scan data of a region of interest |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200612 |