CN114877872B - Unmanned aerial vehicle, operating system thereof, method, medium and equipment for generating map - Google Patents

Unmanned aerial vehicle, operating system thereof, method, medium and equipment for generating map Download PDF

Info

Publication number
CN114877872B
CN114877872B CN202210764782.0A CN202210764782A CN114877872B CN 114877872 B CN114877872 B CN 114877872B CN 202210764782 A CN202210764782 A CN 202210764782A CN 114877872 B CN114877872 B CN 114877872B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
camera
video data
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210764782.0A
Other languages
Chinese (zh)
Other versions
CN114877872A (en
Inventor
张乐
姜殿元
潘嫱
刘岩
王文琦
张雪泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jinri Lantian Technology Co ltd
Original Assignee
Beijing Jinri Lantian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jinri Lantian Technology Co ltd filed Critical Beijing Jinri Lantian Technology Co ltd
Priority to CN202210764782.0A priority Critical patent/CN114877872B/en
Publication of CN114877872A publication Critical patent/CN114877872A/en
Application granted granted Critical
Publication of CN114877872B publication Critical patent/CN114877872B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/36Videogrammetry, i.e. electronic processing of video signals from a single source or from different sources to give parallax or range information

Abstract

The utility model provides a method for generating map based on data that unmanned aerial vehicle gathered, unmanned aerial vehicle includes the unmanned aerial vehicle body, sets up camera group on the unmanned aerial vehicle body and set up the communication module group on the unmanned aerial vehicle body, the method for generating map includes: receiving attitude information of the unmanned aerial vehicle, the first mapping video data and the second mapping video data sent by the unmanned aerial vehicle; and generating a three-dimensional map of the to-be-painted place according to the attitude information of the unmanned aerial vehicle, the first mapping video data and the second mapping video data. The present disclosure also relates to a drone, an operating system of the drone, a computer-readable storage medium and an electronic device.

Description

Unmanned aerial vehicle, operating system thereof, method, medium and equipment for generating map
Technical Field
The invention relates to the field of unmanned aerial vehicle equipment, in particular to a method for generating a map based on data acquired by an unmanned aerial vehicle, the unmanned aerial vehicle, an operating system of the unmanned aerial vehicle, a computer readable storage medium and electronic equipment.
Background
With the development of the unmanned aerial vehicle technology, the unmanned aerial vehicle has been widely applied to technologies such as shooting, exploration and mapping. However, when the unmanned aerial vehicle is used for shooting and surveying the places such as tunnels and mines, the unmanned aerial vehicle still has many limitations, and cannot obtain image information accurately reflecting the places such as tunnels and mines.
Disclosure of Invention
An object of the present disclosure is to provide a method of generating a map based on data acquired by an unmanned aerial vehicle, an operating system of the unmanned aerial vehicle, a computer-readable storage medium, and an electronic device.
As an aspect of the present disclosure, a method for generating a map based on data collected by an unmanned aerial vehicle is provided, wherein the unmanned aerial vehicle comprises an unmanned aerial vehicle body, a camera group arranged on the unmanned aerial vehicle body, and a communication module group arranged on the unmanned aerial vehicle body, wherein the camera group comprises a first control camera, a first mapping camera and a second mapping camera, the first control camera is arranged at the front end of the unmanned aerial vehicle body, the first mapping camera and the second mapping camera are arranged at intervals on the unmanned aerial vehicle body,
the first control camera is used for shooting an image in front of the unmanned aerial vehicle body and generating first control video data and attitude information of the unmanned aerial vehicle;
the first mapping camera is used for shooting images of a place where the unmanned aerial vehicle passes and generating first mapping video data;
the second mapping camera is used for shooting images of a place where the unmanned aerial vehicle passes and generating second mapping video data;
the communication module group comprises a first communication module and a second communication module, the first communication module is used for sending the first control video data to a first operation device and a second operation device, and the second communication module is used for sending the first mapping video data and the second mapping video data to a mapping device;
the method of generating a map comprises:
receiving attitude information of the unmanned aerial vehicle, the first mapping video data and the second mapping video data sent by the unmanned aerial vehicle;
and generating a three-dimensional map of the to-be-mapped place according to the attitude information of the unmanned aerial vehicle, the first mapping video data and the second mapping video data.
Optionally, generating a three-dimensional map of a to-be-painted place according to the pose information of the drone, the first mapping video data, and the second mapping video data, includes:
adjusting images at corresponding shooting time in the first mapping video data and the second mapping video data according to the attitude information of the unmanned aerial vehicle at each shooting time to obtain an adjusted first video image and an adjusted second video image, wherein the rotation angles of the first video image and the second video image relative to a reference standard are consistent;
integrating the first video image and the second video image with the same shooting time to obtain mapping sub-images of all shooting times;
and sequencing all the mapping sub-images along the flight path of the unmanned aerial vehicle according to the time sequence to obtain the three-dimensional mapping.
Optionally, integrating the first video image and the second video image with the same shooting time to obtain the mapping sub-image at each shooting time includes:
determining image information of boundary elements of places to be painted in the first video image and the second video image;
and generating boundary elements of the place to be mapped so as to obtain the mapping sub-image.
As a second aspect of the present disclosure, an unmanned aerial vehicle is provided, the unmanned aerial vehicle includes an unmanned aerial vehicle body, sets up camera group and setting on the unmanned aerial vehicle body are in communication module group on the unmanned aerial vehicle body, wherein, camera group includes first control camera, first survey and drawing camera and second survey and drawing camera, first control camera setting is in the front end of unmanned aerial vehicle body, first survey and drawing camera and second survey and drawing camera are in interval setting on the unmanned aerial vehicle body.
The first control camera is used for shooting an image in front of the unmanned aerial vehicle body and generating first control video data and attitude information of the unmanned aerial vehicle;
the first mapping camera is used for shooting images of places where the unmanned aerial vehicle passes and generating first mapping video data;
the second mapping camera is used for shooting images of places where the unmanned aerial vehicle passes and generating second mapping video data;
the communication module group comprises a first communication module and a second communication module, the first communication module is used for sending the first control video data to a first operation device and a second operation device, and the second communication module is used for sending the first mapping video data and the second mapping video data to a mapping device.
Optionally, unmanned aerial vehicle still includes the back camera, the back camera sets up the rear end of unmanned aerial vehicle body, the back camera is used for shooing the image in unmanned aerial vehicle body rear and generate the video data for the second control, first communication module be used for with the video data for the second control send to first operating means.
Optionally, the first control camera is further configured to adjust a shooting direction of the first control camera according to the received first control camera control signal, and control the rear camera to move cooperatively.
Optionally, the first control camera includes a binocular camera.
Optionally, at least one of the first and second mapping cameras is a 360 degree panoramic camera.
Optionally, first survey and drawing camera with the second survey and drawing camera is 360 degrees panorama cameras, be formed with first holding tank and second holding tank on the unmanned aerial vehicle body, first holding tank with the second holding tank interval sets up, first survey and drawing camera sets up in the first holding tank, the second survey and drawing camera sets up in the second holding tank.
Optionally, the drone comprises a storage device for storing the first mapping video data and the second mapping video data.
Optionally, the drone further comprises an auxiliary detection module and a MEMS gyroscope accelerator;
the auxiliary detection module is used for sending out detection signals so as to determine the position information of the unmanned aerial vehicle;
the MEMS gyroscope accelerator is used for determining the attitude information of the unmanned aerial vehicle, and the first communication module is further used for sending the attitude information of the unmanned aerial vehicle to the first operating device and the second operating device.
Optionally, the unmanned aerial vehicle further comprises a light emitting element, and the light emitting element is arranged on the unmanned aerial vehicle body.
Optionally, the first communication module is further configured to receive a camera group control signal sent by the second operating device, where the camera group control signal is used to control cameras in the camera group.
As a second aspect of the present disclosure, there is provided a drone operating system for operating a drone provided by the first aspect of the present disclosure, the drone operating system comprising a first operating device and a second operating device.
The first operating device comprises a third communication module and a first display module, the first operating device is used for sending an operating signal to the unmanned aerial vehicle through the third communication module, the first operating device is also used for receiving the first control video data through the third communication module, and the first display module is used for displaying according to the first control video data;
the second operation device comprises a fourth communication module and a second display module, the fourth communication module is used for receiving the first control video data, and the second display module is used for displaying according to the first control video data.
Optionally, the first display module comprises a VR headset for VR display according to the first control-purpose video data.
Optionally, the unmanned aerial vehicle includes a rear camera, the rear camera is configured to capture an image behind the unmanned aerial vehicle body and generate second control video data, and the first communication module is configured to send the second control video data to the first operating device;
the VR helmet is used for displaying according to the second control video data, and a display picture corresponding to the second control video data is positioned above a display picture corresponding to the first control video data.
Optionally, the second operating device is further configured to send a camera group control signal to the unmanned aerial vehicle through the fourth communication module, where the camera group control signal is used to control cameras in the camera group.
As a fourth aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon an executable program capable of implementing the method provided by the first aspect of the present disclosure when the executable program is called.
As a fifth aspect of the present disclosure, there is provided an electronic apparatus comprising:
a storage module having an executable program stored thereon;
one or more processors capable of implementing the method provided by the first aspect of the disclosure when the one or more processors invoke the executable program.
When a person watches a short-distance object for a long time, eyeballs are tired to adjust, and the phenomenon that one eyeball inclines outwards and can not keep two eyes converged (namely, two eyeballs can not watch one target together) occurs, so that the visual objects are blurred. Accordingly, if the flyer looks at the first display module in a close range for a long time to watch the image, the above phenomena of eyeball regulation fatigue and blurred vision also occur. For more closed environments such as tunnels, the flyer, once it is tempted to malfunction, can cause the drone to strike the tunnel wall and eventually fail. The assistant personnel can also watch the image corresponding to the video data for the first control through a second display module of the second operating device, and once the unmanned aerial vehicle deviates from a preset air route and the wall collision risk is found, the assistant personnel can remind the flyer.
After the flight hand receives the warning of auxiliary personnel, can correct unmanned aerial vehicle's flight route, perhaps flight gesture through operating means, ensure can not destroy because of the striking at unmanned aerial vehicle flight in-process.
The method of generating a map provided in the present disclosure is based on data provided by a drone. Since the drone is flying forward, the first mapping video data, and the second mapping video data, may include all information of the venue through which the drone passed. The angles of deflection of the pictures in the first mapping video data relative to the vertical direction and the horizontal direction and the angles of deflection of the pictures in the second mapping video data relative to the vertical direction and the horizontal direction can be determined according to the attitude information of the unmanned aerial vehicle. The image in the first surveying and mapping video data is adjusted to the vertical direction and the horizontal direction according to the attitude information of the unmanned aerial vehicle, the image in the second surveying and mapping video data is adjusted to the vertical direction and the horizontal direction according to the attitude information of the unmanned aerial vehicle, the adjusted images are spliced, and the three-dimensional surveying and mapping of the place (namely, the place to be painted) where the unmanned aerial vehicle flies can be obtained.
In the unmanned aerial vehicle that this disclosure provided, not only including first control camera, still including setting up first survey and drawing camera and the second survey and drawing camera in unmanned aerial vehicle body side. First survey and drawing camera and second survey and drawing camera all are used for shooing the image in the place that unmanned aerial vehicle passed through. The shooting visual angle of first survey and drawing camera is different with the shooting visual angle of second survey and drawing camera, consequently, can obtain the image information in the place that unmanned aerial vehicle passed through more comprehensively through the first survey and drawing camera and the second survey and drawing camera that the interval set up. Through handling and integrating first survey and drawing video data and second survey and drawing video data, can derive the three-dimensional mapping in the place that unmanned aerial vehicle passed through, this three-dimensional mapping can present the size information that awaits measuring and draw the place more truly completely.
The first mapping video data and the second mapping video data used for mapping carry a large amount of image information, and occupy larger bandwidth during transmission. In the embodiment provided by the present disclosure, the drone includes at least two communication modules, namely, the first communication module and the second communication module, and the video data used for drone operation (i.e., the first control video data) and the video data used for mapping (i.e., the first mapping video data and the second mapping video data) can be respectively transmitted through different communication modules, that is, the video data used for mapping does not exclusively transmit the bandwidth of the video data used for drone operation, so that the first operating device and the second operating device can acquire the first control video data in time, the environment where the drone is located and the state of the drone can be mastered in real time, and the drone can be accurately operated.
Drawings
Fig. 1 is a top view of a drone provided by the present disclosure;
fig. 2 is a bottom view of a drone provided by the present disclosure;
fig. 3 is a schematic diagram of a drone operating system provided by the present disclosure;
fig. 4 is a schematic view of a VR headset of a first operating device;
FIG. 5 is a schematic view of an operating handle of the first operating device;
FIG. 6 is a flow chart of a method of generating a mapping image provided by the present disclosure;
FIG. 7 is a flowchart of one embodiment of step S320;
FIG. 8 is a flowchart of one embodiment of step S322.
Detailed Description
For those skilled in the art to better understand the technical solution of the present disclosure, the method for generating a map based on data collected by a drone, the operating system of the drone, the computer-readable storage medium, and the electronic device provided in the present disclosure are described in detail below with reference to the accompanying drawings.
Example embodiments will be described more fully hereinafter with reference to the accompanying drawings, but which may be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Embodiments of the present disclosure and features of embodiments may be combined with each other without conflict.
As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising" … …, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure, and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As a first aspect of the present disclosure, a method of generating a map based on data collected by a drone is provided. As shown in fig. 1 and 2, the unmanned aerial vehicle includes an unmanned aerial vehicle body 100, a camera group disposed on the unmanned aerial vehicle body 100, and a communication module group disposed on the unmanned aerial vehicle body 100, wherein the camera group includes a first control camera 210, a first surveying and mapping camera 221, and a second surveying and mapping camera 222, the first control camera 210 is disposed at the front end of the unmanned aerial vehicle body 100, and the first surveying and mapping camera 221 and the second surveying and mapping camera 222 are disposed at intervals on the unmanned aerial vehicle body 100.
First control is used for shooting the image in unmanned aerial vehicle body 100 the place ahead and generating first control and is used video data and unmanned aerial vehicle's gesture information.
The first mapping camera 221 is configured to capture an image of a place where the drone passes and generate first mapping video data.
The second mapping camera 222 is used to capture images of the location where the drone passes and generate first mapping video data.
As shown in fig. 3, the communication module group includes a first communication module 410 and a second communication module 420, the first communication module 410 is configured to transmit the first control video data to a first operating device 510 and a second operating device 520, and the second communication module 420 is configured to transmit the first mapping video data and the second mapping video data to a mapping device 600.
As shown in fig. 6, the method includes:
in step S310, receiving pose information, first mapping video data and second mapping video data of an unmanned aerial vehicle sent by the unmanned aerial vehicle, where the unmanned aerial vehicle is the unmanned aerial vehicle provided in the first aspect of the present disclosure, the pose information of the unmanned aerial vehicle is sent by a first communication module of the unmanned aerial vehicle, and the first mapping video data and the second mapping video data of the unmanned aerial vehicle are both sent by a second communication module of the unmanned aerial vehicle;
in step S320, a three-dimensional map of the drawing place to be drawn is generated according to the attitude information of the drone, the first mapping video data, and the second mapping video data.
As the drone is flying forward, the first mapping video data, and the second mapping video data, may include all information of the venue through which the drone is passing. The angles of deflection of the pictures in the first mapping video data relative to the vertical direction and the horizontal direction and the angles of deflection of the pictures in the second mapping video data relative to the vertical direction and the horizontal direction can be determined according to the attitude information of the unmanned aerial vehicle. The image in the first surveying and mapping video data is adjusted to the vertical direction and the horizontal direction according to the attitude information of the unmanned aerial vehicle, the image in the second surveying and mapping video data is adjusted to the vertical direction and the horizontal direction according to the attitude information of the unmanned aerial vehicle, the adjusted images are spliced, and the three-dimensional surveying and mapping of the place (namely, the place to be painted) where the unmanned aerial vehicle flies can be obtained.
The method of generating the mapping may be performed by a dedicated mapping device, and thus the mapping device may communicate with the first communication module and the second communication module of the drone to obtain the drone pose information, the first mapping video data, and the second mapping video data.
In the present disclosure, how to specifically perform step S320 is not particularly limited. As an alternative implementation, as shown in fig. 7, step S320 may specifically include:
in step S321, adjusting images at corresponding shooting times in the first mapping video data and the second mapping video data according to the pose information of the drone at each shooting time to obtain an adjusted first video image and an adjusted second video image, where rotation angles of the first video image and the second video image with respect to a reference standard are consistent;
in step S322, integrating the first video image and the second video image with the same shooting time to obtain mapping sub-images at each shooting time;
in step S333, all the mapping sub-images are sequenced along the flight trajectory of the drone in time order to obtain the three-dimensional map.
The "reference standard" in the above may be a coordinate system formed by the vertical direction and the horizontal direction, and may be another coordinate system. First survey and drawing camera and second survey and drawing camera open the back, along with unmanned aerial vehicle's flight, at the same moment, what first survey and drawing camera and second survey and drawing camera shot is the image of same scene different positions, splices through the image to same moment different positions, can obtain all graphic information of this scene. That is, the mapping sub-image at any time can reflect all the graphic information of the position of the field to be measured.
After the unmanned aerial vehicle flies the whole course, all images shot at all shooting times are adjusted to obtain surveying and mapping subimages, the surveying and mapping subimages are sequenced along the flight track of the unmanned aerial vehicle according to the time sequence, and the obtained three-dimensional image information is the three-dimensional surveying and mapping of the place to be surveyed and mapping.
Typically, the plot is a line drawing. The lines in the map are usually boundary elements in the site to be mapped. Such as the upper edge of a wall, the lower edge of a wall, a wall turn, etc. Correspondingly, as shown in fig. 8, step S322 may specifically include:
in step S322a, determining image information of a boundary element of a drawing location to be drawn in the first video image and the second video image;
in step S322b, a boundary element of the drawing place to be measured is generated to obtain the mapping sub-image.
As a second aspect of the present disclosure, a drone is provided. As shown in fig. 1 and 2, the unmanned aerial vehicle includes an unmanned aerial vehicle body 100, a camera group disposed on the unmanned aerial vehicle body 100, and a communication module group disposed on the unmanned aerial vehicle body 100, wherein the camera group includes a first control camera 210, a first surveying and mapping camera 221, and a second surveying and mapping camera 222, the first control camera 210 is disposed at the front end of the unmanned aerial vehicle body 100, and the first surveying and mapping camera 221 and the second surveying and mapping camera 222 are disposed at intervals on the unmanned aerial vehicle body 100.
First control is used for shooting the image in unmanned aerial vehicle body 100 the place ahead and generating first control and is used video data and unmanned aerial vehicle's gesture information.
The first mapping camera 221 is configured to capture images of a place where the drone passes and generate first mapping video data.
The second mapping camera 222 is used to capture images of the location where the drone passes and generate first mapping video data.
As shown in fig. 3, the communication module group includes a first communication module 410 and a second communication module 420, the first communication module 410 is configured to transmit the first control video data to a first operating device 510 and a second operating device 520, and the second communication module 420 is configured to transmit the first mapping video data and the second mapping video data to a mapping device 600.
As described above, the drone provided by the present disclosure corresponds to two operating devices, one being the first operating device 510 and the other being the second operating device 520. One of the first operating device 510 and the second operating device 520 is a main operating device, and the other is an auxiliary operating device.
For convenience of description, the working principle of the unmanned aerial vehicle and the operating system thereof provided by the present disclosure is described below by taking the "first operating device 510" as a main operating device as an example. The first operating device 510 is operated by a pilot, that is, the pilot can send a control signal to the drone through the first operating device 510 to control flight parameters of the drone, such as a flight attitude, a flight speed, and a flight altitude. First operating means 510 has first display module, and first operating means 510 receives behind the video data is used in the first control that unmanned aerial vehicle sent through first communication module 410, first display module can show according to video data is used in first control, makes the environment that the current environment of locating of unmanned aerial vehicle and the environment in unmanned aerial vehicle the place ahead can be seen to the flywheel to can operate unmanned aerial vehicle according to the environment that the current environment of locating of unmanned aerial vehicle and the environment in unmanned aerial vehicle the place ahead.
The second operating device 520 is operated by an assistant person, and the second operating device 520 includes a second display module. The second operation device can display the first control video data after receiving the first control video data. That is, the contents displayed by the first display module of the first operating device 510 and the second display module of the second operating device 520 are the same.
When a person watches a close-distance object for a long time, eyeballs are tired to adjust, and the phenomenon that one eyeball inclines outwards and two eyes cannot be kept converged (namely, two eyeballs cannot watch one target together) occurs, so that the visual objects are blurred. Accordingly, if the flyer looks at the first display module in a close range for a long time to watch the image, the above phenomena of eyeball regulation fatigue and blurred vision also occur. For a relatively closed environment such as a tunnel, a flyer may mishandle once it is in mind, causing the drone to strike the tunnel wall and eventually break down. The assistant personnel can also watch the image corresponding to the video data for the first control through a second display module of the second operating device, and once the unmanned aerial vehicle deviates from a preset air route and the wall collision risk is found, the assistant personnel can remind the flyer.
After the flight hand receives the warning of auxiliary personnel, can correct unmanned aerial vehicle's flight route, perhaps flight gesture through operating means, ensure can not destroy because of the striking at unmanned aerial vehicle flight in-process.
In the unmanned aerial vehicle that this disclosure provided, not only include first control camera 210, still include first survey and drawing camera 221 and the second survey and drawing camera 222 of setting in the unmanned aerial vehicle body 100 side. The first surveying camera 221 and the second surveying camera 222 are both used for shooting images of places where the unmanned aerial vehicle passes. The shooting visual angle of the first surveying camera 221 is different from that of the second surveying camera 222, and therefore the image information of the place where the unmanned aerial vehicle passes can be obtained more comprehensively through the first surveying camera 221 and the second surveying camera 222 which are arranged at intervals. Through handling and integrating first survey and drawing video data and second survey and drawing video data, can derive the three-dimensional mapping in the place that unmanned aerial vehicle passed through, this three-dimensional mapping can present the size information that awaits measuring and draw the place more truly and completely.
The first mapping video data and the second mapping video data used for mapping carry a large amount of image information, and occupy larger bandwidth during transmission. As shown in fig. 3, in the embodiment provided by the present disclosure, the drone includes at least two communication modules, namely, a first communication module 410 and a second communication module 420, and video data for drone operation (i.e., first control video data) and video data for mapping (i.e., first mapping video data and second mapping video data) can be respectively transmitted through different communication modules, that is, the video data for mapping does not dedicate a bandwidth for transmitting the video data for drone operation, so that the first operating device 510 and the second operating device 520 can acquire the first control video data in time, real-time control of the environment where the drone is located and the state of the drone can be realized, and the drone can be accurately operated.
In the present disclosure, the specific type of mapping device is not particularly limited. As an alternative, the mapping device may be an electronic device with computing capabilities that is independent of the first operating device 510 and the second operating device 520.
In order to further be convenient for the flying hand to know the environment where the unmanned aerial vehicle is located, optionally, the unmanned aerial vehicle can also include a rear camera, and this rear camera sets up the rear end of unmanned aerial vehicle body, the rear camera is used for shooting the image behind the unmanned aerial vehicle body and generating the video data for the second control, first communication module be used for with the video data for the second control send to first operating means.
Upon receiving the second control video data, the first operating device 510 can display the second control video data. The display image that the flyer corresponds through video data for the second control can confirm the environment at unmanned aerial vehicle rear, is favorable to the flyer to carry out accurate judgement to the environment that unmanned aerial vehicle located more.
As an alternative embodiment, the first display module of the first operating device 510 may include a VR headset (see fig. 4), and the first operating device 510 may control the VR headset to perform VR display, so that when the flier wears the VR headset, a VR image corresponding to the first control video data may be displayed right in front of the flier, and a VR image corresponding to the second control video data may be displayed above the VR image corresponding to the first control video data, thereby facilitating viewing by the flier.
In the present disclosure, the first control camera 210 is movable. The flying hand can send a first control camera control signal to the unmanned aerial vehicle through the first operating device 510 or the second operating device 520, so that the first control camera 210 adjusts the shooting direction of the flying hand according to the received first control camera control signal, and controls the rear camera to move cooperatively. That is, when the first control camera 210 moves, the rear camera may also move in coordination, ensuring that the rear camera can capture an image corresponding to the first control camera.
In the present disclosure, the shooting angle of each camera in the camera group may be controlled by the first operating device 510, and may also be controlled by the second operating device 520. As a preferred embodiment, the assistant operator sends the camera group control signal to the first communication module 410 through the second operating device 520, so that the flyer can concentrate on the control of the drone, and avoid collision or damage of the drone when flying in a closed environment. On this basis, supplementary operating personnel can enough obtain more comprehensive video data through the shooting angle of each camera in the second operating means 520 control camera group, especially first survey and drawing camera 221 and second survey and drawing camera 222's angle, can not lead to the fact the influence to the flyer again, has promoted unmanned aerial vehicle's security at the survey and drawing in-process.
Correspondingly, the unmanned aerial vehicle can obtain the control signal of the camera group through the first communication module 410, and the controller of the unmanned aerial vehicle controls and operates the corresponding camera according to the control signal of the camera group.
In the present disclosure, the specific type of the first control camera 210 is not particularly limited. When the first display module of the first operating device 510 is a VR headset, the first control camera may be a binocular camera. The effect of the images shot by the binocular camera is similar to the effect of direct viewing of human eyes. When forming the VR image according to the video data for obtaining first control through binocular camera, can make the flyer feel the environment that unmanned aerial vehicle is located more really to can control unmanned aerial vehicle more accurately.
In order to obtain more accurate mapping information, optionally, at least one of the first mapping camera 221 and the second mapping camera 222 is a 360-degree panoramic camera.
To obtain more comprehensive venue video image information, optionally, the first mapping camera 221 and the second mapping camera 222 are both 360 degree panoramic cameras.
Of course, the present disclosure is not limited thereto, for example, one of the first and second mapping cameras 221 and 222 is a 360-degree panoramic camera, and the other is not a 360-degree panoramic camera. For example, the first mapping camera 221 may be a 360 degree panoramic camera and the second mapping camera 221 may be a camera group including a second upper camera and a second lower camera. Wherein, the shooting direction orientation of camera on the second unmanned aerial vehicle's top, the shooting direction orientation of camera under the second unmanned aerial vehicle's below. In such embodiments, the second mapping camera, including the second upper camera and the second lower camera, may simulate a "binocular camera. In order to obtain more comprehensive video information, optionally, both the second upper camera and the second lower camera are wide-angle cameras. Preferably, the shooting angle of the wide-angle camera is not less than 120 degrees, and further preferably, the shooting angle of the wide-angle camera is 150 degrees, so that more comprehensive video information can be obtained, and generation of the stereoscopic mapping is more facilitated according to the data for mapping provided by the first mapping camera 221 and the second mapping camera 222.
In the present disclosure, how the first surveying and mapping camera 221 and the second surveying and mapping camera 222 are disposed on the drone body is not particularly limited. As an optional embodiment, a first receiving groove and a second receiving groove are formed on the drone body 100, and the first receiving groove and the second receiving groove are arranged at intervals, the first surveying and mapping camera 221 is arranged in the first receiving groove, and the second surveying and mapping camera 222 is arranged in the second receiving groove.
In the present disclosure, the depth directions of the first receiving groove and the second receiving groove are not particularly limited. As long as the first surveying camera 221 and the second surveying camera 222 can be set, and the images captured by the first surveying camera 221 and the second surveying camera 222 are not affected. As an optional implementation mode, the depth direction of first holding tank and the depth direction of second holding tank all can be unanimous with the thickness direction of unmanned aerial vehicle body. First holding tank and second holding tank can also be for the heavy groove that does not run through the unmanned aerial vehicle body for running through the logical groove of unmanned aerial vehicle body.
It should be noted that, when first survey and drawing camera 221 and second survey and drawing camera 222 are 360 degree panoramic camera, set up the two in first holding tank and second holding tank respectively, the part that first survey and drawing camera 221 exposes from first holding tank and the part that second survey and drawing camera 222 exposes from the second holding tank both can satisfy panorama shooting, can not receive too much again and shelter from. In the disclosure, the first surveying camera 221 and the second surveying camera 222 are both 360-degree panoramic cameras, and shooting angles of the first surveying camera 221 and the second surveying camera 222 are controlled to obtain a more comprehensive surveying and mapping image, so that the gesture change limitation of the unmanned aerial vehicle can be compensated to obtain the comprehensive surveying and mapping image, and a three-dimensional surveying and mapping image which more truly and comprehensively reflects the painting place to be painted is generated.
For example, the diameter of the narrowest part of the mine may be only 60cm, even less than 60cm, and the width of the unmanned aerial vehicle reaches about 40 cm. Therefore, when the unmanned aerial vehicle flies in a mine, too many attitude changes can not be made. First survey and drawing camera 221 and second survey and drawing camera 222 are 360 degrees panorama cameras to can carry out more comprehensive shooting to the mine, utilize the mapping image, can generate more truly comprehensive reflection and await measuring the three-dimensional mapping in drawing the place.
As an alternative embodiment, the drone may include a storage device disposed on the drone body 100, the storage device being configured to store the first mapping video data and the second mapping video data. First survey and drawing video data and second survey and drawing video data are kept in storage device through unmanned aerial vehicle, wait unmanned aerial vehicle to accomplish the survey and drawing back of surveying and drawing the place, derive first survey and drawing video data and the second survey and drawing video data of storage in the storage device again, handle first survey and drawing video data and second survey and drawing video data, can generate the whole survey and drawing stereogram that awaits measuring and draw the place.
In the present disclosure, the storage device may serve as a backup device, and in case of a communication failure of the second communication module, the mapping stereogram may be generated using the first mapping video data and the second mapping video data stored in the storage device.
In this disclosure, there is no particular limitation on how to locate the drone. For example, a GPS location module may be included on the drone. The unmanned aerial vehicle that this disclosure provided is applicable to and measures the closed environment, and GPS signal is more weak in the closed environment, for the convenience fixes a position the unmanned aerial vehicle that is in the closed environment, as an optional implementation, unmanned aerial vehicle can also include supplementary detection module. The auxiliary detection module is used for sending out detection signals so as to determine the position information of the unmanned aerial vehicle.
In the present disclosure, there is no particular limitation on how to determine the attitude information of the drone, and optionally, the drone may include a MEMS gyroscope accelerator, through which the attitude information of the drone may be determined. As an optional implementation manner, the attitude information of the drone may be transmitted to the first operating device and the second operating device through the first communication module.
In this disclosure, unmanned aerial vehicle's gesture information and video data for first control all send to first operating means and second operating means through first communication module to be favorable to the flier and the auxiliary personnel obtain unmanned aerial vehicle's gesture information in time, be convenient for the flier accurately to judge unmanned aerial vehicle's gesture.
In order to facilitate the flying of the drone in the closed environment, as an optional embodiment, the drone further comprises a light emitting element 310, the light emitting element 310 being disposed on the drone body 100. In the present disclosure, how to control the light emitting element 310 to emit light is not particularly limited. For example, a light sensing device may be provided on the drone, and the control device of the drone controls the light emitting element to emit light when the light sensing device determines that the ambient brightness in which the drone is located is lower than a predetermined brightness.
In the present disclosure, the specific type of the light emitting element is not particularly limited. In order to provide sufficient shooting light for the camera group, optionally, the light emitting element may include a light strip disposed around the drone body.
In this disclosure, the lamp area can set up on the side of unmanned aerial vehicle body, the preceding terminal surface of unmanned aerial vehicle body and the rear end face of unmanned aerial vehicle body. Of course, also can all set up in the other positions of unmanned aerial vehicle body the lamp area.
In the embodiment shown in fig. 1 and 2, the light emitting elements are LEDs. Be provided with light emitting component 310 on four fillets of unmanned aerial vehicle body respectively to can provide necessary light source for unmanned aerial vehicle's flight and video shooting.
In the present disclosure, there is no particular limitation on how to control the flight parameters of the drone. For example, the drone operation signal may be transmitted to the first communication module 410 of the drone through the first operation device 510 and the second operation device 520. Be provided with the controller in the unmanned aerial vehicle body 100, this controller can be according to after receiving unmanned aerial vehicle operating signal unmanned aerial vehicle flight parameter control unmanned aerial vehicle flies.
As a third aspect of the present disclosure, there is provided a drone operating system for operating a drone provided by the first aspect of the present disclosure, the drone operating system comprising a first operating device 510 and a second operating device 520, as shown in fig. 3.
The first operating device 510 includes a third communication module and a first display module, the first operating device 510 is configured to send an operation signal to the unmanned aerial vehicle through the third communication module, and the first operating device 510 is further configured to receive the first control video data through the third communication module, and the first display module is configured to display according to the first control video data. It is noted that on the drone side, the operation signal may be received by the first communication module 410 of the drone.
The second operating device 520 includes a fourth communication module for receiving the first control video data and a second display module for displaying according to the first control video data.
As described above, the "first operating device 510" may be used as a main operating device, and the first operating device 510 is operated by the pilot, that is, the pilot may transmit a control signal to the drone through the first operating device 510 to control flight parameters such as a flight attitude, a flight speed, and a flight altitude of the drone. First operating means 510 has first display module, and first operating means receives behind the video data is used in the first control that unmanned aerial vehicle sent through first communication module, first display module can show according to video data is used in first control, makes the flier can make clear and tell the environment that unmanned aerial vehicle is present to and the environment in unmanned aerial vehicle the place ahead to can operate unmanned aerial vehicle according to the environment that unmanned aerial vehicle is present to be located, and the environment in unmanned aerial vehicle the place ahead.
The second operating device 520 is operated by an assistant person, and the second operating device 520 includes a second display module. The second operating device 520, upon receiving the first control video data, can display the first control video data. That is, the contents displayed by the first display module of the first manipulation device 510 and the second display module of the second manipulation device 520 may be the same.
When a person watches a short-distance object for a long time, eyeballs are tired to adjust, and the phenomenon that one eyeball inclines outwards and can not keep two eyes converged (namely, two eyeballs can not watch one target together) occurs, so that the visual objects are blurred. Accordingly, if the flying hand looks at the first display module in a close range for a long time to watch the image, the above phenomena of eyeball regulation fatigue and blurred vision also occur. For a relatively closed environment such as a tunnel, a flyer may mishandle once it is in mind, causing the drone to strike the tunnel wall and eventually break down. The assistant personnel can also watch the image corresponding to the video data for the first control through a second display module of the second operating device, and once the unmanned aerial vehicle deviates from a preset air route and the wall collision risk is found, the assistant personnel can remind the flyer.
After the flight hand receives the warning of auxiliary personnel, can correct unmanned aerial vehicle's flight route, perhaps flight gesture through operating means, ensure can not destroy because of the striking at unmanned aerial vehicle flight in-process.
In the present disclosure, the first display module is not particularly limited, and for example, as shown in fig. 4, the first display module of the first operating device may include a display panel. As described above, the first display module may include a VR headset for VR display according to the first control-purpose video data.
When the first display module of the first operating device serving as the main operating device includes a VR headset, the second display module of the second operating device serving as the auxiliary operating device may be a display panel. The display panel may be a display panel such as a liquid crystal display panel, an LED display panel, or the like.
In addition to the VR headset, the first operating device may also include an operating handle as shown in fig. 5. The flyer controls the unmanned aerial vehicle through the operating handle. Of course, the present disclosure is not so limited.
As an optional implementation manner, the unmanned aerial vehicle includes a rear camera, and the second control video data is shot by the rear camera. When the first display module is a VR helmet, the display can also be performed according to second control video data, and a display screen corresponding to the second control video data is located above a display screen corresponding to the first control video data. The main flyer may use a picture corresponding to the second control video data as a reference picture for controlling the flight of the unmanned aerial vehicle.
As described above, in order to ensure that the flying hand concentrates on manipulating the drone to fly, in the present disclosure, the camera group on the drone may be controlled by the second operating device. That is to say, second operating means still is used for through fourth communication module to unmanned aerial vehicle sends camera group control signal, camera group control signal is used for controlling the camera in the camera group.
As a fourth aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon an executable program capable of implementing the method provided by the first aspect of the present disclosure when the executable program is called.
As a fifth aspect of the present disclosure, there is provided an electronic apparatus comprising:
a storage module having an executable program stored thereon;
one or more processors capable of implementing the method provided by the first aspect of the disclosure when the one or more processors invoke the executable program.
One of ordinary skill in the art will appreciate that all or some of the steps in the methods, systems, and functional modules/units in the apparatus disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.
Example embodiments have been disclosed herein, and although specific terms are employed, they are used and should be interpreted in a generic and descriptive sense only and not for purposes of limitation. In some instances, features, characteristics and/or elements described in connection with a particular embodiment may be used alone or in combination with features, characteristics and/or elements described in connection with other embodiments, unless expressly stated otherwise, as would be apparent to one skilled in the art. Accordingly, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the disclosure as set forth in the appended claims.

Claims (19)

1. A method for generating a map based on data collected by an unmanned aerial vehicle, the unmanned aerial vehicle comprises an unmanned aerial vehicle body, a camera group arranged on the unmanned aerial vehicle body and a communication module group arranged on the unmanned aerial vehicle body, the camera group comprises a first control camera, a first mapping camera and a second mapping camera, the first control camera is arranged at the front end of the unmanned aerial vehicle body, the first mapping camera and the second mapping camera are arranged on the unmanned aerial vehicle body at intervals,
the first control camera is used for shooting an image in front of the unmanned aerial vehicle body and generating first control video data and attitude information of the unmanned aerial vehicle;
the first mapping camera is used for shooting images of places where the unmanned aerial vehicle passes and generating first mapping video data;
the second mapping camera is used for shooting images of places where the unmanned aerial vehicle passes and generating second mapping video data;
the communication module group comprises a first communication module and a second communication module, the first communication module is used for sending the first control video data to a first operation device and a second operation device, the second communication module is used for sending the first mapping video data and the second mapping video data to a mapping device, the first operation device comprises a third communication module and a first display module, the first operation device is used for sending an operation signal to the unmanned aerial vehicle through the third communication module, the first operation device is also used for receiving the first control video data through the third communication module, and the first display module is used for displaying according to the first control video data; the second operation device comprises a fourth communication module and a second display module, the fourth communication module is used for receiving the first control video data, and the second display module is used for displaying according to the first control video data;
the method of generating a map comprises:
receiving attitude information of the unmanned aerial vehicle, the first mapping video data and the second mapping video data sent by the unmanned aerial vehicle;
and generating a three-dimensional map of the to-be-painted place according to the attitude information of the unmanned aerial vehicle, the first mapping video data and the second mapping video data.
2. The method of claim 1, wherein generating a three-dimensional map of a venue to be painted from pose information of the drone, the first mapping video data, and the second mapping video data comprises:
adjusting images at corresponding shooting time in the first mapping video data and the second mapping video data according to the attitude information of the unmanned aerial vehicle at each shooting time to obtain an adjusted first video image and an adjusted second video image, wherein the rotation angles of the first video image and the second video image relative to a reference standard are consistent;
integrating the first video image and the second video image with the same shooting time to obtain mapping sub-images of all shooting times;
and sequencing all surveying and mapping sub-images along the flight path of the unmanned aerial vehicle according to the time sequence to obtain the three-dimensional surveying and mapping.
3. The method of claim 2, wherein integrating the first video image and the second video image with the same shooting time to obtain the mapping sub-image of each shooting time comprises:
determining image information of boundary elements of places to be painted in the first video image and the second video image;
and generating boundary elements of the place to be mapped so as to obtain the mapping sub-image.
4. An unmanned aerial vehicle, the unmanned aerial vehicle comprises an unmanned aerial vehicle body, a camera group arranged on the unmanned aerial vehicle body and a communication module group arranged on the unmanned aerial vehicle body, and is characterized in that the camera group comprises a first control camera, a first surveying and mapping camera and a second surveying and mapping camera, the first control camera is arranged at the front end of the unmanned aerial vehicle body, the first surveying and mapping camera and the second surveying and mapping camera are arranged on the unmanned aerial vehicle body at intervals,
the first control camera is used for shooting an image in front of the unmanned aerial vehicle body and generating first control video data and attitude information of the unmanned aerial vehicle;
the first mapping camera is used for shooting images of places where the unmanned aerial vehicle passes and generating first mapping video data;
the second mapping camera is used for shooting images of places where the unmanned aerial vehicle passes and generating second mapping video data;
the communication module group includes a first communication module for transmitting the first control video data to a first operating device and a second operating device, and a second communication module for transmitting the first mapping video data and the second mapping video data to a mapping device, wherein,
the first operating device comprises a third communication module and a first display module, the first operating device is used for sending an operating signal to the unmanned aerial vehicle through the third communication module, the first operating device is also used for receiving the first control video data through the third communication module, and the first display module is used for displaying according to the first control video data; the second operation device comprises a fourth communication module and a second display module, the fourth communication module is used for receiving the first control video data, and the second display module is used for displaying according to the first control video data.
5. The unmanned aerial vehicle of claim 4, further comprising a rear camera disposed at a rear end of the unmanned aerial vehicle body, the rear camera being configured to capture images behind the unmanned aerial vehicle body and generate second control video data, and the first communication module being configured to send the second control video data to the first operating device.
6. The UAV of claim 5, wherein the first control camera is further configured to adjust a shooting direction of the first control camera according to the received first control camera control signal, and control the rear camera to move cooperatively.
7. The drone of claim 5, wherein the first control camera comprises a binocular camera.
8. A drone as claimed in any one of claims 4 to 7, wherein each of the first and second mapping cameras is a 360 degree panoramic camera; alternatively, the first and second electrodes may be,
first survey and drawing camera is 360 degrees panorama cameras, the second survey and drawing camera includes camera and second lower camera on the second, the shooting direction orientation of camera on the second unmanned aerial vehicle's top, the shooting direction orientation of camera under the second unmanned aerial vehicle's below.
9. The unmanned aerial vehicle of claim 8, wherein the unmanned aerial vehicle body has a first receiving groove and a second receiving groove formed thereon, the first receiving groove and the second receiving groove being spaced apart, the first mapping camera being disposed in the first receiving groove, the second mapping camera being disposed in the second receiving groove.
10. A drone according to any one of claims 4 to 7, characterised in that the drone includes storage means for storing the first and second mapping video data.
11. A drone according to any one of claims 4 to 7, further comprising an auxiliary detection module and a MEMS gyroscope accelerator;
the auxiliary detection module is used for sending out detection signals so as to determine the position information of the unmanned aerial vehicle;
the MEMS gyroscope accelerator is used for determining the attitude information of the unmanned aerial vehicle, and the first communication module is further used for sending the attitude information of the unmanned aerial vehicle to the first operating device and the second operating device.
12. A drone according to any one of claims 4 to 7, further comprising a light emitting element provided on the drone body.
13. The unmanned aerial vehicle of any one of claims 4 to 7, wherein the first communication module is further configured to receive a camera group control signal sent by the second operating device, and the camera group control signal is configured to control cameras in the camera group.
14. A drone operating system for operating a drone according to any one of claims 4 to 13, the drone operating system comprising a first operating device and a second operating device,
the first operating device comprises a third communication module and a first display module, the first operating device is used for sending an operating signal to the unmanned aerial vehicle through the third communication module, the first operating device is also used for receiving the first control video data through the third communication module, and the first display module is used for displaying according to the first control video data;
the second operation device comprises a fourth communication module and a second display module, the fourth communication module is used for receiving the first control video data, and the second display module is used for displaying according to the first control video data.
15. The drone operating system of claim 14, wherein the first display module includes a VR headset for VR display of the first control-purpose video data.
16. The unmanned aerial vehicle operating system of claim 15, wherein the unmanned aerial vehicle comprises a rear camera for capturing images behind the unmanned aerial vehicle body and generating second control video data, and the first communication module is configured to send the second control video data to the first operating device;
the VR helmet is used for displaying according to the second control video data, and a display picture corresponding to the second control video data is positioned above a display picture corresponding to the first control video data.
17. A drone operating system according to any one of claims 14 to 16, wherein the second operating means is further arranged to send camera group control signals to the drone via the fourth communication module, the camera group control signals being arranged to control the cameras in the camera group.
18. A computer-readable storage medium having stored thereon an executable program capable of implementing the method of any one of claims 1 to 3 when said executable program is invoked.
19. An electronic device, comprising:
a storage module having an executable program stored thereon;
one or more processors capable of implementing the method of any one of claims 1 to 3 when said executable program is called by said one or more processors.
CN202210764782.0A 2022-07-01 2022-07-01 Unmanned aerial vehicle, operating system thereof, method, medium and equipment for generating map Active CN114877872B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210764782.0A CN114877872B (en) 2022-07-01 2022-07-01 Unmanned aerial vehicle, operating system thereof, method, medium and equipment for generating map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210764782.0A CN114877872B (en) 2022-07-01 2022-07-01 Unmanned aerial vehicle, operating system thereof, method, medium and equipment for generating map

Publications (2)

Publication Number Publication Date
CN114877872A CN114877872A (en) 2022-08-09
CN114877872B true CN114877872B (en) 2022-10-14

Family

ID=82682778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210764782.0A Active CN114877872B (en) 2022-07-01 2022-07-01 Unmanned aerial vehicle, operating system thereof, method, medium and equipment for generating map

Country Status (1)

Country Link
CN (1) CN114877872B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115574872B (en) * 2022-12-09 2023-03-24 北京今日蓝天科技有限公司 Mapping system based on dynamic net arrangement, well climate control method and medium
CN116758157B (en) * 2023-06-14 2024-01-30 深圳市华赛睿飞智能科技有限公司 Unmanned aerial vehicle indoor three-dimensional space mapping method, system and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9618934B2 (en) * 2014-09-12 2017-04-11 4D Tech Solutions, Inc. Unmanned aerial vehicle 3D mapping system
CN105928493A (en) * 2016-04-05 2016-09-07 王建立 Binocular vision three-dimensional mapping system and method based on UAV
KR101855864B1 (en) * 2016-11-18 2018-06-20 주식회사 포스코건설 3d mapping technique construction site management system using drone for considering heavy construction equipment
KR101885184B1 (en) * 2017-08-02 2018-08-06 신용겸 Drones for aerial photogrammetry
CN112470092B (en) * 2018-11-21 2022-11-08 广州极飞科技股份有限公司 Surveying and mapping system, surveying and mapping method, device, equipment and medium
CN112469967B (en) * 2018-11-21 2023-12-26 广州极飞科技股份有限公司 Mapping system, mapping method, mapping device, mapping apparatus, and recording medium
WO2020103021A1 (en) * 2018-11-21 2020-05-28 广州极飞科技有限公司 Planning method and apparatus for surveying and mapping sampling points, control terminal and storage medium
CN109737924A (en) * 2019-02-28 2019-05-10 华南机械制造有限公司 Three-dimensional mapping system based on unmanned plane
CN114132501A (en) * 2021-12-20 2022-03-04 辽宁工程技术大学 A microminiature unmanned aerial vehicle for low latitude remote sensing survey and drawing task

Also Published As

Publication number Publication date
CN114877872A (en) 2022-08-09

Similar Documents

Publication Publication Date Title
CN114877872B (en) Unmanned aerial vehicle, operating system thereof, method, medium and equipment for generating map
US10678238B2 (en) Modified-reality device and method for operating a modified-reality device
US9947230B2 (en) Planning a flight path by identifying key frames
Azuma A survey of augmented reality
EP2557468A2 (en) Remote control system
US20100292868A1 (en) System and method for navigating a remote control vehicle past obstacles
CN109154499A (en) System and method for enhancing stereoscopic display
SE527257C2 (en) Device and method for presenting an external image
KR101896654B1 (en) Image processing system using drone and method of the same
US20200012335A1 (en) Individual visual immersion device for a moving person with management of obstacles
US20200118337A1 (en) Environmental mapping for augmented reality
US20210034052A1 (en) Information processing device, instruction method for prompting information, program, and recording medium
JP3477441B2 (en) Image display device
EP2523062B1 (en) Time phased imagery for an artificial point of view
CN110880161B (en) Depth image stitching and fusion method and system for multiple hosts and multiple depth cameras
JP6831949B2 (en) Display control system, display control device and display control method
JP7435599B2 (en) Information processing device, information processing method, and program
KR102181809B1 (en) Apparatus and method for checking facility
US11669088B2 (en) Apparatus, method and software for assisting human operator in flying drone using remote controller
CN113311855B (en) Aircraft monitoring method and device, computer storage medium and computer device
KR20180060403A (en) Control apparatus for drone based on image
US20220166917A1 (en) Information processing apparatus, information processing method, and program
CN115442510A (en) Video display method and system for view angle of unmanned aerial vehicle
JP6890759B2 (en) Flight route guidance system, flight route guidance device and flight route guidance method
KR101027533B1 (en) Apparatus and method for monitoring image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant