CN113906362A - Surveying camera control method, surveying camera, unmanned aerial vehicle and surveying system - Google Patents

Surveying camera control method, surveying camera, unmanned aerial vehicle and surveying system Download PDF

Info

Publication number
CN113906362A
CN113906362A CN202080040708.3A CN202080040708A CN113906362A CN 113906362 A CN113906362 A CN 113906362A CN 202080040708 A CN202080040708 A CN 202080040708A CN 113906362 A CN113906362 A CN 113906362A
Authority
CN
China
Prior art keywords
lens
mapping
camera
mapping camera
focusing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080040708.3A
Other languages
Chinese (zh)
Inventor
吴利鑫
朱玲龙
何纲
方朝晖
黄振昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN113906362A publication Critical patent/CN113906362A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Studio Devices (AREA)

Abstract

A control method of a surveying camera (412, 50), a drone (41), and a surveying system. Surveying cameras (412, 50) are mounted on the drone (41), the method comprising: s202, in the process of carrying out surveying and mapping operation by the unmanned aerial vehicle (41) according to a target route, determining the target position of a focusing lens group of a lens (51) when the lens (51) of a surveying and mapping camera (412, 50) focuses to infinity; s204, adjusting the focusing mirror group to a target position, carrying out focusing locking, and controlling the mapping cameras (412, 50) to carry out image acquisition on the mapping area in the process that the unmanned aerial vehicle (41) flies according to the target route. Because the focusing lens group of the mapping camera (412, 50) is not physically locked, the target position of the focusing lens group when the lens (51) is focused to infinity in the current working environment can be automatically determined in the process of executing the mapping task, and the position of the focusing lens group is adjusted, so that the problem of fuzzy acquired images caused by drift of the focus position of the mapping camera (412, 50) due to temperature change can be solved.

Description

Surveying camera control method, surveying camera, unmanned aerial vehicle and surveying system
Technical Field
The application relates to the technical field of surveying and mapping, in particular to a control method of a surveying and mapping camera, the surveying and mapping camera, an unmanned aerial vehicle and a surveying and mapping system.
Background
Unmanned aerial vehicle has extensive application in the survey and drawing field, carries on the survey and drawing camera on unmanned aerial vehicle usually, when unmanned aerial vehicle flies according to predetermined route, gathers the image in survey and drawing area through the survey and drawing camera. At present, a lens of a surveying camera is generally an undetachable lens, and before the surveying camera leaves a factory, the position of a focusing lens group when the lens focuses to infinity is calibrated, and then the focusing lens group is fixed at the position in a physical locking manner. Because unmanned aerial vehicle need carry out the operation under different operation environment, the temperature of different operation environment can change, and the position of focus also can drift along with temperature change, leads to the image blur in the survey and drawing area of collection, can't model. Therefore, there is a need to provide a solution to the problem of focus position drift caused by temperature variation of the mapping camera, so as to ensure that the images acquired during the mapping process are clear.
Disclosure of Invention
In view of the above, the present application provides a control method of a surveying camera, an unmanned aerial vehicle and a surveying system.
According to a first aspect of the present application, there is provided a method of controlling a surveying camera mounted on an unmanned aerial vehicle, the method comprising:
in the process that the unmanned aerial vehicle carries out surveying and mapping operation according to a target route, determining a target position where a focusing lens group of a lens is located when the lens of the surveying and mapping camera is focused to infinity;
and adjusting the focusing mirror group to the target position, focusing and locking so as to control the mapping camera to acquire images of a mapping area in the process that the unmanned aerial vehicle flies according to the target route.
According to a second aspect of the present application, there is provided a surveying camera, the surveying camera being mounted on a drone, the surveying camera comprising a processor, a memory, and a computer program stored in the memory for execution by the processor, when executing the computer program, implementing the steps of:
in the process that the unmanned aerial vehicle carries out surveying and mapping operation according to a target route, determining a target position where a focusing lens group of a lens is located when the lens of the surveying and mapping camera is focused to infinity;
and adjusting the focusing mirror group to the target position, focusing and locking so as to control the mapping camera to acquire images of a mapping area in the process that the unmanned aerial vehicle flies according to the target route.
According to a third aspect of the present application there is provided a drone comprising a mapping camera of the second aspect described above.
According to a fourth aspect of the present application, there is provided a surveying and mapping system comprising an unmanned aerial vehicle and a control terminal, the control terminal being equipped with a designated APP, the unmanned aerial vehicle comprising a surveying and mapping camera;
the designated APP is used for receiving a target route input by a user and sending the target route to the unmanned aerial vehicle;
the mapping camera comprises a processor, a memory, and a computer program stored in the memory for execution by the processor, the computer program when executed by the processor implementing the steps of:
in the process that the unmanned aerial vehicle conducts surveying and mapping operation according to the target air route, determining a target position where a focusing lens group of a lens is located when the lens of the surveying and mapping camera focuses to infinity;
and adjusting the focusing mirror group to the target position, focusing and locking so as to control the mapping camera to acquire images of a mapping area in the process that the unmanned aerial vehicle flies according to the target route.
According to a fifth aspect of the present application, there is provided a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the surveying camera control method mentioned in the first aspect above.
By the adoption of the scheme, in the process that the unmanned aerial vehicle executes the surveying and mapping task according to the target air route, the target position where the focusing lens group is located when the lens of the surveying and mapping camera carried on the unmanned aerial vehicle focuses to infinity can be automatically determined, then the focusing lens group can be adjusted to the target position and is focused and locked, and therefore the surveying and mapping camera can acquire images of the surveying and mapping area. Because the focusing lens group of the surveying and mapping camera is not physically locked, the target position of the focusing lens group when the lens is focused to infinity in the current working environment can be automatically determined in the surveying and mapping task executing process, and the position of the focusing lens group is adjusted, so that the problem of fuzzy acquired images caused by drift of the focus position of the surveying and mapping camera due to temperature change can be solved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic diagram of an infinity focusing range of a camera according to an embodiment of the present application.
FIG. 2 is a flow chart of a mapping camera control method according to an embodiment of the present application.
FIG. 3(a) is a schematic illustration of a target route according to an embodiment of the present application.
FIG. 3(b) is a schematic diagram of adding a buffer area outside of a target route according to an embodiment of the present application.
Fig. 4 is a schematic diagram of an application scenario according to an embodiment of the present application.
FIG. 5 is a schematic diagram of a logical structure of a mapping camera according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
When the camera is used for collecting images, in order to obtain clear images, the shot target can be focused, and the focus is adjusted to the plane where the shot target is located, so that the clear images of the collected shot target are guaranteed. The lens of the camera generally includes a plurality of lens sets, and the focal position can be adjusted by adjusting the distance between one or more lens sets and other lens sets, wherein the lens set for changing the focal position is called a focusing lens set, and the focal position can be changed by adjusting the position of the focusing lens set, such as moving the focal point forward or backward to be aligned with the object to be photographed.
The lens of the camera typically also includes an infinite distance threshold that is related to the aperture, pixel, sensor, etc. of the camera. When the distance between the shot object and the lens exceeds the infinite distance threshold, the shot object is focused, and the shot object larger than the infinite distance threshold forms clear images in the camera. For example, assuming that the infinite distance threshold of a certain type of mapping camera is 100m, when the mapping camera is in focus at a distance greater than 100m from a photographed target, it is clear to photograph the photographed target in a range from 100m to infinite distance using the mapping camera. As shown in fig. 1, when the distance between the object to be photographed and the lens of the surveying and mapping camera is greater than the infinite distance threshold d, the object to be photographed is generally said to be located within the infinite focusing range of the lens of the surveying and mapping camera, the image of the object to be photographed in the surveying and mapping camera is clear, and when the distance between the object to be photographed and the lens of the surveying and mapping camera is less than the infinite distance threshold d, the object to be photographed is generally said to be located outside the infinite focusing range of the lens of the surveying and mapping camera, the image of the object to be photographed in the surveying and mapping camera is blurred.
In order to enable a camera to clearly image when shooting an object beyond an infinite distance threshold, the camera is usually focused at infinity (which may also be referred to as focusing to infinity), that is, a focusing target is located within an infinite focusing range of the camera, then the camera is used to focus the focusing target, so that a focus is aligned with the focusing target, and a position of a focusing lens group at this time is determined, which is referred to as focusing to infinity.
When the unmanned aerial vehicle is used for carrying the mapping camera to acquire a scene of an image of a mapping area, the distance between the lens of the mapping camera and a shot target in the mapping area is far, and the shot target is usually located in the infinite focusing range of the lens of the mapping camera. Therefore, before the surveying and mapping camera leaves the factory, infinite focusing calibration is performed on the lens of the surveying and mapping camera in advance, the position of the focusing lens group of the lens is determined when the lens of the surveying and mapping camera is focused to infinity, then the focusing lens group is fixed on the position in a physical locking mode such as glue or bolt fixing, and the position of the focusing lens group cannot be moved in the subsequent operation process of the surveying and mapping camera. However, because the temperature of the working environment of the unmanned aerial vehicle is different in the working process, the position of the focal point of the lens drifts along with the change of the environmental temperature, so that the image of the mapping area acquired by the mapping camera is blurred, and modeling cannot be performed.
Based on this, the embodiment of the application provides a control method of a surveying and mapping camera, and the surveying and mapping camera is carried on an unmanned aerial vehicle. In order to adjust the position of the focusing lens group of the lens at any time to change the position of a focus in the surveying and mapping process and solve the problem of focus position drift caused by the change of the temperature of the operation environment, the focusing lens group of the surveying and mapping camera in the embodiment of the application is not fixed in a physical locking mode, but the position of the focusing lens group when the surveying and mapping camera focuses to infinity in the current environment can be automatically determined in the surveying and mapping process, and the position of the focusing lens group is adjusted and is focused and locked to ensure that the acquired image of the surveying and mapping area is clear.
Survey and drawing camera and unmanned aerial vehicle in this application embodiment can be integrated as an organic whole integral type equipment, and of course, survey and drawing camera and unmanned aerial vehicle also can be two independent equipment, and survey and drawing camera can be through physical interface and unmanned aerial vehicle communication connection, and of course, also can be through wireless communication network and unmanned aerial vehicle communication connection, and this application embodiment does not limit.
In the case where the drone and the surveying camera are an integral device, the integral device may share a set of processors, and thus the surveying camera control method may be executed by the shared processor. In a scenario where the drone and the mapping camera are two separate devices, the control method may be performed by the mapping camera. Of course, in certain embodiments, some of the processing steps of the control method may be performed by the mapping camera and some of the processing steps may be performed by the drone.
The lens of the mapping camera in the embodiment of the application can be a fixed-focus lens, namely, in the using process, the focal length of the lens of the mapping camera is not adjustable, and certainly, in some scenes, in order to meet different shooting scene requirements, the lens of the mapping camera can also be a zoom lens.
Specifically, the control method of the surveying and mapping camera, as shown in fig. 2, includes the following steps:
s202, in the process that the unmanned aerial vehicle conducts surveying and mapping operation according to a target route, determining a target position where a focusing lens group of a lens is located when the lens of the surveying and mapping camera is focused to infinity;
s204, adjusting the focusing mirror group to the target position, carrying out focusing locking, and controlling the mapping camera to carry out image acquisition on a mapping area in the process that the unmanned aerial vehicle flies according to the target route.
Usually, the user can control unmanned aerial vehicle through control terminal and carry out the survey and drawing task, and control terminal can install appointed APP, and the user can set for unmanned aerial vehicle's the target course line of carrying out the survey and drawing task and control unmanned aerial vehicle flight etc. through this APP. After a user sets a target air route, an unmanned aerial vehicle can work according to the target air route, and the surveying and mapping area is generally located in the infinite focusing range of the surveying and mapping camera, so that the surveying and mapping camera can automatically determine the target position of a focusing lens group of a lens when the lens is focused to infinity in the current environment in the process that the unmanned aerial vehicle works according to the target air route. Usually, the focusing lens group can be adjusted within a certain position range, the position of the focusing lens group can be changed by driving the focusing lens group to move through a device for driving the focusing lens group to move in the surveying and mapping camera, and the focus position of the camera can also be changed when the focusing lens group is located at different positions. Due to the change of the working environment, the position of the focus point changes when the camera focuses to infinity, and therefore the target position of the focusing lens group is required to be determined according to the actual working condition when the lens focuses to infinity, wherein the timing for determining the target position of the focusing lens group can be determined according to the actual requirements, for example, the determination can be performed before each mapping task is performed, or in order to ensure that each acquired image is clear, the determination can be performed before each image is acquired, or performed when the unmanned aerial vehicle arrives at a specified position in a flight path, or performed once at regular time intervals, which is not limited herein.
After the target position of the focusing lens group when the lens is focused to infinity is determined, the focusing lens group can be adjusted to the target position through a device for driving the focusing lens group to move in the mapping camera, and then focusing locking is carried out. After focusing and locking are carried out, the position of a focus of the lens is fixed, then the image of the surveying and mapping area is collected in the process that the unmanned aerial vehicle flies according to the target air route, and clear images can be obtained through collection.
The position of the focusing lens group is not calibrated before the surveying and mapping camera is delivered from a factory, namely the position of the focusing lens group when the camera is focused to infinity is calibrated, and then the camera is physically locked, so that the camera cannot be changed. In the surveying and mapping process, the target position of the focusing lens group when the focusing lens group is focused to infinity is automatically determined at any time according to the actual working environment, and then the position of the focusing lens group is adjusted, so that the problems of drift of the focus position and unclear acquired images caused by temperature change can be avoided.
In certain embodiments, the drone may include a pan-tilt, and the mapping camera may be carried on the drone by the pan-tilt, wherein the pan-tilt may be a single axis pan-tilt or a multiple axis pan-tilt. In some embodiments, a three-axis pan-tilt can be adopted, and when the image of the mapping area is collected, the rotation of the pan-tilt can be controlled to realize that the mapping camera performs panning on the mapping area at different angles.
In some embodiments, in order to determine the target position of the focusing lens group when the lens of the surveying and mapping camera focuses to infinity, the position of the focusing lens group of the lens when the lens of the surveying and mapping camera focuses to infinity may be calibrated in advance at different temperatures, so as to obtain calibration data. And then, determining the target position of the focusing lens group when the lens is focused to infinity according to the temperature of the environment where the lens is positioned and the calibration data. For example, before the surveying and mapping camera leaves the factory, a calibration plate is used to calibrate the lens for focusing at infinity, the position of the focusing lens group when the lens is focused to infinity is determined at different temperatures, and then the focusing lens group is stored in the surveying and mapping camera. Of course, after the surveying and mapping camera leaves the factory, the user may calibrate the surveying and mapping camera by himself to obtain the calibration data and store the calibration data in the surveying and mapping camera. When the surveying camera executes surveying operation, the temperature of the environment where the lens is located can be determined, and then interpolation processing is performed on the temperature through calibration data to obtain the target position where the focusing lens group is located at the temperature.
The temperature of the environment where the lens is located may be determined by the temperature of the mapping area obtained by the drone from the network, and of course, in some embodiments, a temperature sensor may also be disposed in or near the lens of the mapping camera, and the temperature of the environment where the lens is located is obtained through the temperature sensor.
The target position of the focusing lens group when the lens is focused to infinity is determined by adopting calibration data and the lens temperature, the target position can be executed before the unmanned aerial vehicle executes a task every time, or can be executed after the environment temperature change is detected to exceed a certain threshold value, or can be executed at intervals of preset duration, and the target position can be flexibly set according to actual operation requirements.
In some embodiments, in order to determine the target position of the focusing lens group when the lens of the mapping camera is focused to infinity, an automatic focusing method may also be used to determine the target position of the focusing lens group when the lens is focused to infinity. For example, the mapping camera may be focused on an object in the infinity focus range of the lens and then an autofocus operation is performed to determine the target position. Therefore, infinite focusing calibration is not needed before leaving the factory, calibration data is obtained, and the process of determining the target position can be simplified.
Of course, since there may be an inaccuracy problem in determining the target position through the calibration data or determining the target position through the auto-focusing method, in some embodiments, in order to determine the target position more accurately, the target position may also be determined according to the ambient temperature of the lens and the calibration data and the auto-focusing method, for example, the target positions determined in the two methods may be averaged or weighted to obtain the target position where the final focusing lens group is located.
When the target position of the focusing lens group is determined by adopting an automatic focusing mode when the lens is focused to infinity, to ensure that the focusing target is located within the infinity focusing range of the lens, mapping cameras usually do not have a positioning function or a distance measuring function, however, the drone generally has a positioning or ranging function, such as positioning or ranging by means of binocular vision sensors of the drone, a laser radar, a GPS, and the like, and therefore, in some embodiments, it can be determined whether the focusing target is located within the infinity focusing range of the lens by means of the positioning function or the ranging function of the drone, if the focusing target is located in the infinite focusing range of the lens, an instruction of focusing to infinity can be sent to the mapping camera, and the mapping camera executes the automatic focusing operation after receiving the instruction of focusing to infinity so as to determine the target position. If the focusing target is not within the infinite focusing range of the lens, the user can be prompted to adjust the flying height of the unmanned aerial vehicle or adjust the aperture of the surveying and mapping camera so that the focusing target is within the infinite focusing range of the lens.
The focusing target may be selected according to actual conditions, for example, the photographed target in the mapping area may be used as the focusing target of the lens, or a certain designated target may be used as the focusing target, which is not limited herein. When judging whether the focusing target is located in the infinite focusing range of the lens, whether the unmanned aerial vehicle is located at the designated position can be judged, and when the unmanned aerial vehicle is located at the designated position, the focusing target is considered to be located in the infinite focusing range of the lens, and then an instruction of focusing to infinity is sent to the mapping camera. The designated position can be a certain position in a target air route, and the unmanned aerial vehicle can fly to a position with a certain height from a flying point. For example, the surveying camera may use a building in front of the unmanned aerial vehicle as a focusing target, and during the flight of the unmanned aerial vehicle, the distance between the unmanned aerial vehicle and the focusing target may be measured by a distance measuring device on the unmanned aerial vehicle, and when it is detected that the distance between the unmanned aerial vehicle and the building is greater than an infinite distance threshold, an instruction of focusing to infinity is sent to the surveying camera, so that the surveying camera performs an automatic focusing operation to determine the position of the target. In some embodiments, the command to focus to infinity may be transmitted by the flight control system of the drone to the mapping camera via a predetermined data transmission protocol after determining that the drone is at the designated location. For example, the drone and the mapping camera may be two independent devices, the two devices may perform data transmission through a predefined physical interface and a data transmission protocol, or perform data transmission through a predefined wireless data transmission protocol, and after the flight control system of the drone detects that the drone is located at a specified position, an instruction to focus to infinity may be sent to the mapping camera through the predefined data protocol, so that the mapping camera performs an operation of auto-focusing.
When the flight control system of the unmanned aerial vehicle sends the command focused to infinity to the surveying and mapping camera in the above manner, the transmission link may be long and may be unstable, so that the transmission efficiency of the command is slow. In some embodiments, in order to improve the transmission efficiency of the instruction, the instruction may also be triggered through a pre-designed hardware trigger interface. After the flight control system of the unmanned aerial vehicle determines that the unmanned aerial vehicle is located at the designated position, the command of focusing to infinity can be triggered through a hardware trigger interface which is pre-designed on the surveying and mapping camera, for example, the command of focusing to infinity can be triggered in a pure hardware mode by pulling up or pulling down the level of the hardware interface, so that the time for transmitting the command through a transmission link can be omitted, and the transmission efficiency of the command is improved. Of course, when the unmanned aerial vehicle flies according to the target route, the shot target in the ground mapping area is usually within the infinite focusing range of the lens of the mapping camera, so that in order to improve the working efficiency, the shot target in the mapping area can also be directly used as the focusing target when the unmanned aerial vehicle is located on the target route, so that the mapping camera can perform the automatic focusing operation to determine the position of the target. In order to ensure that the image acquired by the mapping camera is clear in the process that the unmanned aerial vehicle flies according to the target air route, automatic focusing can be performed once before a task is executed. Therefore, in some embodiments, the designated position may be a starting waypoint of a target route of the unmanned aerial vehicle, when the control system of the unmanned aerial vehicle detects that the unmanned aerial vehicle reaches the starting waypoint, the flight control system of the unmanned aerial vehicle may send an instruction to focus to infinity to the surveying camera, after receiving the instruction, the surveying camera may complete an operation of automatically focusing to infinity, determine a target position where the focusing lens group is located, adjust the focusing lens group to the target position, and perform focusing and locking, so that in a process of flying the unmanned aerial vehicle according to the target route, the surveying camera acquires an image according to the determined position of the focusing lens group to obtain a clear image.
In some embodiments, the designated location may also be a buffer area located outside of the target route. Generally, a target route may include a plurality of route segments, in some scenes, temperature fluctuation of mapping areas corresponding to different route segments is large, and if an autofocus operation is performed only once at a starting waypoint of the target route, a phenomenon that a focus position is shifted due to temperature change in a subsequent image acquisition process, so that an acquired image is unclear may still occur. In addition, when the unmanned aerial vehicle is switched between two flight segments at present, generally, when the unmanned aerial vehicle does not reach the end point of the previous flight segment, the unmanned aerial vehicle starts to decelerate, to adjust the heading, switch to the next route segment, as shown in fig. 3(a), assuming that the target route comprises route segment 1, route segment 2, and route segment 3, point a in the figure is the starting waypoint, and typically the drone starts to decelerate as long as it has not yet reached the ending waypoint C of the leg 1, as shown at point B, so that the drone decelerates to 0 at waypoint C, then the course can be adjusted to switch to the flight path segment 2, and during the deceleration process of the unmanned aerial vehicle (the segment from point B to point C in the figure), the holder carrying the surveying and mapping camera still rotates to different directions to acquire images of the surveying and mapping area, this just can cause the cloud platform to bump spacingly at the rotation in-process because of unmanned aerial vehicle slows down rapidly, also can influence the quality of the mapping area's of collection image simultaneously.
In order to avoid the above problem, a buffer area may be additionally added at the end waypoint of each route segment, as shown in fig. 3(b), the buffer area is an extension area (as a dotted line part in the figure) of the end waypoint of each route segment, the buffer area may be a straight line area or a curved line area, the application is not limited, the unmanned aerial vehicle may realize switching between two route segments through the buffer area, for example, when the unmanned aerial vehicle reaches the end waypoint C of the previous route segment (e.g., route segment 1), the unmanned aerial vehicle starts to decelerate, during the deceleration process, the unmanned aerial vehicle may continue to fly for one segment, and adjust the heading, and the area where the unmanned aerial vehicle flies outside the route segment is the buffer area. Therefore, when the unmanned aerial vehicle flies in the flight segment, the speed is not reduced, the problem that the cloud deck rotates to bump the limit position can be avoided, and the quality of the collected image is ensured. Meanwhile, by adding the buffer area outside the target route, time can be provided for automatic focusing of the unmanned aerial vehicle, and image acquisition of a surveying and mapping area cannot be influenced. In some embodiments, when the drone is located in the buffer area, the capturing of the image may be stopped, which may avoid affecting the automatic focusing of the drone.
Of course, in some embodiments, in addition to performing the operations of auto-focusing and focus-locking at the starting waypoint of the target route and at the buffer area, the mapping camera may also perform the operations of auto-focusing and focus-locking at other waypoints of the target route, for example, at regular intervals, or at preset number of waypoints, that is, performing the operations of auto-focusing and focus-locking once, which is not limited in this application.
In some embodiments, if the mapping camera is carried by the pan-tilt on the drone, the orientation of the mapping camera may also be in a different orientation during rotation of the pan-tilt. Therefore, before the surveying camera performs the automatic focusing operation to determine the target position, the pan-tilt can be controlled to rotate to ensure that the lens of the surveying camera faces the ground, and then the automatic focusing operation is performed.
The camera lens that present mapping camera used is undetachable camera lens, when the camera lens broke down or damaged, need return the factory maintenance, and the user can't change the camera lens by oneself, and cost of maintenance is high, and the fault tolerance is relatively poor. To overcome this problem, in some embodiments, the lens of the mapping camera may be designed as an interchangeable lens, which the user may replace by himself. After the lens is replaced, the parameters corresponding to the lens are changed, so that the lens parameters stored in the mapping camera system are not matched with the replaced lens.
In some embodiments, the lens parameters may include a target position of the focusing lens set when the lens is focused to infinity, internal parameters of the lens (such as focal length, sensor center offset, etc.), and distortion correction parameters of the lens. When the lens is replaced, the lens parameters will also change, and if the lens parameter of the last lens stored by the system is still adopted, the problem will occur to the acquired image.
In some embodiments, each shot may be provided with a shot identifier that uniquely identifies the shot, such as an SN number or other number or symbol that identifies the shot. The lens parameters and lens identification of each lens may be bound, and the body and lens of each mapping camera may also be bound, for example, by storing the lens identification of the lens in the mapping camera body. When the lens is installed on the body of the mapping camera, the mapping camera can detect whether the lens identification of the current lens is consistent with the lens identification stored in advance, and if not, the lens is determined to be replaced.
In some embodiments, data transmission can be performed between the body and the lens of the mapping camera through a predefined data transmission protocol, a lens identification field can be added to the data transmission protocol of the body and the lens, when the lens is mounted on the body of the mapping camera, the lens can transmit an instruction to the mapping camera, and the body of the mapping camera can identify a lens identification in the data transmission protocol, so that whether the lens is switched or not can be determined according to the lens identification.
In some embodiments, lens parameters of a plurality of lenses may be pre-stored in the mapping camera, and when it is detected that a lens is replaced, the parameters of the replaced lens may be determined from the pre-stored lens parameters according to a lens identifier of the replaced lens, and then the mapping camera is controlled to perform image acquisition according to the lens parameters of the replaced lens.
In some embodiments, after the lens is detected to be replaced, the user may also be prompted to calibrate the replaced lens parameters through an interactive interface of the control terminal of the unmanned aerial vehicle. For example, the user can be prompted through the popup information, the lens parameters stored in the system are not matched with the current lens, the user is asked to calibrate himself, and in the process of calibrating the lens parameters by himself, detailed calibration steps or demonstration videos can be displayed to the user through the APP on the control terminal, so that the user is prompted to calibrate the lens parameters by himself.
In some embodiments, when the target position of the focusing lens group is calibrated when the lens is focused to infinity after replacement, the calibration scheme may be various. For example, the calibration can be performed by using a calibration board, the unmanned aerial vehicle can also be controlled to execute a certain task, and the target position is calibrated in the process of executing the task. Therefore, multiple calibration schemes can be displayed through the interactive interface of the control terminal, a user can select one calibration scheme through the interactive interface, and the surveying and mapping camera calibrates the target position based on the calibration scheme after receiving the calibration scheme selected by the user.
In some embodiments, when the target position where the focusing lens group is located when focusing to infinity is calibrated for the lens after replacement, the target position may be calibrated by using a calibration board, for example, the user may be prompted by the APP to place the calibration board in an infinite focusing range of the surveying and mapping camera, and then the user may trigger the surveying and mapping camera to perform infinite focusing through a trigger control on the APP, so as to determine the target position where the focusing lens group is located.
Certainly, since the calibration plate is used for calibration, a relatively large field needs to be occupied, in some embodiments, in order to save the field, calibration of the target position can be completed by using a specific flight path task, for example, after the unmanned aerial vehicle is controlled to a specified height, the lens of the surveying and mapping camera is adjusted to face the ground, and an automatic focusing operation is performed to calibrate the target position. After the unmanned aerial vehicle flies to a specified height, the object in the ground is located in the infinite focusing range of the surveying and mapping camera. Wherein the specified height may be determined according to an infinite distance threshold calculated according to the following equation (1):
Figure BDA0003386629970000131
wherein d isInfThe focusing position of the focusing lens group is an infinite focusing position when the shooting distance is greater than the infinite distance threshold value and focusing is carried out; f is the focal length of the lens; d is the diameter of the circle of confusion of the sensor, generally 2 sensor pixels wide; f is the lens aperture value.
For the internal parameters and distortion correction parameters of the surveying and mapping camera, small-area oblique surveying and mapping operation can be performed once, and the parameters and the distortion correction parameters are obtained through calculation of a self-calibration iterative algorithm of modeling software.
In some embodiments, after the user completes calibration of the lens parameters of the replaced lens, the lens identifier of the replaced lens and the lens parameters of the replaced lens obtained by the user calibration may be stored in the system of the mapping camera correspondingly. For example, the calibrated lens parameters may be bound to the SN number of the lens after replacement, and then stored in the mapping camera.
Since the real-time execution of the auto-focusing to determine the position of the lens focusing to the infinity focusing lens group seriously affects the working efficiency, the photographed target in the mapping area is generally ensured to be within the infinity focusing range of the mapping camera when the target route is set. Aiming at the scene with large elevation fluctuation of the surveying and mapping area, the ground-imitating flight function or the surface-imitating flight function of the unmanned aerial vehicle can be combined, and the photographed target of the surveying and mapping area is always kept within the infinite focusing range of the surveying and mapping camera. Of course, in some embodiments, since the infinite distance threshold of the surveying camera is related to the aperture of the camera, the distance between the surveying camera and the photographed target can be monitored in real time by the ranging device of the unmanned aerial vehicle, if it is determined that the photographed target is located outside the infinite focusing range of the surveying camera, the aperture of the surveying camera can be adjusted, the infinite distance threshold of the surveying camera is changed, the photographed target is located within the infinite focusing range of the surveying camera, and then the image of the photographed target is acquired, so that the surveying camera can be applied to scenes with large height range changes.
Of course, in some embodiments, in order to ensure that the images collected by the mapping cameras are clear during the flight of the unmanned aerial vehicle on the target route, the flight height corresponding to each waypoint may be determined in combination with the elevation map of the mapping area when the target route is generated. The elevation map can reflect the height of each object on the ground, and the flight height of the unmanned aerial vehicle can be determined according to the height of each object on the ground so as to ensure that each object on the ground is in the infinite focusing range of the surveying and mapping camera.
In some embodiments, in order to ensure that the images acquired by the mapping camera are clear during the flight of the unmanned aerial vehicle on the target route, when a user sets the target route, for example, when the user inputs a flight height corresponding to each waypoint, a shooting distance of the mapping camera when the mapping camera shoots a shot object may be determined according to the flight height, if the shooting distance is less than a preset distance, the shot object is considered to be located outside an infinite focusing range of the mapping camera, wherein the preset distance may be determined according to an infinite distance threshold, and at this time, the user is prompted that focusing is not possible, so that the user may adjust the input flight height. Therefore, when no person is located at any position of the target route, the shot target in the mapping area is within the infinite focusing range of the mapping camera, and clear imaging can be achieved.
In some embodiments, the mapping camera may support lenses of different focal lengths. Through carrying on the camera lens of different focal length, can satisfy the shooting demand of different scenes of shooing. For example, a longer focal length lens may be used to meet the requirements for modeling close-up fine shots at large distances.
To further explain the surveying camera control method of the present application, it is explained below in conjunction with a specific embodiment.
Fig. 4 is a schematic view of an application scenario in an embodiment of the present application. Survey and drawing unmanned aerial vehicle 41 is last to include a triaxial cloud platform 411, and cloud platform 411 is last to carry a survey and drawing camera 412, and the camera lens of survey and drawing camera can be tight shot or zoom, and this camera lens adopts detachable design, and the camera lens that the survey and drawing camera can support different focal length is in order to adapt to the demand of different shooting scenes. The temperature sensor is arranged in the lens or near the lens to detect the temperature of the environment where the lens is located. The user can control the unmanned aerial vehicle motion through control terminal 42, installs appointed APP on control terminal 42, and the user can set up the target course when unmanned aerial vehicle carries out the survey and drawing operation through this APP.
In order to avoid the drift of the focal position of the lens of the mapping camera along with the temperature change, the focusing lens group of the lens does not adopt a physical locking mode, and in the process of executing the mapping task, the mapping camera can automatically determine the position of the focusing lens group when the lens focuses to infinity, and then adjust the focusing lens group to the position and perform focusing locking so as to finish the image acquisition of the mapping area.
The position of the focusing lens group when the lens is focused to infinity can be determined by the following methods:
(1) determining the position of the focusing lens group according to the temperature of the environment where the lens is located and the calibration data
Before the surveying and mapping camera lens leaves a factory, infinite focusing calibration is carried out at different temperatures, the position of a focusing lens group when the lens is focused to the infinite distance at different temperatures is obtained through calibration, calibration data are stored in a camera system, the temperature of the lens is detected by using a temperature sensor inside the lens or near the lens in actual operation, and the position of the focusing lens group when the lens is focused to the infinite distance is calculated through interpolation according to the data obtained through calibration.
The timing for determining the position of the focusing lens group in this way may be set according to actual requirements, for example, when the detected temperature change exceeds a preset threshold, or after every preset time, or when the unmanned aerial vehicle is located at a specified position of the target route, which is not limited herein.
(2) Method for determining position of focusing lens group by automatic focusing
The user sets up the target route when unmanned aerial vehicle carries out the survey and drawing task through APP on the control terminal, wherein, can be through the waypoint of mode input target route of dotting on the map, the flying height that every waypoint corresponds can be by the user self-input. The APP can acquire an elevation map of the mapping area, the height of each object in the mapping area is determined according to the elevation map, then when the unmanned aerial vehicle is located at the flying height input by a user according to the infinite distance threshold of the mapping camera and the height of each object, whether each object in the mapping area is in the infinite focusing range of the mapping camera or not is determined, and if not, the user is prompted to be incapable of focusing so that the user can adjust the inputted flying height. Of course, the fly height may also be automatically determined by the APP from the elevation map, infinite distance thresholds for the mapping camera, and the shooting accuracy.
After the user passes through APP and confirms the target airline, APP can send the target airline for unmanned aerial vehicle's flight control system, unmanned aerial vehicle's flight control system can control unmanned aerial vehicle and fly to the originated waypoint of target airline, when unmanned aerial vehicle reachs the originated waypoint, flight control system can control the cloud platform and rotate, make surveying camera's camera lens towards ground, then flight control system sends the instruction of focusing to infinity to surveying camera's system, surveying camera receives this instruction after, the operation of execution auto focus.
Certainly, in order to avoid that the focus position of the surveying and mapping camera drifts due to large temperature difference in the same surveying and mapping area, a buffer area can be further arranged at the end point position of each flight path segment, the buffer area is an extension area of the end point position of the flight path segment and is used for realizing the switching of the unmanned aerial vehicle in different flight path segments, when the unmanned aerial vehicle is located in the buffer area, the speed reduction and the course switching are started, the rotation of the cradle head and the image acquisition are stopped at the same time, when the flight control system of the unmanned aerial vehicle detects that the unmanned aerial vehicle is located in the buffer area, the cradle head is controlled to rotate, the lens faces the ground, and then a focusing command to infinity is sent to the surveying and mapping camera system so that the surveying and mapping camera performs automatic focusing and focusing locking operations again. Through setting up the buffer area, can provide the time of carrying out autofocus again for the camera, simultaneously, in the buffer area, the cloud platform pauses and rotates, can avoid leading to hitting the structure or the spacing risk of software that leads to because of rapid deceleration that leads to the cloud platform backswing.
Because the camera lens of this application embodiment is removable camera lens, after the camera lens is changed, the camera lens parameter of the precalibration of survey and drawing camera system storage will no longer match the camera lens of installation, generally speaking, to survey and drawing camera, the camera lens parameter that needs to carry out the demarcation has the camera lens to focus the position of focusing mirror group when infinity, camera internal parameter (including Sensor central offset, focus etc.), distortion correction parameter in the camera system, if the camera lens parameter of demarcation is obtained from external system calculation, the storage interface of camera or the SD card of survey and drawing camera of camera storage of lens parameter accessible APP, survey and drawing in the camera system.
The embodiment of the application adopts the following modes to solve the matching problem of the lens and the lens parameters:
in the production process, the mapping camera and the lens are bound one by one, the lens parameters and the lens SN calibrated when leaving a factory are written into the mapping camera system together, a lens SN field is added into a data transmission protocol between the lens and the camera body, the mapping camera system reads the lens SN and compares the lens SN with the lens SN stored in the camera body after the mapping camera system is started, and if a user changes the lens, the user is prompted to perform self-calibration operation through an APP on the control terminal.
When the user performs self-calibration on the position of the focusing lens group when the lens is focused to infinity, the following method can be adopted for calibration:
(1) and calibrating by using a calibration plate, placing the surveying and mapping calibration plate in the infinite focusing range of the camera, carrying out focusing calibration by using an APP trigger system, and automatically writing the lens SN and the calibration parameters into the camera system after successful focusing.
(2) In order to save the field, calibration can be completed by utilizing a specific air route task, for example, a flight control system can control the unmanned aerial vehicle to fly above a flying starting point or above a shooting point specified by a user. The flying height should be within the infinite focusing range, and the flying height can be set according to the infinite distance threshold of the mapping camera. After the unmanned aerial vehicle reaches the designated position, the flight control system triggers the camera to perform focusing operation, and the camera system writes the focusing lens group position and the lens SN after successful focusing into the system.
For the camera internal parameter and distortion correction parameter, small-area inclination surveying and mapping operation can be carried out once, the calculation is carried out through a self-calibration iterative algorithm of modeling software, and a related parameter file can be imported into a camera system through a storage interface of the APP surveying and mapping camera or an SD card of the surveying and mapping camera and is bound with a lens SN.
According to the surveying and mapping camera provided by the embodiment of the application, the mode of physically locking the lens focusing lens group is not adopted, and the lens can finish focusing operation at different shooting distances, so that the surveying and mapping camera has wider application scenes, such as close to photography, such as continuous automatic focusing in video recording, the problem of focusing temperature drift possibly caused by temperature change of an operation environment can be compensated by carrying out automatic focusing and focusing locking in a route, in addition, the position of the focusing lens group when focusing is carried out to infinity can be determined by combining temperature and calibration data, and the probability of focusing failure can be reduced. In addition, by adopting the interchangeable lens, when the lens is damaged, only the lens needs to be replaced, so that the maintenance cost of the system when the system is damaged is reduced, and the fault tolerance of the system is higher. Meanwhile, the interchangeable lens design is adopted, lenses with different focal lengths can be carried, and the requirement of remote close fine shooting modeling can be met if a longer-focal-length lens is adopted.
In addition, the present application also provides a surveying camera, the surveying camera is mounted on an unmanned aerial vehicle, as shown in fig. 5, the surveying camera 50 includes a lens 51, a processor 52, a memory 53, and a computer program stored in the memory 53 and executable by the processor 52, and when the processor 52 executes the computer program, the following steps are implemented:
in the process that the unmanned aerial vehicle carries out surveying and mapping operation according to a target route, determining a target position where a focusing lens group of a lens is located when the lens of the surveying and mapping camera is focused to infinity;
and adjusting the focusing mirror group to the target position, focusing and locking so as to control the mapping camera to acquire images of a mapping area in the process that the unmanned aerial vehicle flies according to the target route.
In some embodiments, the processor is configured to determine a target position of a lens of the mapping camera at which a focusing lens group of the lens is located when the lens is focused to infinity, and specifically configured to:
determining the target position according to the temperature of the environment where the lens of the mapping camera is located and calibration data, wherein the calibration data is used for representing the position of the focusing lens group when the lens of the mapping camera focuses to infinity at different temperatures, and/or
And determining the target position in an automatic focusing mode.
In certain embodiments, the mapping camera includes a temperature sensor by which the temperature of the environment in which the lens of the mapping camera is located is acquired.
In some embodiments, the processor, when determining the target position in an auto-focus manner, is specifically configured to:
when an instruction of focusing to infinity is received, executing automatic focusing operation to determine the target position, wherein the instruction of focusing to infinity is triggered when the unmanned aerial vehicle is located at a specified position, and when the unmanned aerial vehicle is located at the specified position, a focusing target is located in an infinite focusing range of the lens.
In certain embodiments, the focus to infinity instruction is sent by a flight control system of the drone to the mapping camera through a preset data transfer protocol when the drone is determined to be in the designated location; or
The focus to infinity instruction is sent by a flight control system of the drone through a hardware trigger interface to the mapping camera upon determining that the drone is in the designated location.
In some embodiments, the focus target is a photographed target within the mapping area, and the specified position includes:
a starting waypoint of the target route; and/or
A buffer region outside the target route; the target route comprises a plurality of route segments, the buffer area is an extension area of a stop route point of each route segment, and the buffer area is used for switching the unmanned aerial vehicle between the two route segments.
In certain embodiments, the mapping camera stops capturing images while the drone is in the buffer zone.
In some embodiments, the drone includes a pan-tilt through which the mapping camera is carried on the drone.
In some embodiments, the processor is further configured to, prior to performing the autofocus operation:
and controlling the holder to rotate so as to adjust the lens of the mapping camera to face the ground.
In certain embodiments, the processor is further configured to:
when the lens of the mapping camera is detected to be replaced, prompting a user to calibrate the lens parameters of the replaced lens through an interactive interface of a control terminal of the unmanned aerial vehicle.
In some embodiments, the lens parameters include one or more of:
when the lens is focused to infinity, the target position of the focusing lens group, the internal parameters of the lens and the distortion correction parameters of the lens are obtained.
In some embodiments, the method further includes, after prompting, through an interactive interface of a control terminal of the unmanned aerial vehicle, a user to calibrate the lens parameter of the lens after replacement, where the lens parameter is the target position:
receiving a calibration scheme which is selected by a user through the interactive interface and used for calibrating the target position;
and calibrating the target position based on the calibration scheme.
In some embodiments, the calibration scheme comprises:
calibrating the target position by using a calibration plate; and/or
And after the unmanned aerial vehicle is controlled to fly to a specified height, the lens is adjusted to face the ground, and automatic focusing operation is performed to calibrate the target position, wherein after the unmanned aerial vehicle flies to the specified height, the object on the ground is located in an infinite focusing range of the surveying and mapping camera.
In some embodiments, the lens identifier of the lens is bound to the lens parameter, and the processor is configured to, when detecting that the lens of the mapping camera is replaced, specifically:
and when the lens identification of the current lens is detected to be inconsistent with the lens identification stored by the mapping camera, determining that the lens of the mapping camera is replaced.
In certain embodiments, the data transfer protocol of the lens of the mapping camera and the body of the mapping camera includes a lens identification field.
In some embodiments, the processor is configured to prompt a user to calibrate the lens parameters of the replaced lens, and further configured to:
and correspondingly storing the lens identification of the replaced lens and the lens parameters of the replaced lens obtained by user calibration.
In certain embodiments, the processor is further configured to:
and when the lens of the mapping camera is detected to be replaced, searching the lens parameters of the replaced lens from the prestored lens parameters of various lenses based on the lens identification of the replaced lens.
In some embodiments, the processor is configured to, when the drone performs image acquisition on a mapping area during flight according to a target route, specifically:
when the shot target is determined to be located outside the infinite focusing range of the mapping camera, the aperture of the mapping camera is adjusted, so that the shot target is located within the infinite focusing range of the mapping camera, and then the image of the shot target is acquired.
In some embodiments, the target route is determined based on:
acquiring an elevation map of the mapping area;
and determining the flight height of the unmanned aerial vehicle according to the elevation map so as to generate the target route according to the flight height.
In certain embodiments, the mapping camera supports lenses of different focal lengths.
For details of the specific implementation of acquiring the image of the mapping area by using the mapping camera, reference may be made to the description of each embodiment of the foregoing method, which is not described herein again.
In addition, this application still provides an unmanned aerial vehicle, the mapping camera of this unmanned aerial vehicle above-mentioned any embodiment.
For details of the specific implementation of acquiring the image of the mapping area by using the mapping camera, reference may be made to the description of each embodiment of the foregoing method, which is not described herein again.
Further, the application also provides a surveying and mapping system, which comprises an unmanned aerial vehicle and a control terminal, wherein the control terminal is provided with a designated APP, and the unmanned aerial vehicle comprises a surveying and mapping camera;
the designated APP is used for receiving a target route input by a user and sending the target route to the unmanned aerial vehicle;
the mapping camera comprises a processor, a memory, and a computer program stored in the memory for execution by the processor, the computer program when executed by the processor implementing the steps of:
in the process that the unmanned aerial vehicle conducts surveying and mapping operation according to the target air route, determining a target position where a focusing lens group of a lens is located when the lens of the surveying and mapping camera focuses to infinity;
and adjusting the focusing mirror group to the target position, focusing and locking so as to control the mapping camera to acquire images of a mapping area in the process that the unmanned aerial vehicle flies according to the target route.
In some embodiments, the designated APP is further configured to prompt a user to calibrate a lens parameter of a lens after replacement through an interactive interface when the lens of the mapping camera is replaced.
In some embodiments, the designated APP is further configured to display a calibration scheme for calibrating the lens parameters to a user through an interactive interface, receive an instruction for the user to select the calibration scheme, and prompt the user to calibrate the lens parameters based on the calibration scheme selected by the user.
In some embodiments, the designated APP is further configured to receive a flying height input by a user, and determine a shooting distance for the surveying camera to perform image acquisition on a shot target in the surveying area according to the flying height; and when the shooting distance is smaller than the preset distance, prompting that the user cannot focus through an interactive interface.
For details of the specific implementation of acquiring the image of the mapping area by using the mapping camera, reference may be made to the description of each embodiment of the foregoing method, which is not described herein again.
Accordingly, the present specification further provides a computer storage medium, in which a program is stored, and the program is executed by a processor to implement the control method of the surveying and mapping camera in any one of the above embodiments.
Embodiments of the present description may take the form of a computer program product embodied on one or more storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having program code embodied therein. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The method and apparatus provided by the embodiments of the present invention are described in detail above, and the principle and the embodiments of the present invention are explained in detail herein by using specific examples, and the description of the embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (45)

1. A method of controlling a mapping camera, the mapping camera being mounted on an unmanned aerial vehicle, the method comprising:
in the process that the unmanned aerial vehicle carries out surveying and mapping operation according to a target route, determining a target position where a focusing lens group of a lens is located when the lens of the surveying and mapping camera is focused to infinity;
and adjusting the focusing mirror group to the target position, focusing and locking so as to control the mapping camera to acquire images of a mapping area in the process that the unmanned aerial vehicle flies according to the target route.
2. The method of claim 1, wherein determining the target position of the lens of the mapping camera when the lens is focused to infinity comprises:
determining the target position according to the temperature of the environment where the lens of the mapping camera is located and calibration data, wherein the calibration data is used for representing the position of the focusing lens group when the lens of the mapping camera focuses to infinity at different temperatures, and/or
And determining the target position in an automatic focusing mode.
3. The method according to claim 2, characterized in that the mapping camera comprises a temperature sensor by means of which the temperature of the environment in which the lens of the mapping camera is located is acquired.
4. The method of claim 2 or 3, wherein determining the target position by means of auto-focusing comprises:
when an instruction of focusing to infinity is received, executing automatic focusing operation to determine the target position, wherein the instruction of focusing to infinity is triggered when the unmanned aerial vehicle is located at a specified position, and when the unmanned aerial vehicle is located at the specified position, a focusing target is located in an infinite focusing range of the lens.
5. The method of claim 4, wherein the focus to infinity instruction is sent by a flight control system of the drone to the mapping camera through a preset data transfer protocol upon determining that the drone is in the designated location; or
The focus to infinity instruction is triggered by a flight control system of the drone through a hardware trigger interface on the mapping camera upon determining that the drone is in the specified location.
6. The method according to claim 4 or 5, wherein the focusing target is a photographed target within the mapping area, and the specifying the position includes:
a starting waypoint of the target route; and/or
A buffer region outside the target route; the target route comprises a plurality of route segments, the buffer area is an extension area of a stop route point of each route segment, and the buffer area is used for switching the unmanned aerial vehicle between the two route segments.
7. The method of claim 6, wherein the mapping camera stops capturing images while the drone is in the buffer zone.
8. The method of any of claims 4-7, wherein the drone includes a pan-tilt, and the mapping camera is carried on the drone by the pan-tilt.
9. The method of claim 8, further comprising, prior to performing the autofocus operation:
and controlling the holder to rotate so as to adjust the lens of the mapping camera to face the ground.
10. The method according to any one of claims 1-9, further comprising:
when the lens of the mapping camera is detected to be replaced, prompting a user to calibrate the lens parameters of the replaced lens through an interactive interface of a control terminal of the unmanned aerial vehicle.
11. The method of claim 10, wherein the lens parameters comprise one or more of:
when the lens is focused to infinity, the target position of the focusing lens group, the internal parameters of the lens and the distortion correction parameters of the lens are obtained.
12. The method according to claim 11, wherein the lens parameter is the target position, and after the user is prompted to calibrate the lens parameter of the replaced lens through an interactive interface of a control terminal of the drone, the method further comprises:
receiving a calibration scheme which is selected by a user through the interactive interface and used for calibrating the target position;
and calibrating the target position based on the calibration scheme.
13. The method of claim 12, wherein the calibration scheme comprises:
calibrating the target position by using a calibration plate; and/or
And after the unmanned aerial vehicle is controlled to fly to a specified height, the lens is adjusted to face the ground, and automatic focusing operation is performed to calibrate the target position, wherein after the unmanned aerial vehicle flies to the specified height, the object on the ground is located in an infinite focusing range of the surveying and mapping camera.
14. The method according to any one of claims 10-13, wherein a lens identification of the lens is bound to the lens parameters, and wherein detecting that a lens change of the mapping camera has occurred comprises:
and when the lens identification of the current lens is detected to be inconsistent with the lens identification stored by the mapping camera, determining that the lens of the mapping camera is replaced.
15. The method of claim 14, wherein a data transfer protocol between the lens of the mapping camera and the body of the mapping camera includes a lens identification field.
16. The method according to any one of claims 10 to 15, wherein after prompting the user to calibrate the lens parameters of the replaced lens, the method further comprises:
and correspondingly storing the lens identification of the replaced lens and the lens parameters of the replaced lens obtained by user calibration.
17. The method according to any one of claims 1-9, further comprising:
and when the lens of the mapping camera is detected to be replaced, searching the lens parameters of the replaced lens from the prestored lens parameters of various lenses based on the lens identification of the replaced lens.
18. The method of claim 1, wherein image acquisition of the survey area during flight of the drone by the target route comprises:
when the shot target is determined to be located outside the infinite focusing range of the mapping camera, the aperture of the mapping camera is adjusted, so that the shot target is located within the infinite focusing range of the mapping camera, and then the image of the shot target is acquired.
19. The method of any one of claims 1-18, wherein the target route is determined based on:
acquiring an elevation map of the mapping area;
and determining the flight height of the unmanned aerial vehicle according to the elevation map so as to generate the target route according to the flight height.
20. The method of any of claims 1-19, wherein the mapping camera supports lenses of different focal lengths.
21. A mapping camera mounted on a drone, the mapping camera including a processor, a memory, and a computer program stored in the memory for execution by the processor, when executing the computer program, performing the steps of:
in the process that the unmanned aerial vehicle carries out surveying and mapping operation according to a target route, determining a target position where a focusing lens group of a lens is located when the lens of the surveying and mapping camera is focused to infinity;
and adjusting the focusing mirror group to the target position, focusing and locking so as to control the mapping camera to acquire images of a mapping area in the process that the unmanned aerial vehicle flies according to the target route.
22. The surveying camera according to claim 21, wherein the processor is configured to determine a target position of a lens group of the lens when the lens of the surveying camera is focused to infinity, and is specifically configured to:
determining the target position according to the temperature of the environment where the lens of the mapping camera is located and calibration data, wherein the calibration data is used for representing the position of the focusing lens group when the lens of the mapping camera focuses to infinity at different temperatures, and/or
And determining the target position in an automatic focusing mode.
23. Mapping camera according to claim 22, characterized in that the mapping camera comprises a temperature sensor by means of which the temperature of the environment in which the lens of the mapping camera is located is acquired.
24. Mapping camera according to claim 22 or 23, wherein the processor, when determining the target position in an autofocus manner, is specifically configured to:
when an instruction of focusing to infinity is received, executing automatic focusing operation to determine the target position, wherein the instruction of focusing to infinity is triggered when the unmanned aerial vehicle is located at a specified position, and when the unmanned aerial vehicle is located at the specified position, a focusing target is located in an infinite focusing range of the lens.
25. The mapping camera of claim 24, wherein the focus to infinity instruction is sent by the drone's flight control system to the mapping camera via a preset data transfer protocol upon determining that the drone is in the designated location; or
The focus to infinity instruction is triggered by a flight control system of the drone through a hardware trigger interface on the mapping camera upon determining that the drone is in the specified location.
26. The surveying camera according to claim 24 or 25, wherein the in-focus target is a photographed target within the surveying area, the specified position includes:
a starting waypoint of the target route; and/or
A buffer region outside the target route; the target route comprises a plurality of route segments, the buffer area is an extension area of a stop route point of each route segment, and the buffer area is used for switching the unmanned aerial vehicle between the two route segments.
27. The mapping camera of claim 26, wherein the mapping camera stops capturing images while the drone is in the buffer region.
28. A mapping camera according to any of claims 24-27, wherein the drone includes a pan-tilt through which the mapping camera is carried on the drone.
29. The mapping camera of claim 28, wherein the processor, prior to performing the autofocus operation, is further configured to:
and controlling the holder to rotate so as to adjust the lens of the mapping camera to face the ground.
30. The mapping camera of any of claims 21-29, wherein the processor is further configured to:
when the lens of the mapping camera is detected to be replaced, prompting a user to calibrate the lens parameters of the replaced lens through an interactive interface of a control terminal of the unmanned aerial vehicle.
31. The mapping camera of claim 30, wherein the lens parameters include one or more of:
when the lens is focused to infinity, the target position of the focusing lens group, the internal parameters of the lens and the distortion correction parameters of the lens are obtained.
32. The surveying and mapping camera according to claim 31, wherein the lens parameter is the target position, and after the user is prompted to calibrate the lens parameter of the replaced lens through an interactive interface of the control terminal of the drone, the surveying and mapping camera further comprises:
receiving a calibration scheme which is selected by a user through the interactive interface and used for calibrating the target position;
and calibrating the target position based on the calibration scheme.
33. The mapping camera of claim 32, wherein the calibration scheme comprises:
calibrating the target position by using a calibration plate; and/or
And after the unmanned aerial vehicle is controlled to fly to a specified height, the lens is adjusted to face the ground, and automatic focusing operation is performed to calibrate the target position, wherein after the unmanned aerial vehicle flies to the specified height, the object on the ground is located in an infinite focusing range of the surveying and mapping camera.
34. A mapping camera according to any of claims 30-33, wherein the lens identification of the lens is bound to the lens parameters, and the processor is configured to detect that the lens of the mapping camera is replaced, and in particular to:
and when the lens identification of the current lens is detected to be inconsistent with the lens identification stored by the mapping camera, determining that the lens of the mapping camera is replaced.
35. The mapping camera of claim 34, wherein a data transfer protocol between the lens of the mapping camera and the body of the mapping camera includes a lens identification field.
36. A mapping camera according to any of claims 30-35, wherein the processor is configured to prompt the user to calibrate lens parameters of the replaced lens, and further configured to:
and correspondingly storing the lens identification of the replaced lens and the lens parameters of the replaced lens obtained by user calibration.
37. The mapping camera of any of claims 21-29, wherein the processor is further configured to:
and when the lens of the mapping camera is detected to be replaced, searching the lens parameters of the replaced lens from the prestored lens parameters of various lenses based on the lens identification of the replaced lens.
38. The surveying camera according to claim 21, wherein the processor is configured to, when the unmanned aerial vehicle is flying along a target route for image acquisition of a surveying area, in particular:
when the shot target is determined to be located outside the infinite focusing range of the mapping camera, the aperture of the mapping camera is adjusted, so that the shot target is located within the infinite focusing range of the mapping camera, and then the image of the shot target is acquired.
39. The mapping camera of any of claims 21-38, wherein the target course is determined based on:
acquiring an elevation map of the mapping area;
and determining the flight height of the unmanned aerial vehicle according to the elevation map so as to generate the target route according to the flight height.
40. A mapping camera according to any of claims 22-39, wherein the mapping camera supports lenses of different focal lengths.
41. A drone, characterized in that it comprises a surveying camera according to any one of claims 21-40.
42. A surveying and mapping system is characterized by comprising an unmanned aerial vehicle and a control terminal, wherein the control terminal is provided with a designated APP, and the unmanned aerial vehicle comprises a surveying and mapping camera;
the designated APP is used for receiving a target route input by a user and sending the target route to the unmanned aerial vehicle;
the mapping camera comprises a processor, a memory, and a computer program stored in the memory for execution by the processor, the computer program when executed by the processor implementing the steps of:
in the process that the unmanned aerial vehicle conducts surveying and mapping operation according to the target air route, determining a target position where a focusing lens group of a lens is located when the lens of the surveying and mapping camera focuses to infinity;
and adjusting the focusing mirror group to the target position, focusing and locking so as to control the mapping camera to acquire images of a mapping area in the process that the unmanned aerial vehicle flies according to the target route.
43. The system according to claim 42, wherein the designated APP is further configured to prompt a user via an interactive interface to calibrate lens parameters of a lens after replacement when the lens of the surveying camera is replaced.
44. The mapping system according to claim 42, wherein the designated APP is further configured to display a calibration scheme for calibrating the lens parameters to a user through an interactive interface, receive an instruction for selecting the calibration scheme from the user, and prompt the user to calibrate the lens parameters based on the calibration scheme selected by the user.
45. The mapping system of claim 42, wherein the designated APP is further configured to receive a user-input flying height, and determine a shooting distance for the mapping camera to image-capture a shot target in the mapping area according to the flying height; and when the shooting distance is smaller than the preset distance, prompting that the user cannot focus through an interactive interface.
CN202080040708.3A 2020-10-13 2020-10-13 Surveying camera control method, surveying camera, unmanned aerial vehicle and surveying system Pending CN113906362A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/120703 WO2022077236A1 (en) 2020-10-13 2020-10-13 Control method for mapping camera, mapping camera, unmanned aerial vehicle, and mapping system

Publications (1)

Publication Number Publication Date
CN113906362A true CN113906362A (en) 2022-01-07

Family

ID=79186972

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080040708.3A Pending CN113906362A (en) 2020-10-13 2020-10-13 Surveying camera control method, surveying camera, unmanned aerial vehicle and surveying system

Country Status (2)

Country Link
CN (1) CN113906362A (en)
WO (1) WO2022077236A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114993263B (en) * 2022-05-26 2023-11-21 邓州市邓房测绘有限公司 High-precision unmanned aerial vehicle mapping system for building based on level point positioning

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4377744A (en) * 1980-07-14 1983-03-22 The United States Of America As Represented By The Secretary Of The Navy Remote lens focusing system for an aerial camera
CN203704924U (en) * 2014-02-20 2014-07-09 北京天元四维科技有限公司 Digital aerial photography real-time gesture precision control sensing device
CN104730539A (en) * 2015-03-06 2015-06-24 河南四维远见信息技术有限公司 Low-altitude light and small infrared and laser radar integrated system
CN204556829U (en) * 2015-03-06 2015-08-12 河南四维远见信息技术有限公司 A kind of small low-altitude light is infrared with laser radar integrated system
CN108141522A (en) * 2015-11-30 2018-06-08 深圳市大疆创新科技有限公司 Imaging system and method
CN109905604A (en) * 2019-03-29 2019-06-18 深圳市道通智能航空技术有限公司 Focusing method, device, capture apparatus and aircraft

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4377744A (en) * 1980-07-14 1983-03-22 The United States Of America As Represented By The Secretary Of The Navy Remote lens focusing system for an aerial camera
CN203704924U (en) * 2014-02-20 2014-07-09 北京天元四维科技有限公司 Digital aerial photography real-time gesture precision control sensing device
CN104730539A (en) * 2015-03-06 2015-06-24 河南四维远见信息技术有限公司 Low-altitude light and small infrared and laser radar integrated system
CN204556829U (en) * 2015-03-06 2015-08-12 河南四维远见信息技术有限公司 A kind of small low-altitude light is infrared with laser radar integrated system
CN108141522A (en) * 2015-11-30 2018-06-08 深圳市大疆创新科技有限公司 Imaging system and method
CN109905604A (en) * 2019-03-29 2019-06-18 深圳市道通智能航空技术有限公司 Focusing method, device, capture apparatus and aircraft

Also Published As

Publication number Publication date
WO2022077236A1 (en) 2022-04-21

Similar Documents

Publication Publication Date Title
WO2018072657A1 (en) Image processing method, image processing device, multi-camera photographing device, and aerial vehicle
WO2017120771A1 (en) Depth information acquisition method and apparatus, and image collection device
WO2019113966A1 (en) Obstacle avoidance method and device, and unmanned aerial vehicle
KR101450702B1 (en) System for editing taken air photograph by maintainance vertical position against earth surface
JP2007116666A (en) Surveillance camera apparatus and surveillance camera system
CN107135349A (en) Picture pick-up device, lens unit, camera system and its control method
KR101214081B1 (en) Image expression mapping system using space image and numeric information
US11022858B2 (en) Multiple camera apparatus and method for synchronized autofocus
WO2017117749A1 (en) Follow focus system and method based on multiple ranging approaches, and photographing system
KR20090105290A (en) GPS Based Digital Orthogonal Metric Aerial Photograph Auto-Control System
JP7136079B2 (en) Information processing device, information processing method and information processing program
US20200145568A1 (en) Electro-optical imager field of regard coverage using vehicle motion
WO2021031159A1 (en) Match photographing method, electronic device, unmanned aerial vehicle and storage medium
JP2018007051A (en) Photographing apparatus, movable photographing device, photographing mobile body, and mobile body photographing control device
CN105704393A (en) Image-capturing apparatus and image-capturing direction control method
WO2019205087A1 (en) Image stabilization method and device
CN101315509B (en) Imaging apparatus
CN113906362A (en) Surveying camera control method, surveying camera, unmanned aerial vehicle and surveying system
US11310423B2 (en) Image capturing method and image capturing apparatus
CN108419052A (en) A kind of more unmanned plane method for panoramic imaging
JP5277600B2 (en) Overhead radiography system and method
KR101249914B1 (en) System for editing taken air photograph by maintainance vertical position against earth surface
JP2019219874A (en) Autonomous moving and imaging control system and autonomous moving body
JP7468523B2 (en) MOBILE BODY, POSITION ESTIMATION METHOD, AND PROGRAM
WO2022000211A1 (en) Photography system control method, device, movable platform, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination