WO2022016348A1 - 设备控制方法、装置及计算机可读存储介质 - Google Patents

设备控制方法、装置及计算机可读存储介质 Download PDF

Info

Publication number
WO2022016348A1
WO2022016348A1 PCT/CN2020/103156 CN2020103156W WO2022016348A1 WO 2022016348 A1 WO2022016348 A1 WO 2022016348A1 CN 2020103156 W CN2020103156 W CN 2020103156W WO 2022016348 A1 WO2022016348 A1 WO 2022016348A1
Authority
WO
WIPO (PCT)
Prior art keywords
route
heading
uav
interval
camera
Prior art date
Application number
PCT/CN2020/103156
Other languages
English (en)
French (fr)
Inventor
黄振昊
何纲
方朝晖
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/103156 priority Critical patent/WO2022016348A1/zh
Priority to CN202080042367.3A priority patent/CN113950610B/zh
Publication of WO2022016348A1 publication Critical patent/WO2022016348A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • the present application relates to the technical field of drone control, and in particular, to a device control method, device, and computer-readable storage medium.
  • UAVs are widely used in the field of surveying and mapping, to realize the surveying and mapping of the operating area by taking pictures of the operating area through the camera of the UAV.
  • the UAV's route planning and mission efficiency are very important.
  • the UAV and the camera can use the preset fixed working parameters to conduct the route. Planning for subsequent shooting. If you want to improve the task efficiency of shooting tasks, you can improve the performance of the drone and its camera, such as increasing the working power of the drone to improve the flight of the drone. Speed; improve the shooting accuracy of the camera to meet the demand for mapping results.
  • the present application provides a device control method, device, and computer-readable storage medium, which can solve the problem in the prior art that simply increasing the performance of an unmanned aerial vehicle and its camera to achieve task efficiency optimization will lead to a substantial increase in surveying and mapping costs.
  • an embodiment of the present application provides a device control method, including:
  • the relative direction relationship between the frame direction and the heading is adjusted, and the route is re-planned.
  • an embodiment of the present application provides a device control method, including:
  • an embodiment of the present application provides a device control apparatus, including: an acquisition module and a processor;
  • the acquisition module is used to acquire the operation area where the drone performs the shooting task
  • the processing module is configured to plan a route in the operation area according to the relative directional relationship between the frame direction of the camera of the UAV and the heading of the UAV;
  • the relative direction relationship between the frame direction and the heading is adjusted, and the route is re-planned.
  • an embodiment of the present application provides a device control apparatus, including: an acquisition module and a processor;
  • the acquisition module is used to acquire the operation area where the drone performs the shooting task
  • the processing module is configured to, in a plurality of different relative directional relationships between the frame direction of the drone when photographing the work area and the heading of the drone, for each planning a route in the operation area according to the relative direction relationship, and determining the mission parameters when the drone performs the shooting task along the planned route;
  • the present application provides a computer-readable storage medium, the computer-readable storage medium comprising instructions that, when executed on a computer, cause the computer to perform the method described in the above aspects.
  • the present application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method described in the above aspects.
  • the present application plans a corresponding flight route and calculates no The task parameters when the man-machine performs the shooting task along the route; so that when the UAV performs the shooting task according to the relative direction relationship and the route, the corresponding operation efficiency is given a reference measure, and the task parameters can be further judged.
  • FIG. 1 is a system architecture diagram corresponding to a device control method provided by an embodiment of the present application
  • FIG. 2 is a scene diagram of a device control method provided by an embodiment of the present application.
  • FIG. 3 is a scene diagram of another device control method provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a route provided by an embodiment of the present application.
  • FIG. 6 is a specific flowchart of a device control method provided by an embodiment of the present application.
  • FIG. 7 is an imaging schematic diagram of a camera provided by an embodiment of the present application.
  • FIG. 8 is an azimuth relationship diagram between two adjacent images captured by a camera according to an embodiment of the present application.
  • FIG. 9 is an orientation relationship diagram between two adjacent images captured by another camera according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of another route provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of another route provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of another route provided by an embodiment of the present application.
  • FIG. 13 is a flowchart of another device control method provided by an embodiment of the present application.
  • FIG. 14 is a block diagram of a device control apparatus provided by an embodiment of the present application.
  • FIG. 15 is a block diagram of another device control apparatus provided by an embodiment of the present application.
  • FIG. 1 it shows a system architecture diagram corresponding to a device control method provided by the embodiment of the present application, including: an unmanned aerial vehicle 10 and a control device 20 , and the unmanned aerial vehicle 10 may include a camera 11.
  • the device control 20 is wired or wirelessly connected to the UAV 10, and the device control 20 can acquire data, such as operating parameters, control instructions, etc., and control the UAV 10 and the camera 11 to operate by processing the data.
  • the device control 20 may be integrated on the drone 10 , or may be set independently of the drone 10 , which is not limited in this embodiment of the present application.
  • FIG. 2 which shows a scene diagram of a device control method provided by an embodiment of the present application.
  • the camera 11 is used as a load of the drone 10 to perform a photographing task facing the work area 30 .
  • the orientation of the camera 11 can be represented by the relative directional relationship between the frame direction and the heading of the UAV 10.
  • the frame direction can refer to the extension direction of the long side of a single image captured by the camera, or the The direction parallel to the normal direction of the long side. It can also refer to the extension direction of the short side of a single image captured by the camera, or the direction parallel to the normal direction of the short side.
  • the rectangular area corresponding to the single image captured by the camera forms a mapping relationship with the scene area actually covered by the captured image.
  • the above frame direction may also refer to the extending direction of the long side of the rectangular coverage area 31 of the single image captured by the camera, or the direction parallel to the normal direction of the long side. It may also refer to the extension direction of the short side of the rectangular coverage area 31 of the single image captured by the camera, or the direction parallel to the normal direction of the short side.
  • the frame direction may also include the direction of a certain reference line in a single image or in the rectangular coverage area 31 , for example, the extension direction of the diagonal of the rectangle, or a preset angle with the long side. other directions.
  • the frame direction is taken as the extending direction of the long sides of the rectangular coverage area 31 or the direction parallel to the normal direction of the long sides for description.
  • 2 shows that the shooting attitude of the camera 11 is that the direction parallel to the normal direction of the long side of the rectangular coverage area 31 is parallel to the heading X of the drone 10 .
  • FIG. 3 shows that the shooting posture of the camera 11 is that the extending direction of the long side of the rectangular coverage area 31 is parallel to the heading X of the UAV 10 .
  • the rectangular coverage area 31 passes through The area can only cover one side of the operation area 30, and the other side has not yet completed the surveying and mapping.
  • the planned route 32 Continue to detour to the area on the other side.
  • the route 32 of the UAV 10 travels through a long voyage and takes a long time.
  • a task parameter condition may be preset, and the task parameters when the UAV performs the shooting task are set to meet the task parameter condition, then the UAV performs the shooting task before the shooting task.
  • the control device can plan the route in the operation area based on the relative directional relationship between the frame direction of the UAV's camera and the UAV's heading, and determine the mission parameters when the UAV performs the shooting task along the route. If the task parameter conditions are met, the control device further controls the UAV to perform the shooting task according to the above-mentioned relative direction relationship and route; if the task parameters do not meet the task parameter conditions, the control device controls the UAV to adjust the relative direction between the frame direction and the heading. relationship, and re-plan the route until the mission parameters meet the mission parameter conditions, and control the UAV to perform the shooting task according to the new relative direction relationship and route.
  • the task parameter conditions can be preset, and the task parameters when the UAV performs the shooting task must meet the task parameter conditions.
  • the control device can be based on the unmanned aerial vehicle. There are many different relative directional relationships between the frame direction of the camera of the drone and the heading of the drone, respectively plan the corresponding routes in the operation area, and determine the mission parameters when the drone performs shooting tasks along each route, and then , after the control device determines one or more target relative direction relationships and target routes that meet the task parameter conditions in all relative direction relationships, it can automatically control the unmanned aerial vehicle according to the target relative direction relationship and the corresponding target route with the optimal mission parameters.
  • the drone performs the shooting task, or according to the user's choice, the drone is controlled to perform the shooting task according to the relative direction relationship of the target selected by the user and the corresponding target route.
  • the corresponding route is planned and the unmanned aerial vehicle is calculated according to the relative directional relationship between the frame direction of the camera of the drone and the heading of the drone.
  • the relative directional relationships and flight routes with higher operational efficiency can be screened out, and then the UAV can be controlled to execute according to the relative directional relationships and flight routes with higher operational efficiency. Shooting tasks to improve work efficiency.
  • FIG. 4 is a flowchart of a device control method provided by an embodiment of the present application. As shown in FIG. 4 , the method may include:
  • Step 101 Obtain the operation area where the drone performs the shooting task.
  • the operation area where the UAV performs the shooting task is usually known, and the control device of the UAV can receive the coordinates of the operation area and store it.
  • the contour of the working area may be a regular shape or an irregular shape, which is not limited in this embodiment of the present application.
  • Step 102 Plan a route in the operation area according to the relative directional relationship between the frame direction of the camera of the drone and the heading of the drone.
  • the route is planned in the work area according to the relative directional relationship.
  • the route planning can be realized by the size of the work area and the position interval of the UAV when the camera moves along the course to capture two adjacent images.
  • FIG. 5 which shows a schematic diagram of a route provided by the embodiment of the present application
  • the unmanned aerial vehicle needs to be turned back several times in the operation area 30
  • the detour operation is performed so that the photographed image covers the entire operation area, so that the route 32 planned for the UAV usually includes multiple single paths, and the route 32 in FIG. 5 has three single paths.
  • the position interval can include the heading interval and the side interval.
  • the heading interval and the side interval can be calculated according to the heading overlap rate and the side overlap rate.
  • the heading overlap rate, the side overlap rate and the size of the operation area are known parameters, which can be obtained when the shooting task is determined.
  • the circumscribed rectangle of the operation area can be determined, and according to the length of the circumscribed rectangle and the heading interval, it can be determined that the UAV is in the operation area.
  • the length of a single path required to move along the course then determine the number of single paths required for the UAV to move along the course in the work area according to the width of the circumscribed rectangle and the lateral spacing; arrange the multiple single paths in turn
  • the initial route is obtained; finally, according to the outline of the operation area in the circumscribed rectangle, the initial route is fine-tuned so that it is all located in the operation area, and the planned route for the operation area is obtained.
  • Step 103 Determine mission parameters when the UAV performs the photographing mission along the route.
  • the task parameters are used to measure the efficiency of the UAV when the UAV performs the shooting task according to the current relative direction relationship and the route.
  • the task parameters may include: the total length of the route, the time required to complete the route, and the camera to complete the route. number of images taken, etc.
  • the mission parameters may also include parameters such as flight height above the ground, ground resolution, and camera internal parameters. Since these kinds of task parameters are generally predefined during flight, these kinds of task parameters can be changed according to the actual situation, or the configuration of these kinds of task parameters can be fixed, such as adjusting the parameters such as route, completion time, and number of images.
  • the total length of the route can be obtained; according to the moving speed and total length of the route when the UAV performs the shooting task, the time required to complete the route can be obtained; according to the heading distance when the UAV performs the shooting task and the total length of the route, the number of images captured by the camera to complete the route can be obtained.
  • Step 104 In the case that the task parameters do not meet the preset task parameter conditions, adjust the relative direction relationship between the frame direction and the heading, and re-plan the route.
  • the mission parameters include the total length of the route, the time required to complete the route, and the number of images captured by the route camera
  • the total length of the route is required to be as short as possible, and the time required to complete the route should be as short as possible for the efficiency of the mission. Short, complete route with as few images as possible from the camera.
  • the mission parameter conditions can be set according to specific mission parameters and actual needs, and when the mission parameters calculated according to the current relative direction relationship satisfy the mission parameter conditions, the UAV can be controlled according to the current relative direction relationship and the corresponding route.
  • Execute the shooting task if the task parameters calculated according to the current relative direction relationship do not meet the task parameter conditions, then flexibly control the UAV to change the relative direction relationship between the camera's frame direction and the UAV's heading.
  • the new route and mission parameters are obtained, and until the mission parameters of the new route meet the mission parameter conditions, the UAV is controlled to perform the shooting task according to the new relative direction relationship and the new route.
  • controlling the UAV to change the relative directional relationship between the frame direction of the camera and the heading of the UAV can be achieved by controlling the UAV to rotate the camera, for example, when the camera is installed on the gimbal of the UAV
  • the gimbal can be controlled to drive the camera to rotate, thereby changing the relative directional relationship between the frame direction of the camera and the heading of the drone.
  • the task parameter conditions can be set as follows: the time spent performing the shooting task cannot exceed 1 hour, and the images captured during the shooting task cannot exceed 10,000. Then, after the corresponding task parameters are obtained according to the route planned according to the current relative direction relationship, it can be determined whether the task parameters meet the task parameter conditions according to the above task parameter conditions.
  • the corresponding direction is planned according to the relative directional relationship between the frame direction of the UAV's camera and the heading of the UAV. Route, and calculate the mission parameters when the UAV performs the shooting task along the route; so that when the follow-up UAV performs the shooting task according to the relative direction relationship and the route, the corresponding operation efficiency is given a reference measure, which can be used for further evaluation.
  • the relative relationship between the camera's frame direction and the UAV's heading can be changed by flexibly controlling the UAV.
  • the direction relationship can finally meet the needs and optimize the attitude of the camera relative to the heading, thereby improving the operational efficiency of the UAV.
  • FIG. 6 is a specific flowchart of a device control method provided by an embodiment of the present application, and the method may include:
  • Step 201 Obtain the operation area where the drone performs the shooting task.
  • step 201 reference may be made to the foregoing step 101, and details are not repeated here.
  • Step 202 According to the relative direction relationship, determine the position interval of the drone when the drone moves along the heading to capture two adjacent images.
  • the camera in order to ensure the continuity of the front and rear and left and right of the picture in the UAV surveying and mapping results, it is necessary for the camera to have a certain overlapping area between the two adjacent images, which are adjacent to the front and rear, and adjacent to the left and right. Specifically, it is necessary to determine the position interval of the UAV when the UAV moves along the course to capture two adjacent images, and realize the planning of the route according to the position interval.
  • the position interval includes: the position interval of the drone when moving on a single path of the flight route to take two adjacent images, and the separation distance between the images on two adjacent single paths of the flight route.
  • step 202 may specifically include:
  • Sub-step 2021 Determine the shooting overlap ratio of the camera according to the relative direction relationship.
  • the shooting overlap rate of the camera of the drone can be further set according to the relative direction relationship, and the shooting overlap rate is used to limit the camera when shooting images.
  • the area ratio of the overlapping area between the two adjacent images For example, in aerial surveying and mapping, the shooting overlap rate between two adjacent images is generally 60%, that is, the ratio of the length of the overlapping area to the length of the image is 60%.
  • Sub-step 2022 Determine the position interval according to the shooting overlap rate, the ground resolution of the camera, and the flying height of the drone.
  • the size of the rectangular coverage area corresponding to the ground of an image captured by the camera can be determined.
  • Overlap rate get the position interval of the drone when the drone moves along the heading to take two adjacent images.
  • the position interval includes: a heading interval and a side interval
  • the shooting overlap rate includes: a heading overlap rate and a side overlap rate
  • sub-step 2022 may specifically include:
  • Sub-step A1 Determine the lengths of the short side and the long side of the frame reference area of the camera according to the ground resolution and the flying height, and the frame reference area is rectangular in shape.
  • FIG. 7 it shows a schematic imaging diagram of a camera provided by an embodiment of the present application, wherein the total size of the image sensor of the camera is s, the size of a single pixel is d, and the corresponding ground resolution is D.
  • the flying height is H
  • the size of the frame reference area of the camera that is, the rectangular coverage area of the ground corresponding to an image captured by the camera
  • the corresponding camera focal length is f.
  • the size S of the frame reference area can be obtained.
  • Sub-step A2 Determine the heading interval according to the length of the short side of the frame reference area and the heading overlap ratio.
  • the position interval includes: the heading interval and the side interval
  • the shooting overlap rate includes: the heading overlap rate and the side overlap rate.
  • the heading interval and the heading overlap rate reflect the distance between the two adjacent images before and after the camera captures the image.
  • Overlapping characteristics, lateral spacing and lateral overlapping rate reflect the overlapping characteristics between two adjacent images when the camera is capturing images.
  • the frame direction includes: the extension direction of the long side of the frame reference area, or the direction parallel to the normal direction of the long side, or the extension direction of the short side of the frame reference area, or the direction of the short side of the frame reference area.
  • the frame direction of the camera of the drone is the short side direction and the long side direction of the frame reference area
  • two kinds of relative direction relationships are commonly used, including the long side extension direction of the frame reference area of the drone and the no-frame reference area.
  • the course of the man-machine is parallel, and the direction parallel to the normal direction of the long side is parallel to the course of the drone.
  • FIG. 8 it shows an orientation relationship diagram between two adjacent images captured by a camera provided by an embodiment of the present application, wherein the direction parallel to the normal direction of the long side of the frame reference area of the camera and the The heading X of the drone is parallel, and the camera captures two adjacent images; the frame reference areas of the two adjacent images are the area ABKF and the area EJCD, respectively, and the long side size of the frame reference area is S long and short. Side dimension is S short .
  • a heading overlap area EJKF is created between the area ABKF and the area EJCD.
  • FIG. 9 shows an azimuth relationship diagram between two adjacent images captured by another camera provided by an embodiment of the present application, wherein the extension direction of the long side of the frame reference area of the camera and the heading of the UAV X-parallel, the camera captured two adjacent images; the frame reference areas of the two adjacent images are area A'B'K'F' and area E'J'C'D' respectively, the frame reference area
  • the dimension of the long side is S long and the dimension of the short side is S short .
  • a heading overlap area E'J'K'F' is created between area A'B'K'F' and area E'J'C'D'.
  • the heading pitch HK S length ⁇ (1-P%).
  • Sub-step A3 Determine the lateral interval according to the length of the long side of the frame reference area and the lateral overlap ratio.
  • the frame direction of the camera of the drone is the extension direction of the long side of the frame reference area and the direction parallel to the normal direction of the long side.
  • Two kinds of relative directional relationships are commonly used, including the frame reference of the drone.
  • the extension direction of the long side of the area is parallel to the heading of the UAV, and the direction parallel to the normal direction of the long side is parallel to the heading of the UAV.
  • the lateral spacing is solved through these two relative directional relationships:
  • the direction parallel to the normal direction of the long side of the frame reference area of the camera is parallel to the heading X of the drone, and the camera has captured two adjacent images on the left and right;
  • the frame reference areas are respectively the area ABKF and the area ILMH, the long side dimension of the frame reference area is S long , and the short side dimension is S short .
  • a heading overlap area IBKH is created between area ABKF and area ILMH.
  • the lateral pitch KM S length ⁇ (1 ⁇ Q%).
  • the long-side extension direction of the frame reference area of the camera is parallel to the heading X of the drone, and the camera has captured two images adjacent to the left and right; the frame reference areas of the two adjacent images are respectively the area A'B'K'F' and area I'L'M'H', the frame reference area has a long side dimension of S long and a short side dimension of S short .
  • a heading overlap area I'B'K'H' is created between the area A'B'K'F' and the area I'L'M'H'.
  • step 202 may specifically include:
  • Sub-step 2023 Obtain the minimum time interval between two adjacent exposures of the camera.
  • the minimum time interval between two adjacent exposures of the camera may be further determined, where the minimum time interval is the time interval when the drone moves along the course to capture two adjacent images, and the minimum time interval It can be set according to the actual needs of users. It can also be set according to the actual frame rate of the camera sensor (the maximum number of exposures per unit time).
  • the size of the minimum time interval affects the fineness of the picture in the final surveying and mapping result, and the user can set it according to the requirements of cost and accuracy.
  • the user can also set the shutter time for each exposure.
  • the minimum time interval between two adjacent exposures of the camera is also limited by the hardware performance of the camera.
  • the shutter time of the camera exposure will affect the perception of light by the camera sensor sensor. In order to ensure that each exposure, the sensor can perceive the expected amount of incoming light, and the user can adjust the shutter time.
  • Sub-step 2024 Determine the position interval as the product of the minimum time interval between two adjacent exposures of the camera and the maximum flight speed of the drone.
  • the product of the minimum time interval between the two adjacent exposures of the camera and the maximum flight speed of the UAV can be used as the position interval of the UAV when the UAV moves along the heading to capture two adjacent images. , so that the position interval is obtained by another implementation.
  • Step 203 Plan a route in the operation area according to the position interval.
  • the position interval includes the heading In the case of the interval m and the lateral interval n, by knowing the size of the operation area 30, the length of a single path of the route 32 can be obtained according to the length of the operation area 30 and the heading interval m; according to the width of the operation area 30 and the side spacing n, the number of single paths required for the route 32 can be obtained.
  • the route After knowing the length and quantity of a single path of the route, the route can be obtained by connecting multiple single paths end to end in sequence.
  • the route includes at least one single path; step 203 may specifically include:
  • Sub-step 2031 Determine the size of the circumscribed rectangle of the working area.
  • the shape of the planned work area is generally not a regular shape.
  • sub-step 2031 may specifically include:
  • Sub-step B1 According to the heading, establish a circumscribed rectangle of the work area, the extending direction of the long side of the circumscribed rectangle is parallel to the moving direction, or a direction parallel to the normal direction of the long side of the circumscribed rectangle parallel to the moving direction.
  • Sub-step B2 Determine the size of the circumscribed rectangle.
  • FIG. 10 it shows a schematic diagram of an initial route planning provided by an embodiment of the present application, wherein the operation area 30 is a hexagon.
  • the operation area 30 is a hexagon.
  • a circumscribed rectangle 33 of the work area 30 is established, wherein the extending direction of the long sides of the circumscribed rectangle 33 is kept parallel to the moving direction X.
  • the direction parallel to the normal direction of the long side of the circumscribed rectangle may also be kept parallel to the moving direction X, which is not limited in the present application.
  • the size of the circumscribed rectangle 33 can be obtained according to the size of the work area 30 .
  • Sub-step 2032 Determine the length of a single path required by the route according to the size of the circumscribed rectangle and the heading interval.
  • the final planned route 32 may include multiple single routes connected end to end, and two adjacent single routes 32 are connected by a turning route .
  • Sub-step 2033 Determine the number of single paths required for the flight route according to the size of the circumscribed rectangle and the sideways interval.
  • the number of single paths required for the flight route N [L outer width /n], where [ ] is a round-up symbol. As shown in Figure 10, the number of single paths required for the route is 5.
  • Sub-step 2034 According to the side interval, the heading interval, the length of the single path, and the number of the single path, plan the route in the operation area, and the adjacent single paths of the route are obtained. The side intervals are spaced therebetween.
  • the initial route 34 can be obtained.
  • the total length of the initial route 34 the number of single paths N ⁇ the length of a single path L single + the width of the circumscribed rectangle L outer width .
  • the obtained route 34 is the initial route, and its part of the route is outside the operation area 33.
  • the specific adjustment process is as follows:
  • sub-step 2034 may specifically include:
  • Sub-step C1 According to the side interval, the heading interval, the length of the single path, and the number of the single path, plan an initial route in the circumscribed rectangle, and determine the difference between the initial route and the The intersection of the boundaries of the work area in the enclosing rectangle.
  • intersection point a of the initial route 34 and the boundary of the operation area 33 in the circumscribed rectangle 30 can be further determined.
  • Sub-step C2 move the intersection point by a preset distance along a target direction, the target direction is a direction parallel to the path where the intersection point is located, and the target direction is a direction toward the interior of the work area or away from the target direction. the direction inside the work area.
  • FIG. 11 shows a schematic diagram of a final route planning provided by an embodiment of the present application, wherein after moving the seven intersection points in FIG. 10 , a new intersection b as shown in FIG. 11 is obtained.
  • the intersection point a is moved along the target direction to obtain a new intersection point b, where the target direction may include: a direction parallel to the path where the intersection point a is located, and the target direction is a direction toward the interior of the work area 33 or a direction away from the interior of the work area 33 .
  • the target direction can also include any direction set by the user.
  • Sub-step C3 Connect the moved intersection points in series in sequence to obtain the route.
  • the final flight route 32 can be obtained by sequentially connecting the moved intersection points b in series. It can be seen that, compared with the initial route 34 in FIG. 10 , the final route 32 in FIG. 12 is all located in the operation area 33 , so as to meet the requirement for the UAV to operate in the operation area 33 as much as possible.
  • Step 204 Determine mission parameters when the UAV performs the photographing mission along the route.
  • step 204 may specifically be parameterized with the above-mentioned step 103, which will not be repeated here.
  • the task parameter includes any one of the total length of the route, the estimated operation time of the UAV to complete the route, and the estimated number of pictures taken by the camera when the route is completed.
  • These three parameters are important parameters that affect the cost and quality of the shooting task, so the cost-effectiveness of the route can be pre-judged based on the three parameters of the acquired route.
  • the mission parameters can also include other types of parameters, such as the power consumption of the UAV and the number of obstacles on the route.
  • the task parameters include the estimated operation time for the UAV to complete the route, and step 204 may specifically include:
  • Sub-step 2041 Determine the ratio of the total length of the route to the target speed as the estimated operation time.
  • the target speed is the operating speed; when the operating speed of the UAV is greater than the In the case of the maximum moving speed of the man-machine, the target speed is the maximum moving speed, and the working speed is the speed when the drone moves according to the minimum time interval between two adjacent exposures of the camera.
  • the camera in the field of surveying and mapping, the camera continuously captures images in the work area, and the camera needs to set the minimum time interval t required for two adjacent exposures, so as to ensure the continuity of the captured images.
  • the man-machine moves according to the parameter of the minimum time interval t between two adjacent exposures of the camera, it has an operating speed V1, and according to the power of the drone, it also has a maximum flight speed V2.
  • these parameters satisfy: heading interval m ⁇ minimum time interval t ⁇ working speed V1, then working speed V1 ⁇ (heading interval m/minimum time interval t).
  • the operation speed V1 of the UAV and the maximum flight speed V2 of the UAV can be compared, and when the operation speed V1 of the UAV is less than or equal to the maximum moving speed V2 of the UAV, the operation speed V1 is determined as the target speed V, and the ratio of the total length of the route to the target speed V is taken as the estimated operation time for the UAV to complete the route.
  • the maximum moving speed V2 is determined as the target speed V, and the ratio of the total length of the route to the target speed V is used as the UAV completed.
  • the estimated operation time of the route that is, within the rated speed range of the UAV flight, the calculation of the estimated operation time of the UAV to complete the route is performed.
  • the task parameter includes the estimated number of photos taken by the drone to complete the route
  • Step 204 may specifically include:
  • Sub-step 2042 Determine the expected number of photos by taking the ratio of the total length of the route to the position interval corresponding to the route.
  • the ratio of the total length of the route to the heading interval corresponding to the route can be used to determine the expected number of photos taken.
  • the smaller the expected number of photos the better the efficiency of surveying and mapping.
  • Step 205 In the case that the task parameters do not meet the preset task parameter conditions, adjust the relative directional relationship between the frame direction and the heading, and re-plan the route.
  • step 205 may specifically be parameterized with the above-mentioned step 104, which will not be repeated here.
  • step 205 may specifically include:
  • Sub-step 2051 In the case that the value of the task parameter is greater than or equal to the task parameter threshold corresponding to the task parameter, determine that the task parameter does not meet the preset task parameter condition.
  • the mission parameters include any one of the total length of the route, the estimated operation time of the UAV to complete the route, and the estimated number of pictures taken by the camera when the route is completed
  • the mission parameters In the process of comparing the value of , and the task parameter threshold corresponding to the task parameter, a single comparison can be performed, such as:
  • the task parameter threshold is the time value
  • the mission parameter threshold is the quantity value
  • the mission parameter threshold is the distance value, then compare the current route length with the mission parameter threshold. If the current route length is greater than or equal to the mission parameter threshold, it is considered that the UAV needs to complete the current route for too long and does not meet the preset mission.
  • the parameter conditions require re-planning of the relative direction relationship and route.
  • the three mission parameters of the total length of the route, the estimated operation time of the UAV to complete the route, and the estimated number of pictures taken by the camera when the route is completed have different degrees of importance, for example, the total length of the route , the estimated operation time, and the estimated number of photos are in decreasing order of importance. Therefore, it is also possible to set weight values for the three task parameters respectively, and add the products of each task parameter and the weight value to obtain the value of the task parameter.
  • Sub-step 2052 Control the rotation of the camera to obtain a new relative direction relationship between the frame direction and the heading.
  • the rotation operation may be to keep the camera facing the work area and rotate counterclockwise or clockwise.
  • sub-step 2052 may specifically include:
  • Sub-step D1 controlling the pan/tilt to drive the camera to rotate, to obtain a new relative directional relationship between the frame direction and the heading.
  • a rotation angle can be set, so that the gimbal can calculate the rotation amount according to the rotation angle, and the gimbal operates according to the rotation amount to drive the camera to rotate to obtain a new relative relationship between the frame direction and the heading. directional relationship.
  • the camera needs to be rotated 90 degrees counterclockwise or clockwise.
  • Sub-step 2053 In the case where the value of the task parameter of the new route is obtained according to the new relative direction relationship planning, which is less than the task parameter threshold corresponding to the The relative directional relationship and the new route control the UAV to perform the shooting task.
  • the drone can be controlled to perform the shooting task according to the new relative direction relationship and the new route.
  • the process of sub-step 2052 needs to be continued to re-determine a new relative direction relationship and a new route, until the value of the mission parameter of the new route is less than the mission parameter threshold corresponding to the mission parameter.
  • the corresponding direction is planned according to the relative directional relationship between the frame direction of the UAV's camera and the heading of the UAV. Route, and calculate the task parameters when the UAV performs the shooting task along the route; so that when the follow-up UAV performs the shooting task according to the relative direction relationship and the route, the corresponding operation efficiency is given a reference measure, which can be used for further evaluation.
  • the relative relationship between the camera's frame direction and the UAV's heading can be changed by flexibly controlling the UAV.
  • the direction relationship can finally meet the needs and optimize the attitude of the camera relative to the heading, thereby improving the operational efficiency of the UAV.
  • FIG. 13 is a flowchart of another device control method provided by an embodiment of the present application, and the method may include:
  • Step 301 Obtain the operation area where the drone performs the shooting task.
  • step 301 For the specific step 301, reference may be made to the above-mentioned step 101, which will not be repeated here.
  • Step 302 In a plurality of different relative directional relationships between the frame direction of the drone when shooting the work area and the heading of the drone, for each of the relative directional relationships Plan a route in the operation area, and determine mission parameters when the UAV performs the shooting task along the planned route.
  • a plurality of relative directional relationships may be preset, a corresponding flight route may be planned for each relative directional relationship in advance, and task parameters corresponding to each flight route may be determined.
  • corresponding routes can be respectively planned for the relative direction relationship shown in FIG. 2 and the relative direction relationship shown in FIG. 3 , and the task parameters corresponding to each route can be determined.
  • step 302 may specifically include:
  • Sub-step 3021 For each of the relative directional relationships, determine the movement of the UAV along the direction in the relative directional relationship, so as to capture the position interval of the UAV when two adjacent images are taken.
  • sub-step 3021 may specifically include:
  • Sub-step E1 for each of the relative azimuth relationships, determine the shooting overlap ratio of the cameras respectively.
  • Sub-step E2 Determine the position interval corresponding to each of the relative azimuth relationships according to the shooting overlap rate, the ground resolution of the camera, and the flying height of the UAV.
  • the position interval includes: a heading interval and a side interval
  • the shooting overlap rate includes: a heading overlap rate and a side overlap rate
  • sub-step E2 may specifically include:
  • Sub-step E21 determine the lengths of the short side and the long side of the frame reference area of the camera, and the frame reference area is rectangular in shape.
  • Sub-step E22 Determine the heading interval corresponding to each of the relative azimuth relationships according to the length of the short side of the frame reference area and the heading overlap ratio.
  • Sub-step E23 Determine the lateral interval corresponding to each of the relative orientation relationships according to the length of the long side of the frame reference area and the lateral overlap ratio.
  • sub-step 3021 reference may be made to the above-mentioned step 202, which will not be repeated here.
  • Sub-steps E1-E2 may refer to the above-mentioned sub-steps 2021-2022 for details, which will not be repeated here.
  • Sub-steps E21-E23 may refer to the above-mentioned sub-steps A1-A3 for details, which will not be repeated here.
  • Sub-step 3022 Plan a route in the operation area according to the position interval.
  • the route includes at least one single path; sub-step 3022 may specifically include:
  • Sub-step F1 Determine the size of the circumscribed rectangle of the working area.
  • sub-step F1 may specifically include:
  • Sub-step F11 According to the heading, establish a circumscribed rectangle of the work area, and the extension direction of the long side of the circumscribed rectangle is parallel to the moving direction, or a direction parallel to the normal direction of the long side of the circumscribed rectangle parallel to the moving direction.
  • Sub-step F12 Determine the size of the circumscribed rectangle.
  • Sub-step F2 Determine the length of a single path required by the route according to the size of the circumscribed rectangle and the heading interval.
  • Sub-step F3 Determine the number of single paths required for the route according to the size of the circumscribed rectangle and the side spacing.
  • Sub-step F4 According to the side interval, the heading interval, the length of the single path and the number of the single path, plan the route in the operation area, and the adjacent single paths of the route are obtained. The side intervals are spaced therebetween.
  • sub-step F4 may specifically include:
  • Sub-step F41 According to the side interval, the heading interval, the length of the single path and the number of the single path, plan an initial route in the circumscribed rectangle, and determine the difference between the initial route and the The intersection of the boundaries of the work area in the enclosing rectangle.
  • Sub-step F42 move the intersection point by a preset distance along a target direction, the target direction is a direction parallel to the path where the intersection point is located, and the target direction is a direction toward the inside of the work area or away from the target direction. the direction inside the work area.
  • Sub-step F43 Connect the moved intersection points in series in sequence to obtain the route.
  • the frame direction includes: the extension direction of the long side of the frame reference area, or the direction parallel to the normal direction of the long side, or the extension direction of the short side of the frame reference area, or the direction of the short side of the frame reference area.
  • the normal direction of the short side is parallel to the direction.
  • sub-step 3022 reference may be made to the above-mentioned step 203, which will not be repeated here.
  • Sub-steps F1-F4 may refer to the above-mentioned sub-steps 2031-2034 for details, which will not be repeated here.
  • Sub-steps F11-F12 may refer to the above-mentioned sub-steps B1-B2 for details, which will not be repeated here.
  • Sub-steps F41-F43 may refer to the above-mentioned sub-steps C1-C3 for details, which will not be repeated here.
  • sub-step 3021 may specifically include:
  • Sub-step G1 Obtain the minimum time interval between two adjacent exposures of the camera.
  • Sub-step G2 Determine the position interval as the product of the minimum time interval between two adjacent exposures of the camera and the maximum flying speed of the drone.
  • sub-steps G1-G2 reference may be made to the above-mentioned sub-steps 2023-2024, which will not be repeated here.
  • the task parameter includes any one of the total length of the route, the estimated operation time of the UAV to complete the route, and the expected number of pictures taken by the camera when the route is completed.
  • the value of the target task parameter is the minimum value among the values of all task parameters.
  • the task parameter includes the estimated operation time for the UAV to complete the route
  • step 302 may specifically include:
  • Sub-step 3023 Determine the ratio of the total length of the route to the target speed as the estimated operation time.
  • the target speed is the operating speed; when the operating speed of the UAV is greater than the In the case of the maximum moving speed of the man-machine, the target speed is the maximum moving speed, and the working speed is the speed when the drone moves according to the minimum time interval between two adjacent exposures of the camera.
  • sub-step 3023 reference may be made to the above-mentioned step 2041, which will not be repeated here.
  • the task parameters include the estimated number of photos taken by the drone to complete the route
  • step 302 may specifically include:
  • Sub-step 3024 Determine the expected number of photos by taking the ratio of the total length of the route to the position interval corresponding to the route.
  • step 3024 reference may be made to the above-mentioned step 2042, which will not be repeated here.
  • Step 303 Determine the target relative azimuth relationship and the corresponding target route corresponding to the target mission parameters that meet the preset mission parameter conditions, and the target relative azimuth relationship and the target route are used to control the UAV to perform the shooting mission. .
  • step 303 may specifically include:
  • Sub-step 3031 In the case that the current relative directional relationship between the frame direction of the camera and the heading of the UAV does not match the relative azimuth relationship corresponding to the target task parameters, control the gimbal to drive The camera rotates, adjusts the current relative direction relationship to the relative orientation relationship corresponding to the target task parameters, and controls the UAV to perform the shooting task according to the route corresponding to the target task parameters.
  • the relative directional relationships and flight routes with higher operational efficiency can be screened out, and then the UAV can be controlled to follow the higher operational efficiency.
  • the relative direction relationship and the flight route are used to perform shooting tasks, thereby improving operational efficiency.
  • the two groups of task parameters can be compared to determine whether they meet the preset task parameters.
  • the target relative azimuth relationship and the corresponding target route corresponding to the target task parameters of the conditions, and the target relative azimuth relationship and the corresponding target route are used to control the UAV to perform the shooting task. According to the route planning in Figure 2 and Figure 3, the route in Figure 2 better.
  • the UAV can be automatically controlled to perform the shooting task according to the target relative directional relationship and the corresponding target route with the optimal task parameters, or according to the target route.
  • the user's selection controls the UAV to perform the shooting task according to the relative direction relationship of the target selected by the user and the corresponding target route.
  • sub-step 3031 reference may be made to the above-mentioned step 2042, which will not be repeated here.
  • the corresponding camera is planned according to the relative directional relationship between the frame direction of the camera of the drone and the heading of the drone.
  • Route and calculate the task parameters when the UAV performs the shooting task along the route; so that when the subsequent UAV performs the shooting task according to the relative direction relationship and the flight route, the corresponding operation efficiency is given a reference measure, and the task is further evaluated.
  • the relative direction relationship and the operation efficiency corresponding to the route meet the requirements, and if the requirements are not met, the relative direction between the camera's frame direction and the UAV's heading can be changed by flexibly controlling the drone. relationship, to finally meet the demand, realize the optimization of the attitude of the camera relative to the heading, thereby improving the operation efficiency of the UAV.
  • the relative directional relationships and flight routes with higher operational efficiency can be screened out, and the UAV can be controlled to follow the operations in the future. Efficient relative direction relationship and route execution of shooting tasks, thereby improving operational efficiency.
  • FIG. 14 is a block diagram of a device control apparatus provided by an embodiment of the present application.
  • the device control apparatus 400 may include: an acquisition module 401 and a processing module 402;
  • the acquiring module 401 is used to perform: acquiring the operation area where the drone performs the shooting task;
  • the processing module 402 is used to execute:
  • the relative direction relationship between the frame direction and the heading is adjusted, and the route is re-planned.
  • processing module 402 is specifically used for:
  • a route is planned in the work area according to the position interval.
  • processing module 402 is specifically used for:
  • the position interval is determined according to the shooting overlap rate, the ground resolution of the camera and the flying height of the drone.
  • the position interval includes: a heading interval and a side interval
  • the shooting overlap rate includes: a heading overlap rate and a side overlap rate
  • the processing module 402 is specifically used for:
  • the frame reference area is in the shape of a rectangle
  • the lateral interval is determined according to the length of the long side of the frame reference area and the lateral overlap ratio.
  • the route includes at least one single path; the processing module 402 is specifically configured to:
  • the route is planned in the operation area, and the adjacent single paths of the route are separated by a distance of 100%. the side interval.
  • processing module 402 is specifically used for:
  • the side interval the heading interval, the length of the single path and the number of the single path, an initial route is planned in the circumscribed rectangle, and the initial route and the circumscribed rectangle are determined. the intersection of the boundaries of the work area;
  • the target direction is a direction parallel to the path where the intersection point is located, and the target direction is a direction toward the interior of the work area or away from the interior of the work area direction;
  • processing module 402 is specifically used for:
  • a circumscribed rectangle of the work area is established, and the long side of the circumscribed rectangle extends in a direction parallel to the moving direction, or the direction parallel to the normal direction of the long side of the circumscribed rectangle is parallel to the movement direction direction parallel;
  • the size of the circumscribed rectangle is determined.
  • the frame direction includes: the extension direction of the long side of the frame reference area, or the direction parallel to the normal direction of the long side, or the extension direction of the short side of the frame reference area, or the direction of the short side of the frame reference area.
  • the normal direction of the short side is parallel to the direction.
  • processing module 402 is specifically used for:
  • the position interval is determined as the product of the minimum time interval between the two adjacent exposures of the camera and the maximum flying speed of the drone.
  • the task parameter includes any one of the total length of the route, the estimated operation time of the UAV to complete the route, and the expected number of pictures taken by the camera when the route is completed.
  • the processing module 402 is specifically configured to:
  • the value of the task parameter of the new flight route obtained by planning according to the new relative direction relationship is smaller than the task parameter threshold value corresponding to the task parameter, it is used for the unmanned aerial vehicle according to the new relative direction relationship and For the new route, control the UAV to perform the shooting task.
  • the drone is equipped with a gimbal, and the gimbal is equipped with the camera; the processing module 402 is specifically used for:
  • the gimbal is controlled to drive the camera to rotate, so as to obtain a new relative directional relationship between the frame direction and the heading.
  • the task parameter includes the estimated operation time of the UAV to complete the route
  • the processing module 402 is specifically used for:
  • the target speed is the operating speed; when the operating speed of the UAV is greater than the In the case of the maximum moving speed of the man-machine, the target speed is the maximum moving speed, and the working speed is the speed when the drone moves according to the minimum time interval between two adjacent exposures of the camera.
  • the task parameter includes the estimated number of photos taken by the drone to complete the route
  • the processing module 402 is specifically configured to:
  • the estimated number of photos is determined by the ratio of the total length of the route to the position interval corresponding to the route.
  • the device control device plans a corresponding flight route according to the relative directional relationship between the frame direction of the camera of the drone and the heading of the drone before the drone performs the shooting task. , and calculate the task parameters when the UAV performs the shooting task along the route; so that when the subsequent UAV performs the shooting task according to the relative direction relationship and the flight route, the corresponding operation efficiency is given a reference measure, and the task parameters are further evaluated. Judgment can determine whether the relative directional relationship and the operational efficiency corresponding to the route meet the requirements, and if the requirements are not met, the relative directional relationship between the camera's frame direction and the UAV's heading can be changed by flexibly controlling the drone. , to finally meet the demand and optimize the attitude of the camera relative to the heading, thereby improving the operational efficiency of the UAV.
  • FIG. 15 is a block diagram of a device control apparatus provided by an embodiment of the present application.
  • the device control apparatus 500 may include: an acquisition module 501 and a processing module 502;
  • the acquiring module 401 is used to perform: acquiring the operation area where the drone performs the shooting task;
  • the processing module 402 is configured to perform: in a plurality of different relative directional relationships between the frame direction of the drone when photographing the work area and the heading of the drone, for each The relative direction relationship is used to plan a route in the operation area, and the task parameters when the UAV performs the shooting task along the planned route is determined;
  • processing module 502 is specifically used for:
  • a route is planned in the work area according to the position interval.
  • processing module 502 is specifically used for:
  • the position interval corresponding to each of the relative azimuth relationships is determined according to the shooting overlap ratio, the ground resolution of the camera, and the flying height of the UAV.
  • the position interval includes: a heading interval and a side interval
  • the shooting overlap rate includes: a heading overlap rate and a side overlap rate
  • the processing module 502 is specifically used for:
  • the frame reference area is in the shape of a rectangle
  • the lateral interval corresponding to each of the relative orientation relationships is determined.
  • the route includes at least one single path; the processing module 502 is specifically configured to:
  • the route is planned in the operation area, and the adjacent single paths of the route are separated by a distance of 100%. the side interval.
  • processing module 502 is specifically used for:
  • the side interval the heading interval, the length of the single path and the number of the single path, an initial route is planned in the circumscribed rectangle, and the initial route and the circumscribed rectangle are determined. the intersection of the boundaries of the work area;
  • the target direction is a direction parallel to the path where the intersection point is located, and the target direction is a direction toward the interior of the work area or away from the interior of the work area direction;
  • processing module 502 is specifically used for:
  • a circumscribed rectangle of the work area is established, and the long side of the circumscribed rectangle extends in a direction parallel to the moving direction, or the direction parallel to the normal direction of the long side of the circumscribed rectangle is parallel to the movement direction direction parallel;
  • the size of the circumscribed rectangle is determined.
  • the frame direction includes: the extension direction of the long side of the frame reference area, or the direction parallel to the normal direction of the long side, or the extension direction of the short side of the frame reference area, or the direction of the short side of the frame reference area.
  • the normal direction of the short side is parallel to the direction.
  • processing module 502 is specifically used for:
  • the position interval is determined as the product of the minimum time interval between the two adjacent exposures of the camera and the maximum flying speed of the drone.
  • the processing module is specifically configured to: the task parameters include the total length of the route, the estimated operation time for the drone to complete the route, and the estimated number of photos taken by the camera when the route is completed. any of the.
  • the value of the target task parameter is the minimum value among the values of all task parameters.
  • the task parameter includes the estimated operation time of the UAV to complete the route
  • the processing module 402 is specifically used for:
  • the target speed is the operating speed; when the operating speed of the UAV is greater than the In the case of the maximum moving speed of the man-machine, the target speed is the maximum moving speed, and the working speed is the speed when the drone moves according to the minimum time interval between two adjacent exposures of the camera.
  • the task parameter includes the estimated number of photos taken by the UAV to complete the route
  • the processing module 502 is specifically configured to:
  • the estimated number of photos is determined by the ratio of the total length of the route to the position interval corresponding to the route.
  • the drone is equipped with a gimbal, and the gimbal is equipped with the camera; the processing module 502 is specifically used for:
  • the gimbal is controlled to drive the camera to rotate , adjust the current relative direction relationship to the relative orientation relationship corresponding to the target task parameter, and control the UAV to perform the shooting task according to the route corresponding to the target task parameter.
  • the device control device plans a corresponding flight route according to the relative directional relationship between the frame direction of the camera of the drone and the heading of the drone before the drone performs the shooting task. , and calculate the task parameters when the UAV performs the shooting task along the route; so that when the subsequent UAV performs the shooting task according to the relative direction relationship and the flight route, the corresponding operation efficiency is given a reference measure, and the task parameters are further evaluated. Judgment can determine whether the relative directional relationship and the operational efficiency corresponding to the route meet the requirements, and if the requirements are not met, the relative directional relationship between the camera's frame direction and the UAV's heading can be changed by flexibly controlling the drone. , to finally meet the demand and optimize the attitude of the camera relative to the heading, thereby improving the operational efficiency of the UAV.
  • the relative directional relationships and flight routes with higher operational efficiency can be screened out, and the UAV can be controlled to operate according to the operation efficiency. Efficient relative direction relationship and route execution of shooting tasks, thereby improving operational efficiency.
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the foregoing device control method embodiments can be implemented, and the same technology can be achieved. The effect, in order to avoid repetition, is not repeated here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, referred to as ROM), random access memory (Random Access Memory, referred to as RAM), magnetic disk or optical disk and so on.
  • the acquiring module may be an interface connecting the external control terminal and the device control device.
  • the external control terminal may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a control terminal with an identification module, an audio input /Output (I/O) ports, video I/O ports, headphone ports, and more.
  • the acquisition module may be used to receive input (eg, data information, power, etc.) from an external control terminal and transmit the received input to one or more elements within the device control device or may be used in the device control device and external Data transfer between control terminals.
  • At least one magnetic disk storage device For example at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the processor is the control center of the control terminal. It uses various interfaces and lines to connect various parts of the entire control terminal, and executes control by running or executing the software programs and/or modules stored in the memory and calling the data stored in the memory. Various functions of the terminal and processing data, so as to carry out overall monitoring of the control terminal.
  • the processor may include one or more processing units; preferably, the processor may integrate an application processor and a modem processor, wherein the application processor mainly processes the operating system, user interface and application programs, etc., and the modem processor Mainly deals with wireless communication. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor.
  • the embodiments of the present application may be provided as a method, a control terminal, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer readable memory capable of directing a computer or other programmable data processing terminal device to operate in a particular manner, such that the instructions stored in the computer readable memory result in an article of manufacture comprising the instruction to control the terminal,
  • the instruction controls the terminal to implement the function specified in one flow or multiple flows of the flowchart and/or one block or multiple blocks of the block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种设备控制方法、装置(400,500)及计算机可读存储介质,方法包括:获取无人机(10)执行拍摄任务的作业区域(30, 33) (101);根据无人机(10)的相机(11)的画幅方向和无人机(10)的航向之间的相对方向关系,在作业区域(30, 33)规划航线(32, 34) (102);确定无人机(10)沿航线(32, 34)执行拍摄任务时的任务参数(103);在任务参数不满足预设任务参数条件的情况下,调整画幅方向与航向之间的相对方向关系,并重新规划航线(32, 34) (104)。对无人机(10)在按照相对方向关系和航线(32, 34)执行拍摄任务之前,预估产生的作业效率赋予了可参考的度量,通过对任务参数进行判断,可以在作业效率不满足需求的情况下,通过灵活控制无人机(10)改变相对方向关系,来最终满足需求,实现对相机(11)相对于航向的姿态的优化,从而提高无人机(10)的作业效率。

Description

设备控制方法、装置及计算机可读存储介质 技术领域
本申请涉及无人机控制技术领域,特别是涉及一种设备控制方法、装置及计算机可读存储介质。
背景技术
无人机被广泛应用于测绘领域,以通过无人机的相机对作业区域的拍摄,实现对作业区域的测绘。
无人机执行拍摄任务时,无人机的航线规划以及任务效率至关重要,相关技术中,相机对作业区域进行拍摄时,无人机和相机可以通过预设定好的固定工作参数进行航线规划,以供后续进行拍摄,若要对拍摄任务的任务效率进行提升,则可以对无人机及其相机的性能进行提升,如,增加无人机的工作功率,以提升无人机的飞行速度;提高相机的拍摄精度,以满足对测绘结果的需求。
但是,目前方案中,单纯增加无人机及其相机的性能,会导致测绘成本大幅上升,而在无人机及其相机固定的情况下,限制了对无人机作业效率的提升,使得对拍摄任务的效率优化工作难以进行。
发明内容
本申请提供一种设备控制方法、装置及计算机可读存储介质,可以解决现有技术中单纯增加无人机及其相机的性能以实现任务效率的优化,会导致测绘成本大幅上升的问题。
第一方面,本申请实施例提供了一种设备控制方法,包括:
获取无人机执行拍摄任务的作业区域;
根据所述无人机的相机的画幅方向和所述无人机的航向之间的相对方向关系,在所述作业区域规划航线;
确定所述无人机沿所述航线执行所述拍摄任务时的任务参数;
在所述任务参数不满足预设任务参数条件的情况下,调整所述画幅方向与所述航向之间的相对方向关系,并重新规划航线。
第二方面,本申请实施例提供了一种设备控制方法,包括:
获取无人机执行拍摄任务的作业区域;
在所述无人机的相机在拍摄所述作业区域时的画幅方向和所述无人机的航向之间的多个不同的相对方向关系中,针对每一种所述相对方向关系在所述作业区域规划航线,并确定所述无人机沿规划的所述航线执行所述拍摄任务时的任务参数;
确定符合预设任务参数条件的目标任务参数对应的目标相对方位关系和对应的目标航线,所述目标相对方位关系和所述目标航线用于控制所述无人机执行所述拍摄任务。
第三方面,本申请实施例提供了一种设备控制装置,包括:获取模块和处理器;
所述获取模块用于,获取无人机执行拍摄任务的作业区域;
所述处理模块用于,根据所述无人机的相机的画幅方向和所述无人机的航向之间的相对方向关系,在所述作业区域规划航线;
确定所述无人机沿所述航线执行所述拍摄任务时的任务参数;
在所述任务参数不满足预设任务参数条件的情况下,调整所述画幅方向与所述航向之间的相对方向关系,并重新规划航线。
第四方面,本申请实施例提供了一种设备控制装置,包括:获取模块和处理器;
所述获取模块用于,获取无人机执行拍摄任务的作业区域;
所述处理模块用于,在所述无人机的相机在拍摄所述作业区域时的画幅方向和所述无人机的航向之间的多个不同的相对方向关系中,针对每一种所述相对方向关系在所述作业区域规划航线,并确定所述无人机沿规划的所述航线执行所述拍摄任务时的任务参数;
确定符合预设任务参数条件的目标任务参数对应的目标相对方位关系和对应的目标航线,所述目标相对方位关系和所述目标航线用于控制所述无人机执行所述拍摄任务。
第五方面,本申请提供一种计算机可读存储介质,所述计算机可读存储介质包括指令,当其在计算机上运行时,使得计算机执行上述方面所述的方 法。
第六方面,本申请提供一种计算机程序产品,所述计算机程序产品包括指令,当其在计算机上运行时,使得计算机执行上述方面所述的方法。
在本申请实施例中,本申请通过在无人机执行拍摄任务之前,通过根据无人机的相机的画幅方向和无人机的航向之间的相对方向关系,规划对应的航线,以及计算无人机沿航线执行拍摄任务时的任务参数;使得无人机后续在按照相对方向关系和航线执行拍摄任务时,对应产生的作业效率被赋予了可参考的度量,进一步对任务参数进行判断,可以确定相对方向关系和航线对应的作业效率是否满足需求,并在不满足需求的情况下,通过灵活控制无人机改变相机的画幅方向和无人机的航向之间的相对方向关系,来最终满足需求,实现对相机相对于航向的姿态的优化,从而提高无人机的作业效率。
附图说明
图1是本申请实施例提供的一种设备控制方法对应的系统架构图;
图2是本申请实施例提供的一种设备控制方法的场景图;
图3是本申请实施例提供的另一种设备控制方法的场景图;
图4是本申请实施例提供的一种设备控制方法的流程图;
图5是本申请实施例提供的一种航线示意图;
图6是本申请实施例提供的一种设备控制方法的具体流程图;
图7是本申请实施例提供的一种相机的成像示意图;
图8是本申请实施例提供的一种相机拍摄的相邻两张图像之间的方位关系图;
图9是本申请实施例提供的另一种相机拍摄的相邻两张图像之间的方位关系图;
图10是本申请实施例提供的另一种航线示意图;
图11是本申请实施例提供的另一种航线示意图;
图12是本申请实施例提供的另一种航线示意图;
图13是本申请实施例提供的另一种设备控制方法的流程图;
图14是本申请实施例提供的一种设备控制装置的框图;
图15是本申请实施例提供的另一种设备控制装置的框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
在本申请实施例中,参照图1,其示出了本申请实施例提供的一种设备控制方法对应的系统架构图,包括:无人机10、控制设备20,无人机10可以包括相机11。设备控制20与无人机10有线或无线连接,设备控制20可以获取数据,如,作业参数、控制指令等,并通过对数据的处理,控制无人机10以及相机11运行。需要说明的是,设备控制20可以集成设置在无人机10上,也可以独立于无人机10单独设置,本申请实施例对此不作限定。
其中,参照图2,其示出了本申请实施例提供的一种设备控制方法的场景图,相机11作为无人机10的负载,用于面向作业区域30进行拍摄任务。
具体的,在相机11的地面分辨率(GSD,Ground Sampling Distance)固定的情况下,相机11拍摄的单张图像具有一个对应地面的矩形覆盖区域31,相机11的拍摄姿态影响该矩形覆盖区域31的方位,且相机11的拍摄姿态可以用画幅方向与无人机10的航向之间的相对方向关系来表示,画幅方向可以是指,相机拍摄的单张图像的长边延伸方向,或者与所述长边的法线方向平行的方向。也可以指,相机拍摄的单张图像的短边延伸方向,或者与所述短边的法线方向平行的方向。
此外,相机拍摄单张图像对应的矩形区域,与拍摄得到的影像实际覆盖的场景区域之间形成映射关系。上述画幅方向也可以指相机拍摄的单张图像的矩形覆盖区域31的长边延伸方向,或者与所述长边的法线方向平行的方向。也可以指,相机拍摄的单张图像的矩形覆盖区域31的短边延伸方向,或者与所述短边的法线方向平行的方向。
另外,根据实际需求,画幅方向也可以包括单张图像中或矩形覆盖区域31中的某一参考线的方向,例如,矩形对角线的延伸方向,或者是与长边呈预设夹角的其他方向。
在本申请实施例中,以画幅方向为矩形覆盖区域31的长边延伸方向,或者与所述长边的法线方向平行的方向进行说明。其中,图2示出了相机11 的拍摄姿态为:使与矩形覆盖区域31的长边的法线方向平行的方向和无人机10的航向X平行。图3示出了相机11的拍摄姿态为:使矩形覆盖区域31的长边延伸方向与无人机10的航向X平行。
在图2中可以看出,若无人机10的相机11保持当前姿态,且无人机10沿航向X从作业区域30的一个短边飞至另一短边,则矩形覆盖区域31所经过的区域几乎可以覆盖整个作业区域30,整个拍摄任务中,无人机10的航线32经过的航程较短、耗时较少。
在图3中可以看出,若无人机10的相机11保持当前姿态,且无人机10沿航向X从作业区域30的一个短边飞至另一短边,则矩形覆盖区域31所经过的区域仅能覆盖作业区域30的一侧,而另一侧还未完成测绘,若要对另一侧完成测绘,无人机10在抵达作业区域30的另一短边时,规划的航线32还需继续向另一侧所在区域绕行。整个拍摄任务中,无人机10的航线32经过的航程较长、耗时较长。
由此可见,图2中无人机的执行效率明显大于图3中无人机的执行效率,因此,无人机的相机的画幅方向和无人机的航向之间的相对方向关系不同,会导致无人机的航线规划,以及按照所规划的航线完成拍摄任务的任务参数(航程、拍摄图像的数量、耗时等)不同,从而影响到无人机执行拍摄任务的效率。
在本申请实施例中,一种实现方式中,可以预设一个任务参数条件,并设定无人机执行拍摄任务时的任务参数需满足该任务参数条件,则无人机在执行拍摄任务之前,控制设备可以基于无人机的相机的画幅方向和无人机的航向之间的相对方向关系,在作业区域规划航线,并确定无人机沿航线执行拍摄任务时的任务参数,若任务参数满足任务参数条件,则控制设备进一步控制无人机按照上述相对方向关系和航线执行拍摄任务;若任务参数不满足任务参数条件,则控制设备控制无人机调整画幅方向与航向之间的相对方向关系,并重新规划航线,直至任务参数满足任务参数条件后,按照新的相对方向关系和航线控制无人机执行拍摄任务。
在另一种实现方式中,可以预设任务参数条件,并设定无人机执行拍摄任务时的任务参数需满足该任务参数条件,无人机在执行拍摄任务之前,控制设备可以基于无人机的相机的画幅方向和无人机的航向之间的多个不同的相对方向关系,在作业区域分别规划对应的航线,并确定无人机沿每个航线执行拍摄任务时的任务参数,之后,控制设备在所有相对方向关系中确定了符合任务参数条件的一个或多个目标相对方向关系和目标航线后,可以自 动按照任务参数最优的目标相对方向关系和对应的目标航线,控制无人机执行拍摄任务,或根据用户的选择,按照用户选择的目标相对方向关系和对应的目标航线控制无人机执行拍摄任务。
因此,在本申请实施例中,在无人机执行拍摄任务之前,通过根据无人机的相机的画幅方向和无人机的航向之间的相对方向关系,规划对应的航线,以及计算无人机沿航线执行拍摄任务时的任务参数;可以使得后续无人机在按照相对方向关系和航线执行拍摄任务时,对应产生的作业效率具有了可参考的度量,进一步对任务参数进行判断,可以确定相对方向关系和航线对应的作业效率是否满足需求,并在不满足需求的情况下,通过灵活控制无人机改变相机的画幅方向和无人机的航向之间的相对方向关系,来最终满足需求,实现对相机相对于航向的姿态的优化,从而提高无人机的作业效率。
另外,对每组相对方向关系以及航线对应的任务参数进行比较,可以筛选出作业效率较高的相对方向关系以及航线,后续则可以控制无人机按照作业效率较高的相对方向关系以及航线执行拍摄任务,从而提高作业效率。
图4是本申请实施例提供的一种设备控制方法的流程图,如图4所示,该方法可以包括:
步骤101、获取无人机执行拍摄任务的作业区域。
在实际应用中,无人机执行拍摄任务的作业区域通常是已知的,无人机的控制设备可以接收该作业区域的坐标并进行存储。根据实际需求,该作业区域的轮廓可以为规则形状或不规则形状,本申请实施例对此不作限定。
步骤102、根据所述无人机的相机的画幅方向和所述无人机的航向之间的相对方向关系,在所述作业区域规划航线。
具体的,根据相对方向关系在作业区域规划航线,具体可以通过根据作业区域的尺寸,以及相机沿航向移动以拍摄相邻两张图像时无人机的位置间隔实现航线的规划。
参照图5,其示出了本申请实施例提供的一种航线示意图,由于在实际航拍测绘领域中,作业区域30的面积一般较大,则无人机需要在作业区域30中进行多次折返绕行操作,使拍摄画面覆盖整个作业区域,这就使得针对无人机规划的航线32通常包括多条单条路径,图5中的航线32具有3条单条路径。
位置间隔可以包括航向间隔和旁向间隔,其中,无人机在航拍测绘过程中,为保证测绘结果中画面的连续性,相机在拍摄图像时,要求相邻两张图 像之间沿航向对所摄地面有一定的重叠,称之为航向重叠,则相邻两张图像在航向上的间隔距离称为航向间隔;另外,无人机在航拍测绘过程中,要求相机在航线的两相邻单条路径上分别所拍摄的图像之间也需要有一定的影像重叠,这种重叠称之为旁向重叠,则航线的两相邻单条路径上的图像之间的间隔距离称为旁向间隔。航向间隔和旁向间隔可以根据航向重叠率和旁向重叠率计算得出,航向重叠率、旁向重叠率以及作业区域的尺寸为已知参数,在确定了拍摄任务时即可得到。
进一步的,在确定了相机的画幅方向和无人机的航向之间的相对方向关系后,可以确定作业区域的外接矩形,并根据外接矩形的长度和航向间隔,确定无人机在作业区域中沿航向移动时所需的单条路径的长度;之后再根据外接矩形的宽度和旁向间隔,确定无人机在作业区域中沿航向移动时所需的单条路径的数量;将多条单条路径依次首尾相连后,得到初始航线;最后再根据作业区域在外接矩形中的轮廓,对初始航线进行微调,使其全部位于作业区域中,得到针对作业区域所规划的航线。
步骤103、确定所述无人机沿所述航线执行所述拍摄任务时的任务参数。
在本申请实施例中,任务参数用于度量无人机按照当前的相对方向关系以及航线执行拍摄任务时的效率,如,任务参数可以包括:航线总长度、完成航线所需时间、完成航线相机所拍摄图像的数量等。
此外,任务参数还可以包括,离地飞行高度,地面分辨率,相机内参等参数。由于这几种任务参数一般在飞行时是预定义好的,可以根据实际情况变更这几种任务参数,也可以固定这几种任务参数配置,如调节航线、完成时间、图像张数等参数。
其中,在航线规划完成后,即可得到航线总长度;根据无人机执行拍摄任务时的移动速度和航线总长度,可以得到完成航线所需时间;根据无人机执行拍摄任务时的航向间距和航线总长度,可以得到完成航线相机所拍摄图像的数量。
步骤104、在所述任务参数不满足预设任务参数条件的情况下,调整所述画幅方向与所述航向之间的相对方向关系,并重新规划航线。
在任务参数包括航线总长度、完成航线所需时间、完成航线相机所拍摄图像的数量的情况下,针对拍摄任务时的效率的需求,要求航线总长度尽可能短、完成航线所需时间尽可能短、完成航线相机所拍摄图像的数量尽可能少。
因此,可以针对具体的任务参数以及实际需求,设定任务参数条件,并 在根据当前相对方向关系计算得到的任务参数满足任务参数条件的情况下,控制无人机按照当前相对方向关系和对应航线执行拍摄任务;在根据当前相对方向关系计算得到的任务参数不满足任务参数条件的情况下,则通过灵活控制无人机改变相机的画幅方向和无人机的航向之间的相对方向关系,来得到新的航线和任务参数,直至新的航线的任务参数满足任务参数条件后,控制无人机按照新的相对方向关系和新的航线执行拍摄任务。
具体的,控制无人机改变相机的画幅方向和无人机的航向之间的相对方向关系,可以通过控制无人机对相机进行旋转实现,如,在相机安装在无人机的云台上时,可以控制云台带动相机旋转,从而改变相机的画幅方向和无人机的航向之间的相对方向关系。
例如,可以根据实际需求,将任务参数条件设定为:执行拍摄任务所花费的时间不能超过1小时、执行拍摄任务所拍摄的图像不能超过1万张。则根据当前的相对方向关系规划的路线求得对应的任务参数后,可以依据上述任务参数条件,判断任务参数是否满足任务参数条件。
综上,本申请实施例提供的一种设备控制方法,通过在无人机执行拍摄任务之前,通过根据无人机的相机的画幅方向和无人机的航向之间的相对方向关系,规划对应的航线,以及计算无人机沿航线执行拍摄任务时的任务参数;使得后续无人机在按照相对方向关系和航线执行拍摄任务时,对应产生的作业效率被赋予了可参考的度量,进一步对任务参数进行判断,可以确定相对方向关系和航线对应的作业效率是否满足需求,并在不满足需求的情况下,通过灵活控制无人机改变相机的画幅方向和无人机的航向之间的相对方向关系,来最终满足需求,实现对相机相对于航向的姿态的优化,从而提高无人机的作业效率。
图6是本申请实施例提供的一种设备控制方法的具体流程图,该方法可以包括:
步骤201、获取无人机执行拍摄任务的作业区域。
具体的,步骤201具体可以参照上述步骤101,此处不再赘述。
步骤202、根据所述相对方向关系,确定所述无人机沿所述航向移动以拍摄相邻两张图像时所述无人机的位置间隔。
在本申请实施例中,为了保证无人机测绘结果中画面的前后、左右连续性,需要相机在拍摄图像时,要求前后相邻、左右相邻的两张图像之间具有 一定的重叠区域,具体的,则需要确定无人机沿航向移动以拍摄相邻两张图像时无人机的位置间隔,并根据该位置间隔实现航线的规划。该位置间隔包括:在航线的一个单条路径上移动以拍摄相邻两张图像时无人机的位置间隔,以及在航线的两个相邻单条路径上的图像之间的间隔距离。
可选的,在一种实现方式中,步骤202具体可以包括:
子步骤2021、根据所述相对方向关系,确定所述相机的拍摄重叠率。
在本申请实施例中,当拍摄任务、作业区域和相对方向关系确定后,可以根据相对方向关系,进一步设定无人机的相机的拍摄重叠率,拍摄重叠率用于限定相机在拍摄图像时,前后相邻、左右相邻的两张图像之间重叠区域的面积占比。如,航空测绘中,前后相邻的两张图像之间拍摄重叠率一般为60%,即重叠区域的长度与图像的长度的比值为60%。
子步骤2022、根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定所述位置间隔。
具体的,根据相机的地面分辨率和无人机的飞行高度,可以确定出相机拍摄的一张图像对应地面的矩形覆盖区域的尺寸,在确定了矩形覆盖区域的尺寸后,即可进一步根据拍摄重叠率,得到无人机沿航向移动以拍摄相邻两张图像时无人机的位置间隔。
可选的,位置间隔包括:航向间隔和旁向间隔,所述拍摄重叠率包括:航向重叠率和旁向重叠率,子步骤2022具体可以包括:
子步骤A1、根据所述地面分辨率和所述飞行高度,确定所述相机的画幅参考区域的短边和长边的长度,所述画幅参考区域的形状为矩形。
具体的,参照图7,其示出了本申请实施例提供的一种相机的成像示意图,其中,相机的图像传感器的总尺寸为s,单个像元尺寸为d,在对应的地面分辨率为D,飞行高度为H的情况下,相机的画幅参考区域(即相机拍摄的一张图像对应地面的矩形覆盖区域)的尺寸为S,对应的相机焦距为f。
根据图7中示出的相似三角形关系,上述参数之间具有的关系为:
s/S=f/H=d/D;
则在相机的图像传感器的总尺寸s、单个像元尺寸d、地面分辨率D、飞行高度H、相机焦距f已知的情况下,则可以求出画幅参考区域的尺寸S。
子步骤A2、根据所述画幅参考区域的短边的长度以及所述航向重叠率,确定所述航向间隔。
具体的,位置间隔包括:航向间隔和旁向间隔,拍摄重叠率包括:航向重叠率和旁向重叠率,航向间隔与航向重叠率反映了相机在拍摄图像时前后相邻的两张图像之间重叠特性,旁向间隔与旁向重叠率反映了相机在拍摄图像时左右相邻的两张图像之间重叠特性。
可选的,画幅方向包括:所述画幅参考区域的长边延伸方向,或与所述长边的法线方向平行的方向,或所述画幅参考区域的短边延伸方向,或与所述短边的法线方向平行的方向。
在本申请实施例中,假设无人机的相机的画幅方向为画幅参考区域的短边方向和长边方向,常用两种相对方向关系包括无人机的画幅参考区域的长边延伸方向和无人机的航向平行,以及与长边的法线方向平行的方向和无人机的航向平行,先通过这两种相对方向关系分别进行航向间隔的求解:
参照图8,其示出了本申请实施例提供的一种相机拍摄的相邻两张图像之间的方位关系图,其中,与相机的画幅参考区域的长边的法线方向平行的方向和无人机的航向X平行,相机拍摄了前后相邻的两张图像;前后相邻的两张图像的画幅参考区域分别为区域ABKF和区域EJCD,画幅参考区域的长边尺寸为S ,短边尺寸为S 。区域ABKF和区域EJCD之间产生航向重叠区域EJKF。
在设定了航向重叠率为P%的情况下,航向间距AE=S ×(1-P%)。
参照图9,其示出了本申请实施例提供的另一种相机拍摄的相邻两张图像之间的方位关系图,其中,相机的画幅参考区域的长边延伸方向和无人机的航向X平行,相机拍摄了前后相邻的两张图像;前后相邻的两张图像的画幅参考区域分别为区域A’B’K’F’和区域E’J’C’D’,画幅参考区域的长边尺寸为S ,短边尺寸为S 。区域A’B’K’F’和区域E’J’C’D’之间产生航向重叠区域E’J’K’F’。
在设定了航向重叠率为P%的情况下,航向间距HK=S ×(1-P%)。
子步骤A3、根据所述画幅参考区域的长边的长度以及所述旁向重叠率,确定所述旁向间隔。
在本申请实施例中,假设无人机的相机的画幅方向为画幅参考区域的长边延伸方向和与长边的法线方向平行的方向,常用两种相对方向关系包括无人机的画幅参考区域的长边延伸方向和无人机的航向平行,以及与长边的法线方向平行的方向和无人机的航向平行,先通过这两种相对方向关系分别进行旁向间隔的求解:
参照图8,其中,与相机的画幅参考区域的长边的法线方向平行的方向和无人机的航向X平行,相机拍摄了左右相邻的两张图像;左右相邻的两张图像的画幅参考区域分别为区域ABKF和区域ILMH,画幅参考区域的长边尺寸为S ,短边尺寸为S 。区域ABKF和区域ILMH之间产生航向重叠区域IBKH。
在设定了旁向重叠率为Q%的情况下,旁向间距KM=S ×(1-Q%)。
参照图9,其中,相机的画幅参考区域的长边延伸方向和无人机的航向X平行,相机拍摄了左右相邻的两张图像;左右相邻的两张图像的画幅参考区域分别为区域A’B’K’F’和区域I’L’M’H’,画幅参考区域长边尺寸为S ,短边尺寸为S 。区域A’B’K’F’和区域I’L’M’H’之间产生航向重叠区域I’B’K’H’。
在设定了旁向重叠率为Q%的情况下,旁向间距K’M’=S ×(1-Q%)。
可选的,在另一种实现方式中,步骤202具体可以包括:
子步骤2023、获取所述相机相邻两次曝光的最小时间间隔。
在本申请实施例中,可以进一步确定相机相邻两次曝光的最小时间间隔,其中,该最小时间间隔为无人机沿航向移动以拍摄相邻两张图像时的时间间隔,该最小时间间隔可以根据用户实际需求进行设定。也可以根据相机传感器的实际帧率(单位时间最多曝光次数)来设定。
该最小时间间隔的大小影响最终测绘结果中画面的精细度,用户可以根据成本和精度的需求进行设定。
用户还可以设置每次曝光的快门时间。此外,相机相邻两次曝光的最小时间间隔也受限于相机的硬件性能。相机曝光的快门时间,会影响相机传感器sensor对光线的感知程度。为了保证每次曝光,sensor能够感知到符合预期的进光量,用户可以调节快门时间。
子步骤2024、将所述相机相邻两次曝光的最小时间间隔和所述无人机的最大飞行速度的乘积,确定为所述位置间隔。
具体的,相机相邻两次曝光的最小时间间隔和无人机的最大飞行速度的乘积,可以作为无人机沿所述航向移动以拍摄相邻两张图像时所述无人机的位置间隔,从而通过另一种实现方式,求得了位置间隔。
步骤203、根据所述位置间隔在所述作业区域规划航线。
在本申请实施例中,由于无人机可以按照图5所示的航线32,航线32包括3条单条路径,在矩形的作业区域30中进行多次折返绕行操作,则在位置间隔包括航向间隔m和旁向间隔n的情况下,通过得知作业区域30的尺寸,即可根据作业区域30的长度和航向间隔m,求得航线32的一条单条路径的长度;根据作业区域30的宽度和旁向间隔n,可以求得航线32所需单条路径的数量。
在得知了航线的一条单条路径的长度和数量后,可以将多条单条路径依次首尾相连后,得到航线。
可选的,航线包括至少一条单条路径;步骤203具体可以包括:
子步骤2031、确定所述作业区域的外接矩形的尺寸。
在本申请实施例中,在实际应用中,由于地形和拍摄目标的分布的影响,所规划的作业区域的形状一般不为规则形状,则在作业区域为非矩形形状的情况下,需要确定作业区域的外接矩形的尺寸,以便通过外接矩形进行初始航线的规划。
可选的,子步骤2031具体可以包括:
子步骤B1、根据所述航向,建立所述作业区域的外接矩形,所述外接矩形的长边延伸方向与所述移动方向平行,或与所述外接矩形的长边的法线方向平行的方向与所述移动方向平行。
子步骤B2、确定所述外接矩形的尺寸。
具体的,参照图10,其示出了本申请实施例提供的一种初始航线的规划示意图,其中,作业区域30为六边形,为了在该不规则的作业区域30中规划航线,首先西安根据航向X,建立作业区域30的外接矩形33,其中,保持外接矩形33的长边延伸方向与移动方向X平行。另外,也可以保持与所述外接矩形的长边的法线方向平行的方向与移动方向X平行,本申请对此不作限定。
在作业区域30的外接矩形33构建完毕后,根据作业区域30的尺寸, 可以得出外接矩形33的尺寸。
子步骤2032、根据所述外接矩形的尺寸、所述航向间隔,确定所述航线所需的单条路径的长度。
具体的,参照上述图5中对航线以及航线包括的单条路径的描述,最终规划出的航线32可以包括多个首尾依次相连的单条路径,且两个相邻单条路径32之间通过转弯路径连接。
在根据相对方向关系求得航向间隔m和旁向间隔n之后,可以进一步推出单条路径的长度L =外接矩形的长度L 外长-2×m。转弯路径的总长度L =外接矩形宽度L 外宽
子步骤2033、根据所述外接矩形的尺寸、所述旁向间隔,确定所述航线所需的单条路径的数量。
进一步的,航线所需的单条路径的数量N=[L 外宽/n],其中,[]为向上取整符号。如图10,航线所需的单条路径的数量为5。
子步骤2034、根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,所述航线的相邻单条路径之间间隔所述旁向间隔。
在本申请实施例中,参照图10,在得出单条路径的长度和单条路径的数量之后,在根据旁向间隔和航向间隔,将所有单条路径等间隔的布置在外接矩形30中,相邻单条路径之间间隔距离为旁向间隔,且单条路径的端部与外接矩形30的对应短边之间的距离为航向间隔。完成布置后,可以得到初始的航线34。初始的航线34的总长度=单条路径的数量N×单条路径的长度L +外接矩形宽度L 外宽
但是,此时得到航线34为初始的航线,其部分路径处于作业区域33之外,若要进一步提高航线的精度,满足航线尽可能位于作业区域33中的需求,则还需要对初始的航线34进行进一步的调整,具体调整过程如下:
可选的,子步骤2034具体可以包括:
子步骤C1、根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述外接矩形中规划得到初始航线,并确定所述初始航线与所述外接矩形中的作业区域的边界的交点。
参照图10,在外接矩形30中规划得到初始的航线34之后,可以进一步确定初始的航线34与外接矩形30中的作业区域33的边界的交点a。在图 10中,一共有7个交点a。
子步骤C2、将所述交点沿目标方向移动预设距离值,所述目标方向为与所述交点所在的路径平行的方向,且所述目标方向为朝向所述作业区域内部的方向或背离所述作业区域内部的方向。
参照图11,其示出了本申请实施例提供的一种最终航线的规划示意图,其中,在将图10中的7个交点进行移动后,得到如图11所示的新的交点b。
具体的,交点a沿目标方向移动得到新的交点b,其中,目标方向可以包括:与交点a所在的路径平行的方向,且目标方向为朝向作业区域33内部的方向或背离作业区域33内部的方向,另外,根据实际需求,目标方向也可以包括用户设定的任一方向。
子步骤C3、将移动后的交点依次串联连接起来,得到所述航线。
具体的,进一步参照图12,在将移动后的交点b依次串联连接起来,可以得到最终的航线32。可见,图12中最终的航线32相较于图10中初始的航线34,其全部位于作业区域33中,从而满足无人机尽可能在作业区域33中运行的需求。
步骤204、确定所述无人机沿所述航线执行所述拍摄任务时的任务参数。
具体的,步骤204具体可以参数上述步骤103,此处不再赘述。
可选的,任务参数包括所述航线的总长度、所述无人机完成所述航线的预计作业时间、完成所述航线时所述相机的预计拍照数量中的任一种。这三种参数是影响拍摄任务的成本和质量的重要参数,因此可以基于获取航线的这三种参数,来前置判断该航线的性价比。需要说明的是,任务参数还可以包括其他类型的参数,如无人机耗电量、航线上障碍物数量等。
可选的,所述任务参数包括所述无人机完成所述航线的预计作业时间,步骤204具体可以包括:
子步骤2041、将所述航线的总长度与目标速度的比值,确定为所述预计作业时间。
其中,在所述无人机的作业速度小于或等于所述无人机的最大移动速度的情况下,所述目标速度为所述作业速度;在所述无人机的作业速度大于所述无人机的最大移动速度的情况下,所述目标速度为所述最大移动速度,所述作业速度为所述无人机按照所述相机相邻两次曝光的最小时间间隔移动时的速度。
在本申请实施例中,在测绘领域中,相机在作业区域中持续拍摄图像,相机需要设定相邻两次曝光所需的最小时间间隔t,从而保证拍摄画面的连续性,具体的,无人机在按照相机相邻两次曝光的最小时间间隔t的参数移动时,具有作业速度V1,另外根据无人机的功率,其也具有最大飞行速度V2。并且,这些参数满足:航向间隔m≥最小时间间隔t×作业速度V1,则作业速度V1≤(航向间隔m/最小时间间隔t)。
进一步的,可以对无人机的作业速度V1以及无人机的最大飞行速度V2进行比较,在无人机的作业速度V1小于或等于无人机的最大移动速度V2的情况下,将作业速度V1确定为目标速度V,并将航线的总长度与目标速度V的比值,作为无人机完成航线的预计作业时间。
在无人机的作业速度V1大于无人机的最大移动速度V2的情况下,将最大移动速度V2确定为目标速度V,并将航线的总长度与目标速度V的比值,作为无人机完成航线的预计作业时间,即在无人机飞行的额定速度范围内,进行无人机完成航线的预计作业时间的计算。
可选的,所述任务参数包括所述无人机完成所述航线的预计拍照数量,
步骤204具体可以包括:
子步骤2042、将所述航线的总长度与所述航线对应的位置间隔的比值,确定所述预计拍照数量。
具体的,在该步骤中,可以将航线的总长度与航线对应的航向间隔的比值,确定预计拍照数量,在测绘领域中,保证同样测绘精度的情况下,预计拍照数量越小,说明测绘效率越高,成本越低。
步骤205、在所述任务参数不满足预设任务参数条件的情况下,调整所述画幅方向与所述航向之间的相对方向关系,并重新规划航线。
具体的,步骤205具体可以参数上述步骤104,此处不再赘述。
可选的,步骤205具体可以包括:
子步骤2051、在所述任务参数的值大于或等于所述任务参数对应的任务参数阈值的情况下,确定所述任务参数不满足预设的任务参数条件。
在任务参数包括所述航线的总长度、所述无人机完成所述航线的预计作业时间、完成所述航线时所述相机的预计拍照数量中的任一种的情况下,在 进行任务参数的值与任务参数对应的任务参数阈值的比较的过程中,可以进行单一对比,如:
如果任务参数阈值为时间值,则对比当前预计作业时间的值与任务参数阈值,若当前预计作业时间的值大于或等于任务参数阈值,则认为无人机完成当前航线所需时间过长,不满足预设的任务参数条件,需要重新规划相对方向关系和航线。
如果任务参数阈值为数量值,则对比当前预计拍照数量与任务参数阈值,若当前预计拍照数量大于或等于任务参数阈值,则认为无人机完成当前航线所需拍照数量过多,导致成本较高,不满足预设的任务参数条件,需要重新规划相对方向关系和航线。
如果任务参数阈值为距离值,则对比当前航线长度与任务参数阈值,若当前当前航线长度大于或等于任务参数阈值,则认为无人机完成当前航线所需航程过长,不满足预设的任务参数条件,需要重新规划相对方向关系和航线。
另外,航线的总长度、所述无人机完成所述航线的预计作业时间、完成所述航线时所述相机的预计拍照数量这三个任务参数具有不同的重要程度,如,航线的总长度、预计作业时间、预计拍照数量的重要性依次递减,因此,还可以对三个任务参数分别设定权重值,并将每个任务参数与权重值的乘积进行加和,得到任务参数的值,并根据实际需求,设定一个任务参数阈值,在加权求和得到的任务参数的值大于或等于所述任务参数对应的任务参数阈值的情况下,确定任务参数不满足预设的任务参数条件,这样可以综合考虑各个任务参数的重要性,提高了判断精度。
子步骤2052、控制所述相机旋转,得到所述画幅方向与所述航向之间的新的相对方向关系。
具体的,该旋转操作可以为保持相机面向作业区域,且逆时针或顺时针旋转。
可选的,所述无人机搭载云台,所述云台搭载所述相机,子步骤2052具体可以包括:
子步骤D1、控制所述云台带动所述相机旋转,得到所述画幅方向与所 述航向之间的新的相对方向关系。
在本申请实施例中,可以设定一个旋转角度,从而使得云台可以根据该旋转角度计算得到旋转量,云台根据旋转量运作,带动相机旋转,得到画幅方向与航向之间的新的相对方向关系。
例如,参照图2和图3,将图2中的相机姿态旋转至图3中的相机姿态,需要将相机逆时针或顺时针旋转90度。
子步骤2053、在根据所述新的相对方向关系规划得到新的航线的任务参数的值,小于所述任务参数对应的任务参数阈值的情况下,以供所述无人机根据所述新的相对方向关系和所述新的航线,控制所述无人机执行拍摄任务。
本申请实施例中,在根据新的相对方向关系规划得到新的航线之后,在新的航线的任务参数的值小于任务参数对应的任务参数阈值的情况下,认为新的相对方向关系和新的航线满足需求,则可以控制无人机根据新的相对方向关系和新的航线执行拍摄任务。
在根据新的相对方向关系规划得到新的航线之后,在新的航线的任务参数的值大于或等于任务参数对应的任务参数阈值的情况下,认为新的相对方向关系和新的航线还是不满足需求,则需要继续进行子步骤2052的流程,重新确定新的相对方向关系和新的航线,直至新的航线的任务参数的值小于任务参数对应的任务参数阈值。
综上,本申请实施例提供的一种设备控制方法,通过在无人机执行拍摄任务之前,通过根据无人机的相机的画幅方向和无人机的航向之间的相对方向关系,规划对应的航线,以及计算无人机沿航线执行拍摄任务时的任务参数;使得后续无人机在按照相对方向关系和航线执行拍摄任务时,对应产生的作业效率被赋予了可参考的度量,进一步对任务参数进行判断,可以确定相对方向关系和航线对应的作业效率是否满足需求,并在不满足需求的情况下,通过灵活控制无人机改变相机的画幅方向和无人机的航向之间的相对方向关系,来最终满足需求,实现对相机相对于航向的姿态的优化,从而提高无人机的作业效率。
图13是本申请实施例提供的另一种设备控制方法的流程图,该方法可以包括:
步骤301、获取无人机执行拍摄任务的作业区域。
具体的步骤301具体可以参照上述步骤101,此处不再赘述。
步骤302、在所述无人机的相机在拍摄所述作业区域时的画幅方向和所述无人机的航向之间的多个不同的相对方向关系中,针对每一种所述相对方向关系在所述作业区域规划航线,并确定所述无人机沿规划的所述航线执行所述拍摄任务时的任务参数。
在本申请实施例中,在无人机执行拍摄任务之前,可以预先设定多个相对方向关系,提前针对每一个相对方向关系规划对应的航线,并确定每一个航线对应的任务参数。
如,在无人机执行拍摄任务之前,可以针对图2所示的相对方向关系和图3所示的相对方向关系,分别规划对应的航线,且确定每一个航线对应的任务参数。
可选的,在一种实现方式中,步骤302具体可以包括:
子步骤3021、针对每一种所述相对方向关系,分别确定所述无人机沿所述相对方向关系中的航向移动,以拍摄相邻两张图像时所述无人机的位置间隔。
可选的,子步骤3021具体可以包括:
子步骤E1、针对每一种所述相对方位关系,分别确定所述相机的拍摄重叠率。
子步骤E2、根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定每一种所述相对方位关系对应的位置间隔。
可选的,所述位置间隔包括:航向间隔和旁向间隔,所述拍摄重叠率包括:航向重叠率和旁向重叠率,子步骤E2具体可以包括:
子步骤E21、根据所述地面分辨率和所述飞行高度,确定所述相机的画幅参考区域的短边和长边的长度,所述画幅参考区域的形状为矩形。
子步骤E22、根据所述画幅参考区域的短边的长度以及所述航向重叠率,确定每一种所述相对方位关系对应的航向间隔。
子步骤E23、根据所述画幅参考区域的长边的长度以及所述旁向重叠率,确定每一种所述相对方位关系对应的旁向间隔。
具体的,子步骤3021具体可以参照上述步骤202,此处不再赘述。子步骤E1-E2具体可以参照上述子步骤2021-2022,此处不再赘述。子步骤 E21-E23具体可以参照上述子步骤A1-A3,此处不再赘述。
子步骤3022、根据所述位置间隔在所述作业区域规划航线。
可选的,航线包括至少一条单条路径;子步骤3022具体可以包括:
子步骤F1、确定所述作业区域的外接矩形的尺寸。
可选的,子步骤F1具体可以包括:
子步骤F11、根据所述航向,建立所述作业区域的外接矩形,所述外接矩形的长边延伸方向与所述移动方向平行,或与所述外接矩形的长边的法线方向平行的方向与所述移动方向平行。
子步骤F12、确定所述外接矩形的尺寸。
子步骤F2、根据所述外接矩形的尺寸、所述航向间隔,确定所述航线所需的单条路径的长度。
子步骤F3、根据所述外接矩形的尺寸、所述旁向间隔,确定所述航线所需的单条路径的数量。
子步骤F4、根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,所述航线的相邻单条路径之间间隔所述旁向间隔。
可选的,子步骤F4具体可以包括:
子步骤F41、根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述外接矩形中规划得到初始航线,并确定所述初始航线与所述外接矩形中的作业区域的边界的交点。
子步骤F42、将所述交点沿目标方向移动预设距离值,所述目标方向为与所述交点所在的路径平行的方向,且所述目标方向为朝向所述作业区域内部的方向或背离所述作业区域内部的方向。
子步骤F43、将移动后的交点依次串联连接起来,得到所述航线。
可选的,所述画幅方向包括:所述画幅参考区域的长边延伸方向,或与所述长边的法线方向平行的方向,或所述画幅参考区域的短边延伸方向,或与所述短边的法线方向平行的方向。
具体的,子步骤3022具体可以参照上述步骤203,此处不再赘述。子步骤F1-F4具体可以参照上述子步骤2031-2034,此处不再赘述。子步骤F11-F12 具体可以参照上述子步骤B1-B2,此处不再赘述。子步骤F41-F43具体可以参照上述子步骤C1-C3,此处不再赘述。
可选的,子步骤3021具体可以包括:
子步骤G1、获取所述相机相邻两次曝光的最小时间间隔。
子步骤G2、将所述相机相邻两次曝光的最小时间间隔和所述无人机的最大飞行速度的乘积,确定为所述位置间隔。
具体的,子步骤G1-G2具体可以参照上述子步骤2023-2024,此处不再赘述。
可选的,所述任务参数包括所述航线的总长度、所述无人机完成所述航线的预计作业时间、完成所述航线时所述相机的预计拍照数量中的任一种。
可选的,所述目标任务参数的值为所有任务参数的值中的最小值。
可选的,所述任务参数包括所述无人机完成所述航线的预计作业时间,步骤302具体可以包括:
子步骤3023、将所述航线的总长度与目标速度的比值,确定为所述预计作业时间。
其中,在所述无人机的作业速度小于或等于所述无人机的最大移动速度的情况下,所述目标速度为所述作业速度;在所述无人机的作业速度大于所述无人机的最大移动速度的情况下,所述目标速度为所述最大移动速度,所述作业速度为所述无人机按照所述相机相邻两次曝光的最小时间间隔移动时的速度。
具体的,子步骤3023具体可以参照上述步骤2041,此处不再赘述。
可选的,所述任务参数包括所述无人机完成所述航线的预计拍照数量,步骤302具体可以包括:
子步骤3024、将所述航线的总长度与所述航线对应的位置间隔的比值,确定所述预计拍照数量。
具体的,子步骤3024具体可以参照上述步骤2042,此处不再赘述。
步骤303、确定符合预设任务参数条件的目标任务参数对应的目标相对方位关系和对应的目标航线,所述目标相对方位关系和所述目标航线用于控制所述无人机执行所述拍摄任务。
可选的,所述无人机搭载云台,所述云台搭载所述相机;步骤303具体可以包括:
子步骤3031、在所述相机的画幅方向和所述无人机的航向之间的当前相对方向关系,与所述目标任务参数对应的相对方位关系不匹配的情况下,控制所述云台带动所述相机旋转,将所述当前相对方向关系调整为所述目标任务参数对应的相对方位关系,并按照所述目标任务参数对应的航线,控制所述无人机执行所述拍摄任务。
在本申请实施例中,通过对每组相对方向关系以及航线对应的任务参数进行比较,可以筛选出作业效率较高的相对方向关系以及航线,后续则可以控制无人机按照作业效率较高的相对方向关系以及航线执行拍摄任务,从而提高作业效率。
如,在得到图2所示的相对方向关系、航线、任务参数,以及图3所示的相对方向关系、航线、任务参数之后,可以将两组任务参数进行比较,确定其中符合预设任务参数条件的目标任务参数对应的目标相对方位关系和对应的目标航线,并采用目标相对方位关系和对应的目标航线控制无人机执行拍摄任务,根据图2和图3的航线规划,图2的航线更优。
需要说明的是,在确定得到了多个目标相对方向关系和目标航线的情况下,可以自动按照任务参数最优的目标相对方向关系和对应的目标航线,控制无人机执行拍摄任务,或根据用户的选择,按照用户选择的目标相对方向关系和对应的目标航线控制无人机执行拍摄任务。
具体的,子步骤3031具体可以参照上述步骤2042,此处不再赘述。
综上,本申请实施例提供的一种设备控制方法,在无人机执行拍摄任务之前,通过根据无人机的相机的画幅方向和无人机的航向之间的相对方向关系,规划对应的航线,以及计算无人机沿航线执行拍摄任务时的任务参数;使得后续无人机在按照相对方向关系和航线执行拍摄任务时,对应产生的作业效率被赋予了可参考的度量,进一步对任务参数进行判断,可以确定相对方向关系和航线对应的作业效率是否满足需求,并在不满足需求的情况下,通过灵活控制无人机改变相机的画幅方向和无人机的航向之间的相对方向关系,来最终满足需求,实现对相机相对于航向的姿态的优化,从而提高无人机的作业效率。
另外,通过预设多组相对方向关系,并对每组相对方向关系以及航线对 应的任务参数进行比较,可以筛选出作业效率较高的相对方向关系以及航线,后续则可以控制无人机按照作业效率较高的相对方向关系以及航线执行拍摄任务,从而提高作业效率。
图14是本申请实施例提供的一种设备控制装置的框图,如图14所示,该设备控制装置400可以包括:获取模块401和处理模块402;
所述获取模块401用于执行:获取无人机执行拍摄任务的作业区域;
所述处理模块402用于执行:
根据所述无人机的相机的画幅方向和所述无人机的航向之间的相对方向关系,在所述作业区域规划航线;
确定所述无人机沿所述航线执行所述拍摄任务时的任务参数;
在所述任务参数不满足预设任务参数条件的情况下,调整所述画幅方向与所述航向之间的相对方向关系,并重新规划航线。
可选的,所述处理模块402具体用于:
根据所述相对方向关系,确定所述无人机沿所述航向移动以拍摄相邻两张图像时所述无人机的位置间隔;
根据所述位置间隔在所述作业区域规划航线。
可选的,所述处理模块402具体用于:
根据所述相对方向关系,确定所述相机的拍摄重叠率;
根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定所述位置间隔。
可选的,所述位置间隔包括:航向间隔和旁向间隔,所述拍摄重叠率包括:航向重叠率和旁向重叠率,所述处理模块402具体用于:
根据所述地面分辨率和所述飞行高度,确定所述相机的画幅参考区域的短边和长边的长度,所述画幅参考区域的形状为矩形;
根据所述画幅参考区域的短边的长度以及所述航向重叠率,确定所述航向间隔;
根据所述画幅参考区域的长边的长度以及所述旁向重叠率,确定所述旁向间隔。
可选的,所述航线包括至少一条单条路径;所述处理模块402具体用于:
确定所述作业区域的外接矩形的尺寸;
根据所述外接矩形的尺寸、所述航向间隔,确定所述航线所需的单条路径的长度;
根据所述外接矩形的尺寸、所述旁向间隔,确定所述航线所需的单条路径的数量;
根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,所述航线的相邻单条路径之间间隔所述旁向间隔。
可选的,所述处理模块402具体用于:
根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述外接矩形中规划得到初始航线,并确定所述初始航线与所述外接矩形中的作业区域的边界的交点;
将所述交点沿目标方向移动预设距离值,所述目标方向为与所述交点所在的路径平行的方向,且所述目标方向为朝向所述作业区域内部的方向或背离所述作业区域内部的方向;
将移动后的交点依次串联连接起来,得到所述航线。
可选的,所述处理模块402具体用于:
根据所述航向,建立所述作业区域的外接矩形,所述外接矩形的长边延伸方向与所述移动方向平行,或与所述外接矩形的长边的法线方向平行的方向与所述移动方向平行;
确定所述外接矩形的尺寸。
可选的,所述画幅方向包括:所述画幅参考区域的长边延伸方向,或与所述长边的法线方向平行的方向,或所述画幅参考区域的短边延伸方向,或与所述短边的法线方向平行的方向。
可选的,所述处理模块402具体用于:
获取所述相机相邻两次曝光的最小时间间隔;
将所述相机相邻两次曝光的最小时间间隔和所述无人机的最大飞行速度的乘积,确定为所述位置间隔。
可选的,所述任务参数包括所述航线的总长度、所述无人机完成所述航 线的预计作业时间、完成所述航线时所述相机的预计拍照数量中的任一种。
可选的,所述在所述任务参数不满足预设的任务参数条件的情况下,所述处理模块402具体用于:
在所述任务参数的值大于或等于所述任务参数对应的任务参数阈值的情况下,确定所述任务参数不满足预设的任务参数条件;
控制所述相机旋转,得到所述画幅方向与所述航向之间的新的相对方向关系;
在根据所述新的相对方向关系规划得到新的航线的任务参数的值,小于所述任务参数对应的任务参数阈值的情况下,以供所述无人机根据所述新的相对方向关系和所述新的航线,控制所述无人机执行拍摄任务。
可选的,所述无人机搭载云台,所述云台搭载所述相机;所述处理模块402具体用于:
控制所述云台带动所述相机旋转,得到所述画幅方向与所述航向之间的新的相对方向关系。
可选的,所述任务参数包括所述无人机完成所述航线的预计作业时间,所述处理模块402具体用于:
将所述航线的总长度与目标速度的比值,确定为所述预计作业时间;
其中,在所述无人机的作业速度小于或等于所述无人机的最大移动速度的情况下,所述目标速度为所述作业速度;在所述无人机的作业速度大于所述无人机的最大移动速度的情况下,所述目标速度为所述最大移动速度,所述作业速度为所述无人机按照所述相机相邻两次曝光的最小时间间隔移动时的速度。
可选的,所述任务参数包括所述无人机完成所述航线的预计拍照数量,所述处理模块402具体用于:
将所述航线的总长度与所述航线对应的位置间隔的比值,确定所述预计拍照数量。
综上,本申请实施例提供的设备控制装置,通过在无人机执行拍摄任务之前,通过根据无人机的相机的画幅方向和无人机的航向之间的相对方向关系,规划对应的航线,以及计算无人机沿航线执行拍摄任务时的任务参数;使得后续无人机在按照相对方向关系和航线执行拍摄任务时,对应产生的作 业效率被赋予了可参考的度量,进一步对任务参数进行判断,可以确定相对方向关系和航线对应的作业效率是否满足需求,并在不满足需求的情况下,通过灵活控制无人机改变相机的画幅方向和无人机的航向之间的相对方向关系,来最终满足需求,实现对相机相对于航向的姿态的优化,从而提高无人机的作业效率。
图15是本申请实施例提供的一种设备控制装置的框图,如图15所示,该设备控制装置500可以包括:获取模块501和处理模块502;
所述获取模块401用于执行:获取无人机执行拍摄任务的作业区域;
所述处理模块402用于执行:在所述无人机的相机在拍摄所述作业区域时的画幅方向和所述无人机的航向之间的多个不同的相对方向关系中,针对每一种所述相对方向关系在所述作业区域规划航线,并确定所述无人机沿规划的所述航线执行所述拍摄任务时的任务参数;
确定符合预设任务参数条件的目标任务参数对应的目标相对方位关系和对应的目标航线,所述目标相对方位关系和所述目标航线用于控制所述无人机执行所述拍摄任务。
可选的,所述处理模块502具体用于:
针对每一种所述相对方向关系,分别确定所述无人机沿所述相对方向关系中的航向移动,以拍摄相邻两张图像时所述无人机的位置间隔;
根据所述位置间隔在所述作业区域规划航线。
可选的,所述处理模块502具体用于:
针对每一种所述相对方位关系,分别确定所述相机的拍摄重叠率;
根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定每一种所述相对方位关系对应的位置间隔。
可选的,所述位置间隔包括:航向间隔和旁向间隔,所述拍摄重叠率包括:航向重叠率和旁向重叠率,所述处理模块502具体用于:
根据所述地面分辨率和所述飞行高度,确定所述相机的画幅参考区域的短边和长边的长度,所述画幅参考区域的形状为矩形;
根据所述画幅参考区域的短边的长度以及所述航向重叠率,确定每一种所述相对方位关系对应的航向间隔;
根据所述画幅参考区域的长边的长度以及所述旁向重叠率,确定每一种所述相对方位关系对应的旁向间隔。
可选的,所述航线包括至少一条单条路径;所述处理模块502具体用于:
确定所述作业区域的外接矩形的尺寸;
根据所述外接矩形的尺寸、所述航向间隔,确定所述航线所需的单条路径的长度;
根据所述外接矩形的尺寸、所述旁向间隔,确定所述航线所需的单条路径的数量;
根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,所述航线的相邻单条路径之间间隔所述旁向间隔。
可选的,所述处理模块502具体用于:
根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述外接矩形中规划得到初始航线,并确定所述初始航线与所述外接矩形中的作业区域的边界的交点;
将所述交点沿目标方向移动预设距离值,所述目标方向为与所述交点所在的路径平行的方向,且所述目标方向为朝向所述作业区域内部的方向或背离所述作业区域内部的方向;
将移动后的交点依次串联连接起来,得到所述航线。
可选的,所述处理模块502具体用于:
根据所述航向,建立所述作业区域的外接矩形,所述外接矩形的长边延伸方向与所述移动方向平行,或与所述外接矩形的长边的法线方向平行的方向与所述移动方向平行;
确定所述外接矩形的尺寸。
可选的,所述画幅方向包括:所述画幅参考区域的长边延伸方向,或与所述长边的法线方向平行的方向,或所述画幅参考区域的短边延伸方向,或与所述短边的法线方向平行的方向。
可选的,所述处理模块502具体用于:
获取所述相机相邻两次曝光的最小时间间隔;
将所述相机相邻两次曝光的最小时间间隔和所述无人机的最大飞行速度的乘积,确定为所述位置间隔。
可选的,所述处理模块具体用于:所述任务参数包括所述航线的总长度、所述无人机完成所述航线的预计作业时间、完成所述航线时所述相机的预计拍照数量中的任一种。
可选的,所述目标任务参数的值为所有任务参数的值中的最小值。
可选的,所述任务参数包括所述无人机完成所述航线的预计作业时间,所述处理模块402具体用于:
将所述航线的总长度与目标速度的比值,确定为所述预计作业时间;
其中,在所述无人机的作业速度小于或等于所述无人机的最大移动速度的情况下,所述目标速度为所述作业速度;在所述无人机的作业速度大于所述无人机的最大移动速度的情况下,所述目标速度为所述最大移动速度,所述作业速度为所述无人机按照所述相机相邻两次曝光的最小时间间隔移动时的速度。
可选的,所述任务参数包括所述无人机完成所述航线的预计拍照数量,所述处理模块502具体用于:
将所述航线的总长度与所述航线对应的位置间隔的比值,确定所述预计拍照数量。
可选的,所述无人机搭载云台,所述云台搭载所述相机;所述处理模块502具体用于:
在所述相机的画幅方向和所述无人机的航向之间的当前相对方向关系,与所述目标任务参数对应的相对方位关系不匹配的情况下,控制所述云台带动所述相机旋转,将所述当前相对方向关系调整为所述目标任务参数对应的相对方位关系,并按照所述目标任务参数对应的航线,控制所述无人机执行所述拍摄任务。
综上,本申请实施例提供的设备控制装置,通过在无人机执行拍摄任务之前,通过根据无人机的相机的画幅方向和无人机的航向之间的相对方向关系,规划对应的航线,以及计算无人机沿航线执行拍摄任务时的任务参数;使得后续无人机在按照相对方向关系和航线执行拍摄任务时,对应产生的作业效率被赋予了可参考的度量,进一步对任务参数进行判断,可以确定相对 方向关系和航线对应的作业效率是否满足需求,并在不满足需求的情况下,通过灵活控制无人机改变相机的画幅方向和无人机的航向之间的相对方向关系,来最终满足需求,实现对相机相对于航向的姿态的优化,从而提高无人机的作业效率。
另外,通过预设多组相对方向关系,并对每组相对方向关系以及航线对应的任务参数进行比较,可以筛选出作业效率较高的相对方向关系以及航线,后续则可以控制无人机按照作业效率较高的相对方向关系以及航线执行拍摄任务,从而提高作业效率。
本申请实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述设备控制方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,简称ROM)、随机存取存储器(Random Access Memory,简称RAM)、磁碟或者光盘等。
获取模块可以为外部控制终端与设备控制装置连接的接口。例如,外部控制终端可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的控制终端的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。获取模块可以用于接收来自外部控制终端的输入(例如,数据信息、电力等等)并且将接收到的输入传输到设备控制装置内的一个或多个元件或者可以用于在设备控制装置和外部控制终端之间传输数据。
例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器是控制终端的控制中心,利用各种接口和线路连接整个控制终端的各个部分,通过运行或执行存储在存储器内的软件程序和/或模块,以及调用存储在存储器内的数据,执行控制终端的各种功能和处理数据,从而对控制终端进行整体监控。处理器可包括一个或多个处理单元;优选的,处理器可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器中。
本说明书中的各个实施例均采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似的部分互相参见即可。
本领域内的技术人员应明白,本申请的实施例可提供为方法、控制终端、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请的方法、终端设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理终端设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理终端设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的控制终端。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理终端设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令控制终端的制造品,该指令控制终端实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理终端设备上,使得在计算机或其他可编程终端设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程终端设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
尽管已描述了本申请的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例做出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本申请范围的所有变更和修改。
最后,还需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者终端设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者终端设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者终 端设备中还存在另外的相同要素。
以上对本申请进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (57)

  1. 一种设备控制方法,其特征在于,所述方法包括:
    获取无人机执行拍摄任务的作业区域;
    根据所述无人机的相机的画幅方向和所述无人机的航向之间的相对方向关系,在所述作业区域规划航线;
    确定所述无人机沿所述航线执行所述拍摄任务时的任务参数;
    在所述任务参数不满足预设任务参数条件的情况下,调整所述画幅方向与所述航向之间的相对方向关系,并重新规划航线。
  2. 根据权利要求1所述的方法,其特征在于,根据所述无人机的相机的画幅方向和所述无人机的航向之间的相对方向关系,在所述作业区域规划航线,包括:
    根据所述相对方向关系,确定所述无人机沿所述航向移动以拍摄相邻两张图像时所述无人机的位置间隔;
    根据所述位置间隔在所述作业区域规划航线。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述相对方向关系,确定所述无人机沿所述航向移动以拍摄相邻两张图像时所述无人机的位置间隔,包括:
    根据所述相对方向关系,确定所述相机的拍摄重叠率;
    根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定所述位置间隔。
  4. 根据权利要求3所述的方法,其特征在于,所述位置间隔包括:航向间隔和旁向间隔,所述拍摄重叠率包括:航向重叠率和旁向重叠率,所述根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定所述位置间隔,包括:
    根据所述地面分辨率和所述飞行高度,确定所述相机的画幅参考区域的短边和长边的长度,所述画幅参考区域的形状为矩形;
    根据所述画幅参考区域的短边的长度以及所述航向重叠率,确定所述航向间隔;
    根据所述画幅参考区域的长边的长度以及所述旁向重叠率,确定所述旁 向间隔。
  5. 根据权利要求4所述的方法,其特征在于,所述航线包括至少一条单条路径;
    所述根据所述位置间隔在所述作业区域规划航线,包括:
    确定所述作业区域的外接矩形的尺寸;
    根据所述外接矩形的尺寸、所述航向间隔,确定所述航线所需的单条路径的长度;
    根据所述外接矩形的尺寸、所述旁向间隔,确定所述航线所需的单条路径的数量;
    根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,所述航线的相邻单条路径之间间隔所述旁向间隔。
  6. 根据权利要求5所述的方法,其特征在于,所述根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,包括:
    根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述外接矩形中规划得到初始航线,并确定所述初始航线与所述外接矩形中的作业区域的边界的交点;
    将所述交点沿目标方向移动预设距离值,所述目标方向为与所述交点所在的路径平行的方向,且所述目标方向为朝向所述作业区域内部的方向或背离所述作业区域内部的方向;
    将移动后的交点依次串联连接起来,得到所述航线。
  7. 根据权利要求5所述的方法,其特征在于,所述确定所述作业区域的外接矩形的尺寸,包括:
    根据所述航向,建立所述作业区域的外接矩形,所述外接矩形的长边延伸方向与所述移动方向平行,或与所述外接矩形的长边的法线方向平行的方向与所述移动方向平行;
    确定所述外接矩形的尺寸。
  8. 根据权利要求4所述的方法,其特征在于,所述画幅方向包括:所 述画幅参考区域的长边延伸方向,或与所述长边的法线方向平行的方向,或所述画幅参考区域的短边延伸方向,或与所述短边的法线方向平行的方向。
  9. 根据权利要求2所述的方法,其特征在于,所述根据所述相对方向关系,确定所述无人机沿所述航向移动以拍摄相邻两张图像时所述无人机的位置间隔,包括:
    获取所述相机相邻两次曝光的最小时间间隔;
    将所述相机相邻两次曝光的最小时间间隔和所述无人机的最大飞行速度的乘积,确定为所述位置间隔。
  10. 根据权利要求1所述的方法,其特征在于,所述任务参数包括所述航线的总长度、所述无人机完成所述航线的预计作业时间、完成所述航线时所述相机的预计拍照数量中的任一种。
  11. 根据权利要求10所述的方法,其特征在于,所述在所述任务参数不满足预设的任务参数条件的情况下,调整所述画幅方向与所述航向之间的相对方向关系,包括:
    在所述任务参数的值大于或等于所述任务参数对应的任务参数阈值的情况下,确定所述任务参数不满足预设的任务参数条件;
    控制所述相机旋转,得到所述画幅方向与所述航向之间的新的相对方向关系;
    在根据所述新的相对方向关系规划得到新的航线的任务参数的值,小于所述任务参数对应的任务参数阈值的情况下,以供所述无人机根据所述新的相对方向关系和所述新的航线,控制所述无人机执行拍摄任务。
  12. 根据权利要求10所述的方法,其特征在于,所述无人机搭载云台,所述云台搭载所述相机;所述控制所述相机旋转,得到所述画幅方向与所述航向之间的新的相对方向关系,包括:
    控制所述云台带动所述相机旋转,得到所述画幅方向与所述航向之间的新的相对方向关系。
  13. 根据权利要求10所述的方法,其特征在于,所述任务参数包括所述无人机完成所述航线的预计作业时间,
    所述确定所述无人机沿所述航线执行所述拍摄任务时的任务参数,包括:
    将所述航线的总长度与目标速度的比值,确定为所述预计作业时间;
    其中,在所述无人机的作业速度小于或等于所述无人机的最大移动速度的情况下,所述目标速度为所述作业速度;在所述无人机的作业速度大于所述无人机的最大移动速度的情况下,所述目标速度为所述最大移动速度,所述作业速度为所述无人机按照所述相机相邻两次曝光的最小时间间隔移动时的速度。
  14. 根据权利要求10所述的方法,其特征在于,所述任务参数包括所述无人机完成所述航线的预计拍照数量,
    所述确定所述无人机沿所述航线执行所述拍摄任务时的任务参数,包括:
    将所述航线的总长度与所述航线对应的位置间隔的比值,确定所述预计拍照数量。
  15. 一种设备控制方法,其特征在于,所述方法包括:
    获取无人机执行拍摄任务的作业区域;
    在所述无人机的相机在拍摄所述作业区域时的画幅方向和所述无人机的航向之间的多个不同的相对方向关系中,针对每一种所述相对方向关系在所述作业区域规划航线,并确定所述无人机沿规划的所述航线执行所述拍摄任务时的任务参数;
    确定符合预设任务参数条件的目标任务参数对应的目标相对方位关系和对应的目标航线,所述目标相对方位关系和所述目标航线用于控制所述无人机执行所述拍摄任务。
  16. 根据权利要求15所述的方法,其特征在于,所述针对每一种所述相对方向关系在所述作业区域规划航线,包括:
    针对每一种所述相对方向关系,分别确定所述无人机沿所述相对方向关系中的航向移动,以拍摄相邻两张图像时所述无人机的位置间隔;
    根据所述位置间隔在所述作业区域规划航线。
  17. 根据权利要求16所述的方法,其特征在于,所述针对每一种所述相对方向关系,分别确定所述无人机沿所述相对方向关系中的航向移动,以 拍摄相邻两张图像时所述无人机的位置间隔,包括:
    针对每一种所述相对方位关系,分别确定所述相机的拍摄重叠率;
    根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定每一种所述相对方位关系对应的位置间隔。
  18. 根据权利要求17所述的方法,其特征在于,所述位置间隔包括:航向间隔和旁向间隔,所述拍摄重叠率包括:航向重叠率和旁向重叠率,所述根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定每一种所述相对方位关系对应的位置间隔,包括:
    根据所述地面分辨率和所述飞行高度,确定所述相机的画幅参考区域的短边和长边的长度,所述画幅参考区域的形状为矩形;
    根据所述画幅参考区域的短边的长度以及所述航向重叠率,确定每一种所述相对方位关系对应的航向间隔;
    根据所述画幅参考区域的长边的长度以及所述旁向重叠率,确定每一种所述相对方位关系对应的旁向间隔。
  19. 根据权利要求18所述的方法,其特征在于,所述航线包括至少一条单条路径;
    所述根据所述位置间隔在所述作业区域规划航线,包括:
    确定所述作业区域的外接矩形的尺寸;
    根据所述外接矩形的尺寸、所述航向间隔,确定所述航线所需的单条路径的长度;
    根据所述外接矩形的尺寸、所述旁向间隔,确定所述航线所需的单条路径的数量;
    根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,所述航线的相邻单条路径之间间隔所述旁向间隔。
  20. 根据权利要求19所述的方法,其特征在于,所述根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,包括:
    根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路 径的数量,在所述外接矩形中规划得到初始航线,并确定所述初始航线与所述外接矩形中的作业区域的边界的交点;
    将所述交点沿目标方向移动预设距离值,所述目标方向为与所述交点所在的路径平行的方向,且所述目标方向为朝向所述作业区域内部的方向或背离所述作业区域内部的方向;
    将移动后的交点依次串联连接起来,得到所述航线。
  21. 根据权利要求19所述的方法,其特征在于,所述确定所述作业区域的外接矩形的尺寸,包括:
    根据所述航向,建立所述作业区域的外接矩形,所述外接矩形的长边延伸方向与所述移动方向平行,或与所述外接矩形的长边的法线方向平行的方向与所述移动方向平行;
    确定所述外接矩形的尺寸。
  22. 根据权利要求18所述的方法,其特征在于,所述画幅方向包括:所述画幅参考区域的长边延伸方向,或与所述长边的法线方向平行的方向,或所述画幅参考区域的短边延伸方向,或与所述短边的法线方向平行的方向。
  23. 根据权利要求16所述的方法,其特征在于,所述针对每一种所述相对方位关系,分别确定所述无人机沿所述航向移动以拍摄相邻两张图像时所述无人机的位置间隔,包括:
    获取所述相机相邻两次曝光的最小时间间隔;
    将所述相机相邻两次曝光的最小时间间隔和所述无人机的最大飞行速度的乘积,确定为所述位置间隔。
  24. 根据权利要求15所述的方法,其特征在于,所述任务参数包括所述航线的总长度、所述无人机完成所述航线的预计作业时间、完成所述航线时所述相机的预计拍照数量中的任一种。
  25. 根据权利要求24所述的方法,其特征在于,所述目标任务参数的值为所有任务参数的值中的最小值。
  26. 根据权利要求24所述的方法,其特征在于,所述任务参数包括所述无人机完成所述航线的预计作业时间,
    所述确定所述无人机沿规划的所述航线执行所述拍摄任务时的任务参 数,包括:
    将所述航线的总长度与目标速度的比值,确定为所述预计作业时间;
    其中,在所述无人机的作业速度小于或等于所述无人机的最大移动速度的情况下,所述目标速度为所述作业速度;在所述无人机的作业速度大于所述无人机的最大移动速度的情况下,所述目标速度为所述最大移动速度,所述作业速度为所述无人机按照所述相机相邻两次曝光的最小时间间隔移动时的速度。
  27. 根据权利要求24所述的方法,其特征在于,所述任务参数包括所述无人机完成所述航线的预计拍照数量,
    所述确定所述无人机沿规划的所述航线执行所述拍摄任务时的任务参数,包括:
    将所述航线的总长度与所述航线对应的位置间隔的比值,确定所述预计拍照数量。
  28. 根据权利要求15所述的方法,其特征在于,所述无人机搭载云台,所述云台搭载所述相机;所述根据符合预设任务参数条件的目标任务参数对应的相对方位关系和对应的航线,控制所述无人机执行所述拍摄任务,包括:
    在所述相机的画幅方向和所述无人机的航向之间的当前相对方向关系,与所述目标任务参数对应的相对方位关系不匹配的情况下,控制所述云台带动所述相机旋转,将所述当前相对方向关系调整为所述目标任务参数对应的相对方位关系,并按照所述目标任务参数对应的航线,控制所述无人机执行所述拍摄任务。
  29. 一种设备控制装置,其特征在于,所述装置包括:获取模块和处理模块;
    所述获取模块用于,获取无人机执行拍摄任务的作业区域;
    所述处理模块用于,根据所述无人机的相机的画幅方向和所述无人机的航向之间的相对方向关系,在所述作业区域规划航线;
    确定所述无人机沿所述航线执行所述拍摄任务时的任务参数;
    在所述任务参数不满足预设任务参数条件的情况下,调整所述画幅方向 与所述航向之间的相对方向关系,并重新规划航线。
  30. 根据权利要求29所述的装置,其特征在于,所述处理模块具体用于:
    根据所述相对方向关系,确定所述无人机沿所述航向移动以拍摄相邻两张图像时所述无人机的位置间隔;
    根据所述位置间隔在所述作业区域规划航线。
  31. 根据权利要求30所述的装置,其特征在于,所述处理模块具体用于:
    根据所述相对方向关系,确定所述相机的拍摄重叠率;
    根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定所述位置间隔。
  32. 根据权利要求31所述的装置,其特征在于,所述位置间隔包括:航向间隔和旁向间隔,所述拍摄重叠率包括:航向重叠率和旁向重叠率,所述处理模块具体用于:
    根据所述地面分辨率和所述飞行高度,确定所述相机的画幅参考区域的短边和长边的长度,所述画幅参考区域的形状为矩形;
    根据所述画幅参考区域的短边的长度以及所述航向重叠率,确定所述航向间隔;
    根据所述画幅参考区域的长边的长度以及所述旁向重叠率,确定所述旁向间隔。
  33. 根据权利要求32所述的装置,其特征在于,所述航线包括至少一条单条路径;所述处理模块具体用于:
    确定所述作业区域的外接矩形的尺寸;
    根据所述外接矩形的尺寸、所述航向间隔,确定所述航线所需的单条路径的长度;
    根据所述外接矩形的尺寸、所述旁向间隔,确定所述航线所需的单条路径的数量;
    根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,所述航线的相邻单条路径 之间间隔所述旁向间隔。
  34. 根据权利要求33所述的装置,其特征在于,所述处理模块具体用于:
    根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述外接矩形中规划得到初始航线,并确定所述初始航线与所述外接矩形中的作业区域的边界的交点;
    将所述交点沿目标方向移动预设距离值,所述目标方向为与所述交点所在的路径平行的方向,且所述目标方向为朝向所述作业区域内部的方向或背离所述作业区域内部的方向;
    将移动后的交点依次串联连接起来,得到所述航线。
  35. 根据权利要求33所述的装置,其特征在于,所述处理模块具体用于:
    根据所述航向,建立所述作业区域的外接矩形,所述外接矩形的长边延伸方向与所述移动方向平行,或与所述外接矩形的长边的法线方向平行的方向与所述移动方向平行;
    确定所述外接矩形的尺寸。
  36. 根据权利要求32所述的装置,其特征在于,所述画幅方向包括:所述画幅参考区域的长边延伸方向,或与所述长边的法线方向平行的方向,或所述画幅参考区域的短边延伸方向,或与所述短边的法线方向平行的方向。
  37. 根据权利要求29所述的装置,其特征在于,所述处理模块具体用于:
    获取所述相机相邻两次曝光的最小时间间隔;
    将所述相机相邻两次曝光的最小时间间隔和所述无人机的最大飞行速度的乘积,确定为所述位置间隔。
  38. 根据权利要求29所述的装置,其特征在于,所述任务参数包括所述航线的总长度、所述无人机完成所述航线的预计作业时间、完成所述航线时所述相机的预计拍照数量中的任一种。
  39. 根据权利要求38所述的装置,其特征在于,所述在所述任务参数不满足预设的任务参数条件的情况下,所述处理模块具体用于:
    在所述任务参数的值大于或等于所述任务参数对应的任务参数阈值的情况下,确定所述任务参数不满足预设的任务参数条件;
    控制所述相机旋转,得到所述画幅方向与所述航向之间的新的相对方向关系;
    在根据所述新的相对方向关系规划得到新的航线的任务参数的值,小于所述任务参数对应的任务参数阈值的情况下,以供所述无人机根据所述新的相对方向关系和所述新的航线,控制所述无人机执行拍摄任务。
  40. 根据权利要求38所述的装置,其特征在于,所述无人机搭载云台,所述云台搭载所述相机;所述处理模块具体用于:
    控制所述云台带动所述相机旋转,得到所述画幅方向与所述航向之间的新的相对方向关系。
  41. 根据权利要求38所述的装置,其特征在于,所述任务参数包括所述无人机完成所述航线的预计作业时间,所述处理模块具体用于:
    将所述航线的总长度与目标速度的比值,确定为所述预计作业时间;
    其中,在所述无人机的作业速度小于或等于所述无人机的最大移动速度的情况下,所述目标速度为所述作业速度;在所述无人机的作业速度大于所述无人机的最大移动速度的情况下,所述目标速度为所述最大移动速度,所述作业速度为所述无人机按照所述相机相邻两次曝光的最小时间间隔移动时的速度。
  42. 根据权利要求38所述的装置,其特征在于,所述任务参数包括所述无人机完成所述航线的预计拍照数量,所述处理模块具体用于:
    将所述航线的总长度与所述航线对应的位置间隔的比值,确定所述预计拍照数量。
  43. 一种设备控制装置,其特征在于,所述装置包括:获取模块和处理模块;
    所述获取模块用于,获取无人机执行拍摄任务的作业区域;
    所述处理模块用于,在所述无人机的相机在拍摄所述作业区域时的画幅方向和所述无人机的航向之间的多个不同的相对方向关系中,针对每一种所 述相对方向关系在所述作业区域规划航线,并确定所述无人机沿规划的所述航线执行所述拍摄任务时的任务参数;
    确定符合预设任务参数条件的目标任务参数对应的目标相对方位关系和对应的目标航线,所述目标相对方位关系和所述目标航线用于控制所述无人机执行所述拍摄任务。
  44. 根据权利要求43所述的装置,其特征在于,所述处理模块具体用于:
    针对每一种所述相对方向关系,分别确定所述无人机沿所述相对方向关系中的航向移动,以拍摄相邻两张图像时所述无人机的位置间隔;
    根据所述位置间隔在所述作业区域规划航线。
  45. 根据权利要求44所述的装置,其特征在于,所述处理模块具体用于:
    针对每一种所述相对方位关系,分别确定所述相机的拍摄重叠率;
    根据所述拍摄重叠率、所述相机的地面分辨率和所述无人机的飞行高度,确定每一种所述相对方位关系对应的位置间隔。
  46. 根据权利要求45所述的装置,其特征在于,所述位置间隔包括:航向间隔和旁向间隔,所述拍摄重叠率包括:航向重叠率和旁向重叠率,所述处理模块具体用于:
    根据所述地面分辨率和所述飞行高度,确定所述相机的画幅参考区域的短边和长边的长度,所述画幅参考区域的形状为矩形;
    根据所述画幅参考区域的短边的长度以及所述航向重叠率,确定每一种所述相对方位关系对应的航向间隔;
    根据所述画幅参考区域的长边的长度以及所述旁向重叠率,确定每一种所述相对方位关系对应的旁向间隔。
  47. 根据权利要求46所述的装置,其特征在于,所述航线包括至少一条单条路径;所述处理模块具体用于:
    确定所述作业区域的外接矩形的尺寸;
    根据所述外接矩形的尺寸、所述航向间隔,确定所述航线所需的单条路径的长度;
    根据所述外接矩形的尺寸、所述旁向间隔,确定所述航线所需的单条路径的数量;
    根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述作业区域中规划得到所述航线,所述航线的相邻单条路径之间间隔所述旁向间隔。
  48. 根据权利要求47所述的装置,其特征在于,所述处理模块具体用于:
    根据所述旁向间隔、所述航向间隔、所述单条路径的长度和所述单条路径的数量,在所述外接矩形中规划得到初始航线,并确定所述初始航线与所述外接矩形中的作业区域的边界的交点;
    将所述交点沿目标方向移动预设距离值,所述目标方向为与所述交点所在的路径平行的方向,且所述目标方向为朝向所述作业区域内部的方向或背离所述作业区域内部的方向;
    将移动后的交点依次串联连接起来,得到所述航线。
  49. 根据权利要求47所述的装置,其特征在于,所述处理模块具体用于:
    根据所述航向,建立所述作业区域的外接矩形,所述外接矩形的长边延伸方向与所述移动方向平行,或与所述外接矩形的长边的法线方向平行的方向与所述移动方向平行;
    确定所述外接矩形的尺寸。
  50. 根据权利要求46所述的装置,其特征在于,所述画幅方向包括:所述画幅参考区域的长边延伸方向,或与所述长边的法线方向平行的方向,或所述画幅参考区域的短边延伸方向,或与所述短边的法线方向平行的方向。
  51. 根据权利要求44所述的装置,其特征在于,所述处理模块具体用于:
    获取所述相机相邻两次曝光的最小时间间隔;
    将所述相机相邻两次曝光的最小时间间隔和所述无人机的最大飞行速度的乘积,确定为所述位置间隔。
  52. 根据权利要求43所述的装置,其特征在于,所述处理模块具体用 于:所述任务参数包括所述航线的总长度、所述无人机完成所述航线的预计作业时间、完成所述航线时所述相机的预计拍照数量中的任一种。
  53. 根据权利要求52所述的装置,其特征在于,所述目标任务参数的值为所有任务参数的值中的最小值。
  54. 根据权利要求52所述的装置,其特征在于,所述任务参数包括所述无人机完成所述航线的预计作业时间,所述处理模块具体用于:
    将所述航线的总长度与目标速度的比值,确定为所述预计作业时间;
    其中,在所述无人机的作业速度小于或等于所述无人机的最大移动速度的情况下,所述目标速度为所述作业速度;在所述无人机的作业速度大于所述无人机的最大移动速度的情况下,所述目标速度为所述最大移动速度,所述作业速度为所述无人机按照所述相机相邻两次曝光的最小时间间隔移动时的速度。
  55. 根据权利要求52所述的装置,其特征在于,所述任务参数包括所述无人机完成所述航线的预计拍照数量,所述处理模块具体用于:
    将所述航线的总长度与所述航线对应的位置间隔的比值,确定所述预计拍照数量。
  56. 根据权利要求43所述的装置,其特征在于,所述无人机搭载云台,所述云台搭载所述相机;所述处理模块具体用于:
    在所述相机的画幅方向和所述无人机的航向之间的当前相对方向关系,与所述目标任务参数对应的相对方位关系不匹配的情况下,控制所述云台带动所述相机旋转,将所述当前相对方向关系调整为所述目标任务参数对应的相对方位关系,并按照所述目标任务参数对应的航线,控制所述无人机执行所述拍摄任务。
  57. 一种计算机可读存储介质,其特征在于,包括指令,当其在计算机上运行时,使得所述计算机执行权利要求1至28中任一项所述的设备控制方法。
PCT/CN2020/103156 2020-07-21 2020-07-21 设备控制方法、装置及计算机可读存储介质 WO2022016348A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/103156 WO2022016348A1 (zh) 2020-07-21 2020-07-21 设备控制方法、装置及计算机可读存储介质
CN202080042367.3A CN113950610B (zh) 2020-07-21 2020-07-21 设备控制方法、装置及计算机可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/103156 WO2022016348A1 (zh) 2020-07-21 2020-07-21 设备控制方法、装置及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2022016348A1 true WO2022016348A1 (zh) 2022-01-27

Family

ID=79326127

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/103156 WO2022016348A1 (zh) 2020-07-21 2020-07-21 设备控制方法、装置及计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN113950610B (zh)
WO (1) WO2022016348A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320774A (zh) * 2023-04-06 2023-06-23 北京四维远见信息技术有限公司 高效利用航摄影像的方法、装置、设备及存储介质
CN117151311A (zh) * 2023-10-31 2023-12-01 天津云圣智能科技有限责任公司 测绘参数的优化处理方法、装置、电子设备及存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114268742B (zh) * 2022-03-01 2022-05-24 北京瞭望神州科技有限公司 一种天眼芯片处理装置
CN115278074B (zh) * 2022-07-26 2023-05-12 城乡院(广州)有限公司 基于宗地红线的无人机拍摄方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106477038A (zh) * 2016-12-20 2017-03-08 北京小米移动软件有限公司 图像拍摄方法及装置、无人机
CN106887028A (zh) * 2017-01-19 2017-06-23 西安忠林世纪电子科技有限公司 实时显示航拍照片覆盖区域的方法及系统
CN108225318A (zh) * 2017-11-29 2018-06-29 农业部南京农业机械化研究所 基于图像质量的航空遥感路径规划方法及系统
CN109032165A (zh) * 2017-07-21 2018-12-18 广州极飞科技有限公司 无人机航线的生成方法和装置
US20200117197A1 (en) * 2018-10-10 2020-04-16 Parrot Drones Obstacle detection assembly for a drone, drone equipped with such an obstacle detection assembly and obstacle detection method
CN111033419A (zh) * 2018-12-03 2020-04-17 深圳市大疆创新科技有限公司 飞行器的航线规划方法、控制台、飞行器系统及存储介质

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110244765B (zh) * 2019-06-27 2023-02-28 深圳市道通智能航空技术股份有限公司 一种飞行器航线轨迹生成方法、装置、无人机及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106477038A (zh) * 2016-12-20 2017-03-08 北京小米移动软件有限公司 图像拍摄方法及装置、无人机
CN106887028A (zh) * 2017-01-19 2017-06-23 西安忠林世纪电子科技有限公司 实时显示航拍照片覆盖区域的方法及系统
CN109032165A (zh) * 2017-07-21 2018-12-18 广州极飞科技有限公司 无人机航线的生成方法和装置
CN108225318A (zh) * 2017-11-29 2018-06-29 农业部南京农业机械化研究所 基于图像质量的航空遥感路径规划方法及系统
US20200117197A1 (en) * 2018-10-10 2020-04-16 Parrot Drones Obstacle detection assembly for a drone, drone equipped with such an obstacle detection assembly and obstacle detection method
CN111033419A (zh) * 2018-12-03 2020-04-17 深圳市大疆创新科技有限公司 飞行器的航线规划方法、控制台、飞行器系统及存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320774A (zh) * 2023-04-06 2023-06-23 北京四维远见信息技术有限公司 高效利用航摄影像的方法、装置、设备及存储介质
CN116320774B (zh) * 2023-04-06 2024-03-19 北京四维远见信息技术有限公司 高效利用航摄影像的方法、装置、设备及存储介质
CN117151311A (zh) * 2023-10-31 2023-12-01 天津云圣智能科技有限责任公司 测绘参数的优化处理方法、装置、电子设备及存储介质
CN117151311B (zh) * 2023-10-31 2024-02-02 天津云圣智能科技有限责任公司 测绘参数的优化处理方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN113950610A (zh) 2022-01-18
CN113950610B (zh) 2024-04-16

Similar Documents

Publication Publication Date Title
WO2022016348A1 (zh) 设备控制方法、装置及计算机可读存储介质
CN111006671B (zh) 输电线路精细化巡检智能航线规划方法
WO2020014909A1 (zh) 拍摄方法、装置和无人机
WO2021189456A1 (zh) 无人机巡检方法、装置及无人机
CN107329490B (zh) 无人机避障方法及无人机
WO2020103110A1 (zh) 一种基于点云地图的图像边界获取方法、设备及飞行器
WO2019113966A1 (zh) 一种避障方法、装置和无人机
WO2019104641A1 (zh) 无人机、其控制方法以及记录介质
WO2021212445A1 (zh) 拍摄方法、可移动平台、控制设备和存储介质
US11741571B2 (en) Voronoi cropping of images for post field generation
KR102195051B1 (ko) 드론의 영상 정보를 이용한 공간 정보 생성 시스템 및 방법과, 이를 위한 컴퓨터 프로그램
WO2021037286A1 (zh) 一种图像处理方法、装置、设备及存储介质
CN110337668B (zh) 图像增稳方法和装置
CN107211114A (zh) 跟踪摄影控制装置、跟踪摄影系统、相机、终端装置、跟踪摄影方法及跟踪摄影程序
JP7310811B2 (ja) 制御装置および方法、並びにプログラム
WO2023115342A1 (zh) 一种带状目标的无人机航测方法、装置、系统及存储介质
WO2020237478A1 (zh) 一种飞行规划方法及相关设备
WO2020220158A1 (zh) 一种无人机的控制方法、无人机及计算机可读存储介质
Greatwood et al. Perspective correcting visual odometry for agile mavs using a pixel processor array
WO2020237422A1 (zh) 航测方法、飞行器及存储介质
CN114428510A (zh) 环绕航线修正方法及系统
CN114545963A (zh) 一种优化多无人机全景监控视频的方法、系统及电子设备
WO2021168707A1 (zh) 对焦方法、装置及设备
CN113791640A (zh) 一种图像获取方法、装置、飞行器和存储介质
CN114659501A (zh) 一种对摄像机的参数处理方法、装置及图像处理设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20945783

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20945783

Country of ref document: EP

Kind code of ref document: A1