WO2020001629A1 - 信息处理装置、飞行路径生成方法、程序以及记录介质 - Google Patents

信息处理装置、飞行路径生成方法、程序以及记录介质 Download PDF

Info

Publication number
WO2020001629A1
WO2020001629A1 PCT/CN2019/093764 CN2019093764W WO2020001629A1 WO 2020001629 A1 WO2020001629 A1 WO 2020001629A1 CN 2019093764 W CN2019093764 W CN 2019093764W WO 2020001629 A1 WO2020001629 A1 WO 2020001629A1
Authority
WO
WIPO (PCT)
Prior art keywords
input line
output curve
point
flight path
generating
Prior art date
Application number
PCT/CN2019/093764
Other languages
English (en)
French (fr)
Inventor
沈思杰
顾磊
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980005106.1A priority Critical patent/CN111226093A/zh
Publication of WO2020001629A1 publication Critical patent/WO2020001629A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/22Plotting boards
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • B64U2201/202Remote controls using tethers for connecting to ground station
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports

Definitions

  • the present disclosure relates to an information processing device, a flight path generation method, a program, and a recording medium that generate a flight path for a flying body.
  • the flight path when the flight path is set automatically, the flight path may include a straight path and a curved path.
  • a straight path can be set by passing two points.
  • the curved path can be set so that it becomes a restricted quadratic Bezier curve.
  • the restricted quadratic Bezier curve it means that the quadratic Bezier curve passing through two points needs to have symmetry with respect to the vertical bisector of the two points.
  • a drone when a drone is flying on a complicated tortuous road, it is difficult to make it fly along the road, and it is difficult to set a complicated curved flight path to make the drone fly. Therefore, it is desirable to be able to set the flight path in a shape with fewer restrictions and more freedom.
  • the end point is the output curve of the end point, the cumulative value of the distance between each point in the input line and the output curve is calculated, and the flight path is generated according to the cumulative value of the output curve and the distance.
  • the processing unit may generate a flight path shown by an output curve.
  • the processing section determines the longest first point among the points in the input line, divides the input line at the first point, and generates a first input line portion and a first Two input line parts, generating the first output curve part of the output curve with the starting point of the first input line part as the starting point and the end point of the first input line part as the starting point, generating the starting point of the second input line part The second output curve portion of the output curve with the end point of the second input line portion as the end point.
  • the processing unit may calculate the cumulative value of the distance between each point in the first input line portion and the first output curve portion, and repeatedly perform the division of the first input line portion and correspond to the divided first input line portion. Generation of the output curve portion until the cumulative value of the distance is less than the first threshold.
  • the processing unit may connect a plurality of generated output curve parts to generate a flight path.
  • the processing unit may generate an equidistant point based on the start point of the input line, the end point of the input line, and a point on the vertical bisector that connects points equidistant from the start point and the end point. Based on the cumulative value of the distance, the symmetric output curve of the vertical bisector is used to calculate the derivative function for moving the equidistant points on the vertical bisector. Based on the calculated value of the derivative function, determine whether the output curve is deformed.
  • the processing unit may keep the output curve unchanged.
  • the processing unit may move the equidistant points on the vertical bisector according to the calculated value of the derivative function, and according to the start point of the input line, the end point of the input line, and Equidistant points deform the output curve.
  • the processing unit may repeatedly move the equidistant points and deform the output curve until the calculated value of the derivative function is less than the second threshold.
  • the output curve can be a quadratic Bezier curve with the start point of the input line, the end point of the input line, and equidistant points as control points.
  • the information processing apparatus may further include a communication section.
  • the processing unit may transmit the information of the flight path to the flying body via the communication unit.
  • the information processing apparatus may further include a communication section.
  • the processing unit may send the information of the start point of the input line, the end point of the input line, and equidistant points to the flying body via the communication unit.
  • the information processing apparatus may further include a display unit.
  • the processing unit may display information on the flight path via the display unit.
  • the information processing apparatus may further include a display unit.
  • the processing unit may display information of the start point of the input line, the end point of the input line, and equidistant points via the display portion.
  • the information processing device may be a flying object.
  • the processing unit may control the flight of the flying body according to the flight path.
  • a method for generating a flight path in an information processing device for generating a flight path for flight of a flying object has the steps of: acquiring an input line representing a first path; and generating a starting point of the input line as a starting point, A step of outputting a curve with an end point of the input line as an end point; a step of calculating a cumulative value of a distance between each point in the input line and the output curve; and a step of generating a flight path based on the cumulative value of the output curve and the distance.
  • the step of generating the flight path may include the step of generating the flight path shown by the output curve when the accumulated value is less than the first threshold.
  • the step of generating the flight path may include: when the cumulative value is greater than or equal to the first threshold, determining a first point having the longest distance among points in the input line; dividing the input line at the first point, and generating on the input line A step of the first input line portion and a second input line portion; a step of generating a first output curve portion in the output curve starting from the start point of the first input line portion and ending at the end point of the first input line portion; And a step of generating a second output curve portion of the output curve with the starting point of the second input line portion as a starting point and the end point of the second input line portion as an end point.
  • the step of calculating the accumulated value of the distance may include the step of calculating the accumulated value of the distance between each point in the first input line portion and the first output curve portion.
  • the step of generating the flight path may include the step of repeatedly dividing the first input line portion and generating the output curve portion corresponding to the divided first input line portion until the cumulative value of the distance is less than the first threshold value.
  • the step of generating a flight path may include the step of connecting the plurality of output curve portions generated to generate a flight path.
  • the step of generating the output curve may include: generating the starting point and the input line passing through the input line according to the starting point of the input line, the end point of the input line, and the points on the vertical bisector connecting the points equidistant from the starting point and the end point.
  • a step of outputting a symmetrical output curve relative to a vertical bisector a step of calculating a derivative function for moving equidistant points on the vertical bisector based on the cumulative value of the distance; and determining the calculated value based on the derivative function Steps to output the curve without distortion.
  • the step of generating the output curve may include a step of making the output curve constant when the calculated value of the derivative function is less than the second threshold.
  • the step of generating the output curve may include: when the calculated value of the derivative function is greater than or equal to the second threshold, moving the equidistant points on the vertical bisector according to the calculated value of the derivative function; and according to the starting point of the input line and the input line.
  • the step of generating the output curve may include the steps of repeatedly moving the equidistant points and deforming the output curve until the calculated value of the derivative function is less than the second threshold.
  • the output curve can be a quadratic Bezier curve with the start point of the input line, the end point of the input line, and equidistant points as control points.
  • the flight path generation method may further include the step of transmitting the information of the flight path to the flying body.
  • the method for generating a flight path may further include the step of sending the information of the start point of the input line, the end point of the input line, and equidistant points to the flying body.
  • the flight path generating method may further include the step of displaying information of the flight path.
  • the flight path generating method may further include the step of displaying information of an input line starting point, an input line ending point, and equidistant points.
  • the information processing device may be a flying object.
  • the flight path generation method may further include the step of controlling the flight of the flying body according to the flight path.
  • a program for causing an information processing device that generates a flight path for a flying body to perform the following steps: a step of obtaining an input line representing a first path; and generating a starting point of the input line as a starting point, A step of outputting a curve with an end point of the input line as an end point; a step of calculating a cumulative value of a distance between each point in the input line and the output curve; and a step of generating a flight path based on the cumulative value of the output curve and the distance.
  • a recording medium is a computer-readable recording medium and records a program for causing an information processing apparatus that generates a flight path for a flying body to perform the following steps:
  • FIG. 1 is a schematic diagram showing a first configuration example of a flying body system in the embodiment.
  • FIG. 2 is a schematic diagram showing a second configuration example of the flying body system in the embodiment.
  • FIG. 3 is a diagram showing an example of a specific appearance of an unmanned aircraft.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of an unmanned aircraft.
  • FIG. 5 is a block diagram showing an example of a hardware configuration of a terminal.
  • FIG. 6 is a flowchart showing an operation example when a flight path is generated by a terminal.
  • FIG. 7 is a diagram showing an example of an input line Q that generates a flight path.
  • FIG. 8 is a diagram showing an example of an output curve C generated by a first-time curve fitting.
  • FIG. 9 is a diagram showing an example of an error E between an input line Q and an output curve C.
  • FIG. 10 is a diagram showing an example of the division point K of the input line Q.
  • FIG. 11 is a diagram showing an example of a plurality of input line portions q1 and q2 obtained by performing division at the division point K.
  • FIG. 12 is a diagram showing an example of an output curve portion c1 generated by curve fitting of the input line portion q1.
  • FIG. 13 is a diagram showing an example of an output curve portion c2 generated by curve fitting of the input line portion q2.
  • FIG. 14 is a diagram showing an example of an error E2 between the input line portion q2 and the output curve portion c2.
  • FIG. 15 is a diagram showing an example of the division point K2 of the input line portion q2.
  • FIG. 16 is a diagram showing an example of a plurality of input line portions q2_1 and q2_2 obtained by division at the division point K2.
  • FIG. 17 is a diagram showing a flight path formed by combining output curve portions and control points used in generating each output curve portion c.
  • FIG. 18 is a flowchart showing a specific operation example when performing curve fitting through a terminal.
  • FIG. 19 is a diagram showing an example of the start point P0 of the input line Q, the midpoint as the equidistant point P1, and the end point P2.
  • FIG. 20 is a diagram showing an example of an integrated value D of the distance between each point p on the input line Q and a straight line as the output curve C.
  • FIG. 20 is a diagram showing an example of an integrated value D of the distance between each point p on the input line Q and a straight line as the output curve C.
  • FIG. 21 is a diagram showing a setting example of the V axis along the vertical bisector L1.
  • FIG. 22 is a diagram showing an example of a moving distance d of an equidistant point P1 moving on the V axis.
  • FIG. 23 is a diagram showing an example of an equidistant point P1 after moving on the V axis and an output curve C changed according to the equidistant point P1.
  • the flying object is an unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) as an example.
  • Unmanned aircraft includes aircraft moving in the air.
  • the unmanned aircraft is also marked as "UAV".
  • the information processing device may take a terminal as an example, but may also be another device (for example, a transmitter, a server, or an unmanned aircraft).
  • the flight path generation method specifies operations in the information processing apparatus.
  • a program for example, a program which causes an information processing apparatus to perform various processes is recorded in a recording medium.
  • FIG. 1 is a schematic diagram showing a first configuration example of the flying body system 10 in the embodiment.
  • the flying body system 10 includes an unmanned aircraft 100 and a terminal 80.
  • the unmanned aircraft 100 and the terminal 80 may communicate with each other through wired communication or wireless communication (for example, wireless LAN (Local Area Network)).
  • the terminal 80 is exemplified as a portable terminal (for example, a smart phone or a tablet terminal).
  • the configuration of the flying body system may include an unmanned aircraft, a transmitter (radio control transmitter), and a portable terminal.
  • the transmitter radio control transmitter
  • the user can use left and right joysticks arranged in front of the transmitter to instruct the control of the flight of the unmanned aircraft.
  • the unmanned aircraft, the transmitter, and the portable terminal can communicate with each other through wired communication or wireless communication.
  • FIG. 2 is a schematic diagram showing a second configuration example of the flying body system 10 in the embodiment.
  • the terminal 80 is a PC.
  • the terminal 80 may have the same function.
  • FIG. 3 is a diagram showing an example of a specific appearance of the unmanned aircraft 100.
  • a perspective view of the unmanned aircraft 100 when flying in the moving direction STV0 is shown.
  • the unmanned aircraft 100 is an example of a moving body.
  • the roll axis is set to a direction parallel to the ground and along the moving direction STV0 (refer to the x-axis).
  • the pitch axis is set to a direction parallel to the ground and perpendicular to the roll axis (refer to the y axis)
  • the yaw axis is set to be perpendicular to the ground and perpendicular to the roll axis and the pitch axis (see z axis).
  • the unmanned aerial vehicle 100 is configured to include a UAV body 102, a gimbal 200, an imaging unit 220, and a plurality of imaging units 230.
  • the UAV body 102 includes a plurality of rotors (propellers).
  • the UAV body 102 controls the rotation of a plurality of rotors to fly the unmanned aircraft 100.
  • the UAV body 102 uses, for example, four rotors to fly the unmanned aircraft 100.
  • the number of rotors is not limited to four.
  • the unmanned aircraft 100 may be a fixed-wing aircraft without a rotor.
  • the imaging unit 220 may be an imaging camera that captures a subject included in a desired imaging range (for example, an aerial view, a landscape such as a mountain, a river, or a building on the ground) that is included in an aerial photography target.
  • a desired imaging range for example, an aerial view, a landscape such as a mountain, a river, or a building on the ground
  • the plurality of imaging units 230 may be a sensing camera that captures the surroundings of the drone 100 in order to control the flight of the drone 100.
  • the two camera units 230 may be provided on the nose of the unmanned aircraft 100, that is, on the front side. Furthermore, the other two imaging units 230 may be provided on the bottom surface of the drone 100.
  • the two image pickup units 230 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging units 230 on the bottom surface side may be paired to function as a stereo camera.
  • the three-dimensional space data (three-dimensional shape data) of the periphery of the drone aircraft 100 may be generated based on the images captured by the plurality of imaging sections 230.
  • the number of imaging units 230 included in the unmanned aerial vehicle 100 is not limited to four.
  • the unmanned aerial vehicle 100 only needs to include at least one imaging unit 230.
  • the unmanned aircraft 100 may include at least one camera 230 on the nose, tail, side, bottom, and top surfaces of the unmanned aircraft 100, respectively.
  • the angle of view settable in the imaging section 230 may be greater than the angle of view settable in the imaging section 220.
  • the imaging unit 230 may include a single focus lens or a fisheye lens.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of the unmanned aircraft 100.
  • the unmanned aircraft 100 is composed of a UAV control unit 110, a communication interface 150, a memory 160, a memory 170, a gimbal 200, a rotor mechanism 210, a camera unit 220, a camera unit 230, a GPS receiver 240, and an inertial measurement device ( IMU: Inertial Measurement Unit) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, laser measuring instrument 290.
  • IMU Inertial Measurement Unit
  • the UAV control unit 110 is composed of, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
  • the UAV control unit 110 performs signal processing for overall control of operations of each part of the unmanned aerial vehicle 100, input / output processing of data with other parts, calculation processing of data, and data storage processing.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 in accordance with a program stored in the memory 160.
  • the UAV control unit 110 may control flight.
  • the UAV control unit 110 can take aerial images.
  • the UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100.
  • the UAV control unit 110 may obtain position information indicating the latitude, longitude, and altitude where the unmanned aircraft 100 is located from the GPS receiver 240.
  • the UAV control unit 110 may obtain latitude and longitude information indicating the latitude and longitude of the unmanned aircraft 100 from the GPS receiver 240 and altitude information indicating the altitude of the unmanned aircraft 100 from the barometric altimeter 270 as position information.
  • the UAV control unit 110 may obtain the distance between the radiation point of the ultrasonic wave generated by the ultrasonic sensor 280 and the reflection point of the ultrasonic wave as height information.
  • the UAV control unit 110 may acquire orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260.
  • the orientation information may be expressed in an orientation corresponding to the orientation of the nose of the unmanned aircraft 100, for example.
  • the UAV control unit 110 can acquire position information indicating the position where the unmanned aerial vehicle 100 should exist when the imaging unit 220 captures an imaging range that should be captured.
  • the UAV control unit 110 may obtain position information indicating a position where the unmanned aircraft 100 should exist from the memory 160.
  • the UAV control unit 110 may obtain position information indicating a position where the unmanned aerial vehicle 100 should exist from another device via the communication interface 150.
  • the UAV control unit 110 may refer to the three-dimensional map database to specify a position where the unmanned aircraft 100 can exist, and obtain the position as position information indicating a position where the unmanned aircraft 100 should exist.
  • the UAV control unit 110 can acquire imaging range information indicating the respective imaging ranges of the imaging unit 220 and the imaging unit 230.
  • the UAV control unit 110 may obtain angle information indicating the angles of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as parameters for specifying an imaging range.
  • the UAV control section 110 may acquire information indicating the imaging directions of the imaging section 220 and the imaging section 230 as parameters for specifying the imaging range.
  • the UAV control unit 110 may acquire, for example, posture information indicating the posture state of the imaging unit 220 from the gimbal 200 as the information indicating the imaging direction of the imaging unit 220.
  • the posture information of the imaging unit 220 may indicate the rotation angle of the gimbal 200 from the reference rotation angles of the pitch axis and the yaw axis.
  • the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 is located as a parameter for specifying an imaging range.
  • the UAV control unit 110 may generate an imaging range information by delineating an imaging range indicating a geographic range captured by the imaging unit 220 according to the angle of view and the imaging direction of the imaging unit 220 and the imaging unit 230, and the location of the unmanned aerial vehicle 100. Obtain camera range information.
  • the UAV control unit 110 may acquire imaging range information from the memory 160.
  • the UAV control unit 110 can acquire imaging range information via the communication interface 150.
  • the UAV control unit 110 controls the gimbal 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230.
  • the UAV control unit 110 may control the imaging range of the imaging unit 220 by changing the imaging direction or viewing angle of the imaging unit 220.
  • the UAV control unit 110 may control the imaging range of the imaging unit 220 supported by the universal joint 200 by controlling the rotation mechanism of the universal joint 200.
  • the imaging range refers to a geographic range captured by the imaging section 220 or the imaging section 230.
  • the camera range is defined by latitude, longitude, and altitude.
  • the imaging range may be a range of three-dimensional spatial data defined by latitude, longitude, and height.
  • the imaging range may be a range of two-dimensional spatial data defined by latitude and longitude.
  • the imaging range can be specified according to the angle of view and imaging direction of the imaging unit 220 or the imaging unit 230 and the position where the unmanned aerial vehicle 100 is located.
  • the imaging directions of the imaging section 220 and the imaging section 230 may be defined according to the azimuth and depression angle of the front face of the imaging lens where the imaging section 220 and the imaging section 230 are provided.
  • the imaging direction of the imaging unit 220 may be a direction specified according to the orientation of the nose of the unmanned aircraft 100 and the posture state of the imaging unit 220 relative to the gimbal 200.
  • the imaging direction of the imaging section 230 may be a direction specified according to the orientation of the nose of the unmanned aircraft 100 and the position where the imaging section 230 is provided.
  • the UAV control unit 110 may specify a surrounding environment of the unmanned aircraft 100 by analyzing a plurality of images captured by the plurality of imaging units 230.
  • the UAV control unit 110 may control the flight according to the surrounding environment of the unmanned aircraft 100, for example, avoiding obstacles.
  • the UAV control unit 110 can acquire stereo information (three-dimensional information) indicating a stereo shape (three-dimensional shape) of an object existing around the unmanned aircraft 100.
  • the object may be, for example, a part of a landscape such as a building, a road, a vehicle, or a tree.
  • the stereo information is, for example, three-dimensional spatial data.
  • the UAV control unit 110 may generate stereo information indicating a stereo shape of an object existing around the unmanned aerial vehicle 100 from each of the images obtained by the plurality of imaging units 230 to obtain the stereo information.
  • the UAV control unit 110 may acquire stereoscopic information indicating a stereoscopic shape of an object existing around the drone 100 by referring to a three-dimensional map database stored in the memory 160 or the memory 170.
  • the UAV control unit 110 may obtain stereoscopic information related to the stereoscopic shape of an object existing around the unmanned aerial vehicle 100 by referring to a three-dimensional map database managed by a server existing on the network.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the rotor mechanism 210 to control the position including the latitude, longitude, and altitude of the unmanned aircraft 100.
  • the UAV control unit 110 may control the imaging range of the imaging unit 220 by controlling the flight of the unmanned aircraft 100.
  • the UAV control unit 110 may control a viewing angle of the imaging unit 220 by controlling a zoom lens included in the imaging unit 220.
  • the UAV control unit 110 may use the digital zoom function of the imaging unit 220 to control the angle of view of the imaging unit 220 through digital zoom.
  • the UAV control unit 110 may move the unmanned aircraft 100 to a specified position on a specified date, so that the camera unit 220 is in a desired environment. Take a picture of the desired imaging range.
  • the UAV control section 110 can move the drone 100 to a specified position on a specified date, so that the camera section 220 is at a desired Shoot the desired imaging range under the environment.
  • the communication interface 150 communicates with the terminal 80.
  • the communication interface 150 can perform wireless communication through any wireless communication method.
  • the communication interface 150 can perform wired communication by using any wired communication method.
  • the communication interface 150 may transmit the aerial image and the additional information (metadata) related to the aerial image to the terminal 80.
  • the communication interface 150 may acquire the flight control instruction information from the terminal 80.
  • the instruction information of the flight control may include information such as a flight path for the flight of the unmanned aircraft 100, a flight point (Waypoint) for generating the flight path, a control point as a basis for generating the flight path, and the like.
  • the memory 160 stores the UAV control unit 110 to the gimbal 200, the rotor mechanism 210, the camera unit 220, the camera unit 230, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring instrument 290 Programs and the like required for control.
  • the memory 160 may be a computer-readable recording medium and may include SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory) In addition to at least one of programmable read-only memory (EEPROM), EEPROM (Electrically Programmable Read-Only Memory: electrically erasable programmable read-only memory), and USB (Universal Serial Bus: universal serial bus) memory.
  • SRAM Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • EPROM Erasable Programmable Read Only Memory
  • EEPROM programmable read-only memory
  • EEPROM Electrically Programmable Read-Only Memory
  • USB Universal Serial Bus: universal serial bus
  • the memory 170 may include at least one of a HDD (Hard Disk Drive), an SSD (Solid State Drive), an SD memory card, a USB memory, and other memories.
  • the memory 170 can store various information and various data.
  • the memory 170 can be detached from the unmanned aircraft 100.
  • the memory 170 may record aerial images.
  • the gimbal 200 may rotatably support the imaging unit 220 around a yaw axis, a pitch axis, and a roll axis.
  • the gimbal 200 can rotate the imaging unit 220 around at least one of a yaw axis, a pitch axis, and a roll axis, thereby changing the imaging direction of the imaging unit 220.
  • the rotor mechanism 210 includes a plurality of rotors and a plurality of drive motors for rotating the rotors.
  • the rotor mechanism 210 is controlled to rotate by the UAV control unit 110 to fly the unmanned aircraft 100.
  • the imaging unit 220 captures a subject within a desired imaging range and generates data of a captured image.
  • the image data (for example, aerial image) obtained by the imaging of the imaging unit 220 may be stored in a memory or the memory 170 of the imaging unit 220.
  • the imaging unit 230 captures the surroundings of the drone 100 and generates data of a captured image.
  • the image data of the imaging unit 230 may be stored in the memory 170.
  • the GPS receiver 240 receives a plurality of signals indicating the time transmitted from a plurality of navigation satellites (ie, GPS satellites) and the position (coordinates) of each GPS satellite.
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the received multiple signals.
  • the GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control section 110.
  • the UAV control unit 110 may calculate the position information of the GPS receiver 240 instead of the GPS receiver 240. In this case, the UAV control unit 110 receives information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240.
  • the inertial measurement device 250 detects the attitude of the unmanned aerial vehicle 100 and outputs the detection result to the UAV control unit 110.
  • the inertial measurement device 250 can detect the acceleration of the three-axis directions of the front, rear, left, right, and up and down of the unmanned aircraft 100 and the angular velocities of the three axes of the pitch axis, roll axis, and yaw axis as the attitude of the unmanned aircraft 100.
  • the magnetic compass 260 detects the heading of the drone 100 and outputs the detection result to the UAV control unit 110.
  • the barometric altimeter 270 detects the flying height of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the ultrasonic sensor 280 transmits ultrasonic waves, detects ultrasonic waves reflected from the ground and objects, and outputs the detection results to the UAV control unit 110.
  • the detection result may show the distance from the unmanned aircraft 100 to the ground, that is, the altitude.
  • the detection result may show the distance from the unmanned aircraft 100 to an object (subject).
  • the laser measuring instrument 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between the unmanned aerial vehicle 100 and the object (subject) by the reflected light.
  • a time-of-flight method may be used.
  • FIG. 5 is a block diagram showing an example of a hardware configuration of the terminal 80.
  • the terminal 80 includes a terminal control section 81, an operation section 83, a communication section 85, a memory 87, a display section 88, and a memory 89.
  • the terminal 80 may be held by a user who wishes to instruct flight control of the unmanned aircraft 100.
  • the terminal control unit 81 is configured using, for example, a CPU, an MPU, or a DSP.
  • the terminal control section 81 performs signal processing for overall control of operations of each section of the terminal 80, data input and output processing with other sections, data calculation processing, and data storage processing.
  • the terminal control unit 81 can acquire data and information from the unmanned aircraft 100 via the communication unit 85.
  • the terminal control unit 81 may acquire data and information input via the operation unit 83.
  • the terminal control unit 81 may acquire data and information stored in the memory 87.
  • the terminal control unit 81 may transmit data and information to the unmanned aircraft 100 via the communication unit 85.
  • the terminal control unit 81 may send data and information to the display unit 88 and cause the display unit 88 to display display information based on the data and information.
  • the information displayed on the display section 88 and the information sent to the unmanned aircraft 100 through the communication section 85 may include a flight path for the unmanned aircraft 100 to fly, a flight point for generating a flight path (Waypoint), and a flight path. The generated control points and other information.
  • the terminal control unit 81 may execute an application program for generating a flight path.
  • the terminal control unit 81 may generate various data used in the application.
  • the operation unit 83 receives and acquires data and information input by a user of the terminal 80.
  • the operation unit 83 may include input devices such as buttons, keys, a touch display screen, and a microphone.
  • the operation section 83 and the display section 88 are constituted by a touch display screen.
  • the operation section 83 may accept a touch operation, a click operation, a drag operation, and the like.
  • the communication unit 85 performs wireless communication with the unmanned aircraft 100 through various wireless communication methods.
  • the wireless communication method of the wireless communication may include, for example, communication via a wireless LAN, Bluetooth (registered trademark), or a public wireless network.
  • the communication unit 85 can perform wired communication using any wired communication method.
  • the memory 87 may include, for example, a program that regulates the operation of the terminal 80, a ROM that stores data of set values, and a RAM that temporarily stores various information and data used by the terminal control unit 81 for processing.
  • the memory 87 may include a memory other than a ROM and a RAM.
  • the memory 87 may be provided inside the terminal 80.
  • the memory 87 may be detachably provided in the terminal 80.
  • the program may include an application program.
  • the display unit 88 is configured by, for example, an LCD (Liquid Crystal Display), and displays various information and data output from the terminal control unit 81.
  • the display unit 88 can display various data and information related to execution of the application.
  • the memory 89 stores and stores various data and information.
  • the memory 89 may be an HDD, an SSD, an SD card, a USB memory, or the like.
  • the memory 89 may be provided inside the terminal 80.
  • the memory 89 may be detachably provided in the terminal 80.
  • the memory 89 may store aerial images and additional information acquired from the unmanned aircraft 100. Additional information may be stored in the memory 87.
  • the processing performed by the terminal 80 may be performed by the transmitter. Since the transmitter has the same constituent parts as the terminal 80, it will not be described in detail.
  • the transmitter includes a control section, an operation section, a communication section, a display section, a memory, and the like. When the flying body system 10 has a transmitter, the terminal 80 may not be provided.
  • the terminal control section 81 is an example of a processing section.
  • the terminal control unit 81 can generate a flight path corresponding to a complicated path by performing processing related to the generation of the flight path.
  • the terminal control unit 81 acquires a path FR1 where the unmanned aircraft 100 is expected to fly.
  • the path FR1 may be a path of any shape, a path of a complicated shape, or a path that is difficult for the unmanned aircraft 100 to fly smoothly.
  • the terminal control unit 81 can accept a user operation via the operation unit 83, and generate and acquire the route FR1 based on the user operation.
  • the terminal control unit 81 can acquire the information of the path FR1 stored in advance from the memory 87 and the like (the memory 87 and the memory 89).
  • the terminal control unit 81 can access the external map server storing the map information via the communication unit 85, send identification information for identifying the route FR1, and receive and acquire the information of the route FR1.
  • the path FR1 can be represented by an input line Q.
  • the terminal control unit 81 generates the output curve C by using the start point P0 and the end point P2 of the input line Q as the start point P0 and the end point P2 of the output curve C having a predetermined shape.
  • the output curve C may be a curve having a more simplified shape than the input line Q (path FR1), and may be at least a part of a path that the unmanned aircraft 100 can fly smoothly.
  • the output curve C may be a quadratic Bezier curve or other curves.
  • the output curve C is mainly illustrated as a quadratic Bezier curve. Generating an output curve C based on an input line Q or an input line portion q described later is also referred to as curve fitting.
  • the quadratic Bezier curve can be expressed by the following formula (0), for example.
  • t is a parameter.
  • B (t) is a point on a quadratic Bezier curve.
  • P0, P1, and P2 are control points for generating a quadratic Bezier curve.
  • P0 and P2 in the formula (0) correspond to the start point P0 and the end point P2 of the output curve C.
  • P1 in formula (0) corresponds to equidistant points at equal distances from the start point P0 and the end point P2.
  • the terminal control unit 81 calculates a difference (error E) between the input line Q and the output curve C.
  • the error E can be expressed by the following formula (1).
  • the error E shown in the formula (1) can be expressed by, for example, an area including a dotted line portion in FIG. 9.
  • the terminal control unit 81 specifies the division point K that is the farthest (longest) from the output curve C on the input line Q. That is, the position of p where the value of dist (p, C) is the largest can be calculated.
  • the division point K is not limited to the longest distance, and may be any one of a plurality of positions with a distance greater than or equal to the threshold th4.
  • the terminal control unit 81 divides the input line Q into two input line portions q1 and q2 at the position of the division point K.
  • the division point K is a point serving as a reference for dividing the input line Q.
  • the terminal control unit 81 may perform curve fitting on the two input line portions q1 and q2 to generate output curve portions c1 and c2 based on the input line portions q1 and q2.
  • the terminal control unit 81 may repeatedly perform separation and curve fitting of the input line portion q to generate input line portions q3, q4, q5, ..., and output curve portions c3, c4, ....
  • the terminal control section 81 may divide the input line Q using one or more division points, generate a plurality of input line portions q, and generate a plurality of output curve portions c corresponding to the plurality of input line portions q.
  • the plurality of output curve portions c actually become the basis of the flight path FR2 for the unmanned aircraft 100 to fly.
  • the output curve portion c can also be said to be a part of the output curve C.
  • the points applicable to the input line Q and the output curve C can also be applied to the input line portion q and the output curve portion c.
  • the starting point P0 and the ending point P2 of the input line portion q may be used as the starting point P0 and the ending point P2 to generate the output curve portion c.
  • the input line portion q may be divided at the division point K, thereby generating a plurality of input line portions q.
  • the unmanned aircraft 100 can fly on the flight path FR2 whose shape is close to the desired path FR1.
  • the terminal control unit 81 may generate the flight path FR2 based on the output curve C and the plurality of output curve portions c. For example, the terminal control unit 81 may use the output curve C as the flight path FR2. The terminal control unit 81 may connect adjacent output curve portions c of the plurality of output curve portions c to each other to generate a flight path FR2. In this case, the terminal control unit 81 connects the respective start points P0 and end points P2 of the adjacent output curve portions c to generate the flight path FR2. The unmanned aircraft 100 flies according to the generated flight path FR2. Therefore, the start point P0 and the end point P2 of the output curve C and the output curve portion c become the flight positions through which the unmanned aircraft 100 passes. This flight position is also called a waypoint.
  • the terminal control unit 81 may store the fitting result of the curve fitting in the memory 87 or the like.
  • the terminal control unit 81 may display the fitting result via the display unit 88.
  • the terminal control section 81 may transmit the fitting result to the unmanned aircraft 100 via the communication section 85.
  • the fitting result may include information of each way point WP and information of the flight path FR2.
  • the flight path FR2 can be represented by a combination of an output curve C and a plurality of output curve portions c.
  • the fitting result may include control points (starting point P0, equidistant point P1, and ending point P2) that are the basis of the generation of the output curve C and the output curve portion c.
  • the terminal 80 can confirm the position of the flight path FR2 and the path point WP derived from the terminal 80 by displaying the fitting result.
  • the terminal 80 can notify the unmanned aircraft 100 of the fitting result, so that the unmanned aircraft 100 can fly according to the fitted flight path FP2.
  • the terminal control unit 81 sets an equidistant point P1 equidistant from the start point P0 and the end point P2 of the input line Q as an initial setting.
  • the equidistant point P1 as an initial value may be the center point of the starting point P0 and the ending point P2, that is, the midpoint.
  • These starting points P0, equidistant points P1, and ending points P2 are the three control points in the quadratic Bezier curve as the output curve C.
  • the quadratic Bezier curve is a straight line.
  • the equidistant point P1 is located on a vertical bisector L1 that bisects an imaginary line connecting the start point P0 and the end point P2.
  • the terminal control unit 81 may calculate an integrated value D of the distance between the point p on the input line Q and the output curve C.
  • the accumulated value D can be expressed by the following formula (2).
  • the point P can be moved arbitrarily on the input line Q.
  • the terminal control unit 81 calculates a derivative function ⁇ for deriving the moving distance d of the equidistant point P1 on the V axis.
  • the derivative function ⁇ can be expressed by the following formula (3).
  • v is a parameter.
  • the value of the derivative function ⁇ (the calculated value of the derivative function) is the moving distance d.
  • the derivative function ⁇ represents a differential value of the accumulated value D of the distance, and represents a change amount of the accumulated value D of the distance slightly moved with respect to the V-axis direction.
  • the equidistant point p1 is the midpoint between the starting point P0 and the ending point P2 as the initial value
  • v 0.
  • the derivative function ⁇ is a value that differentiates the cumulative value D of the distance using the variable v
  • the value of the derivative function ⁇ is determined according to the position on the V axis. Since the position on the V axis is determined, the terminal control unit 81 can calculate the value of the derivative function ⁇ at the position v on the V axis, and can calculate the value of the derivative function ⁇ at the equidistant point P1.
  • the terminal control unit 81 may move the equidistant point P1 on the V axis by only a moving distance d in the positive direction or the negative direction, so that the cumulative value D of the distance becomes small (for example, minimized).
  • the equidistant point P1 can be moved by a moving distance d including the moving direction on the V axis.
  • d is the value of the derivative function ⁇
  • a positive number is the positive direction of the V axis
  • a negative number is the negative direction of the V axis.
  • the threshold value th1 is also referred to as a specified fluctuation value.
  • the terminal control unit 81 may calculate an output curve C of a quadratic Bezier curve with three points of the starting point P0 and the equidistant point P1 and the ending point P2 after a movement distance d as control points.
  • the current output curve C is closer to the shape of the original input line Q than the previous output curve C. Therefore, the terminal 80 can make the unmanned aircraft 100 fly along the flight path FR2 having a shape closer to the desired path FR1.
  • the terminal control unit 81 may omit the movement of the equidistant point P1 and omit calculations related to the extra curve fitting. As a result, the terminal 80 can shorten the time required for curve fitting, and thus can shorten the time required for generating the flight path FR2.
  • the terminal control unit 81 may store the curve-related information related to the derived output curve C in the memory 87 or the like.
  • the terminal control unit 81 may display the curve-related information via the display unit 88.
  • the curve related information may include the derived output curve C and three control points (starting point P0, equidistant point P1, and ending point P2) for deriving the output curve C.
  • FIG. 6 is a flowchart showing an operation example when the flight path RF2 is generated by the terminal 80.
  • the terminal control unit 81 acquires an input line Q indicating a path FR1 in which the unmanned aircraft 100 is desired to fly (S11).
  • FIG. 7 is a diagram showing an example of an input line Q that generates a flight path FR2.
  • the terminal control unit 81 uses the starting point P0 and the ending point P2 of the input line Q as the starting point P0 and the ending point P2 of the output curve C, and generates an output curve C as a quadratic Bezier curve (S12). That is, the terminal control unit 81 generates an output curve C based on the input line Q by curve fitting.
  • FIG. 8 is a diagram showing an example of an output curve C generated by a first-time curve fitting.
  • the terminal control unit 81 calculates an error E which is a difference between the input line Q and the output curve C. With respect to the error E, the terminal control section 81 acquires a specified error Es as a threshold value th2 to be compared with the error E. The terminal control unit 81 determines whether the error E is smaller than the specified error Es (S13).
  • FIG. 9 is a diagram showing an example of an error E between an input line Q and an output curve C.
  • the error E is represented by the cumulative value of the distance between each point p of the input line Q and the output curve C, the area indicated by the diagonal dotted line in FIG. 9 is the error E.
  • the designation error Es may be stored in the memory 87 or the like and acquired from the memory 87 or the like, or may be input and acquired by a user operation via the operation unit 83.
  • the specified error Es may be a fixed value or a variable value.
  • the designation error Es is an index of how much the shape of the output curve C approximates the shape of the input line Q.
  • the specified error Es can be based on, for example, the geographic characteristics of the flight path FR2 represented by the output curve C (e.g., areas that are actually difficult to fly due to terrain, buildings, wind), the scheduled flight characteristics of the unmanned aircraft 100 (e.g., unmanned aircraft 100 flight speed).
  • the terminal control unit 81 stores the fitting result of the curve fitting (for example, the flight path FR2, each path point WP, and each control point) in the memory 87 and the like (S14).
  • the terminal control section 81 may determine that the shape of the input line Q is sufficiently similar to the shape of the output curve C, so that the unmanned aircraft 100 can fly. For example, although the quadratic Bezier curve as the output curve C is different from the actual curve, it is within the allowable range. Thereby, the terminal control unit 81 stores information of the output curve C and each control point (for example, the start point P0, the equidistant point P1, and the end point P2) for generating the output curve C.
  • the terminal control unit 81 calculates the farthest point that is the farthest from the output curve C among the points of the input line Q (S15).
  • the point p on the input line Q which is the longest distance among the calculated distances between the points p of the plurality of input lines Q and the output curve C is the division point K.
  • FIG. 10 is a diagram showing an example of a division point K of the output curve C.
  • the terminal control unit 81 can determine that the shape of the input line Q and the shape of the output curve C are not sufficiently approximated. In this case, the terminal control section 81 improves the output curve C so that the error E is smaller than the specified error Es, and brings the shape of the output curve C closer to the shape of the input line Q.
  • the terminal control unit 81 divides the input line Q at the position of the division point K, and generates input line portions q1 and q2 as two curves (S16).
  • FIG. 11 is a diagram showing an example of a plurality of input line portions q1 and q2 obtained by performing division at the division point K.
  • the terminal control section 81 proceeds to the process of S11. That is, the terminal control unit 81 performs curve fitting of S11 on the input line portion q (the input line portions q1 and q2 for the first time) generated in S16.
  • the terminal control unit 81 calculates the error between the input line portion q and the output curve portion c as the error E2, and Compare with the specified error Es.
  • the error E2 is smaller than the specified error Es, the fitting result is saved and the processing of FIG. 6 (that is, curve fitting) is ended.
  • the terminal control section 81 may again derive the division point K2 as the division point in the input line portion q, and further divide the input.
  • the line part q continues to perform curve fitting on the divided input line part q.
  • FIG. 12 is a diagram showing an example of an output curve portion c1 generated by curve fitting of the input line portion q1.
  • the error E2 is smaller than the specified error Es. Therefore, since the output curve portion c1 does not further derive the division point K2 and performs curve fitting, the output curve portion c1 is used as a part of the final flight path FR2.
  • the input line portion q1 is indicated by a dotted line.
  • FIG. 13 is a diagram showing an example of an output curve portion c2 generated by curve fitting of the input line portion q2.
  • FIG. 14 is a diagram showing an example of an error E2 between the input line portion q2 and the output curve portion c2.
  • the method of deriving the error E2 is the same as the method of deriving the initial error E.
  • the error E2 as the error E2 is greater than or equal to the specified error Es. Therefore, the output curve portion c2 further derives the division point K2, and then generates a plurality of input line portions q2_1, q2_2, and continues to perform curve fitting on the input line portions q2_1, q2_2.
  • FIG. 13 is a diagram showing an example of an output curve portion c2 generated by curve fitting of the input line portion q2.
  • FIG. 14 is a diagram showing an example of an error E2 between the input line portion q2 and the output curve portion c2.
  • the method of deriving the error E2 is the same as the
  • FIG. 15 is a diagram showing an example of the division point K2 of the input line portion q2.
  • FIG. 16 is a diagram showing an example of a plurality of input line portions q2_1 and q2_2 obtained by division at the division point K2.
  • the input line portion q2 is indicated by a dotted line.
  • the terminal control unit 81 may continue the curve fitting until the error E is smaller than the specified error Es for all the output curve portions c generated. In addition, regarding the number of generated input line portions q, when the ratio of the number of input line portions q of the error E to the specified error Es is greater than or equal to the threshold th3, the terminal control unit 81 may end the curve fitting.
  • the terminal control unit 81 For each input line portion q, when the error E is smaller than the specified error Es, the terminal control unit 81 connects each of the generated output curve portions c to generate a flight path FR2. In addition, when the curve is fitted once and the input line portion q is not generated from the original input line Q, the output curve C obtained by curve fitting from the original input line Q is the flight path FR2.
  • FIG. 17 is a diagram showing a flight path FR2 formed by combining the output curve portions c and control points used in the generation of each output curve portion c.
  • the control points include a starting point P0, an equidistant point P1, and an ending point P2.
  • the output curve portions c11 to c19 are connected to generate the flight path FR2.
  • the starting point P0, the equidistant point P1, and the end point P2 in the output curve portion c12 are representatively shown, but the same applies to the starting point P0, the equidistant point P1, and the end point P2 in the other output curve portion c.
  • the start point P0 in the output curve portion c12 becomes the end point P2 in the output curve portion c11.
  • the end point P2 in the output curve portion c12 becomes the start point P0 in the output curve portion c13.
  • the flight position (waypoint WP) where the unmanned aircraft 100 flies is the start point P0 and the end point P2 of the output curve portion c. Therefore, according to the number of output curve parts c derived from the input line Q, the number of waypoints WP changes.
  • the smaller the number of waypoints WP the smaller the number of flight positions that the unmanned aircraft 100 should pass through, and the flight efficiency improves.
  • the greater the number of waypoints WP the shorter the length of an output curve portion c, and the smaller the error E.
  • the terminal control section 81 acquires the input line Q representing the path FR1 (an example of the first path).
  • the terminal control unit 81 generates an output curve C with the start point of the input line Q as the starting point P0 and the end point of the input line Q as the end point P2.
  • the terminal control unit 81 calculates an error E between the input line Q and the output curve C (an example of an integrated value of the distance between each point p in the input line Q and the output curve C).
  • the terminal control unit 81 generates a flight path FR2 based on the output curve C and the error E.
  • the terminal 80 can generate a similar shape to the path FR1. Flight path FR2.
  • the terminal 80 generates a flight path by referring to the error E between the input line Q and the output curve C, and thus can generate the flight path FR2 while adjusting the approximation of the input line Q and the output curve C.
  • the shape of the output curve C can be adjusted as required, as compared with a case where one path FR1 is simply replaced with one quadratic Bezier curve.
  • the terminal 80 can generate the flight path FR2 in a shape with fewer restrictions and more freedom.
  • the terminal control section 81 may generate the flight path FR2 represented by the output curve C.
  • the terminal 80 can use the shape of the output curve C as the shape of the flight path FR2. Accordingly, the terminal 80 can easily generate the flight path FR2 from the path FR1.
  • the terminal control section 81 may determine a division point K1 (an example of the first point) at which the distance in each point p in the input line Q is greater than or equal to the threshold th4.
  • the terminal control section 81 may divide the input line Q at the division point K, and generate an input line portion q1 (an example of a first input line portion) and an input line portion q2 (an example of a second input line portion) in the input line Q. ).
  • the terminal control unit 81 may generate an output curve portion c1 (an example of a first output curve portion) in the output curve C with the start point of the input line portion q1 as the starting point P0 and the end point of the input line portion q1 as the end point P2.
  • the terminal control unit 81 may generate an output curve portion c2 (an example of a second output curve portion) in the output curve C with the start point P0 of the input line portion q2 as the start point P2 and the end point of the input line portion q2 as the end point P2.
  • the terminal 80 can divide the input line Q at the division point K.
  • the division point K is a point that is relatively far from the input line Q.
  • the division point K is the start point or end point of the divided input line portions q1 and q2, is the start point P0 or end point P2 of the corresponding output curve portion c1, and is a path point. Therefore, by generating the output curve portions c1 and c2 based on the divided input line portions q1 and q2, the shape of the flight path Fr2 is close to the shape of the input line Q, and the error E is reduced. Therefore, the unmanned aircraft 100 can fly on the flight path FR2 that is close to the path FR1 desired by the user.
  • the terminal control unit 81 may calculate an error E2 between the input line portion q and the output curve portion c (an example of an integrated value of the distance between each point p in the input line portion q1 and the output curve portion c1). The terminal control unit 81 may repeatedly divide the input line portion q and generate the output curve portion c corresponding to the divided input line portion q until the error E2 is smaller than the specified error Es.
  • the terminal 80 can make the shape of the flight path FR2 close to the shape of the path FR1 desired by the user until the error E2 is smaller than the specified error Es.
  • the terminal 80 can shorten the time required for generating the flight path by ending the curve fitting.
  • the terminal control unit 81 may connect the plurality of generated output curve portions c to generate the flight path FR2.
  • the terminal 80 can finely adjust the input line Q using the division point K to generate each output curve portion c, and connect each output curve portion c to generate the flight path FR2.
  • FIG. 18 is a flowchart showing a specific operation example when performing curve fitting through a terminal.
  • the curve fitting may be a curve fitting performed in S12 of FIG. 6.
  • the terminal control unit 81 calculates the midpoint of the equidistant point p1 located at an equal distance from the start point P0 and the end point P2 of the input line Q and located between the start point P0 and the end point P2 (S21).
  • FIG. 19 is a diagram showing an example of the start point P0 of the input line Q, the midpoint as the equidistant point P1, and the end point P2.
  • the terminal control unit 81 calculates an integrated value D of the distance between each point p on the input line Q and the output curve C (S22).
  • FIG. 20 is a diagram showing an example of an integrated value D of the distance between each point p on the input line Q and a straight line as the output curve C.
  • the terminal control unit 81 calculates a vertical bisector L1 that bisects a line segment connecting the start point P0 and the end point P2 of the input line Q vertically, and sets the vertical bisector L1 as the V axis.
  • FIG. 21 is a diagram showing a setting example of the V axis along the vertical bisector L1.
  • the terminal control unit 81 calculates a derivative function ⁇ for deriving the moving distance d of the midpoint of the equidistant point P1 on the V axis (S23).
  • the terminal control unit 81 calculates the value of the derivative function ⁇ at an equidistant point P1 (for example, the midpoint in the first week of the flowchart of FIG. 18).
  • the value of the derivative function ⁇ corresponds to the moving distance d.
  • the terminal control unit 81 determines whether the value of the derivative function ⁇ is smaller than a specified fluctuation value as the threshold value th1 (S24).
  • the specified fluctuation value may be stored in and obtained from the memory 87 or the like, or may be input and acquired through a user operation via the operation unit 83.
  • the specified fluctuation value can be a fixed value or a variable value.
  • the specified fluctuation value can be determined by referring to the calculation efficiency of the output curve C.
  • the value of the conduction function ⁇ when the value of the conduction function ⁇ is small, the amount of change in the cumulative value D of the distance moved from the equidistant point P1 on the V axis is small, so the advantage of moving the equidistant point P1 is small, and the output curve C is The calculation efficiency is relatively low.
  • the value of the conduction function ⁇ is large, the change amount of the cumulative value D of the moving distance from the equidistant point P1 on the V axis is larger, so the advantage of moving the equidistant point P1 is greater.
  • the calculation efficiency is relatively high. You can refer to this calculation efficiency to determine the specified fluctuation value.
  • the terminal control unit 81 uses the generated curve-related information of the output curve C or the output curve portion c (for example, the output curve C, the starting point P0 of each control point of the output curve C, The equidistant points P1 and the end point P2) are stored in the memory 87 and the like (S25).
  • the mobile terminal control unit 81 can also determine that the shape of the output curve C is hardly close to the shape of the input line Q. Therefore, the terminal control unit 81 stores information of each control point of the output curve C and the quadratic Bezier curve used to generate the output curve C. That is, the equidistant point P1 does not move, and the output curve C or the output curve portion c does not deform.
  • the terminal control unit 81 moves the equidistant point P1 along the vertical bisector L1, that is, the V axis. In this case, the terminal control unit 81 moves the equidistance point P1 by the value of the calculated derivative function ⁇ , which is the amount of the movement distance d. That is, the output curve C or the output curve portion c is deformed.
  • FIG. 22 is a diagram showing an example of a moving distance d of an equidistant point P1 moving on the V axis.
  • the equidistant point P1 is moved from the initial position (the position of the midpoint between the start point P0 and the end point P2) to the position of the point p1 '.
  • 23 is a diagram showing an example of an equidistant point P1 'after moving on the V axis and an output curve C' changed according to the equidistant point P1.
  • the terminal control unit 81 generates an output curve C 'based on the starting point P0, the equidistant point P1' after the movement, and the ending point P2.
  • the terminal control unit 81 can determine that the amount of change in the cumulative value D of the distance moved from the equidistant point P1 on the V axis is large, and output the shape of the curve C Close to the shape of the input line Q. In this case, the terminal control unit 81 can repeatedly move the equidistant point P1 on the V axis to take the time to make the shape of the output curve C closer to the shape of the input line Q and improve the output curve C.
  • the process proceeds to S22, and the terminal control unit 81 repeats the process of curve fitting. This repetition may be continued until the value of the derivative function ⁇ for moving the equidistant point P1 is smaller than the specified fluctuation value.
  • the terminal control unit 81 may generate an equidistant point P1 based on the point P1 on the input line Q, the end point P2 on the input line Q2, and a point on the vertical bisector L1 connecting points equidistant from the start point P0 and the end point P2.
  • the starting point P0 of the input line Q and the ending point P2 of the input line Q2 are output curves C that are symmetrical with respect to the vertical bisector L1.
  • the terminal control unit 81 may calculate a derivative function ⁇ for moving the equidistant point P1 on the vertical bisector L1 based on the accumulated value D of the distance.
  • the terminal control unit 81 may determine whether or not the output curve C is deformed based on the calculated value of the function ⁇ .
  • the terminal 80 can determine whether to deform the output curve C by calculating the derivative function ⁇ and referring to the calculation efficiency for reducing the difference between the input line Q and the output curve C (corresponding to the cumulative value D of the distance). Therefore, the terminal 80 can balance the reduction of the difference between the input line Q and the output curve C and the deformation efficiency of the output curve C.
  • the terminal control section 81 may keep the output curve C unchanged.
  • the terminal 80 can determine that even if the equidistant point P1 is moved, the accumulated distance D will hardly change, so it can omit the movement of the equidistant point P1, omit the deformation of the output curve C, and omit redundant Calculations involved in curve fitting. As a result, the terminal 80 can shorten the time required for curve fitting, and thus can shorten the time required for generating the flight path FR2.
  • the terminal control unit 81 may move the equidistant point P1 on the vertical bisector L1 according to the calculated value of the derivative function ⁇ , and according to the starting point P0 of the input line Q , The end point P2 of the input line Q0 and the moved equidistant point P1 (P1 ') deform the output curve C.
  • the result of the deformation is, for example, the output curve C '.
  • the terminal 80 can determine that the accumulated distance D of the distance has greatly changed, and by deforming the output curve C, the difference between the input line Q and the output curve C can be greatly reduced.
  • the terminal control unit 81 may repeat the movement of the equidistant point P1 and the deformation of the output curve C until the calculated value of the derivative function ⁇ is equal to or less than a specified fluctuation value.
  • the terminal 80 can optimize the shape of the output curve so that the difference between the input line Q and the output curve C becomes smaller (for example, minimized).
  • the output curve C may be a quadratic Bezier curve with the start point P0 of the input line Q, the end point P2 of the input line Q, and the equidistant point P1 as control points.
  • the output curve portion may also be a quadratic Bezier curve with the start point P0 of the input line portion q, the end point P2 of the input line portion q, and the equidistant point P1 as control points.
  • the terminal 80 can easily use a known quadratic Bezier curve to easily generate the flight path FR2 from the output curve C and the output curve portion c.
  • the quadratic Bezier curve is exemplified as the output curve C
  • the present invention is not limited to this.
  • the output curve C may be a cubic or more Bezier curve, or may be a curve other than the Bezier curve.
  • the terminal control section 81 may transmit the fitting result to the unmanned aircraft 100 via the communication section 85.
  • the UAV control unit 110 may obtain a fitting result from the terminal 80 via the communication interface 150.
  • the fitting result may include a flight path FR2, a path point WP, an output curve C, a control point (starting point P0, equidistant point P1, end point P2) as a basis for generating the output curve part c, and the like.
  • the UAV control unit 110 When the UAV control unit 110 obtains the information of the flight path FR2, it can control the unmanned aircraft 100 to fly along the flight path FR2. As a result, the unmanned aircraft 100 itself does not need to generate the flight path FR2, and can reduce the processing load involved in generating the flight path of the unmanned aircraft 100, enabling it to fly along the free-form flight path FR2 with fewer restrictions .
  • the UAV control unit 110 When the UAV control unit 110 acquires the output curve C and the information of the control points on which the output curve portion c is generated, it can generate the flight path FR2 based on the information of the control points.
  • the method of generating the control point-based flight path FR2 may be the same as when the terminal 80 generates.
  • the UAV control unit 110 may control the unmanned aircraft 100 to fly along the generated flight path FR2.
  • the unmanned aerial vehicle 100 itself does not need to derive the control points for generating the flight path FR2, and can reduce the processing load involved in the derivation of the control points of the unmanned aerial vehicle 100, enabling it to have fewer restrictions and have freedom Shaped flight path FR2 flight.
  • the UAV control unit 110 When the UAV control unit 110 obtains the information of the waypoint WP, it can generate the flight path FR2 based on the information of the waypoint WP.
  • the UAV control unit 110 may control the unmanned aircraft 100 to fly along the flight path FR2 passing the waypoint WP.
  • the unmanned aircraft 100 itself does not need to derive the waypoint WP with good flight efficiency, and can reduce the processing load involved in the waypoint WP derivation, so that it can fly along the flight path FR2 with less restrictions and a free shape.
  • the UAV control unit 110 may have a function related to the generation of a flight path, which the terminal control unit 81 of the terminal 80 has. In this case, each operation of the terminal control section 81 described above may be performed by the UAV control section 110. In this case, the UAV control unit 110 may control the unmanned aircraft 100 to fly along the flight path FR2 generated by itself.
  • the unmanned aircraft 100 can complete the flight from the generation of the flight path FR2 to the flight along the flight path FR2 by the unmanned aircraft 100 itself, and can simplify the system configuration of the flying body system 10. That is, the terminal 80 can be omitted.
  • a drone is shown as a moving body, the present disclosure is not limited to this, and can also be applied to a camera-equipped unmanned vehicle, a camera-equipped bicycle, or a person moving Simultaneous gimbal device with camera and so on.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

能够以更少的制约、更自由的形状来设定飞行路径。一种用于生成飞行体飞行的飞行路径的信息处理装置,其包含处理部,所述处理部获取第一路径的输入线,生成以输入线的起点为起点,以输入线的终点为终点的输出曲线,计算出输入线中的各点与输出曲线之间的距离的累计值,根据输出曲线以及距离的累计值来生成飞行路径。

Description

信息处理装置、飞行路径生成方法、程序以及记录介质 技术领域
本公开涉及一种生成用于飞行体飞行的飞行路径的信息处理装置、飞行路径生成方法、程序以及记录介质。
背景技术
作为指示无人机的飞行的控制的方式,可以考虑使用遥控器来指示飞行方向,使无人机按照该指示飞行的方式(手动式)、使无人机按照事先设定的飞行路径飞行的方式(自动式)。作为采用自动式的无人机,已知有一边通过预先设定的固定路径一边进行拍摄的平台(无人机)(参照专利文献1:日本特开2010-61216号公报)。
发明内容
【发明所要解决的技术问题】
如专利文献1所示,当以自动式设定飞行路径时,飞行路径中可以包含直线路径和曲线路径。当设定直线路径时,例如可以以通过两个点的方式来设定直线路径。当设定曲线路径时,可以以使其成为受制约的二次贝塞尔曲线的方式来设定曲线路径。在受制约的二次贝塞尔曲线中,意味着通过两个点的二次贝塞尔曲线相对于该两个点的垂直平分线需要具有对称性。例如,在无人机在复杂曲折的道路上飞行时,难以使其沿着道路飞行,难以设定复杂的曲线形状的飞行路径来使无人驾驶航空器飞行。因此,期望能够以制约更少的、更自由的形状来设定飞行路径。
【用于解决问题的技术手段】
在一个方面中,一种生成用于飞行体飞行的飞行路径的信息处理装置,其包含处理部,处理部获取表示第一路径的输入线,生成以输入线的起点为起点、以输入线的终点为终点的输出曲线,计算出输入线中的各点与输出曲线之间的距离的累计值,并根据输出曲线和距离的累计值生成飞行路径。
当累计值小于第一阈值时,处理部可以生成输出曲线所示的飞行路径。
当累计值大于等于第一阈值时,处理部确定输入线中的各点中距离最长的 第一点,并在第一点处分割输入线,在输入线上生成第一输入线部分和第二输入线部分,生成以第一输入线部分的起点为起点,以第一输入线部分的终点为终点的、输出曲线中的第一输出曲线部分,生成以第二输入线部分的起点为起点,以第二输入线部分的终点为终点的、输出曲线中的第二输出曲线部分。
处理部可以计算出第一输入线部分中的各点与第一输出曲线部分之间的距离的累计值,并反复进行第一输入线部分的分割以及与所分割的第一输入线部分相对应的输出曲线部分的生成,直到距离的累计值小于第一阈值。
处理部可以连接所生成的多个输出曲线部分来生成飞行路径。
处理部可以根据输入线的起点、输入线的终点、以及连结与起点及终点等距离的点的垂直平分线上的点即等距离点,生成通过输入线的起点和输入线的终点、相对于垂直平分线对称的输出曲线,基于距离的累计值,计算出用于使等距离点在垂直平分线上移动的导函数,并根据导函数的计算值,确定输出曲线有无变形。
当导函数的计算值小于第二阈值时,处理部可以使输出曲线不变。
当导函数的计算值大于等于第二阈值时,处理部可以根据导函数的计算值,使等距离点在垂直平分线上移动,并根据输入线的起点、输入线的终点、以及所移动的等距离点,使输出曲线变形。
处理部可以反复进行等距离点的移动以及输出曲线的变形,直到导函数的计算值小于第二阈值。
输出曲线可以是以输入线的起点、输入线的终点、以及等距离点作为控制点的二次贝塞尔曲线。
信息处理装置还可以包含通信部。处理部可以经由通信部将飞行路径的信息发送到飞行体。
信息处理装置还可以包含通信部。处理部可以经由通信部将输入线的起点、输入线的终点以及等距离点的信息发送到飞行体。
信息处理装置还可以包含显示部。处理部可以经由显示部显示飞行路径的信息。
信息处理装置还可以包含显示部。处理部可以经由显示部显示输入线的起点、输入线的终点以及等距离点的信息。
信息处理装置可以是飞行体。处理部可以按照飞行路径来控制飞行体的飞 行。
在一个方面中,一种生成用于飞行体飞行的飞行路径的信息处理装置中的飞行路径生成方法,其具有:获取表示第一路径的输入线的步骤;生成以输入线的起点为起点、以输入线的终点为终点的输出曲线的步骤;计算出输入线中的各点与输出曲线之间的距离的累计值的步骤;以及根据输出曲线和距离的累计值,生成飞行路径的步骤。
生成飞行路径的步骤可以包括当累计值小于第一阈值时,生成输出曲线所示的飞行路径的步骤。
生成飞行路径的步骤可以包括:当累计值大于等于第一阈值时,确定输入线中的各点中距离最长的第一点的步骤;在第一点处分割输入线,在输入线上生成第一输入线部分和第二输入线部分的步骤;生成以第一输入线部分的起点为起点,以第一输入线部分的终点为终点的、输出曲线中的第一输出曲线部分的步骤;以及生成以第二输入线部分的起点为起点,以第二输入线部分的终点为终点的、输出曲线中的第二输出曲线部分的步骤。
计算出距离的累计值的步骤可以包括计算出第一输入线部分中的各点与第一输出曲线部分之间的距离的累计值的步骤。生成飞行路径的步骤可以包括反复进行第一输入线部分的分割以及与所分割的第一输入线部分相对应的输出曲线部分的生成,直到距离的累计值小于第一阈值的步骤。
生成飞行路径的步骤可以包括连接所生成的多个输出曲线部分来生成飞行路径的步骤。
生成输出曲线的步骤可以包括:根据输入线的起点、输入线的终点、以及连结与起点及终点等距离的点的垂直平分线上的点即等距离点,生成通过输入线的起点和输入线的终点、相对于垂直平分线对称的输出曲线的步骤;基于距离的累计值,计算出用于使等距离点在垂直平分线上移动的导函数的步骤;以及根据导函数的计算值,确定输出曲线有无变形的步骤。
生成输出曲线的步骤可以包括当导函数的计算值小于第二阈值时,使输出曲线不变的步骤。
生成输出曲线的步骤可以包括:当导函数的计算值大于等于第二阈值时,根据导函数的计算值,使等距离点在垂直平分线上移动的步骤;以及根据输入线的起点、输入线的终点、以及所移动的等距离点,使输出曲线变形的步骤。
生成输出曲线的步骤可以包括反复进行等距离点的移动以及输出曲线的变形,直到导函数的计算值小于第二阈值的步骤。
输出曲线可以是以输入线的起点、输入线的终点、以及等距离点作为控制点的二次贝塞尔曲线。
飞行路径生成方法还可以包括将飞行路径的信息发送到飞行体的步骤。
飞行路径生成方法还可以包括将输入线的起点、输入线的终点以及等距离点的信息发送到飞行体的步骤。
飞行路径生成方法还可以包括显示飞行路径的信息的步骤。
飞行路径生成方法还可以包括显示输入线的起点、输入线的终点以及等距离点的信息的步骤。
信息处理装置可以是飞行体。飞行路径生成方法还可以包括根据按照飞行路径控制飞行体的飞行的步骤。
在一个方面中,一种程序,其用于使生成用于飞行体飞行的飞行路径的信息处理装置执行以下步骤:获取表示第一路径的输入线的步骤;生成以输入线的起点为起点、以输入线的终点为终点的输出曲线的步骤;计算出输入线中的各点与输出曲线之间的距离的累计值的步骤;以及根据输出曲线和距离的累计值,生成飞行路径的步骤。
在一个方面中,一种记录介质,其是计算机可读记录介质并记录有用于使生成用于飞行体飞行的飞行路径的信息处理装置执行以下步骤的程序:
获取表示第一路径的输入线的步骤;生成以输入线的起点为起点、以输入线的终点为终点的输出曲线的步骤;计算出输入线中的各点与输出曲线之间的距离的累计值的步骤;以及根据输出曲线和距离的累计值,生成飞行路径的步骤。
此外,上述的发明内容中并未穷举本公开的所有特征。另外,这些特征群的子集也可形成发明。
附图说明
图1是示出实施方式中的飞行体系统的第一构成示例的示意图。
图2是示出实施方式中的飞行体系统的第二构成示例的示意图。
图3是示出无人驾驶航空器的具体的外观的一个示例的图。
图4是示出无人驾驶航空器的硬件构成的一个示例的框图。
图5是示出终端的硬件构成的一个示例的框图。
图6是示出通过终端生成飞行路径时的操作示例的流程图。
图7是示出生成飞行路径的输入线Q的一个示例的图。
图8是示出通过初次的曲线拟合所生成的输出曲线C的一个示例的图。
图9是示出输入线Q和输出曲线C之间的误差E的一个示例的图。
图10是示出输入线Q的分割点K的一个示例的图。
图11是示出在分割点K处进行分割而获得的多个输入线部分q1、q2的一个示例的图。
图12是示出通过输入线部分q1的曲线拟合所生成的输出曲线部分c1的一个示例的图。
图13是示出通过输入线部分q2的曲线拟合所生成的输出曲线部分c2的一个示例的图。
图14是示出输入线部分q2和输出曲线部分c2之间的误差E2的一个示例的图。
图15是示出输入线部分q2的分割点K2的一个示例的图。
图16是示出在分割点K2处进行分割而获得的多个输入线部分q2_1、q2_2的一个示例的图。
图17是示出将输出曲线部分合成而形成的飞行路径以及在各个输出曲线部分c的生成中使用的控制点的图。
图18是示出通过终端进行曲线拟合时的具体的操作示例的流程图。
图19是示出输入线Q的起点P0、作为等距离点P1的中点以及终点P2的一个例子的图。
图20是示出输入线Q上的各点p与作为输出曲线C的直线之间的距离的累计值D的一个示例的图。
图21是示出沿着垂直平分线L1的V轴的设定示例的图。
图22是示出在V轴上移动的等距离点P1的移动距离d的一个示例的图。
图23是示出在V轴上移动后的等距离点P1以及根据等距离点P1变更的输出曲线C的一个示例的图。
【附图标记说明】
10:飞行体系统
80:终端
81:终端控制部
83:操作部
85:通信部
87:内存
88:显示部
89:储存器
100:无人驾驶航空器
110:UAV控制部
150:通信接口
160:内存
170:存储器
200:万向节
210:旋翼机构
220、230:摄像部
240:GPS接收器
250:惯性测量装置
260:磁罗盘
270:气压高度计
280:超声波传感器
290:激光测量仪
C:输出曲线
c1、c2:输出曲线部分
FR1:路径
FR2:飞行路径
K:分割点
p:输入线上的点
P0:起点
P1:等距离点
P2:终点
Q:输入线
q1、q2:输入线部分
具体实施方式
以下,通过本发明的实施方式来对本公开进行说明,但是以下实施方式并非限制权利要求书所涉及的发明。实施方式中说明的特征的所有组合未必是发明的解决方案所必须的。
在权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。
在以下实施方式中,飞行体以无人驾驶航空器(UAV:Unmanned Aerial Vehicle)为例。无人驾驶航空器包括在空中移动的航空器。在本说明书的附图中,无人驾驶航空器也标记为“UAV”。信息处理装置,例如可以以终端为例,但也可以是其他装置(例如发送器、服务器、无人驾驶航空器)。飞行路径生成方法规定了信息处理装置中的操作。此外,记录介质中记录有程序(例如使信息处理装置执行各种处理的程序)。
图1是示出实施方式中的飞行体系统10的第一构成示例的示意图。飞行体系统10包含无人驾驶航空器100以及终端80。无人驾驶航空器100和终端80之间可以通过有线通信或无线通信(例如,无线LAN(Local Area Network))互相通信。在图1中,例示了终端80是便携式终端(例如智能手机、平板终端)。
另外,飞行体系统的构成可以为包含无人驾驶航空器、发送器(无线电控制发送器)以及便携式终端。当包含发送器时,用户能够使用布置在发送器的前面的左右控制杆来指示无人驾驶航空器的飞行的控制。另外,在此情况下,无人驾驶航空器、发送器以及便携式终端之间能够通过有线通信或者无线通信相互通信。
图2是示出实施方式中的飞行体系统10的第二构成示例的示意图。在图2中,例示了终端80是PC。在图1和图2的任意一个中,终端80具有的功能可以相同。
图3是示出无人驾驶航空器100的具体的外观的一个示例的图。在图3中,示出了无人驾驶航空器100在移动方向STV0飞行时的立体图。无人驾驶航空器100为移动体的一个示例。
如图3所示,设定滚转轴为与地面平行且沿着移动方向STV0的方向(参照x轴)。在此情况下,设定俯仰轴为与地面平行且与滚转轴垂直的方向(参照y轴),另外,设定偏航轴为与地面垂直且与滚转轴以及俯仰轴垂直的方向(参照z轴)。
无人驾驶航空器100的构成为包括UAV主体102、万向节200、摄像部220、多个摄像部230。
UAV主体102包含多个旋翼(螺旋浆)。UAV主体102通过控制多个旋翼的旋转而使无人驾驶航空器100飞行。UAV主体102使用例如四个旋翼使无人驾驶航空器100飞行。旋翼的数量并不限于四个。另外,无人驾驶航空器100可以是没有旋翼的固定翼飞机。
摄像部220可以是对包含在所希望的摄像范围内的被摄体(例如,作为航拍对象的上空的景象、山川、河流等的景色、地面的建筑物)进行拍摄的摄像用相机。
多个摄像部230可以是为了控制无人驾驶航空器100的飞行而对无人驾驶航空器100的周围进行拍摄的传感用相机。两个摄像部230可以设置于无人驾驶航空器100的机头、即正面。进而,其他两个摄像部230可以设置于无人驾驶航空器100的底面。正面侧的两个摄像部230可以成对,起到所谓立体相机的作用。底面侧的两个摄像部230也可以成对,起到立体相机的作用。可以基于由多个摄像部230拍摄的图像来生成无人驾驶航空器100的周围的三维空间数据(三维形状数据)。另外,无人驾驶航空器100所包含的摄像部230的数量不限于四个。无人驾驶航空器100只要包含至少一个摄像部230即可。无人驾驶航空器100可以在无人驾驶航空器100的机头、机尾、侧面、底面及顶面分别包含至少一个摄像部230。摄像部230中可设定的视角可大于摄像部220中可设定的视角。摄像部230可以具有单焦点镜头或鱼眼镜头。
图4是示出无人驾驶航空器100的硬件构成的一个示例的框图。无人驾驶航空器100的构成为包括UAV控制部110、通信接口150、内存160、存储器170、万向节200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、 惯性测量装置(IMU:Inertial Measurement Unit)250、磁罗盘260、气压高度计270、超声波传感器280、激光测量仪290。
UAV控制部110例如由CPU(Central Processing Unit:中央处理器)、MPU(Micro Processing Unit:微处理器)或DSP(Digital Signal Processor:数字信号处理器)构成。UAV控制部110执行用于总体控制无人驾驶航空器100的各部分的操作的信号处理、与其它各部分之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。
UAV控制部110按照存储于内存160中的程序来控制无人驾驶航空器100的飞行。UAV控制部110可以控制飞行。UAV控制部110可以航拍图像。
UAV控制部110获取表示无人驾驶航空器100的位置的位置信息。UAV控制部110可以从GPS接收器240获取表示无人驾驶航空器100所在的纬度、经度以及高度的位置信息。UAV控制部110可以分别从GPS接收器240获取表示无人驾驶航空器100所在的纬度以及经度的纬度经度信息、并从气压高度计270获取表示无人驾驶航空器100所在的高度的高度信息,作为位置信息。UAV控制部110可以获取超声波传感器280产生的超声波的放射点与超声波的反射点之间的距离,作为高度信息。
UAV控制部110可以从磁罗盘260获取表示无人驾驶航空器100的朝向的朝向信息。朝向信息可以用例如与无人驾驶航空器100的机头的朝向相对应的方位来表示。
UAV控制部110可以获取位置信息,该位置信息表示在摄像部220对应该拍摄的摄像范围进行拍摄时,无人驾驶航空器100应该存在的位置。UAV控制部110可以从内存160获取表示无人驾驶航空器100应该存在的位置的位置信息。UAV控制部110可以经由通信接口150从其他装置获取表示无人驾驶航空器100应该存在的位置的位置信息。UAV控制部110可以参照三维地图数据库,来指定无人驾驶航空器100能够存在的位置,并获取该位置作为表示无人驾驶航空器100应该存在的位置的位置信息。
UAV控制部110可以获取表示摄像部220以及摄像部230的各自的摄像范围的摄像范围信息。UAV控制部110可以从摄像部220以及摄像部230获取表示摄像部220以及摄像部230的视角的视角信息,作为用于指定摄像范围的参数。UAV控制部110可以获取表示摄像部220以及摄像部230的摄像方向的信 息,作为用于指定摄像范围的参数。UAV控制部110例如可以从万向节200获取表示摄像部220的姿势状态的姿势信息,作为表示摄像部220的摄像方向的信息。摄像部220的姿势信息可以示出万向节200的从俯仰轴和偏航轴的基准旋转角度开始的旋转角度。
UAV控制部110可以获取表示无人驾驶航空器100所在的位置的位置信息,作为用于指定摄像范围的参数。UAV控制部110可以根据摄像部220和摄像部230的视角和摄像方向、以及无人驾驶航空器100所在的位置,通过划定表示摄像部220拍摄的地理范围的摄像范围并生成摄像范围信息,来获取摄像范围信息。
UAV控制部110可以从内存160获取摄像范围信息。UAV控制部110可以经由通信接口150获取摄像范围信息。
UAV控制部110控制万向节200、旋翼机构210、摄像部220以及摄像部230。UAV控制部110可以通过变更摄像部220的摄像方向或视角来控制摄像部220的摄像范围。UAV控制部110可以通过控制万向节200的旋转机构来控制万向节200所支持的摄像部220的摄像范围。
摄像范围是指由摄像部220或摄像部230拍摄的地理范围。摄像范围由纬度、经度和高度定义。摄像范围可以是由纬度、经度和高度定义的三维空间数据的范围。摄像范围可以是由纬度和经度定义的二维空间数据的范围。摄像范围可以根据摄像部220或摄像部230的视角和摄像方向、以及无人驾驶航空器100所在的位置而指定。摄像部220和摄像部230的摄像方向可以根据设置有摄像部220和摄像部230的摄像镜头的正面所朝的方位和俯角来定义。摄像部220的摄像方向可以是根据无人驾驶航空器100的机头的方位和相对于万向节200的摄像部220姿势状态而指定的方向。摄像部230的摄像方向可以是根据无人驾驶航空器100的机头的方位和设置有摄像部230的位置而指定的方向。
UAV控制部110可以通过分析由多个摄像部230拍摄到的多个图像,来指定无人驾驶航空器100的周围的环境。UAV控制部110可以根据无人驾驶航空器100的周围的环境,例如避开障碍物来控制飞行。
UAV控制部110可以获取表示存在于无人驾驶航空器100周围的对象的立体形状(三维形状)的立体信息(三维信息)。对象例如可以是建筑物、道路、车辆、树木等风景的一部分。立体信息例如是三维空间数据。UAV控制部110 可以根据由多个摄像部230得到的各个图像,生成表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息,从而获取立体信息。UAV控制部110可以通过参照存储在内存160或存储器170中的三维地图数据库,来获取表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息。UAV控制部110可以通过参照网络上存在的由服务器所管理的三维地图数据库,来获取与存在于无人驾驶航空器100的周围的对象的立体形状相关的立体信息。
UAV控制部110通过控制旋翼机构210来控制无人驾驶航空器100的飞行。即,UAV控制部110通过控制旋翼机构210来对包括无人驾驶航空器100的纬度、经度以及高度的位置进行控制。UAV控制部110可以通过控制无人驾驶航空器100的飞行来控制摄像部220的摄像范围。UAV控制部110可以通过控制摄像部220所包含的变焦镜头来控制摄像部220的视角。UAV控制部110可以利用摄像部220的数字变焦功能,通过数字变焦来控制摄像部220的视角。
当摄像部220固定于无人驾驶航空器100,不能移动摄像部220时,UAV控制部110可以通过使无人驾驶航空器100在指定的日期向指定的位置移动,使摄像部220在所希望的环境下对所希望的摄像范围进行拍摄。或者,即使当摄像部220没有变焦功能,无法变更摄像部220视角时,UAV控制部110也可以通过使无人驾驶航空器100在指定的日期向指定的位置移动,使摄像部220在所希望的环境下对所希望的摄像范围进行拍摄。
通信接口150与终端80进行通信。通信接口150可以通过任意的无线通信方式进行无线通信。通信接口150可以通过任意的有线通信方式进行有线通信。通信接口150可以将航拍图像、与航拍图像相关的附加信息(元数据)发送到终端80。通信接口150可以从终端80获取飞行控制的指示信息。飞行控制的指示信息可以包括用于无人驾驶航空器100飞行的飞行路径、用于生成飞行路径的飞行位置(Waypoint)、作为飞行路径的生成的基础的控制点等信息。
内存160存储UAV控制部110对万向节200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置250、磁罗盘260、气压高度计270、超声波传感器280以及激光测量仪290进行控制所需的程序等。内存160可以是计算机可读记录介质,可以包括SRAM(Static Random Access Memory:静态随机存取存储器)、DRAM(Dynamic Random Access Memory:动态随机存取存储器)、EPROM(Erasable Programmable Read Only Memory:可擦除可编程 只读存储器)、EEPROM(Electrically Erasable Programmable Read-Only Memory:电可擦除可编程只读存储器)、以及USB(Universal Serial Bus:通用串行总线)存储器等闪存中的至少一个。内存160可以从无人驾驶航空器100上拆卸下来。内存160可以作为作业用内存进行工作。
存储器170可以包括HDD(Hard Disk Drive:硬盘驱动器)、SSD(Solid State Drive:固态硬盘)、SD内存卡、USB存储器、其他的存储器中的至少一个。存储器170可以保存各种信息、各种数据。存储器170可以从无人驾驶航空器100上拆卸下来。存储器170可以记录航拍图像。
万向节200可以以偏航轴、俯仰轴以及滚转轴为中心可旋转地支持摄像部220。万向节200可以使摄像部220以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,从而变更摄像部220的摄像方向。
旋翼机构210具有多个旋翼和使多个旋翼旋转的多个驱动电机。旋翼机构210通过UAV控制部110控制旋转,从而使无人驾驶航空器100飞行。
摄像部220对所希望的摄像范围内的被摄体进行拍摄并生成摄像图像的数据。通过摄像部220的摄像而得到的图像数据(例如航拍图像)可以存储于摄像部220具有的内存、或存储器170中。
摄像部230对无人驾驶航空器100的周围进行拍摄并生成摄像图像的数据。摄像部230的图像数据可以存储于存储器170中。
GPS接收器240接收表示从多个导航卫星(即GPS卫星)发送的时间以及各GPS卫星的位置(坐标)的多个信号。GPS接收器240根据接收到的多个信号,计算出GPS接收器240的位置(即无人驾驶航空器100的位置)。GPS接收器240将无人驾驶航空器100的位置信息输出到UAV控制部110。另外,可以由UAV控制部110代替GPS接收器240来进行GPS接收器240的位置信息的计算。在此情况下,在UAV控制部110中输入有GPS接收器240所接收到的多个信号中包含的表示时间以及各GPS卫星的位置的信息。
惯性测量装置250检测无人驾驶航空器100的姿势,并将检测结果输出到UAV控制部110。惯性测量装置250可以检测无人驾驶航空器100的前后、左右、以及上下的三轴方向的加速度以及俯仰轴、滚转轴和偏航轴三轴方向的角速度,作为无人驾驶航空器100的姿势。
磁罗盘260检测无人驾驶航空器100的机头的方位,并将检测结果输出到 UAV控制部110。
气压高度计270检测无人驾驶航空器100的飞行高度,并将检测结果输出到UAV控制部110。
超声波传感器280发射超声波,检测地面、物体反射的超声波,并将检测结果输出到UAV控制部110。检测结果可以示出从无人驾驶航空器100到地面的距离,即高度。检测结果可以示出从无人驾驶航空器100到物体(被摄体)的距离。
激光测量仪290对物体照射激光,接收物体反射的反射光,并通过反射光测量无人驾驶航空器100与物体(被摄体)之间的距离。作为基于激光的距离测量方法的一个示例,可以为飞行时间法。
图5是示出终端80的硬件构成的一个示例的框图。终端80包含终端控制部81、操作部83、通信部85、内存87、显示部88以及存储器89。终端80可以由希望指示无人驾驶航空器100的飞行控制的用户所持有。
终端控制部81例如采用CPU、MPU或DSP构成。终端控制部81进行用于整体控制终端80各部的操作的信号处理、与其它各部之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。
终端控制部81可以经由通信部85获取来自无人驾驶航空器100的数据、信息。终端控制部81也可以获取经由操作部83输入的数据、信息。终端控制部81也可以获取保存在内存87中的数据、信息。终端控制部81可以经由通信部85向无人驾驶航空器100发送数据、信息。终端控制部81也可以将数据、信息发送到显示部88,并使显示部88显示基于此数据、信息的显示信息。显示部88所显示的信息、通过通信部85向无人驾驶航空器100发送的信息可以包括用于无人驾驶航空器100飞行的飞行路径、用于生成飞行路径的飞行位置(Waypoint)、作为飞行路径的生成的基础的控制点等信息。
终端控制部81也可以执行用于生成飞行路径的应用程序。终端控制部81也可以生成应用程序中使用的各种数据。
操作部83接受并获取由终端80的用户输入的数据、信息。操作部83也可以包括按钮、按键、触控显示屏、话筒等输入装置。这里主要示出了操作部83和显示部88由触控显示屏构成。在此情况下,操作部83可以接受触控操作、点击操作、拖动操作等。
通信部85通过各种无线通信方式与无人驾驶航空器100之间进行无线通信。该无线通信的无线通信方式例如可以包括通过无线LAN、Bluetooth(注册商标)、或公共无线网络进行的通信。通信部85可以通过任意的有线通信方式进行有线通信。
内存87例如可以具有规定终端80的操作的程序、存储设定值的数据的ROM、暂时保存终端控制部81进行处理时所使用的各种信息、数据的RAM。内存87可以包括ROM和RAM以外的内存。内存87可以设置在终端80的内部。内存87可以可拆卸地设置在终端80中。程序可以包括应用程序。
显示部88例如由LCD(Liquid Crystal Display:液晶显示器)构成,显示从终端控制部81输出的各种信息、数据。显示部88可以显示与应用程序的执行相关的各种数据、信息。
存储器89存储并保存各种数据、信息。存储器89可以是HDD、SSD、SD卡、USB存储器等。存储器89可以设置在终端80的内部。存储器89可以可拆卸地设置在终端80中。存储器89可以保存从无人驾驶航空器100获取的航拍图像、附加信息。附加信息可以保存在内存87中。
另外,当飞行体系统10包含发送器(无线电控制发送器)时,终端80执行的处理也可以由发送器执行。由于发送器具有与终端80相同的构成部,故不再详细说明。发送器具有控制部、操作部、通信部、显示部、内存等。当飞行体系统10具有发送器时,也可以不设置终端80。
接着,对终端80的终端控制部81具有的与飞行路径生成相关的功能进行说明。终端控制部81为处理部的一个示例。终端控制部81通过进行与飞行路径的生成有关的处理,能够生成与复杂的路径相对应的飞行路径。
终端控制部81获取希望无人驾驶航空器100飞行的路径FR1。路径FR1可以是任意形状的路径,可以是形状复杂的路径,也可以是无人驾驶航空器100难以平稳飞行的路径。终端控制部81可以经由操作部83接受用户操作,基于用户操作生成并获取路径FR1。终端控制部81可以从内存87等(内存87、存储器89)获取预先保存的路径FR1的信息。终端控制部81可以经由通信部85访问保存有地图信息的外部的地图服务器,发送用于识别路径FR1的识别信息,接收并获取路径FR1的信息。路径FR1可以用输入线Q来表示。
终端控制部81以输入线Q的起点P0以及终点P2作为具有预定形状的输 出曲线C的起点P0以及终点P2,来生成输出曲线C。输出曲线C可以是具有比输入线Q(路径FR1)更简化的形状的曲线,可以是无人驾驶航空器100能够平稳飞行的路径的至少一部分。输出曲线C可以是二次贝塞尔曲线、其他曲线。在此,主要例示出输出曲线C为二次贝塞尔曲线。也将基于输入线Q或基于后述的输入线部分q而生成输出曲线C称为曲线拟合。
二次贝塞尔曲线例如可以用以下的公式(0)来表示。
B(t)=(1-t) 2P 0+2(1-t)tP 1+t 2P 2,0≤t≤1.   …(0)
其中,t是参数。B(t)是二次贝塞尔曲线上的点。P0、P1、P2为用于生成二次贝塞尔曲线的控制点。公式(0)中的P0以及P2相当于输出曲线C的起点P0以及终点P2。公式(0)中的P1相当于距起点P0以及终点P2等距离的等距离点。
终端控制部81计算出输入线Q与输出曲线C之间的差分(误差E)。误差E可以用以下的公式(1)来表示。
E=∫ p  in  Qdist(p,C)   …(1)
其中,p是输入线Q上的点。dist(p,C)是输出曲线C与输入线Q上的点p的差分(距离)。因此,公式(1)所示的误差E例如可以用包含图9的虚线部分的面积来表示。
终端控制部81在输入线Q上确定与输出曲线C的距离最远(最长)的分割点K。即,可以计算出dist(p,C)的值最大的p的位置。另外,分割点K并不限于距离最长,也可以是距离大于等于阈值th4的多个位置中的任一个。
终端控制部81在分割点K的位置将输入线Q分割为两个输入线部分q1、q2。分割点K是作为分割输入线Q的基准的点。然后,终端控制部81可以对两个输入线部分q1、q2进行曲线拟合,生成基于输入线部分q1、q2的输出曲线部分c1、c2。另外,终端控制部81可以反复实施输入线部分q的分离、曲线拟合,来生成输入线部分q3、q4、q5、...,输出曲线部分c3、c4、...。
即,终端控制部81可以使用一个或更多的分割点来分割输入线Q,生成多个输入线部分q,并生成与多个输入线部分q相对应的多个输出曲线部分c。多 个输出曲线部分c实际上成为用于无人驾驶航空器100飞行的飞行路径FR2的基础。输出曲线部分c也可以说是一部分输出曲线C。
因此,适用于输入线Q以及输出曲线C的点,也同样能够适用于输入线部分q以及输出曲线部分c。例如,可以将输入线部分q的起点P0以及终点P2作为起点P0以及终点P2,来生成输出曲线部分c。可以在分割点K处分割输入线部分q,进而生成多个输入线部分q。
输入线部分q的分离、曲线拟合的次数越少,飞行路径FR2越会成为由实施曲线拟合之前的原来的输入线Q的形状所简化的路径。因此,无人驾驶航空器100能够在所希望的路径FR1的形状得以简化的飞行路径FR2上,减少飞行方向的变更量而飞行。
另外,输入线部分q的分离、曲线拟合的次数越多,飞行路径FR2越接近实施曲线拟合之前的原来的输入线Q的形状。因此,无人驾驶航空器100能够在形状接近所希望的路径FR1的飞行路径FR2上飞行。
终端控制部81可以根据输出曲线C、多个输出曲线部分c来生成飞行路径FR2。例如,终端控制部81可以将输出曲线C作为飞行路径FR2。终端控制部81可以将多个输出曲线部分c中的相邻的输出曲线部分c彼此连接,生成飞行路径FR2。在此情况下,终端控制部81连接相邻的输出曲线部分c的各自的起点P0以及终点P2,来生成飞行路径FR2。无人驾驶航空器100按照所生成的飞行路径FR2飞行。因此,输出曲线C、输出曲线部分c的起点P0以及终点P2成为无人驾驶航空器100飞行时通过的飞行位置。也将该飞行位置称为路径点。
终端控制部81可以将曲线拟合的拟合结果保存在内存87等中。终端控制部81可以经由显示部88显示拟合结果。终端控制部81可以经由通信部85将拟合结果发送到无人驾驶航空器100。拟合结果可以包括各路径点WP的信息、飞行路径FR2的信息。飞行路径FR2可以用输出曲线C、多个输出曲线部分c的组合来表示。另外,拟合结果可以包括成为输出曲线C、输出曲线部分c的生成的基础的控制点(起点P0、等距离点P1、终点P2)。
终端80通过显示拟合结果,能够确认由终端80导出的飞行路径FR2、路径点WP的位置。终端80通过将拟合结果通知给无人驾驶航空器100,能够使无人驾驶航空器100按照所拟合的飞行路径FP2飞行。
以下,对曲线拟合的具体实例进行说明。
当进行曲线拟合时,终端控制部81设定与输入线Q的起点P0和终点P2等距离的等距离点P1,作为初始设定。等距离点P1作为初始值可以是起点P0和终点P2的中心点即中点。这些起点P0、等距离点P1、终点P2为作为输出曲线C的二次贝塞尔曲线中的三个控制点。当等距离点P1为起点P0与终点P2的中点时,二次贝塞尔曲线为直线。另外,等距离点P1位于将连结起点P0和终点P2的假想线平分的垂直平分线L1上。将沿着垂直平分线L1的轴设为V轴。在此情况下,等距离点P1能够在V轴上移动。
终端控制部81可以计算出输入线Q上的点p与输出曲线C之间的距离的累计值D。该累计值D可以用以下的公式(2)来表示。点P能够在输入线Q上任意移动。
D=∫ p  in  Qdist(p,C)   …(2)
另外,p是输入线Q的点。dist(p,C)是输出曲线C与输入线Q上的各点p之间的距离。因此,公式(2)所示的距离的累计值D可以用包含图20的虚线部分的面积来表示。另外,比较公式(1)、(2)后可以理解,公式(2)具有与公式(1)相同的形式。
关于距离的累计值D,终端控制部81计算出用于导出V轴上的等距离点P1的移动距离d的导函数σ。导函数σ可以用以下的公式(3)来表示。在公式(3)的函数σ中,v是参数。导函数σ的值(导函数的计算值)为移动距离d。
Figure PCTCN2019093764-appb-000001
另外,如公式(3)所示,导函数σ表示距离的累计值D的微分值,并表示相对于V轴方向微小移动的距离的累计值D的变化量。另外,当等距离点p1是初始值即起点P0和终点P2的中点时,v=0。由于导函数σ是使用变量v对距离的累计值D进行微分的值,所以根据V轴上的位置来确定导函数σ的值。由 于V轴上的位置是确定的,所以终端控制部81能够计算出V轴上的位置v处的导函数σ的值,并能够计算出等距离点P1处的导函数σ的值。
当导函数σ的值大于等于阈值th1时,相对于在V轴上的微小移动的距离的累计值D的变化较大。即,在该时刻的等距离点P1的位置上,当使等距离点P1移动时,距离的累计值D会大幅变化。因此,在此情况下,终端控制部81可以使等距离点P1在V轴上向正方向或负方向仅移动一移动距离d,以使距离的累计值D变小(例如最小化)。在此情况下,可以使等距离点P1在V轴上包含移动方向地移动一移动距离d。d是导函数σ的值,正数为V轴的正方向,负数为V轴的负方向。阈值th1也被称为指定变动值。
然后,终端控制部81可以计算出以起点P0、移动一移动距离d后的等距离点P1、终点P2这三个点作为控制点的二次贝塞尔曲线的输出曲线C。在此情况下,本次的输出曲线C比上次的输出曲线C更接近原来的输入线Q的形状。因此,终端80能够使无人驾驶航空器100沿着更接近所希望的路径FR1的形状的飞行路径FR2飞行。
当导函数σ的值小于阈值th1时,相对于在V轴上的微小移动的距离的累计值D的变化较小。即,在该时刻的等距离点P1的位置上,即便使等距离点P1移动,距离的累计值D也不怎么变化。因此,即便不使等距离点P1移动,所希望的路径FR1与飞行路径FR2的差分也不怎么变化。所以,终端控制部81可以省略等距离点P1的移动,省略多余的曲线拟合所涉及的计算。由此,终端80能够缩短曲线拟合所需的时间,因而能够缩短飞行路径FR2的生成所需的时间。
终端控制部81可以将与所导出的输出曲线C相关的曲线相关信息保存于内存87等中。终端控制部81可以经由显示部88显示曲线关联信息。该曲线关联信息可以包括所导出的输出曲线C、用于导出输出曲线C的三个控制点(起点P0、等距离点P1、终点P2)。
以下,对飞行体系统10的操作示例进行说明。
图6是示出通过终端80生成飞行路径RF2时的操作示例的流程图。
首先,终端控制部81获取表示希望无人驾驶航空器100飞行的路径FR1的输入线Q(S11)。图7是示出生成飞行路径FR2的输入线Q的一个示例的图。
终端控制部81以输入线Q的起点P0以及终点P2作为输出曲线C的起点 P0以及终点P2,生成作为二次贝塞尔曲线的输出曲线C(S12)。即,终端控制部81通过曲线拟合,基于输入线Q生成输出曲线C。图8是示出通过初次的曲线拟合所生成的输出曲线C的一个示例的图。
终端控制部81计算出作为输入线Q与输出曲线C之间的差分的误差E。针对误差E,终端控制部81获取作为与误差E进行比较的阈值th2的指定误差Es。终端控制部81判断误差E是否小于指定误差Es(S13)。
图9是示出输入线Q和输出曲线C之间的误差E的一个示例的图。由于误差E是用输入线Q的各点p与输出曲线C的距离的累计值来表示,因此图9中用斜虚线表示的区域便为误差E。
另外,指定误差Es例如可以保存在内存87等中并从内存87等获取,也可以经由操作部83通过用户操作进行输入并获取。指定误差Es可以是固定值,也可以是可变值。指定误差Es成为使输出曲线C的形状以何种程度近似于输入线Q的形状的指标。指定误差Es例如可以根据输出曲线C表示的飞行路径FR2的地理特性(例如实际上由于地形、建筑物而难以飞行的区域、风力)、无人驾驶航空器100的飞行预定特性(例如无人驾驶航空器100的飞行预定速度)来确定。
当误差E小于指定误差Es时,终端控制部81将曲线拟合的拟合结果(例如,飞行路径FR2、各路径点WP、各控制点)保存在内存87等中(S14)。
即,当误差E小于指定误差Es时,终端控制部81可以判断输入线Q的形状与输出曲线C的形状充分近似,使得无人驾驶航空器100能够飞行。例如,作为输出曲线C的二次贝塞尔曲线与实际曲线虽不相同,但在容许范围内。由此,终端控制部81保存输出曲线C、用于生成输出曲线C的各控制点(例如起点P0、等距离点P1、终点P2)的信息。
当误差E大于等于指定误差Es时,终端控制部81计算出输入线Q的各点中的到输出曲线C的距离最远的最远点作为分割点K(S15)。换言之,计算出的多个输入线Q的各点p与输出曲线C之间的距离中的最长距离的输入线Q上的点p是分割点K。图10是示出输出曲线C的分割点K的一个示例的图。
即,当误差E大于等于指定误差Es时,终端控制部81能够判断输入线Q的形状与输出曲线C的形状并不充分近似。在此情况下,终端控制部81改善输出曲线C,以使误差E小于指定误差Es,并使输出曲线C的形状进一步接近 输入线Q的形状。
终端控制部81在分割点K的位置分割输入线Q,生成输入线部分q1、q2作为两条曲线(S16)。图11是示出在分割点K处进行分割而获得的多个输入线部分q1、q2的一个示例的图。
当S16的处理结束时,终端控制部81进入S11的处理。即,终端控制部81针对在S16中生成的输入线部分q(初次为输入线部分q1、q2)进行S11的曲线拟合。
对于第二次以后(即图6的流程图的第二次迭代以后)的曲线拟合,与初次相同,终端控制部81计算出输入线部分q与输出曲线部分c的误差作为误差E2,并与指定误差Es进行比较。当误差E2小于指定误差Es时,保存拟合结果并结束图6的处理(即曲线拟合)。
另一方面,在第二次以后的曲线拟合结果中,当误差E2不小于指定误差Es时,终端控制部81可以再次导出分割点K2作为输入线部分q中的分割点,并进一步分割输入线部分q,对分割后的输入线部分q继续进行曲线拟合。
图12是示出通过输入线部分q1的曲线拟合所生成的输出曲线部分c1的一个示例的图。对于该输出曲线部分c1,误差E2小于指定误差Es。因此,由于输出曲线部分c1不会进一步导出分割点K2并进行曲线拟合,所以该输出曲线部分c1被用作最终的飞行路径FR2的一部分。另外,在图12中,用虚线表示输入线部分q1。
图13是示出通过输入线部分q2的曲线拟合所生成的输出曲线部分c2的一个示例的图。图14是示出输入线部分q2和输出曲线部分c2之间的误差E2的一个示例的图。误差E2的导出方法与初次的误差E的导出方法相同。对于该输出曲线部分c2,作为误差E2的误差E2大于等于指定误差Es。因此,输出曲线部分c2进一步导出分割点K2,进而生成多个输入线部分q2_1、q2_2,对输入线部分q2_1、q2_2继续进行曲线拟合。图15是示出输入线部分q2的分割点K2的一个示例的图。图16是示出在分割点K2处进行分割而获得的多个输入线部分q2_1、q2_2的一个示例的图。另外,在图13至图15中,用虚线表示输入线部分q2。
终端控制部81可以继续进行该曲线拟合,直到对于所生成的所有输出曲线部分c,误差E均小于指定误差Es。另外,关于所生成的输入线部分q的数量, 当误差E小于指定误差Es的输入线部分q的数量的比例大于等于阈值th3时,终端控制部81也可以结束曲线拟合。
针对各输入线部分q,当误差E小于指定误差Es时,终端控制部81连接所生成的各输出曲线部分c,来生成飞行路径FR2。另外,当曲线拟合为一次,且并未从原来的输入线Q生成输入线部分q时,从原来的输入线Q通过曲线拟合得到的输出曲线C为飞行路径FR2。
图17是示出将输出曲线部分c合成而形成的飞行路径FR2以及在各个输出曲线部分c的生成中使用的控制点的图。控制点包括起点P0,等距离点P1、终点P2。在图17中,将输出曲线部分c11~c19连接来生成飞行路径FR2。在图17中,代表性地示出了输出曲线部分c12中的起点P0、等距离点P1、终点P2,但其他的输出曲线部分c的起点P0、等距离点P1、终点P2也同样如此。例如,输出曲线部分c12中的起点P0成为输出曲线部分c11中的终点P2。例如,输出曲线部分c12中的终点P2成为输出曲线部分c13中的起点P0。
无人驾驶航空器100飞行的飞行位置(路径点WP)是输出曲线部分c的起点P0、终点P2。因此,根据从输入线Q导出的输出曲线部分c的数量,路径点WP的数量会发生变化。路径点WP的数量越少,无人驾驶航空器100应通过的飞行位置的数量会越少,飞行效率提高。路径点WP的数量越多,一个输出曲线部分c的长度越短,误差E越小。
这样,终端控制部81(处理部的一个示例)获取表示路径FR1(第一路径的一个示例)的输入线Q。终端控制部81生成以输入线Q的起点为起点P0、以输入线Q的终点为终点P2的输出曲线C。终端控制部81计算出输入线Q与输出曲线C之间的误差E(输入线Q中的各点p与输出曲线C之间的距离的累计值的一个示例)。终端控制部81根据输出曲线C和误差E,生成飞行路径FR2。
由此,即便当路径FR1是具有复杂的曲线形状的路径,无人驾驶航空器100难以准确地沿着路径FR1飞行时,终端80也能够生成与路径FR1的形状近似的、无人驾驶航空器100能够飞行的飞行路径FR2。另外,终端80是参考输入线Q和输出曲线C之间的误差E来生成飞行路径的,因此能够边调整输入线Q和输出曲线C的近似情况边生成飞行路径FR2。在此情况下,与单纯地将一个路径FR1置换为一个二次贝塞尔曲线的情况相比,能够根据需要调整输出曲线C的形状。由此,终端80能够以制约更少的、更自由的形状生成飞行路径FR2。
此外,当误差E小于指定误差Es(第一阈值的一个示例)时,终端控制部81可以生成用输出曲线C表示的飞行路径FR2。
由此,当路径FR1与输出曲线C的形状相近时,终端80能够将输出曲线C的形状用作飞行路径FR2的形状。从而,终端80能够容易地从路径FR1生成飞行路径FR2。
另外,当误差E大于等于指定误差Es时,终端控制部81可以确定输入线Q中的各点p中的距离大于等于阈值th4的分割点K1(第一点的一个示例)。终端控制部81可以在分割点K处分割输入线Q,并在输入线Q中生成输入线部分q1(第一输入线部分的一个示例)和输入线部分q2(第二输入线部分的一个示例)。终端控制部81可以生成以输入线部分q1的起点为起点P0,以输入线部分q1的终点为终点P2的输出曲线C中的输出曲线部分c1(第一输出曲线部分的一个示例)。终端控制部81可以生成以输入线部分q2的起点为起点P0,以输入线部分q2的终点为终点P2的输出曲线C中的输出曲线部分c2(第二输出曲线部分的一个示例)。
由此,当路径FR1与输出曲线C的形状不近似时,终端80能够在分割点K处分割输入线Q。分割点K是距输入线Q的距离相对远的点。另外,分割点K是所分割的输入线部分q1、q2的起点或终点,是对应的输出曲线部分c1的起点P0或终点P2,是路径点。因此,通过基于所分割的输入线部分q1、q2来生成输出曲线部分c1、c2,飞行路径Fr2的形状接近输入线Q的形状,误差E减小。因此,无人驾驶航空器100能够在接近用户所希望的路径FR1的飞行路径FR2上飞行。
另外,终端控制部81可以计算出输入线部分q与输出曲线部分c之间的误差E2(输入线部分q1中的各点p与输出曲线部分c1之间的距离的累计值的一个示例)。终端控制部81可以反复进行输入线部分q的分割以及与分割后的输入线部分q相对应的输出曲线部分c的生成,直到误差E2小于指定误差Es。
由此,终端80能够使飞行路径FR2的形状接近用户所希望的路径FR1的形状,直到误差E2小于指定误差Es。另外,在某种程度上使飞行路径FR2的形状接近用户所希望的路径FR1的形状的阶段,终端80能够通过结束曲线拟合,缩短飞行路径的生成所需的时间。
另外,终端控制部81可以连接所生成的多个输出曲线部分c,来生成飞行 路径FR2。
由此,终端80能够利用分割点K细微地调整输入线Q来生成各输出曲线部分c,并连接各输出曲线部分c来生成飞行路径FR2。
图18是示出通过终端进行曲线拟合时的具体的操作示例的流程图。曲线拟合可以是在图6的S12中进行的曲线拟合。
终端控制部81计算出位于距输入线Q的起点P0和终点P2等距离处的等距离点p1中的、位于起点P0和终点P2的中间的中点(S21)。图19是示出输入线Q的起点P0、作为等距离点P1的中点以及终点P2的一个例子的图。
终端控制部81计算出输入线Q上的各点p与输出曲线C之间的距离的累计值D(S22)。图20是示出输入线Q上的各点p与作为输出曲线C的直线之间的距离的累计值D的一个示例的图。
终端控制部81计算出将连结输入线Q的起点P0和终点P2的线段垂直平分的垂直平分线L1,并将该垂直平分线L1设为V轴。图21是示出沿着垂直平分线L1的V轴的设定示例的图。
对于距离的累计值D,终端控制部81计算出用于导出V轴上的等距离点P1的中点的移动距离d的导函数σ(S23)。终端控制部81计算出等距离点P1(例如图18的流程图的第一周中的中点)处的导函数σ的值。导函数σ的值相当于移动距离d。
终端控制部81判断导函数σ的值是否小于作为阈值th1的指定变动值(S24)。
另外,指定变动值例如可以保存在内存87等中并从内存87等获取,也可以经由操作部83通过用户操作进行输入并获取。指定变动值可以是固定值,也可以是可变值。指定变动值可以参考输出曲线C的计算效率来确定。
即,当导电函数σ的值较小时,相对于V轴上的等距离点P1的移动的距离的累计值D的变化量较小,因此使等距离点P1移动的优点较小,输出曲线C的计算效率比较低。当导电函数σ的值较大时,相对于V轴上的等距离点P1的移动的距离的累计值D的变化量较大,因此使等距离点P1移动的优点较大,输出曲线C的计算效率比较高。可以参考该计算效率来确定指定变动值。
当导函数σ的值小于指定变动值时,终端控制部81将所生成的输出曲线C或输出曲线部分c的曲线相关信息(例如输出曲线C、作为输出曲线C的各控 制点的起点P0、等距离点P1、终点P2)保存在内存87等中(S25)。
当导函数σ的值小于指定变动值时,相对于V轴上的等距离点P1的移动的距离的累计值D的变化量较小,即便是花费时间反复进行V轴上的等距离点P1的移动,终端控制部81也能够判断出输出曲线C的形状几乎不接近输入线Q的形状。因此,终端控制部81保存输出曲线C及用于生成输出曲线C的二次贝塞尔曲线的各控制点的信息。即,等距离点P1不移动,输出曲线C或输出曲线部分c不变形。
当导函数σ的值大于等于指定变动值时,终端控制部81使等距离点P1沿着垂直平分线L1即V轴移动。在此情况下,终端控制部81使等距离点P1移动计算出的导函数σ的值即移动距离d的量。即,输出曲线C或输出曲线部分c变形。
图22是示出在V轴上移动的等距离点P1的移动距离d的一个示例的图。在图22中,等距离点P1从最初的位置(始点P0和终点P2的中点的位置)移动到了点p1’的位置。另外,图23是示出在V轴上移动后的等距离点P1’以及根据等距离点P1变更的输出曲线C’的一例的图。终端控制部81根据起点P0、移动后的等距离点P1’以及终点P2生成输出曲线C’。
这样,当导函数σ的值大于等于指定变动值时,终端控制部81能够判断出相对于V轴上的等距离点P1的移动的距离的累计值D的变化量大,输出曲线C的形状接近输入线Q的形状。在此情况下,终端控制部81可以通过花费时间反复进行V轴上的等距离点P1的移动,使输出曲线C的形状进一步接近输入线Q的形状,改善输出曲线C。
在等距离点P1移动后,进入S22的处理,终端控制部81反复进行曲线拟合的处理。该反复可以持续,直到用于使等距离点P1移动的导函数σ的值小于指定变动值。
这样,终端控制部81可以根据输入线Q的起点P0、输入线Q的终点P2、以及连结与起点P0和终点P2等距离的点的垂直平分线L1上的点即等距离点P1,生成通过输入线Q的起点P0和输入线Q的终点P2、相对于垂直平分线L1对称的输出曲线C。终端控制部81可以基于距离的累计值D,计算出用于在垂直平分线L1上使等距离点P1移动的导函数σ。终端控制部81可以根据函数σ的计算值来确定输出曲线C有无变形。
由此,终端80能够通过计算出导函数σ,参考用于缩小输入线Q与输出曲线C之间的差分(相当于距离的累计值D)的运算效率来确定是否使输出曲线C变形。因此,终端80能够兼顾输入线Q与输出曲线C之间的差分的降低和输出曲线C的变形效率。
另外,当导函数σ的计算值小于指定变动值(第二阈值的一个示例)时,终端控制部81可以使输出曲线C不变。
在此情况下,终端80能够判断出即便使等距离点P1移动,距离的累计值D也几乎不会变化,因此能够省略等距离点P1的移动,省略输出曲线C的变形,并省略多余的曲线拟合所涉及的计算。由此,终端80能够缩短曲线拟合所需的时间,因而能够缩短飞行路径FR2的生成所需的时间。
另外,当导函数σ的计算值大于等于指定变动值时,终端控制部81可以根据导函数σ的计算值,使等距离点P1在垂直平分线L1上移动,并根据输入线Q的起点P0、输入线Q0的终点P2以及所移动的等距离点P1(P1’)使输出曲线C变形。变形的结果,例如成为输出曲线C’。
在此情况下,当使等距离点P1移动时,终端80能够判断出距离的累计值D大幅变化,通过使输出曲线C变形,能够大幅降低输入线Q与输出曲线C之间的差分。
另外,终端控制部81可以反复进行等距离点P1的移动以及输出曲线C的变形,直到导函数σ的计算值在指定变动值以下。
由此,终端80能够使输出曲线的形状最佳化,以使输入线Q与输出曲线C的差分变小(例如最小化)。
另外,输出曲线C可以是以输入线Q的起点P0、输入线Q的终点P2、等距离点P1为控制点的二次贝塞尔曲线。此外,输出曲线部分也同样可以是以输入线部分q的起点P0、输入线部分q的终点P2、等距离点P1为控制点的二次贝塞尔曲线。
由此,终端80能够部分地使用公知的二次贝塞尔曲线,根据输出曲线C、输出曲线部分c很容易地生成飞行路径FR2。
另外,作为输出曲线C例示了二次贝塞尔曲线,但并不限于此。例如,输出曲线C可以是三次或更多的贝塞尔曲线,也可以是贝塞尔曲线以外的曲线。
接着,对无人驾驶航空器100中的飞行路径的设定进行说明。
在终端80中,终端控制部81可以经由通信部85将拟合结果发送到无人驾驶航空器100。在无人驾驶航空器100中,UAV控制部110可以经由通信接口150从终端80获取拟合结果。拟合结果可以包括飞行路径FR2、路径点WP、输出曲线C、作为输出曲线部分c的生成的基础的控制点(起点P0、等距离点P1、终点P2)等。
当UAV控制部110获取到飞行路径FR2的信息时,可以控制无人驾驶航空器100沿飞行路径FR2飞行。由此,无人驾驶航空器100自身不必生成飞行路径FR2,便能够降低无人驾驶航空器100的飞行路径的生成所涉及的处理负荷,使其能够沿制约更少、具有自由形状的飞行路径FR2飞行。
当UAV控制部110在获取输出曲线C、作为输出曲线部分c的生成的基础的控制点的信息时,可以根据控制点的信息生成飞行路径FR2。基于控制点的飞行路径FR2的生成方法可以与终端80生成时相同。UAV控制部110可以控制无人驾驶航空器100沿着所生成的飞行路径FR2飞行。由此,无人驾驶航空器100自身不必导出用于生成飞行路径FR2的控制点,便能够降低无人驾驶航空器100的控制点的导出所涉及的处理负荷,使其能够沿制约更少、具有自由形状的飞行路径FR2飞行。
当UAV控制部110在获取到路径点WP的信息时,可以根据路径点WP的信息生成飞行路径FR2。UAV控制部110可以控制无人驾驶航空器100沿着通过路径点WP的飞行路径FR2飞行。由此,无人驾驶航空器100自身不必导出飞行效率良好的路径点WP,便可以降低路径点WP的导出所涉及的处理负荷,使其沿着制约更少、具有自由形状的飞行路径FR2飞行。
进而,UAV控制部110也可以代替终端80,具有终端80的终端控制部81具有的、与飞行路径的生成有关的功能。在此情况下,上述终端控制部81的各个操作可以由UAV控制部110执行。在此情况下,UAV控制部110可以控制无人驾驶航空器100沿着自身生成的飞行路径FR2飞行。
由此,无人驾驶航空器100能够通过无人驾驶航空器100自身完成从飞行路径FR2的生成到沿着飞行路径FR2的飞行,并能够简化飞行体系统10的系统构成。即,能够省略终端80。
以上使用实施方式对本公开进行了说明,但是本公开的技术范围并不限于上述实施方式所记载的范围。对本领域普通技术人员来说,显然可对上述实施 方式加以各种变更或改良。从权利要求书的记载即可明白,加以了这样的变更或改良的方式都可包含在本公开的技术范围之内。
权利要求书、说明书以及说明书附图中所示的装置、系统、程序和方法中的操作、顺序、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,即可以以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
在上述实施方式中,作为移动体,虽然示出了无人驾驶航空器,但本公开并不限于此,也能够应用于搭载有相机的无人驾驶汽车、搭载有相机的自行车、人在移动的同时把持的带相机的万向节装置等。

Claims (32)

  1. 一种生成用于飞行体飞行的飞行路径的信息处理装置,其特征在于,其包含处理部,所述处理部用于:
    获取表示第一路径的输入线;
    生成以所述输入线的起点为起点、以所述输入线的终点为终点的输出曲线;
    计算出所述输入线中的各点与所述输出曲线之间的距离的累计值;以及
    根据所述输出曲线和所述距离的累计值,生成所述飞行路径。
  2. 根据权利要求1所述的信息处理装置,其特征在于,当所述累计值小于第一阈值时,所述处理部生成所述输出曲线所示的所述飞行路径。
  3. 根据权利要求1所述的信息处理装置,其特征在于:
    当所述累计值大于等于第一阈值时,所述处理部确定所述输入线中的各点中所述距离最长的第一点;
    在所述第一点处分割所述输入线,在所述输入线上生成第一输入线部分和第二输入线部分;
    生成以所述第一输入线部分的起点为起点,以所述第一输入线部分的终点为终点的、所述输出曲线中的第一输出曲线部分;
    生成以所述第二输入线部分的起点为起点,以所述第二输入线部分的终点为终点的、所述输出曲线中的第二输出曲线部分。
  4. 根据权利要求3所述的信息处理装置,其特征在于:
    所述处理部计算出所述第一输入线部分中的各点与所述第一输出曲线部分之间的距离的累计值;
    反复进行所述第一输入线部分的分割以及与所分割的所述第一输入线部分相对应的输出曲线部分的生成,直到所述距离的累计值小于所述第一阈值。
  5. 根据权利要求4所述的信息处理装置,其特征在于:
    所述处理部连接所生成的多个输出曲线部分来生成所述飞行路径。
  6. 根据权利要求1至5中任一项所述的信息处理装置,其特征在于,所述处理部用于:
    根据所述输入线的起点、所述输入线的终点、以及连结与所述起点及所述终点等距离的点的垂直平分线上的点即等距离点,来生成通过所述输入线的起 点和所述输入线的终点、相对于所述垂直平分线对称的所述输出曲线;
    基于所述距离的累计值,计算出用于使所述等距离点在所述垂直平分线上移动的导函数,以及
    根据所述导函数的计算值,确定所述输出曲线有无变形。
  7. 根据权利要求6所述的信息处理装置,其特征在于:
    在所述导函数的计算值小于第二阈值时,所述处理部使所述输出曲线不变。
  8. 根据权利要求6所述的信息处理装置,其特征在于,当所述导函数的计算值大于等于第二阈值时,所述处理部用于:
    根据所述导函数的计算值,使所述等距离点在所述垂直平分线上移动;以及
    根据所述输入线的起点、所述输入线的终点、以及所移动的所述等距离点,使所述输出曲线变形。
  9. 根据权利要求8所述的信息处理装置,其特征在于,所述处理部反复进行所述等距离点的移动以及所述输出曲线的变形,直到所述导函数的计算值小于所述第二阈值。
  10. 根据权利要求6至9中任一项所述的信息处理装置,其特征在于,所述输出曲线是以所述输入线的起点、所述输入线的终点、所述等距离点作为控制点的二次贝塞尔曲线。
  11. 根据权利要求1至10中任一项所述的信息处理装置,其特征在于,还包含通信部,
    所述处理部经由通信部将所述飞行路径的信息发送到所述飞行体。
  12. 根据权利要求6至10中任一项所述的信息处理装置,其特征在于,还包含通信部,
    所述处理部经由所述通信部将所述输入线的起点、所述输入线的终点以及所述等距离点的信息发送到所述飞行体。
  13. 根据权利要求1至10中任一项所述的信息处理装置,其特征在于,还包含显示部,
    所述处理部经由所述显示部显示所述飞行路径的信息。
  14. 根据权利要求6至10中任一项所述的信息处理装置,其特征在于,还包含显示部,
    所述处理部经由所述显示部显示所述输入线的起点、所述输入线的终点以及所述等距离点的信息。
  15. 根据权利要求1至10中任一项所述的信息处理装置,其特征在于:
    所述信息处理装置是所述飞行体,
    所述处理部按照所述飞行路径来控制所述飞行体的飞行。
  16. 一种生成用于飞行体飞行的飞行路径的信息处理装置中的飞行路径生成方法,其特征在于,具有:
    获取表示第一路径的输入线的步骤;
    生成以所述输入线的起点为起点、以所述输入线的终点为终点的输出曲线的步骤;
    计算出所述输入线中的各点与所述输出曲线之间的距离的累计值的步骤;以及
    根据所述输出曲线和所述距离的累计值,生成所述飞行路径的步骤。
  17. 根据权利要求16所述的飞行路径生成方法,其特征在于,生成所述飞行路径的步骤包括当所述累计值小于第一阈值时,生成所述输出曲线所示的所述飞行路径的步骤。
  18. 根据权利要求16所述的飞行路径生成方法,其特征在于,生成所述飞行路径的步骤包括:
    当所述累计值大于等于第一阈值时,确定所述输入线中的各点中所述距离最长的第一点的步骤;
    在所述第一点处分割所述输入线,在所述输入线上生成第一输入线部分和第二输入线部分的步骤;
    生成以所述第一输入线部分的起点为起点,以所述第一输入线部分的终点为终点的、所述输出曲线中的第一输出曲线部分的步骤;以及
    生成以所述第二输入线部分的起点为起点,以所述第二输入线部分的终点为终点的、所述输出曲线中的第二输出曲线部分的步骤。
  19. 根据权利要求18所述的飞行路径生成方法,其特征在于:计算出所述距离的累计值的步骤包括计算出所述第一输入线部分中的各点与所述第一输出曲线部分之间的距离的累计值的步骤;
    生成所述飞行路径的步骤包括反复进行所述第一输入线部分的分割以及与 所分割的所述第一输入线部分相对应的输出曲线部分的生成,直到所述距离的累计值小于所述第一阈值的步骤。
  20. 根据权利要求19所述的飞行路径生成方法,其特征在于,生成所述飞行路径的步骤包括连接所生成的多个输出曲线部分来生成所述飞行路径的步骤。
  21. 根据权利要求16至20中任一项所述的飞行路径生成方法,其特征在于,生成所述输出曲线的步骤包括:
    根据所述输入线的起点、所述输入线的终点、以及连结与所述起点及所述终点等距离的点的垂直平分线上的点即等距离点,来生成通过所述输入线的起点和所述输入线的终点、相对于所述垂直平分线对称的所述输出曲线的步骤;
    基于所述距离的累计值,计算出用于使所述等距离点在所述垂直平分线上移动的导函数的步骤;以及
    根据所述导函数的计算值,确定所述输出曲线有无变形的步骤。
  22. 根据权利要求21所述的飞行路径生成方法,其特征在于,生成所述输出曲线的步骤包括当所述导函数的计算值小于第二阈值时,使所述输出曲线不变的步骤。
  23. 根据权利要求21所述的飞行路径生成方法,其特征在于,生成所述输出曲线的步骤包括:当所述导函数的计算值大于等于第二阈值时,根据所述导函数的计算值,使所述等距离点在所述垂直平分线上移动的步骤;以及
    根据所述输入线的起点、所述输入线的终点、所移动的所述等距离点,使所述输出曲线变形的步骤。
  24. 根据权利要求23所述的飞行路径生成方法,其特征在于,生成所述输出曲线的步骤包括反复进行所述等距离点的移动以及所述输出曲线的变形,直到所述导函数的计算值小于所述第二阈值的步骤。
  25. 根据权利要求21至24中任一项所述的飞行路径生成方法,其特征在于,所述输出曲线是以所述输入线的起点、所述输入线的终点、以及所述等距离点作为控制点的二次贝塞尔曲线。
  26. 根据权利要求16至25中任一项所述的飞行路径生成方法,其特征在于,还包括将所述飞行路径的信息发送到所述飞行体的步骤。
  27. 根据权利要求21至25中任一项所述的飞行路径生成方法,其特征在 于,还包括将所述输入线的起点、所述输入线的终点以及所述等距离点的信息发送到所述飞行体的步骤。
  28. 根据权利要求16至25中任一项所述的飞行路径生成方法,其特征在于,还包括显示所述飞行路径的信息的步骤。
  29. 根据权利要求21至25中任一项所述的飞行路径生成方法,其特征在于,还包括显示所述输入线的起点、所述输入线的终点以及所述等距离点的信息的步骤。
  30. 根据权利要求16至25中任一项所述的飞行路径生成方法,其特征在于,所述信息处理装置是所述飞行体,所述飞行路径生成方法还包括按照所述飞行路径控制所述飞行体的飞行的步骤。
  31. 一种程序,其特征在于,其用于使生成用于飞行体飞行的飞行路径的信息处理装置执行以下步骤:
    获取表示第一路径的输入线;
    生成以所述输入线的起点为起点、以所述输入线的终点为终点的输出曲线;
    计算出所述输入线中的各点与所述输出曲线之间的距离的累计值;以及
    根据所述输出曲线和所述距离的累计值,生成所述飞行路径。
  32. 一种记录介质,其特征在于,其是计算机可读记录介质并记录有用于使生成用于飞行体飞行的飞行路径的信息处理装置执行以下步骤的程序:
    获取表示第一路径的输入线;
    生成以所述输入线的起点为起点、以所述输入线的终点为终点的输出曲线;
    计算出所述输入线中的各点与所述输出曲线之间的距离的累计值;以及
    根据所述输出曲线和所述距离的累计值,生成所述飞行路径。
PCT/CN2019/093764 2018-06-29 2019-06-28 信息处理装置、飞行路径生成方法、程序以及记录介质 WO2020001629A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980005106.1A CN111226093A (zh) 2018-06-29 2019-06-28 信息处理装置、飞行路径生成方法、程序以及记录介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-125369 2018-06-29
JP2018125369A JP2020003428A (ja) 2018-06-29 2018-06-29 情報処理装置、飛行経路生成方法、プログラム、及び記録媒体

Publications (1)

Publication Number Publication Date
WO2020001629A1 true WO2020001629A1 (zh) 2020-01-02

Family

ID=68985275

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/093764 WO2020001629A1 (zh) 2018-06-29 2019-06-28 信息处理装置、飞行路径生成方法、程序以及记录介质

Country Status (3)

Country Link
JP (1) JP2020003428A (zh)
CN (1) CN111226093A (zh)
WO (1) WO2020001629A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2022070851A1 (zh) * 2020-09-30 2022-04-07

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106483975A (zh) * 2016-10-26 2017-03-08 广州极飞科技有限公司 确定无人机航线的方法及装置
WO2017096548A1 (en) * 2015-12-09 2017-06-15 SZ DJI Technology Co., Ltd. Systems and methods for auto-return
CN107038899A (zh) * 2017-03-29 2017-08-11 北京小米移动软件有限公司 一种进行飞行的方法和装置
CN107085437A (zh) * 2017-03-20 2017-08-22 浙江工业大学 一种基于eb‑rrt的无人机航迹规划方法
CN107479570A (zh) * 2017-07-05 2017-12-15 南宁学院 一种可调螺旋翼姿态的无人机自动飞行控制方法
CN107990897A (zh) * 2016-10-26 2018-05-04 杭州海康机器人技术有限公司 一种航线数据确定方法及装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5877381B2 (ja) * 2010-06-30 2016-03-08 パナソニックIpマネジメント株式会社 曲線分割装置、曲線分割方法、曲線分割プログラム及び集積回路
JP5892779B2 (ja) * 2010-12-14 2016-03-23 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation 実際の位置特定データを提供するためのシステム
CN102129710A (zh) * 2010-12-30 2011-07-20 北京像素软件科技股份有限公司 一种飞行路径模拟方法及系统
JP2016169944A (ja) * 2015-03-11 2016-09-23 三菱電機株式会社 ヘリ位置表示システム
KR20180051996A (ko) * 2016-11-09 2018-05-17 삼성전자주식회사 무인 비행 장치 및 이를 이용한 피사체 촬영 방법

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017096548A1 (en) * 2015-12-09 2017-06-15 SZ DJI Technology Co., Ltd. Systems and methods for auto-return
CN106483975A (zh) * 2016-10-26 2017-03-08 广州极飞科技有限公司 确定无人机航线的方法及装置
CN107990897A (zh) * 2016-10-26 2018-05-04 杭州海康机器人技术有限公司 一种航线数据确定方法及装置
CN107085437A (zh) * 2017-03-20 2017-08-22 浙江工业大学 一种基于eb‑rrt的无人机航迹规划方法
CN107038899A (zh) * 2017-03-29 2017-08-11 北京小米移动软件有限公司 一种进行飞行的方法和装置
CN107479570A (zh) * 2017-07-05 2017-12-15 南宁学院 一种可调螺旋翼姿态的无人机自动飞行控制方法

Also Published As

Publication number Publication date
JP2020003428A (ja) 2020-01-09
CN111226093A (zh) 2020-06-02

Similar Documents

Publication Publication Date Title
JP6803919B2 (ja) 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
JP6803800B2 (ja) 情報処理装置、空撮経路生成方法、空撮経路生成システム、プログラム、及び記録媒体
US20200320886A1 (en) Information processing device, flight control instruction method, program and recording medium
JP6962775B2 (ja) 情報処理装置、空撮経路生成方法、プログラム、及び記録媒体
JP6878194B2 (ja) モバイルプラットフォーム、情報出力方法、プログラム、及び記録媒体
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
JPWO2018198281A1 (ja) 情報処理装置、空撮経路生成方法、空撮経路生成システム、プログラム、及び記録媒体
WO2021203940A1 (zh) 显示控制方法、显示控制装置、程序以及记录介质
CN111344650B (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
US20200217665A1 (en) Mobile platform, image capture path generation method, program, and recording medium
JP2018201119A (ja) モバイルプラットフォーム、飛行体、支持装置、携帯端末、撮像補助方法、プログラム、及び記録媒体
JP2019028560A (ja) モバイルプラットフォーム、画像合成方法、プログラム、及び記録媒体
JP7501535B2 (ja) 情報処理装置、情報処理方法、情報処理プログラム
WO2020001629A1 (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
US20210092306A1 (en) Movable body, image generation method, program, and recording medium
JP2019082837A (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
WO2020119572A1 (zh) 形状推断装置、形状推断方法、程序以及记录介质
WO2021016867A1 (zh) 终端设备及其数据处理方法、无人机及其控制方法
JP2021048559A (ja) 制御装置、制御方法、プログラム、及び記録媒体
WO2020108290A1 (zh) 图像生成装置、图像生成方法、程序以及记录介质
WO2020088397A1 (zh) 位置推定装置、位置推定方法、程序以及记录介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19825557

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19825557

Country of ref document: EP

Kind code of ref document: A1