WO2018020659A1 - Corps mobile, procédé de commande de corps mobile, système de commande de corps mobile et programme de commande de corps mobile - Google Patents

Corps mobile, procédé de commande de corps mobile, système de commande de corps mobile et programme de commande de corps mobile Download PDF

Info

Publication number
WO2018020659A1
WO2018020659A1 PCT/JP2016/072314 JP2016072314W WO2018020659A1 WO 2018020659 A1 WO2018020659 A1 WO 2018020659A1 JP 2016072314 W JP2016072314 W JP 2016072314W WO 2018020659 A1 WO2018020659 A1 WO 2018020659A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
position information
moving body
positioning process
obtained based
Prior art date
Application number
PCT/JP2016/072314
Other languages
English (en)
Japanese (ja)
Inventor
幸良 笹尾
祐司 和田
Original Assignee
エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by エスゼット ディージェイアイ テクノロジー カンパニー リミテッド filed Critical エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority to JP2017519597A priority Critical patent/JP6289750B1/ja
Priority to PCT/JP2016/072314 priority patent/WO2018020659A1/fr
Publication of WO2018020659A1 publication Critical patent/WO2018020659A1/fr
Priority to US16/256,776 priority patent/US11029707B2/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/006Navigation or guidance aids for a single aircraft in accordance with predefined flight zones, e.g. to avoid prohibited zones
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft

Definitions

  • the present invention relates to a moving body technology.
  • Non-Patent Document 1 discloses a positioning technique using a GPS (Global Positioning System) and a gyro sensor, which are a kind of GNSS.
  • the following two methods are conceivable as a method for designating the route on which the UAV flies or the range in which the flight is permitted.
  • the first method is to designate a flight route or range on the map data.
  • the UAV is caused to fly by manual operation, and the route indicated by the flight log is set as the route for the subsequent flight or the boundary of the flight range.
  • the UAV flight route or range may be specified correctly. Can not.
  • the UAV flight for obtaining the flight log is performed manually, it is difficult to fly on the target route with high accuracy.
  • An object of the present invention is to provide a technique for more accurately setting a route or a range in which a moving body moves.
  • the mobile object includes a first acquisition unit that acquires first position information obtained based on a positioning process of a terminal located at the first point, and a mobile object located at the first point. Calculation that calculates difference information between the second acquisition unit that acquires the second position information obtained based on the positioning process, and the position indicated by the first position information and the position indicated by the second position information Information on the moving range of the mobile unit based on the third position information and the difference information And a setting unit for setting.
  • the positioning process of the terminal and the moving body is performed at the same point, and the first position information and the second position information are acquired. Difference information between these positions is calculated.
  • the third position information is acquired by moving the terminal along the boundary of the movement range to be set.
  • a moving range of the moving body is set based on the third position information and the difference information. Thereby, the range which permits a movement to a moving body can be set with a high precision.
  • the moving body may include a control unit that controls the moving body so as not to exceed the moving range.
  • the setting unit can set the information on the movement range according to the position information calculated based on the third position information and the difference information.
  • the third position information includes information on a plurality of positions, and the setting unit sets information on the movement range according to the information on the plurality of positions calculated based on the information on the plurality of positions and the difference information. be able to.
  • the setting unit can set the boundary information of the movement range by connecting a plurality of positions.
  • the setting unit can set the information on the movement range so that the movement range is defined inside the area surrounded by connecting a plurality of positions.
  • the terminal positioning process method is different from the mobile positioning process method.
  • the terminal positioning method can include autonomous navigation.
  • the terminal positioning method can include positioning by GNSS.
  • the terminal positioning processing method can include positioning by a signal received from a base station or a wireless router.
  • the difference information can include latitude and longitude information.
  • the difference information can include vector information.
  • the terminal is a mobile phone, a smartphone, a tablet terminal, a laptop, or a navigation device.
  • the positioning process of the moving body located at the first point is executed according to the user's instruction.
  • the third location information includes location information obtained by a positioning process performed at a predetermined time in the terminal.
  • the moving body includes a first acquisition unit that acquires first position information obtained based on a positioning process of a terminal located at the first point, and a moving body located at the first point.
  • the second acquisition unit that acquires the second position information obtained based on the positioning process, and the difference information between the position indicated by the first position information and the position indicated by the second position information is calculated.
  • a setting unit for setting for setting.
  • the positioning process of the terminal and the moving body is performed at the same point, and the first position information and the second position information are acquired. Difference information between these positions is calculated.
  • the third position information is acquired by moving the terminal along the path of the moving body to be set.
  • a route of the moving body is set based on the third position information and the difference information. Thereby, the path
  • a control unit that controls the moving body to move on the route can be provided.
  • the setting unit can set route information according to the position information calculated based on the third position information and the difference information.
  • the third position information includes information on a plurality of positions, and the setting unit sets route information according to the information on the plurality of positions calculated based on the information on the plurality of positions and the difference information. Can do.
  • the setting unit can set a route by connecting a plurality of positions.
  • the information processing apparatus is based on the position indicated by the first position information obtained based on the positioning process of the terminal located at the first point and the positioning process of the mobile object located at the first point. Difference information between the position indicated by the second position information obtained and the third position information obtained based on the positioning process of the terminal after the movement, the third position information and A setting unit is provided for setting information on the moving range of the moving body based on the difference information.
  • the information processing apparatus performs the positioning process indicated by the first position information obtained based on the positioning process of the terminal located at the first point and the positioning process of the mobile object located at the first point. Obtaining the difference information between the position indicated by the second position information obtained based on the third position information obtained based on the positioning process of the terminal after movement, and the third position information And a setting unit that sets information on the route of the moving object based on the difference information.
  • a mobile body control method implemented in a mobile body includes a step of acquiring first location information obtained based on a positioning process of a terminal located at a first location, and a location at the first location. A step of acquiring second position information obtained based on the positioning process of the moving body, and a step of calculating difference information between the position indicated by the first position information and the position indicated by the second position information A step of acquiring third position information obtained based on the positioning process of the terminal after movement, and a step of setting information on a moving range of the moving body based on the third position information and difference information; Is provided.
  • the moving body control method includes a step of controlling so that the moving body does not exceed the moving range.
  • the setting includes setting the information on the movement range according to the position information calculated based on the third position information and the difference information.
  • the third position information includes information on a plurality of positions, and the setting is to set movement range information according to the information on the plurality of positions and the information on the plurality of positions calculated based on the difference information. including.
  • Setting includes setting boundary information of the movement range by connecting a plurality of positions.
  • Setting includes setting movement range information so that the movement range is defined inside an area surrounded by connecting a plurality of positions.
  • a mobile body control method implemented in a mobile body includes a step of acquiring first location information obtained based on a positioning process of a terminal located at a first location; The step of obtaining the second position information obtained based on the positioning process of the moving body located, and the difference information between the position indicated by the first position information and the position indicated by the second position information are calculated. A step of acquiring third position information obtained based on positioning processing of the terminal after movement, a step of setting route information of the moving body based on the third position information and difference information, Is provided.
  • the moving body control method includes a step of controlling the moving body so as not to cross the route.
  • Setting includes setting route information according to the position information calculated based on the third position information and the difference information.
  • the third position information includes information on a plurality of positions
  • setting includes setting route information according to the information on the plurality of positions and the information on the plurality of positions calculated based on the difference information. Including.
  • Setting includes setting a route by connecting a plurality of positions.
  • an information processing method implemented in a computer including a control unit includes: a position indicated by first position information obtained based on a positioning process of a terminal located at a first point by the control unit; Obtained based on the step of obtaining the difference information between the position indicated by the second position information obtained based on the positioning process of the mobile body located at one point and the positioning process of the terminal after the movement
  • the method includes a step of acquiring third position information and a step of setting information on a moving range of the moving body based on the third position information and the difference information.
  • an information processing method implemented in a computer including a control unit includes a position indicated by first position information obtained by a control unit based on a positioning process of a terminal located at a first point, Obtained based on the difference information between the position indicated by the second position information obtained based on the positioning process of the mobile body located at the first point and the positioning process of the terminal after movement. Obtaining the third position information, and setting the route information of the moving body based on the third position information and the difference information.
  • the mobile body control system in one mode is a mobile body control system provided with a mobile body and a terminal, and the mobile body is the first position information obtained based on the positioning process by the terminal located at the first point.
  • the first acquisition unit for acquiring the second acquisition unit, the second acquisition unit for acquiring the second position information obtained based on the positioning process by the moving body located at the first point, and the first position information A calculation unit that calculates difference information between the position and the position indicated by the second position information; a third acquisition unit that acquires third position information obtained based on the positioning process of the terminal after movement; And a setting unit that sets information on a moving range of the moving body based on the third position information and the difference information.
  • the mobile body control system is a mobile body control system including a mobile body and a terminal, and the mobile body is a first position obtained based on a positioning process by a terminal located at a first point.
  • a first acquisition unit that acquires information
  • a second acquisition unit that acquires second position information obtained based on a positioning process by a mobile body located at the first point
  • first position information A calculation unit that calculates difference information between the position indicated and the position indicated by the second position information
  • a third acquisition unit that acquires third position information obtained based on the positioning process of the terminal after movement
  • a setting unit that sets information on the route of the moving body based on the third position information and the difference information.
  • the moving body control program includes a step of acquiring, in a computer, first position information obtained based on a positioning process of a terminal located at a first point, and a moving body located at the first point.
  • a step of acquiring second position information obtained based on the positioning process, a step of calculating difference information between the position indicated by the first position information and the position indicated by the second position information, and after the movement The step of acquiring the third position information obtained based on the positioning process of the terminal and the step of setting the information of the moving range of the moving body based on the third position information and the difference information are executed.
  • the moving body control program includes a step of acquiring, on a computer, first position information obtained based on a positioning process of a terminal located at a first point, and a moving body located at the first point. Obtaining the second position information obtained based on the positioning process, calculating the difference information between the position indicated by the first position information and the position indicated by the second position information, and moving A step of acquiring the third position information obtained based on the positioning process of the subsequent terminal and a step of setting the route information of the moving body based on the third position information and the difference information are executed.
  • the mobile control program stores in the computer the position indicated by the first location information obtained based on the positioning process of the terminal located at the first location, and the mobile location located at the first location.
  • the mobile control program stores in the computer the position indicated by the first location information obtained based on the positioning process of the terminal located at the first location, and the mobile location located at the first location.
  • the unmanned aircraft control system 100 includes a UAV (Unmanned vehicle) 101 and a controller 301.
  • the UAV 101 can perform a flight by a manual operation through a controller 301 by a user, an automatic flight that flies on a preset route, and the like.
  • the UAV 101 includes a UAV main body 110, a plurality of rotor blades 120, a gimbal 130, an imaging device 140, a camera 150, and the like.
  • the flight of the UAV 101 can be controlled according to a virtual fence set via the controller 301 or the like.
  • the virtual fence defines a boundary of a range where the flight (movement) of the UAV 101 is permitted (hereinafter referred to as “flight range”).
  • flight range a range where the flight (movement) of the UAV 101 is permitted
  • the virtual fence is set, the flight of the UAV 101 is controlled so as not to cross the virtual fence.
  • the virtual fence is used when the UAV 101 is allowed to fly only in the land of a specific area and the pesticide is sprayed. In this case, even if the boundary between the specific area and the outer area is set as a virtual fence and the user sends an instruction to move outside the land by the controller 301, the UAV 101 does not move out of the specific area. To be controlled.
  • the plurality of rotor blades 120 generate the lifting force and propulsion force of the UAV 101 by the rotation. By controlling the rotation of the plurality of rotor blades 120, the flight of the UAV main body 110 is controlled.
  • the UAV 101 has four rotor blades 120.
  • the number of rotor blades 120 is not limited to four, and can be set to an arbitrary number.
  • the UAV 101 may be a UAV having fixed wings without having the rotating wings 120.
  • the UAV 101 may have both the rotary wing 120 and the fixed wing.
  • the gimbal 130 supports the imaging device 140 so that the posture of the imaging device 140 can be changed.
  • the gimbal 130 adjusts the posture of the imaging device 140 by rotating the imaging device 140 around the yaw axis, the pitch axis, and the roll axis.
  • the imaging device 140 includes a lens device, and generates and records image data of an optical image of a subject formed through the lens device.
  • the image data generated by the imaging device 140 includes still image data and moving image data.
  • the plurality of cameras 150 are formed by sensing cameras for controlling the flight of the UAV 101. As shown in FIG. 1, for example, two cameras 150 can be provided on the front surface which is the nose of the UAV main body 110.
  • Two cameras 150 may be provided on the bottom surface of the UAV main body 110. By using the parallax of the images captured by the pair of two cameras 150, the distance to the object existing around the UAV main body 110 can be obtained.
  • the pair of cameras 150 can be provided on at least one of the nose, the tail, the side surface, the bottom surface, and the ceiling surface.
  • the controller 301 is a remote controller (terminal device) that operates the UAV 101.
  • a terminal device various types such as a type that can be carried by the user or an end type that can be mounted on a carrier (such as an automobile) are used.
  • Specific examples of the terminal device include a mobile phone, a smartphone, a tablet terminal, a laptop, a navigation device, or a dedicated controller device.
  • the controller 301 communicates with the UAV 101.
  • the controller 301 transmits a signal to the UAV 101 and controls various operations including the flight of the UAV 101.
  • the controller 301 can also receive signals including various types of information from the UAV 101.
  • the controller 301 includes a display unit 320, an operation unit 330, and a main body 340.
  • the display unit 320 is a user interface that displays information on processing results by the controller 301 and an image.
  • the display unit 320 includes arbitrary display means including a liquid crystal display.
  • the operation unit 330 is a user interface for accepting user operation instructions.
  • the operation unit 330 includes, for example, a button and a joystick-shaped operation member.
  • a touch panel having both the operation unit 330 and the display unit 320 may be adopted.
  • the main body 340 includes a control unit 310, a storage unit 350, a communication unit 360, and a sensor 370 as main components.
  • the control unit 310 controls the operation of each component included in the controller 301 and controls the execution of various processes.
  • the control unit 310 includes, for example, a CPU (Central Processing Unit) and a memory.
  • the control unit 310 develops and executes the program stored in the storage unit 350 in the memory, and implements various functions by controlling the operation of each component included in the controller 301. Functions realized by the control unit 310 will be described later.
  • the storage unit 350 stores various programs necessary for execution of processing in the controller 301 and information on processing results.
  • the storage unit 350 includes a storage medium such as a semiconductor memory.
  • the communication unit 360 is a communication interface for communicating with an external device. For example, the communication unit 360 outputs a control signal for controlling the operation of the UAV 101. The communication unit 360 can also receive a signal from a GNSS satellite used for the positioning process of the UAV 101.
  • Sensor 370 includes, for example, a gyro sensor, an acceleration sensor, a geomagnetic sensor, and an image sensor.
  • the sensor 370 can detect, for example, the inclination of the controller 301, the direction in which a predetermined part of the controller 301 faces, and whether or not the controller 301 is moving.
  • the control unit 310 includes a positioning processing unit 311, a position recording unit 312, a transmission / reception unit 313, and a database 314 as main functional elements. These functions are realized, for example, when the control unit 310 develops and executes a program stored in the storage unit 350 in a memory and controls the operation of each component included in the controller 301. At least a part of the above functions can be realized by operations of various circuits and hardware instead of the program.
  • the control unit 310 can realize other functions included in a general remote controller or a terminal device, but description thereof is omitted here for convenience.
  • the database 314 stores various information such as information necessary for processing in the controller 301 and information generated by the processing.
  • the positioning processing unit 311 identifies the position of the controller 301 by positioning processing. Any method can be adopted for the positioning process. For example, the positioning processing unit 311 identifies the position of the controller 301 by executing a positioning process based on a signal from a GNSS satellite received via the communication unit 360. The positioning processing unit 311 can also perform a positioning process based on sensing information from the sensor 370 (that is, by autonomous navigation) and specify the position of the controller 301. The positioning processing unit 311 can also execute a positioning process based on a signal received from a communication device such as a communication router or a base station existing in the vicinity.
  • a communication device such as a communication router or a base station existing in the vicinity.
  • the positioning processing unit 311 outputs information on the position of the controller 301 specified by the positioning process and information on timing (for example, date and time) when the position is specified as positioning information.
  • the positioning process by the positioning processing unit 311 may be executed, for example, at a preset timing (for example, every second) or based on a user instruction.
  • the position recording unit 312 stores the positioning information output by the positioning processing unit 311 in the database 314.
  • the position information included in the positioning information may be specified by latitude and longitude, or may be specified by other coordinate information.
  • the position recording unit 312 may store the positioning information in the database 314 every time the positioning process is executed by the positioning processing unit 311, or the positioning information acquired by a part of the positioning processes executed. May be stored in the database 314.
  • the position recording unit 312 stores positioning information in the database 314 every time a predetermined time elapses (for example, every 10 seconds) or every time the controller 301 moves a predetermined distance (for example, every 10 m). can do.
  • the position recording unit 312 can also store the positioning information in the database 314 at the timing of receiving a user instruction.
  • the transmission / reception unit 313 transmits / receives information and signals to / from an external device.
  • the transmission / reception unit 313 transmits the positioning information output by the positioning processing unit 311 or the positioning information stored in the database 314 to the UAV 101.
  • This flow includes processing for acquiring position information (third position information described later) of the controller 301 used for setting the flight range (virtual fence) or flight path of the UAV 101.
  • the third position information is information indicating the movement trajectory of the controller 301.
  • the third position information indicates that, for example, when the user is moving while holding the controller 301 on the virtual fence or flight path to be set, the controller 301 performs the positioning process. Obtained by
  • control unit 310 executing a predetermined program stored in the storage unit 350 and controlling each element of the controller 301, for example.
  • step S ⁇ b> 101 the control unit 310 receives an instruction to start recording positioning information via the operation unit 330. Thereafter, the process proceeds to step S102.
  • step S102 the control unit 310 controls the controller 301 to perform the positioning process.
  • the details of the positioning process are as described above. Thereafter, the process proceeds to step S103.
  • step S103 the control unit 310 stores the positioning information acquired by the positioning process in step S102 in the storage unit 350.
  • the positioning information includes, for example, position information of the controller 301 and positioning time information. Thereafter, the process proceeds to step S104.
  • step S104 the control unit 310 determines whether an instruction to end the recording of positioning information is received via the operation unit 330. If such an end instruction is accepted (Yes), the process proceeds to step S105. If an end instruction has not been received (No), the process proceeds to step S102.
  • This positioning process may be performed every predetermined time (for example, every 10 seconds or every 1 second), or may be performed every time an instruction is received from the user.
  • the figure shows an example of a screen displayed on the display unit 320 of the controller 301.
  • a map 601 On this screen, a map 601, a recording start button 602, and a recording end button 603 are displayed.
  • an icon 604 and an icon 605 related to the positioning position are displayed.
  • the control unit 310 receives an instruction to start recording of positioning information in step S101.
  • recording of positioning information is started in step S103, positioning processing is performed by the controller 301, and an icon 604 is displayed at a position on the map 601 corresponding to the recorded positioning position.
  • an icon is displayed at a position on the map 601 corresponding to the positioning position by the controller 301.
  • the positioning position of the controller 301 also moves.
  • the icons are also moved and displayed sequentially.
  • An icon 605 is displayed at the current position of the controller 301 on the map 601.
  • the positioning information continues to be recorded until the user presses the recording end button 603 until a recording end instruction is accepted in step S104.
  • the icon 604 indicating the recording position of the positioning information has a round shape
  • the icon 605 indicating the current position has a human shape.
  • the shape of the icon is not limited to these, and an arbitrary shape can be adopted.
  • FIG. 6 shows an example of a map displayed on the display unit 320 when an instruction to end the positioning information recording is received in step S104 of FIG. Icons are displayed at respective positions on the map 701 corresponding to a plurality of positioning positions recorded from the start to the end of recording.
  • An icon 702 indicates a position (first point) where recording of positioning information is started.
  • An icon 703 indicates a position where recording of positioning information is finished. For example, such a screen is displayed when the user walks clockwise around the white area in the center of FIG. 6 while holding the controller 301.
  • the positioning information stored in the storage unit 350 as described above indicates a locus (movement log) by which the controller 301 is moved, and will be referred to as movement locus information in the following description.
  • the movement trajectory information includes information on one or a plurality of positions.
  • step S105 the control unit 310 transmits the positioning information stored in the storage unit 350 in step S103, that is, the movement trajectory information, to the UAV 101.
  • the process illustrated in FIG. 4 ends.
  • the movement trajectory information is information used to specify the flight range and the flight path in the processing described later.
  • the flight range or the flight path is specified by performing offset correction described later on the position indicated by the movement trajectory information.
  • a first modification will be described with reference to FIGS.
  • approximation processing is performed on a plurality of measured positions.
  • a plurality of positions obtained by the positioning process are distributed discretely.
  • the control unit 310 performs an approximation process on the movement trajectory indicated by the plurality of positioning positions shown in FIG. 6 and converts the movement trajectory into a shape such as a polygon 802 shown in FIG. .
  • the shape obtained by this approximation process may be a polygon or a circle or other shapes. Any method can be adopted as a method of executing the approximation process.
  • An example of the approximation process is disclosed in Non-Patent Document 2.
  • step S105 in FIG. 4 the control unit 310 transmits information on the position of the outer periphery of the approximate shape such as a polygonal shape to the UAV 101 as movement trajectory information.
  • the approximation process may be performed by the UAV 101 instead of the controller 301.
  • control unit 310 records positioning information at a timing according to a user's instruction, and sets a shape having a vertex at a position indicated by the recorded positioning information as a movement trajectory.
  • an icon 903 and an icon 904 indicate positions indicated by the positioning information recorded at a timing according to the user's instruction.
  • An icon 905 indicates the current location of the user.
  • control unit 310 changes the position indicated by the movement trajectory information once set according to a user instruction.
  • FIG. 9 shows the map 801 and the polygon 802 shown in FIG. Icons indicated by “+” marks are displayed between the vertices of the polygon 802 indicating the movement locus. For example, when the icon 803 indicated by the “+” mark is tapped by the user, new icons 805 and 806 are displayed on both sides of the movement locus as shown in FIG. An icon 804 is newly displayed at the position where the icon 803 was displayed.
  • the user can change the shape of the movement locus by dragging the icon 804.
  • the user can also change the shape of the movement trajectory by dragging another vertex.
  • Information on the position of the movement locus whose shape has been changed is transmitted from the controller 301 to the UAV 101 as movement locus information.
  • the UAV main body includes a UAV control unit 210, a memory 220, a communication interface 230, and a rotor blade mechanism 240 as main components.
  • the UAV control unit 210 controls the operation of various elements included in the UAV 101 and controls the execution of various processes.
  • the UAV control unit 210 includes, for example, a CPU and a memory.
  • the UAV control unit 210 develops and executes a program stored in the memory 220, and controls various elements included in the UAV control unit 210, thereby realizing various functions. Functions realized by the UAV control unit 210 will be described later. Further, the processing by the UAV control unit 210 is controlled according to a command received from an external device via the communication interface 230.
  • the memory 220 may store a program for controlling the entire UAV.
  • the memory 220 may store various types of log information of the UAV 101 and various types of data and information such as image data captured by the imaging device 140 and the camera 150.
  • a computer-readable storage medium can be used as the memory.
  • flash memory such as SRAM, DRAM, EEPROM, USB memory can be used.
  • the memory 220 may be removable from the UAV 101.
  • the communication interface 230 is an interface for communicating with the outside.
  • the communication interface 230 can receive an instruction from the remote controller terminal by wireless communication, and can transmit various data and information stored in the memory of the UAV 101.
  • the communication interface 230 can also receive signals from the GNSS positioning system.
  • Rotating blade mechanism 240 is a mechanism for rotating a plurality of rotating blades 120.
  • the rotary blade mechanism 240 includes a plurality of rotary blades 120 and a plurality of drive motors.
  • the UAV 101 may have various sensors such as a barometer, a laser, an acceleration, and a gyro.
  • the UAV 101 can include other devices and mechanisms.
  • the UAV control unit 210 includes a flight control unit 211, a transmission / reception unit 212, a positioning processing unit 213, a difference calculation unit 214, a flight control information setting unit 215, and a database 216 as main functional configurations. These functions are realized by, for example, the UAV control unit 210 developing and executing a program stored in the memory 220 and controlling the operation of each component included in the UAV 101. At least a part of the above functions can also be realized by operations of various circuits and hardware without executing a program.
  • the UAV control unit 210 can realize various other functions provided in a general unmanned aerial vehicle, but the description thereof is omitted here for convenience.
  • the database 216 stores various information such as information necessary for processing executed in the UAV 101 and information generated by the processing.
  • the flight control unit 211 controls the flight of the UAV 101 by controlling the operation of the rotary wing mechanism 240 and the like based on the signal received from the outside of the UAV 101 and the information stored in the database 216.
  • the flight control unit 211 can control the flight of the UAV 101 based on the flight range or route information.
  • the transmission / reception unit 212 transmits / receives information and signals to / from an external device.
  • the transmission / reception unit 212 receives the positioning information (position information) of the controller 301 transmitted from the controller 301 and a signal for controlling the flight. That is, the transmission / reception unit 212 functions as a position information acquisition unit.
  • the transmission / reception unit 212 can also transmit image data captured by the imaging device 140 and the camera 150 to the outside.
  • the position information acquired by the transmission / reception unit 212 includes position information (first position information) obtained based on the positioning process of the controller 301 located at a certain point (first point).
  • the position information acquired by the transmission / reception unit 212 includes position information (third position information) such as the above-described movement trajectory information obtained based on the positioning process of the controller 301 after movement.
  • the positioning processing unit 213 specifies the position (for example, latitude and longitude) of the UAV 101 by positioning processing.
  • the positioning processing unit 213 can perform any type of positioning processing.
  • the positioning processing unit 213 identifies the position of the UAV 101 by executing a positioning process based on a signal from the GNSS satellite received via the transmission / reception unit 212.
  • the positioning processing unit 213 can also specify the altitude of the position of the UAV 101 according to the barometer measurement result of the UAV 101. That is, the positioning processing unit 213 functions as an acquisition unit that acquires position information (second position information) of the UAV 101.
  • the positioning processing by the positioning processing unit 213 is performed at a preset timing (for example, every 10 seconds) or according to a user instruction.
  • the difference calculation unit 214 includes the position indicated by the position information (first position information) of the controller 301 acquired by the transmission / reception unit 212 and the position information (second position information) of the UAV 101 acquired by the positioning processing unit 213.
  • the difference information from the indicated position is calculated or specified.
  • the difference calculation unit 214 can specify the offset of the position of the UAV 101 with respect to the position of the controller 301.
  • the difference information may include information on the distance between the positions and the orientation of the positional relationship.
  • the difference information may include latitude and longitude difference information between positions.
  • the difference information can include vector information (information including distance and direction) indicating two positional relationships.
  • the difference calculation unit 214 stores the calculated difference information in the database 216.
  • the difference calculation unit 214 is acquired by the positioning processing of both of them located at substantially the same point (first point).
  • the difference information between the determined positions is calculated or specified.
  • the above-described offset occurs between the measured positions even if both are located at the same point.
  • the UAV 101 performs positioning processing using a signal from a GNSS satellite, and the controller 301 performs positioning using autonomous navigation. In such a case, an offset occurs.
  • the movement trajectory information of the controller 301 described above is log information obtained by positioning the controller 301 while moving on the actual route. Is shown with high accuracy. By adding the above-described offset to such movement trajectory information, position information when the UAV 101 moves on the route can be calculated with high accuracy.
  • step S401 the difference calculation unit 214 acquires the position information (first position information) of the controller 301 received (acquired) by the transmission / reception unit 212.
  • the first position information is position information acquired by performing positioning processing by the controller 301 when the controller 301 is placed at the first position. Thereafter, the process proceeds to step S402.
  • step S402 the difference calculation unit 214 acquires the position information (second position information) of the UAV 101 acquired by the positioning processing unit 213.
  • the second position information is position information acquired by the positioning process by the UAV 101 when the UAV 101 is arranged at the first position. Thereafter, the process proceeds to step S403.
  • step S403 the difference calculation unit 214 acquires difference information between the position indicated by the first position information acquired in step S401 and the position indicated by the second position information acquired in step S402. Thereafter, the process proceeds to step S404.
  • the latitude of the position indicated by the first position information is MOBILE [lat]
  • the longitude is MOBILE [lng].
  • the latitude of the position indicated by the second position information is UAV [lat]
  • the longitude is UAV [lat].
  • the difference in latitude (OFFSET [lat]) and the difference in longitude (OFFSET [lng]) between the position indicated by the first position information and the position indicated by the second position information are as follows. .
  • step S404 the difference calculation unit 214 stores the difference information calculated in step S403 in the database 216.
  • the difference calculation unit 214 stores, for example, the latitude difference OFFSET [lat] and the longitude difference OFFSET [lng] in the database 216 as difference information. Thereafter, the process shown in FIG. 14 ends.
  • the flight control information setting unit 215 is information on the flight range or route of the UAV 101 based on the difference information calculated by the difference calculation unit 214 and the movement trajectory information (third position information) acquired by the transmission / reception unit 212. Set.
  • the flight control information setting unit 215 can set flight range or route information according to position information calculated based on the movement trajectory information and difference information of the controller 301.
  • the flight range or route information may be indicated by latitude and longitude, or may be indicated by other coordinate information for indicating the position.
  • the latitude of the position calculated from the latitude (MOBILE '[lat]) and longitude (MOBILE' [lng]) indicated by the movement trajectory information ( UAV '[lat]) and longitude (UAV' [lng]) are as follows.
  • UAV '[lat] MOBILE' [lat] + OFFSET [lat]
  • UAV '[lng] MOBILE' [lng] + OFFSET [lng]
  • offset correction the process of calculating a new position from the position indicated by the movement trajectory information by adding difference information (offset) to the movement trajectory information as in the above formula is referred to as offset correction.
  • the flight control information setting unit 215 can set information on the boundary of the flight range of the UAV 101 by connecting a plurality of positions indicated by UAV ′ [lat] and UAV ′ [lng]. Alternatively, the flight control information setting unit 215 can set the route information of the UAV 101 by connecting a plurality of positions indicated by UAV ′ [lat] and UAV ′ [lng].
  • the flight control information setting unit 215 can set the flight range or route of the UAV 101 inside the area surrounded by the line formed by connecting the positions indicated by UAV ′ [lat] and UAV ′ [lng]. .
  • the boundary or route of the flight range can be set at a position that is a predetermined distance (for example, 2 m) from a line connecting the positions indicated by UAV '[lat] and UAV' [lng].
  • the flight control information setting unit 215 sets the position of the outer periphery of the polygon that is output by performing approximation processing on the positions indicated by UAV '[lat] and UAV' [lng] as the flight range or route. You can also.
  • the flight control information setting unit 215 can output an arbitrary shape such as a circle, an ellipse, or an irregular shape by the approximation process, and can set the position of the shape as a flight range or a route.
  • the flight control information setting unit 215 outputs a polygon (or an arbitrary shape such as a circle, an ellipse, or an irregular shape) by performing an approximation process on the positions indicated by UAV '[lat] and UAV' [lng].
  • the flight range (virtual fence) or route of the UAV 101 can be set inside the area of the shape of
  • the flight control information setting unit 215 stores information on the set flight range or route in the database 216.
  • the flight control unit 211 controls the flight of the UAV 101 based on the flight range or route information and the position information acquired from the positioning processing unit 213.
  • the flight control unit 211 controls the flight of the UAV 101 so that the position of the UAV 101 indicated by the position information acquired from the positioning processing unit 213 does not exceed the set flight range.
  • the flight control unit 211 controls the flight of the UAV 101 so that the position of the own aircraft indicated by the position information acquired from the positioning processing unit 213 is located on the set route.
  • the controller 301 and the UAV 101 are arranged at the same point and the positioning process is performed, and the difference information between the positions, that is, the offset is calculated.
  • the user moves the controller 301 on the virtual fence or route to be set (for example, walks while holding the controller 301), and obtains movement trajectory information.
  • This movement trajectory information is log information measured while moving the controller 301 on the virtual fence or route to be set. For this reason, the positioning position is shown with high accuracy. By adding the offset to such movement trajectory information, it becomes possible to set a virtual fence or a route with high accuracy.
  • position information is acquired while moving the controller 301, and a virtual fence or a route is set based on the position information. Therefore, the problem of inaccuracy of the virtual fence due to the difficulty of the flight operation of the UAV, which occurs when the method of setting the virtual fence or the like based on the flight log of the UAV 101 does not occur.
  • the processing by the difference calculation unit 214 and the flight control information setting unit 215 implemented in the UAV 101 described above may be performed in an external information processing apparatus such as a server apparatus.
  • the first position information, the second position information, and the third position information necessary for the processing are transmitted from the UAV 101 and the controller 301 to the information processing apparatus.
  • the information processing apparatus includes a difference information acquisition unit and a third position information acquisition unit, and sets flight range or route information based on the acquired difference information and the third position information.
  • a setting unit can be provided.
  • the flight range or route information set as a result of processing by the information processing apparatus is transmitted to the UAV 101.
  • the UAV 101 controls its flight based on the received flight range or route information.
  • step S501 the controller 301 executes a positioning process by the positioning processing unit 311 according to a user instruction, and acquires position information (first position information) of the controller 301.
  • step S502 the controller 301 transmits the first position information acquired in step S501 to the UAV 101.
  • step S503 the UAV 101 executes a positioning process by the positioning processing unit 213 according to a user instruction, and acquires the position information (second position information) of the UAV 101.
  • step S504 the UAV 101 specifies, as difference information, a difference between the position indicated by the first position information acquired from the controller 301 and the position indicated by the second position information acquired by the positioning process of the UAV 101.
  • the UAV 101 stores the specified difference information in the database 216.
  • step S505 the controller 301 is carried by a user or a carrier (for example, an automobile), so that it moves from the first point.
  • the user moves on the virtual fence or route to be set by walking or the like while carrying the controller 301.
  • the controller 301 performs positioning processing at a preset timing (for example, every 10 seconds) or until an end instruction is received in response to a user instruction, and position information (third position information ) To get.
  • the route may be set using a map displayed on the display unit 320. In this case, the third position information corresponds to this route.
  • step S506 the controller 301 transmits the third position information acquired in step S505 to the UAV 101.
  • step S507 the UAV 101 generates flight range or route information based on the offset as the difference information specified in step S504 and the third position information acquired from the controller 301 in step S506.
  • the method of generating the flight range or route information is as described above.
  • step S508 after receiving the flight instruction, the UAV 101 controls the flight based on the flight range or route information generated in step S506.
  • the present invention is not limited to the above-described embodiment, and can be implemented with various other modifications within the scope not departing from the gist of the present invention.
  • the above-described embodiment is merely an example in all respects, and is not interpreted in a limited manner, and various modifications can be employed.
  • the UAV 101 in the unmanned aircraft control system 100 of the above embodiment may be replaced with another aircraft.
  • the UAV 101 can be a manned aircraft.
  • the flight range of the manned aircraft and the route during automatic flight can be set by the processing described above.
  • the UAV 101 in the unmanned aerial vehicle control system 100 may be replaced with another arbitrary moving body.
  • the moving body include other aircraft that moves in the air, vehicles that move on the ground, and ships that move on the water.
  • the movement range and the movement route can be set by the same processing as the setting of the flight range indicating the movement range of the UAV 101 and the flight path indicating the movement route described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne une corps mobile qui est pourvu d'une première unité d'acquisition, d'une deuxième unité d'acquisition, d'une unité de calcul et d'une unité de réglage. La première unité d'acquisition acquiert des informations de première position obtenues sur la base du traitement de positionnement d'un terminal positionné à un premier point. La deuxième unité d'acquisition acquiert des informations de deuxième position obtenues sur la base du traitement de positionnement du corps mobile positionné à un premier point. L'unité de calcul calcule des informations de différence relatives à la différence entre la position indiquée par les premières informations de position et la position indiquée par les deuxièmes informations de position. Une troisième unité d'acquisition acquiert des troisièmes informations de position obtenues sur la base du traitement de positionnement du terminal après le mouvement. L'unité de réglage établit des informations relatives à une plage de mouvement du corps mobile sur la base des troisièmes informations de position et des informations de différence.
PCT/JP2016/072314 2016-07-29 2016-07-29 Corps mobile, procédé de commande de corps mobile, système de commande de corps mobile et programme de commande de corps mobile WO2018020659A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2017519597A JP6289750B1 (ja) 2016-07-29 2016-07-29 移動体、移動体制御方法、移動体制御システム、及び移動体制御プログラム
PCT/JP2016/072314 WO2018020659A1 (fr) 2016-07-29 2016-07-29 Corps mobile, procédé de commande de corps mobile, système de commande de corps mobile et programme de commande de corps mobile
US16/256,776 US11029707B2 (en) 2016-07-29 2019-01-24 Moving object, moving object control method, moving object control system, and moving object control program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/072314 WO2018020659A1 (fr) 2016-07-29 2016-07-29 Corps mobile, procédé de commande de corps mobile, système de commande de corps mobile et programme de commande de corps mobile

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/256,776 Continuation US11029707B2 (en) 2016-07-29 2019-01-24 Moving object, moving object control method, moving object control system, and moving object control program

Publications (1)

Publication Number Publication Date
WO2018020659A1 true WO2018020659A1 (fr) 2018-02-01

Family

ID=61017142

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/072314 WO2018020659A1 (fr) 2016-07-29 2016-07-29 Corps mobile, procédé de commande de corps mobile, système de commande de corps mobile et programme de commande de corps mobile

Country Status (3)

Country Link
US (1) US11029707B2 (fr)
JP (1) JP6289750B1 (fr)
WO (1) WO2018020659A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020189702A1 (fr) * 2019-03-19 2020-09-24
JPWO2020203126A1 (fr) * 2019-04-02 2020-10-08
CN112863219A (zh) * 2020-12-30 2021-05-28 深圳酷派技术有限公司 位置更新方法、装置、存储介质及电子设备
JPWO2021166175A1 (fr) * 2020-02-20 2021-08-26
WO2021220409A1 (fr) * 2020-04-28 2021-11-04 株式会社ナイルワークス Système d'édition de zone, dispositif d'interface utilisateur et procédé d'édition de zone de travail
JP7570710B2 (ja) 2020-04-28 2024-10-22 株式会社ナイルワークス エリア編集システム、作業エリアの編集方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11435656B1 (en) * 2018-02-27 2022-09-06 Snap Inc. System and method for image projection mapping
KR102480259B1 (ko) * 2018-05-10 2022-12-22 베이징 시아오미 모바일 소프트웨어 컴퍼니 리미티드 무인 항공기의 경로 정보를 획득 및 전송하는 방법
JP7094365B2 (ja) * 2018-07-18 2022-07-01 良平 上瀧 空中権管理システム
CN113498498B (zh) 2019-03-06 2024-04-19 索尼集团公司 行动控制设备和行动控制方法、以及程序
WO2022094961A1 (fr) * 2020-11-06 2022-05-12 深圳市大疆创新科技有限公司 Procédé et appareil de commande de robot à commande non humaine et robot à commande non humaine

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010160735A (ja) * 2009-01-09 2010-07-22 Toyota Motor Corp 移動ロボット、走行計画マップ生成方法、管理システム
JP2014040231A (ja) * 2012-07-13 2014-03-06 Honeywell Internatl Inc 自主的な空間飛行計画および仮想空間抑制システム

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016130994A1 (fr) * 2015-02-13 2016-08-18 Unmanned Innovation, Inc. Système de planification de vol télécommandé pour véhicule aérien sans pilote
US10762795B2 (en) * 2016-02-08 2020-09-01 Skydio, Inc. Unmanned aerial vehicle privacy controls
US9592912B1 (en) * 2016-03-08 2017-03-14 Unmanned Innovation, Inc. Ground control point assignment and determination system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010160735A (ja) * 2009-01-09 2010-07-22 Toyota Motor Corp 移動ロボット、走行計画マップ生成方法、管理システム
JP2014040231A (ja) * 2012-07-13 2014-03-06 Honeywell Internatl Inc 自主的な空間飛行計画および仮想空間抑制システム

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020189702A1 (fr) * 2019-03-19 2020-09-24
JPWO2020203126A1 (fr) * 2019-04-02 2020-10-08
WO2020203126A1 (fr) * 2019-04-02 2020-10-08 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
CN113631477A (zh) * 2019-04-02 2021-11-09 索尼集团公司 信息处理装置、信息处理方法和程序
JP7452533B2 (ja) 2019-04-02 2024-03-19 ソニーグループ株式会社 情報処理装置、情報処理方法及びプログラム
JPWO2021166175A1 (fr) * 2020-02-20 2021-08-26
JP7412037B2 (ja) 2020-02-20 2024-01-12 株式会社ナイルワークス ドローンシステム、操作器および作業エリアの定義方法
WO2021220409A1 (fr) * 2020-04-28 2021-11-04 株式会社ナイルワークス Système d'édition de zone, dispositif d'interface utilisateur et procédé d'édition de zone de travail
JP7570710B2 (ja) 2020-04-28 2024-10-22 株式会社ナイルワークス エリア編集システム、作業エリアの編集方法
CN112863219A (zh) * 2020-12-30 2021-05-28 深圳酷派技术有限公司 位置更新方法、装置、存储介质及电子设备

Also Published As

Publication number Publication date
JP6289750B1 (ja) 2018-03-07
JPWO2018020659A1 (ja) 2018-07-26
US20190171238A1 (en) 2019-06-06
US11029707B2 (en) 2021-06-08

Similar Documents

Publication Publication Date Title
JP6289750B1 (ja) 移動体、移動体制御方法、移動体制御システム、及び移動体制御プログラム
US11377211B2 (en) Flight path generation method, flight path generation system, flight vehicle, program, and storage medium
US10648809B2 (en) Adaptive compass calibration based on local field conditions
JP6878567B2 (ja) 3次元形状推定方法、飛行体、モバイルプラットフォーム、プログラム及び記録媒体
JP6430073B2 (ja) 姿勢推定装置、姿勢推定方法及び観測システム
JP6899846B2 (ja) 飛行経路表示方法、モバイルプラットフォーム、飛行システム、記録媒体及びプログラム
WO2018218536A1 (fr) Procédé de commande de vol, appareil et terminal de commande et procédé de commande s'y rapportant et véhicule aérien sans pilote
JPWO2018193574A1 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
WO2019230604A1 (fr) Système d'inspection
US20210229810A1 (en) Information processing device, flight control method, and flight control system
WO2021168819A1 (fr) Procédé et dispositif de commande de retour d'un véhicule aérien sans pilote
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
JP2020170213A (ja) ドローン作業支援システム及びドローン作業支援方法
JP2019032234A (ja) 表示装置
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
CN112313599B (zh) 控制方法、装置和存储介质
JP2019082837A (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
JP7114191B2 (ja) 無人航空機システム
CN110892353A (zh) 控制方法、控制装置、无人飞行器的控制终端
WO2023187891A1 (fr) Système de détermination et procédé de détermination
JP6974290B2 (ja) 位置推定装置、位置推定方法、プログラム、及び記録媒体

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017519597

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16910562

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC , EPO FORM 1205A 15.05.2019.

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC , EPO FORM 1205A DATED 15.05.2019.

122 Ep: pct application non-entry in european phase

Ref document number: 16910562

Country of ref document: EP

Kind code of ref document: A1