WO2021251441A1 - Procédé, système et programme - Google Patents

Procédé, système et programme Download PDF

Info

Publication number
WO2021251441A1
WO2021251441A1 PCT/JP2021/021987 JP2021021987W WO2021251441A1 WO 2021251441 A1 WO2021251441 A1 WO 2021251441A1 JP 2021021987 W JP2021021987 W JP 2021021987W WO 2021251441 A1 WO2021251441 A1 WO 2021251441A1
Authority
WO
WIPO (PCT)
Prior art keywords
restricted area
map
area
flight restricted
flight
Prior art date
Application number
PCT/JP2021/021987
Other languages
English (en)
Japanese (ja)
Inventor
剛史 中村
Original Assignee
株式会社Clue
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Clue filed Critical 株式会社Clue
Priority to JP2022530612A priority Critical patent/JPWO2021251441A1/ja
Publication of WO2021251441A1 publication Critical patent/WO2021251441A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • G08G1/127Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams to a central station ; Indicators in a central station
    • G08G1/13Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams to a central station ; Indicators in a central station the indicator being in the form of a map
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts

Definitions

  • This disclosure relates to methods, systems and programs.
  • Patent Document 1 discloses a technique for operating an unmanned flying object using a touch panel.
  • This disclosure has been made in view of this background and its purpose is to provide methods, systems and programs that make it easy to set flight restricted areas for aircraft in the field. be.
  • the map is displayed on the display unit, the information related to the designated area on the map input based on the operation for the map displayed on the display unit is acquired, and the information on the map is acquired.
  • a method is provided that includes outputting information about a flight restricted area of an air vehicle in real space based on a designated area of.
  • the display control unit for displaying the map on the display unit and the input information for acquiring the information related to the designated area on the map input based on the operation for the map displayed on the display unit.
  • a system including an acquisition unit and an output control unit that outputs information relating to a flight restricted area of an air vehicle in real space based on a designated area on the map is provided.
  • the computer acquires information related to a display control unit that displays a map on the display unit and a designated area on the map that is input based on an operation on the map displayed on the display unit.
  • a program is provided that functions as an input information acquisition unit and an output control unit that outputs information related to a flight restricted area of an air vehicle in real space based on a designated area on the map.
  • FIG. 1 is a diagram showing an outline of a system 1 according to an embodiment of the present disclosure.
  • the system 1 includes an information processing terminal 10 and an unmanned flying object 20.
  • the system 1 according to the present embodiment can be used, for example, for photographing an area FRA1 including a building S1 or the like.
  • the user U who uses the information processing terminal 10 operates the touch panel of the information processing terminal 10 to control the flight of the unmanned flying object 20.
  • the touch panel displays an image including the above area taken by the camera 28 mounted on the unmanned flying object 20.
  • information on the flight restricted area of the unmanned vehicle 20 can be displayed by superimposing it on an image displayed on the touch panel. With such a display, the user U can grasp the area where the unmanned vehicle 20 can / cannot fly by looking at the touch panel.
  • the information processing terminal 10 is mounted by a so-called tablet-shaped small computer.
  • the information processing terminal 10 may be realized by a portable information processing terminal such as a smartphone or a game machine, or may be realized by a stationary information processing terminal such as a personal computer.
  • the information processing terminal 10 may be realized by a plurality of hardware and may have a configuration in which the functions are distributed to them.
  • FIG. 2 is a block diagram showing the configuration of the information processing terminal 10 according to the present embodiment.
  • the information processing terminal 10 includes a control unit 11 and a touch panel 12 which is an example of a display unit.
  • the processor 11a is an arithmetic unit that controls the operation of the control unit 11, controls the transmission and reception of data between each element, and performs processing necessary for program execution.
  • the processor 11a is, for example, a CPU (Central Processing Unit), and executes each process by executing a program stored in the storage 11c and expanded in the memory 11b, which will be described later.
  • CPU Central Processing Unit
  • the memory 11b includes a main storage device composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive). .. While the memory 11b is used as a work area of the processor 11a, the BIOS (Basic Input / Output System) executed when the control unit 11 is started, various setting information, and the like are stored.
  • BIOS Basic Input / Output System
  • the storage 11c stores programs and information used for various processes. For example, when a user operates an air vehicle for imaging an area including a building S1 or the like via an information processing terminal 10, a program for controlling the flight of the air vehicle is stored in the storage 11c. May be good.
  • the transmission / reception unit 11d connects the control unit 11 to a network such as an Internet network, and may be provided with a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • the input / output unit 11e is an interface to which input / output devices are connected, and in the present embodiment, the touch panel 12 is connected.
  • the bus 11f transmits, for example, an address signal, a data signal, and various control signals between the connected processor 11a, memory 11b, storage 11c, transmission / reception unit 11d, and input / output unit 11e.
  • the touch panel 12 is an example of a display unit, and includes a display surface on which acquired images and images are displayed.
  • this display surface receives information input by contact with the display surface, and is implemented by various techniques such as a resistance film method and a capacitance method.
  • an image captured by the unmanned flying object 20 can be displayed on the display surface of the touch panel 12.
  • Information about the flight restricted area of the unmanned vehicle 20 may be displayed on the display surface by superimposing it on the image captured by the unmanned vehicle 20.
  • buttons, objects, and the like for flight control of the unmanned vehicle 20 and control of the image pickup apparatus may be displayed.
  • the user can input input information to the image, the button, or the like displayed on the display surface via the touch panel 12.
  • the touch panel 12 is mentioned as an example of the display unit in the present embodiment, the present technology is not limited to such an example.
  • the display unit may be realized by another display device such as a display, a monitor, or a smartphone.
  • the input device for acquiring the input information may be realized by a method other than the touch panel 12.
  • the input device may be realized by various input devices such as a mouse, a keyboard, a voice recognition device, and a line-of-sight recognition device instead of the touch panel 12.
  • the information processing device 10 may be realized independently of at least one of the display device such as the touch panel 12 and the input device.
  • the information processing apparatus 10 may be realized by one or a plurality of servers such as a cloud server and may be provided so as to be able to communicate with the touch panel 12.
  • FIG. 3 is a block diagram showing an example of the functional configuration of the unmanned aircraft 20 according to the present embodiment.
  • the unmanned vehicle 20 includes a transmission / reception unit 22, a flight controller 23, a battery 24, an ESC 25, a motor 26, a propeller 27, and a camera 28 in the main body 21.
  • the unmanned flying object 20 is an example of an flying object.
  • the type of the flying object is not particularly limited, and may be, for example, a so-called multi-rotor type drone as shown in FIG.
  • the flight controller 23 can have one or more processors 23A such as a programmable processor (eg, central processing unit (CPU)).
  • processors 23A such as a programmable processor (eg, central processing unit (CPU)).
  • the flight controller 23 has a memory 23B and can access the memory 23B.
  • Memory 23B stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps.
  • the memory 23B may include, for example, a separable medium such as an SD card or a random access memory (RAM) or an external storage device.
  • the data acquired from the sensors 23C may be directly transmitted and stored in the memory 23B.
  • still image / moving image data taken by the camera 28 is recorded in the built-in memory or the external memory.
  • the flight controller 23 includes a control module configured to control the state of the flying object.
  • the control module may adjust the spatial placement, velocity, and / or acceleration of an air vehicle with 6 degrees of freedom (translational motion x, y and z, and rotational motion ⁇ x , ⁇ y and ⁇ z).
  • the propulsion mechanism (motor 26, etc.) of the air vehicle is controlled via the ESC (Electric Speed Controller) 25.
  • the control module can control one or more of the camera 28, the sensors 23C, and the like.
  • the flight controller 23 is a transmitter / receiver configured to transmit and / or receive data from one or more external devices (eg, a terminal such as an information processing terminal 10, a display device, or another remote controller). It is possible to communicate with 22.
  • the transmission / reception unit 22 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, cloud communication, and the like. can do.
  • the transmission / reception unit 22 is one or more of data acquired by the camera 28 and sensors 23C, processing results generated by the flight controller 23, predetermined control data, user commands from the information processing terminal 10 or a remote controller, and the like. Can be sent and / or received.
  • Sensors 23C may include an inertial sensor (acceleration sensor, gyro sensor), GPS sensor, proximity sensor (eg, rider), or vision / image sensor (eg, camera).
  • inertial sensor acceleration sensor, gyro sensor
  • GPS sensor GPS sensor
  • proximity sensor eg, rider
  • vision / image sensor eg, camera
  • the battery 24 can be a known battery such as a lithium polymer battery.
  • the power for driving the unmanned vehicle 20 is not limited to the electric power supplied from the battery 24 or the like, and may be, for example, the power of an internal combustion engine or the like.
  • the camera 28 is an example of an image pickup device.
  • the type of the camera 28 is not particularly limited, and may be, for example, an ordinary digital camera, an omnidirectional camera, an infrared camera, an image sensor such as a thermography, or the like.
  • the camera 28 may be connected to the main body 21 so as to be independently displaceable by a gimbal or the like (not shown).
  • FIG. 4 is a block diagram showing a functional configuration of the control unit 11 according to the present embodiment.
  • the control unit 11 includes a map acquisition unit 111, an image acquisition unit 112, a display control unit 113, an input information acquisition unit 114, a flight restricted area determination unit 115, a coordinate conversion unit 116, and an output control unit 117. Be prepared.
  • Each of these functional units can be realized by the processor 11a reading a program stored in the storage 11c into the memory 11b and executing the program.
  • the map information 118 may be stored in, for example, a storage 11c, an external storage, a cloud server, or the like.
  • the map acquisition unit 111 has a function of acquiring map information 118.
  • the map information 118 has information related to the map.
  • the information related to the map is associated with, for example, a map or an aerial photograph (which may include those imaged by the unmanned aircraft 20) with position information (including at least one of longitude information, latitude information, and altitude information). It is the information that has been given.
  • Such map information 118 may include not only geographical information but also information on the type of structure or area (house, park, public facility, commercial facility, industrial facility, river, etc.).
  • the map information 118 may be acquired by using, for example, an API (Application Programming Interface) provided by a service related to a map.
  • API Application Programming Interface
  • the user may appropriately update the map information 118 by performing an operation of searching for a place where the flight restricted area should be set by inputting to the touch panel 12 on which the map is displayed.
  • the acquired map information 118 is output to the display control unit 113.
  • the image acquisition unit 112 has a function of acquiring information on an image captured by the unmanned flying object 20.
  • the image acquisition unit 112 appropriately acquires an image captured by the camera 28 mounted on the unmanned flying object 20.
  • the image to be acquired may be a moving image obtained in real time, or may be a still image captured at an arbitrary timing.
  • the acquired image information is output to the display control unit 113.
  • the display control unit 113 has a function of displaying acquired images and information on the touch panel 12. Further, the display control unit 113 includes information such as buttons, objects, and texts for providing information to the user who uses the system 1 and for acquiring input information based on the operation by the user in the image. It has a function to display. Further, it may also have a function for displaying the display based on the input information obtained by the input information acquisition unit 114, which will be described later, on the touch panel 12.
  • the display control unit 113 may have a function of displaying a map related to the acquired map information 118. As a result, the map is displayed on the touch panel 12. Further, the display control unit 113 may have a function of displaying the captured image acquired by the image acquisition unit 112.
  • the input information acquisition unit 114 has a function of acquiring input information generated based on an operation on an image displayed on the touch panel 12.
  • the input information referred to here includes, for example, information regarding a position on an image displayed on the touch panel 12.
  • the position on the image is, for example, the position of the pixels constituting the image. That is, the input information includes information indicating to which position on the image the user has performed an operation. Such operations include, for example, tap, touch, swipe, slide and the like.
  • Such input information may include, for example, input information relating to the setting of a flight restricted area.
  • the input information related to the setting of the flight restricted area is, for example, when the area where the unmanned aircraft 20 can fly is displayed on the map on the touch panel 12, the flight restricted area of the unmanned aircraft 20 is displayed.
  • This is information (designated area information) generated by an operation of setting a designated area corresponding to the map.
  • Such input information may include, for example, information regarding the registration, modification or deletion of flight restricted areas.
  • the input information may include input information for controlling the flight of the unmanned vehicle 20.
  • input information for controlling the flight of the unmanned vehicle 20 For example, when a button or an object for operating the flight of the unmanned vehicle 20 is displayed on the touch panel 12 by the display control unit 113, the input information for controlling the flight of the unmanned vehicle 20 is the button or the object.
  • Such input information includes, for example, control for takeoff, landing, ascent, descent, translation, rotation, speed, etc. of the unmanned aircraft 20, and information for control regarding image pickup processing, zoom, and shooting direction of the camera 28. sell.
  • the flight restricted area determination unit 115 has a function of determining a flight restricted area in real space based on the input information related to the setting of the flight restricted area. For example, the flight restricted area determination unit 115 determines the flight restricted area in the real space based on the designated area information included in the above input information.
  • the flight restricted area can be specified by, for example, the position information (latitude information and longitude information) of the designated area on the map.
  • the coordinate conversion unit 116 has a function of converting the flight restricted area into the position coordinates in the image captured by the unmanned vehicle 20. For example, the coordinate conversion unit 116 obtains the coordinates of the flight restricted area on the captured image based on the latitude information and the longitude information of the flight restricted area determined by the flight restricted area determination unit 115. Thereby, for example, it is possible to recognize which part the flight restricted area corresponds to on the captured image displayed on the touch panel 12 described later.
  • the output control unit 117 has a function of outputting information related to the flight restricted area in the real space.
  • the output mode of the information by the output control unit 117 is not particularly limited.
  • the output control unit 117 may output the information to the touch panel 12.
  • the information may be superimposed and displayed on the captured image captured by the unmanned flying object 20, for example. More specifically, the output control unit 117 displays an image indicating a position (for example, an area) corresponding to at least one of the inside or outside of the flight restricted area, such as AR (augmented reality). It may be superimposed on the touch panel 12 and displayed on the touch panel 12.
  • the output control unit 117 may control the display of the object when the object for moving the unmanned vehicle 20 is superimposed on the captured image captured by the unmanned vehicle 20.
  • the output control unit 117 corresponds to the fact that the object cannot be set as the movement target position when the object is in a position corresponding to the outside of the flight restricted area (that is, the non-flight area) in the captured image.
  • the display of the object may be controlled depending on the embodiment.
  • the mode corresponding to the non-setting mode may include, for example, a mode in which the color or shape of the object changes, or the object for completing the setting of the movement target cannot be selected.
  • the output control unit 117 displays the object for setting the movement target position so that the object can be moved based on the operation with respect to the touch panel 12 only at the position inside the flight restricted area in the captured image. May be controlled.
  • the output control unit 117 may output the information related to the flight restricted area in the real space to the unmanned flying object 20.
  • the output control unit 117 may control the unmanned vehicle 20 so as to fly only in the area inside the flight restricted area. Thereby, for example, the unmanned aircraft 20 can fly only in the area corresponding to the restricted flight area.
  • FIG. 5 is a flowchart of a series of controls in the method of setting a flight restricted area using the system 1 according to the present embodiment.
  • the map acquisition unit 111 acquires the map information 118 (step SQ101). Then, the acquired map is displayed on the touch panel 12 by the display control unit 113 (step SQ103).
  • FIG. 6 is a diagram showing a first screen example of the touch panel 12 according to the method of setting the flight restricted area by the system 1 according to the present embodiment.
  • the map M1 is displayed on the screen V1 of the touch panel 12. Further, on the screen V1, a button 101 for modifying the designated area is displayed.
  • a button 101 for modifying the designated area is displayed.
  • By operating the touch panel 12 of the user it is possible to set a designated area for the map M1. Further, by inputting a slide, a pinch, or the like to the map M1, the area displayed on the map M1 can be changed or scaled.
  • FIG. 7 is a diagram showing a second screen example of the touch panel 12 according to the method of setting the flight restricted area by the system 1 according to the present embodiment.
  • a plurality of pins 201 may be set on the map M1 by the user's operation on the touch panel 12.
  • the designated area 202 can be set with each pin 201 as the apex.
  • the method of setting the designated area 202 is not particularly limited, and for example, the designated area may be set by freehand. Further, when the same area is set in the past, the designated area may be set by calling the information set in the past.
  • the designated area 202 may be determined, for example, when the designation of pin 201 is completed and the enclosed area is generated.
  • the flight restricted area determination unit 115 determines the flight restricted area in the real space based on the determined designated area 202 (step SQ107).
  • the information relating to the flight restricted area may be temporarily stored in the storage 13 or the like, for example.
  • the unmanned vehicle 20 may be controlled so as to be used.
  • FIG. 8 is a flowchart relating to a series of controls in the information output method relating to the flight restricted area using the system 1 according to the present embodiment.
  • the unmanned vehicle 20 captures an area including the flight restricted area and its vicinity by the loaded camera 28 (step SQ201).
  • the captured image obtained by such an imaging process is acquired by the image acquisition unit 112 (step SQ203).
  • the position of the unmanned flying object 20 or the like when the captured image is captured is also acquired.
  • the information related to the position can be used for the conversion process between the coordinates on the captured image and the coordinates on the real space, which will be described later.
  • the display control unit 113 displays the captured image on the touch panel 12 (step SQ205).
  • the captured image may be a moving image received in real time from the unmanned flying object 20, or may be a still image.
  • the coordinate conversion unit 116 associates the coordinates on the captured image with the real space coordinates (step SQ207).
  • a process is a process for displaying an area corresponding to a flight restricted area on a captured image described later.
  • the correspondence between the coordinates (local coordinates) on the captured image and the real space coordinates (global coordinates) can be performed by a known technique such as mapping conversion.
  • the coordinate conversion unit 116 converts the flight restricted area into the coordinates on the captured image based on the above correspondence (step SQ209). By such processing, the area corresponding to the flight restricted area on the captured image is uniquely determined.
  • the output control unit 117 superimposes and displays the area outside the flight restricted area on the touch panel 12 on the captured image (step SQ211).
  • FIG. 9 is a diagram showing a first screen example of the touch panel 12 relating to the information output method relating to the flight restricted area by the system 1 according to the present embodiment.
  • the image P1 captured from the unmanned flying object 20 is displayed on the screen V1 of the touch panel 12.
  • the captured image P1 is obtained by capturing an image with the shooting direction of the camera 28 of the unmanned flying object 20 as the direct downward direction.
  • the region FRA10 inside the flight restricted area and the region FRA11 on the solo side of the flight restricted area are displayed so as to be superimposed on the captured image P1 with the boundary line 203 of the flight restricted area interposed therebetween.
  • a button 103 for landing the unmanned aviation body 20 a button 104 for moving the unmanned fleet 20 in the vertical direction, and a button 105 for rotating the unmanned fleet 20 around the yaw axis.
  • the button 109 for starting the movement of the object 108 and the unmanned aircraft 20 of the above is also displayed.
  • the region FRA10 inside the flight restricted area and the region FRA11 outside the flight restricted area can be displayed by other embodiments.
  • no image processing is performed on the region FRA10 inside the flight restricted area, whereas an object can be displayed on the region FRA10 outside the flight restricted area.
  • the touch panel 12 it is possible to visually recognize the inside and outside of the flight restricted area from the viewpoint of the unmanned flying object 20. That is, the position in the real space corresponding to the flight restricted area set by the user himself / herself with respect to the interface of the touch panel 12 can be visually recognized during the flight of the unmanned flying object 20.
  • the display position of the area inside and outside the flight restricted area may be changed according to the movement of the unmanned flight object 20.
  • FIG. 10 is a diagram showing a second screen example of the touch panel 12 relating to the information output method relating to the flight restricted area by the system 1 according to the present embodiment.
  • the captured image P2 is obtained by capturing the image of the camera 28 of the unmanned flying object 20 not directly below but at an angle.
  • the region FRA10 inside the flight restricted area and the region FRA11 outside the flight restricted area are displayed separately as information related to the flight restricted area on the captured image P2.
  • the position information included in the captured image may include information related to the imaging angle of the camera 28.
  • the coordinate conversion unit 116 Based on the information related to the imaging angle, the coordinate conversion unit 116 associates the coordinates of the flight restricted area with the coordinates of the captured image, so that the region on the captured image P2 can be displayed. At this time, predetermined altitude information may be included as the position information of the flight restricted area. As a result, the position of the flight restricted area to be superimposed on the captured image P2 can be uniquely determined.
  • FIG. 11 is a diagram showing a third screen example of the touch panel 12 relating to the information output method relating to the flight restricted area by the system 1 according to the present embodiment.
  • the captured image P1 shown in FIG. 11 is the same as the captured image P1 shown in FIG.
  • the object 108 for positioning the movement target position of the unmanned aircraft 20 is moved onto the region FRA 11 outside the flight restricted area.
  • the aspect of the object 108 changes and a display like the object 108a appears on the screen V1.
  • the object 108a may be displayed as a mode that cannot be set at this position.
  • the button 109 for starting the movement of the unmanned vehicle 20 cannot be selected. It may be changed and displayed. This prevents the user from moving the unmanned vehicle 20 out of the restricted flight area.
  • the example of the control method of the unmanned aircraft 20 by the system 1 according to the present embodiment has been described above.
  • an example of a control method in the actual flight of the unmanned flying object 20 has been described, but the present technology is not limited to such an example.
  • this technology can be used for simulation in the case of flying a virtual drone in VR (Virtual Reality) or AR (Augmented Reality) space.
  • VR Virtual Reality
  • AR Augmented Reality
  • the simulation technique using the VR space or the AR space a known technique can be used.
  • a flight restricted area can be set for the touch panel 12, and information regarding the flight restricted area can be obtained via the touch panel 12.
  • the flight restricted area can be superimposed on the captured image captured by the unmanned vehicle 20 and displayed in an AR manner.
  • the user who operates the touch panel 12 can set the flight restricted area of the unmanned vehicle 20 by himself / herself and intuitively grasp the position thereof.
  • a display for example, even if the shooting angle of the camera 28 of the unmanned flying object 20 is changed, it is possible to superimpose and display an object indicating a flight restricted area at an appropriate position of the captured image.
  • the inside and outside of the flight restricted area are displayed in an AR manner by an object or the like, but the present technology is not limited to such an example.
  • the touch panel 12 may output an alert that the unmanned aircraft 20 is close to the boundary. If it is determined that the unmanned aircraft 20 has flown outside the restricted flight area, or that the unmanned aircraft 20 has been controlled to cross the boundary of the restricted flight area within a predetermined time, the flight is controlled. The unmanned aircraft 20 may be automatically controlled to move toward the inside of the restricted flight area.
  • the device described in the present specification may be realized as a single device, or may be realized by a plurality of devices, etc., which are partially or wholly connected by a network.
  • the control unit and the storage of the information processing terminal 10 may be realized by different servers connected to each other by a network.
  • the series of processes by the apparatus described in the present specification may be realized by using any of software, hardware, and a combination of software and hardware. It is possible to create a computer program for realizing each function of the information processing terminal 10 according to the present embodiment and implement it on a PC or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed, for example, via a network without using a recording medium.
  • (Item 1) Displaying the map on the display and Acquiring the information related to the designated area on the map, which is input based on the operation for the map displayed on the display unit, and To output information related to the flight restricted area of the aircraft in real space based on the designated area on the map. How to include.
  • (Item 2) The method described in item 1.
  • (Item 3) The method described in item 2. Displaying the captured image loaded on the flying object flying in the flight restricted area on the display unit, and A method further comprising superimposing information on the flight restricted area on the captured image and displaying it on the display unit.
  • (Item 4) The method described in item 3 A method of superimposing a display indicating a position corresponding to at least one of the inside or outside of the flight restricted area in the captured image on the captured image and displaying the display on the display unit.
  • (Item 5) The method described in item 4.
  • (Item 6) The method according to any one of items 2 to 5. Including displaying an image captured by the flying object flying in the flight restricted area on the display unit. An object for setting the movement target position of the flying object is displayed on the display unit displaying the captured image.
  • the object When the position of the object is a position corresponding to the outside of the flight restricted area in the captured image, the object is displayed on the display unit in a manner corresponding to the fact that it cannot be set as the movement target position.
  • Method. (Item 7) The method according to any one of items 2 to 6. Including displaying an image captured by the flying object flying in the flight restricted area on the display unit. An object for setting the movement target position of the flying object is displayed on the display unit displaying the captured image. A method in which the display of the object is controlled so that the object can be moved based on an operation on the display unit only at a position inside the flight restricted area in the captured image. (Item 8) The method according to any one of items 1 to 7.
  • a method of outputting information relating to a flight restricted area in real space to the flying object (Item 9) The method according to item 8. A method in which the flying object is controlled to fly only in an area inside the restricted flight area in the real space. (Item 10) The method according to any one of items 1 to 9. A method, wherein the information relating to the flight restricted area in real space includes information relating to a non-flying area outside the flight restricted area. (Item 11) It is a system related to flight control of an aircraft.
  • a display control unit that displays a map on the display unit
  • An input information acquisition unit for acquiring information related to a designated area on the map, which is input based on an operation for the map displayed on the display unit.
  • An output control unit that outputs information related to the flight restricted area in real space based on the designated area on the map.
  • a system equipped with. (Item 12) Computer A display control unit that displays a map on the display unit, An input information acquisition unit for acquiring information related to a designated area on the map, which is input based on an operation for the map displayed on the display unit. An output control unit that outputs information related to the flight restricted area of the aircraft in real space based on the designated area on the map.
  • a program that functions as.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Le problème à résoudre par la présente invention est de pouvoir régler facilement une zone à restriction de vol d'un objet volant sur la scène. La solution porte sur le présent procédé qui est utilisé dans la détermination d'une zone à restriction de vol, et ce procédé consiste à afficher une carte sur un panneau tactile (12), acquérir des informations relatives à une région indiquée sur la carte affichée sur le panneau tactile (12), la région étant entrée sur la base d'une opération effectuée sur la carte, et délivrer en sortie des informations relatives à une zone à restriction de vol d'un objet volant sans pilote (20) dans un espace réel sur la base de la région indiquée sur la carte.
PCT/JP2021/021987 2020-06-10 2021-06-09 Procédé, système et programme WO2021251441A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022530612A JPWO2021251441A1 (fr) 2020-06-10 2021-06-09

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-101296 2020-06-10
JP2020101296 2020-06-10

Publications (1)

Publication Number Publication Date
WO2021251441A1 true WO2021251441A1 (fr) 2021-12-16

Family

ID=78846095

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/021987 WO2021251441A1 (fr) 2020-06-10 2021-06-09 Procédé, système et programme

Country Status (2)

Country Link
JP (1) JPWO2021251441A1 (fr)
WO (1) WO2021251441A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7295320B1 (ja) 2022-09-21 2023-06-20 ソフトバンク株式会社 情報処理装置、プログラム、システム、及び情報処理方法
JP7295321B1 (ja) 2022-09-21 2023-06-20 ソフトバンク株式会社 情報処理装置、プログラム、システム、及び情報処理方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017175804A1 (fr) * 2016-04-08 2017-10-12 株式会社ナイルワークス Procédé de pulvérisation de substance chimique utilisant un véhicule aérien sans pilote, programme et dispositif
JP2017208678A (ja) * 2016-05-18 2017-11-24 本郷飛行機株式会社 小型無人飛行機の通信及び制御装置並びにこれらの方法
WO2019003396A1 (fr) * 2017-06-29 2019-01-03 株式会社オプティム Système, procédé et programme de fourniture d'images
JP2019039875A (ja) * 2017-08-28 2019-03-14 Necソリューションイノベータ株式会社 飛行経路設定装置、飛行経路設定方法、及びプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017175804A1 (fr) * 2016-04-08 2017-10-12 株式会社ナイルワークス Procédé de pulvérisation de substance chimique utilisant un véhicule aérien sans pilote, programme et dispositif
JP2017208678A (ja) * 2016-05-18 2017-11-24 本郷飛行機株式会社 小型無人飛行機の通信及び制御装置並びにこれらの方法
WO2019003396A1 (fr) * 2017-06-29 2019-01-03 株式会社オプティム Système, procédé et programme de fourniture d'images
JP2019039875A (ja) * 2017-08-28 2019-03-14 Necソリューションイノベータ株式会社 飛行経路設定装置、飛行経路設定方法、及びプログラム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7295320B1 (ja) 2022-09-21 2023-06-20 ソフトバンク株式会社 情報処理装置、プログラム、システム、及び情報処理方法
JP7295321B1 (ja) 2022-09-21 2023-06-20 ソフトバンク株式会社 情報処理装置、プログラム、システム、及び情報処理方法
JP2024044400A (ja) * 2022-09-21 2024-04-02 ソフトバンク株式会社 情報処理装置、プログラム、システム、及び情報処理方法
JP2024044499A (ja) * 2022-09-21 2024-04-02 ソフトバンク株式会社 情報処理装置、プログラム、システム、及び情報処理方法

Also Published As

Publication number Publication date
JPWO2021251441A1 (fr) 2021-12-16

Similar Documents

Publication Publication Date Title
US11233943B2 (en) Multi-gimbal assembly
WO2020143677A1 (fr) Procédé de commande de vol et système de commande de vol
CN108351574A (zh) 用于设置相机参数的系统、方法和装置
WO2021251441A1 (fr) Procédé, système et programme
WO2021259252A1 (fr) Procédé et appareil de simulation de vol, dispositif électronique et véhicule aérien sans pilote
WO2019227289A1 (fr) Procédé et dispositif de commande de chronophotographie
WO2019230604A1 (fr) Système d'inspection
WO2021199449A1 (fr) Procédé de calcul de position et système de traitement d'informations
US11082639B2 (en) Image display method, image display system, flying object, program, and recording medium
JPWO2018146803A1 (ja) 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体
WO2020048365A1 (fr) Procédé et dispositif de commande de vol pour aéronef, et dispositif terminal et système de commande de vol
JP2023100642A (ja) 検査システム
JP6966810B2 (ja) 管理サーバ及び管理システム、表示情報生成方法、プログラム
JP7435599B2 (ja) 情報処理装置、情報処理方法、及びプログラム
WO2021168821A1 (fr) Procédé de commande de plateforme mobile et dispositif
JP6681101B2 (ja) 検査システム
WO2020225979A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations
WO2020042186A1 (fr) Procédé de commande de plateforme mobile, plateforme mobile, dispositif terminal et système
JP6730764B1 (ja) 飛行体の飛行経路表示方法及び情報処理装置
JP2020036163A (ja) 情報処理装置、撮影制御方法、プログラム及び記録媒体
JP7004374B1 (ja) 移動体の移動経路生成方法及びプログラム、管理サーバ、管理システム
JP6684012B1 (ja) 情報処理装置および情報処理方法
WO2022188151A1 (fr) Procédé de photographie d'image, appareil de commande, plateforme mobile et support de stockage informatique
WO2022000245A1 (fr) Procédé de positionnement d'aéronef, et procédé et appareil de commande pour système de positionnement assisté
JP6800505B1 (ja) 飛行体の管理サーバ及び管理システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21821110

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022530612

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21821110

Country of ref document: EP

Kind code of ref document: A1