WO2021064982A1 - Dispositif et procédé de traitement d'informations - Google Patents

Dispositif et procédé de traitement d'informations Download PDF

Info

Publication number
WO2021064982A1
WO2021064982A1 PCT/JP2019/039281 JP2019039281W WO2021064982A1 WO 2021064982 A1 WO2021064982 A1 WO 2021064982A1 JP 2019039281 W JP2019039281 W JP 2019039281W WO 2021064982 A1 WO2021064982 A1 WO 2021064982A1
Authority
WO
WIPO (PCT)
Prior art keywords
field image
information processing
visual field
point
processing device
Prior art date
Application number
PCT/JP2019/039281
Other languages
English (en)
Japanese (ja)
Inventor
賢次 小関
好司 岸田
Original Assignee
株式会社トラジェクトリー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社トラジェクトリー filed Critical 株式会社トラジェクトリー
Priority to PCT/JP2019/039281 priority Critical patent/WO2021064982A1/fr
Priority to JP2019572703A priority patent/JP6684012B1/ja
Publication of WO2021064982A1 publication Critical patent/WO2021064982A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]

Definitions

  • the present disclosure relates to an information processing device and an information processing method for setting a route point of a moving body.
  • flying objects can be used, for example, for aerial photography, inspection and surveying.
  • an air vehicle is made to fly autonomously for the purpose of such an application and a predetermined process is performed at a predetermined position, a route point of the air vehicle is set.
  • Patent Document 1 discloses a technique for setting a flight path for three-dimensional geographic information.
  • an object of the present invention is to provide an information processing apparatus and an information processing method capable of setting a path point of a moving body more accurately and easily. ..
  • an information processing device for setting a path point of a moving body, and is displayed as a display control unit for displaying a field image in a virtual three-dimensional space corresponding to the moving target space on the display device. It includes a determination unit that determines one selected visual field image from the visual field image, and a point setting unit that sets the path point of the moving body in the moving target space based on the viewpoint position of the selected visual field image.
  • An information processing device is provided.
  • an information processing method for setting a path point of a moving body, and a visual field image in a virtual three-dimensional space corresponding to a moving target space is displayed on a display device.
  • An information processing method including determining one selected visual field image from the visual field image and setting the path point of the moving body in the moving target space based on the viewpoint position of the selected visual field image. Is provided.
  • the route point of the moving body can be set more accurately and easily.
  • the technique according to the embodiment of the present disclosure has the following configurations.
  • (Item 1) An information processing device for setting the route point of a moving body.
  • a display control unit that displays a field image in a virtual three-dimensional space corresponding to the moving target space on a display device,
  • a determination unit that determines one selected visual field image from the displayed visual field image,
  • a point setting unit that sets the path point of the moving body in the moving target space based on the viewpoint position of the selected visual field image.
  • Information processing device (Item 2) The information processing device according to item 1, wherein the point setting unit sets a position in the moving target space corresponding to the viewpoint position of the selected visual field image as the path point of the moving body.
  • the information processing device according to item 1 or 2, wherein the point setting unit sets the direction of the moving body at the path point of the moving body based on the line-of-sight direction at the viewpoint position of the selected visual field image.
  • the mobile is equipped with a camera The information processing device according to any one of items 1 to 3, wherein the point setting unit sets the direction of the optical axis of the camera based on the line-of-sight direction at the viewpoint position of the selected visual field image.
  • the mobile is equipped with a camera The information according to any one of items 1 to 4, wherein the point setting unit sets a position in the moving target space in which the camera can generate an captured image corresponding to the selected visual field image as the path point. Processing equipment.
  • (Item 6) The information processing device according to any one of items 1 to 5, wherein the display control unit controls the display of the visual field image on the stereoscopic viewing device.
  • (Item 7) The information processing device according to any one of items 1 to 6, wherein the display control unit displays a position corresponding to a viewpoint position of the visual field image on a screen related to map information corresponding to the moving target space. ..
  • (Item 8) The information processing device according to any one of items 1 to 7, wherein the information related to the virtual three-dimensional space is information generated based on geographic information provided from an external server. (Item 9) It is an information processing method for setting the route point of a moving body.
  • Displaying the visual field image in the virtual three-dimensional space corresponding to the moving target space on the display device Determining one selected visual field image from the displayed visual field image, Setting the path point of the moving body in the moving target space based on the viewpoint position of the selected visual field image, and setting the path point.
  • Information processing methods including.
  • FIG. 1 is a diagram for explaining an outline of a route setting system for an air vehicle.
  • the virtual three-dimensional space 1000 is a space generated based on virtual three-dimensional space data corresponding to an actual flight target space (an example of a movement target space).
  • the virtual three-dimensional space 1000 is based on geographic information provided by an external server, such as spatial data issued by Google Maps (registered trademark), Google Earth (registered trademark), or the Geospatial Information Authority of Japan. May be good.
  • Such three-dimensional spatial data may include object information such as architectural object 1010.
  • the building object 1010 may be additionally provided in the geographic information. Further, the information of the building object included in the geographic information may be deleted in the virtual three-dimensional space 1000. Thereby, for example, the flight target space before and after the construction of the building can be expressed.
  • a plurality of route points 1001 are set, and a flight path 1002 is set so as to connect these route points 1001.
  • a route point 1001 for example, the orientation (including before and after conversion) and attitude of the flying object, the orientation of the camera mounted on the flying object, and the like can be set.
  • the flight course at the route point 1001 can be appropriately set, and the imaging direction at the route point 1001 can be specified.
  • the orientation and attitude of the above-mentioned flying object and camera are set by using a field image in the virtual three-dimensional space 1000.
  • a field image in the virtual three-dimensional space 1000.
  • FIG. 2 is a diagram showing an outline of the system 1 according to the present embodiment.
  • the system 1 includes an information processing device 10 and a display device 20 (21).
  • Such a system 1 can transmit information about a flight path to the flying object 30.
  • the system 1 can receive geographic information from the external server 40.
  • the information processing device 10 is provided so as to be connectable to the display device 20 via a network such as the Internet.
  • networks include local area networks (LAN), wide area networks (WAN), infrared, wireless, WiFi, point-to-point (P2P) networks, telecommunications networks, cloud communications, and the like.
  • the display device 20 is a display device such as a display, a television, a screen, or a monitor.
  • the display device 20 has a function of displaying information transmitted from the information processing device 10.
  • the display device 20 may include an input device such as a mouse, a keyboard, a switch, a button, a sensor, a microphone, or a touch panel, and the display device 20 transmits information obtained by inputting to the input device to the information processing device 10. You may send it.
  • the display device may be VR goggles 21.
  • the VR goggles 21 is an example of a display device that displays a virtual three-dimensional space corresponding to the flight target space by stereoscopic vision and displays it to the user. By using the VR goggles 21, it is possible to enhance the sense of presence in the virtual three-dimensional space and to set the information regarding the path point of the flying object more precisely.
  • the display device is not limited to the VR goggles 21 as long as it can be viewed stereoscopically, and any display device may be used.
  • the flying object 30 is an example of a moving object.
  • an unmanned flying object is shown as an example of a moving body in FIG. 2, the moving body may be, for example, a vehicle, a ship, a diving device, or the like, and the form is particularly limited as long as it is a moving body that operates autonomously by a control signal.
  • Examples of the air vehicle 30 include an unmanned aerial vehicle (for example, a drone, a multicopter, etc.) that autonomously flies based on control information obtained in advance from a control device such as an information processing device 10.
  • the type of flying object is not particularly limited.
  • the aircraft body 30 autonomously flies based on the information regarding the flight path and the route point transmitted from the information processing device 10.
  • the air vehicle 30 may be connected to the information processing device 10 via, for example, LTE, 5G, infrared rays, WiFi, Bluetooth (registered trademark), BLE (Bluetooth Low Energy), wired or the like.
  • the external server 40 is composed of one or more servers (for example, a cloud server) and has geographic information.
  • geographic information may include, for example, map information and topographical information, as well as additional and meta information associated thereto.
  • the additional information is, for example, information about a building.
  • Geographic information may be provided by GIS (Geographic Information System).
  • GIS Geographic Information System
  • the information processing device 10 may acquire geographic information from an external server 40 through an API (Application Programming Interface). For example, when the information processing device 10 stores the geographic information in the storage 13 or the like in advance, the information processing device 10 does not have to acquire the geographic information from the external server 40.
  • API Application Programming Interface
  • the flying object 30 when the flight path of the flight body 30 is set and the flight body 30 performs a predetermined work at a route point (a point in the middle of the flight path where the work by the flight body 30 such as shooting and inspection is performed). To do. At this time, in order to properly carry out the work by the flying object 30, the flying object 30 may be set to face a predetermined direction or posture with respect to the object or the target space. However, when trying to set the orientation and attitude of the flying object 30 numerically, it is difficult to grasp whether the orientation of the flying object 30 in the actual flight target space is the desired direction until the actual flight.
  • the system 1 constructs a virtual three-dimensional space 1000 corresponding to the flight target space, and uses a virtual flying object (or a camera mounted on the flying object) in the virtual three-dimensional space 1000 as a viewpoint. Acquire a field image. From such a field image, a field image that reflects a desired field of view is selected as a selected field image, and a path point of an air vehicle is set based on the viewpoint position of the selected field image. As a result, the user who flies the flying object 30 can more intuitively grasp the route point of the flying object 30. That is, the route point of the flying object 30 can be set more accurately.
  • FIG. 3 is a diagram showing a hardware configuration of the information processing device 10 according to the present embodiment.
  • the illustrated configuration is an example, and may have other configurations.
  • the information processing device 10 is connected to a database (not shown) to form a part of the system.
  • the information processing device 10 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
  • the information processing device 10 includes at least a control unit 11, a memory 12, a storage 13, a communication unit 14, an input / output unit 15, and the like, and these are electrically connected to each other through a bus 16.
  • the control unit 11 is an arithmetic unit that controls the operation of the entire information processing device 10, controls the transmission and reception of data between each element, and performs information processing and the like necessary for application execution and authentication processing.
  • the control unit 11 is a CPU (Central Processing Unit), and executes each information processing by executing a program or the like stored in the storage 13 and expanded in the memory 12.
  • the memory 12 includes a main memory composed of a volatile storage device such as DRAM (Dynamic Random Access Memory) and an auxiliary storage composed of a flash memory or a non-volatile storage device such as HDD (Hard Disc Drive). ..
  • the memory 12 is used as a work area or the like of the control unit 11, and also stores a BIOS (Basic Input / Output System) executed when the information processing apparatus 10 is started, various setting information, and the like.
  • BIOS Basic Input / Output System
  • the storage 13 stores various programs such as application programs.
  • a database storing data used for each process may be built in the storage 13.
  • the communication unit 14 connects the information processing device 10 to the network and / or the blockchain network.
  • the communication unit 14 may be provided with a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
  • the input / output unit 15 is an information input device such as a keyboard and a mouse, and an output device such as a display.
  • the bus 16 is commonly connected to each of the above elements and transmits, for example, an address signal, a data signal, and various control signals.
  • FIG. 4 is a functional block diagram of the flying object 30 according to the present embodiment.
  • the following functional block diagram is described as a concept stored in a single device (aircraft in FIG. 4) for the sake of simplicity, but for example, some of its functions are described as an external device (for example, an external device). , It may be logically configured by exerting it on the information processing device 10) or by using cloud computing technology.
  • the flight controller 31 can have one or more processors such as a programmable processor (eg, central processing unit (CPU)).
  • processors such as a programmable processor (eg, central processing unit (CPU)).
  • the flight controller 31 has a memory 311 and can access the memory.
  • Memory 311 stores logic, code, and / or program instructions that the flight controller 31 can execute to perform one or more steps.
  • Memory 311 may include, for example, a separable medium such as an SD card or random access memory (RAM) or an external storage device.
  • the data acquired from the external device 35 such as the camera 351 or the sensor 352 may be directly transmitted and stored in the memory 311.
  • still image / moving image data taken by a camera or the like is recorded in an internal memory or an external memory.
  • the external device 35 is installed on the flying object via the gimbal 34.
  • the flight controller 31 includes a control unit 312 configured to control the state of the flying object.
  • the control unit 312 adjusts the spatial arrangement, velocity, and / or acceleration of an air vehicle having six degrees of freedom (translational motion x, y and z, and rotational motion ⁇ x , ⁇ y and ⁇ z).
  • the propulsion mechanism (motor 37, etc.) of the flying object is controlled via the ESC (Electric Speed Controller) 36.
  • the motor 37 rotates the propeller 38 to generate lift of the flying object.
  • the control unit 312 can control one or more of the states of the mounting unit and the sensors.
  • the flight controller 31 is a transmitter / receiver configured to transmit and / or receive data from one or more external devices (eg, transmitter / receiver (propo), terminal, display device, or other remote controller). It is possible to communicate with 33.
  • the transmitter / receiver can use any suitable communication means such as wired communication or wireless communication.
  • the transmission / reception unit 33 is one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, wired, WiFi, point-to-point (P2P) networks, telecommunications networks, cloud communications, and the like. Can be used.
  • LAN local area network
  • WAN wide area network
  • P2P point-to-point
  • the transmission / reception unit 33 transmits and / or transmits one or more of data acquired by a camera or various sensors, a processing result generated by the flight controller 31, predetermined control data, a user command from a terminal or a remote controller, and the like. Or you can receive it.
  • communication with the information processing device 10 can be performed via the transmission / reception unit 33.
  • the sensor 352 (included in the external device 25) according to the present embodiment is an inertial sensor (acceleration sensor, gyro sensor), positioning sensor (GPS sensor), ranging sensor (for example, laser sensor, ultrasonic sensor, LiDAR). , Or may include geomagnetic sensors. It also includes as a sensor or a vision / image sensor (eg, camera 351).
  • FIG. 5 is a block diagram showing the function of the control unit 11 in the information processing device 10 according to the present embodiment.
  • the configuration shown in FIG. 5 is an example, and is not limited to this.
  • the control unit 11 includes an acquisition unit 111, a display control unit 112, a determination unit 113, and a point setting unit 114.
  • the acquisition unit 111 has a function of acquiring various information from the inside or the outside of the control unit 11.
  • the acquisition unit 111 acquires geographic information from the external server 40 or the storage 13.
  • geographic information includes geographic information relating to the actual flight target space corresponding to the virtual three-dimensional space described later.
  • the virtual three-dimensional space can be rendered by the control unit 11 when it is displayed on a display device or the like by the display control unit 112, for example.
  • the display control unit 112 has a function of displaying a visual field image in a virtual three-dimensional space corresponding to the flight target space on a display device. As shown in FIG. 1, such a virtual three-dimensional space is a space modeled corresponding to the flight target space in which the flying object 30 is to be flown.
  • FIG. 6 is a diagram showing an example of processing in a virtual three-dimensional space.
  • the virtual three-dimensional space 1000 shown in FIG. 6 includes an architectural object 1010.
  • the flight path 1001a of the flying object and the path points 1002a and 1002b of the flying object are preset.
  • a new flight path 1001b is set next with the route point 1002b as the starting point, that is, the next route point is set.
  • the visual field image at the viewpoint position 1003 is acquired.
  • the visual field image corresponding to the line of sight 1004c at the viewpoint position 1003 is acquired.
  • the viewpoint position 1003 and the line of sight 1004c can be appropriately changed via, for example, the input / output unit 15. Due to such a change, a plurality of visual field images are acquired.
  • the field-of-view image may be a still image or a moving image.
  • the display control unit 112 outputs the acquired field of view image to the display devices 20 and 21.
  • FIG. 7 is a diagram showing an example of a display mode in the display device 20 by the display control unit 112 according to the present embodiment.
  • the visual field image Img1 is displayed on the display device 20.
  • a part of the virtual three-dimensional space 1000 is projected on the visual field image Img1.
  • the visual field image Img1 was acquired at the viewpoint position 1003 in FIG.
  • the field-of-view image Img1 includes the building object 1010, the already set flight path 1001a, and the path point 1002b.
  • the field-of-view image Img1 is an image obtained by looking down on the building object 1010 from an oblique direction.
  • the display control unit 112 may display the position 1006 corresponding to the viewpoint position 1003 of the visual field image Img1 on the screen 1005 related to the map information corresponding to the flight target space.
  • the screen 1005 may be superimposed and displayed on the visual field image Img1.
  • map information may be information related to a two-dimensional map, or may be map information corresponding to a three-dimensional space acquired by a visual field different from the visual field image Img1. This makes it possible to grasp the viewpoint position 1003 from a bird's-eye view.
  • the display control unit 112 may display the information at the viewpoint position 1003 from which the visual field image Img1 is acquired.
  • the viewpoint position information 1007 is displayed in the visual field image Img1.
  • the viewpoint position information 1007 displays latitude information (Lat.), Mild information (Lon.), Horizontal angle information (Dir.), And elevation angle information (Ang.). As a result, more accurate information regarding the viewpoint position can be grasped.
  • the user who sets the route of the flying object 30 can appropriately change the visual field image Img1 displayed on the display device 20 to determine the desired visual field image.
  • Such an operation may be performed by an input device such as a touch panel attached to the display device 20, or may be performed via the input / output unit 15 of the information processing device 10.
  • the determination unit 113 has a function of determining a selected visual field image from the visual field image displayed on the display device 20. For example, it is assumed that the visual field image of the determination unit 113 is appropriately changed by the user's operation or the like displayed on the display device 20. Then, when a desired field of view is obtained by the user and an operation related to the determination is performed by the user via the input / output unit 15 or the like, the determination unit 113 displays the field of view image displayed on the display device 20 at that time. Determined as a selective field image. Further, when a plurality of visual field images are obtained, the determination unit 113 may determine one of the visual field images as the selected visual field image. When the visual field image is a moving image, the determination unit 113 may extract one frame to determine the selected visual field image.
  • the point setting unit 114 has a function of setting a path point of the flying object 30 in the flight target space based on the viewpoint position of the selected visual field image. For example, the point setting unit 114 may set such a viewpoint position as a route point of the flying object 30.
  • FIG. 8 is a diagram showing an example of a route point setting process by the point setting unit 114 according to the present embodiment.
  • the point setting unit 114 sets the route point 1002c.
  • This route point 1002c is a position corresponding to the viewpoint position 1003 shown in FIG. That is, the position where the visual field image Img1 displayed in FIG. 7 can be acquired can be used as the route point as it is. As a result, the route point desired by the user can be set more accurately.
  • the point setting unit 114 may set the route point by giving an offset in consideration of the positional relationship of the flying object 30 and the camera 351 with respect to the viewpoint position.
  • the point setting unit 114 may set the direction of the flying object 30 at the route point or the direction of the optical axis of the camera 351 loaded on the flying object 30 based on the line-of-sight direction at the viewpoint position of the selected visual field image.
  • the orientation here includes the horizontal orientation and the elevation angle. As shown in FIG. 8, the orientation of the flying object 30 at the route points 1002a, 1002b and 1002c or the direction of the optical axis of the camera 351 1004a, 1004b and 1004c is the direction corresponding to the line-of-sight direction at the viewpoint position corresponding to each route point. It may be.
  • the flying object 30 in the work in the flight target space represented in the virtual three-dimensional space 1000.
  • the orientation of the optical axis of the camera 351 can be set in a more accurate and appropriate orientation.
  • the point setting unit 114 sets the direction of the flying object 30 based on the line-of-sight direction, for example, an object in which an external device 35 such as a sensor 352 or an actuator is loaded on the flying object 30 and included in the visual field image. It is conceivable that the work is performed by the external device 35 on the building or the like corresponding to the above. Further, a case where the point setting unit 114 sets the direction of the optical axis of the camera 351 based on the line-of-sight direction may be a case where a building or the like corresponding to an object included in the field of view image is photographed by the camera 351. ..
  • the point setting unit 114 may set a position in the flight target space in which the camera 351 can generate an captured image corresponding to the selected visual field image as a route point. That is, the point setting unit 114 may set the path point so that the camera 351 can generate the captured image corresponding to the selected visual field image and the direction.
  • Such points and orientations can be obtained from the relative position and attitude relationship between the flying object 30 and the camera 351 and the relationship between the position information in the virtual three-dimensional space and the position information in the actual flight target space.
  • the captured image obtained by the camera 351 can be obtained as an image in a situation closer to the selected visual field image.
  • Information about the route point set by the point setting unit 114 is output to, for example, the flying object 30.
  • the flying object 30 performs processing related to the flight mode and work at the route point based on the acquired information on the route point. Further, the information regarding the route point may be appropriately stored in the storage 13. Further, the flight route may be set based on the route point by another program of the control unit 11.
  • FIG. 9 is a flowchart relating to the control of the information processing apparatus 10 according to the present embodiment.
  • the acquisition unit 111 acquires the geographic information from the external server 40 or the like (step SQ101).
  • the display control unit 112 causes the display device 20 to display the visual field image in the virtual three-dimensional space obtained based on the acquired geographic information (step SQ103).
  • the determination unit 113 determines one selected visual field image from the visual field image based on the user's operation or the like (step SQ105). Then, the point setting unit 114 sets the path point of the flying object 30 based on the selected visual field image (step SQ107).
  • the control unit 11 outputs the information related to the set route point to the flying object 30 and / or the storage 13 or the like (step SQ109).
  • the output processing of the information related to the route points may be performed for each setting of one route point, or after the setting processing of a plurality of route points is performed by the control unit 11, a plurality of route points are collectively set. Information related to the above may be output.
  • the system 1 acquires a visual field image in a virtual three-dimensional space corresponding to the flight target space, and sets a path point of the flying object based on the desired visual field image.
  • This makes it possible to intuitively and more accurately set the route point (and the orientation of the flying object or camera) in a space that simulates an environment close to the actual flight target space. Therefore, the route point of the flying object can be set more accurately and easily than before, and the burden on the user is reduced.
  • the device described in the present specification may be realized as a single device, or may be realized by a plurality of devices (for example, a cloud server) in which some or all of them are connected by a network.
  • the control unit 11 and the storage 13 of the information processing device 10 may be realized by different servers connected to each other by a network.
  • the series of processes by the apparatus described in the present specification may be realized by using any of software, hardware, and a combination of software and hardware. It is possible to create a computer program for realizing each function of the information processing apparatus 10 according to the present embodiment and implement it on a PC or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via, for example, a network without using a recording medium.
  • the air vehicle disclosed in this disclosure can be expected to be used as an industrial air vehicle in surveys, surveys, observations, etc.
  • the aircraft of the present disclosure can be used in airplane-related industries such as multicopter drones.
  • System 10 Information processing device 20 Display device 21 VR goggles 30 Flying object (example of moving object) 40 External server 111 Acquisition unit 112 Display control unit 113 Decision unit 114 Point setting unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)
  • Traffic Control Systems (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un dispositif de traitement d'informations qui permet de régler plus précisément et simplement un point d'itinéraire pour un corps en mouvement. Selon l'invention, la solution porte sur un dispositif de traitement d'informations permettant de régler un point d'itinéraire pour un corps en mouvement, ledit dispositif comprenant : une unité de commande d'affichage servant à afficher, sur un dispositif d'affichage, des images de champ de vue dans un espace tridimensionnel virtuel correspondant à un espace à parcourir ; une unité de détermination servant à déterminer une seule image de champ de vue sélectionnée parmi les images de champ de vue affichées ; et une unité de réglage de point pour, sur la base de la position de point de vue de l'image de champ de vue sélectionnée, réglant un point d'itinéraire pour le corps en mouvement dans l'espace à parcourir.
PCT/JP2019/039281 2019-10-04 2019-10-04 Dispositif et procédé de traitement d'informations WO2021064982A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/039281 WO2021064982A1 (fr) 2019-10-04 2019-10-04 Dispositif et procédé de traitement d'informations
JP2019572703A JP6684012B1 (ja) 2019-10-04 2019-10-04 情報処理装置および情報処理方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/039281 WO2021064982A1 (fr) 2019-10-04 2019-10-04 Dispositif et procédé de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2021064982A1 true WO2021064982A1 (fr) 2021-04-08

Family

ID=70286826

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/039281 WO2021064982A1 (fr) 2019-10-04 2019-10-04 Dispositif et procédé de traitement d'informations

Country Status (2)

Country Link
JP (1) JP6684012B1 (fr)
WO (1) WO2021064982A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022247498A1 (fr) * 2021-05-27 2022-12-01 北京三快在线科技有限公司 Surveillance de véhicule aérien sans pilote

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012053896A (ja) * 2009-05-18 2012-03-15 Kodaira Associates Kk 画像情報出力方法
JP2017076302A (ja) * 2015-10-16 2017-04-20 株式会社プロドローン 小型無人飛行機の制御方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012053896A (ja) * 2009-05-18 2012-03-15 Kodaira Associates Kk 画像情報出力方法
JP2017076302A (ja) * 2015-10-16 2017-04-20 株式会社プロドローン 小型無人飛行機の制御方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022247498A1 (fr) * 2021-05-27 2022-12-01 北京三快在线科技有限公司 Surveillance de véhicule aérien sans pilote

Also Published As

Publication number Publication date
JP6684012B1 (ja) 2020-04-22
JPWO2021064982A1 (ja) 2021-11-04

Similar Documents

Publication Publication Date Title
CN109219785B (zh) 一种多传感器校准方法与系统
CN108139759B (zh) 用于无人飞行器路径规划和控制的系统和方法
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
WO2020143677A1 (fr) Procédé de commande de vol et système de commande de vol
JP2017524958A (ja) ジンバル及びジンバルシミュレーションシステム
US11556681B2 (en) Method and system for simulating movable object states
JP6829513B1 (ja) 位置算出方法及び情報処理システム
WO2021251441A1 (fr) Procédé, système et programme
JP6966810B2 (ja) 管理サーバ及び管理システム、表示情報生成方法、プログラム
JP6684012B1 (ja) 情報処理装置および情報処理方法
WO2021079516A1 (fr) Procédé de création d'itinéraire de vol pour corps volant et serveur de gestion
WO2022264413A1 (fr) Procédé et programme de génération de trajet de déplacement de corps mobile, serveur de gestion et système de gestion
JP6730764B1 (ja) 飛行体の飛行経路表示方法及び情報処理装置
JP2020036163A (ja) 情報処理装置、撮影制御方法、プログラム及び記録媒体
JP6818379B1 (ja) 飛行体の飛行経路作成方法及び管理サーバ
JP2019082837A (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
JP2021015603A (ja) 飛行体の管理サーバ及び管理システム
WO2022113482A1 (fr) Dispositif, procédé et programme de traitement d'informations
JP7072311B1 (ja) 移動体の移動経路生成方法及びプログラム、管理サーバ、管理システム
JP2023083072A (ja) 方法、システムおよびプログラム
JP6786138B1 (ja) 飛行体の管理サーバ及び管理システム
JP6771253B1 (ja) 飛行体の管理サーバ及び管理システム
WO2021038622A1 (fr) Système de commande pour corps mobile sans pilote et procédé de commande de corps mobile sans pilote
JP2021015590A (ja) 飛行体の管理サーバ及び管理システム
JP2021015584A (ja) 飛行体の管理サーバ及び管理システム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019572703

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19947469

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19947469

Country of ref document: EP

Kind code of ref document: A1