US11586225B2 - Mobile device, mobile body control system, mobile body control method, and program - Google Patents
Mobile device, mobile body control system, mobile body control method, and program Download PDFInfo
- Publication number
- US11586225B2 US11586225B2 US16/991,056 US202016991056A US11586225B2 US 11586225 B2 US11586225 B2 US 11586225B2 US 202016991056 A US202016991056 A US 202016991056A US 11586225 B2 US11586225 B2 US 11586225B2
- Authority
- US
- United States
- Prior art keywords
- mobile device
- controller
- drone
- track
- mobile
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/04—Control of altitude or depth
- G05D1/06—Rate of change of altitude or depth
- G05D1/0607—Rate of change of altitude or depth specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0011—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
- G05D1/0016—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement characterised by the operator's input device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0011—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
- G05D1/0022—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement characterised by the communication link
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0011—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
- G05D1/0027—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0011—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
- G05D1/0033—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0011—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
- G05D1/0044—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/104—Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
Definitions
- the present disclosure relates to a mobile device, a mobile body control system, a mobile body control method, and a program. More specifically, the present disclosure relates to a mobile device, a mobile body control system, a mobile body control method, and a program by which a control target device can be reliably identified in a configuration for performing remote control on a plurality of mobile devices such as a plurality of drones, for example.
- a drone equipped with a camera is used in a process of photographing a landscape on the ground from the sky.
- aerial images obtained by drones are also used for processing to ascertain geographic features, for processing to survey land, or for construction sites, etc.
- a drone performs flight control in accordance with an instruction from, for example, a remote controller on the ground.
- control of the flight direction, the height, the speed, etc. of the drone control of photographing start and stop processing for a camera installed in the drone, control of setting for the photographing, etc. are also performed in accordance with an instruction from the remote controller on the ground in some cases.
- Japanese Patent Laid-Open No. 2016-007648 discloses an operator terminal for controlling a plurality of robots by wireless communication.
- This document discloses a configuration in which identification colors unique to respective robots are assigned, and the identification color of a robot which is set as a control target of a terminal is displayed on the terminal, so that the control target robot can be identified by an operator.
- the operator may need to identify the control target by checking both the color displayed on the operator terminal and a color display board installed on each of the robots.
- the present disclosure has been made in view of the above problems, and it is desirable to provide a mobile device, a mobile body control system, a mobile body control method, and a program for, in a configuration of controlling a plurality of mobile devices such as drones by using a remote controller, allowing a user (manipulator) who is manipulating the controller to identify a control target device without taking the eyes off the mobile devices such as the drones.
- a mobile device including a communication section that performs communication with a controller which selectively transmits control signals to a plurality of mobile devices, and a data processing section that performs movement control of the own device.
- the data processing section confirms whether or not an own-device selection signal which indicates that the own device is selected as a control target device has been received from the controller and, upon confirming reception of the own-device selection signal, performs movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
- a mobile body control system including a controller that selectively transmits control signals to a plurality of mobile devices, and a mobile device that moves upon receiving a control signal from the controller.
- the controller transmits, to one of the plurality of mobile devices, a selection signal which indicates that the one mobile device is selected as a control target.
- the mobile device confirms whether or not an own-device selecting signal which indicates that the own device is selected as a control target device has been received from the controller and, upon confirming reception of the own-device selecting signal, performs movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
- a mobile body control method which is executed by a mobile device, the mobile device including a communication section that performs communication with a controller which selectively transmits control signals to a plurality of mobile devices, and a data processing section that performs movement control of the own device.
- the mobile body control method includes causing the data processing section to confirm whether or not an own-device selecting signal which indicates that the own device is selected as a control target device has been received from the controller, and causing the data processing section to, upon confirming reception of the own-device selecting signal, perform movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
- a mobile body control method which is executed by a mobile body control system including a controller that selectively transmits control signals to a plurality of mobile devices and a mobile device that moves upon receiving a control signal from the controller.
- the mobile body control method includes causing the controller to transmit, to one of the plurality of mobile devices, a selection signal which indicates that the one mobile device is selected as a control target.
- the mobile body control method further includes causing the mobile device to confirm whether or not an own-device selecting signal which indicates that the own device is selected as a control target device has been received from the controller and, upon confirming reception of the own-device selecting signal, perform movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
- a program for causing a mobile device to perform mobile body control the mobile device including a communication section that performs communication with a controller which selectively transmits control signals to a plurality of mobile devices and a data processing section that performs movement control of the own device.
- the program includes causing the data processing section to confirm whether or not an own-device selecting signal which indicates that the own device is selected as a control target device has been received from the controller, and causing the data processing section to, upon confirming reception of the own-device selecting signal, perform movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
- the program according to the fifth embodiment of the present disclosure can be provided by a recording medium or communication medium for providing the program in a computer readable format to an information processing device or computer system that is capable of executing various program codes, for example. Since the program is provided in a computer readable format, processing in accordance with the program is executed on the information processing device or the computer system.
- a system refers to a logical set structure including a plurality of devices, and the devices of the structure are not necessarily included in the same casing.
- a user who is manipulating a controller is allowed to identify a control target device without taking the eyes off mobile devices such as drones.
- the configuration includes a communication section that performs communication with a controller which selectively transmits control signals to a plurality of mobile devices, and a data processing section that performs movement control of the own device.
- the data processing section confirms whether or not an own-device selecting signal which indicates that the own device is selected as a control target device has been received from the controller, and causes, upon receiving the own-device selecting signal, the own device to move in accordance with a selected-device identification track which indicates that the own device is selected.
- a drone starts flight in accordance with a selected-device identification track such as a forward/rearward horizontal flight track or a leftward/rightward horizontal flight track.
- FIG. 1 is a diagram for explaining an outline of a configuration and processing according to an embodiment of the present disclosure
- FIG. 2 is a diagram for explaining a configuration example of a controller (remote control device);
- FIG. 3 is a diagram for explaining another configuration example of the controller (remote control device).
- FIG. 4 is a diagram for explaining a configuration of communication between the controller and mobile devices (drones);
- FIG. 5 is a diagram for explaining another configuration of the communication between the controller and the mobile devices (drones);
- FIGS. 6 A to 6 C are diagrams for explaining examples of a selected-device identification track
- FIGS. 7 A to 7 C are diagrams for explaining further examples of the selected-device identification track
- FIGS. 8 A to 8 C are diagrams for explaining further examples of the selected-device identification track
- FIGS. 9 A and 9 B are diagrams for explaining further examples of the selected-device identification track
- FIG. 10 is a diagram for explaining a further example of the selected-device identification track
- FIGS. 11 A and 11 B are diagrams for explaining further examples of the selected-device identification track
- FIG. 12 is a diagram for explaining a further example of the selected-device identification track
- FIG. 13 is a diagram for explaining a further example of the selected-device identification track
- FIG. 14 is a diagram for explaining a further example of the selected-device identification track
- FIG. 15 is a diagram depicting a flowchart of a process sequence which is executed by a mobile device (drone);
- FIG. 16 is a diagram for explaining a track generation example
- FIG. 17 is a diagram depicting a flowchart for explaining another process sequence which is executed by the mobile device (drone);
- FIGS. 18 A and 18 B are diagrams for explaining a specific example of an enlargement/reduction parameter (scale value);
- FIG. 19 is a diagram for explaining another specific example of the enlargement/reduction parameter (scale value).
- FIG. 20 is a diagram depicting a flowchart for explaining a further process sequence which is executed by the mobile device (drone);
- FIG. 21 is a diagram depicting a flowchart for explaining the further process sequence which is executed by the mobile device (drone);
- FIG. 22 is a diagram for explaining generation of a 3D map and a process using the 3D map
- FIG. 23 is a diagram for explaining generation of a 3D map and a process using the 3D map
- FIG. 24 is a diagram for explaining generation of a 3D map and a process using the 3D map
- FIG. 25 is a diagram for explaining generation of a 3D map and a process using the 3D map
- FIG. 26 is a diagram depicting a flowchart for explaining a further process sequence which is executed by the mobile device (drone);
- FIG. 27 is a diagram depicting a flowchart for explaining a further process sequence which is executed by the mobile device (drone);
- FIG. 28 is a diagram for explaining a configuration example of the mobile device (drone).
- FIG. 29 is a diagram for explaining a configuration example of the controller (remote control device).
- FIG. 1 is a diagram depicting an entire configuration example of the movement control system according to the embodiment of the present disclosure.
- FIG. 1 depicts three drones including a drone-1 21 , a drone-2 22 , and a drone-3 23 , a controller (remote control device) 10 that controls the drones, and a user 1 who is a drone manipulator who manipulates the controller 10 .
- controller remote control device
- Control such as flight control is performed on all the drone-1 21 , the drone-2 22 , and the drone-3 23 by means of the single controller 10 manipulated by the user (manipulator) 1 .
- the controller 10 has a function of switching a control target drone. That is, the controller 10 is capable of performing three types of settings including a first setting for setting the drone-1 21 as a control target, a second setting for setting the drone-2 22 as a control target, and a third setting for setting the drone-3 23 as a control target.
- the controller 10 outputs a control signal having a frequency that varies according to the settings, for example.
- the drone determines that the drone itself is a control target and performs various control, such as flight control or camera photographing control, based on the control signal.
- control signals having identifiers (ID), which are allocated to the respective drones may be broadcasted.
- each of the drones may confirm the ID included in a received signal and determine that the drone itself is a control target when confirming that the received signal is a control signal having the ID allocated to the drone itself, so that the drone performs control based on the control signal.
- the user 1 performs switching control on the three drones including the drone-1 21 , the drone-2 22 , and the drone-3 23 by using the single controller 10 in this manner.
- occurrence of such a situation is prevented, and the user 1 can identify a drone that is a control target of the controller 10 while watching drones in the sky.
- a control target of the controller is assumed as a drone.
- processing according to the present disclosure is applicable not only to the case where the control target of the controller is a drone but also to a case where there are a plurality of various control targets that are not drones.
- the configuration and processing according to the present disclosure are applicable to a case where a plurality of vehicles or a plurality of robots exists as control targets and switching control of the vehicles or robots is performed by means of a single controller.
- FIG. 2 is a diagram depicting one configuration example of the controller 10 .
- the controller 10 depicted in FIG. 2 includes a touch panel-type display section. A user performs a touch operation on various icons displayed on the display section of the controller 10 , so that a process of switching a control target drone, a process of setting and transmitting a control signal to a control target drone, and the like can be performed.
- a display data region includes a control-target drone selecting section 11 and a selected-drone control section 12 .
- the control-target drone selecting section 11 is a display region in which an operation for switching a control target drone of the controller 10 is performed.
- the selected-drone control section 12 is a display region in which a control signal for performing control on a control target drone, such as flight control, is set and transmitted.
- a control-target drone display section 15 and a control-target drone switching section 16 are displayed in the control-target drone selecting section 11 .
- buttons for switching a control target drone of the controller 10 are displayed. For example, when a user operates (touches) a triangular icon on the right side, the control target drone of the controller 10 is switched from the drone-2 22 , which is the current control target, to the drone-3 23 .
- the control target drone of the controller 10 is switched from the drone-2 22 , which is the current control target, to the drone-1 21 .
- the controller 10 has the touch panel-type display section.
- the controller 10 is not limited to this configuration and can have various configurations.
- FIG. 3 is a diagram depicting one example of the controller 10 including no touch panel-type display section.
- the controller 10 depicted in FIG. 3 includes the control-target drone selecting section 11 and the selected-drone control section 12 .
- the control-target drone selecting section 11 is an operation region in which an operation for switching a control target drone of the controller 10 is performed.
- the selected-drone control section 12 is an operation region for setting and transmitting a control signal to perform control on a control target drone such as flight control.
- control-target drone display section 15 and the control-target drone switching section 16 are provided in the control-target drone selecting section 11 .
- Switches for switching a control target drone of the controller 10 are provided in the control-target drone switching section 16 .
- a control target drone of the controller 10 is switched from the drone-2 22 , which is the current control target, to the drone-3 23 .
- the control target drone of the controller 10 is switched from the drone-2 22 , which is the current control target, to the drone-1 21 .
- FIG. 4 is a diagram for explaining one configuration example of communication between a controller and drones.
- solid line arrows each indicate a signal that is transmitted from the controller 10 to any of the drones 21 to 23
- dotted line arrows each indicate a signal that is exchanged among the drones.
- Signals that are transmitted from the controller 10 to the drones 21 to 23 include the following two types of signals:
- signals that are transmitted from the controller 10 to the drones 21 to 23 also include a control signal for a drone. However, illustration of the control signal is omitted.
- the (a1) selection signal indicates that the drone having received this signal is selected as a control target device of a control signal that is transmitted from the controller 10 .
- a signal in any of various forms can be used as the selection signal.
- an ID set for each of the drones can be used as the selection signal.
- a signal with a unique frequency set for each of the drones may be used as the selection signal.
- the (a2) controller position information indicates the current position of the controller.
- the (a2) controller position information is 3D position information, which is specifically position information including latitude information, longitude information, height information, etc.
- the controller 10 includes an own-position information obtaining section such as an inertial measurement unit (IMU) or a global positioning system (GPS) and analyzes the own position of the controller 10 on the basis of information obtained by the own-position information obtaining section.
- an own-position information obtaining section such as an inertial measurement unit (IMU) or a global positioning system (GPS) and analyzes the own position of the controller 10 on the basis of information obtained by the own-position information obtaining section.
- IMU inertial measurement unit
- GPS global positioning system
- Both the (a1) selection signal and the (a2) controller position information are regularly transmitted from the controller 10 to a drone.
- signals that are exchanged among the drones as indicated by the dotted line arrows in FIG. 4 include:
- the drones each fly while reporting the own device position information to one another.
- the position information includes 3D position information, which is specifically position information including latitude information, longitude information, height information, etc.
- Each of the drones also includes an own-position information obtaining section such as an IMU or a GPS and analyzes the own position of the drone on the basis of information obtained by the own-position information obtaining section.
- an own-position information obtaining section such as an IMU or a GPS and analyzes the own position of the drone on the basis of information obtained by the own-position information obtaining section.
- This process of exchanging position information among the drones is also constantly executed during flight, so that the latest position information is constantly shared by the drones.
- the communication configuration explained above with reference to FIG. 4 is one example, and other communication configurations may be adopted.
- FIG. 5 depicts another communication configuration example.
- solid line arrows indicate signals that are transmitted from the controller 10 to the drones 21 to 23
- dotted line arrows indicate signals that are transmitted from the drones 21 to 23 to the controller 10 .
- the signals that are transmitted from the drones 21 to 23 to the controller 10 as indicated by the dotted line arrows in FIG. 5 include
- the drones each obtain the own position and transmit the obtained own-device position information to the controller 10 .
- This position information is 3D position information, which is specifically position information including latitude information, longitude information, height information, etc. For example, position information obtained by use of an IMU or GPS signal or the like is used.
- the signals that are transmitted from the controller 10 to the drones 21 to 23 include the following three types of signals:
- the signals that are transmitted from the controller 10 to the drones 21 to 23 include a control signal for a drone. However, illustration of the control signal is omitted.
- the (a1) selection signal and the (a2) controller position information are identical to those explained above with reference to FIG. 4 .
- the (a1) selection signal indicates that the drone having received this signal is selected as the control target device of a control signal that is transmitted from the controller 10 .
- the (a2) controller position information indicates the current position of the controller.
- the (a3) other drone position information indicates the position of a drone other than the own device. This information is generated on the basis of the “(b1) own drone position information” which the controller 10 receives from the drones 21 to 23 .
- the controller 10 generates, for each of the drones, a drone identifier and correspondence data on 3D position information regarding the drone and broadcasts data obtained by combining the drone identifier and the correspondence data.
- Each of the drones obtains, from received data, correspondence data (drone ID and position information) other than position data associated with the ID of the own device, thereby to confirm the positions of the drones excluding the own device.
- the position information is 3D position information, which is specifically position information including latitude information, longitude information, height information, etc.
- position information obtained by use of an IMU or GPS signal or the like is used.
- Communication processes depicted in FIG. 5 are also constantly executed during flight of the drones, so that the latest position information is constantly shared by the controller and the drones.
- the controller 10 performs switching control of a plurality of drones.
- occurrence of such a situation is prevented, and the user 1 can identify a control target drone of the controller 10 while watching drones in the sky.
- the drone selected as the control target carries out special flight for indicating that the drone is selected as the control target, that is, flight in accordance with a “selected-device identification track.”
- the user 1 on the ground, manipulating the controller 10 confirms that one of a plurality of drones in the sky carries out special flight, that is, flight in accordance with a “selected-device identification track,” so that the user 1 can confirm that the one drone carrying out the flight in accordance with the “selected-device identification track” is a drone selected as the control target of the controller 10 .
- the user 1 When performing this confirmation process, the user 1 , who is a manipulator of the controller 10 , does not need to look at the controller 10 . That is, while continuously observing the drones in the sky, the user 1 can assuredly confirm which drone is selected as the control target. Accordingly, the user 1 can perform a confirmation operation without taking the eyes off the drones.
- the user 1 who is a manipulator of the controller 10 may need to grasp one selected flight form of the “selected-device identification track.”
- FIGS. 6 A to 6 C depict three examples of the “selected-device identification track.”
- the drone when a drone having received a selection signal for the drone itself from the controller 10 that is being manipulated by the user 1 confirms the fact that the drone is selected as a control target from the received selection signal, the drone carries out forward/rearward horizontal flight to make the user 1 , who is manipulating the controller 10 , know that the drone is a device (drone) selected as the control target device.
- the user 1 When the user 1 watching a plurality of drones in the sky sees a drone start forward/rearward horizontal flight, the user 1 can assuredly confirm that the drone is the current control target drone of the controller 10 .
- the drone when a drone is selected as a control target by a selection signal from the controller 10 , the drone carries out leftward/rightward horizontal flight to make the user 1 know that the drone is a device (drone) selected as the control target device.
- the user 1 can assuredly confirm that the drone is the current control target drone of the controller 10 .
- the drone when a drone is selected as a control target by a selection signal from the controller 10 , the drone carries out upward/downward vertical flight to make the user 1 know that the drone is a device (drone) selected as the control target device.
- the user 1 can assuredly confirm that the drone is the current control target drone of the controller 10 .
- FIGS. 7 A to 7 C depict the following three examples of the “selected-device identification track.”
- selected-device identification track example 4 to (6) selected-device identification track example 6 when a drone is selected as a control target by a selection signal from the controller 10 , the drone carries out “horizontally rotating flight,” “vertically rotating flight,” or “triangular flight” to make the user 1 know that the drone is a device (drone) selected as the control target device.
- the user 1 can assuredly confirm that the drone is the current control target drone of the controller 10 .
- FIGS. 8 A to 8 C depict the following three examples of the “selected-device identification track.”
- selected-device identification track example 7 to (9) selected-device identification track example 9 when a drone is selected as a control target by a selection signal from the controller 10 , the drone carries out “circular flight,” “user-specified shape flight,” or “swing flight” to make the user 1 know that the drone is a device (drone) selected as the control target device.
- the user 1 can assuredly confirm that the drone is the current control target drone of the controller 10 .
- FIGS. 9 A and 9 B depict the following two examples of the “selected-device identification track.”
- each of (10) selected-device identification track example 10 and (11) selected-device identification track example 11 when a drone is selected as a control target by a selection signal from the controller 10 , the drone carries out “tilted flight” or “vertically-inverted flight” to make the user 1 know that the drone is a device (drone) selected as the control target device.
- the user 1 can assuredly confirm that the drone is the current control target drone of the controller 10 .
- the eleven examples of the selected-device identification track explained above with reference to FIGS. 6 A to 9 B each represent a flight example in which a user viewpoint is not taken into consideration. In some cases, depending on the viewpoint position of the user, whether or not flight in accordance with a selected-device identification track is being carried out is difficult to discern.
- FIG. 10 depicts the following example of the “selected-device identification track.”
- the “selected-device identification track” is set to have a circular track in a plane that is substantially orthogonal to a viewpoint direction of the user 1 .
- the user 1 can assuredly observe the drone flying in accordance with the circular track in a plane that is substantially orthogonal to the viewpoint direction of the user 1 . That is, the drone flying in accordance with the selected-device identification track can be assuredly discerned as the control target drone.
- the size of a circular track as the “selected-device identification track” is set to be small.
- the size of the track is changed according to the distance to the user 1 as described above, so that the user 1 can assuredly discern a drone that is flying in accordance with the “selected-device identification track.”
- the drone flies in accordance with the “selected-device identification track,” when the position of the track of the flight is invisible to the user 1 , for example, when the position is shielded by an obstacle, it is difficult for the user 1 to confirm the drone that is flying in accordance with the “selected-device identification track.”
- the drone receives controller position information from the controller 10 , generates a 3D map of a 3D space along a flight route, analyzes the 3D map, and analyzes a 3D position of the obstacle.
- FIG. 13 depicts a case (D1) where a selected drone 31 that is selected as a control target by a selection signal from the controller 10 carries out flight to move upward to a position higher than other non-selected drones 32 a and 32 b in order to make the user 1 know that the drone is a device (drone) selected as a control target device.
- the user 1 who is observing drones while manipulating the controller 10 can confirm, by seeing a drone start to move upward to a highest position among the flying drones, that the drone at the highest position is the current control target drone of the controller 10 .
- the user 1 observing drones while manipulating the controller 10 can confirm, by seeing a drone start flight to approach the position of the user 1 and reach a position closest, of those of flying drones, to the position of the user 1 , that the drone at the closest position is the current control target drone of the controller 10 .
- the flight in accordance with the plurality of different “selected-device identification tracks” can be separately carried out, but the flight in accordance with the plurality of “selected-device identification tracks” may be carried out in combination.
- a drone may need to confirm a signal transmitted from the controller 10 , for example, a selection signal which indicates that the drone is selected as the control target.
- a process of confirming controller position information and positions of other drones may be needed depending on the flight form.
- steps in the flowcharts in FIGS. 15 , 17 , 20 , 21 , 26 , and 27 can be executed in accordance with a program stored in an internal memory of an information processing device installed in a drone and under control of a control section (data processing section) that includes a central processing unit (CPU) or the like having a function of executing the program in the information processing device.
- a control section data processing section
- CPU central processing unit
- the data processing section of the drone detects the fact that the own device (own drone) is selected as a control target device of the controller.
- the controller 10 transmits a selection signal to each of drones.
- the selection signal indicates that the drone is selected as a control target device of a control signal that is transmitted from the controller 10 .
- a signal in any of various forms can be used as the selection signal.
- an ID set to each of drones can be used as the selection signal.
- a signal with a unique frequency set to each of drones may be used as the selection signal.
- the drone analyzes the selection signal included in signals transmitted from the controller 10 and detects the fact that the own device (own drone) is selected as the control target.
- step S 102 the drone having detected the fact that the own device (own drone) is selected as the control target receives controller position information.
- the controller 10 transmits controller position information (3D position information) to drones as occasion demands, and the drone receives the transmitted position information and confirms the 3D position of the controller 10 .
- step S 103 the drone obtains the own position of the own device (own drone).
- the drone includes an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- step S 104 the drone generates a selected-device identification track.
- a process of generating the selected-device identification track is executed to determine positions, moving speeds, and moving directions for respective track constituting points (process points).
- FIG. 16 depicts a circular track 50 as one example of the selected-device identification track.
- Step S 104 is executed to determine a position (3D position), a moving speed, and a moving direction for each of the process points 51 .
- One process point to be processed first is defined as a start point 52 . From the position of the start point, positions (3D positions), moving speeds, and moving directions are determined for all the process points on the track 50 . Accordingly, one selected-device identification track is generated.
- FIG. 16 illustrates positions (3D positions), moving speeds, and moving directions for two process points P25 and P31 as examples.
- the process point P25 is set as a selected-device identification track constituting point having a position (3D position), a moving speed, and a moving direction as follows.
- the process point P31 is set as a selected-device identification track constituting point having a position (3D position), a moving speed, and a moving direction as follows.
- the data processing section of the drone generates one selected-device identification track by executing a process of determining the positions (3D positions), the moving speeds, and the moving directions for all the process points on the track 50 from the start point 52 .
- process points set in FIG. 16 are depicted at intervals for easy understanding. However, in the actual track generating process, the process points are densely set on the track 50 , and the process of determining the positions (3D positions), the moving speeds, and the moving directions for all the process points is executed, thereby to generate one selected-device identification track.
- step S 105 the drone starts flight in accordance with the selected-device identification track generated at step S 104 .
- the user can confirm that the drone is a drone selected as the current control target of the controller 10 .
- (2a) process sequence for carrying out flight in accordance with a selected-device identification track in which a user viewpoint is taken into consideration will be explained.
- the data processing section of a drone detects the fact that the own device (own drone) is selected as the control target device of the controller.
- the controller 10 transmits a selection signal to each of drones.
- the selection signal indicates that the drone is selected as a control target device of a control signal that is transmitted from the controller 10 .
- a signal in any of various forms can be used as the selection signal.
- an ID set to each of drones can be used as the selection signal.
- a signal with a unique frequency set to each of drones may be used as the selection signal.
- the drone analyzes the selection signal included in signals transmitted from the controller 10 and detects the fact that the own device (own drone) is selected as the control target.
- step S 202 the drone having detected the fact that the own device (own drone) is selected as the control target receives controller position information.
- the controller 10 transmits controller position information (3D position information) to the drone as occasion demands, and the drone receives the transmitted position information and confirms the 3D position of the controller.
- step S 203 the drone obtains the own position of the own device (own drone).
- the drone includes an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- step S 204 the drone generates a selected-device identification track.
- a process of generating the selected-device identification track is executed to determine positions, moving speeds, and moving directions for respective track constituting points (process points).
- This process is similar to that explained above with reference to FIG. 16 .
- the determined track and the determined speed at each of track constituting points on the determined track, which are generated at step S 204 , are set as a “reference track” and a “reference speed,” respectively.
- step S 205 the data processing section of the drone calculates the distance between the own device (own drone) and the controller.
- the distance can be calculated on the basis of the position of the controller obtained at step S 202 and the position of the own device (own drone) obtained at step S 203 .
- step S 206 the data processing section of the drone calculates or obtains an enlargement/reduction parameter (scale value) according to the distance between the own device (own drone) and the controller.
- the reference track generated at step S 204 is a track that is used in a case where the distance between the user and the drone is equal to a prescribed reference distance (Ls), that is, is a reference “selected-device identification track.”
- the “selected-device identification track” according to the reference distance (Ls) and the enlargement/reduction parameter (scale value) according to the distance between the own device (own drone) and the controller will be explained in detail with reference to FIGS. 18 A and 18 B .
- FIG. 18 A depicts an example of the “selected-device identification track” according to the reference distance (Ls).
- the “selected-device identification track” according to the reference distance (Ls) is the reference track generated at step S 204 .
- a moving speed and a moving direction are specified for each of the track constituting points (process points).
- the moving speeds of the respective reference track constituting points (process points) are set to be different from one another.
- the moving speed of each of the reference track constituting points (process points) is defined as the reference speed.
- FIG. 18 B is a diagram depicting one example of the enlargement/reduction parameter (scale value) according to the distance between the own device (own drone) and the controller.
- the horizontal axis represents the distance between the own device (own drone) and the controller while the vertical axis represents the enlargement/reduction parameter (scale value).
- the value of the enlargement/reduction parameter (scale value) is set to become greater when the distance between the own device (own drone) and the controller is longer.
- the drone holds, in a storage section (memory), for example, a table or a function corresponding to the graph in FIG. 18 B and calculates the enlargement/reduction parameter (scale value) on the basis of the distance between the own device (own drone) and the controller calculated at step S 205 .
- a storage section for example, a table or a function corresponding to the graph in FIG. 18 B and calculates the enlargement/reduction parameter (scale value) on the basis of the distance between the own device (own drone) and the controller calculated at step S 205 .
- the distance between the own device (own drone) and the controller has a proportional relation with the enlargement/reduction parameter (scale value), but this is one example, and another relation may be set.
- the value of the enlargement/reduction parameter (scale value) is basically set to become greater with an increase in the distance between the own device (own drone) and the controller.
- the enlargement/reduction parameter (scale value) is a value greater than 1.
- the enlargement/reduction parameter scale value
- the data processing section of the drone calculates the enlargement/reduction parameter (scale value) by using the function stored in the storage section on the basis of the distance between the own device (own drone) and the controller calculated at step S 205 or obtains the enlargement/reduction parameter (scale value) from the table stored in the storage section.
- step S 207 the data processing section of the drone generates an enlarged/reduced track which is obtained by enlarging or reducing the reference track with use of the enlargement/reduction parameter (scale value) according to the distance between the own device (own drone) and the controller calculated or obtained at step S 206 .
- One of these tracks is set and used as the actual “selected-device identification track.”
- FIG. 19 A specific example of a process of enlarging/reducing the track is depicted in FIG. 19 .
- the reference track is directly set.
- the speed for each of the points constituting the enlarged/reduced track determined at step S 207 is calculated at step S 208 and subsequent steps.
- the speed for each of the points constituting the enlarged/reduced track is also changed by application of the enlargement/reduction parameter (scale value) that has been applied to the enlarged/reduced track determined at step S 207 .
- a speed for each of the points constituting the reference track is defined as a reference speed. Speeds for the respective constituting points are different from one another.
- a maximum speed allowable for the drone is prescribed.
- a process of setting the allowable maximum speed as a set speed is executed.
- step S 208 and the subsequent steps of executing the speed setting process will be explained.
- the data processing section of the drone first obtains, as a process point, a start position which is one point of the enlarged/reduced track generated at step S 207 .
- This step is similar to that explained above with reference to FIG. 16 .
- a start position to be processed first is obtained as a process point.
- step S 209 the data processing section of the drone executes a determination process based on the following determination expression (Expression 1): (process point reference speed) ⁇ (scale value)>allowable maximum speed (Expression 1)
- the (process point reference speed) refers to a speed at a process point of the reference track. This speed is the process point-corresponding reference speed already calculated at step S 204 .
- the (scale value) is calculated or obtained at step S 206 and is an enlargement/reduction parameter (scale value) according to the distance between the own device (own drone) and the controller.
- the allowable maximum speed is a prescribed maximum speed that is allowable for the drone.
- step S 210 the process proceeds to step S 210 .
- step S 209 In a case where it is determined at step S 209 that the above determination expression (Expression 1) is not satisfied, the process proceeds to step S 210 .
- step S 210 the data processing section of the drone sets, as a moving speed for the process point, the speed calculated by (process point reference speed) ⁇ (scale value).
- step S 209 determines whether the above determination expression (Expression 1) is satisfied. If it is determined at step S 209 that the above determination expression (Expression 1) is satisfied, the process proceeds to step S 211 .
- the data processing section of the drone sets the allowable maximum speed as a moving speed for the process point instead of setting the speed calculated by (process point reference speed) ⁇ (scale value) as a moving speed for the process point.
- step S 210 After the speed for one process point is determined at step S 210 or step S 211 , whether or not the moving speeds at all the process points of the track (enlarged/reduced track) have been determined is determined at step S 212 .
- step S 213 a process point unprocessed is selected as a new process point, and a moving speed therefor is determined by execution of step S 209 and the subsequent steps.
- step S 212 In a case where it is determined at step S 212 that the moving speeds for all the process points of the track (enlarged/reduced track) have been determined, the process proceeds to step S 214 .
- step S 214 the drone starts flight in accordance with the generated enlarged/reduced track and the determined moving speeds for the respective process points. That is, flight in accordance with the “selected-device identification track” is started.
- the user 1 can easily identify the drone flying in accordance with the selected-device identification track even when the position of the drone is far.
- (2b) process sequence for carrying out flight in accordance with a selected-device identification track in which a user viewpoint and an obstacle are taken into consideration will be explained.
- This process sequence corresponds to that for a case where flight in accordance with the selected-device identification track which has been explained with reference to FIG. 12 is carried out. That is, in a case where a drone flies in accordance with a “selected-device identification track,” if the position of a track for the flight is invisible to the user 1 , for example, if the position is shielded by an obstacle, it is difficult for the user 1 to confirm the drone flying in accordance with the “selected-device identification track.”
- the drone flies in accordance with the “selected-device identification track” after moving to a position visible to the user 1 as depicted in FIG. 12 .
- the drone receives controller position information from the controller 10 , generates a 3D map of a 3D space along the flight route, and analyzes the 3D position of the obstacle by analyzing the 3D map.
- the data processing section of the drone generates a 3D map in which an obstacle position is included in a movement process of the own device (own drone).
- the drone has a function of creating a 3D map based on an image photographed by a camera installed in the drone and information obtained by an own-position obtaining section such as an IMU or a GPS, thereby to generate a 3D map in which the position of an obstacle is included in the moving process of the own device (own drone).
- an own-position obtaining section such as an IMU or a GPS
- FIG. 22 depicts one example of the 3D map generated by the drone.
- objects identified by units of section areas which are defined by a grid are recorded as depicted in FIG. 22 .
- FIG. 22 is a drawing of a 2D plane, the 3D map actually supports a 3D space.
- Each rectangular section area depicted in FIG. 22 corresponds to one cube-shaped section area.
- the data processing section of the drone In the movement process of the own device (own drone) from a start position, the data processing section of the drone generates a 3D map, in which an obstacle position is included, by using information obtained by the camera, the IMU, the GPS, or the like. This 3D map generating process is constantly executed during the flight of the drone. The generated map is stored in the storage section of the drone.
- step S 302 the data processing section of the drone detects the fact that the own device (own drone) is selected as the control target device of the controller.
- the controller 10 transmits a selection signal to each drone.
- the selection signal indicates that the drone is selected as a control target device of a control signal that is transmitted from the controller 10 .
- a signal in any of various forms can be used as the selection signal.
- an ID set to each of drones can be used as the selection signal.
- a signal with a unique frequency set to each of drones may be used as the selection signal.
- the drone analyzes the selection signal included in signals transmitted from the controller 10 and detects the fact that the own device (own drone) is selected as the control target.
- step S 303 the drone having detected the fact that the own device (own drone) is selected as the control target receives controller position information.
- the controller 10 transmits controller position information (3D position information) to the drone as occasion demands, and the drone receives the transmitted position information and confirms the 3D position of the controller.
- step S 304 the drone obtains the own position of the own device (own drone).
- the drone includes an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- step S 305 the drone generates a selected-device identification track.
- a process of generating the selected-device identification track is executed to determine positions, moving speeds, and moving directions for respective track constituting points (process points).
- This process is similar to that explained above with reference to FIG. 16 .
- the drone analyzes the 3D map generated at step S 301 and detects an obstacle that is located between the own device (own drone) and the controller.
- the drone classifies a plurality of section areas constituting the 3D map into user observable section areas and user unobservable section areas.
- a plurality of trees exist between the drone 20 and the controller 10 held by the user 1 .
- the plurality of trees are detected as obstacles.
- a plurality of section areas constituting the 3D map are classified into user observable section areas and user unobservable section areas.
- a plurality of section areas constituting the 3D map are each set as
- the process of generating a 3D map and the process of classifying section areas are constantly executed during flight of the drone, and the updated data is stored as occasion demands in the storage section of the drone.
- the drone selects, as a track-start-point setting section area, a “user observable section area” that is closest to the current position of the own device (own drone).
- a track-start-point setting section area 71 depicted in FIG. 24 is the closest “user observable section area.”
- step S 308 the drone generates a track including the track-start-point setting section area and sets, as track setting section areas, section areas each including the track.
- a generated track 72 depicted in FIG. 24 is generated, for example.
- step S 309 the drone determines whether or not all the track setting section areas are “user observable section areas.”
- step S 310 the process proceeds to step S 310 .
- step S 313 the process proceeds to step S 313 .
- step S 309 the determination at step S 309 is No. Then, the process proceeds to step S 310 .
- step S 309 determines whether the track setting section areas include at least one section area that is not a “user observable section area.”
- the data processing section of the drone determines, at step S 310 , whether or not any “user observable section area” that is not selected as the track-start-point setting section area is left unprocessed.
- the unprocessed “user observable section area” is set, at step S 311 , as a new track-start-point setting section area to generate a new track at step S 308 . Then, steps S 308 and S 309 are executed.
- step S 310 In a case where it is determined, at step S 310 , no “user observable section area” that is not selected as a track-start-point setting section area is left unprocessed, the process proceeds to step S 312 to report an error to the controller. Then, the process is ended.
- step S 309 determines whether the track setting section areas are “user observable section areas.” If the determination at step S 309 is Yes, that is, in a case where all the track setting section areas are “user observable section areas,” the process proceeds to step S 313 .
- the data processing section of the drone determines, at step S 313 , a track including “user observable section areas” only as a selected-device identification track.
- step S 314 the drone starts flight in accordance with the track determined at step S 313 .
- a selected-device identification track 75 depicted in FIG. 25 is generated at step S 313 .
- the selected-device identification track 75 depicted in FIG. 25 includes “user observable section areas” only.
- the user 1 can completely observe the selected-device identification track 75 without any obstacle shielding the selected-device identification track 75 and thus can determine that the drone flying in accordance with this track is the control target drone of the controller 10 .
- (3a) process sequence for carrying out flight in accordance with a selected-device identification track in which a relative position (height) with respect to other drones is taken into consideration will be explained.
- This process sequence corresponds to that for a case where flight in accordance with the selected-device identification track which has been explained with reference to FIG. 13 is carried out. That is, as depicted in FIG. 13 , the selected drone 31 that is selected as the control target by a selection signal from the controller 10 carries out flight to move upward to a position higher than the positions of the other non-selected drones 32 a and 32 b in order to make the user 1 know that the drone is a device (drone) selected as the control target device.
- the user 1 who is observing drones while manipulating the controller 10 can confirm, by seeing a drone start to move upward to the highest position among the flying drones, that the drone at the highest position is the current control target drone of the controller 10 .
- the data processing section of the drone detects the fact that the own device (own drone) is selected as the control target device of the controller.
- the controller 10 transmits a selection signal to each drone.
- the selection signal indicates that the drone is selected as a control target device of a control signal that is transmitted from the controller 10 .
- a signal in any of various forms can be used as the selection signal.
- an ID set to each of drones can be used as the selection signal.
- a signal with a unique frequency set to each of drones may be used as the selection signal.
- the drone analyzes the selection signal included in signals transmitted from the controller 10 and detects the fact that the own device (own drone) is selected as the control target.
- step S 402 the drone having detected the fact that the own device (own drone) is selected as the control target obtains positions of other devices (other drones).
- the position information regarding the other devices can be received through communication performed among the drones or communication performed via the controller.
- step S 403 the drone having detected the fact that the own device (own drone) is selected as the control target obtains the own position of the own device (own drone).
- the drone includes an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- the drone compares the position information regarding the own device (own drone) with the position information regarding the other devices (other drones) and determines whether or not the height of the own device (own drone) is the highest among those of the other devices (other drones).
- step S 405 the process proceeds to step S 405 .
- step S 406 the process proceeds to step S 406 .
- step S 404 the height of the own device (own drone) is determined to be the highest among those of the other devices (other drones), the process proceeds to step S 405 .
- the drone carries out flight in accordance with a selected-device identification track that differs from a “selected-device identification track for moving upward to a position higher than any other mobile devices (other drones).”
- step S 404 in a case where existence of another device (another drone) located at a position higher than the own device (own drone) is determined at step S 404 , the process proceeds to step S 406 .
- the drone obtains the current height (h_0) of the own device (own drone) at step S 406 .
- step S 407 the drone calculates a target height (h_1) to be reached as a result of upward movement of the own device (own drone).
- step S 408 the drone causes the own device (own drone) to move upward to the target height (h_1) calculated at step S 407 .
- the user 1 observing drones while manipulating the controller 10 can confirm, by seeing the drone start to move upward to the highest position among the flying drones, that the drone is the current control target drone of the controller 10 .
- (3b) process sequence for carrying out flight in accordance with a selected-device identification track in which a relative position (distance to a user) with respect to other drones is taken into consideration will be explained.
- This process sequence corresponds to that for a case where flight in accordance with the selected-device identification track which has been explained with reference to FIG. 14 is carried out. That is, as depicted in FIG. 14 , the selected drone 31 selected as a control target by a selection signal from the controller 10 carries out flight to approach a position closer to the user 1 and the controller 10 than the other non-selected drones 32 a and 32 b in order to make the user 1 know that the drone is a device (drone) selected as the control target device.
- the user 1 observing drones while manipulating the controller 10 can confirm, by seeing the drone move to a position closest to the user 1 among the flying drones, that the drone is the current control target drone of the controller 10 .
- the data processing section of the drone detects the fact that the own device (own drone) is selected as the control target device of the controller.
- the controller 10 transmits a selection signal to each of drones.
- the selection signal indicates that the drone is selected as a control target device of a control signal transmitted from the controller 10 .
- a signal in any of various forms can be used as the selection signal.
- an ID set to each of drones can be used as the selection signal.
- a signal with a unique frequency set to each of drones may be used as the selection signal.
- the drone analyzes a selection signal included in signals transmitted from the controller 10 and detects the fact that the own device (own drone) is selected as a control target.
- step S 502 the drone having detected the fact that the own device (own drone) is selected as the control target receives controller position information.
- the controller 10 transmits controller position information (3D position information) to the drone as occasion demands, and the drone receives the transmitted position information and confirms the 3D position of the controller.
- step S 503 the drone having detected the fact that the own device (own drone) is selected as a control target obtains positions of other devices (other drones).
- information regarding the positions of the other devices can be received through communication performed among the drones or communication performed via the controller.
- step S 504 the drone having detected the fact that the own device (own drone) is selected as the control target obtains the own position of the own device (own drone).
- the drone includes an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- an own-position information obtaining section such as an IMU or a GPS and obtains the current own position (3D position) of the drone on the basis of information obtained by the own-position information obtaining section.
- the drone compares the position information regarding the own device (own drone) with those of the other devices (other drones), thereby to determine whether or not the own device (own drone) is at a position closer to the position of the controller than any of the other devices (other drones).
- step S 506 the process proceeds to step S 506 .
- step S 507 the process proceeds to step S 507 .
- step S 505 In a case where the own device (own drone) is determined, at step S 505 , to be located at a position closer to the controller than any of the other devices (other drones), the process proceeds to step S 506 .
- the drone carries out flight in accordance with a selected-device identification track that differs from a “selected-device identification track for moving to a position closer to the controller than any of the other devices (own drones).”
- step S 505 the process proceeds to step S 507 .
- the drone obtains, at step S 507 , the current position (P0) of the own device (own drone).
- the drone calculates a target distance (d_1) that corresponds to a movement distance by which the own device (own drone) moves to be located at a position closest to the controller.
- step S 509 the drone obtains a position P1 at which the distance from the controller is equal to the target distance (d_1) on a straight line connecting the controller to the own device (own drone).
- step S 510 the drone causes the own device (own drone) to move to the position P1 calculated at step S 509 .
- the user 1 observing drones while manipulating the controller 10 can confirm, by seeing the drone start movement toward the direction of the user and reach a position closest to the user among the flying drones, that the drone is the current control target drone of the controller 10 .
- a mobile device (drone) 100 includes a hardware part 110 and a software part 120 .
- the software part 120 corresponds to a data processing section that executes various processes in accordance with a program (software) stored in a storage section 116 , for example.
- the data processing section includes a processor such as a CPU having a program executing function such that various processes are executed by the processor executing the program.
- the hardware part 110 includes a sensor section (camera, etc.) 111 , an IMU section 112 , a GPS section 113 , a communication section 114 , a propeller driving section 115 , and the storage section 116 .
- the sensor section (camera, etc.) 111 includes various sensors such as a camera, a distance sensor, and a temperature sensor.
- the IMU section 112 and the GPS section 113 are own-position information obtaining sections such as an IMU and a GPS as explained above.
- the communication section 114 performs communication with a controller 200 or another mobile device 300 .
- the propeller driving section 115 is a section that drives a propeller for causing the drone to fly.
- the mobile device 100 is not limited to drones, and a vehicle or a robot may be used therefor.
- the driving section includes wheels, etc.
- the driving section is formed as a leg driving section for walking.
- the storage section 116 stores a program which is executed by the software part 120 , for example. Further, the storage section 116 is also used as a work area or a storage area for various parameters that are used to execute the program.
- the storage section 116 also stores sensor acquisition information such as camera photograph information acquired by the sensor section 111 , own position information acquired by the IMU section 112 and the GPS section 113 , and further, device position information received from the controller 200 and the other mobile device 300 .
- the storage section 116 is used as a storage region for 3D map data, etc. generated in the software part 120 .
- the software part 120 corresponds to a data processing section that executes various processes in accordance with the program (software) stored in the storage section 116 , for example.
- the software part 120 includes a processor such as a CPU having a program executing function such that various processes are executed by the processor executing the program.
- a plurality of process blocks in the software part 120 in FIG. 28 are depicted as processing sections that are independent of one another according to the process types of a plurality of processes to be executed in the software part 120 .
- the software part 120 includes a 3D map generation section (obstacle detection section) 121 , an own position detection section 122 , an information integration section 123 , a control form determination section 124 , a normal track planning section 125 , an identification track planning section 126 , and a device control section 127 .
- a 3D map generation section (obstacle detection section) 121
- an own position detection section 122 the software part 120 includes a 3D map generation section (obstacle detection section) 121 , an own position detection section 122 , an information integration section 123 , a control form determination section 124 , a normal track planning section 125 , an identification track planning section 126 , and a device control section 127 .
- the 3D map generation section (obstacle detection section) 121 generates a 3D map by using own position information, etc. calculated on the basis of information obtained by the sensor section 111 such as a camera and information obtained by the IMU section 112 and the GPS section 113 , that is, an IMU, a GPS, etc., and further, detects an obstacle in the 3D map.
- the own position detection section 122 calculates the own position (3D position) on the basis of information obtained by the IMU section 112 and the GPS section 113 .
- the information integration section 123 integrates 3D map information and obstacle detection information generated by the 3D map generation section (obstacle detection section) 121 , the own position information calculated by the own position detection section 122 , and reception information received from the controller 200 or the other mobile device 300 via the communication section 114 and inputs the integrated information to the control form determination section 124 .
- the control form determination section 124 determines a control form, such as a flight form, for the mobile device 100 on the basis of the 3D map information, obstacle detection information, and own position information inputted from the information integration section 123 and reception information received from the controller 200 and the other mobile device 300 via the communication section 114 .
- the normal track planning section 125 plans a normal-time movement track (flight track) for the mobile device 100 .
- the identification track planning section 126 plans a track for a selected-device identification movement (flight) to make a user know that the mobile device is selected as a control target.
- the device control section 127 controls the propeller driving section 115 in order to move (fly) in accordance with the track planned by the normal track planning section 125 or the identification track planning section 126 .
- controller (remote control device) 200 Next, a configuration example of the controller (remote control device) 200 will be explained with reference to FIG. 29 .
- the controller (remote control device) 200 includes a hardware part 210 and a software part 220 .
- the software part 220 corresponds to a data processing section that executes various processes in accordance with a program (software) stored in a storage section 216 , for example.
- the software part 220 includes a processor such as a CPU having a program executing function such that various processes are executed by the processor executing the program.
- the hardware part 210 includes an output section 211 , an input section 212 , a communication section 213 , an IMU section 214 , a GPS section 215 , and the storage section 216 .
- the output section 211 includes an image output section, a sound output section, a lamp, and the like, for example.
- the image output section may include a touch panel such as a liquid crystal display, for example, so as to also have a function of the input section 212 .
- the sound output section is a loudspeaker.
- the lamp is an LED lamp such as a display lamp of the control target drone which has been explained above with reference to FIG. 3 .
- the input section 212 can be operated by a user and includes an input section for a drone control signal, an input section for a selection of a control target drone, and the like.
- the input section 212 may be formed by use of a touch panel type display section.
- the communication section 213 performs communication with a control target mobile device such as a drone.
- the IMU section 214 and the GPS section 215 are own-position information obtaining sections such as an IMU and a GPS.
- the storage section 216 stores a program which is executed in the software part 220 , for example. Moreover, the storage section 216 is used also as a work area or a storage area for various parameters that are used to execute the program.
- the storage section 216 also stores own position information obtained by the IMU section 214 and the GPS section 215 , and further, position information received from the mobile device 100 .
- the storage section 216 is used also as a storage area for various data generated in the software part 220 .
- the software part 220 corresponds to a data processing section that executes various processes in accordance with the program (software) stored in the storage section 216 , for example.
- the software part 220 includes a processor such as a CPU having a program executing function such that various processes are executed by the processor executing the program.
- a plurality of process blocks in the software part 220 in FIG. 29 are depicted as processing sections that are independent of one another according to the process types of a plurality of processes to be executed in the software part 220 .
- the software part 220 includes an output control section 221 , an inputted-information analysis section 222 , a control target determination section 223 , a transmission-data generation section 224 , an information integration section 225 , and a controller position estimation section 226 .
- the output control section 221 generates information to be outputted to the output section 211 , and outputs the information.
- the output control section 221 generates display information that is used to switch a control target drone, information indicating a control target drone, and the like and outputs the information to a display section.
- the inputted-information analysis section 222 analyzes information inputted via the input section 212 . Specifically, the inputted-information analysis section 222 receives and analyzes control-target-drone switching information, etc. from a user and outputs the analysis result to the control target determination section 223 .
- the control target determination section 223 receives the analysis information from the inputted-information analysis section 222 and determines a drone that is a control target of the controller 200 . Information regarding the determination is inputted to the transmission-data generation section 224 .
- the transmission-data generation section 224 transmits a selection signal to a mobile device (drone) via the communication section 213 .
- the information integration section 225 generates integrated information by integrating a controller position calculated by the controller position estimation section 226 , drone position information received from the drone via the communication section 213 , etc. and outputs the integrated information to the transmission-data generation section 224 to transmit the information to the drone via the transmission-data generation section 224 or the communication section 213 .
- the integrated information is transferred to the output control section 221 to cause the information to be outputted to the output section 211 .
- the controller position estimation section 226 calculates the position (3D position) of the controller 200 on the basis of information obtained by the IMU section 214 and the GPS section 215 , etc.
- a mobile device including:
- a communication section that performs communication with a controller which selectively transmits control signals to a plurality of mobile devices
- the selected-device identification track is formed according to a movement form which is any one of forward/rearward horizontal movement, leftward/rightward horizontal movement, upward/downward vertical movement, or rotational movement in a particular direction.
- the mobile device includes a drone
- the selected-device identification track includes a flight track of the drone.
- the selected-device identification track is formed according to a flight form which is any one of forward/rearward horizontal flight, leftward/rightward horizontal flight, upward/downward vertical flight, rotational flight in a particular direction, swing flight, tilted flight, or vertically-inverted flight.
- the data processing section upon confirming reception of the own-device selecting signal, the data processing section generates the selected-device identification track and causes the own device to move in accordance with the generated selected-device identification track.
- the data processing section generates, as the selected-device identification track, a track for moving in a plane that is substantially orthogonal to a visual line direction of a user who is manipulating the controller.
- the data processing section generates the selected-device identification track which varies in size depending on a distance between the controller and the own device.
- the data processing section generates the selected-device identification track that is larger in size when the distance between the controller and the own device is longer.
- the data processing section generates the selected-device identification track in a region that is observable by a user who is manipulating the controller.
- the data processing section determines whether or not any obstacle exists between the controller and the own device, detects a region where no obstacle exists between the controller and the own device in a case where any obstacle exists between the controller and the own device, and generates the selected-device identification track in the detected region.
- the data processing section generates a three-dimensional map based on information acquired during movement of the own device and detects, by using the generated three-dimensional map, the region that is observable by the user who is manipulating the controller.
- the data processing section generates, as the selected-device identification track, a track for moving to a highest position among positions of all mobile devices that are to be control targets of the controller.
- the data processing section generates, as the selected-device identification track, a track for moving to a position closest to the controller among positions of all mobile devices that are to be control targets of the controller.
- the data processing section generates, as the selected-device identification track, a track in which positions of points constituting the track and moving speeds at the respective track constituting points are specified.
- a mobile body control system including:
- controller that selectively transmits control signals to a plurality of mobile devices
- the controller transmits, to one of the plurality of mobile devices, a selection signal which indicates that the one mobile device is selected as a control target, and
- a mobile body control method which is executed by a mobile device, the mobile device including a communication section that performs communication with a controller which selectively transmits control signals to a plurality of mobile devices, and a data processing section that performs movement control of the own device, the method including:
- causing the data processing section to, upon confirming reception of the own-device selecting signal, perform movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
- a mobile body control method which is executed by a mobile body control system including a controller that selectively transmits control signals to a plurality of mobile devices and a mobile device that moves upon receiving a control signal from the controller, the method including:
- controller causing the controller to transmit, to one of the plurality of mobile devices, a selection signal which indicates that the one mobile device is selected as a control target;
- causing the data processing section to, upon confirming reception of the own-device selecting signal, perform movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
- a series of the processes explained herein can be executed by hardware, software, or a composite structure thereof.
- a program having a sequence of the processes recorded therein can be executed after being installed into a memory incorporated in dedicated hardware in a computer or can be executed after being installed into a general-purpose computer capable of executing various processes.
- a program may be previously recorded in a recording medium.
- the program can be installed in the computer from the recording medium.
- the program can be received over a network such as a local area network (LAN) or the Internet and be installed into a recording medium such as an internal hard disk.
- LAN local area network
- the Internet can be installed into a recording medium such as an internal hard disk.
- a system refers to a logical set structure including a plurality of devices, and the devices of the structure are not necessarily included in the same casing.
- a user who is manipulating a controller can identify a control target device without taking the eyes off mobile devices such as drones.
- the configuration includes a communication section that performs communication with a controller which selectively transmits control signals to a plurality of mobile devices, and a data processing section that performs movement control of the own device.
- the data processing section confirms whether or not an own-device selecting signal which indicates that the own device is selected as a control target device has been received from the controller and causes, upon reception of the own-device selecting signal, the own device to move in accordance with a selected-device identification track which indicates that the own device is selected.
- a drone starts flight in accordance with a selected-device identification track such as a forward/rearward horizontal flight track or a leftward/rightward horizontal flight track.
Abstract
Description
(process point reference speed)×(scale value)>allowable maximum speed (Expression 1)
h_1=(h_max)+(h_off) (Expression 2)
d_1=(d_min)−(d_off) (Expression 3)
-
- confirms whether or not an own-device selection signal which indicates that the own device is selected as a control target device has been received from the controller, and
- upon confirming reception of the own-device selection signal, performs movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
-
- confirms whether or not an own-device selecting signal which indicates that the own device is selected as a control target device has been received from the controller, and
- upon confirming reception of the own-device selecting signal, performs movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
-
- confirm whether or not an own-device selecting signal which indicates that the own device is selected as a control target device has been received from the controller, and
- upon confirming reception of the own-device selecting signal, perform movement control to cause the own device to move in accordance with a selected-device identification track which indicates that the own device is selected as the control target device.
Claims (17)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019150327A JP2021033447A (en) | 2019-08-20 | 2019-08-20 | Movable device, movable body control system, movable body control method, and program |
JP2019-150327 | 2019-08-20 | ||
JPJP2019-150327 | 2019-08-20 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210055744A1 US20210055744A1 (en) | 2021-02-25 |
US11586225B2 true US11586225B2 (en) | 2023-02-21 |
Family
ID=74647322
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/991,056 Active 2040-10-05 US11586225B2 (en) | 2019-08-20 | 2020-08-12 | Mobile device, mobile body control system, mobile body control method, and program |
Country Status (2)
Country | Link |
---|---|
US (1) | US11586225B2 (en) |
JP (1) | JP2021033447A (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024018766A1 (en) * | 2022-07-21 | 2024-01-25 | パナソニックホールディングス株式会社 | Autonomous movable body control system, autonomous movable body, and control device |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016007648A (en) | 2014-06-20 | 2016-01-18 | オムロン株式会社 | Robot control system |
US9508263B1 (en) * | 2015-10-20 | 2016-11-29 | Skycatch, Inc. | Generating a mission plan for capturing aerial images with an unmanned aerial vehicle |
US20170269587A1 (en) * | 2015-09-04 | 2017-09-21 | YooJung Hong | Drone controller |
US20180164801A1 (en) * | 2016-12-14 | 2018-06-14 | Samsung Electronics Co., Ltd. | Method for operating unmanned aerial vehicle and electronic device for supporting the same |
US20180275659A1 (en) * | 2017-03-21 | 2018-09-27 | Kabushiki Kaisha Toshiba | Route generation apparatus, route control system and route generation method |
US20180312274A1 (en) * | 2017-04-27 | 2018-11-01 | Qualcomm Incorporated | Environmentally Aware Status LEDs for Use in Drones |
US20180356189A1 (en) * | 2017-06-13 | 2018-12-13 | Raytheon Company | Recapture of remotely-tracked command guided vehicle into the tracker's field-of-view |
US20190077507A1 (en) * | 2017-09-14 | 2019-03-14 | Drone Racing League, Inc. | Three-dimensional pathway tracking system |
US20190176967A1 (en) * | 2016-03-31 | 2019-06-13 | Nikon Corporation | Flying device, electronic device, and program |
US20190220020A1 (en) * | 2019-03-26 | 2019-07-18 | Intel Corporation | Methods and apparatus for dynamically routing robots based on exploratory on-board mapping |
US20210034078A1 (en) * | 2017-12-27 | 2021-02-04 | Intel Corporation | Dynamic generation of restricted flight zones for drones |
-
2019
- 2019-08-20 JP JP2019150327A patent/JP2021033447A/en active Pending
-
2020
- 2020-08-12 US US16/991,056 patent/US11586225B2/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016007648A (en) | 2014-06-20 | 2016-01-18 | オムロン株式会社 | Robot control system |
US20170144297A1 (en) | 2014-06-20 | 2017-05-25 | Omron Corporation | Robot control system |
US20170269587A1 (en) * | 2015-09-04 | 2017-09-21 | YooJung Hong | Drone controller |
US9508263B1 (en) * | 2015-10-20 | 2016-11-29 | Skycatch, Inc. | Generating a mission plan for capturing aerial images with an unmanned aerial vehicle |
US20190176967A1 (en) * | 2016-03-31 | 2019-06-13 | Nikon Corporation | Flying device, electronic device, and program |
US20180164801A1 (en) * | 2016-12-14 | 2018-06-14 | Samsung Electronics Co., Ltd. | Method for operating unmanned aerial vehicle and electronic device for supporting the same |
US20180275659A1 (en) * | 2017-03-21 | 2018-09-27 | Kabushiki Kaisha Toshiba | Route generation apparatus, route control system and route generation method |
US20180312274A1 (en) * | 2017-04-27 | 2018-11-01 | Qualcomm Incorporated | Environmentally Aware Status LEDs for Use in Drones |
US20180356189A1 (en) * | 2017-06-13 | 2018-12-13 | Raytheon Company | Recapture of remotely-tracked command guided vehicle into the tracker's field-of-view |
US20190077507A1 (en) * | 2017-09-14 | 2019-03-14 | Drone Racing League, Inc. | Three-dimensional pathway tracking system |
US20210034078A1 (en) * | 2017-12-27 | 2021-02-04 | Intel Corporation | Dynamic generation of restricted flight zones for drones |
US20190220020A1 (en) * | 2019-03-26 | 2019-07-18 | Intel Corporation | Methods and apparatus for dynamically routing robots based on exploratory on-board mapping |
Also Published As
Publication number | Publication date |
---|---|
US20210055744A1 (en) | 2021-02-25 |
JP2021033447A (en) | 2021-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2022202920B2 (en) | System and method for autonomous operation of a machine | |
EP3272586B1 (en) | Work vehicle | |
EP3164769B1 (en) | Machine safety dome | |
JP6601554B2 (en) | Unmanned aerial vehicle, unmanned aircraft control system, flight control method, and computer program | |
JP6586109B2 (en) | Control device, information processing method, program, and flight system | |
US20200007751A1 (en) | Control apparatus, movable apparatus, and remote-control system | |
CN114397903A (en) | Navigation processing method and control equipment | |
JP7029565B2 (en) | Maneuvering equipment, information processing methods, and programs | |
JP2015043488A (en) | Remote controller and remote construction method using the same | |
WO2017169841A1 (en) | Display device and display control method | |
KR20170107341A (en) | Mobile robot and method for controlling the same | |
US20210034052A1 (en) | Information processing device, instruction method for prompting information, program, and recording medium | |
JP7076501B2 (en) | Work vehicle | |
JP2019101765A (en) | Tracking image presentation system for moving object | |
JP7287262B2 (en) | Remote control system and remote control server | |
US11586225B2 (en) | Mobile device, mobile body control system, mobile body control method, and program | |
JP6740170B2 (en) | Mobile body control system, program, and control method | |
JP6368503B2 (en) | Obstacle monitoring system and program | |
CN113574487A (en) | Unmanned aerial vehicle control method and device and unmanned aerial vehicle | |
EP4072130A1 (en) | Work assisting server and work assisting system | |
JP6699944B2 (en) | Display system | |
KR102181809B1 (en) | Apparatus and method for checking facility | |
JP6527848B2 (en) | Monitoring device and program | |
JP2020170293A (en) | Image display method and remote-control system | |
CN109947096B (en) | Controlled object control method and device and unmanned system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, CHAO;KOBAYASHI, DAI;ISHIZUKA, TATSUYA;SIGNING DATES FROM 20200803 TO 20200826;REEL/FRAME:055246/0251 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |