WO2019206044A1 - 信息处理装置、信息提示指示方法、程序以及记录介质 - Google Patents
信息处理装置、信息提示指示方法、程序以及记录介质 Download PDFInfo
- Publication number
- WO2019206044A1 WO2019206044A1 PCT/CN2019/083477 CN2019083477W WO2019206044A1 WO 2019206044 A1 WO2019206044 A1 WO 2019206044A1 CN 2019083477 W CN2019083477 W CN 2019083477W WO 2019206044 A1 WO2019206044 A1 WO 2019206044A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- flying body
- terminal
- interest
- point
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 33
- 238000000034 method Methods 0.000 title claims description 35
- 238000012545 processing Methods 0.000 claims abstract description 93
- 238000004891 communication Methods 0.000 claims description 104
- 238000003384 imaging method Methods 0.000 description 183
- 238000010586 diagram Methods 0.000 description 40
- 230000033001 locomotion Effects 0.000 description 19
- 238000001514 detection method Methods 0.000 description 18
- 101000755816 Homo sapiens Inactive rhomboid protein 1 Proteins 0.000 description 16
- 102100022420 Inactive rhomboid protein 1 Human genes 0.000 description 16
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 12
- 230000001133 acceleration Effects 0.000 description 12
- 238000013459 approach Methods 0.000 description 10
- 230000007246 mechanism Effects 0.000 description 8
- 230000009471 action Effects 0.000 description 7
- 238000005259 measurement Methods 0.000 description 7
- 210000003128 head Anatomy 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 239000003550 marker Substances 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 239000000725 suspension Substances 0.000 description 3
- 102220527481 Calcium-activated potassium channel subunit beta-4_T11A_mutation Human genes 0.000 description 2
- 102220527418 Calcium-activated potassium channel subunit beta-4_T11D_mutation Human genes 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 102220060547 rs786203080 Human genes 0.000 description 2
- 230000000052 comparative effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 102200026996 rs45550635 Human genes 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000002366 time-of-flight method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0033—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0027—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/104—Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
- G05D1/1064—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones specially adapted for avoiding collisions with other aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
Definitions
- the present disclosure relates to an information processing apparatus, an information prompt indication method, a program, and a recording medium that present information according to a point of interest of a user in an image photographed by a flying body.
- the user did not need to visually view the unmanned aircraft, for example, the FPV (First Person View, first person) who operated the unmanned aircraft while observing the captured image obtained by the unmanned aircraft displayed on the display of the terminal using the terminal.
- the angle of view) flight is known (refer to Patent Document 1).
- Patent Document 1 Japanese Laid-Open Patent Publication No. 2016-203978
- the unmanned aircraft When the unmanned aircraft is subjected to FPV flight, it is difficult for the user to check the situation around the unmanned aircraft if only the captured image is observed. For example, in the case where a plurality of unmanned aerial vehicles are flying toward the same destination, as the unmanned aircraft approach each other as they approach the destination, the unmanned aircraft may collide with each other.
- an information processing apparatus prompts information according to a point of interest of a user in an image photographed by a flying body, and includes a processing unit that acquires a first terminal that controls control of the first flying body
- the first user of interest in the first image captured by the first flying body and the second user operating on the second terminal indicating the control of the second flying body are photographed by the second flying body a second point of interest in the second image, and determining whether the first point of interest and the second point of interest are common points of interest representing the same point of interest, and in the case of a common point of interest, associated with the second object
- the information is prompted to the first terminal.
- the processing unit may determine whether the first flying body is moving, and if the first flying body is moving, present information related to the second flying body to the first terminal.
- the processing unit may acquire the position information of the first flying body, and acquire the position information of the second flying body.
- the processing unit and the second The information related to the flight body is presented to the first terminal.
- the processing unit may acquire the position information of the first flying body, acquire the position information of the second flying body, and prompt the first position in the position of the second flying body relative to the first flying body in the screen of the first terminal Information on the existence of two flying bodies.
- the processing unit may acquire the position information of the first flying body, acquire the position information of the second flying body, and if the distance between the first flying body and the second flying body, that is, the first distance is less than or equal to the first threshold, the recommended pair
- the first recommendation information limiting the speed of flight of the first flying body is presented to the first terminal.
- the processing unit may present first recommendation information that recommends that the shorter the first distance is, the more the speed of the flight of the first flying body is limited to a low speed.
- the processing unit may acquire location information of the common point of interest, and acquire location information of the first flying body.
- the first flight is recommended.
- the second recommendation information limiting the speed of the flight of the body is presented to the first terminal.
- the processing unit may present second recommendation information that recommends that the shorter the second distance is, the more the speed of the flight of the first flying body is limited to a low speed.
- the information processing device may be a server, and may further include a communication portion.
- the processing unit may acquire the first point of interest from the first terminal via the communication unit, acquire the second point of interest from the second terminal, and transmit information to be prompted to the first terminal to the first terminal via the communication unit.
- the information processing device is a first terminal, and may further include a communication unit and a prompting unit.
- the processing unit may acquire the first image from the first flying body via the communication unit, detect the first point of interest in the first image, and acquire the second point of interest from the second terminal via the communication unit, and prompt the information to be presented to the first terminal. Give the prompts.
- an information prompt indication method for presenting information according to a point of interest of a user in an image captured by a flying body includes: acquiring a first operation of a first terminal that controls control of the first flying body a first point of interest of the user in the first image captured by the first flying body, and a second image of the second user operating the second terminal that controls the second flying body in the second flying body a step of focusing on a second point of interest; determining whether the first point of interest and the second point of interest are steps of a common point of interest representing the same point of interest; and, in the case of a common point of interest, relating to the second object of interest The information is prompted to the first terminal.
- the information prompt indication method may further include the step of determining whether the first flying body is moving.
- the step of prompting information related to the second flying body may include the step of presenting information related to the second flying body to the first terminal while the first flying body is moving.
- the information prompt indication method may further include the steps of acquiring the position information of the first flying body and acquiring the position information of the second flying body.
- the step of prompting information related to the second flying body may include: information related to the second flying body in a case where the distance between the first flying body and the second flying body, that is, the first distance is less than or equal to the first threshold The step of prompting the first terminal.
- the information prompt indication method may further include the steps of acquiring the position information of the first flying body and acquiring the position information of the second flying body.
- the step of prompting information related to the second flying body may include prompting information indicating the presence of the second flying body at a position according to a position of the second flying body relative to the first flying body in a picture of the first terminal step.
- the information prompt indication method may further include: a step of acquiring position information of the first flying body; a step of acquiring position information of the second flying body; and a distance between the first flying body and the second flying body, that is, the first distance is less than or equal to In the case of a threshold, the step of recommending the first recommendation information that limits the speed of flight of the first flying body to the first terminal is recommended.
- the step of prompting the first recommendation information may include the step of prompting the first recommendation information that limits the speed of flight of the first flying body to a low speed to the shorter the recommended first distance.
- the information prompt indication method may further include: a step of acquiring location information of the common point of interest; a step of acquiring location information of the first flying body; and a distance between the common point of interest and the first flying body, that is, the second distance is less than or equal to the second threshold In the case, the step of recommending the second recommendation information that limits the speed of the flight of the first flying body to the first terminal is recommended.
- the step of prompting the second recommendation information may include the step of prompting the second recommendation information that limits the speed of the flight of the first flying body to the low speed to the shorter the recommended second distance.
- the information processing device may be a server.
- the step of acquiring the first point of interest and the second point of interest may include the steps of receiving a first point of interest from the first terminal, and receiving the second point of interest from the second terminal.
- the step of prompting information related to the second flying body may include the step of transmitting information to the first terminal to be prompted to the first terminal.
- the information processing device may be the first terminal.
- the step of acquiring the first point of interest and the second point of interest may include: receiving the first image from the first flying body; detecting the first point of interest in the first image; and receiving the second point of interest from the second terminal A step of.
- the step of prompting the information related to the second flying body may include the step of prompting the information to be presented to the first terminal to the prompting portion included in the first terminal.
- a program for causing an information processing apparatus that prompts information according to a point of interest of a user in an image captured by a flying body performs the step of acquiring a first terminal that controls control of the first flying body a first user of interest in the first image taken by the first flying body and a second user operating on the second terminal indicating control of the second flying body are operated by the second flying body a step of capturing a second point of interest in the second image; determining whether the first point of interest and the second point of interest are steps of a common point of interest representing the same point of interest; and in the case of a common point of interest, The second flight body related information prompts the first terminal.
- a recording medium which is a computer readable medium and which is recorded with a program for causing an information processing apparatus that presents information according to a point of interest of a user in an image photographed by a flying body to perform the following steps: obtaining a pair indication a first point of interest of the first user operating the first terminal of the control of the first flying body in the first image captured by the first flying body, and a second terminal instructing control of the second flying body a step of the second user focusing on the second point of interest in the second image taken by the second flying body; determining whether the first point of interest and the second point of interest are common points of interest representing the same point of interest; In the case of a common point of interest, the step of presenting information related to the second flying body to the first terminal is performed.
- FIG. 1 is a schematic view showing a configuration example of a flight system in the first embodiment.
- FIG. 2 is a diagram showing an example of a specific appearance of an unmanned aerial vehicle.
- FIG. 3 is a block diagram showing one example of a hardware configuration of an unmanned aerial vehicle.
- FIG. 4 is a perspective view showing an example of an appearance of a terminal on which a transmitter is mounted.
- FIG. 5 is a block diagram showing one example of a hardware configuration of a transmitter.
- Fig. 6A is a block diagram showing one example of a hardware configuration of a terminal.
- Fig. 6B is a block diagram showing the hardware configuration of the server.
- Fig. 7 is a sequence diagram showing an information presentation instruction step by a server in the first operation example.
- FIG. 8 is a diagram showing an example of detection of a point of interest.
- FIG. 9A is a diagram showing a captured image displayed by each display when the points of interest of the two users are common points of interest.
- FIG. 9B is a diagram showing the captured images GZ1 and GZ2 respectively displayed on the display of each terminal when the points of interest of the two users are not the common point of interest.
- FIG. 10 is a diagram showing the positional relationship of two unmanned aerial vehicles.
- Fig. 11A is a view showing a captured image of the unmanned aerial vehicle displayed on the display of the terminal.
- Fig. 11B is a view showing a captured image of the unmanned aerial vehicle displayed on the display of the other terminal.
- Fig. 12A is a sequence diagram showing an information presentation instruction step under the unmanned aerial vehicle viewpoint by the server in the second operation example.
- Fig. 12B is a sequence diagram showing an information presentation instruction step under the viewpoint of the unmanned aerial vehicle by the server, continued from Fig. 12A.
- FIG. 13 is a diagram showing a space of a threshold D set for a distance between two unmanned aerial vehicles.
- FIG. 14A is a diagram showing a recommended screen displayed on the display when the distance is within the threshold.
- FIG. 14B is a diagram showing a recommended screen displayed on the display in the case where the distance is within the threshold.
- Fig. 15A is a view showing a state in which an unmanned aerial vehicle is manipulated by visual observation.
- 15B is a diagram showing a state in which an unmanned aerial vehicle is operated in an FPV flight mode.
- Fig. 16A is a sequence diagram showing an information presentation instruction step at a destination viewpoint by a server in the second embodiment.
- Fig. 16B is a sequence diagram showing an information presentation instruction step at the destination viewpoint by the server, continued from Fig. 16A.
- 17 is a sequence diagram showing an information presentation instruction step under the unmanned aerial vehicle viewpoint by the terminal in the first operation example of the third embodiment.
- FIG. 18A is a sequence diagram showing an information presentation instruction step in the unmanned aerial vehicle viewpoint by the terminal in the second operation example of the third embodiment.
- Fig. 18B is a sequence diagram showing an information presentation instruction step under the viewpoint of the unmanned aerial vehicle performed by the terminal, continued from Fig. 18A.
- 19 is a perspective view showing an appearance of a head mounted display in the fourth embodiment.
- Fig. 20 is a block diagram showing the hardware configuration of a head mounted display.
- the flying body is exemplified by an unmanned aerial vehicle (UAV).
- Unmanned aircraft include aircraft that move in the air.
- the unmanned aerial vehicle is also marked as "UAV”.
- the information processing apparatus is exemplified by a server, a terminal, and the like.
- the information prompt indication method specifies an action in the information processing apparatus.
- a program (for example, a program that causes the information processing apparatus to execute various processes) is recorded in the recording medium.
- FIG. 1 is a schematic diagram showing a configuration example of the flight system 10 in the first embodiment.
- the flight system 10 includes a plurality of unmanned aerial vehicles 100, a transmitter 50, a plurality of terminals 80, and a server 300.
- the unmanned aircraft 100, the transmitter 50, the terminal 80, and the server 300 can communicate with each other by wired communication or wireless communication (for example, a wireless LAN (Local Area Network)).
- the terminal 80 can communicate with the server 300 by wired communication or wireless communication.
- unmanned aircraft 100A, 100B are shown as a plurality of unmanned aerial vehicles 100.
- Terminals 80A, 80B are shown as a plurality of terminals 80.
- FIG. 2 is a diagram showing an example of a specific appearance of the unmanned aircraft 100.
- a perspective view of the unmanned aerial vehicle 100 when flying in the moving direction STV0 is shown in FIG.
- the unmanned aircraft 100 is an example of a flying body.
- a rolling axis (refer to the x-axis) is defined in a direction parallel to the ground and in the moving direction STV0.
- the pitch axis (refer to the y-axis) is determined to be parallel to the ground and perpendicular to the roll axis
- the yaw axis (refer to the z-axis) is determined to be perpendicular to the ground and perpendicular to the roll axis and the pitch axis. direction.
- the unmanned aerial vehicle 100 is configured to include a UAV main body 102, a universal joint 200, an imaging device 220, and a plurality of imaging devices 230.
- the UAV main body 102 is one example of a housing of the unmanned aerial vehicle 100.
- the imaging devices 220 and 230 are an example of an imaging unit.
- the UAV body 102 includes a plurality of rotors (spiral).
- the UAV main body 102 causes the unmanned aerial vehicle 100 to fly by controlling the rotation of a plurality of rotors.
- the UAV body 102 causes the unmanned aerial vehicle 100 to fly using, for example, four rotors.
- the number of rotors is not limited to four.
- the unmanned aerial vehicle 100 can be a fixed-wing aircraft without a rotor.
- the imaging device 220 is an imaging camera that images a subject included in a desired imaging range (for example, a situation in which the imaging target is over, a scenery such as a mountain river, or a building on the ground).
- a desired imaging range for example, a situation in which the imaging target is over, a scenery such as a mountain river, or a building on the ground.
- the plurality of imaging devices 230 may be sensing cameras that image the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100.
- the two camera devices 230 may be disposed on the front side of the hand of the unmanned aircraft 100. Further, the other two imaging devices 230 may be disposed on the bottom surface of the unmanned aerial vehicle 100.
- the two camera units 230 on the front side can be paired to function as a so-called stereo camera.
- the two imaging devices 230 on the bottom side may also be paired to function as a stereo camera.
- the three-dimensional spatial data around the unmanned aerial vehicle 100 can be generated from images captured by the plurality of imaging devices 230.
- the number of imaging devices 230 included in the unmanned aerial vehicle 100 is not limited to four.
- the unmanned aerial vehicle 100 may include at least one camera 230.
- the unmanned aerial vehicle 100 may include at least one camera 230 on the nose, the tail, the side, the bottom surface, and the top surface of the unmanned aircraft 100, respectively.
- the angle of view that can be set in the camera 230 can be larger than the angle of view that can be set in the camera 220.
- the camera 230 may have a single focus lens or a fisheye lens.
- FIG. 3 is a block diagram showing one example of a configuration configuration of the unmanned aircraft 100.
- the unmanned aerial vehicle 100 is configured to include a UAV control unit 110, a communication interface 150, a memory 160, a universal joint 200, a rotor mechanism 210, an imaging device 220, an imaging device 230, a GPS receiver 240, and an inertial measurement device (IMU: Inertial Measurement).
- Unit 250 magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, and laser measuring instrument 290.
- the communication interface 150 is an example of a communication section.
- the UAV control unit 110 is configured by, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
- the UAV control unit 110 performs signal processing for overall control of the operation of each part of the unmanned aircraft 100, input/output processing of data with other parts, arithmetic processing of data, and storage processing of data.
- the UAV control unit 110 controls the flight of the unmanned aircraft 100 in accordance with a program stored in the memory 160.
- the UAV control unit 110 controls the flight of the unmanned aerial vehicle 100 in accordance with an instruction received from the remote transmitter 50 through the communication interface 150.
- the memory 160 can be detached from the unmanned aircraft 100.
- the UAV control unit 110 can specify the environment around the unmanned aircraft 100 by analyzing a plurality of images captured by the plurality of imaging devices 230.
- the UAV control unit 110 controls the flight based on the environment around the unmanned aircraft 100, for example, avoiding an obstacle.
- the UAV control unit 110 acquires date information indicating the current date.
- the UAV control section 110 can acquire date information indicating the current date from the GPS receiver 240.
- the UAV control unit 110 can acquire date information indicating the current date from a timer (not shown) mounted on the unmanned aircraft 100.
- the UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100.
- the UAV control unit 110 can acquire position information indicating the latitude, longitude, and altitude at which the unmanned aircraft 100 is located from the GPS receiver 240.
- the UAV control unit 110 can acquire latitude and longitude information indicating the latitude and longitude of the unmanned aircraft 100 from the GPS receiver 240, respectively, and acquire height information indicating the height of the unmanned aircraft 100 from the barometric altimeter 270 as position information.
- the UAV control unit 110 can acquire the distance between the emission point of the ultrasonic wave generated by the ultrasonic sensor 280 and the reflection point of the ultrasonic wave as the height information.
- the UAV control unit 110 acquires orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260.
- the orientation information indicates, for example, an orientation corresponding to the orientation of the nose of the unmanned aerial vehicle 100.
- the UAV control unit 110 can acquire position information indicating a position where the unmanned aircraft 100 should exist when the imaging device 220 performs imaging in accordance with the imaging range of imaging.
- the UAV control unit 110 can acquire position information indicating the position where the unmanned aircraft 100 should exist from the memory 160.
- the UAV control unit 110 can acquire position information indicating a position where the unmanned aircraft 100 should exist from the other device such as the transmitter 50 via the communication interface 150.
- the UAV control unit 110 can refer to the three-dimensional map database, specify the position where the unmanned aircraft 100 can exist in order to capture the imaging range to be captured, and acquire the position as position information indicating the position where the unmanned aircraft 100 should exist.
- the UAV control unit 110 acquires imaging information indicating an imaging range of each of the imaging device 220 and the imaging device 230.
- the UAV control unit 110 acquires angle of view information indicating the angles of view of the imaging device 220 and the imaging device 230 from the imaging device 220 and the imaging device 230 as parameters for specifying the imaging range.
- the UAV control unit 110 acquires information indicating the imaging directions of the imaging device 220 and the imaging device 230 as parameters for specifying the imaging range.
- the UAV control unit 110 acquires posture information indicating the posture state of the imaging device 220 from the universal joint 200 as, for example, information indicating the imaging direction of the imaging device 220.
- the UAV control unit 110 acquires information indicating the orientation of the unmanned aircraft 100.
- the information indicating the posture of the imaging device 220 indicates the angle at which the universal joint 200 is rotated from the reference rotation angles of the pitch axis and the yaw axis.
- the UAV control unit 110 acquires position information indicating the position where the unmanned aircraft 100 is located as a parameter for specifying the imaging range.
- the UAV control unit 110 can generate an imaging range indicating the imaging range by delineating the imaging range indicating the geographical range captured by the imaging device 220, based on the angle of view and the imaging direction of the imaging device 220 and the imaging device 230, and the position of the unmanned aerial vehicle 100. Information to obtain camera information.
- the UAV control unit 110 can acquire imaging information indicating an imaging range to be captured by the imaging device 220.
- the UAV control unit 110 can acquire the imaging information to be captured by the imaging device 220 from the memory 160.
- the UAV control unit 110 can acquire the imaging information to be captured by the imaging device 220 from the other device such as the transmitter 50 via the communication interface 150.
- the UAV control unit 110 can acquire stereoscopic information (three-dimensional information) indicating a three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aircraft 100.
- the object is a part of a landscape such as a building, a road, a vehicle, or a tree.
- the stereoscopic information is, for example, three-dimensional spatial data.
- the UAV control unit 110 can acquire stereoscopic information from each image obtained by the plurality of imaging devices 230 by generating stereoscopic information indicating a three-dimensional shape of an object existing around the unmanned aircraft 100.
- the UAV control unit 110 can acquire stereoscopic information indicating a three-dimensional shape of an object existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the memory 160.
- the UAV control unit 110 can acquire stereoscopic information related to the three-dimensional shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database managed by the server existing on the network.
- the UAV control unit 110 acquires image data captured by the imaging device 220 and the imaging device 230.
- the UAV control unit 110 controls the universal joint 200, the rotor mechanism 210, the imaging device 220, and the imaging device 230.
- the UAV control unit 110 controls the imaging range of the imaging device 220 by changing the imaging direction or the angle of view of the imaging device 220.
- the UAV control unit 110 controls the imaging range of the imaging device 220 supported by the universal joint 200 by controlling the rotation mechanism of the universal joint 200.
- the imaging range refers to a geographical range captured by the imaging device 220 or the imaging device 230.
- the camera range is defined by latitude, longitude and altitude.
- the imaging range can be a range of three-dimensional spatial data defined by latitude, longitude, and altitude.
- the imaging range is specified in accordance with the angle of view and imaging direction of the imaging device 220 or the imaging device 230, and the position of the unmanned aircraft 100.
- the imaging directions of the imaging device 220 and the imaging device 230 are defined by the orientation and depression angle of the imaging device 220 and the imaging device 230 on the front side of the imaging lens.
- the imaging direction of the imaging device 220 is a direction specified by the head of the unmanned aerial vehicle 100 and the posture state of the imaging device 220 of the universal joint 200.
- the imaging direction of the imaging device 230 is a direction specified by the orientation of the head of the unmanned aircraft 100 and the position where the imaging device 230 is provided.
- the UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the position including the latitude, longitude, and altitude of the unmanned aircraft 100 by controlling the rotor mechanism 210.
- the UAV control unit 110 can control the imaging ranges of the imaging device 220 and the imaging device 230 by controlling the flying of the unmanned aircraft 100.
- the UAV control section 110 can control the angle of view of the imaging apparatus 220 by controlling the zoom lens included in the imaging apparatus 220.
- the UAV control unit 110 can control the angle of view of the imaging device 220 by digital zoom using the digital zoom function of the imaging device 220.
- the UAV control unit 110 can move the unmanned aircraft 100 to a specific position on a specific date, so that the imaging device 220 is desired.
- the desired imaging range is taken under the environment.
- the UAV control unit 110 can move the imaging device 220 to a specific position by causing the unmanned aircraft 100 to move to a specific position on a specific date. Take the desired range of images.
- the UAV control unit 110 can set the flight mode of the unmanned aircraft 100.
- the flight mode includes, for example, a normal flight mode, a low speed flight mode, a temporary stop mode, and the like.
- the information of the set flight mode can be saved in the memory 160.
- the normal flight mode is a flight mode that can fly without speed limit.
- the low-speed flight mode is a flight mode in which it is prohibited to fly at a speed higher than a predetermined speed and can fly at a limited speed.
- the temporary stop mode is an airplane mode that prohibits the movement of the unmanned aircraft 100 and can hover.
- the UAV control unit 110 adds information related to the captured image to the captured image captured by the imaging device 220 as additional information (an example of metadata).
- the additional information can include various parameters.
- the various parameters may include parameters (flight parameters) related to the flight of the unmanned aircraft 100 at the time of shooting and information (imaging parameters) related to the imaging of the camera 220 at the time of shooting.
- the flight parameters may include at least one of imaging position information, imaging path information, imaging time information, and other information.
- the imaging parameters may include at least one of imaging angle of view information, imaging direction information, imaging posture information, imaging range information, and subject distance information.
- the imaging path information indicates a path (imaging path) in which a captured image is captured.
- the imaging path information is information of a path on which the unmanned aircraft 100 is flying at the time of shooting, and may be composed of a collection of imaging positions in which the imaging positions are continuously connected.
- the imaging position can be based on the position acquired by the GPS receiver 240.
- the imaging time information indicates the time (imaging time) at which the captured image is captured.
- the imaging time information can be based on the time information of the timer referred to by the UAV control unit 110.
- the imaging angle of view information indicates angle of view information of the imaging device 220 when the captured image is captured.
- the imaging direction information indicates the imaging direction (imaging direction) of the imaging device 220 when the captured image is captured.
- the imaging posture information indicates posture information of the imaging device 220 when the captured image is captured.
- the imaging range information indicates the imaging range of the imaging device 220 when the captured image is captured.
- the subject distance information indicates information of the distance from the imaging device 220 to the subject. The subject distance information may be based on the detection information measured by the ultrasonic sensor 280 or the laser measuring instrument 290.
- the communication interface 150 is in communication with the transmitter 50, the terminal 80, and the server 300.
- the communication interface 150 receives various instructions and information from the remote transmitter 50 to the UAV control unit 110.
- the communication interface 150 can transmit the captured image and additional information related to the captured image to the terminal 80.
- the memory 160 stores the UAV control unit 110 for the universal joint 200, the rotor mechanism 210, the imaging device 220, the imaging device 230, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring instrument. 290 Programs required for control, etc.
- the memory 160 may be a computer readable recording medium, and may include an SRAM (Static Random Access Memory), a DRAM (Dynamic Random Access Memory), and an EPROM (Erasable Programmable Read Only Memory: Erasable).
- a flash memory such as a programmable read only memory, an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a USB memory.
- the memory 160 can be disposed inside the UAV body 102. It can be configured to be detachable from the UAV body 102.
- the universal joint 200 rotatably supports the image pickup device 220 around at least one of the axes.
- the universal joint 200 can rotatably support the image pickup device 220 centering on the yaw axis, the pitch axis, and the roll axis.
- the universal joint 200 can change the imaging direction of the imaging device 220 by rotating the imaging device 220 around at least one of the yaw axis, the pitch axis, and the roll axis.
- the rotor mechanism 210 includes a plurality of rotors 211, a plurality of drive motors 212 that rotate the plurality of rotors 211, and a current sensor 213 that measures a current value (actual measurement value) of a drive current for driving the drive motor 212.
- the drive current is supplied to the drive motor 212.
- the imaging device 220 captures a subject of a desired imaging range and generates data of the captured image.
- the image data obtained by the imaging by the imaging device 220 is stored in a memory of the imaging device 220 or in the memory 160.
- the imaging device 230 captures the surroundings of the unmanned aircraft 100 and generates data of the captured image.
- the image data of the imaging device 230 is stored in the memory 160.
- the GPS receiver 240 receives a plurality of signals indicating the time transmitted from a plurality of navigation satellites (i.e., GPS satellites) and the position (coordinates) of each GPS satellite.
- the GPS receiver 240 calculates the position of the GPS receiver 240 (i.e., the position of the unmanned aircraft 100) based on the received plurality of signals.
- the GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control unit 110.
- the position information of the GPS receiver 240 may be calculated by the UAV control unit 110 instead of the GPS receiver 240.
- the UAV control unit 110 inputs information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240.
- the inertial measurement device 250 detects the posture of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
- the inertial measurement device 250 detects the acceleration in the three-axis direction of the front, rear, left and right, and up and down of the unmanned aircraft 100 and the angular velocity in the three-axis directions of the pitch axis, the roll axis, and the yaw axis as the posture of the unmanned aircraft 100.
- the magnetic compass 260 detects the orientation of the nose of the unmanned aircraft 100, and outputs the detection result to the UAV control unit 110.
- the barometric altimeter 270 detects the flying height of the unmanned aircraft 100 and outputs the detection result to the UAV control section 110.
- the ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected from the ground and the objects, and outputs the detection results to the UAV control unit 110.
- the detection result may show the distance from the unmanned aircraft 100 to the ground, that is, the altitude.
- the detection result may indicate the distance from the unmanned aircraft 100 to the object.
- the laser measuring instrument 290 irradiates the laser light onto the object, receives the reflected light reflected by the object, and measures the distance between the unmanned aircraft 100 and the object by the reflected light.
- the distance measuring method according to the laser it may be a time-of-flight method.
- FIG. 4 is a perspective view showing one example of the appearance of the terminal 80 on which the transmitter 50 is mounted.
- a smartphone 80S is shown as an example of the terminal 80.
- the directions of the arrows shown in FIG. 4 are respectively observed with respect to the up, down, left, and right directions of the transmitter 50.
- the transmitter 50 is used in a state in which, for example, a person who uses the transmitter 50 (hereinafter referred to as an "operator") holds with both hands.
- the transmitter 50 has, for example, a case 50B of a resin material having a substantially square bottom surface and a substantially rectangular parallelepiped shape (in other words, a substantially box shape) having a height shorter than one side of the bottom surface.
- a left control bar 53L and a right control bar 53R are protruded from substantially the center of the casing surface of the transmitter 50.
- the left control bar 53L and the right control bar 53R are respectively operated by the operator for the remote control (for example, the front-rear movement, the left-right movement, the vertical movement, and the orientation change of the unmanned aircraft 100) by the operator. Used in (mobile control operation).
- the left control bar 53L and the right control bar 53R indicate positions where the initial state of the external force is not separately applied by the operator's hands.
- the left control bar 53L and the right control bar 53R are automatically restored to a predetermined position (for example, the initial position shown in FIG. 4) after the external force applied by the operator is released.
- a power button B1 of the transmitter 50 is disposed on the near side (in other words, the operator side) of the left control bar 53L.
- the remaining amount of the capacity of the battery (not shown) built in, for example, the transmitter 50 is displayed on the remaining battery amount display unit L2.
- the power button B1 again for example, the power of the transmitter 50 is turned on, and power can be supplied to each unit of the transmitter 50 for use.
- An RTH (Return To Home) button B2 is disposed on the near side (in other words, the operator side) of the right control rod 53R.
- the transmitter 50 transmits a signal to the unmanned aircraft 100 for automatically returning it to the predetermined position.
- the transmitter 50 can cause the unmanned aircraft 100 to automatically return to a predetermined position (e.g., the takeoff position stored by the unmanned aerial vehicle 100).
- a predetermined position e.g., the takeoff position stored by the unmanned aerial vehicle 100.
- the RTH can be utilized.
- the remote state display unit L1 and the battery remaining amount display unit L2 are disposed on the near side (in other words, the operator side) of the power button B1 and the RTH button B2.
- the remote state display unit L1 is configured by, for example, an LED (Light Emission Diode), and displays the wireless connection state between the transmitter 50 and the unmanned aircraft 100.
- the battery remaining amount display unit L2 is configured by, for example, an LED, and displays the remaining capacity of the battery (not shown) built in the transmitter 50.
- the antenna control unit 53L and the right control rod 53R are located on the rear side, and two antennas AN1 and AN2 are protruded from the rear side surface of the casing 50B of the transmitter 50.
- the antennas AN1, AN2 transmit signals generated by the transmitter control section 61 (i.e., signals for controlling the movement of the unmanned aircraft 100) to the driverless driving in accordance with the operations of the operator's left control bar 53L and the right control bar 53R. Aircraft 100. This signal is one of the operational input signals input by the transmitter 50.
- the antennas AN1, AN2 can cover, for example, a transmission and reception range of 2 km.
- the antennas AN1 and AN2 are transmitted from the unmanned aircraft 100 to an image captured by the imaging device 220 of the unmanned aerial vehicle 100 wirelessly connected to the transmitter 50 or various data acquired by the unmanned aircraft 100. In the case of these images, it is possible to receive these images or various data.
- the transmitter 50 does not include a display portion, it may include a display portion.
- the terminal 80 can be mounted on the bracket HLD.
- the bracket HLD can be attached and mounted on the transmitter 50. Thereby, the terminal 80 is mounted on the transmitter 50 via the bracket HLD.
- the terminal 80 and the transmitter 50 can be connected via a wired cable such as a USB cable.
- the terminal 80 may not be installed on the transmitter 50, and the terminal 80 and the transmitter 50 may be separately provided.
- FIG. 5 is a block diagram showing one example of a hardware configuration of the transmitter 50.
- the transmitter 50 is configured to include a left control bar 53L, a right control bar 53R, a transmitter control unit 61, a wireless communication unit 63, an interface unit 65, a power button B1, an RTH button B2, an operation unit group OPS, and a remote status display unit L1.
- the transmitter 50 is an example of an operation device that instructs control of the unmanned aircraft 100.
- the left control bar 53L is used, for example, for an operation of remotely controlling the movement of the unmanned aircraft 100 by the operator's left hand.
- the right control bar 53R is used, for example, to remotely control the operation of the unmanned aircraft 100 by the operator's right hand.
- the movement of the unmanned aircraft 100 is, for example, a movement in a forward direction, a movement in a backward direction, a movement in the left direction, a movement in the right direction, a movement in a rising direction, a movement in a downward direction, and a rotation in the left direction. Any one of the movement of the human-driven aircraft 100 and the movement of the unmanned aircraft 100 in the right direction, or a combination thereof, is the same as follows.
- a signal indicating that the press is pressed once is input to the transmitter control unit 61.
- the transmitter control unit 61 displays the remaining amount of the capacity of the battery (not shown) built in the transmitter 50 on the remaining battery level display unit L2 based on the signal. Thereby, the operator can easily check the remaining amount of the capacity of the battery built in the transmitter 50.
- a signal indicating that the press is pressed twice is transmitted to the transmitter control unit 61.
- the transmitter control unit 61 instructs a battery (not shown) built in the transmitter 50 to supply power to each unit in the transmitter 50 based on the signal. Thereby, the operator turns on the power of the transmitter 50, and the use of the transmitter 50 can be easily started.
- a signal indicating that the press is pressed is input to the transmitter control unit 61.
- the transmitter control unit 61 generates a signal for automatically returning the unmanned aircraft 100 to a predetermined position (for example, a take-off position of the unmanned aircraft 100) in accordance with the signal, and transmits the signal to the wireless communication unit 63 and the antennas AN1 and AN2.
- the person drives the aircraft 100. Thereby, the operator can automatically return (return) the unmanned aircraft 100 to the predetermined position by a simple operation of the transmitter 50.
- the operation unit group OPS is composed of a plurality of operation units OP (for example, operation units OP1, . . . , operation unit OPn) (n: an integer of 2 or more).
- the operation unit group OPS is operated by an operation unit other than the left control bar 53L, the right control bar 53R, the power button B1, and the RTH button B2 shown in FIG. 3 (for example, for assisting the unmanned aircraft 100 using the transmitter 50).
- the various operating units of the remote control are configured.
- the various operation units referred to here correspond to, for example, a button for instructing imaging of a still image using the imaging device 220 of the unmanned aircraft 100, and a moving image of the imaging device 220 using the unmanned aircraft 100.
- the transmitter control unit 61 is constituted by a processor (for example, a CPU, an MPU, or a DSP).
- the transmitter control unit 61 performs signal processing for integrally controlling the operation of each unit of the transmitter 50, input/output processing of data with other units, calculation processing of data, and storage processing of data.
- the transmitter control unit 61 is an example of a processing unit.
- the transmitter control unit 61 can acquire the data of the captured image captured by the imaging device 220 of the unmanned aircraft 100 by the wireless communication unit 63 and store the data in a memory (not shown), and output it to the terminal 80 via the interface unit 65. In other words, the transmitter control unit 61 can cause the terminal 80 to display the data of the captured image captured by the imaging device 220 of the unmanned aerial vehicle 100. Thereby, the captured image captured by the imaging device 220 of the unmanned aerial vehicle 100 can be displayed in the terminal 80.
- the transmitter control unit 61 can generate an instruction signal for controlling the flight of the unmanned aircraft 100 specified by the operation by the operations of the operator's left control bar 53L and right control bar 53R.
- the transmitter control unit 61 can remotely control the unmanned aircraft 100 by transmitting the indication signal to the unmanned aircraft 100 via the wireless communication unit 63 and the antennas AN1 and AN2. Thereby, the transmitter 50 can remotely control the movement of the unmanned aircraft 100.
- the wireless communication unit 63 is connected to the two antennas AN1 and AN2.
- the wireless communication unit 63 performs transmission and reception of information and data using a predetermined wireless communication method (for example, wireless LAN) with the unmanned aerial vehicle 100 via the two antennas AN1 and AN2.
- a predetermined wireless communication method for example, wireless LAN
- the interface unit 65 performs input and output of information and data between the transmitter 50 and the terminal 80.
- the interface unit 65 may be, for example, a USB port (not shown) provided on the transmitter 50.
- the interface unit 65 may be an interface other than the USB port.
- FIG. 6A is a block diagram showing one example of the hardware configuration of the terminal 80.
- the terminal 80 may include a terminal control unit 81, an interface unit 82, an operation unit 83, a communication unit 85, a memory 87, a display unit 88, and an imaging unit 89.
- the display unit 88 is an example of a presentation unit.
- the terminal control unit 81 is configured by, for example, a CPU, an MPU, or a DSP.
- the terminal control unit 81 performs signal processing for controlling the operation of each unit of the terminal 80 as a whole, input/output processing of data with other units, calculation processing of data, and storage processing of data.
- the terminal control unit 81 can acquire data and information from the unmanned aircraft 100 via the communication unit 85. For example, the terminal control unit 81 can acquire the captured image from the unmanned aircraft 100 and its additional information via the communication unit 85. The terminal control unit 81 can acquire data and information from the transmitter 50 via the interface unit 82. The terminal control unit 81 can acquire data and information input through the operation unit 83. The terminal control unit 81 can acquire data and information stored in the memory 87. The terminal control unit 81 can transmit data and information to the display unit 88, and display the display information based on the data and the information on the display unit 88.
- the terminal control unit 81 can directly acquire the position information of the unmanned aircraft 100 from the unmanned aircraft 100 via the communication unit 85 or acquire the position information of the unmanned aircraft 100 as the imaging position information included in the additional information.
- the terminal control unit 81 can sequentially acquire the position information of the unmanned aircraft 100, and calculate information on the speed at which the unmanned aircraft 100 moves and the direction in which the vehicle is moving based on the position information.
- Information on the position, speed, and moving direction of the unmanned aircraft 100 may be included in the additional information and notified to the server 300 or the like.
- the terminal control section 81 can execute an application for instructing control of the unmanned aircraft 100.
- the terminal control unit 81 can generate various data used in the application.
- the terminal control unit 81 can acquire a captured image from the unmanned aircraft 100.
- the terminal control unit 81 can cause the display unit 88 to display a captured image from the unmanned aircraft 100.
- the terminal control unit 81 can acquire an image (user image) of the peripheral portion of the user's eyes captured by the imaging unit 89.
- the user image may be an image captured when the user observes the display portion 88 that displays the captured image from the unmanned aircraft 100.
- the terminal control unit 81 detects an eye (for example, a pupil) of the user by performing image recognition (for example, division processing or object recognition processing) on the captured image.
- the terminal control unit 81 can detect a point of interest that the user who operates the terminal 80 focuses on in the captured image displayed on the display unit 88. In this case, the terminal control unit 81 can acquire the coordinates of the attention point (the line of sight detection position) on the screen (the target of interest) that the user is gazing on using the well-known visual line detection technique. In other words, the terminal control unit 81 can recognize which position of the captured image displayed by the user's eye observation display unit 88.
- the terminal control unit 81 can acquire imaging range information included in the additional information related to the captured image from the unmanned aircraft 100. That is, the terminal control unit 81 can designate the geographical imaging range as the range on the map based on the imaging range information from the unmanned aircraft 100. The terminal control unit 81 can detect which position of the captured image displayed on the display unit 88 is the coordinate of the point of interest, and detect which position of the geographical imaging range indicated by the range of the captured image corresponds to. Thereby, the specified position included in the geographical imaging range can be detected as the point of interest.
- the terminal control unit 81 can communicate with an external map server having a map database via the communication unit 85, and can detect what kind of object exists on the map at a geographically designated position corresponding to the point of interest. Thereby, it is possible to detect a specified object included in the geographical imaging range as a point of interest.
- the memory 87 may have a map database that the map server has.
- the terminal control unit 81 can recognize each object in the captured image by performing image recognition (for example, division processing and object recognition processing) on the captured image from the unmanned aircraft 100. In this case, for example, even when the information of the map database is relatively old, it is possible to recognize the object reflected in the captured image at the time of shooting.
- image recognition for example, division processing and object recognition processing
- the information as the point of interest may be position information (latitude and longitude, or latitude, longitude, and altitude), or may be information of an object identified by a unique name such as a ⁇ tower. Further, the information of the object may include information of the object and position information in addition to the unique name such as the ⁇ tower.
- the method of detecting the point of interest is an example, and the point of interest may also be detected by other methods.
- the interface unit 82 performs input and output of information and data between the transmitter 50 and the terminal 80.
- the interface unit 82 may be, for example, a USB connector (not shown) provided in the terminal 80.
- the interface unit 82 may be an interface other than the USB connector.
- the operation unit 83 receives data and information input by the operator of the terminal 80.
- the operation unit 83 may include a button, a button, a touch display screen, a microphone, and the like.
- the operation portion 83 and the display portion 88 are constituted by a touch display screen.
- the operation unit 83 can accept a touch operation, a click operation, a drag operation, and the like.
- the communication unit 85 communicates with the unmanned aircraft 100 by various wireless communication methods.
- the wireless communication method may include, for example, communication by wireless LAN, Bluetooth (registered trademark), short-range wireless communication, or a public wireless network. Further, the communication unit 85 can also perform wired communication.
- the memory 87 may have, for example, a program that defines the operation of the terminal 80, a ROM that stores data of the set value, and a RAM that temporarily stores various information and data used when the terminal control unit 81 performs processing.
- the memory 87 may include memory other than the ROM and the RAM.
- the memory 87 can be disposed inside the terminal 80.
- the memory 87 can be configured to be detachable from the terminal 80.
- the program can include an application.
- the display unit 88 is configured by, for example, an LCD (Liquid Crystal Display), and displays various kinds of information and data output from the terminal control unit 81.
- the display unit 88 can display data of a captured image taken by the imaging device 220 of the unmanned aerial vehicle 100.
- the imaging unit 89 includes an image sensor and captures an image.
- the imaging unit 89 may be disposed on the front side including the display unit 88.
- the imaging unit 89 can take an image (user image) with an object including the periphery of the eyes of the user who views the image displayed by the display unit 88 as a subject.
- the imaging unit 89 can output the user image to the terminal control unit 81.
- the imaging unit 89 can also capture an image other than the user image.
- FIG. 6B is a block diagram showing the hardware configuration of the server 300.
- the server 300 is an example of an information processing apparatus.
- the server 300 has a server control unit 310, a communication unit 320, a memory 340, and a memory 330.
- the server control unit 310 is configured using, for example, a CPU, an MPU, or a DSP.
- the server control unit 310 performs signal processing for controlling the operation of each part of the server 300, input/output processing of data with other parts, arithmetic processing of data, and storage processing of data.
- the server control unit 310 can acquire data and information from the unmanned aircraft 100 via the communication unit 320.
- the server control unit 310 can acquire data and information from the terminal 80 via the communication unit 320.
- the server control section 310 can execute an application for instructing control of the unmanned aircraft 100.
- the server control unit 310 can generate various data used in the application.
- the server control unit 310 performs processing related to an information presentation instruction for avoiding collision of the unmanned aircraft 100.
- the server control unit 310 presents information based on the user's point of interest in the image captured by the unmanned aircraft 100.
- the server control unit 310 can directly acquire the position information of each of the unmanned aerial vehicles 100 from the unmanned aircraft 100 via the communication unit 320, or acquire the position information of the unmanned aerial vehicle 100 from each terminal 80 as the imaging position information included in the additional information. .
- the server control unit 310 can sequentially acquire the position information of the unmanned aircraft 100, and calculate information on the speed at which the unmanned aircraft 100 moves and the direction in which the vehicle is moving based on the position information.
- the server control unit 310 may acquire information on the position, speed, and moving direction of the unmanned aircraft 100 included in the additional information from each terminal 80 via the communication unit 320.
- the memory 340 may have, for example, a program that defines the operation of the server 300, a ROM that stores data of the set value, and a RAM that temporarily stores various information and data used when the server control unit 310 performs processing.
- the memory 340 may include memory other than the ROM and the RAM.
- the memory 340 can be disposed inside the server 300.
- the memory 340 can be configured to be detachable from the server 300.
- the program can include an application.
- the communication unit 320 can communicate with other devices (for example, the transmitter 50, the terminal 80, and the unmanned aircraft 100) by wire or wirelessly.
- the memory 330 is a large-capacity recording medium capable of storing captured images, map information, and the like.
- FIG. 7 is a sequence diagram showing an information presentation instruction step by the server 300 in the first operation example.
- the unmanned aircraft 100 is used for FPV flight using the transmitter 50 and the terminal 80.
- the operator of the transmitter 50 and the terminal 80 does not need to visually view the unmanned aircraft 100, for example, while observing the captured image of the unmanned aircraft 100 displayed on the display unit 88 of the terminal 80, while manipulating the unmanned aircraft 100.
- a plurality of users operate the transmitter 50 and the terminal 80 that instruct the control of the flight of the unmanned aircraft 100A.
- the user Ua (user U1) operates the transmitter 50 and the terminal 80 that instruct the control of the flight of the unmanned aircraft 100A.
- the user Ub (user U2) operates the transmitter 50 and the terminal 80 that instruct the control of the flight of the other unmanned aircraft 100B.
- the transmitter 50, the terminal 80, and the parts of the unmanned aircraft 100 operated by the user A are also indicated by "A" at the end of the symbol (for example, the terminal 80A, the display portion 88A, and the unmanned aircraft 100A). ).
- the transmitter 50 and each part of the terminal 80 operated by the user B are also indicated by "B" at the end of the symbol (for example, the terminal 80B, the display unit 88B, and the unmanned aircraft 100B).
- the imaging device 220 repeats the shooting (T1).
- the UAV control unit 110 can record the captured image captured by the imaging device 220 in the memory 160, and also record the additional information related to the captured image in the memory 160.
- the UAV control unit 110 transmits the captured image stored in the memory 160 and its additional information to the terminal 80 via the communication interface 150 (T2).
- the terminal control unit 81 receives the captured image transmitted from the unmanned aircraft 100A and its additional information via the communication unit 85 (T3).
- the terminal control unit 81 causes the display unit 88 to display a captured image.
- the terminal control unit 81 detects a point of interest (T4) that the user who operates the terminal 80 focuses on in the captured image displayed on the display unit 88.
- FIG. 8 is a diagram showing an example of detection of a point of interest.
- the captured image GZ1 captured by the imaging device 220 is displayed on the display unit 88.
- the terminal control unit 81 determines the line of sight position of the user by the line of sight detection determination tower J1 with respect to the captured image GZ1 displayed on the display unit 88, and detects the point of interest tp1.
- the terminal control unit 81 transmits the information of the point of interest to the server 300 via the communication unit 85 (T5). Further, the terminal control unit 81 may transmit at least a part of the additional information acquired by the terminal 80 from the unmanned aircraft 100 via the communication unit 85 in addition to the information of the point of interest.
- the server control unit 310 receives information (for example, information of a point of interest) transmitted from the terminal 80 via the communication unit 320, and stores it in the memory 330 (T6).
- the server 300 receives information (for example, information of a point of interest) from the other unmanned aircraft 100B in addition to the unmanned aircraft 100A, and stores it in the memory 330.
- the server control unit 310 discriminates between the position information indicated by the point of interest and the information of the plurality of points of interest of the same (common) among the information of the one or more points of interest stored in the memory 330 (T7).
- the common point of interest that is, the location information and the point of interest common to the objects are also referred to as common points of interest.
- FIG. 9A is a diagram showing the captured images GZ1 and GZ2 displayed on the respective display units 88 when the attention points of the two users (operators) are the common points of interest.
- Two users can be users U1, U2.
- the captured images GZ1 and GZ2 are images in which the imaging directions of the unmanned aircraft 100A and 100B are different in the imaging direction, and include the tower J1, the bridge J2, and the building J3.
- the points of interest tp1, tp2 are the tower J1, which are common and therefore are common points of interest.
- FIG. 9B is a view showing the captured images GZ1 and GZ2 respectively displayed on the display units 88A and 88B of the respective terminals when the points of interest of the two users are not the common points of interest.
- the point of interest tp1 with respect to the captured image GZ1 is the tower J1.
- the point of interest tp2 with respect to the captured image GZ2 is a bridge. Therefore, the points of interest tp1, tp2 are not common points of interest.
- the information of the point of interest is not the information of the object such as the tower but the geographical position information, for example, when the distance between the two points of interest is smaller than the threshold, it may be determined to be the common point of interest.
- the server control unit 310 returns to the first step T6 in FIG.
- step T7 when there is information of a plurality of common points of interest, the server control unit 310 transmits a notification to the terminal 80A that has transmitted the information of the common point of interest via the communication unit 320.
- the information of the other unmanned aerial vehicle 100B of the controlled unmanned aerial vehicle 100A (T8).
- step T8 the server control unit 310 transmits information of the unmanned aircraft 100A different from the unmanned aircraft 100B that the terminal 80B has instructed to control the flight to the terminal 80B that transmitted the information of the common point of interest via the communication unit 320.
- the information of the other unmanned aerial vehicles 100B herein may include, for example, information indicating the presence of other unmanned aerial vehicles 100B, position information of other unmanned aerial vehicles 100B, information on the moving direction of movement of other unmanned aerial vehicles 100B, and the like.
- the information of the unmanned aircraft 100A herein may include, for example, information indicating the presence of the unmanned aircraft 100A, position information of the unmanned aircraft 100A, information on the moving direction of the unmanned aircraft 100A, and the like.
- the terminal control unit 81 receives information of the other unmanned aircraft 100B via the communication unit 85 (T9). Similarly, in the other terminal 80B, the terminal control unit 81 receives the information of the unmanned aircraft 100A (T9).
- the terminal control unit 81 of the terminal 80A causes the display unit 88 to display a superimposed image with respect to the captured image GZ1 displayed on the display unit 88 based on the received information of the unmanned aircraft 100B (T10).
- an arrow-like mark mk1 can be superimposedly displayed on the captured image CZ1.
- the terminal control unit 81 of the terminal 80A may display information of the other unmanned aircraft 100B in addition to the mark mk1 which is a superimposed image with respect to the captured image GZ1 displayed on the display unit 88.
- the terminal control unit 81 may display whether or not other unmanned aerial vehicles exist, the position, speed, moving direction, and the like of other unmanned aerial vehicles.
- the terminal 80 may include a speaker for sound output of information of other unmanned aerial vehicles 100B.
- the terminal 80 may also include a vibrator that represents information of other unmanned aerial vehicles 100B by vibration.
- the server 300 acquires information of the point of interest from each terminal 80, and determines whether or not there is a common point of interest among the plurality of acquired points of interest.
- the point of interest is the location, object that the user is interested in, and the possibility that the unmanned aircraft 100 is flying toward the point of interest is high. Therefore, in the case where the points of interest are common common concerns, the possibility that the unmanned aircraft 100 collide with each other becomes higher as the geographical position corresponding to the point of interest is approached. Even in such a case, the user of each terminal 80 can present information related to other unmanned aircraft 100 operated by other users to the own machine being confirmed during FPV flight. Therefore, the user of each terminal 80 can recognize that other users are also paying attention to the same point of interest, and can improve the security and maneuver the unmanned aircraft 100.
- the server control unit 310 may determine whether or not the unmanned aircraft 100A, 100B that have received the point of interest are moving. In this case, the server control unit 310 can acquire the information of the point of interest via the communication unit 320 while acquiring the positional information of the time series of the unmanned aircraft 100A, 100B.
- the position information may be included in the additional information of the captured image as the imaging position information, and may be sequentially acquired from the terminals 80A and 80B via the communication unit 320, or may be directly obtained from the unmanned aircraft 100A and 100B in order.
- the server control unit 310 can instruct the information presentation based on the common point of interest, and if the unmanned aircraft 100A and 100B are not moved, Information prompts based on common points of interest are not indicated.
- the server 300 can increase the possibility of collision, and thus can instruct the presentation information.
- the server 300 has a low possibility of collision of the unmanned aircraft 100A and 100B even when attention is paid to a common position or object, and therefore may not be instructed. message notification.
- the server control unit 310 can determine whether the unmanned aircraft 100A is moving. In the case where the unmanned aircraft 100A is moving, the server control section 310 can cause the terminal 80A to display information related to the unmanned aircraft 100B.
- the server 300 can notify the information of the other unmanned aircraft 100B as the warning information, and it is possible to suppress the collision between the unmanned aircraft 100A and the unmanned aircraft 100B.
- the server control unit 310 can calculate the distance r1 from the unmanned aircraft 100A to the unmanned aircraft 100B based on the acquired position information of the unmanned aircraft 100A, 100B. When the distance r1 is less than or equal to the threshold, the server control unit 310 can cause the terminal 80A to display information related to the unmanned aircraft 100B, for example, a flag indicating the presence of the unmanned aircraft 100B.
- the threshold here may be the same as the threshold Dist1 used in the second operation example described later.
- the server 300 can suppress the occurrence of a collision of the unmanned aircraft 100A with the unmanned aircraft 100B.
- FIG. 10 is a diagram showing the positional relationship of two unmanned aerial vehicles 100A, 100B. Three-dimensional coordinates with the unmanned aircraft 100A as the origin are set. In this case, in FIG. 10, the unmanned aircraft 100B is located in the positive x direction and the positive y direction, and is located in the positive z direction (ie, a position that is more empty than the unmanned aircraft 100A).
- FIG. 11A is a view showing a captured image GZ1 of the unmanned aerial vehicle 100A displayed on the display unit 88A of the terminal 80A.
- the positional relationship of the unmanned aerial vehicles 100A, 100B is the positional relationship of FIG.
- the terminal control unit 81A of the terminal 80A receives an instruction to display information from the server 300 (see T8 in FIG. 7), and displays various information via the display unit 88A.
- FIG. 11A at the upper right end portion of the display portion 88A, a mark mk1 similar to an arrow from the right to the left is displayed overlapping with the captured image GZ1.
- this mark mk1 indicates that another unmanned aircraft 100B is flying in the upper right direction as a blind spot in the display portion 88A. Therefore, when the imaging direction of the unmanned aircraft 100A is shifted in the direction indicated by the mark mk1, another unmanned aerial vehicle 100B appears in the captured image GZ1.
- the server control unit 310 of the server 300 can acquire the location information of the unmanned aerial vehicle 100A via the communication unit 320.
- the server control unit 310 can acquire the position information of the unmanned aircraft 100B via the communication unit 320.
- the position information may be included in the additional information of the captured image as the imaging position information, and may be sequentially acquired from the terminals 80A and 80B via the communication unit 320, or may be directly acquired from the unmanned aircraft 100A and 100B in order.
- the server control unit 310 can determine the position at which the marker mk1 is displayed in consideration of the positional relationship of the unmanned aircrafts 100A and 100B based on the position information of the unmanned aircrafts 100A and 100B.
- the server control unit 310 can instruct the terminal 80A via the communication unit 320 to display information related to the unmanned aircraft 100B (for example, information indicating that the unmanned aircraft 100B exists) including the information indicating the position of the mark mk1.
- the mark mk1 may also indicate the moving direction of the unmanned aircraft 100B.
- the unmanned aerial vehicle 100B may be flying from right to left in the geographical range and orientation corresponding to the captured image displayed on the display unit 88A.
- information related to the unmanned aircraft 100B such as the position and speed of the unmanned aircraft 100B may be displayed by information other than the mark mk1.
- FIG. 11B is a view showing the captured image GZ2 of the unmanned aerial vehicle 100B displayed on the display unit 88B of the other terminal 80B.
- the positional relationship of the unmanned aerial vehicles 100A, 100B is the positional relationship of FIG.
- a mark mk2 similar to the arrow from the left to the right is displayed superimposed on the captured image GZ2.
- this mark mk2 indicates that another unmanned aircraft 100A is flying on the display unit 88B to the left of the blind spot. Therefore, when the imaging direction of the unmanned aircraft 100B is shifted in the direction indicated by the mark mk2, another unmanned aircraft 100A appears in the captured image GZ2.
- the server control unit 310 of the server 300 can acquire the location information of the unmanned aerial vehicle 100A via the communication unit 320.
- the server control unit 310 can acquire the position information of the unmanned aircraft 100B via the communication unit 320.
- the position information may be included in the additional information of the captured image as the imaging position information, and may be sequentially acquired from the terminals 80A and 80B via the communication unit 320, or may be directly acquired from the unmanned aircraft 100A and 100B in order.
- the server control unit 310 can determine the position at which the marker mk2 is displayed in consideration of the positional relationship of the unmanned aircrafts 100A and 100B based on the position information of the unmanned aircrafts 100A and 100B.
- the server control unit 310 can instruct the terminal 80B via the communication unit 320 to display information related to the unmanned aircraft 100A (for example, information indicating that the unmanned aircraft 100A exists) including the information indicating the position of the mark mk2.
- the mark mk2 may also indicate the moving direction of the unmanned aircraft 100A.
- the unmanned aerial vehicle 100A may be flying from left to right in the geographical range and orientation corresponding to the captured image displayed on the display unit 88B.
- information related to the unmanned aircraft 100A such as the position and speed of the unmanned aircraft 100A may be displayed by information other than the mark mk2.
- the server control section 310 of the server 300 can acquire the positional information of the unmanned aircraft 100A and the unmanned aircraft 100B.
- the server control section 310 can instruct the display section 88A to display information indicating the presence of the unmanned aircraft 100B at a position according to the position of the unmanned aircraft 100B with respect to the unmanned aircraft 100A.
- the terminal 80A receives an instruction from the server 300, and can display the presence of the unmanned aircraft at the position according to the position of the unmanned aircraft 100B with respect to the unmanned aircraft 100A, for example, at the upper right end of the screen of the display unit 88.
- Marker mk2 of 100B an example of presence information.
- the user U1 of the operation terminal 80A easily grasps the existence position of the unmanned aircraft 100B intuitively. Therefore, the user U1 can more easily operate the terminal 80A in consideration of the position of the unmanned aircraft 100B. Therefore, the server 300 can suppress the collision of the unmanned aircraft 100A with the unmanned aircraft 100B.
- the unmanned aircraft 100A (the local machine) and the other unmanned aircraft 100B corresponding to the user who is concerned with the common point of interest are present, even if displayed on the display portion 88A of the terminal 80A.
- the other unmanned aircraft 100B does not appear in the captured image, and the flag mk1 indicating the other unmanned aircraft 100B is also displayed.
- the server control unit 310 acquires the attention point tp1 which is the captured image GZ1 captured by the unmanned aircraft 100A displayed on the terminal 80A by the user U1 operating the terminal 80A that controls the flight of the unmanned aircraft 100A.
- the point of concern The server control unit 310 acquires the point of interest tp2, which is the user U2 that operates the terminal 80B that controls the flight of the unmanned aircraft 100B, and pays attention to the captured image GZ2 captured by the unmanned aircraft 100B displayed on the terminal 80B.
- Point The server control unit 310 determines whether the point of interest tp1 and the point of interest tp2 are common points of interest indicating the same point of interest. When they are the common point of interest, the server control unit 310 causes the terminal 80A to display the flag mk1 indicating the presence or proximity of the unmanned aircraft 100B.
- the server control unit 310 is an example of a processing unit.
- the unmanned aerial vehicle 100A is one example of the first flying body.
- the terminal 80A is an example of the first terminal.
- the captured image GZ1 is an example of the first image.
- the point of interest tp1 is an example of the first point of interest.
- the unmanned aerial vehicle 100B is an example of a second flying body.
- the terminal 80B is an example of the second terminal.
- the captured image GZ2 is an example of the second image.
- the point of interest tp2 is an example of the second point of interest.
- the user U1 can grasp information related to the unmanned aircraft 100B existing around the unmanned aircraft 100A. Therefore, when the unmanned aircraft 100A performs FPV flight, it is difficult to confirm the situation around the unmanned aircraft 100A, even if it is the same destination corresponding to the point of interest common to the destinations of the plurality of unmanned aircraft 100, The user U1 can also operate the terminal 80A in consideration of information related to the unmanned aircraft 100B. Therefore, the server 300 can suppress the collision of the unmanned aircraft 100A with the unmanned aircraft 100B.
- a case is shown in which a mark indicating the presence of another unmanned aircraft in which the points of interest are present is displayed on the display of the terminal in a superimposed manner with the captured image.
- the second action example it is shown that in the case where the common point of interest is the same and the distance between the other unmanned aircraft is less than the threshold, in the terminal indicating the control of the flight of the unmanned aircraft The case where the recommendation information is displayed.
- 12A and 12B are sequence diagrams showing an information presentation instruction step under the unmanned aerial vehicle viewpoint by the server 300 in the second operation example.
- the description is omitted or simplified by using the same symbols.
- the flight system 10 performs the processing of T1 to T6.
- step T7 in the case where a plurality of unmanned aerial vehicles 100 common to the points of interest exist, the server control unit 310 of the server 300 determines whether the distance r1 from the unmanned aircraft 100A to the other unmanned aerial vehicles 100B is equal to or less than a threshold value. Dist1 (T8A).
- FIG. 13 is a diagram showing the space of the thresholds Dist1 and Dist2 set for the distance r1 between the two unmanned aircrafts 100A and 100B.
- the distance r1 between the two unmanned aircrafts 100A, 100B uses the position coordinates (x, y, z) of the unmanned aircraft 100B, by the formula (1) ) said.
- the threshold Dist1 compared with the distance r1 from the other unmanned aerial vehicles 100B is a value for recommending the speed reduction when it is expected to be close to the other unmanned aircraft 100B.
- the threshold Dist2 that is compared with the distance r1 from the other unmanned aircraft 100B is a value for recommending a temporary stop such as hovering when it is expected to collide with the other unmanned aircraft 100B. Therefore, the threshold Dist2 is a value smaller than the threshold Dist1.
- the server control unit 310 of the server 300 returns to the first step T6 of the server 300 in T8A of Fig. 12A.
- the server control unit 310 determines whether or not the distance r1 from the unmanned aircraft 100B is equal to or smaller than the threshold value Dist2 (T9A).
- the server control unit 310 recommends the low-speed flight mode and generates recommendation information (T10A) for recommending the low-speed flight mode.
- the server control unit 310 recommends that a temporary stop such as hovering (temporary stop mode) can be performed, and recommendation information for recommending temporary suspension is generated ( T11A).
- the server control unit 310 transmits the recommendation information of step T10A or step T11A to the terminal 80A that instructs the control of the flight of the unmanned aircraft 100A via the communication unit 320 (T12A).
- the terminal control unit 81 of the terminal 80A receives the recommendation information from the server 300 via the communication unit 85 (T13A).
- the terminal control unit 81 displays a recommendation screen including recommendation information on the display unit 88 based on the recommendation information (T14A).
- FIG. 14A is a diagram showing the recommended screen GM1 displayed on the display 88 in the case where the distance r1 is within the threshold Dist1.
- a message "Please set to the low-speed flight mode” is displayed.
- FIG. 14B is a diagram showing the recommended screen GM2 displayed on the display 88 in the case where the distance r1 is within the threshold Dist2.
- a message "Please stop temporarily” is displayed.
- the messages shown in FIGS. 14A and 14B are displayed separately on the recommended screens GM1 and GM2. In addition, these messages may be displayed on the captured image in an overlapping manner, or may be displayed on the captured image in an overlapping manner with the marker shown in the first motion example.
- the server 300 can present a warning message to the terminal 80 indicating the control of the unmanned aircraft 100 flight to restrict the flight. speed. Therefore, even in the case where the unmanned aircraft 100 are close to each other, the terminal 80 can improve the safety of the flight of the unmanned aircraft 100, and cause the unmanned aircraft 100 to perform the FPV flight. Further, when the unmanned aircraft 100 are further approached to each other, the warning information can be further prompted to limit the speed. For example, each of the unmanned aerial vehicles 100 can also be hovered. Therefore, the server 300 can change the importance of the warning in stages according to the proximity of the unmanned aircraft 100, and simultaneously present information. Therefore, the user of each terminal 80 can recognize the approach of the unmanned aircraft 100 other than the unmanned aircraft 100 operated by the user when performing FPV flight toward the common point of interest, and take necessary measures to manipulate the unmanned person according to the prompt information. Drive the aircraft 100.
- the terminal 80A displays the recommended screen.
- the server control unit 310 may perform an instruction to perform flight control such as a low-speed flight mode or a temporary stop (hovering or the like) on the unmanned aircraft 100A instead of the instruction to display the recommended screen or the instruction to display the recommended screen.
- the low speed flight mode is recommended.
- a temporary stop such as hovering is recommended. Thereby, it is helpful to avoid collision of the unmanned aircraft 100 with each other.
- the server control section 310 can cause the terminal 80A to display information recommending that the flight speed of the unmanned aerial vehicle 100A is restricted (for example, it is used to set the recommended information for the low-speed flight mode).
- the displayed indication information can be transmitted to the terminal 80A.
- the user U1 can grasp the limitation that the speed of the flight of the unmanned aircraft 100A is recommended by confirming the display of the recommendation information of the terminal 80A.
- the terminal 80A performs speed setting for restricting flight, and enables the unmanned aircraft 100A to fly.
- the setting of the speed limit may be automatically set by the terminal control unit 81 based on the recommended information, or may be manually set via the operation unit 83.
- the user U1 can more easily confirm the state of the unmanned aircraft 100A on the screen of the terminal 80A than the case of high-speed flight, and can suppress collision with the unmanned aircraft 100B.
- the server control unit 310 can cause the terminal 80A to display recommendation information such that the shorter the distance r1 is, the more the speed of the flight of the unmanned aircraft 100A is restricted to a low speed (for example, recommended information for temporary suspension).
- the displayed indication information can be transmitted to the terminal 80A.
- the shorter the distance r1 is, the higher the possibility of collision even with a short moving distance, but the shorter the distance r1, the more the server 300 makes the unmanned aircraft 100A fly at a low speed, thereby being able to extend the unmanned aircraft 100B.
- the position takes time to move and can easily avoid collisions.
- Fig. 15A is a view showing a state in which a driverless aircraft is manipulated by visual observation in a comparative example.
- the fields of view CA1 of the users U1, U2 are wider. Therefore, the users U1, U2 are easy to avoid situations in which the unmanned aircraft approach each other. Therefore, it is less likely that the unmanned aircraft collide with each other.
- Fig. 15B is a view showing a state in which the unmanned aerial vehicle is operated in the FPV flight mode of the present embodiment.
- the unmanned aircraft 100A and 100B are respectively operated while observing the display units 88 of the terminal 80 by the two users U1 and U2, the fields of view CA2 of the users U1 and U2 are narrowed. Even if other unmanned aerial vehicles fly near the unmanned aircraft operated by the users U1, U2, the users U1, U2 are hard to detect. Therefore, it is easy for the unmanned aircraft to collide with each other.
- the server 300 can perform information presentation based on the point of interest of the user of each terminal 80 in a dominant manner. That is, the server control unit 310 can acquire the point of interest tp1 from the terminal 80A via the communication unit 320, and acquire the point of interest tp2 from the terminal 80B. The server control unit 310 can transmit information to be displayed on the terminal 80A (for example, information related to the unmanned aircraft 100B, recommendation information) to the terminal 80A via the communication unit 320.
- the server 300 can collectively process the information of the point of interest detected by the plurality of terminals 80 existing in the flight system 10, and instruct the presentation information. Therefore, the server 300 can reduce the processing load of the terminal 80 involved in the processing of the information presentation based on the information of the common point of interest, for example.
- the server control unit 310 of the server 300 transmits information and recommendation information related to the other unmanned aircraft 100 to the terminal 80, but also The unmanned aircraft 100 can be instructed to control corresponding to the recommended information.
- the server control unit 310 can transmit the flight control information such as the low-speed flight mode and the temporary stop to the terminal 80 via the communication unit 320.
- the terminal control unit 81 of the terminal 80 receives the flight control information via the communication unit 85, it is possible to instruct to control the flight of the unmanned aircraft 100 based on the flight control information.
- the server control section 310 can limit the flying speed of the unmanned aircraft 100A.
- the restriction indication information for the restriction may be directly transmitted to the unmanned aircraft 100A or may be transmitted via the terminal 80A.
- the server 300 can limit the speed of the unmanned aircraft 100A according to the positional relationship between the unmanned aircraft 100A and the unmanned aircraft 100B by performing the restriction instruction of the speed of the flight of the unmanned aircraft 100A.
- the server 300 can limit the speed of the unmanned aircraft 100A according to the positional relationship between the unmanned aircraft 100A and the unmanned aircraft 100B by performing the restriction instruction of the speed of the flight of the unmanned aircraft 100A.
- the user U1 does not notice the presence of the unmanned aircraft 100B and operates the terminal 80A, it is possible to suppress the unmanned aircraft 100A from flying at a high speed in accordance with an instruction from the terminal 80A, and it is possible to suppress the unmanned person. Driving the collision of aircraft 100B.
- the server control unit 310 can limit the speed of flight of the unmanned aircraft 100A to a low speed.
- the restriction indication information for the restriction may be directly transmitted to the unmanned aircraft 100A or may be transmitted via the terminal 80A.
- the closer the unmanned aircraft 100A is to the unmanned aircraft 100B the lower the flight speed. Therefore, although the closer the unmanned aircraft 100A is to the unmanned aircraft 100B, the easier it is to collide, since the flight speed is also restricted to be low, the server 300 can suppress the collision with the unmanned aircraft 100B.
- the first embodiment a case where a plurality of unmanned aerial vehicles 100 approach each other is shown.
- the second embodiment the case where the unmanned aircraft 100 approaches a destination as a common point of interest is shown.
- the configuration of the flight system 10 in the second embodiment has substantially the same configuration as that of the first embodiment.
- the same components as those of the first embodiment are denoted by the same reference numerals, and the description thereof will be omitted or simplified.
- 16A and 16B are sequence diagrams showing an information presentation instruction step at the destination viewpoint by the server 300 in the second embodiment.
- the same steps as those of the first operation example shown in FIG. 7 and the second operation example shown in FIG. 12 are omitted or simplified by using the same reference numerals.
- the flight system 10 performs the processing of T1 to T6.
- the server control unit 310 of the server 300 determines whether or not the unmanned aircraft 100 exists in the range of the radius r2 from the common point of interest. (T8B).
- the radius r2 is a value for recommending a speed reduction under the premise that the unmanned aircraft 100 is expected to approach a common point of interest.
- the location information of the common point of interest can be obtained from the map information held in the memory 330 of the server 300.
- the map information may be stored in an external map server, and the server control unit 310 may acquire the map information via the communication unit 320.
- the first step is returned to the server control unit 310 and the server 300.
- the server control unit 310 determines whether or not there is no radius within the range of the radius r3 from the common point of interest.
- the person drives the aircraft 100 (T9B).
- the radius r3 is a value for recommending a temporary stop such as hovering on the premise that the unmanned aircraft is expected to collide with a common point of interest.
- the radius r3 is a value smaller than the radius r2.
- the server control unit 310 goes to the unmanned aircraft 100 (for example, a circle between a circle having a radius r2 and a circle having a radius r3)
- the corresponding terminal 80 of the pilot aircraft 100 recommends a low speed flight mode (T10B).
- step T9B in the case where the unmanned aircraft 100 is present in the range of the radius r3, the server control unit 310 is directed to the corresponding unmanned aircraft 100 (for example, located inside the circle of radius r2).
- the terminal 80 corresponding to the unmanned aircraft 100 is recommended to temporarily stop (T11B) such as hovering.
- the server control unit 310 transmits the recommendation information of step T10B or step T11B to the terminal 80 corresponding to the corresponding unmanned aircraft 100 via the communication unit 320 (T12B).
- the terminal control unit 81 of the terminal 80 corresponding to the corresponding unmanned aircraft 100 receives the recommendation information from the server 300 via the communication unit 85 (T13A).
- the terminal control unit 81 displays a recommendation screen on the display unit 88 based on the recommendation information (T14A).
- the server control section 310 of the server 300 can acquire the location information of the common point of interest.
- the server control section 310 can acquire the location information of the unmanned aircraft 100.
- the server control unit 310 can cause the terminal 80 to display the recommended pair of unmanned persons. Information on the speed at which the flight of the aircraft 100 is limited. In this case, the displayed indication information can be transmitted to the terminal 80.
- the terminal 80 performs setting for limiting the speed of the flight, and limits the speed of the flight to cause the unmanned aircraft 100 to fly.
- the setting of the speed limit may be automatically set by the terminal control unit 81 based on the recommended information, or may be manually set via the operation unit 83.
- the server control section 310 can cause the terminal 80 to display recommendation information that the shorter the distance from the common point of interest to the unmanned aircraft 100, the more the speed of the flight of the unmanned aircraft 100 is limited to the low speed. In this case, the displayed indication information can be transmitted to the terminal 80.
- the distance from the common point of interest to the unmanned aerial vehicle 100 is short, and the probability of collision even at a short moving distance becomes high.
- the shorter the distance from the common point of interest to the unmanned aerial vehicle 100 the more the server 300 causes the unmanned aerial vehicle 100 to fly at a low speed. Therefore, the server 300 can extend the time required to move to a common point of interest, and collision can be easily avoided.
- the server control unit 310 of the server 300 transmits the recommendation information to the terminal 80
- the unmanned aircraft 100 may be instructed to control the recommendation information.
- the server control unit 310 can transmit the flight control information such as the low-speed flight mode and the temporary stop to the terminal 80 via the communication unit 320.
- the terminal control unit 81 of the terminal 80 receives the flight control information via the communication unit 85, it is possible to instruct to control the flight of the unmanned aircraft 100 based on the flight control information.
- the server control section 310 may limit the flying speed of the unmanned aircraft 100A.
- the restriction indication information for the restriction may be directly transmitted to the unmanned aircraft 100A or may be transmitted via the terminal 80A.
- the server 300 can limit the speed of the unmanned aircraft 100A according to the positional relationship between the unmanned aircraft 100A and the common point of interest by performing the restriction instruction of the flight speed of the unmanned aircraft 100A.
- the unmanned aircraft 100A can suppress the high-speed flight according to the instruction from the terminal 80A, and suppress the presence and the common point of interest. Collisions of (destination) objects, other unmanned aerial vehicles 100B close to a common point of interest.
- the server control unit 310 can limit the speed of flight of the unmanned aerial vehicle 100A to a low speed.
- the restriction indication information for the restriction may be directly transmitted to the unmanned aircraft 100A or may be transmitted via the terminal 80A.
- the server control section 310 may perform control in a predetermined order such that the unmanned aircrafts 100 sequentially approach the common point of interest in a case where the plurality of unmanned aerial vehicles 100 simultaneously approach the destinations as the common point of interest.
- This control information may be directly transmitted to the unmanned aircraft 100A or may be transmitted via the terminal 80A.
- the server 300 can avoid collision of the unmanned aircraft 100 with each other and cause each unmanned aircraft 100 to reach the destination.
- the case where the server 300 performs an operation of the information presentation instruction for avoiding the collision of the unmanned aircraft 100 is shown.
- a case is shown in which any one of the plurality of terminals 80 performs an operation of an information presentation instruction for avoiding collision of the unmanned aircraft 100.
- the flight system 10 of the third embodiment has substantially the same configuration as the first embodiment.
- the same components as those in the first embodiment are denoted by the same reference numerals, and the description thereof will be omitted or simplified.
- the terminal control unit 81 of the terminal 80P performs processing related to the information presentation instruction of the server 300 for avoiding collision of the unmanned aircraft 100.
- the terminal control unit 81 presents information based on the point of interest of the user in the image captured by the unmanned aircraft 100.
- the terminal control unit 81 can perform the same processing as that performed by the server control unit 310 of the server 300 in the first and second embodiments.
- the terminal control unit 81 is an example of a processing unit.
- the terminal 80 will be mainly described as the terminal 80P or another terminal 80Q. There may be more than one terminal 80Q.
- the terminal 80P instructs the control of the flight of the unmanned aircraft 100P to be operated by the user Up. Further, the terminal 80Q instructs the control of the flight of the unmanned aircraft 100Q to be operated by the user Uq.
- the terminal 80P may be the terminal 80A.
- the terminal 80Q may be the terminal 80B.
- the unmanned aerial vehicle 100P may be the unmanned aerial vehicle 100A.
- the unmanned aerial vehicle 100Q may be the unmanned aerial vehicle 100B.
- the terminal 80P and the other terminal 80Q that perform the information presentation instruction according to the common point of interest can connect the communication link to be able to communicate in advance.
- 17 is a sequence diagram showing an information presentation instruction step under the unmanned aerial vehicle viewpoint by the terminal 80 in the first operation example of the third embodiment.
- the same steps as those in FIG. 7 in the first operation example of the first embodiment are denoted by the same step numbers, and the description thereof is omitted or simplified.
- the terminal 80P is an example of an information processing apparatus.
- the flight system 10 performs the processing of T1 to T5.
- the imaging device 220 repeats imaging.
- the UAV control unit 110 can record the captured image captured by the imaging device 220 in the memory 160, and also record the additional information related to the captured image in the memory 160.
- the UAV control unit 110 transmits the captured image stored in the memory 160 and its additional information to the terminal 80P via the communication interface 150.
- the terminal control unit 81 receives the captured image transmitted from the unmanned aircraft 100P and its additional information (T3C) via the communication unit 85.
- the terminal control unit 81 detects the point of interest of the user Up of the operation terminal 80P and stores it in the memory 87 (T4C). Further, the terminal control unit 81 receives the information including the point of interest transmitted from the other terminal 80Q via the communication unit 85, and stores it in the memory 87 (T6C). Therefore, the terminal 80P locally detects and acquires the point of interest of the user Up as the terminal 80P of the own machine, and acquires the point of interest of the user Uq of the other terminal 80Q from the other terminal 80Q.
- the terminal control unit 81 determines whether or not information of a plurality of common points of interest exists in the information of the plurality of points of interest stored in the memory 87 (T7C). When there is no information of a plurality of common points of interest, the terminal control unit 81 returns to the first step T3C of the terminal 80P.
- step T7C when there is information of a plurality of common points of interest, the terminal control unit 81 transmits the other unmanned aircraft 100 to the terminal 80Q that transmitted the information of the common point of interest via the communication unit 85.
- Information (T8C) for example, unmanned aerial vehicle 100P.
- the terminal 80Q that has transmitted the information of the common point of interest can receive the information presentation instruction from the terminal 80P, and superimposes the flag indicating the presence of the other unmanned aircraft 100P on the captured image displayed on the display unit 88 as an overlap. image.
- the terminal control unit 81 of the terminal 80P instructs the other unmanned aircraft 100Q of the flight control based on the other terminal 80Q.
- the information is displayed on the captured image displayed on the display unit 88 in a superimposed manner (T10C) indicating the presence of the other unmanned aircraft 100Q.
- the terminal 80P performs the information presentation instruction based on the common point of interest, thereby omitting the installation of the server 300, simplifying the configuration of the flight system 10, and reducing the cost.
- 18A and 18B are sequence diagrams showing an information presentation instruction step under the unmanned aerial vehicle viewpoint by the terminal 80 in the second operation example of the third embodiment.
- the same steps as those in FIG. 17 in the first operation example of the second embodiment of the first embodiment and the first operation example in the first embodiment are denoted by the same step numbers, and the description thereof will be omitted or simplified.
- terminal 80P the terminal that performs the same operation as the operation of the server 300 in the first operation example of the first embodiment among the plurality of terminals 80 is referred to as the designated terminal 80P.
- Step T3D is the same processing as step T3C shown in FIG.
- Step T4D is the same processing as step T4C shown in FIG.
- Step T6D is the same processing as step T6C shown in FIG.
- the terminal control unit 81 determines whether or not information of a plurality of common points of interest (T7D) exists in the information of the plurality of points of interest stored in the memory 87. When there is no information of a plurality of common points of interest, the terminal control unit 81 returns to the first step T3D in the terminal 80P.
- T7D common points of interest
- step T7D when there is information of a plurality of common points of interest, the terminal control unit 81 determines whether the distance r1 from the unmanned aircraft 100P to the other unmanned aerial vehicles 100Q is equal to or less than the threshold value Dist1 ( T8D).
- the terminal control unit 81 When the distance r1 is larger than the threshold Dist1, the terminal control unit 81 returns to the first step T3D in the terminal 80P.
- the terminal control unit 81 determines whether or not the distance r1 is equal to or smaller than the threshold value Dist2 (T9D). In the case where the distance r1 is larger than the threshold Dist2, the terminal control section 81 recommends the low-speed flight mode, and generates recommendation information (T10D) for recommending the low-speed flight mode. On the other hand, when the distance r1 is less than or equal to the threshold value Dist2, the terminal control unit 81 recommends a temporary stop (temporary stop mode) such as hovering, and generates recommendation information for recommending temporary suspension (T11D). .
- a temporary stop temporary stop mode
- the terminal control unit 81 transmits the recommendation information of step T10D or step T11D to the other terminal 80Q (T12D) that instructs the flight control of the other unmanned aircraft 100Q via the communication unit 85.
- the terminal control unit 81 of the other terminal 80Q receives the recommendation information (T13D) from the terminal 80P via the communication unit 85.
- the terminal control unit 81 displays a recommendation screen including recommendation information on the display unit 88 based on the recommendation information (T14D).
- the other terminal 80Q that receives the information presentation instruction based on the common point of interest can display the recommended screen on the display unit 88. Therefore, the user Uq of the other terminal 80Q can perform manipulation by referring to the recommended screen when manipulating the other unmanned aircraft 100Q, and thus the safety of the manipulation can be improved.
- the terminal control unit 81 of the terminal 80P displays a recommendation screen including the recommendation information on the display unit 88 (T15D).
- the terminal 80P that performs the information presentation instruction based on the common point of interest can display the recommended screen on the display unit 88. Therefore, the user Up of the terminal 80P can perform manipulation by referring to the recommended screen when manipulating the unmanned aircraft 100P, and thus the safety of the manipulation can be improved.
- a low speed flight mode is recommended in a case where a plurality of unmanned aerial vehicles 100 (for example, unmanned aerial vehicles 100P, 100Q) are approaching. Further, when there is a high possibility that a plurality of unmanned aerial vehicles 100 collide, a temporary stop such as hovering is recommended. Thereby, it is helpful to avoid collision of the unmanned aircraft 100 with each other. Further, the arrangement of the server 300 can be omitted, the configuration of the flight system 10 can be simplified, and the cost can be reduced.
- the terminal control unit 81 acquires the captured image GZ1 from the unmanned aircraft 100P via the communication unit 85.
- the terminal control unit 81 detects the point of interest tp1 in the captured image GZ1.
- the terminal control unit 81 acquires the point of interest tp2 from the other terminal 80Q via the communication unit 85.
- the terminal control unit 81 causes the display unit 88 to display information to be displayed on the terminal 80P (for example, information related to the other unmanned aircraft 100Q and recommended information).
- the terminal 80P can perform a series of processes from the detection of the point of interest to the determination of the common point of interest and the display of the information based on the determination of the common point of interest. Therefore, the terminal 80P does not need to separately provide the server 300 that instructs the information display based on the detection of the point of interest and the determination of the common point of interest. Therefore, the terminal 80P can simplify the structure for displaying information according to the detection of the common point of interest.
- the case where the smartphone 80S as the terminal 80 instructs the control of the flight of the unmanned aircraft 100 is shown.
- the HMD (Head Mount Display) 500 as the terminal 80 instructs the control of the flight of the unmanned aircraft 100 is shown.
- the flight system 10 of the fourth embodiment has substantially the same configuration as that of the first embodiment except that the terminal 80 becomes the HMD 500.
- the same components as those of the first embodiment are denoted by the same reference numerals, and the description thereof will be omitted or simplified.
- Fig. 19 is a perspective view showing the appearance of the HMD 500 in the fourth embodiment.
- the HMD 500 has a mounting portion 510 mounted on a user's head and a main body portion 520 supported by the mounting portion 510.
- FIG. 20 is a block diagram showing the hardware configuration of the HMD 500.
- the HMD 500 includes a processing unit 521, a communication unit 522, a storage unit 523, an operation unit 524, a display unit 525, an acceleration sensor 526, an imaging unit 527, and an interface unit 528. These respective structures of the HMD 500 may be disposed on the main body portion 520.
- the processing unit 521 is configured using, for example, a CPU, an MPU, or a DSP.
- the processing unit 521 performs signal processing for integrally controlling the operation of each unit of the main body unit 520, input/output processing of data with other units, calculation processing of data, and storage processing of data.
- the processing unit 521 can acquire data and information from the unmanned aircraft 100 via the communication unit 522.
- the processing unit 521 can also acquire data and information input through the operation unit 524.
- the processing unit 521 can also acquire data and information stored in the storage unit 523.
- the processing unit 521 can transmit data and information including the captured image of the unmanned aircraft 100 to the display unit 525, and cause the display unit 525 to display display information based on the data and the information.
- the processing section 521 can execute an application for instructing control of the unmanned aircraft 100.
- the processing unit 521 can generate various data used in the application.
- the processing unit 521 can perform line-of-sight detection based on the image of the user's eyes captured by the imaging unit 527, and detects the point of interest as in the first embodiment. Further, the processing unit 521 can instruct the control of the flight of the unmanned aircraft 100 based on the detection result of the visual line detection. That is, the processing unit 521 can maneuver the unmanned aerial vehicle 100 with the movement of the line of sight. For example, the processing unit 521 can instruct the unmanned aerial vehicle 100 via the communication unit 522 such that the unmanned aerial vehicle 100 flies in a direction of a geographical position or an object corresponding to a position on a screen observed by a user wearing the HMD 500. Therefore, the user's point of interest can be the destination of the unmanned aircraft 100.
- the processing section 521 can acquire information of the acceleration detected by the acceleration sensor 526 and instruct the control of the flight of the unmanned aircraft 100 based on the acceleration. For example, the processing unit 521 can instruct the unmanned aircraft 100 via the communication unit 522 such that the unmanned aerial vehicle 100 flies in a direction in which the head of the user wearing the HMD 500 is tilted.
- the communication unit 522 communicates with the unmanned aircraft 100 by various wireless communication methods.
- the wireless communication method may include, for example, communication by wireless LAN, Bluetooth (registered trademark), short-range wireless communication, or a public wireless network. Further, the communication unit 522 can also communicate via wire.
- the storage unit 523 may have, for example, a program that defines the operation of the HMD 500, a ROM that stores data of the set value, and a RAM that temporarily stores various information and data used when the processing unit 521 performs processing.
- the storage portion 523 may be disposed to be detachable from the HMD 500.
- the program can include an application.
- the operation unit 524 accepts data and information input by the user.
- the operation unit 524 may include a button, a button, a touch display screen, a touch panel, a microphone, and the like.
- the operation unit 524 can accept operations such as tracking, click flight, and the like.
- the display unit 525 is configured by, for example, an LCD (Liquid Crystal Display), and displays various kinds of information and data output from the processing unit 521.
- the display unit 525 can display data of a captured image taken by the imaging device 220 of the unmanned aerial vehicle 100.
- the acceleration sensor 526 may be a three-axis acceleration sensor capable of detecting the attitude of the HMD 500.
- the acceleration sensor 526 can output the detected posture information to the processing unit 521 as one of the operation information.
- the imaging unit 527 captures various images. In order to detect the direction in which the user views, that is, the line of sight, the imaging unit 527 can capture the eyes of the user and output it to the processing unit 521.
- the interface unit 528 can perform input and output of information and data with an external device.
- the HMD 500 can perform the same operations as the first to third embodiments. Therefore, even if the terminal 80 is the HMD 500, the same effects as those of the first to third embodiments can be obtained. In addition, when the user wears the HMD 500, the field of view of the HMD 500 facing the outside is largely blocked as compared with the case where the HMD 500 is not worn, so that the image can be visually confirmed by further improving the sense of reality, and the flight control instruction of the FPV flight of the unmanned aircraft 100 can be enjoyed. .
- the HMD 500 can display the presence or absence of the information presentation of the common point of interest based on the point of interest detected by the processing unit 521, and cause the display unit 525 to display other unmanned aircraft 100 other than the instruction to perform flight control by the HMD 500.
- Information and recommended information for driving the aircraft 100 Therefore, even if the field of view of the HMD 500 facing the outside is largely blocked, the user wearing the HMD 500 can improve the operational safety of the unmanned aircraft 100 using the HMD 500 by confirming the presented information.
- the HMD 500 can fly with the unmanned aircraft 100 operating with the left and right control bars of the transmitter 50 in the case where the control of the flight of the unmanned aircraft 100 can be instructed based on the acceleration detected by the acceleration sensor 526.
- the control instruction is also indicated in the same manner. Therefore, in this case, the flight system 10 may also not include the transmitter 50.
- the HMD 500 may not indicate the flight control of the unmanned aircraft 100 based on the acceleration detected by the acceleration sensor 526. In this case, the user can manipulate the unmanned aircraft 100 using the transmitter 50 while confirming the display unit 525 of the HMD 500.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
提供一种能够在FPV飞行时抑制无人驾驶航空器发生碰撞的信息处理装置,信息处理装置根据由飞行体拍摄的图像中的用户的关注点来提示信息,并包括处理部,处理部获取对指示第一飞行体的控制的第一终端进行操作的第一用户在由第一飞行体拍摄的第一图像中关注的第一关注点、和对指示第二飞行体的控制的第二终端进行操作的第二用户在由第二飞行体拍摄的第二图像中关注的第二关注点,判定第一关注点和第二关注点是否为表示相同关注点的共同关注点,若是共同关注点的情况下,将与第二飞行体相关的信息提示给第一终端。
Description
本公开涉及根据由飞行体拍摄的图像中的用户的关注点来提示信息的信息处理装置、信息提示指示方法、程序以及记录介质。
以往,用户无需目视无人驾驶航空器,例如使用终端一边观察终端的显示器所显示的、无人驾驶航空器所得到的摄像图像,一边操作无人驾驶航空器的飞行的FPV(First Person View,第一人称视角)飞行是已知的(参照专利文献1)。
【现有技术文献】
专利文献
专利文献1:日本特开2016-203978号公报
【发明内容】
【发明所要解决的技术问题】
在使无人驾驶航空器进行FPV飞行的情况下,用户如果只观察摄像图像,难以确认无人驾驶航空器的周围的状况。例如,在多个无人驾驶航空器朝相同目的地飞行的情况下,随着接近目的地,无人驾驶航空器彼此接近,无人驾驶航空器彼此间有可能发生碰撞。
【用于解决问题的技术手段】
在一个方面中,一种信息处理装置,其根据由飞行体拍摄的图像中的用户的关注点来提示信息,其包括处理部,处理部获取对指示第一飞行体的控制的第一终端进行操作的第一用户在由第一飞行体拍摄的第一图像中关注的第一关注点、和对指示第二飞行体的控制的第二终端进行操作的第二用户在由第二飞行体拍摄的第二图像中关注的第二关注点,并判定第一关注点和第二关注点是否为表示相同关注点的共同关注点,在是共同关注点的情况下,将与第二飞行体相关的信息提示给第一终端。
处理部可以判定第一飞行体是否正在移动,在第一飞行体正在移动的情况下,将与第二飞行体相关的信息提示给第一终端。
处理部可以获取第一飞行体的位置信息,获取第二飞行体的位置信息,在第一飞行体与第二飞行体的距离即第一距离小于等于第一阈值的情况下,将与第二飞行体相关的信息提示给第一终端。
处理部可以获取第一飞行体的位置信息,获取第二飞行体的位置信息,在第一终 端的画面中的、根据第二飞行体相对于第一飞行体的位置的位置处,提示表示第二飞行体存在的信息。
处理部可以获取第一飞行体的位置信息,获取第二飞行体的位置信息,并在第一飞行体与第二飞行体的距离即第一距离小于等于第一阈值的情况下,将推荐对第一飞行体的飞行的速度进行限制的第一推荐信息提示给第一终端。
处理部可以提示第一推荐信息,该第一推荐信息推荐第一距离越短,越使第一飞行体的飞行的速度限制为低速。
处理部可以获取共同关注点的位置信息,获取第一飞行体的位置信息,在共同关注点与第一飞行体的距离即第二距离小于等于第二阈值的情况下,将推荐对第一飞行体的飞行的速度进行限制的第二推荐信息提示给第一终端。
处理部可以提示第二推荐信息,该第二推荐信息推荐第二距离越短,越使第一飞行体的飞行的速度限制为低速。
信息处理装置可以是服务器,还可以包括通信部。处理部可以经由通信部从第一终端获取第一关注点,从第二终端获取第二关注点,并经由通信部向第一终端发送要提示给第一终端的信息。
信息处理装置是第一终端,还可以包括通信部和提示部。处理部可以经由通信部从第一飞行体获取第一图像,检测第一图像中的第一关注点,并经由通信部从第二终端获取第二关注点,将要提示给第一终端的信息提示给提示部。
在一个方面中,一种信息提示指示方法,其根据由飞行体拍摄的图像中的用户的关注点来提示信息,包括:获取对指示第一飞行体的控制的第一终端进行操作的第一用户在由第一飞行体拍摄的第一图像中关注的第一关注点、和对指示第二飞行体的控制的第二终端进行操作的第二用户在由第二飞行体拍摄的第二图像中关注的第二关注点的步骤;判定第一关注点和第二关注点是否为表示相同关注点的共同关注点的步骤;以及在是共同关注点的情况下,将与第二飞行体相关的信息提示给第一终端的步骤。
信息提示指示方法还可以包括判定第一飞行体是否正在移动的步骤。提示与第二飞行体相关的信息的步骤可以包括在第一飞行体正在移动的情况下,将与第二飞行体相关的信息提示给第一终端的步骤。
信息提示指示方法还可以包括获取第一飞行体的位置信息的步骤和获取第二飞行体的位置信息的步骤。提示与第二飞行体相关的信息的步骤可以包括:在第一飞行 体与第二飞行体之间的距离即第一距离小于等于第一阈值的情况下,将与第二飞行体相关的信息提示给第一终端的步骤。
信息提示指示方法还可以包括获取第一飞行体的位置信息的步骤和获取第二飞行体的位置信息的步骤。提示与第二飞行体相关的信息的步骤可以包括:在第一终端的画面中的、根据第二飞行体相对于第一飞行体的位置的位置处,提示表示第二飞行体存在的信息的步骤。
信息提示指示方法还可以包括:获取第一飞行体的位置信息的步骤;获取第二飞行体的位置信息的步骤;以及在第一飞行体与第二飞行体的距离即第一距离小于等于第一阈值的情况下,将推荐对第一飞行体的飞行的速度进行限制的第一推荐信息提示给第一终端的步骤。
提示第一推荐信息的步骤可以包括:对推荐第一距离越短,越使第一飞行体的飞行的速度限制为低速的第一推荐信息进行提示的步骤。
信息提示指示方法还可以包括:获取共同关注点的位置信息的步骤;获取第一飞行体的位置信息的步骤;在共同关注点与第一飞行体的距离即第二距离小于等于第二阈值的情况下,将推荐对第一飞行体的飞行的速度进行限制的第二推荐信息提示给第一终端的步骤。
提示第二推荐信息的步骤可以包括:对推荐第二距离越短,越使第一飞行体的飞行的速度限制为低速的第二推荐信息进行提示的步骤。
信息处理装置可以是服务器。获取第一关注点和第二关注点的步骤可以包括从第一终端接收第一关注点的步骤,和从第二终端接收第二关注点的步骤。提示与第二飞行体相关的信息的步骤可以包括向第一终端发送要提示给第一终端的信息的步骤。
信息处理装置可以是第一终端。获取第一关注点和第二关注点的步骤可以包括:从第一飞行体接收第一图像的步骤;检测第一图像中的第一关注点的步骤;以及从第二终端接收第二关注点的步骤。提示与第二飞行体相关的信息的步骤可以包括将要提示给第一终端的信息提示给第一终端所包括的提示部的步骤。
在一个方面中,一种程序,其用于使根据由飞行体拍摄的图像中的用户的关注点来提示信息的信息处理装置执行以下步骤:获取对指示第一飞行体的控制的第一终端进行操作的第一用户在由第一飞行体拍摄的第一图像中关注的第一关注点、和对指示第二飞行体的控制的第二终端进行操作的第二用户在由第二飞行体拍摄的第二图像中关注的第二关注点的步骤;判定第一关注点和第二关注点是否为表示相同关注点的共同关注点的步骤;以及在是共同关注点的情况下,将与第二飞行体相关的信息提示 给第一终端的步骤。
在一个方面中,一种记录介质,其是计算机可读介质并记录有用于使根据由飞行体拍摄的图像中的用户的关注点来提示信息的信息处理装置执行以下步骤的程序:获取对指示第一飞行体的控制的第一终端进行操作的第一用户在由第一飞行体拍摄的第一图像中关注的第一关注点、和对指示第二飞行体的控制的第二终端进行操作的第二用户在由第二飞行体拍摄的第二图像中关注的第二关注点的步骤;判定第一关注点和第二关注点是否为表示相同关注点的共同关注点的步骤;以及在是共同关注点的情况下,将与第二飞行体相关的信息提示给第一终端的步骤。
此外,上述的发明内容中并未穷举本公开的所有特征。此外,这些特征组的子组合也可以构成发明。
图1是示出第一实施方式中的飞行系统的构成示例的示意图。
图2是示出无人驾驶航空器的具体的外观的一个示例的图。
图3是示出无人驾驶航空器的硬件构成的一个示例的框图。
图4是示出安装有发送器的终端的外观的一个示例的立体图。
图5是示出发送器的硬件构成的一个示例的框图。
图6A是示出终端的硬件构成的一个示例的框图。
图6B是示出服务器的硬件构成的框图。
图7是示出第一动作示例中的由服务器进行的信息提示指示步骤的顺序图。
图8是示出关注点的检测示例的图。
图9A是示出两个用户的关注点是共同的关注点时的各显示器所显示的摄像图像的图。
图9B是示出两个用户的关注点不是共同的关注点时的各终端的显示器上分别显示的摄像图像GZ1、GZ2的图。
图10是示出两个无人驾驶航空器的位置关系的图。
图11A是示出终端的显示器上所显示的无人驾驶航空器的摄像图像的图。
图11B是示出其它终端的显示器上所显示的无人驾驶航空器的摄像图像的图。
图12A是示出第二动作示例中的由服务器进行的无人驾驶航空器视点下的信息提示指示步骤的顺序图。
图12B是示出接着图12A的由服务器进行的无人驾驶航空器视点下的信息提示指示步骤的顺序图。
图13是示出针对两个无人驾驶航空器之间的距离设定的阈值D的空间的图。
图14A是示出在距离在阈值以内的情况下显示器上所显示的推荐画面的图。
图14B是示出在距离在阈值以内的情况下显示器上所显示的推荐画面的图。
图15A是示出通过目视来操纵无人驾驶航空器的状况的图。
图15B是示出以FPV飞行模式操纵无人驾驶航空器的状况的图。
图16A是示出第二实施方式中的由服务器进行的目的地视点下的信息提示指示步骤的顺序图。
图16B是接着图16A的由服务器进行的目的地视点下的信息提示指示步骤的顺序图。
图17是示出第三实施方式的第一动作示例中的由终端进行的无人驾驶航空器视点下的信息提示指示步骤的顺序图。
图18A是示出第三实施方式的第二动作示例中的由终端进行的无人驾驶航空器视点下的信息提示指示步骤的顺序图。
图18B是示出接着图18A的由终端进行的无人驾驶航空器视点下的信息提示指示步骤的顺序图。
图19是示出第四实施方式中的头戴式显示器的外观的立体图。
图20是示出头戴式显示器的硬件构成的框图。
以下,通过本发明的实施方式来对本公开进行说明,但是以下实施方式并非限制权利要求书所涉及的发明。实施方式中说明的特征的所有组合未必是发明的解决方案所必须的。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。
在以下实施方式中,飞行体以无人驾驶航空器(UAV:Unmanned Aerial Vehicle)为例。无人驾驶航空器包括在空中移动的航空器。在本说明书的附图中,无人驾驶航空器也标记为“UAV”。此外,信息处理装置以服务器、终端等为例。信息提示指示方法规定了信息处理装置中的动作。此外,记录介质中记录有程序(例如使信息处理装置执行各种处理的程序)。
(第一实施方式)
图1是示出第一实施方式中的飞行系统10的构成示例的示意图。飞行系统10包括多个无人驾驶航空器100、发送器50、多个终端80以及服务器300。无人驾驶航空器100、发送器50、终端80以及服务器300互相之间可以通过有线通信或无线通信(例如,无线LAN(Local Area Network,局域网))进行通信。终端80可以通过有线通信或无线通信与服务器300通信。在图1中,作为多个无人驾驶航空器100,示出了无人驾驶航空器100A、100B。作为多个终端80,示出了终端80A、80B。
接着,对无人驾驶航空器100的构成示例进行说明。图2是示出无人驾驶航空器100 的具体的外观的一个示例的图。在图2中示出了无人驾驶航空器100在移动方向STV0飞行时的立体图。无人驾驶航空器100为飞行体的一个示例。
如图2所示,在与地面平行且沿着移动方向STV0的方向上定义了滚转轴(参照x轴)。此时,将俯仰轴(参照y轴)确定为与地面相平行并与滚转轴垂直的方向,进一步,将偏航轴(参照z轴)确定为与地面垂直并与滚转轴以及俯仰轴垂直的方向。
无人驾驶航空器100构成为包括UAV主体102、万向节200、摄像装置220和多个摄像装置230。UAV主体102是无人驾驶航空器100的壳体的一个示例。摄像装置220、230是摄像部的一个示例。
UAV主体102包括多个旋翼(螺旋浆)。UAV主体102通过控制多个旋翼的旋转而使无人驾驶航空器100飞行。UAV主体102使用例如四个旋翼使无人驾驶航空器100飞行。旋翼的数量并不限于四个。另外,无人驾驶航空器100可以是没有旋翼的固定翼飞机。
摄像装置220是对包含在所期望的摄像范围内的被摄体(例如,摄像对象的上空的情况、山川河流等的景色、地上的建筑物)进行拍摄的拍摄用相机。
多个摄像装置230可以是为了控制无人驾驶航空器100的飞行而对无人驾驶航空器100的周围进行拍摄的传感用相机。两个摄像装置230可以设置于无人驾驶航空器100的机头即正面。进而,其他两个摄像装置230可以设置于无人驾驶航空器100的底面。正面侧的两个摄像装置230可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置230也可以成对,起到立体相机的作用。可以根据由多个摄像装置230拍摄的图像来生成无人驾驶航空器100周围的三维空间数据。另外,无人驾驶航空器100所包括的摄像装置230的数量不限于四个。无人驾驶航空器100包括至少一个摄像装置230即可。无人驾驶航空器100可以在无人驾驶航空器100的机头、机尾、侧面、底面及顶面分别包括至少一个摄像装置230。摄像装置230中可设定的视角可大于摄像装置220中可设定的视角。摄像装置230可以具有单焦点镜头或鱼眼镜头。
图3是示出无人驾驶航空器100的构成配置的一个示例的框图。无人驾驶航空器100构成为包括UAV控制部110、通信接口150、内存160、万向节200、旋翼机构210、摄像装置220、摄像装置230、GPS接收器240、惯性测量装置(IMU:Inertial Measurement Unit)250、磁罗盘260、气压高度计270、超声波传感器280、激光测量仪290。通信接口150是通信部的一个示例。
UAV控制部110例如由CPU(Central Processing Unit:中央处理器)、MPU(Micro Processing Unit:微处理器)或DSP(Digital Signal Processor:数字信号处理器)构成。UAV控制部110执行用于总体控制无人驾驶航空器100的各部分的动作的信号处理、 与其它各部分之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。
UAV控制部110按照存储于内存160中的程序来控制无人驾驶航空器100的飞行。UAV控制部110按照通过通信接口150从远程的发送器50接收到的指令来控制无人驾驶航空器100的飞行。内存160可以从无人驾驶航空器100上拆卸下来。
UAV控制部110可以通过对由多个摄像装置230拍摄的多个图像进行分析,来指定无人驾驶航空器100的周围的环境。UAV控制部110根据无人驾驶航空器100周围的环境,例如避开障碍物来控制飞行。
UAV控制部110获取表示当前日期的日期信息。UAV控制部110可以从GPS接收器240获取表示当前日期的日期信息。UAV控制部110可以从搭载于无人驾驶航空器100的计时器(未图示)获取表示当前日期的日期信息。
UAV控制部110获取表示无人驾驶航空器100的位置的位置信息。UAV控制部110可以从GPS接收器240获取表示无人驾驶航空器100所在的纬度、经度以及高度的位置信息。UAV控制部110可以分别从GPS接收器240获取表示无人驾驶航空器100所在的纬度以及经度的纬度经度信息、并从气压高度计270获取表示无人驾驶航空器100所在的高度的高度信息,作为位置信息。UAV控制部110可以获取超声波传感器280产生的超声波的发射点与超声波的反射点之间的距离,作为高度信息。
UAV控制部110从磁罗盘260获取表示无人驾驶航空器100的朝向的朝向信息。朝向信息表示例如与无人驾驶航空器100的机头的朝向对应的方位。
UAV控制部110可以在摄像装置220对应摄像的摄像范围进行摄像时,获取表示无人驾驶航空器100所应存在的位置的位置信息。UAV控制部110可以从内存160获取表示无人驾驶航空器100所应存在的位置的位置信息。UAV控制部110可以通过通信接口150从发送器50等的其他装置获取表示无人驾驶航空器100应存在的位置的位置信息。UAV控制部110可以参照三维地图数据库,为了拍摄要拍摄的摄像范围,来指定无人驾驶航空器100可存在的位置,并获取该位置作为表示无人驾驶航空器100应存在的位置的位置信息。
UAV控制部110获取表示摄像装置220以及摄像装置230各自的摄像范围的摄像信息。UAV控制部110从摄像装置220以及摄像装置230获取表示摄像装置220以及摄像装置230的视角的视角信息,作为用于指定摄像范围的参数。UAV控制部110获取表示摄像装置220以及摄像装置230的摄像方向的信息,作为用于指定摄像范围的参数。UAV控制部110从万向节200获取表示摄像装置220的姿势状态的姿势信息,作为例如表示摄像装置220的摄像方向的信息。UAV控制部110获取表示无人驾驶航空器100的朝向的信息。表示摄像装置220的姿态的信息表示万向节200 从俯仰轴和偏航轴的基准旋转角度旋转的角度。UAV控制部110获取表示无人驾驶航空器100所在的位置的位置信息,作为用于指定摄像范围的参数。UAV控制部110可以根据摄像装置220和摄像装置230的视角和摄像方向、以及无人驾驶航空器100所在的位置,通过划定表示摄像装置220拍摄的地理范围的摄像范围并生成表示摄像范围的摄像信息,从而获取摄像信息。
UAV控制部110可以获取表示摄像装置220要拍摄的摄像范围的摄像信息。UAV控制部110可以从内存160获取摄像装置220要拍摄的摄像信息。UAV控制部110可以通过通信接口150从发送器50等其他装置获取摄像装置220要拍摄的摄像信息。
UAV控制部110可以获取表示存在于无人驾驶航空器100周围的对象的立体形状(三维形状)的立体信息(三维信息)。对象是例如建筑物、道路、车辆、树木等风景的一部分。立体信息例如是三维空间数据。UAV控制部110可以通过生成表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息,从由多个摄像装置230得到的各个图像获取立体信息。UAV控制部110可以通过参照存储在内存160中的三维地图数据库,获取表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息。UAV控制部110可以通过参照由网络上存在的服务器所管理的三维地图数据库,来获取与存在于无人驾驶航空器100的周围的对象的立体形状相关的立体信息。
UAV控制部110获取通过摄像装置220和摄像装置230拍摄的图像数据。
UAV控制部110控制万向节200、旋翼机构210、摄像装置220和摄像装置230。UAV控制部110通过变更摄像装置220的摄像方向或视角来控制摄像装置220的摄像范围。UAV控制部110通过控制万向节200的旋转机构来控制万向节200所支撑的摄像装置220的摄像范围。
本发明书中,摄像范围是指由摄像装置220或摄像装置230拍摄的地理范围。摄像范围由纬度、经度和高度定义。摄像范围可以是由纬度、经度和高度定义的三维空间数据的范围。摄像范围根据摄像装置220或摄像装置230的视角和摄像方向、以及无人驾驶航空器100所在的位置而指定。摄像装置220和摄像装置230的摄像方向由摄像装置220和摄像装置230的设置有摄像镜头的正面所朝的方位和俯角来定义。摄像装置220的摄像方向是由无人驾驶航空器100的机头的方位和相对于万向节200的摄像装置220的姿势状态而指定的方向。摄像装置230的摄像方向是由无人驾驶航空器100的机头的方位和设置有摄像装置230的位置而指定的方向。
UAV控制部110通过控制旋翼机构210来控制无人驾驶航空器100的飞行。即,UAV控制部110通过控制旋翼机构210来对包括无人驾驶航空器100的纬度、经度以及高度的位置进行控制。UAV控制部110可以通过控制无人驾驶航空器100的飞 行来控制摄像装置220以及摄像装置230的摄像范围。UAV控制部110可以通过控制摄像装置220所包括的变焦镜头来控制摄像装置220的视角。UAV控制部110可以利用摄像装置220的数字变焦功能,通过数字变焦来控制摄像装置220的视角。
在摄像装置220固定于无人驾驶航空器100,不移动摄像装置220的情况下,UAV控制部110可以通过使无人驾驶航空器100在特定的日期向特定的位置移动,使摄像装置220在期望的环境下拍摄期望的摄像范围。或者在摄像装置220没有变焦功能,无法改变摄像装置220视角的情况下,UAV控制部110可以通过使无人驾驶航空器100在特定的日期向特定的位置移动,使摄像装置220在期望的环境下拍摄期望的摄像范围。
UAV控制部110可以设定无人驾驶航空器100的飞行模式。飞行模式例如包括常规飞行模式、低速飞行模式、暂时停止模式等。设定的飞行模式的信息可以保存在内存160中。常规飞行模式是能够无速度限制地飞行的飞行模式。低速飞行模式是禁止以规定速度以上的速度飞行、能够限制速度来飞行的飞行模式。暂时停止模式是禁止无人驾驶航空器100的移动,能够悬停的飞行模式。
UAV控制部110对于由摄像装置220拍摄的摄像图像,添加与该摄像图像相关的信息作为附加信息(元数据的一个示例)。附加信息可以包括各种参数。各种参数可以包括与拍摄时的无人驾驶航空器100的飞行相关的参数(飞行参数)和与拍摄时的摄像装置220的摄像相关的信息(摄像参数)。飞行参数可以包括摄像位置信息、摄像路径信息、摄像时间信息和其他信息中的至少一个。摄像参数可以包括摄像视角信息、摄像方向信息、摄像姿态信息、摄像范围信息,以及被摄体距离信息中的至少一个。
摄像路径信息表示拍摄摄像图像的路径(摄像路径)。摄像路径信息是拍摄时无人驾驶航空器100所飞行的路径的信息,可以由摄像位置连续地相连的摄像位置集合构成。摄像位置可以根据GPS接收器240所获取的位置。摄像时间信息表示拍摄摄像图像的时间(摄像时间)。摄像时间信息可以根据UAV控制部110所参照的计时器的时间信息。
摄像视角信息表示拍摄摄像图像时的摄像装置220的视角信息。摄像方向信息表示拍摄摄像图像时的摄像装置220的摄像方向(摄像方向)。摄像姿态信息表示拍摄摄像图像时的摄像装置220的姿态信息。摄像范围信息表示拍摄摄像图像时的摄像装置220的摄像范围。被摄体距离信息表示从摄像装置220到被摄体的距离的信息。被摄体距离信息可以根据超声波传感器280或激光测量仪290测得的检测信息。
通信接口150与发送器50、终端80、服务器300通信。通信接口150从远程的发送器50接收对UAV控制部110的各种指令、信息。通信接口150可以将摄像图像、 与摄像图像相关的附加信息发送给终端80。
内存160存储UAV控制部110对万向节200、旋翼机构210、摄像装置220、摄像装置230、GPS接收器240、惯性测量装置250、磁罗盘260、气压高度计270、超声波传感器280以及激光测量仪290进行控制所需的程序等。内存160可以是计算机可读记录介质,可以包括SRAM(Static Random Access Memory:静态随机存取存储器)、DRAM(Dynamic Random Access Memory:动态随机存取存储器)、EPROM(Erasable Programmable Read Only Memory:可擦除可编程只读存储器)、EEPROM(Electrically Erasable Programmable Read-Only Memory:电可擦除可编程只读存储器)以及USB存储器等闪存中的至少一个。内存160可以设置在UAV主体102的内部。其可以设置成可从UAV主体102上拆卸下来。
万向节200以至少一个轴为中心可旋转地支撑摄像装置220。万向节200可以以偏航轴、俯仰轴以及滚转轴为中心可旋转地支撑摄像装置220。万向节200可以通过使摄像装置220以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,从而变更摄像装置220的摄像方向。
旋翼机构210具有:多个旋翼211、使多个旋翼211旋转的多个驱动电机212、以及测量用于驱动驱动电机212的驱动电流的电流值(实测值)的电流传感器213。驱动电流被供给至驱动电机212。
摄像装置220拍摄期望的摄像范围的被摄体并生成摄像图像的数据。通过摄像装置220的拍摄而得到的图像数据保存于摄像装置220所具有的内存、或内存160中。
摄像装置230拍摄无人驾驶航空器100的周围并生成摄像图像的数据。摄像装置230的图像数据存储于内存160中。
GPS接收器240接收表示从多个导航卫星(即GPS卫星)发送的时间以及各GPS卫星的位置(坐标)的多个信号。GPS接收器240根据接收到的多个信号,计算出GPS接收器240的位置(即无人驾驶航空器100的位置)。GPS接收器240将无人驾驶航空器100的位置信息输出到UAV控制部110。另外,可以由UAV控制部110代替GPS接收器240来计算GPS接收器240的位置信息。在此情况下,在UAV控制部110中输入有GPS接收器240所接收到的多个信号中包含的表示时间以及各GPS卫星的位置的信息。
惯性测量装置250检测无人驾驶航空器100的姿势,并将检测结果输出到UAV控制部110。惯性测量装置250检测无人驾驶航空器100的前后、左右和上下的三轴方向的加速度以及俯仰轴、滚转轴和偏航轴的三轴方向的角速度,作为无人驾驶航空器100的姿态。
磁罗盘260检测无人驾驶航空器100的机头的方位,并将检测结果输出到UAV控制部110。
气压高度计270检测无人驾驶航空器100的飞行高度,并将检测结果输出到UAV控制部110。
超声波传感器280发射超声波,检测地面、物体反射的超声波,并将检测结果输出到UAV控制部110。检测结果可以示出从无人驾驶航空器100到地面的距离,即高度。检测结果可以表示从无人驾驶航空器100到物体的距离。
激光测量仪290将激光照射在物体上,接收物体反射的反射光,并通过反射光测量无人驾驶航空器100与物体之间的距离。作为根据激光的距离测量方法的一个示例,可以为飞行时间法。
接着,对发送器50以及终端80的构成示例进行说明。图4是示出安装有发送器50的终端80的外观的一个示例的立体图。在图4中,作为终端80的一个示例,示出了智能手机80S。相对于发送器50的上下前后左右的方向分别遵从图4所示的箭头的方向。发送器50是在例如使用发送器50的人(以下称为“操作者”)用两手握持的状态下使用的。
发送器50具有例如具有大致正方形的底面且高度比底面的一边短的大致长方体(换言之,大致箱形)的形状的树脂材料的壳体50B。在发送器50的壳体表面的大致中央突出设置有左控制棒53L和右控制棒53R。
左控制棒53L、右控制棒53R分别在用于远程控制(例如,无人驾驶航空器100的前后移动、左右移动、上下移动、朝向变更)由操作者进行的无人驾驶航空器100的移动的操作(移动控制操作)中使用。在图4中,左控制棒53L和右控制棒53R表示未由操作者的双手分别施加外力的初始状态的位置。左控制棒53L和右控制棒53R在由操作者施加的外力被释放后,自动地恢复到预定位置(例如图4所示的初始位置)。
在左控制棒53L的近前侧(换言之,操作者侧)配置有发送器50的电源按钮B1。当操作者按下一次电源按钮B1时,内置于例如发送器50的电池(未图示)的容量的余量在电池余量显示部L2中显示。当操作者再次按下电源按钮B1时,例如发送器50的电源接通,能够向发送器50的各部供给电源来使用。
在右控制棒53R的近前侧(换言之,操作者侧)配置有RTH(Return To Home)按钮B2。当操作者按下RTH按钮B2时,发送器50向无人驾驶航空器100发送用于使其自动恢复到预定位置的信号。由此,发送器50能够使无人驾驶航空器100自动 返回到预定位置(例如无人驾驶航空器100所存储的起飞位置)。例如在室外的利用无人驾驶航空器100的拍摄中,在操作者看丢了无人驾驶航空器100的机体的情况下,或者遭遇电波干扰或无法预期的故障而不能操作等情况下,能够利用RTH按钮B2。
在电源按钮B1以及RTH按钮B2的近前侧(换言之,操作者侧)配置有远程状态显示部L1以及电池余量显示部L2。远程状态显示部L1例如由LED(Light Emission Diode)构成,显示发送器50与无人驾驶航空器100的无线连接状态。电池余量显示部L2例如由LED构成,显示内置于发送器50的电池(未图示)的容量的余量。
比左控制棒53L和右控制棒53R靠后侧,且从发送器50的壳体50B的后方侧面突出设置有两个天线AN1、AN2。天线AN1、AN2根据操作者的左控制棒53L和右控制棒53R的操作,将由发送器控制部61生成的信号(即,用于控制无人驾驶航空器100的移动的信号)发送到无人驾驶航空器100。该信号是由发送器50输入的操作输入信号之一。天线AN1、AN2能够覆盖例如2km的发送接收范围。此外,天线AN1、AN2在由与发送器50无线连接的无人驾驶航空器100所具有的摄像装置220拍摄到的图像、或者无人驾驶航空器100所取得的各种数据从无人驾驶航空器100发送的情况下,能够接收这些图像或各种数据。
在图4中,发送器50虽然不包括显示部,但也可以包括显示部。
终端80可以载置安装在支架HLD上。支架HLD可以附接安装在发送器50上。由此,终端80经由支架HLD安装在发送器50上。终端80和发送器50可以经由有线电缆(例如USB电缆)连接。终端80也可以不安装在发送器50上,终端80和发送器50可以分别独立设置。
图5是示出发送器50的硬件配置的一个示例的框图。发送器50的构成为包括左控制棒53L、右控制棒53R、发送器控制部61、无线通信部63、接口部65、电源按钮B1、RTH按钮B2、操作部组OPS、远程状态显示部L1、电池余量显示部L2和显示部DP。发送器50是对无人驾驶航空器100的控制进行指示的操作装置的一个示例。
左控制棒53L例如用于通过操作者的左手远程控制无人驾驶航空器100的移动的操作。右控制棒53R例如用于通过操作者的右手远程控制无人驾驶航空器100的移动的操作。无人驾驶航空器100的移动例如为前进的方向的移动、后退的方向的移动、向左方向的移动、向右方向的移动、上升的方向的移动、下降的方向的移动、向左方向旋转无人驾驶航空器100的移动、向右方向旋转无人驾驶航空器100的移动中的任一个或它们的组合,以下相同。
一旦按下一次电源按钮B1,表示按下一次的信号就被输入到发送器控制部61。发送器控制部61根据该信号,将内置于发送器50的电池(未图示)的容量的余量显 示于电池余量显示部L2。由此,操作者能够简单地确认内置于发送器50的电池的容量的余量。另外,当按下两次电源按钮B1时,表示按下两次的信号被传递给发送器控制部61。发送器控制部61根据该信号,指示内置于发送器50的电池(未图示)向发送器50内的各部供给电源。由此,操作者接通发送器50的电源,能够简单地开始发送器50的使用。
当按下RTH按钮B2时,表示按下了的信号就被输入到发送器控制部61。发送器控制部61按照该信号,生成用于使无人驾驶航空器100自动恢复到预定位置(例如无人驾驶航空器100的起飞位置)的信号,通过无线通信部63以及天线AN1、AN2发送到无人驾驶航空器100。由此,操作者能够通过对发送器50的简单的操作,使无人驾驶航空器100自动地恢复(返回)到预定位置。
操作部组OPS由多个操作部OP(例如操作部OP1、...、操作部OPn)(n:大于等于2的整数)构成。操作部组OPS由除了图3所示的左控制棒53L、右控制棒53R、电源按钮B1以及RTH按钮B2以外的其他操作部(例如,用于辅助利用发送器50的无人驾驶航空器100的远程控制的各种操作部)构成。这里所说的各种操作部,例如相当于对使用了无人驾驶航空器100的摄像装置220的静态图像的拍摄进行指示的按钮、对使用了无人驾驶航空器100的摄像装置220的动态图像的录像的开始以及结束进行指示的按钮、调整无人驾驶航空器100的万向节200(参照图2)的倾斜方向的倾斜度的拨盘、切换无人驾驶航空器100的飞行模式的按钮、进行无人驾驶航空器100的摄像装置220的设定的拨盘。
由于远程状态显示部L1以及电池余量显示部L2参照图4进行了说明,所以在此省略说明。
发送器控制部61由处理器(例如CPU、MPU或DSP)构成。发送器控制部61进行用于整体控制发送器50各部的动作的信号处理、与其它各部之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。发送器控制部61为处理部的一个示例。
发送器控制部61可以通过无线通信部63获取无人驾驶航空器100的摄像装置220拍摄的摄像图像的数据并保存到内存(未图示),并通过接口部65输出到终端80。换言之,发送器控制部61可以使终端80显示由无人驾驶航空器100的摄像装置220拍摄的摄像图像的数据。由此,由无人驾驶航空器100的摄像装置220拍摄的摄像图像能够在终端80中显示。
发送器控制部61可以通过操作者的左控制棒53L和右控制棒53R的操作,生成用于控制通过该操作指定的无人驾驶航空器100的飞行的指示信号。发送器控制部61可以通过无线通信部63以及天线AN1、AN2向无人驾驶航空器100发送此指示信 号来远程控制无人驾驶航空器100。由此,发送器50能够远程控制无人驾驶航空器100的移动。
无线通信部63与两个天线AN1、AN2连接。无线通信部63通过两个天线AN1、AN2,与无人驾驶航空器100之间进行使用预定的无线通信方式(例如无线LAN)的信息、数据的收发。
接口部65执行发送器50与终端80之间的信息、数据的输入输出。接口部65可以是例如设置在发送器50上的USB端口(未图示)。接口部65也可以是USB端口以外的接口。
图6A是示出终端80的硬件构成的一个示例的框图。终端80可以包括终端控制部81、接口部82、操作部83、通信部85、内存87、显示部88以及摄像部89。显示部88为提示部的一个示例。
终端控制部81例如采用CPU、MPU或DSP构成。终端控制部81进行用于整体控制终端80各部的动作的信号处理、与其它各部之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。
终端控制部81可以经由通信部85获取来自无人驾驶航空器100的数据、信息。例如,终端控制部81可以经由通信部85获取来自无人驾驶航空器100的摄像图像及其附加信息。终端控制部81可以通过接口部82获取来自发送器50的数据、信息。终端控制部81可以获取通过操作部83输入的数据、信息。终端控制部81可以获取保存在内存87中的数据、信息。终端控制部81可以将数据、信息发送至显示部88,将根据此数据、信息的显示信息显示在显示部88上。
终端控制部81可以经由通信部85从无人驾驶航空器100直接获取无人驾驶航空器100的位置信息,或者获取无人驾驶航空器100的位置信息作为附加信息中包含的摄像位置信息。终端控制部81可以依次获取无人驾驶航空器100的位置信息,并根据位置信息,计算无人驾驶航空器100移动的速度、移动的方向的信息。无人驾驶航空器100的位置、速度、移动方向的信息可以包含在附加信息中,并通知给服务器300等。
终端控制部81可以执行用于指示无人驾驶航空器100的控制的应用程序。终端控制部81可以生成应用程序中使用的各种数据。
终端控制部81可以获取来自无人驾驶航空器100的摄像图像。终端控制部81可以使显示部88显示来自无人驾驶航空器100的摄像图像。
终端控制部81可以获取由摄像部89拍摄的有用户眼睛的周边部的图像(用户图像)。用户图像可以是在用户观察显示有来自无人驾驶航空器100的摄像图像的显示部88时拍摄到的图像。终端控制部81通过对摄像图像进行图像识别(例如分割处理、物体识别处理)来检测用户的眼睛(例如瞳孔)。
终端控制部81可以检测对终端80进行操作的用户在显示部88所显示的摄像图像中关注的关注点。在此情况下,终端控制部81可以使用周知的视线检测技术,在显示有摄像图像的显示部88中获取用户注视的(关注的)画面上的位置(视线检测位置)即关注点的坐标。即,终端控制部81可以识别用户的眼睛观察显示部88所显示的摄像图像的哪个位置。
终端控制部81可以获取与来自无人驾驶航空器100的摄像图像相关的附加信息中所包含的摄像范围信息。即,终端控制部81可以根据来自无人驾驶航空器100的摄像范围信息,将地理上的摄像范围指定为地图上的范围。终端控制部81可以检测关注点的坐标是显示部88所显示的摄像图像的哪个位置,检测与摄像图像的范围所表示的地理学上的摄像范围的哪个位置对应。由此,可以检测出地理上的摄像范围所包含的指定的位置作为关注点。
终端控制部81可以经由通信部85与外部的具有地图数据库的地图服务器进行通信,在与关注点对应的地理上的指定的位置处,检测在地图上存在什么样的对象。由此,可以检测出地理上的摄像范围所包含的指定的对象作为关注点。另外,内存87也可以具有地图服务器所具有的地图数据库。
终端控制部81可以通过对来自无人驾驶航空器100的摄像图像进行图像识别(例如分割处理、物体识别处理)来识别摄像图像中的各对象。在此情况下,例如即使在地图数据库的信息较为陈旧的情况下,也能够识别拍摄时的摄像图像中映入的对象。
作为关注点的信息,可以是位置信息(纬度和经度、或者纬度、经度和高度),也可以是由○○塔等的固有名称识别的对象的信息。此外,对象的信息除了例如○○塔等固有名称之外,还可以包括对象的信息、位置信息。另外,检测该关注点的方法是一个示例,也可以通过其他方法检测关注点。
接口部82进行发送器50与终端80之间的信息、数据的输入输出。接口部82例如可以是设置于终端80的USB连接器(未图示)。接口部82也可以是USB连接器以外的接口。
操作部83接受由终端80的操作者输入的数据、信息。操作部83可以包括按钮、按键、触控显示屏、话筒等。这里主要示出了操作部83和显示部88由触控显示屏构成。在此情况下,操作部83可以接受触控操作、点击操作、拖动操作等。
通信部85通过各种无线通信方式与无人驾驶航空器100之间进行通信。无线通信方式例如可以包括通过无线LAN、Bluetooth(注册商标)、短距离无线通信、或公共无线网络进行的通信。此外,通信部85也可以进行有线通信。
内存87例如可以具有规定终端80的动作的程序、存储设定值的数据的ROM、暂时保存终端控制部81进行处理时所使用的各种信息、数据的RAM。内存87可以包括ROM和RAM以外的内存。内存87可以设置在终端80的内部。内存87可以设置成可从终端80上拆卸下来。程序可以包括应用程序。
显示部88例如由LCD(Liquid Crystal Display:液晶显示器)构成,显示从终端控制部81输出的各种信息、数据。显示部88可以显示由无人驾驶航空器100的摄像装置220拍摄的摄像图像的数据。
摄像部89包括图像传感器,并对图像进行拍摄。摄像部89可以设置在包括显示部88的正面侧。拍摄部89可以以包括观看由显示部88显示的图像的用户的眼睛的周边在内的对象为被摄体,拍摄图像(用户图像)。摄像部89可以将用户图像输出到终端控制部81。摄像部89也可以拍摄用户图像以外的图像。
图6B是示出服务器300的硬件构成的框图。服务器300是信息处理装置的一个示例。服务器300具有服务器控制部310、通信部320、内存340和存储器330。
服务器控制部310例如使用CPU、MPU或DSP构成。服务器控制部310进行用于总体控制服务器300的各部分的动作的信号处理、与其他各部分之间的数据的输入输出处理、数据的运算处理和数据的存储处理。
服务器控制部310可以经由通信部320获取来自无人驾驶航空器100的数据、信息。服务器控制部310可以经由通信部320获取来自终端80的数据、信息。服务器控制部310可以执行用于对无人驾驶航空器100的控制进行指示的应用程序。服务器控制部310可以生成应用程序中使用的各种数据。
服务器控制部310进行与用于避免无人驾驶航空器100的碰撞的信息提示指示相关的处理。服务器控制部310根据由无人驾驶航空器100拍摄的图像中的用户的关注点来提示信息。
服务器控制部310可以经由通信部320从无人驾驶航空器100直接获取各无人驾驶航空器100的位置信息,或者从各终端80获取无人驾驶航空器100的位置信息作为附加信息中包含的摄像位置信息。服务器控制部310可以依次获取无人驾驶航空器100的位置信息,并根据位置信息,计算无人驾驶航空器100移动的速度、移动的方向的信息。服务器控制部310也可以经由通信部320从各终端80获取附加信息中包 含的无人驾驶航空器100的位置、速度、移动方向的信息。
内存340例如可以具有规定服务器300的动作的程序、存储设定值的数据的ROM、暂时保存服务器控制部310进行处理时所使用的各种信息、数据的RAM。内存340可以包括ROM和RAM以外的内存。内存340可以设置在服务器300的内部。内存340可以设置成可从服务器300上拆卸下来。程序可以包括应用程序。
通信部320能够通过有线或无线与其他装置(例如发送器50、终端80、无人驾驶航空器100)进行通信。存储器330是能够存储摄像图像、地图信息等的大容量的记录介质。
(第一实施方式的第一动作示例)
图7是示出第一动作示例中的由服务器300进行的信息提示指示步骤的顺序图。在本实施方式中,假设使用发送器50和终端80使无人驾驶航空器100进行FPV飞行。在FPV飞行中,发送器50和终端80的操作者无需目视无人驾驶航空器100,例如一边观察终端80的显示部88所显示的无人驾驶航空器100的摄像图像,一边操纵无人驾驶航空器100。
在此,假设多个用户操作对无人驾驶航空器100A的飞行的控制进行指示的发送器50、终端80。例如,用户Ua(用户U1)操作对无人驾驶航空器100A的飞行的控制进行指示的发送器50、终端80。用户Ub(用户U2)操作对其他无人驾驶航空器100B的飞行的控制进行指示的发送器50、终端80。
另外,对于用户A操作的发送器50、终端80、无人驾驶航空器100的各部分,也会在符号的末尾标注“A”来表示(例如,终端80A、显示部88A、无人驾驶航空器100A)。对于用户B操作的发送器50、终端80的各部分,也会在符号的末尾标注“B”来表示(例如,终端80B、显示部88B、无人驾驶航空器100B)。另外,无人驾驶航空器100B、终端80B、用户Ub可以存在多个。
在飞行过程中,在无人驾驶航空器100(例如无人驾驶航空器100A)中,摄像装置220反复进行拍摄(T1)。UAV控制部110可以将摄像装置220拍摄的摄像图像记录在内存160中,将与摄像图像相关的附加信息也记录在内存160中。UAV控制部110经由通信接口150将存储在内存160中的摄像图像及其附加信息发送给终端80(T2)。
在终端80(例如终端80A)中,终端控制部81经由通信部85接收从无人驾驶航空器100A发送的摄像图像及其附加信息(T3)。终端控制部81使显示部88显示摄像图像。终端控制部81检测对终端80进行操作的用户在显示部88所显示的摄像图像中关注的关注点(T4)。
图8是示出关注点的检测示例的图。在该示例中,在显示部88中显示摄像装置220拍摄的摄像图像GZ1。终端控制部81针对显示部88所显示的摄像图像GZ1,通过视线检测判断塔J1是用户的视线位置,检测关注点tp1。
返回图7,终端控制部81经由通信部85将关注点的信息发送给服务器300(T5)。另外,终端控制部81也可以经由通信部85,除了关注点的信息以外,还发送终端80从无人驾驶航空器100获取的附加信息的至少一部分。
在服务器300中,服务器控制部310经由通信部320接收从终端80发送的信息(例如关注点的信息),并存储在存储器330中(T6)。服务器300除了无人驾驶航空器100A之外,也同样从其他无人驾驶航空器100B接收信息(例如关注点的信息),并存储于存储器330。
服务器控制部310判别存储在存储器330中的一个或多个的关注点的信息中的、关注点所表示的位置信息、对象相同(共同的)的多个关注点的信息是否存在(T7)。另外,也将共同的关注点即位置信息、对象共同的关注点称为共同关注点。
图9A是示出两个用户(操作者)的关注点是共同的关注点时的各显示部88所显示的摄像图像GZ1、GZ2的图。两个用户可以是用户U1、U2。
摄像图像GZ1、GZ2是无人驾驶航空器100A、100B的摄像装置220拍摄的摄像方向不同的图像,分别包括塔J1、桥J2、大厦J3。在图9A中,关注点tp1、tp2是塔J1,是共同的,因此是共同关注点。
图9B是示出两个用户的关注点不是共同的关注点时的各终端的显示部88A、88B上分别显示的摄像图像GZ1、GZ2的图。在图9B中,相对于摄像图像GZ1的关注点tp1是塔J1。另一方面,相对于摄像图像GZ2的关注点tp2是桥。因此,关注点tp1、tp2不是共同关注点。
另外,在关注点的信息不是塔等对象的信息而是地理位置信息的情况下,例如在两个关注点之间的距离小于阈值时,也可以判定为是共同关注点。
在不存在共同的多个关注点的信息的情况下,即不存在共同关注点的情况下,服务器控制部310返回图7中最初的步骤T6。
另一方面,在步骤T7中,在存在共同的多个关注点的信息的情况下,服务器控制部310经由通信部320向发送了共同的关注点的信息的终端80A发送与终端80A指示飞行的控制的无人驾驶航空器100A不同的其他无人驾驶航空器100B的信息(T8)。同样,在步骤T8中,服务器控制部310经由通信部320向发送了共同的关注 点的信息的终端80B发送与终端80B指示飞行的控制的无人驾驶航空器100B不同的无人驾驶航空器100A的信息。
这里的其他无人驾驶航空器100B的信息例如可以包括表示其他无人驾驶航空器100B存在的信息、其他无人驾驶航空器100B的位置信息、其他无人驾驶航空器100B移动的移动方向的信息等。同样地,这里的无人驾驶航空器100A的信息例如可以包括表示无人驾驶航空器100A存在的信息、无人驾驶航空器100A的位置信息、无人驾驶航空器100A移动的移动方向的信息等。
在终端80A中,终端控制部81经由通信部85接收其他无人驾驶航空器100B的信息(T9)。同样地,在其他终端80B中,终端控制部81接收无人驾驶航空器100A的信息(T9)。终端80A的终端控制部81根据接收到的无人驾驶航空器100B的信息,相对于显示部88所显示的摄像图像GZ1,使显示部88显示重叠图像(T10)。
作为重叠图像,类似箭头的标记mk1可以重叠显示在摄像图像CZ1上。另外,终端80A的终端控制部81除了相对于显示部88所显示的摄像图像GZ1显示重叠图像即标记mk1以外,也可以显示其他无人驾驶航空器100B的信息。例如,终端控制部81也可以显示其他无人驾驶航空器是否存在、其他无人驾驶航空器的位置、速度、移动方向等。
另外,在此虽然例示了显示其他无人驾驶航空器100B的信息,但也可以通过显示以外的方法来提示其他无人驾驶航空器100B的信息。例如,终端80可以包括扬声器,对其他无人驾驶航空器100B的信息进行声音输出。例如,终端80也可以包括振动器,通过振动来表示其他无人驾驶航空器100B的信息。
根据图7的处理,服务器300从各终端80获取关注点的信息,判别所获取的多个关注点中是否存在共同关注点。关注点是用户关注的位置、对象,使无人驾驶航空器100朝关注点飞行的可能性较高。因此,在关注点是共同的共同关注点的情况下,随着接近与关注点对应的地理位置,无人驾驶航空器100彼此发生碰撞的可能性变高。即使在这样的情况下,各终端80的用户也能够将与其他用户操纵的其他无人驾驶航空器100相关的信息提示给在FPV飞行中正在确认的本机。因此,各终端80的用户能够认识到其他用户也关注相同的关注点,能够提高安全性而操纵无人驾驶航空器100。
另外,在T6与T7之间,服务器控制部310也可以判定接收到关注点的无人驾驶航空器100A、100B是否正在移动。在此情况下,服务器控制部310可以经由通信部320获取关注点的信息,同时获取无人驾驶航空器100A、100B的时间序列的位置信息。该位置信息可以作为摄像位置信息包含在摄像图像的附加信息中,经由通信部320依次从终端80A、80B获取,也可以依次从无人驾驶航空器100A、100B直接获 取。而且,服务器控制部310在无人驾驶航空器100A、100B的至少一个正在移动的情况下,可以指示根据共同关注点的信息提示,在无人驾驶航空器100A、100B均未移动的情况下,也可以不指示根据共同关注点的信息提示。
即,服务器300在无人驾驶航空器100A、100B正在移动,并关注共同的位置、对象的情况下,碰撞的可能性变高,因此可以指示提示信息。服务器300在无人驾驶航空器100A、100B均没有移动的情况下,即使在关注共同的位置、对象的情况下,例如无人驾驶航空器100A、100B发生碰撞的可能性也低,因此也可以不指示信息提示。
这样,服务器控制部310可以判定无人驾驶航空器100A是否正在移动。在无人驾驶航空器100A正在移动的情况下,服务器控制部310可以使终端80A显示与无人驾驶航空器100B相关的信息。
在无人驾驶航空器100A正在移动的情况下,与其他无人驾驶航空器100B碰撞的可能性变高。即使在此情况下,服务器300也能够将其他无人驾驶航空器100B的信息作为警告信息进行通知,能够抑制无人驾驶航空器100A与无人驾驶航空器100B发生碰撞。
此外,在服务器300中,服务器控制部310可以根据获取的无人驾驶航空器100A、100B的位置信息,计算从无人驾驶航空器100A到无人驾驶航空器100B的距离r1。在距离r1小于等于阈值的情况下,服务器控制部310可以使终端80A显示与无人驾驶航空器100B相关的信息、例如表示无人驾驶航空器100B存在的标记。这里的阈值可以与后述的第二动作示例中使用的阈值Dist1相同。
由此,即使在多个无人驾驶航空器100相互接近地飞行,多个无人驾驶航空器100发生碰撞的可能性变高的情况下,对无人驾驶航空器100A的飞行的控制进行指示的用户U1也能够获知其他无人驾驶航空器100B的存在。因此,服务器300能够抑制无人驾驶航空器100A与无人驾驶航空器100B的碰撞的发生。
图10是示出两个无人驾驶航空器100A、100B的位置关系的图。设定以无人驾驶航空器100A为原点的三维坐标。在此情况下,在图10中,无人驾驶航空器100B位于正x方向和正y方向,且位于正z方向(即,比无人驾驶航空器100A更上空的位置)。
图11A是示出终端80A的显示部88A上所显示的无人驾驶航空器100A的摄像图像GZ1的图。在图11A中,假设无人驾驶航空器100A、100B的位置关系为图10的位置关系。
终端80A的终端控制部81A接收来自服务器300的显示信息的指示(参照图7的T8),经由显示部88A显示各种信息。在图11A中,在显示部88A的右上端部,类似从右向左的箭头的标记mk1与摄像图像GZ1重叠地显示。如图10所示,该标记mk1表示在显示部88A中作为死角的右上方向存在另外的无人驾驶航空器100B正在飞行。因此,若使无人驾驶航空器100A的摄像方向向标记mk1所示的方向偏移,则会在摄像图像GZ1中出现另外的无人驾驶航空器100B。
在此情况下,服务器300的服务器控制部310可以经由通信部320获取无人驾驶航空器100A的位置信息。服务器控制部310可以经由通信部320获取无人驾驶航空器100B的位置信息。该位置信息可以作为摄像位置信息包含在摄像图像的附加信息中,经由通信部320依次从终端80A、80B获取,也可以依次从无人驾驶航空器100A、100B直接获取。服务器控制部310可以根据无人驾驶航空器100A、100B的位置信息,考虑无人驾驶航空器100A、100B的位置关系来决定显示有标记mk1的位置。服务器控制部310可以经由通信部320向终端80A指示显示包含显示有标记mk1的位置的信息在内的、与无人驾驶航空器100B相关的信息(例如表示存在无人驾驶航空器100B的信息)。
另外,标记mk1也可以表示无人驾驶航空器100B的移动方向。即,也可以表示在与显示部88A所显示的摄像图像对应的地理上范围、朝向中,无人驾驶航空器100B正在从右向左飞行。此外,也可以通过标记mk1以外的信息,显示与无人驾驶航空器100B的位置、速度等无人驾驶航空器100B相关的信息。
图11B是示出其他终端80B的显示部88B上所显示的无人驾驶航空器100B的摄像图像GZ2的图。在图11B中,假设无人驾驶航空器100A、100B的位置关系为图10的位置关系。
在显示部88B的左下端部,类似从左向右的箭头的标记mk2与摄像图像GZ2重叠地显示。如图10所示,该标记mk2表示在显示部88B中,在成为死角的左方存在另外的无人驾驶航空器100A正在飞行。因此,若使无人驾驶航空器100B的摄像方向向标记mk2所示的方向偏移,则会在摄像图像GZ2中出现另外的无人驾驶航空器100A。
在此情况下,服务器300的服务器控制部310可以经由通信部320获取无人驾驶航空器100A的位置信息。服务器控制部310可以经由通信部320获取无人驾驶航空器100B的位置信息。该位置信息可以作为摄像位置信息包含在摄像图像的附加信息中,经由通信部320依次从终端80A、80B获取,也可以依次从无人驾驶航空器100A、100B直接获取。服务器控制部310可以根据无人驾驶航空器100A、100B的位置信息,考虑无人驾驶航空器100A、100B的位置关系来确定显示有标记mk2的位置。服 务器控制部310可以经由通信部320向终端80B指示显示包含显示有标记mk2的位置的信息在内的、与无人驾驶航空器100A相关的信息(例如表示存在无人驾驶航空器100A的信息)。
另外,标记mk2也可以表示无人驾驶航空器100A的移动方向。即,也可以表示在与显示部88B所显示的摄像图像对应的地理上范围、朝向中,无人驾驶航空器100A正在从左向右飞行。此外,也可以通过标记mk2以外的信息,显示与无人驾驶航空器100A的位置、速度等无人驾驶航空器100A相关的信息。
这样,服务器300的服务器控制部310可以获取无人驾驶航空器100A和无人驾驶航空器100B的位置信息。服务器控制部310可以指示显示部88A在根据无人驾驶航空器100B相对于无人驾驶航空器100A的位置的位置处显示表示无人驾驶航空器100B的存在的信息。
由此,终端80A接受来自服务器300的指示,可以在根据无人驾驶航空器100B相对于无人驾驶航空器100A的位置的位置处,例如在显示部88的画面右上端部显示表示存在无人驾驶航空器100B的标记mk2(存在信息的一个示例)。操作终端80A的用户U1容易直观地掌握无人驾驶航空器100B的存在位置。因此,用户U1考虑到无人驾驶航空器100B的位置而更容易地操作终端80A。因此,服务器300能够抑制无人驾驶航空器100A与无人驾驶航空器100B发生碰撞。
这样,在第一动作示例中,在与关注共同关注点的用户对应的无人驾驶航空器100A(本机)和其他无人驾驶航空器100B存在的情况下,即使在终端80A的显示部88A上显示的摄像图像中不出现其他无人驾驶航空器100B,也显示表示其他无人驾驶航空器100B的标志mk1。
此外,服务器控制部310获取关注点tp1,其为对指示无人驾驶航空器100A的飞行的控制的终端80A进行操作的用户U1在显示于终端80A的、由无人驾驶航空器100A拍摄的摄像图像GZ1中关注的点。服务器控制部310获取关注点tp2,其为对指示无人驾驶航空器100B的飞行的控制的终端80B进行操作的用户U2在显示于终端80B的、由无人驾驶航空器100B拍摄的摄像图像GZ2中关注的点。服务器控制部310判定关注点tp1和关注点tp2是否为表示相同的关注点的共同关注点。在它们是共同关注点的情况下,服务器控制部310使终端80A显示表示无人驾驶航空器100B的存在、接近的标志mk1。
服务器控制部310是处理部的一个示例。无人驾驶航空器100A是第一飞行体的一个示例。终端80A是第一终端的一个示例。摄像图像GZ1是第一图像的一个示例。关注点tp1是第一关注点的一个示例。无人驾驶航空器100B是第二飞行体的一个示例。终端80B是第二终端的一个示例。摄像图像GZ2是第二图像的一个示例。关注 点tp2是第二关注点的一个示例。
由此,用户U1能够掌握与存在于无人驾驶航空器100A的周围的无人驾驶航空器100B相关的信息。因此,在无人驾驶航空器100A进行FPV飞行的情况下,难以确认无人驾驶航空器100A的周围的状况,即使是与多个无人驾驶航空器100的目的地共同的关注点对应的相同目的地,用户U1也能够考虑与无人驾驶航空器100B相关的信息来操作终端80A。因此,服务器300能够抑制无人驾驶航空器100A与无人驾驶航空器100B发生碰撞。
(第一实施方式的第二动作示例)
在第一动作示例中,示出了将表示关注点共同的其他无人驾驶航空器存在的标记与摄像图像重叠地显示于终端的显示器的情况。在第二动作示例中,示出了在共同的关注点相同、且与其他无人驾驶航空器之间的距离小于阈值的情况下,在对该无人驾驶航空器的飞行的控制进行指示的终端中显示推荐信息的情况。
图12A和图12B是示出第二动作示例中的由服务器300进行的无人驾驶航空器视点下的信息提示指示步骤的顺序图。对于与图7所示的第一动作示例相同的步骤,通过使用相同的符号,省略或简化其说明。
首先,飞行系统10执行T1~T6的处理。
在步骤T7中,在关注点共同的多个无人驾驶航空器100存在的情况下,服务器300的服务器控制部310判别从无人驾驶航空器100A到其他无人驾驶航空器100B的距离r1是否小于等于阈值Dist1(T8A)。
图13是示出针对两个无人驾驶航空器100A、100B之间的距离r1设定的阈值Dist1、Dist2的空间的图。
在以无人驾驶航空器100A的位置为原点的情况下,两个无人驾驶航空器100A、100B之间的距离r1使用无人驾驶航空器100B的位置坐标(x、y、z),由式(1)表示。
ri=(x2+y2+z2)
1/2......(1)
与距其他无人驾驶航空器100B的距离r1进行比较的阈值Dist1是预计与其他无人驾驶航空器100B接近时,用于推荐降速的值。与距其他无人驾驶航空器100B的距离r1进行比较的阈值Dist2是预计与其他无人驾驶航空器100B碰撞时,用于推荐悬停等暂时停止的值。因此,阈值Dist2是比阈值Dist1小的值。
在距离r1不为阈值Dist1以下的情况,即距离r1比阈值Dist1大的情况下,服务 器300的服务器控制部310在图12A的T8A中返回到服务器300的最初的步骤T6。
另一方面,在距离r1小于等于阈值Dist1的情况下,进入图12B,服务器控制部310判别距无人驾驶航空器100B的距离r1是否小于等于阈值Dist2(T9A)。
在距离r1不小于等于阈值Dist2的情况下,即距离r1比阈值Dist1大的情况下,服务器控制部310推荐低速飞行模式,并生成用于推荐低速飞行模式的推荐信息(T10A)。另一方面,在步骤T9A中,在距离r1小于等于阈值Dist2的情况下,服务器控制部310推荐能够进行悬停等的暂时停止(暂时停止模式),并生成用于推荐暂时停止的推荐信息(T11A)。
服务器控制部310将步骤T10A或步骤T11A的推荐信息经由通信部320发送到对无人驾驶航空器100A的飞行的控制进行指示的终端80A(T12A)。
终端80A的终端控制部81经由通信部85从服务器300接收推荐信息(T13A)。终端控制部81根据推荐信息,在显示部88上显示包含推荐信息的推荐画面(T14A)。
图14A是示出在距离r1为阈值Dist1以内的情况下显示器88上所显示的推荐画面GM1的图。在推荐画面GM1中,例如显示“请设为低速飞行模式”的消息。图14B是示出在距离r1为阈值Dist2以内的情况下显示器88上所显示的推荐画面GM2的图。在推荐画面GM2中,显示“请暂时停止”的消息。图14A、图14B所示的消息在推荐画面GM1、GM2中单独显示。另外,这些消息可以重叠地显示在摄像图像上,也可以与第一动作示例所示的标记一起重叠地显示在摄像图像上。
根据图12A以及图12B所示的处理,当无人驾驶航空器100彼此在某种程度上接近时,服务器300能够对指示无人驾驶航空器100飞行的控制的终端80提示警告信息,以限制飞行的速度。因此,即使在无人驾驶航空器100彼此接近的情况下,终端80也能够提高无人驾驶航空器100的飞行的安全性,使无人驾驶航空器100进行FPV飞行。此外,当无人驾驶航空器100彼此进一步接近时,能够进一步提示警告信息,以限制速度。例如,也可以使各无人驾驶航空器100悬停。因此,服务器300能够根据无人驾驶航空器100彼此的接近程度阶段性地改变警告的重要度,同时进行信息提示。因此,各终端80的用户能够在朝向共同关注点进行FPV飞行时,识别用户操纵的无人驾驶航空器100以外的其他无人驾驶航空器100的接近,并根据提示信息采取必要的措施,操纵无人驾驶航空器100。
在上述第二动作示例中,在预计与其他无人驾驶航空器100B接近的情况下,终端80A显示推荐画面。服务器控制部310也可以代替该推荐画面的显示的指示,或者与推荐画面的显示的指示一起,对无人驾驶航空器100A进行低速飞行模式或暂时停止(悬停等)等飞行控制的指示。
这样,在第二动作示例中,在多个无人驾驶航空器(例如无人驾驶航空器100A、100B)接近的情况下,推荐低速飞行模式。此外,在多个无人驾驶航空器发生碰撞的可能性高的情况下,推荐悬停等的暂时停止。由此,有助于避免无人驾驶航空器100彼此之间的碰撞。
此外,在从无人驾驶航空器100A到无人驾驶航空器100B的距离r1小于等于阈值Dist1的情况下,服务器控制部310可以使终端80A显示推荐对无人驾驶航空器100A的飞行速度进行限制的信息(例如,用于设定为低速飞行模式的推荐信息)。在此情况下,可以将显示的指示信息发送给终端80A。
由此,用户U1通过确认终端80A的推荐信息的显示,能够掌握推荐了无人驾驶航空器100A的飞行的速度的限制的这一情况。终端80A进行用于限制飞行的速度设定,能够使无人驾驶航空器100A飞行。限制速度的设定可以根据推荐信息通过终端控制部81自动设定,也可以经由操作部83手动设定。通过限制速度,与高速飞行的情况相比,用户U1更容易确认终端80A的画面上的无人驾驶航空器100A的状态,能够抑制与无人驾驶航空器100B的碰撞。
此外,服务器控制部310可以使终端80A显示如下推荐信息:距离r1越短,越使无人驾驶航空器100A的飞行的速度限制为低速(例如,用于暂时停止的推荐信息)。在此情况下,可以将显示的指示信息发送给终端80A。
因此,距离r1越短,即使是较短的移动距离,碰撞的可能性也越高,但距离r1越短,服务器300越使无人驾驶航空器100A低速飞行,从而能够延长向无人驾驶航空器100B的位置移动所需的时间,能够容易地避免碰撞。
图15A是示出比较例的通过目视来操纵无人驾驶航空器的状况的图。在两个用户U1、U2分别通过目视来操纵无人驾驶航空器100A、100B的情况下,用户U1、U2的视野CA1较宽。因此,用户U1、U2容易避免无人驾驶航空器相互接近的状况。因此,不易发生无人驾驶航空器彼此碰撞的情况。
图15B是示出本实施方式的以FPV飞行模式操纵无人驾驶航空器的状况的图。在两个用户U1、U2一边观察终端80的显示部88,一边分别操纵无人驾驶航空器100A、100B的情况下,用户U1、U2的视野CA2变窄。即使其他无人驾驶航空器在用户U1、U2操纵的无人驾驶航空器的附近飞行,用户U1、U2也难以察觉。因此,容易发生无人驾驶航空器彼此碰撞的情况。
此外,在本实施方式中,服务器300能够主导地进行根据各终端80的用户的关注点的信息提示。即,服务器控制部310可以经由通信部320从终端80A获取关注点tp1,从终端80B获取关注点tp2。服务器控制部310可以经由通信部320向终端80A 发送要显示在终端80A的信息(例如与无人驾驶航空器100B相关的信息、推荐信息)。
由此,服务器300能够对存在于飞行系统10的多个终端80检测的关注点的信息进行集中处理,并指示提示信息。因此,服务器300能够降低例如根据共同关注点的信息提示的处理所涉及的终端80的处理负担。
另外,在第一动作示例中的步骤T10以及第二动作示例中的步骤T14A中,服务器300的服务器控制部310虽然向终端80发送与其他无人驾驶航空器100相关的信息、推荐信息,但也可以对无人驾驶航空器100指示与推荐信息对应的控制。在此情况下,服务器控制部310可以经由通信部320向终端80发送低速飞行模式、暂时停止等飞行控制信息。当终端80的终端控制部81经由通信部85接收飞行控制信息时,可以根据该飞行控制信息,指示控制无人驾驶航空器100的飞行。
例如,在从无人驾驶航空器100A到无人驾驶航空器100B的距离r1小于等于阈值Dist1的情况下,服务器控制部310可以限制无人驾驶航空器100A的飞行速度。用于限制的限制指示信息可以直接发送到无人驾驶航空器100A,也可以经由终端80A发送。
由此,服务器300能够通过进行无人驾驶航空器100A的飞行的速度的限制指示,根据无人驾驶航空器100A与无人驾驶航空器100B的位置关系来限制无人驾驶航空器100A的速度。在此情况下,即使在用户U1没有注意到无人驾驶航空器100B的存在而操作终端80A的情况下,也能够抑制无人驾驶航空器100A根据来自终端80A的指示而高速飞行,能够抑制与无人驾驶航空器100B的碰撞。
例如,距离r1越短,服务器控制部310越可以使无人驾驶航空器100A的飞行的速度限制为低速。用于限制的限制指示信息可以直接发送到无人驾驶航空器100A,也可以经由终端80A发送。在此情况下,无人驾驶航空器100A越接近无人驾驶航空器100B,飞行速度越低。因此,虽然无人驾驶航空器100A与无人驾驶航空器100B越接近越容易碰撞,但由于飞行速度也被限制得较低,因此服务器300能够抑制与无人驾驶航空器100B的碰撞。
(第二实施方式)
在第一实施方式中,示出了多个无人驾驶航空器100彼此接近的情况。在第二实施方式中,示出了无人驾驶航空器100接近作为共同的关注点的目的地的情况。
第二实施方式中的飞行系统10的构成具有与第一实施方式大致相同的构成。对于与第一实施方式相同的构成要素,通过使用相同的符号,省略或简化其说明。
图16A以及图16B是示出第二实施方式中的由服务器300进行的目的地视点下 的信息提示指示步骤的顺序图。对于与图7所示的第一动作示例以及图12所示的第二动作示例相同的步骤,通过使用相同的符号,省略或简化其说明。
首先,飞行系统10执行T1~T6的处理。
在步骤T7中,在关注点共同的多个无人驾驶航空器100存在的情况下,服务器300的服务器控制部310判别在从共同的关注点起半径为r2的范围内是否存在无人驾驶航空器100(T8B)。半径r2是在预计无人驾驶航空器100向共同的关注点接近的前提下,用于推荐速度降低的值。
另外,共同的关注点的位置信息可以从服务器300的存储器330中保存的地图信息中得到。此外,地图信息可以存储在外部的地图服务器中,服务器控制部310可以经由通信部320获取地图信息。
在从共同的关注点的位置起半径为r2的范围内不存在无人驾驶航空器100的情况下,返回到服务器控制部310、服务器300的最初的步骤。
另一方面,在从共同的关注点的位置起半径为r2的范围内存在无人驾驶航空器100的情况下,服务器控制部310判别在从共同的关注点起半径为r3的范围内是否存在无人驾驶航空器100(T9B)。半径r3是在预计无人驾驶航空器与共同的关注点碰撞的前提下,用于推荐悬停等暂时停止的值。半径r3是比半径r2小的值。
在半径为r3的范围内不存在无人驾驶航空器100的情况下,服务器控制部310向与相应的无人驾驶航空器100(例如位于半径为r2的圆和半径为r3的圆之间的无人驾驶航空器100)对应的终端80推荐低速飞行模式(T10B)。
另一方面,在步骤T9B中,在半径为r3的范围内存在无人驾驶航空器100的情况下,服务器控制部310向与相应的无人驾驶航空器100(例如位于半径为r2的圆的内侧的无人驾驶航空器100)对应的终端80,推荐悬停等的暂时停止(T11B)。
服务器控制部310经由通信部320将步骤T10B或步骤T11B的推荐信息发送到与相应的无人驾驶航空器100对应的终端80(T12B)。
与相应的无人驾驶航空器100对应的终端80的终端控制部81经由通信部85从服务器300接收推荐信息(T13A)。终端控制部81根据推荐信息,在显示部88上显示推荐画面(T14A)。
这样,服务器300的服务器控制部310可以获取共同关注点的位置信息。服务器控制部310可以获取无人驾驶航空器100的位置信息。服务器控制部310在从共同的 关注点到无人驾驶航空器100的距离为从共同的关注点起半径为r2的范围内(小于等于半径r2)的情况下,可以使终端80显示推荐对无人驾驶航空器100的飞行的速度进行限制的信息。在此情况下,可以将显示的指示信息发送给终端80。
假设有多个无人驾驶航空器100向作为共同的关注点的目的地飞来。因此,在其他无人驾驶航空器100也接近目的地的情况下,碰撞的可能性变高。用户通过确认终端80所显示的推荐信息,能够掌握推荐了飞行的速度的限制的这一情况。终端80进行用于限制飞行的速度的设定,限制飞行的速度而使无人驾驶航空器100飞行。限制速度的设定可以根据推荐信息通过终端控制部81自动设定,也可以经由操作部83手动设定。通过限制速度,与高速飞行的情况相比,用户更容易确认终端80的画面上的无人驾驶航空器100A的状态,能够抑制与其他无人驾驶航空器100的碰撞。
此外,服务器控制部310可以使终端80显示如下推荐信息:从共同的关注点到所述无人驾驶航空器100的距离越短,越使无人驾驶航空器100的飞行的速度限制为低速。在此情况下,可以将显示的指示信息发送给终端80。
从共同的关注点到无人驾驶航空器100的距离较短,即使在较短的移动距离下碰撞的可能性也变高。在此情况下,从共同的关注点到无人驾驶航空器100的距离越短,服务器300越使无人驾驶航空器100低速飞行。因此,服务器300能够延长向共同的关注点移动所需的时间,能够容易地避免碰撞。
另外,在步骤T10B、T11B中,服务器300的服务器控制部310虽然向终端80发送了推荐信息,但也可以对无人驾驶航空器100指示与推荐信息对应的控制。在此情况下,服务器控制部310可以经由通信部320向终端80发送低速飞行模式、暂时停止等飞行控制信息。当终端80的终端控制部81经由通信部85接收飞行控制信息时,可以根据该飞行控制信息,指示控制无人驾驶航空器100的飞行。
例如,在共同关注点与无人驾驶航空器100A的距离小于等于半径r2的情况下,服务器控制部310可以限制无人驾驶航空器100A的飞行速度。用于限制的限制指示信息可以直接发送到无人驾驶航空器100A,也可以经由终端80A发送。
由此,服务器300能够通过进行无人驾驶航空器100A的飞行速度的限制指示,根据无人驾驶航空器100A与共同关注点的位置关系来限制无人驾驶航空器100A的速度。在此情况下,即使在用户U1没有注意到共同关注点的存在而操作终端80A的情况下,无人驾驶航空器100A也能够根据来自终端80A的指示抑制高速飞行,并抑制与存在于共同关注点(目的地)的对象、接近共同关注点的其他无人驾驶航空器100B的碰撞。
例如,共同关注点与无人驾驶航空器100A的距离越短,服务器控制部310越可 以使无人驾驶航空器100A的飞行的速度限制为低速。用于限制的限制指示信息可以直接发送到无人驾驶航空器100A,也可以经由终端80A发送。
例如,服务器控制部310可以在多个无人驾驶航空器100同时接近作为共同的关注点的目的地的情况下,按照预定的顺序进行控制以使得各无人驾驶航空器100依次接近共同关注点。该控制信息可以直接发送到无人驾驶航空器100A,也可以经由终端80A发送。由此,服务器300能够避免无人驾驶航空器100彼此之间的碰撞,并且使各无人驾驶航空器100到达目的地。
(第三实施方式)
在第一以及第二实施方式中,示出了服务器300进行用于避免无人驾驶航空器100的碰撞的信息提示指示的动作的情况。在第三实施方式中,示出了多个终端80中的任一终端80P进行用于避免无人驾驶航空器100的碰撞的信息提示指示的动作的情况。
第三实施方式的飞行系统10具有与第一实施方式大致相同的结构。对于与第一实施方式相同的构成要素使用相同的符号,从而省略或简化其说明。
在第三实施方式中,终端80P的终端控制部81进行与服务器300的用于避免无人驾驶航空器100的碰撞的信息提示指示相关的处理。终端控制部81根据由无人驾驶航空器100拍摄的图像中的用户的关注点来提示信息。即,终端控制部81可以进行与第一以及第二实施方式中的服务器300的服务器控制部310实施的处理相同的处理。终端控制部81为处理部的一个示例。
在第三实施方式中,将终端80主要作为终端80P或其他终端80Q进行说明。终端80Q可以有多个。终端80P指示无人驾驶航空器100P的飞行的控制,由用户Up操作。此外,终端80Q指示无人驾驶航空器100Q的飞行的控制,由用户Uq操作。终端80P可以是终端80A。终端80Q可以是终端80B。无人驾驶航空器100P可以是无人驾驶航空器100A。无人驾驶航空器100Q可以是无人驾驶航空器100B。另外,进行根据共同关注点的信息提示指示的终端80P和其他终端80Q可以连接通信链路以能够预先通信。
(第三实施方式的第一动作示例)
图17是示出第三实施方式的第一动作示例中的由终端80进行的无人驾驶航空器视点下的信息提示指示步骤的顺序图。对于第一实施方式的第一动作示例中的与图7相同的步骤处理,通过标注相同的步骤编号,省略或简化其说明。
此外,在此处,将多个终端80中的、进行与第一实施方式的第一动作示例中的服务器的动作相同的动作的终端作为指定的终端80P。终端80P是信息处理装置的一 个示例。
首先,飞行系统10进行T1至T5的处理。
与第一实施方式的无人驾驶航空器100以及终端80相同,在无人驾驶航空器100P中,摄像装置220反复进行拍摄。UAV控制部110可以将摄像装置220拍摄的摄像图像记录在内存160中,将与摄像图像相关的附加信息也记录在内存160中。UAV控制部110将存储在内存160中的摄像图像及其附加信息经由通信接口150发送给终端80P。
在终端80P中,终端控制部81经由通信部85接收从无人驾驶航空器100P发送的摄像图像及其附加信息(T3C)。终端控制部81检测操作终端80P的用户Up的关注点,并存储在内存87中(T4C)。此外,终端控制部81经由通信部85接收包含从其他终端80Q发送的关注点的信息,并存储在内存87中(T6C)。因此,终端80P本机检测并获取作为本机的终端80P的用户Up的关注点,从其他终端80Q获取其他终端80Q的用户Uq的关注点。
终端控制部81判别在内存87中存储的多个关注点的信息中是否存在共同的多个关注点的信息(T7C)。在不存在共同的多个关注点的信息的情况下,终端控制部81返回到终端80P中最初的步骤T3C。
另一方面,在步骤T7C中,在存在共同的多个关注点的信息的情况下,终端控制部81经由通信部85向发送了共同的关注点的信息的终端80Q发送其他无人驾驶航空器100(例如无人驾驶航空器100P)的信息(T8C)。由此,发送了共同的关注点的信息的终端80Q能够接受来自终端80P的信息提示指示,在显示于显示部88的摄像图像上重叠地显示表示其他无人驾驶航空器100P的存在的标记作为重叠图像。
此外,在存在本终端(终端80P)和发送了共同的关注点的信息的其他终端80Q的情况下,终端80P的终端控制部81根据其他终端80Q指示飞行的控制的其他无人驾驶航空器100Q的信息,在显示于显示部88的摄像图像上重叠地显示表示其他无人驾驶航空器100Q的存在的标记(T10C)。
这样,在第三实施方式的第一动作示例中,在存在关注于共同关注点的用户Up进行操纵的无人驾驶航空器100P(本机)和其他无人驾驶航空器100Q的情况下,即使在终端80P的显示部88显示的摄像图像上不出现其他无人驾驶航空器100Q,也显示表示其他无人驾驶航空器100Q的标记。由此,用户Up能够掌握与用户Uq对应的其他无人驾驶航空器100Q的存在,该用户Uq与用户Up是共同的关注点。此外,通过终端80P进行根据共同关注点的信息提示指示,能够省略服务器300的设置,使飞行系统10的结构变得简单,并削减成本。
(第三实施方式的第二动作示例)
图18A和图18B是示出第三实施方式的第二动作示例中的由终端80进行的无人驾驶航空器视点下的信息提示指示步骤的顺序图。对于第一实施方式的第二动作示例的图12、第一实施方式的第一动作示例中的与图17相同的步骤处理,通过标注相同的步骤编号,省略或简化其说明。
此外,在此处,将多个终端80中的、进行与第一实施方式的第一动作示例中的服务器300的动作相同的动作的终端作为指定的终端80P。
首先,飞行系统10进行T1至T5、T3D、T4D、T6D的处理。步骤T3D是与图17所示的步骤T3C相同的处理。步骤T4D是与图17所示的步骤T4C相同的处理。步骤T6D是与图17所示的步骤T6C相同的处理。
在终端80P中,终端控制部81判别在内存87中存储的多个关注点的信息中是否存在共同的多个关注点的信息(T7D)。在不存在共同的多个关注点的信息的情况下,终端控制部81返回到终端80P中的最初的步骤T3D。
另一方面,在步骤T7D中,在存在共同的多个关注点的信息的情况下,终端控制部81判别从无人驾驶航空器100P到其他无人驾驶航空器100Q的距离r1是否小于等于阈值Dist1(T8D)。
在距离r1大于阈值Dist1的情况下,终端控制部81返回到终端80P中最初的步骤T3D。
另一方面,在距离r1小于等于阈值Dist1的情况下,终端控制部81判别距离r1是否小于等于阈值Dist2(T9D)。在距离r1大于阈值Dist2的情况下,终端控制部81推荐低速飞行模式,并生成用于推荐低速飞行模式的推荐信息(T10D)。另一方面,在步骤T9D中,在距离r1小于等于阈值Dist2的情况下,终端控制部81推荐悬停等的暂时停止(暂时停止模式),并生成用于推荐暂时停止的推荐信息(T11D)。
终端控制部81将步骤T10D或步骤T11D的推荐信息经由通信部85发送到对其他无人驾驶航空器100Q的飞行控制进行指示的其他终端80Q(T12D)。
其他终端80Q的终端控制部81经由通信部85从终端80P接收推荐信息(T13D)。终端控制部81根据推荐信息,在显示部88上显示包含推荐信息的推荐画面(T14D)。由此,接受根据共同关注点的信息提示指示的其他终端80Q能够在显示部88显示推荐画面。因此,其他终端80Q的用户Uq能够在操纵其他无人驾驶航空器100Q时参照推荐画面进行操纵,因此能够提高操纵的安全性。
此外,在发送了与本终端(终端80P)共同的关注点的信息的其他终端80Q存在的情况下,终端80P的终端控制部81在显示部88上显示包含推荐信息的推荐画面(T15D)。由此,进行根据共同关注点的信息提示指示的终端80P能够在显示部88上显示推荐画面。因此,终端80P的用户Up能够在操纵无人驾驶航空器100P时参照推荐画面进行操纵,因此能够提高操纵的安全性。
在第三实施方式的第二动作示例中,在多个无人驾驶航空器100(例如无人驾驶航空器100P、100Q)正在接近的情况下,推荐低速飞行模式。此外,在多个无人驾驶航空器100发生碰撞的可能性高的情况下,推荐悬停等的暂时停止。由此,有助于避免无人驾驶航空器100彼此之间的碰撞。此外,能够省略服务器300的设置,能够使飞行系统10的结构变得简单,并能够削减成本。
这样,在终端80P中,终端控制部81经由通信部85从无人驾驶航空器100P获取摄像图像GZ1。终端控制部81检测摄像图像GZ1中的关注点tp1。终端控制部81经由通信部85从其他终端80Q获取关注点tp2。终端控制部81使显示部88显示要显示在终端80P的信息(例如与其他无人驾驶航空器100Q相关的信息、推荐信息)。
由此,终端80P能够进行从关注点的检测到共同关注点的判定、根据共同关注点的判定的信息显示为止的一系列的处理。因此,终端80P不需要另外设置对根据关注点的检测、共同关注点的判定的信息显示进行指示的服务器300。因此,终端80P能够简化用于显示根据共同关注点的检测的信息的结构。
(第四实施方式)
在第三实施方式中,示出了作为终端80的智能手机80S对无人驾驶航空器100的飞行的控制进行指示的情况。在第四实施方式中,示出作为终端80的HMD(Head Mount Display,头戴式显示器)500对无人驾驶航空器100的飞行的控制进行指示的情况。
此外,第四实施方式的飞行系统10除了终端80变为HMD500以外,具有与第一实施方式大致相同的结构。对于与第一实施方式相同的构成要素,通过使用相同的符号,省略或简化其说明。
图19是示出第四实施例中的HMD500的外观的立体图。HMD500具有安装在用户的头部的安装部510和由安装部510支撑的主体部520。
图20是示出HMD500的硬件构成的框图。HMD500具有处理部521、通信部522、存储部523、操作部524、显示部525、加速度传感器526、摄像部527和接口部528。HMD500的这些各结构可以设置在主体部520上。
处理部521例如使用CPU、MPU或DSP构成。处理部521进行用于整体控制主体部520的各部的动作的信号处理、与其它各部之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。
处理部521可以经由通信部522获取来自无人驾驶航空器100的数据、信息。处理部521还可以获取通过操作部524输入的数据、信息。处理部521还可以获取保存在存储部523中的数据、信息。处理部521可以将包含无人驾驶航空器100的摄像图像的数据、信息发送到显示部525,并使显示部525显示根据该数据、信息的显示信息。处理部521可以执行用于对无人驾驶航空器100的控制进行指示的应用程序。处理部521可以生成应用程序中使用的各种数据。
处理部521可以根据由摄像部527拍摄的用户的眼睛的图像,进行视线检测,与第一实施方式同样地检测关注点。此外,处理部521可以根据视线检测的检测结果来对无人驾驶航空器100的飞行的控制进行指示。即,处理部521可以随着视线的移动来操纵无人驾驶航空器100。例如,处理部521可以经由通信部522对无人驾驶航空器100进行指示,使得无人驾驶航空器100朝向与佩戴了HMD500的用户观察的画面上的位置对应的地理位置、对象的方向飞行。因此,用户的关注点可以成为无人驾驶航空器100的目的地。
处理部521可以获取由加速度传感器526检测到的加速度的信息,并根据加速度对无人驾驶航空器100的飞行的控制进行指示。例如,处理部521可以经由通信部522对无人驾驶航空器100进行指示,使得无人驾驶航空器100朝向佩戴了HMD500的用户头部倾斜的方向飞行。
通信部522通过各种无线通信方式与无人驾驶航空器100之间进行通信。无线通信方式例如可以包括通过无线LAN、Bluetooth(注册商标)、短距离无线通信、或公共无线网络进行的通信。此外,通信部522也可以经由有线进行通信。
存储部523例如可以具有规定HMD500的动作的程序、存储设定值的数据的ROM、暂时保存处理部521进行处理时所使用的各种信息、数据的RAM。存储部523可以设置成可从HMD500拆卸下来。程序可以包括应用程序。
操作部524接受由用户输入的数据、信息。操作部524可以包括按钮、按键、触控显示屏、触摸板、话筒等。操作部524可以接受例如追踪、点击飞行等操作。
显示部525例如使用LCD(Liquid Crystal Display)构成,显示从处理部521输出的各种信息、数据。显示部525可以显示由无人驾驶航空器100的摄像装置220拍摄的摄像图像的数据。
加速度传感器526可以是能够检测HMD500的姿态的三轴加速度传感器。加速度传感器526可以将检测到的姿态信息作为操作信息之一输出到处理部521。
摄像部527拍摄各种图像。为了检测用户观看的方向、即视线,摄像部527可以拍摄用户的眼睛并输出到处理部521。接口部528可以与外部装置之间进行信息、数据的输入输出。
HMD500能够进行与第一至第三实施方式相同的动作。因此,即使终端80是HMD500,也能够得到与第一至第三实施方式相同的效果。此外,用户佩戴HMD500时,与未佩戴HMD500的情况相比,HMD500面向外部的视野被大幅遮挡,因此能够进一步提高真实感来视觉确认图像,能够享受无人驾驶航空器100的FPV飞行的飞行控制指示。此外,HMD500能够通过接收有无根据由处理部521检测出的关注点的共同关注点的信息提示,使显示部525显示除了通过HMD500进行飞行控制的指示的无人驾驶航空器100以外的其他无人驾驶航空器100的信息、推荐信息。因此,即使HMD500面向外部的视野被大幅遮挡,佩戴了HMD500的用户也能够通过确认所提示的信息,提高使用了HMD500的无人驾驶航空器100的操作安全性。
另外,HMD500在能够根据由加速度传感器526检测的加速度对无人驾驶航空器100的飞行的控制进行指示的情况下,可以与采用了发送器50的左右控制棒进行操作的无人驾驶航空器100的飞行控制指示同样地进行指示。因此,在此情况下,飞行系统10也可以不包括发送器50。
另外,HMD500也可以不根据由加速度传感器526检测的加速度来对无人驾驶航空器100的飞行控制进行指示。在此情况下,用户可以一边确认HMD500的显示部525,一边使用发送器50操纵无人驾驶航空器100。
以上使用实施方式对本公开进行了说明,但是本公开的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述即可明白,加以了这样的变更或改良的方式都可包含在本公开的技术范围之内。
权利要求书、说明书以及说明书附图中所示的装置、系统、程序和方法中的动作、顺序、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,即可以以任意顺序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
【符号说明】
10 飞行系统
50 发送器
50B 壳体
53L 左控制棒
53R 右控制棒
61 发送器控制部
63 无线通信部
65 接口部
80、80A、80B 终端
81 终端控制部
82 接口部
83 操作部
85 通信部
87 内存
88、88A、88B 显示部
89 摄像部
100、100A、100B 无人驾驶航空器
102 UAV主体
110 UAV控制部
150 通信接口
160 内存
200 万向节
210 旋翼机构
211 旋翼
212 驱动电机
213 电流传感器
220 摄像装置
240 GPS接收器
250 惯性测量装置
260 磁罗盘
270 气压高度计
280 超声波传感器
290 激光测量仪
300 服务器
310 服务器控制部
320 通信部
330 存储器
340 内存
500 HMD
510 安装部
520 主体部
521 处理部
522 通信部
523 存储部
524 操作部
525 显示部
526 加速度传感器
527 摄像部
528 接口部
AN1,AN2 天线
B1 电源按扭
B2 RTH按扭
CA1、CA2 视野
Dist1、Dist2 阈值
GM1、GM2 推荐画面
GZ1、GZ2 摄像图像
J1 塔、
J2 桥
J3 大厦
mk1、mk2 标记
r1 距离
r2、r3 半径
tp1、tp2 关注点
U1、U2 用户
Claims (22)
- 一种信息处理装置,其根据由飞行体拍摄的图像中的用户的关注点来提示信息,其特征在于,包括处理部,所述处理部获取对指示第一飞行体的控制的第一终端进行操作的第一用户在由所述第一飞行体拍摄的第一图像中关注的第一关注点、和对指示第二飞行体的控制的第二终端进行操作的第二用户在由所述第二飞行体拍摄的第二图像中关注的第二关注点,并判定所述第一关注点和所述第二关注点是否为共同关注点,若在是所述共同关注点的情况下,将所述第二飞行体相关的信息提示给所述第一终端。
- 如权利要求1所述的信息处理装置,其特征在于,所述处理部判定所述第一飞行体是否正在移动,在所述第一飞行体正在移动的情况下,将所述第二飞行体相关的信息提示给所述第一终端。
- 如权利要求1或2所述的信息处理装置,其特征在于,所述处理部获取所述第一飞行体的位置信息,获取所述第二飞行体的位置信息,在所述第一飞行体与所述第二飞行体的距离即第一距离小于等于第一阈值的情况下,将所述第二飞行体相关的信息提示给所述第一终端。
- 如权利要求1至3中任一项所述的信息处理装置,其特征在于,所述处理部获取所述第一飞行体的位置信息,获取所述第二飞行体的位置信息,在所述第一终端的画面中的、根据所述第二飞行体相对于所述第一飞行体的位置的位置处,提示表示所述第二飞行体的存在的信息。
- 如权利要求1至4中任一项所述的信息处理装置,其特征在于,所述处理部获取所述第一飞行体的位置信息,获取所述第二飞行体的位置信息,并在所述第一飞行体与所述第二飞行体的距离即第一距离小于等于第一阈值的情况下,将推荐对所述第一飞行体的飞行的速度进行限制的第一推荐信息提示给所述第一终端。
- 如权利要求5所述的信息处理装置,其特征在于,所述处理部提示所述第一推荐信息,该第一推荐信息推荐所述第一距离越短,越使所述第一飞行体的飞行的速度限制为低速。
- 如权利要求1至6中任一项所述的信息处理装置,其特征在于,所述处理部获取所述共同关注点的位置信息,获取所述第一飞行体的位置信息,在所述共同关注点与所述第一飞行体的距离即第二距离小于等于第二阈值的情况下,将推荐对所述第一飞行体的飞行的速度进行限制的第二推荐信息提示给所述第一终端。
- 如权利要求7所述的信息处理装置,其特征在于,所述处理部提示所述第二 推荐信息,该第二推荐信息推荐所述第二距离越短,越使所述第一飞行体的飞行的速度限制为低速。
- 如权利要求1至8中任一项所述的信息处理装置,其特征在于,所述信息处理装置是服务器,其还包括通信部,所述处理部经由所述通信部从所述第一终端获取所述第一关注点,从所述第二终端获取所述第二关注点,并经由所述通信部向所述第一终端发送要提示给所述第一终端的信息。
- 如权利要求1至8中任一项所述的信息处理装置,其特征在于,所述信息处理装置是所述第一终端,其还包括通信部和提示部,所述处理部经由所述通信部从所述第一飞行体获取所述第一图像,检测所述第一图像中的所述第一关注点,经由所述通信部从所述第二终端获取所述第二关注点,将要提示给所述第一终端的信息提示给所述提示部。
- 一种信息处理装置中的信息提示指示方法,其根据由飞行体拍摄的图像中的用户的关注点来提示信息,其特征在于,包括:获取对指示第一飞行体的控制的第一终端进行操作的第一用户在由所述第一飞行体拍摄的第一图像中关注的第一关注点、和对指示第二飞行体的控制的第二终端进行操作的第二用户在由所述第二飞行体拍摄的第二图像中关注的第二关注点的步骤;判定所述第一关注点和所述第二关注点是否为表示相同关注点的共同关注点的步骤;以及在是所述共同关注点的情况下,将与所述第二飞行体相关的信息提示给所述第一终端的步骤。
- 如权利要求11所述的信息提示指示方法,其特征在于,还包括:判定所述第一飞行体是否正在移动的步骤;提示与所述第二飞行体相关的信息的步骤包括:在所述第一飞行体正在移动的情况下,将与所述第二飞行体相关的信息提示给所述第一终端的步骤。
- 如权利要求11或12所述的信息提示指示方法,其特征在于,还包括:获取所述第一飞行体的位置信息的步骤;获取所述第二飞行体的位置信息的步骤;提示与所述第二飞行体相关的信息的步骤包括:在所述第一飞行体与所述第二飞行体之间的距离即第一距离小于等于第一阈值的情况下,将与所述第二飞行体相关的信息提示给所述第一终端的步骤。
- 如权利要求11至13中任一项所述的信息提示指示方法,其特征在于,还包括:获取所述第一飞行体的位置信息的步骤;获取所述第二飞行体的位置信息的步骤;提示与所述第二飞行体相关的信息的步骤包括:在所述第一终端的画面中的、根据所述第二飞行体的相对于所述第一飞行体的位置的位置处,提示表示所述第二飞行体的存在的信息的步骤。
- 如权利要求11或14所述的信息提示指示方法,其特征在于,还包括:获取所述第一飞行体的位置信息的步骤;获取所述第二飞行体的位置信息的步骤;以及在所述第一飞行体与所述第二飞行体的距离即第一距离小于等于第一阈值的情况下,将推荐对所述第一飞行体的飞行的速度进行限制的第一推荐信息提示给所述第一终端的步骤。
- 如权利要求15所述的信息提示指示方法,其特征在于,提示所述第一推荐信息的步骤包括:对推荐所述第一距离越短,越使所述第一飞行体的飞行的速度限制为低速的所述第一推荐信息进行提示的步骤。
- 如权利要求11至16中任一项所述的信息提示指示方法,其特征在于,还包括:获取所述共同关注点的位置信息的步骤;获取所述第一飞行体的位置信息的步骤;以及在所述共同关注点与所述第一飞行体的距离即第二距离小于等于第二阈值的情况下,将推荐对所述第一飞行体的飞行的速度进行限制的第二推荐信息提示给所述第一终端的步骤。
- 如权利要求17所述的信息提示指示方法,其特征在于,提示所述第二推荐信息的步骤包括:对推荐所述第二距离越短,越使所述第一飞行体的飞行的速度限制为低速的所述第二推荐信息进行提示的步骤。
- 如权利要求11至18中任一项所述的信息提示指示方法,其特征在于,所述信息处理装置是服务器,获取所述第一关注点和所述第二关注点的步骤包括:从所述第一终端接收所述第一关注点的步骤;从所述第二终端接收所述第二关注点的步骤;提示与所述第二飞行体相关的信息的步骤包括:向所述第一终端发送将要提示给所述第一终端的信息的步骤。
- 如权利要求11至18中任一项所述的信息提示指示方法,其特征在于,所述信息处理装置是所述第一终端,获取所述第一关注点和所述第二关注点的步骤包括:从所述第一飞行体接收所述第一图像的步骤;检测所述第一图像中的所述第一关注点的步骤;从所述第二终端接收所述第二关注点的步骤;提示与所述第二飞行体相关的信息的步骤包括:将要提示给所述第一终端的信息提示给所述第一终端所包括的提示部的步骤。
- 一种程序,其特征在于,其用于使根据由飞行体拍摄的图像中的用户的关注点来提示信息的信息处理装置执行以下步骤:获取对指示第一飞行体的控制的第一终端进行操作的第一用户在由所述第一飞行体拍摄的第一图像中关注的第一关注点、和对指示第二飞行体的控制的第二终端进行操作的第二用户在由所述第二飞行体拍摄的第二图像中关注的第二关注点的步骤;判定所述第一关注点和所述第二关注点是否共同关注点的步骤;以及在是所述共同关注点的情况下,将与所述第二飞行体相关的信息提示给所述第一终端的步骤。
- 一种记录介质,其特征在于,其为计算机可读介质并记录有用于使根据由飞行体拍摄的图像中的用户的关注点来提示信息的信息处理装置执行以下步骤的程序:获取对指示第一飞行体的控制的第一终端进行操作的第一用户在由所述第一飞行体拍摄的第一图像中关注的第一关注点、和对指示第二飞行体的控制的第二终端进行操作的第二用户在由所述第二飞行体拍摄的第二图像中关注的第二关注点的步骤;判定所述第一关注点和所述第二关注点是否为共同关注点的步骤;以及在是所述共同关注点的情况下,将与所述第二飞行体相关的信息提示给所述第一终端的步骤。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201980003178.2A CN110785720A (zh) | 2018-04-27 | 2019-04-19 | 信息处理装置、信息提示指示方法、程序以及记录介质 |
US17/075,089 US20210034052A1 (en) | 2018-04-27 | 2020-10-20 | Information processing device, instruction method for prompting information, program, and recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018086903A JP6974247B2 (ja) | 2018-04-27 | 2018-04-27 | 情報処理装置、情報提示指示方法、プログラム、及び記録媒体 |
JP2018-086903 | 2018-04-27 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/075,089 Continuation US20210034052A1 (en) | 2018-04-27 | 2020-10-20 | Information processing device, instruction method for prompting information, program, and recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019206044A1 true WO2019206044A1 (zh) | 2019-10-31 |
Family
ID=68293702
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/083477 WO2019206044A1 (zh) | 2018-04-27 | 2019-04-19 | 信息处理装置、信息提示指示方法、程序以及记录介质 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20210034052A1 (zh) |
JP (1) | JP6974247B2 (zh) |
CN (1) | CN110785720A (zh) |
WO (1) | WO2019206044A1 (zh) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7428678B2 (ja) * | 2021-03-23 | 2024-02-06 | 株式会社日立製作所 | 運航管理システム、運航管理方法及び運航管理プログラム |
JP7486110B2 (ja) * | 2021-04-16 | 2024-05-17 | パナソニックIpマネジメント株式会社 | 映像表示システム及び映像表示方法 |
JP7261334B1 (ja) | 2022-03-15 | 2023-04-19 | Kddi株式会社 | 飛行管理装置、飛行管理方法及びプログラム |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103197684A (zh) * | 2013-04-25 | 2013-07-10 | 清华大学 | 无人机群协同跟踪目标的方法及系统 |
CN104102218A (zh) * | 2014-06-30 | 2014-10-15 | 西北工业大学 | 基于视觉伺服的感知与规避方法及系统 |
CN104597910A (zh) * | 2014-11-27 | 2015-05-06 | 中国人民解放军国防科学技术大学 | 一种基于瞬时碰撞点的无人机非协作式实时避障方法 |
CN107168337A (zh) * | 2017-07-04 | 2017-09-15 | 武汉视览科技有限公司 | 一种基于视觉识别的移动机器人路径规划与调度方法 |
US20170313416A1 (en) * | 2016-05-02 | 2017-11-02 | Qualcomm Incorporated | Imaging Using Multiple Unmanned Aerial Vehicles |
WO2017197556A1 (en) * | 2016-05-16 | 2017-11-23 | SZ DJI Technology Co., Ltd. | Systems and methods for coordinating device actions |
WO2017201697A1 (en) * | 2016-05-25 | 2017-11-30 | SZ DJI Technology Co., Ltd. | Techniques for image recognition-based aerial vehicle navigation |
CN107918403A (zh) * | 2017-12-31 | 2018-04-17 | 天津津彩物联科技有限公司 | 一种多无人机飞行轨迹协同规划的实现方法 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3208064A (en) * | 1957-05-31 | 1965-09-21 | Bendix Corp | Aircraft collision warning system |
CN104269078A (zh) * | 2014-09-23 | 2015-01-07 | 苏州天益航空科技有限公司 | 农用植保无人机碰撞检测方法 |
FR3028186A1 (fr) * | 2014-11-12 | 2016-05-13 | Parrot | Equipement de telecommande de drone a longue portee |
WO2016143256A1 (ja) * | 2015-03-12 | 2016-09-15 | パナソニックIpマネジメント株式会社 | 飛行体 |
US10683086B2 (en) * | 2015-03-19 | 2020-06-16 | Prodrone Co., Ltd. | Unmanned rotorcraft and method for measuring circumjacent object around rotorcraft |
KR20170028811A (ko) * | 2015-09-04 | 2017-03-14 | 유성훈 | 무인 비행체의 비행을 모니터링하는 방법, 사용자 단말, 서버 및 디텍팅 장치 |
WO2017039179A1 (ko) * | 2015-09-04 | 2017-03-09 | 유성훈 | 무인 비행체의 비행을 모니터링하는 방법, 사용자 단말, 서버 및 디텍팅 장치 |
JP2017201757A (ja) * | 2016-05-06 | 2017-11-09 | キヤノン株式会社 | 画像取得システム、画像取得方法、画像処理方法 |
CN105843255B (zh) * | 2016-05-12 | 2019-03-05 | 深圳市艾博航空模型技术有限公司 | 无人机集群自主飞行的防撞装置及防撞方法 |
US20180268723A1 (en) * | 2016-07-27 | 2018-09-20 | Optim Corporation | System, method, and program for controlling traffic of uninhabited vehicle |
CN106483980B (zh) * | 2016-11-24 | 2019-05-31 | 腾讯科技(深圳)有限公司 | 一种无人机跟随飞行的控制方法、装置及系统 |
CN107424443B (zh) * | 2017-08-30 | 2018-06-29 | 北京航空航天大学 | 一种基于Vicsek模型的飞行器集群调控方法及装置 |
US10657833B2 (en) * | 2017-11-30 | 2020-05-19 | Intel Corporation | Vision-based cooperative collision avoidance |
-
2018
- 2018-04-27 JP JP2018086903A patent/JP6974247B2/ja active Active
-
2019
- 2019-04-19 CN CN201980003178.2A patent/CN110785720A/zh active Pending
- 2019-04-19 WO PCT/CN2019/083477 patent/WO2019206044A1/zh active Application Filing
-
2020
- 2020-10-20 US US17/075,089 patent/US20210034052A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103197684A (zh) * | 2013-04-25 | 2013-07-10 | 清华大学 | 无人机群协同跟踪目标的方法及系统 |
CN104102218A (zh) * | 2014-06-30 | 2014-10-15 | 西北工业大学 | 基于视觉伺服的感知与规避方法及系统 |
CN104597910A (zh) * | 2014-11-27 | 2015-05-06 | 中国人民解放军国防科学技术大学 | 一种基于瞬时碰撞点的无人机非协作式实时避障方法 |
US20170313416A1 (en) * | 2016-05-02 | 2017-11-02 | Qualcomm Incorporated | Imaging Using Multiple Unmanned Aerial Vehicles |
WO2017197556A1 (en) * | 2016-05-16 | 2017-11-23 | SZ DJI Technology Co., Ltd. | Systems and methods for coordinating device actions |
WO2017201697A1 (en) * | 2016-05-25 | 2017-11-30 | SZ DJI Technology Co., Ltd. | Techniques for image recognition-based aerial vehicle navigation |
CN107168337A (zh) * | 2017-07-04 | 2017-09-15 | 武汉视览科技有限公司 | 一种基于视觉识别的移动机器人路径规划与调度方法 |
CN107918403A (zh) * | 2017-12-31 | 2018-04-17 | 天津津彩物联科技有限公司 | 一种多无人机飞行轨迹协同规划的实现方法 |
Also Published As
Publication number | Publication date |
---|---|
US20210034052A1 (en) | 2021-02-04 |
JP6974247B2 (ja) | 2021-12-01 |
JP2019192111A (ja) | 2019-10-31 |
CN110785720A (zh) | 2020-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6671375B2 (ja) | 無人機の飛行補助方法 | |
WO2018218516A1 (zh) | 无人机返航路径规划方法及装置 | |
US10410320B2 (en) | Course profiling and sharing | |
US20210034052A1 (en) | Information processing device, instruction method for prompting information, program, and recording medium | |
US11513514B2 (en) | Location processing device, flight vehicle, location processing system, flight system, location processing method, flight control method, program and recording medium | |
JP6765512B2 (ja) | 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体 | |
WO2020143677A1 (zh) | 一种飞行控制方法及飞行控制系统 | |
JPWO2018073879A1 (ja) | 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体 | |
JP2019511044A (ja) | 飛行デバイスの動き制御のための方法およびシステム | |
CN113795803B (zh) | 无人飞行器的飞行辅助方法、设备、芯片、系统及介质 | |
WO2018150492A1 (ja) | 画像表示方法、画像表示システム、飛行体、プログラム、及び記録媒体 | |
WO2021251441A1 (ja) | 方法、システムおよびプログラム | |
JP6684012B1 (ja) | 情報処理装置および情報処理方法 | |
JP6329219B2 (ja) | 操作端末、及び移動体 | |
WO2022205294A1 (zh) | 无人机的控制方法、装置、无人机及存储介质 | |
CN110785724B (zh) | 发送器、飞行体、飞行控制指示方法、飞行控制方法、程序及存储介质 | |
JP6856670B2 (ja) | 飛行体、動作制御方法、動作制御システム、プログラム及び記録媒体 | |
WO2021130980A1 (ja) | 飛行体の飛行経路表示方法及び情報処理装置 | |
CN112313942A (zh) | 一种进行图像处理和框架体控制的控制装置 | |
JP2021048559A (ja) | 制御装置、制御方法、プログラム、及び記録媒体 | |
KR102542181B1 (ko) | 360도 vr 영상 제작을 위한 무인 비행체 제어 방법 및 장치 | |
JP2023083072A (ja) | 方法、システムおよびプログラム | |
JP2024125455A (ja) | 撮像支援システム、撮像支援プログラム、撮像支援方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19793508 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19793508 Country of ref document: EP Kind code of ref document: A1 |