WO2024069789A1 - 空中撮影システム、空中撮影方法および空中撮影プログラム - Google Patents
空中撮影システム、空中撮影方法および空中撮影プログラム Download PDFInfo
- Publication number
- WO2024069789A1 WO2024069789A1 PCT/JP2022/036120 JP2022036120W WO2024069789A1 WO 2024069789 A1 WO2024069789 A1 WO 2024069789A1 JP 2022036120 W JP2022036120 W JP 2022036120W WO 2024069789 A1 WO2024069789 A1 WO 2024069789A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- drone
- photographing
- shooting
- event
- unit
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
Definitions
- the present invention relates to an aerial photography system, an aerial photography method, and an aerial photography program.
- Patent Document 1 discloses a camera viewpoint display system that detects the aircraft's position and nose direction, as well as the pan and tilt angles of a camera device mounted on the aircraft, calculates the camera viewpoint from each of these pieces of data, and displays the viewpoint on a map on a monitor screen. With this system, an operator controls the aircraft's position and attitude, as well as the camera's shooting direction, while grasping the aircraft's position and heading from a ground station.
- Patent document 2 discloses a technology that automatically controls the position and shooting direction so that a specific object is tracked and photographed by multiple UAVs.
- the present invention was made in consideration of the above problems, and aims to provide an aerial photography system that reduces the labor required for photography and enables appropriate photography according to the conditions of the subject.
- an aerial photography system comprises a moving body that flies over a target area, a camera mounted on the moving body that photographs the target area, an event detection unit that detects an event based on an image captured by the camera or an input from an external system, and a photography condition determination unit that determines photography conditions including at least one of a target photography position and a target photography direction of the moving body according to the detected event.
- the shooting condition determination unit may determine the shooting conditions according to the type of the detected event.
- the target shooting direction may be achieved by controlling at least one of the nose direction of the moving body and the angle of the camera relative to the moving body.
- the shooting conditions may include a target zoom amount for the camera.
- the shooting condition determination unit may determine the shooting conditions based on the operation received via the controller even if the event detection unit detects the event.
- the camera may further include a controller that accepts input of the shooting conditions by a user, and the shooting condition determination unit may determine the shooting conditions based on the input via the controller if the event detection unit has not detected the event, and may determine the shooting conditions based on the event if the event detection unit detects the event.
- the aerial photography system may include a plurality of the moving bodies, and photograph a single target area by flying the plurality of moving bodies simultaneously over the target area, and the photography condition determination unit may determine different photography conditions for each of the plurality of moving bodies.
- the shooting condition determination unit may set, for multiple moving bodies flying simultaneously, shooting conditions for shooting the same shooting range from different target shooting positions, or shooting conditions for shooting an area including the same shooting range with different zoom amounts.
- the shooting condition determination unit may determine the shooting conditions according to the predicted result of the ball's trajectory predicted by the event detection unit as the detection result of the event.
- the shooting condition determination unit may determine the shooting conditions so that, when an event indicating a foul has occurred in a competition held in the target area is detected, the shooting condition has a shooting range of the ball used in the competition or the vicinity of the position of the referee of the competition.
- the aerial photography system may include a plurality of moving bodies, and photograph a single target area by flying the plurality of moving bodies simultaneously over the target area, and the photography condition determination unit may determine the photography conditions, when an event indicating that a foul has occurred in the competition is detected, by using the plurality of moving bodies to photograph the ball or the vicinity of the point of the competition referee at different target photography positions, target photography directions, or zoom amounts.
- the system may include a flight path generation unit that generates a flight path for the moving body, and the flight path generation unit may automatically generate the flight path to the target shooting position that is determined based on the event detected from the captured image.
- the flight path generation unit generates the flight path within the court that is configured within the target area, and the flight path generation unit may generate the flight path to the target shooting position by connecting a plurality of shooting positions that have been set in advance, and may change the shooting positions to be connected depending on the detection status of the event.
- an aerial photography method includes an event detection step for detecting an event based on an image captured by a camera photographing a target area or an input from an external system, and a photography condition determination step for determining photography conditions including at least one of a target photography position and a target photography direction of a moving body equipped with the camera according to the detected event.
- an aerial photography program causes a computer to execute an event detection command for detecting an event based on an image acquired by a camera photographing a target area or an input from an external system, and a photography condition determination command for determining photography conditions including at least one of a target photography position and a target photography direction of a moving body equipped with the camera in accordance with the detected event.
- the computer program may be provided by being stored on various data-readable recording media, or may be provided so as to be downloadable via a network such as the Internet.
- the present invention reduces the labor required for photography and enables appropriate photography according to the subject's circumstances.
- FIG. 1 is an overall configuration diagram of an aerial photography system according to an embodiment of the present invention
- FIG. 2 is a simplified external perspective view of the drone according to the embodiment.
- FIG. 2 is a functional configuration diagram of the drone according to the embodiment.
- (a) is a simplified front view of the exterior of the control device of the embodiment;
- (b) is a schematic diagram showing the direction in which the drone moves or turns in response to input from the control device.
- FIG. 2 is a functional configuration diagram of the control device according to the embodiment.
- FIG. 2 is a functional configuration diagram of a server according to the embodiment.
- 1 is a schematic diagram showing an example of a shooting position of a drone that is set in advance in a shooting target field where the drone flies.
- FIG. 1 is a schematic diagram showing an example of a shooting position of a drone that is set in advance in a shooting target field where the drone flies.
- FIG. 2 is a schematic state transition diagram showing the transition of flight modes of the drone. This is a schematic state transition diagram showing the state transition of the drone depending on the aircraft state of the drone. A schematic state transition diagram showing the state transition of the drone according to the aircraft behavior state of the drone.
- FIG. 11 is a schematic state transition diagram showing state transitions of a game in a stadium as an example of a field to be photographed.
- FIG. 2 is a schematic state transition diagram showing the state transition of the offensive and defensive states in the stadium.
- 1 is a table showing an example of the correspondence between the game state in the stadium and the shooting range, camera position, shooting direction, and zoom amount captured by the drone.
- FIG. 1 is a schematic diagram showing possible photographing positions and flight paths to which the photographing position can be transitioned; 1 is a flowchart of a control executed during flight of the drone.
- 16 is a flowchart showing control of flight restrictions in the drone (details of S1002 in FIG. 15). 16 is a flowchart showing flight mode switching control of the drone (details of S1010 in FIG. 15 ).
- FIG. 4 is a diagram showing a first example of a screen displayed on a terminal of the aerial photography system.
- FIG. 13 is a diagram showing a second example of a screen displayed on the terminal of the aerial photography system.
- FIG. 13 is a diagram showing a third example of a screen displayed on the terminal of the aerial photography system.
- FIG. 13 is a diagram showing a fourth example of a screen displayed on the terminal of the aerial photography system.
- FIG. 1 is an overall configuration diagram of an aerial photography system 1 (hereinafter also referred to as "system 1") according to one embodiment of the present invention.
- System 1 uses a drone 100 (an example of a moving body) to take aerial photographs of a competition taking place at a stadium F (FIG. 7) (an example of a target area) or an event taking place at an event venue.
- Stadium F is an example of a target area.
- a single system 1 may include multiple drones 100. In this case, system 1 can photograph a single stadium F by flying multiple drones simultaneously over the stadium F.
- the system 1 mainly includes a controller 200 that allows the pilot to operate the drone 100, a server 300 that manages the flight and photography of the drone 100, an external input device 600, and an external system 700.
- the drone 100 and the controller 200 are connected to each other via wireless communication (which may include communication via a base station 800).
- the controller 200 and the server 300 are connected to each other via a communication network 400 such as an internet line.
- the drone 100 acquires satellite signals from an artificial satellite 500 to determine its own position, etc.
- the external input device 660 is a device capable of transmitting and receiving information to and from the system 1, separate from the controller 200, and is composed of a mobile terminal such as a smartphone or tablet terminal.
- the external input device 660 can be operated, for example, by the manager, coach, bench player, referee, or court equipment personnel of the competition taking place at the stadium F.
- the external input device 660 has, for example, a function for receiving an emergency command to suspend filming, and the drone 100 performs emergency evacuation based on the command.
- the external input device 660 may also receive an input to switch the flight mode of the drone 100.
- the external input device 660 may be equipped with a display device, and may display information similar to that displayed on the display unit 201 of the controller 200.
- the external input device 660 may acquire event information that occurs during the competition. The event information is referred to when the user of the external input device 660 makes an input to switch the flight mode of the drone 100.
- the external system 700 may be any system configured separately from the system 1. For example, systems such as a court facility system, a match management system, and a referee support system may be applied as systems deployed in relation to the competition held at the stadium F, and systems such as a weather observation system or an earthquake observation system deployed independently of the competition may also be applied. Multiple external systems 700 may be connected to the system 1. The system 1 may receive an emergency command to stop filming or a command to switch the flight mode of the drone 100 from the various external systems 700. In addition, the various external systems 700 may acquire event information that occurs during the competition.
- systems such as a court facility system, a match management system, and a referee support system may be applied as systems deployed in relation to the competition held at the stadium F, and systems such as a weather observation system or an earthquake observation system deployed independently of the competition may also be applied.
- Multiple external systems 700 may be connected to the system 1.
- the system 1 may receive an emergency command to stop filming or a command to switch the flight mode of the drone 100 from the various external systems 700
- the court facilities system which is an example of the external system 700, may obtain the brightness of the captured image from the system 1, for example, and control the illuminance adjustment or blinking of the lighting in the stadium F.
- the court facilities system may also receive a request for lighting illuminance from the system 1 and control the illuminance adjustment or blinking.
- the configuration of system 1 is not limited to that shown in FIG. 1, and the drone 100, the controller 200, the server 300, and the base station 800 may each be connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line.
- the drone 100 may perform wireless communication directly with the communication network 400 using a communication method such as LTE without going through the controller 200. Therefore, the drone 100, the controller 200, and the base station 800 do not need to perform direct wireless communication, and it is sufficient if they can each be connected to the communication network 400 in a remote location. Therefore, this system configuration is suitable for cases where the drone 100 and the controller 200 are in a remote location (for example, when a pilot operates them remotely).
- the drone 100, the controller 200, the base station 800, and the server 300 are each connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line, and the drone 100 and the base station 800 may be communicatively connected to the communication network 400 by satellite communication via an artificial satellite 500.
- a communication network 400 such as an Internet line
- the drone 100 and the base station 800 may be communicatively connected to the communication network 400 by satellite communication via an artificial satellite 500.
- multiple servers 300 may be connected to one drone 100 via multiple communication networks 400, i.e., the system may be made redundant.
- the system may be made redundant.
- the drone 100 and the controller 200 can be controlled even when they are remotely located, making them suitable for remote operation, but this is not limited to this, and they can also be applied to visual flight in which the pilot manually controls the drone 100 while watching it.
- the device described in the above embodiment may be realized as a single device, or may be realized by multiple devices (e.g., drone 100, controller 200, cloud server 300) that are partially or completely connected by a communication network 400.
- each functional unit and memory unit of server 300 may be realized by being implemented in different servers 300, drones 100, and controllers 200 that are connected to each other by the communication network 400.
- Fig. 2 is a simplified external perspective view of the drone 100 of this embodiment.
- Fig. 3 is a functional configuration diagram of the drone 100 of this embodiment. As described above, the drone 100 takes aerial photographs of competitions held in the stadium F (Fig. 7) and events held in the event venue.
- drone refers to any flying object that has the ability to autonomously control its attitude, regardless of the power source (electricity, prime mover, etc.), control method (wireless or wired, and fully autonomous or partially manual, etc.), and whether manned or unmanned.
- Drones are also sometimes referred to as Unmanned Aerial Vehicles (UAVs), flying objects, multicopters, RPAS (Remote Piloted Aircraft Systems), or UAS (Unmanned Aircraft Systems), etc.
- the exterior of the drone 100 is mainly composed of a housing 101 and multiple propellers 122.
- the housing 101 is, for example, a roughly rectangular parallelepiped, but may have any shape.
- Rod-shaped connecting parts 102 extending laterally are connected to the left and right sides of the housing 101.
- the other ends of the connecting parts 102 are respectively connected to propellers 122 and motors 121 that rotate the propellers 122.
- the motors 121 are, for example, electric motors.
- the propellers 122 may be composed of a single propeller, or may be composed of multiple propellers arranged coaxially.
- the number and shape of the blades of each propeller are not particularly limited.
- a propeller guard (not shown) may be provided on the outside of the propeller 122 to prevent the propeller from interfering with obstacles.
- a photographing camera 141 is held by a camera holder 142 below the housing 101.
- an obstacle detection camera 131 is disposed on the front surface of the housing 101.
- the obstacle detection camera 131 is a so-called dual camera consisting of two cameras that form a pair.
- the obstacle detection camera 131 is disposed so as to capture an image in front of the drone 100.
- the obstacle detection camera 131 may be disposed not only on the front surface but also on all surfaces of the housing 101, for example, on six surfaces in the case of a housing 101 that is a substantially rectangular parallelepiped.
- the drone 100 is equipped with an alarm device 250 that alerts people around the drone 100 to the presence of the drone 100.
- the alarm device 250 has, for example, a warning light 251 and a speaker 252.
- the warning light 251 is provided for each propeller 122 or motor 121, and is disposed, for example, on each side of multiple motors 121.
- the warning light 251 may be disposed along the cylindrical side of the motor 121 so that it can be seen from all directions in addition to the front.
- the speaker 252 outputs an alarm sound and is provided in the housing 101 of the drone 100.
- the speaker 252 is provided, for example, on the underside of the housing 101, and transmits the alarm sound downwards of the drone 100.
- the drone 100 is equipped with an arithmetic device such as a CPU (Central Processing Unit) for executing information processing, and storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory), and thereby has the following functional blocks: a measurement unit 110, a flight function unit 120, an obstacle detection unit 130, an imaging unit 140, and a communication unit 150.
- an arithmetic device such as a CPU (Central Processing Unit) for executing information processing
- storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory)
- the measurement unit 110 is a functional unit that measures information related to the drone 100 or its surroundings.
- the measurement unit 110 has, for example, a position measurement unit 111, a direction measurement unit 112, an altitude measurement unit 113, and a speed measurement unit 114.
- the measurement unit 110 may also include various sensors that acquire information such as temperature, air pressure, wind speed, and acceleration.
- the position measurement unit 111 receives signals from the artificial satellites 500 and measures the position (absolute position) of the aircraft based on the signals.
- the position measurement unit 111 measures its current position using, for example, GNSS (Global Navigation Satellite System), GPS (Global Positioning System), etc., but is not limited to this.
- GNSS Global Navigation Satellite System
- GPS Global Positioning System
- RTK-GNSS Real Time Kinematic - Global Navigation Satellite System
- the position information includes at least two-dimensional coordinate information in a planar view (e.g., latitude, longitude), and preferably includes three-dimensional coordinate information including altitude information.
- the base station 800 which provides information on the reference points of fixed stations used for relative positioning such as RTK, is connected to the drone 100 and the controller 200 so as to be able to communicate wirelessly with them, making it possible to measure the position of the drone 100 with greater accuracy.
- the base station 800 can be omitted, or the accuracy of the position coordinate estimation of the base station 800 or the drone 100 can be further improved.
- the orientation measurement unit 112 measures the orientation of the aircraft (nose direction, heading direction).
- the orientation measurement unit 112 is composed of a geomagnetic sensor that measures the nose direction (heading direction) of the drone 100 aircraft by measuring geomagnetism, a compass, etc.
- the altitude measurement unit 113 measures the altitude above the ground (hereinafter also referred to as "flight altitude”) as the distance from the ground below the drone 100 (vertically downward).
- the speed measurement unit 114 detects the flight speed of the drone 100.
- the speed measurement unit 114 may measure the speed using a known sensor such as a gyro sensor.
- Flight function unit 120 is a mechanism and function unit that causes the drone 100 to fly, and generates thrust in the airframe for lifting the drone 100 and moving it in a desired direction. As shown in Figures 2 and 3, the flight function unit 120 has a plurality of motors 121, a plurality of propellers 122, and a flight control unit 123.
- the flight control unit 123 independently controls the multiple motors 121 to rotate each propeller 122, causing the drone 100 to perform various operations such as taking off, moving forward, turning, and landing, and controls the attitude angle control and flight operations of the drone 100 from takeoff, during flight, and until landing.
- the flight control unit 123 has a processing unit, also called a flight controller.
- the processing unit can have one or more processors, such as a programmable processor (e.g., a central processing unit (CPU), MPU, or DSP).
- the processing unit has access to a memory (storage unit).
- the memory stores logic, code, and/or program instructions that the processing unit can execute to perform one or more steps.
- the memory may include, for example, a separable medium such as an SD card or RAM, or an external storage device.
- Various data acquired by the measurement unit 110, or video or still image data captured by the imaging camera 141 may be directly transmitted to and stored in the memory. Each data may also be recorded in an external memory.
- the processing unit includes a control module configured to control the state of the drone 100.
- the control module controls the flight function section 120 (thrust generating section) of the drone 100 to adjust the spatial arrangement, attitude angle, angular velocity, angular acceleration, angular jerk rate, and/or acceleration of the drone 100 having six degrees of freedom (translational motion x, y, and z, and rotational motion ⁇ x, ⁇ y, and ⁇ z).
- the flight control unit 123 can control the flight of the drone 100 based on control signals from the pilot 200 or based on a preset autonomous flight program.
- the flight control unit 123 can also control the flight of the drone 100 by controlling the motor 121 based on various information such as the field to be photographed, flight permitted/prohibited areas, information on the corresponding flight geofences, map information including two-dimensional or three-dimensional map data, the current position information of the drone 100, attitude information (heading information), speed information, and acceleration information, and any combination of these.
- shooting target field or “target area” refers to a two-dimensional location (for example, the stadium F) that is the subject of shooting.
- FIG. 7 is a schematic diagram showing an example of a playing field F, which is an example of a field to be photographed by a drone, viewed from above.
- the playing field F is composed of a court F100 that is roughly rectangular and is defined by, for example, a straight outer edge, and an outer court area F200 that is a predetermined area that covers the outer edge of the court F100.
- the outer edge of the court F100 is composed of mutually opposing goal lines F110a, F110b and mutually opposing touch lines F111a, F111b that are connected at roughly right angles.
- the connection points of the goal lines F110a, F110b and the touch lines F111a, F111b are the corners F112a, F113a, F112b, F113b.
- Goals F120a, F120b are provided approximately in the center of the pair of goal lines F110a, F110b.
- Penalty areas F130a, F130b are defined in specific areas inside the court F100 adjacent to the goals F120a, F120b, and penalty lines F140a, F140b are drawn on the outer edges of the penalty areas.
- a halfway line F150 is drawn in the center of the court F100, connecting the midpoints of a pair of touchlines F111a, F111b and dividing the court F100 into approximately equal parts.
- the halfway line F150 is approximately parallel to the goal lines F110a, F110b.
- goal lines F110a, F110b, touchlines F111a, F111b, penalty lines F140a, F140b, and halfway line F150 are required by the rules for players to play the game, and therefore all of these lines are generally drawn in a manner that allows them to be seen, but the technical scope of the present invention is not limited to this.
- a soccer stadium is used as an example, but the sports that are photographed by the system of the present invention are not limited to soccer, and include any type of sports, such as tennis.
- the subject of the photography is not limited to sports, and the system can also be applied to other events (concerts, ceremonies, etc.).
- the shooting positions L101-L105, L206-L215 may be two-dimensional coordinates on a plane, or may be three-dimensional coordinate information that also defines the height at the corresponding positions.
- the flight height of the drone 100 may be manually controllable based on input from the controller 200.
- Photographing positions L101 to L105 are defined, for example, on the touchline F111b, at approximately equal intervals along the touchline F111b.
- photographing position L101 is a point located in a range including the intersection of the halfway line F150 and the touchline F111b and slightly outside the court F100.
- Photographing positions L103 and L105 are points near the corners F112a or F112b on both sides of the touchline F111b.
- Photographing positions L102 and L104 are points between photographing positions L103 and L105 and photographing position L101. Note that the above positions are merely examples, and are not limited to these and may be any appropriate positions.
- Photographing positions L206 to L215 are points defined within the court F100.
- photographing positions L206 and L211 are points near the center of the penalty lines F140a and F140b on a line parallel to the goal lines F110a and F110b, and are so-called goal-front photographing positions.
- Photographing positions L207 and L212 are photographing positions closer to the touchline or F111a or F111b and closer to the halfway line F150 than photographing positions L206 and L215. More specifically, for example, photographing positions L207 and L212 are points on an imaginary line segment connecting photographing position L101 and goals F120a and F120b, and are, for example, points approximately in the center of the imaginary line segment.
- Photographing positions L209 and L215 are points that are linearly symmetrical to photographing positions L207 and L212.
- Shooting position L208 is a point between shooting position L207 and the halfway line F150
- shooting position L210 is a point between shooting position L209 and the halfway line F150
- shooting position L213 is a point between shooting position L212 and the halfway line F150
- shooting position L214 is a point between shooting position L215 and the halfway line F150.
- an evacuation point H200 is set to which the drone 100 is to be evacuated if an abnormality or malfunction of the drone 100 or the system 1 is detected.
- the abnormality referred to here is an abnormality related to the stability of the aerial movement of the drone 100.
- the abnormality includes, for example, a case where the calculation load associated with the operation control (behavior control, shooting control, etc.) of the drone 100 exceeds a load threshold.
- the abnormality may include a transient abnormality related to the environment, such as a case where the measured value of the behavior control value (e.g. speed) of the drone 100 exceeds an allowable value due to the influence of a strong wind or the like.
- the evacuation point H200 is set at a point different from the shooting positions L101 to L105 and L206 to L215, and in this embodiment, it is set outside the touchline F111a and along the touchline F111a. There may be multiple evacuation points H200, and in this embodiment, there are three.
- the evacuation point H220 is set near the extension of the halfway line F150.
- the evacuation points H210 and H230 are set closer to the goals F120a and F120b than the shooting positions L206 and L211.
- the evacuation points H210 and H230 are set at the ends of an area partitioned by a geofence G200, which will be described later, for example.
- the drone 100 is replaced or the battery installed in the drone 100 is replaced.
- a geofence indicates a virtual boundary line that divides an area, and in particular, the geofence in this embodiment indicates a fence that is the boundary line between a flight-permitted area, where the drone 100 is permitted to fly or move, and a no-fly area.
- a geofence is a boundary line that divides an area that extends three-dimensionally, including in plane and height. If a moving object such as the drone 100 comes into contact with a geofence, flight or movement is restricted to prevent the aircraft from flying outside the flight-permitted area.
- the boundary line of the geofence in the height direction may include an upper limit and a lower limit.
- the geofences G100, G200 that are applied to whether or not flight is permitted are switched according to the control of the system 1 while the drone 100 is flying.
- the number of geofences G100, G200 depicted in the figure is two, but the number is arbitrary, and specifically may be three or more.
- the geofence G100 is an area that includes the shooting positions L101 to L105, and defines an area that includes the touchline F111b and the area nearby. In other words, the geofence G100 is defined near the outer edge of the court F100, and a portion of it extends into the outer court area F200.
- the geofence G200 is a geofence that is primarily applied in the outer edge flight mode M102, which will be described later.
- the geofence G200 is an area that includes the shooting positions L206 to L215, and is set at least inside the court F100. This geofence G200 is a geofence that is primarily applied in the on-court flight mode M105, which will be described later.
- the areas defined by the multiple geofences G100, G200 at least partially contact or overlap each other.
- the areas defined by the multiple geofences G100, G200 also overlap in the height direction.
- the heights of the multiple geofences G100, G200 may differ from each other. Specifically, the lower limit of the altitude of the geofence G200 set inside the stadium F is set higher than the lower limit of the altitude of the geofence G100 set on the outer edge of the stadium F.
- the obstacle detection unit 130 is a functional unit that detects obstacles around the drone 100.
- the obstacles may include, for example, people, for example, players, objects, animals such as birds, fixed equipment, and a ball.
- the obstacle detection unit 130 measures the position, speed vector, and the like of an obstacle located, for example, below the drone 100 based on the acquired image.
- the obstacle detection unit 130 includes, for example, an obstacle detection camera 131, a ToF (Time of Flight) sensor 132, and a laser sensor 133.
- the ToF sensor 132 measures the time it takes for a laser pulse emitted from the sensor to return to the light receiving element in the sensor, and measures the distance to an object by converting this time into distance.
- the laser sensor 133 uses, for example, the LiDAR (Light Detection And Ranging) method to shine light such as near-infrared light, visible light, or ultraviolet light on the target object and measure the distance by capturing the reflected light with an optical sensor.
- LiDAR Light Detection And Ranging
- FIG. 2 shows that the obstacle detection camera 131 is positioned facing forward, but the type, position and number of the camera 131, ToF sensor 132 and laser sensor 133 are arbitrary, and the ToF sensor 132 or laser sensor 133 may be positioned instead of the camera 131, or the ToF sensor 132 or laser sensor 133 may be provided on all six surfaces of the housing 101, i.e., the front, back, top, bottom and both sides.
- the photographing unit 140 is a functional unit that photographs images of a competition in the stadium F (FIG. 7) or an event at an event venue, and includes a photographing camera 141, a camera holding unit 142, and a photographing control unit 143. As shown in FIG. 2, the photographing camera 141 (imaging device) is disposed at the bottom of the main body of the drone 100, and outputs image data related to a peripheral image photographed around the drone 100.
- the photographing camera 141 is a video camera (color camera) that photographs moving images.
- the moving images may include audio data acquired by a microphone (not shown). In addition to or instead of this, the photographing camera 141 may also be configured to photograph still images.
- the orientation of the photographic camera 141 (the attitude of the photographic camera 141 relative to the housing 101 of the drone 100) can be adjusted by a camera actuator (not shown) built into the camera holding unit 142.
- the photographic camera 141 may have an automatic control function for parameters such as exposure, contrast, or ISO.
- the camera holding unit 142 may have a so-called gimbal control mechanism that suppresses the transmission of shaking or vibration of the aircraft to the photographic camera 141.
- the photographic control unit 143 controls the photographic camera 141 and the camera holding unit 142 to adjust the orientation of the photographic camera 141, the photographic magnification (zoom amount), the camera's photographic conditions, etc.
- Image data acquired by the photographic camera 141 can be transmitted to the memory unit of the drone 100 itself, the pilot 200, the server 300, etc.
- the communication unit 150 is capable of radio wave communication via the communication network 400 and includes, for example, a radio wave communication module.
- the communication unit 150 is capable of communication with the controller 200 and the like via the communication network 400 (including the wireless base station 800).
- FIG. 4 is a front view of the appearance of the controller 200 of this embodiment in a simplified manner.
- FIG. 5 is a functional configuration diagram of the controller 200 of this embodiment.
- the controller 200 is a mobile information terminal that controls the drone 100 by the operation of the pilot and displays information received from the drone 100 (e.g., position, altitude, remaining battery level, camera image, etc.).
- the flight state (altitude, attitude, etc.) of the drone 100 may be remotely controlled by the pilot 200, or may be autonomously controlled by the drone 100.
- the drone 100 performs autonomous flight.
- manual operation may be possible during basic operations such as takeoff and return, and in an emergency.
- the controller 200 includes a display unit 201 and an input unit 202 as a hardware configuration.
- the display unit 201 and the input unit 202 are connected to each other so that they can communicate with each other wired or wirelessly.
- the display unit 201 may be configured as a touch panel or liquid crystal monitor that is integrated into the controller 200, or may be configured as a display device such as a liquid crystal monitor, tablet terminal, or smartphone that is connected to the controller 200 wired or wirelessly.
- the display unit 201 as a hardware configuration may be configured as a touch panel display by integrally incorporating an element that accepts input such as touch.
- the input unit 202 is a mechanism through which the pilot inputs operational commands such as flight direction and takeoff/landing when piloting the drone 100. As shown in FIG. 4A, the input unit 202 has a left slider 326L, a right slider 326R, a left input stick 327L, a right input stick 327R, a power button 328, and a return button 329.
- the left slider 326L and the right slider 326R are operators that accept, for example, an input of 0/1, or an input of one-dimensional stepless or stepwise information, and the operator slides the left and right index fingers to input, for example, while holding the controller 200 in his/her hand.
- the left input stick 327L and the right input stick 327R are operators that accept an input of multi-dimensional stepless or stepwise information, and are, for example, so-called joysticks.
- the left input stick 327L and the right input stick 327R may also accept an input of 0/1 by pressing them.
- the power button 328 and the return button 329 are operators that accept pressing them, and are constituted by mechanical switches or the like.
- the left input stick 327L and the right input stick 327R accept input operations that instruct the three-dimensional flight operations of the drone 100, including, for example, takeoff, landing, ascent, descent, right turn, left turn, forward movement, backward movement, left movement, and right movement.
- Figure 4(b) is a schematic diagram showing the movement direction or rotation direction of the drone 100 corresponding to each input of the left input stick 327L and right input stick 327R shown in Figure 4(a). Note that this correspondence is an example.
- the controller 200 includes a processor such as a CPU for executing information processing, and storage devices such as RAM and ROM, which constitute the software configuration of the main functional blocks of the display control unit 210, input control unit 220, and communication unit 240.
- a processor such as a CPU for executing information processing
- storage devices such as RAM and ROM, which constitute the software configuration of the main functional blocks of the display control unit 210, input control unit 220, and communication unit 240.
- the display control unit 210 displays to the pilot the drone 100 or the status information of the drone 100 acquired from the server 300.
- the display control unit 210 can display images relating to various information such as the shooting target field, flight permitted/prohibited areas, flight geofence, map information, current position information of the drone 100, attitude information (directional information), speed information, acceleration information, and remaining battery power.
- the "current position information” referred to here is sufficient to include information on the horizontal position of the current position of the drone 100 (i.e., latitude and longitude), and does not need to include altitude information (absolute altitude or relative altitude).
- the display control unit 210 has a mode display unit 211 and a shooting status display unit 212.
- the mode display unit 211 is a functional unit that displays at least the state, i.e., the mode, to which the drone 100 belongs on the display unit 201.
- the mode to which the drone 100 belongs is, for example, the flight mode shown in FIG. 8, but instead of or in addition to this, the aircraft state shown in FIG. 9, the aircraft action state shown in FIG. 10, the game state shown in FIG. 11, or the offensive and defensive states shown in FIG. 12 may be displayed on the display unit 201.
- the screen G1 displayed on the display unit 201 displays, for example, a display field G21 for the flight mode to which the drone 100 belongs, as well as a status display field G22 showing the aircraft status, aircraft behavior status, match status, and offensive/defensive status.
- the display 5 is a functional unit that displays, on the display unit 201, an image captured by the imaging camera 141 mounted on the drone 100.
- the screen G1 displayed on the display unit 201 displays, for example, an image field G40 in which an image being captured by the drone 100 is displayed.
- the screen G1 and each state will be described in detail later.
- the input control unit 220 shown in FIG. 5 receives various inputs from a user such as a pilot.
- the input control unit 220 of this embodiment mainly has the following functional units: an aircraft position operation unit 221, an aircraft attitude operation unit 222, a camera attitude operation unit 223, a camera zoom operation unit 224, a flight mode switching unit 225, a target position receiving unit 226, a power supply input unit 227, and a return input unit 228.
- the aircraft position operation unit 221 includes an up/down movement input unit 221a and a left/right movement input unit 221b.
- the aircraft attitude operation unit 222 includes a forward/backward movement input unit 222a and a yaw rotation input unit 222b.
- the up-down movement input unit 221a is an input unit for allowing the pilot to move the drone 100 up and down, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved upward (toward the back when held in the hand), the drone 100 rises, and when the right input stick 327R is moved downward (toward the front when held in the hand), the drone 100 descends.
- the left-right movement input unit 221b is an input unit for allowing the pilot to move the drone 100 left and right, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved to the right, the drone 100 moves to the right, and when the right input stick 327R is moved to the left, the drone 100 moves to the left.
- the forward/backward movement input unit 222a is an input unit for allowing the pilot to move the drone 100 forward/backward, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved upward (toward the rear when held in the hand), the drone 100 moves forward, and when the left input stick 327L is moved downward (toward the front when held in the hand), the drone 100 moves backward.
- the yaw rotation input unit 222b is an input unit for allowing the pilot to yaw rotate the drone 100, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved to the right, the drone 100 turns right, and when the left input stick 327L is moved to the left, the drone 100 turns left.
- the camera attitude operation unit 223 is an input unit for operating the camera holding unit 142 via the shooting control unit 143 and for controlling the orientation of the shooting camera 141 relative to the housing 101 of the drone 100.
- the camera attitude operation unit 223 obtains input to the right slider 326R.
- the camera attitude operation unit 223 accepts operation of either or both of the pitch angle and yaw angle of the shooting camera 141 relative to the housing 101.
- the camera zoom operation unit 224 is an input unit for operating the shooting magnification of the shooting camera 141, i.e., the zoom amount, and obtains input to the left slider 326L.
- the flight mode switching unit 225 is an input unit for switching flight modes. Flight modes selectable by the flight mode switching unit 225 include at least, for example, the outer edge flight mode M102 (see FIG. 8), the inside court flight mode M105 (see FIG. 8), and the fixed position flight mode M103 or M107 (see FIG. 8).
- the flight mode switching unit 225 accepts switching of flight modes via, for example, a touch panel display integrated with the display unit 201.
- the target position receiving unit 226 is a functional unit that receives input of a target shooting position to which the drone 100 should head.
- the target position receiving unit 226 receives input of a point on the stadium F. For example, when at least a portion of an image or schematic diagram of the stadium F is displayed on the display unit 201, the target position receiving unit 226 may receive input of the target shooting position via a touch panel display that is configured integrally with the display unit 201.
- the target position receiving unit 226 may receive a selection input of a target shooting position when a point that can be selected as the target shooting position, i.e., a shooting position, is specified in advance.
- the flight modes of the drone 100 mainly include a pre-preparation mode M100, an off-court takeoff and landing mode M101, an outer edge flight mode M102, an off-court fixed position flight mode M103, an on-court entry mode M104, an on-court flight mode M105, an off-court exit mode M106, an on-court fixed position flight mode M107, and an on-court takeoff and landing mode M108.
- the advance preparation mode M100 is a mode in which advance settings such as geofence settings are made.
- the advance preparation mode M100 transitions to an off-court takeoff and landing mode M101.
- this off-court takeoff and landing mode M101 the drone 100 takes off from point L101g (see FIG. 14). Note that in the off-court takeoff and landing mode M101, the drone 100 may take off from a point outside the court F100 other than point L101g.
- the off-court takeoff and landing mode M101 is the mode to which the drone 100 belongs when control starts or ends.
- the drone 100 transitions from the off-court takeoff and landing mode M101 to the perimeter flight mode M102.
- the outer edge flight mode M102 is a mode in which the drone flies above the outer edge along part or all of the outer edge of the court F100 to photograph the playing field F, and more specifically, flies at one of the photographing positions L101 to L105 (Fig. 14) to photograph.
- the outer edge flight mode M102 is a mode in which the drone flies above the touchline F111b.
- the "outer edge" on which the outer edge flight mode M102 flies is a concept that includes not only directly above the touchline F111b but also slightly outside the court F100.
- the drone 100 receives user instructions via the target position receiving unit 226 of the pilot 200 and flies at one of the specified shooting positions L101 to L105.
- the shooting direction may be manually controlled according to the user's instructions, or may be fixed at a specified angle.
- the drone 100 may change its shooting position while keeping the shooting direction fixed, a so-called dolly shooting technique, in which the drone 100 follows and shoots a specific player.
- the outer edge flight mode M102 can transition to an outside court takeoff and landing mode M101, an outside court fixed position flight mode M103, or an inside court approach mode M104.
- the outside-court fixed position flight mode M103 is a mode in which the drone 100 flies in a fixed position outside the area of the court F100.
- the outside-court fixed position flight mode M103 can transition to the outer edge flight mode M102.
- the on-court entry mode M104 is a mode in which the drone 100 performs a series of processes required for entering the area of the court F100. The drone 100 transitions to the on-court flight mode M105 via the on-court entry mode M104.
- Intra-court flight mode M105 is a mode in which the drone flies above the court F100 to photograph the stadium F, and more specifically, flies at one of the photographing positions L206 to L215 ( Figure 7) to photograph.
- the drone accepts a user command to select a photographing position via the target position receiving unit 226 of the controller 200, and flies at one of the specified photographing positions L206 to L215.
- the photographing direction may be manually controlled according to the user's instructions, or may be fixed at a predetermined angle.
- the on-court flight mode M105 can transition to an off-court exit mode M106, an on-court fixed position flight mode M107, or an on-court takeoff and landing mode M108.
- the court exit mode M106 is a mode in which the drone 100 performs a series of processes required for the drone 100 to exit the area of the court F100.
- the drone 100 transitions from the court exit mode M106 to the outer edge flight mode M102. Note that the court exit mode M106 and the court entry mode M104 can transition back and forth.
- the on-court fixed position flight mode M107 is a mode in which the drone flies in a fixed position within the area of the court F100.
- the on-court fixed position flight mode M107 can transition to the on-court flight mode M105.
- the on-court takeoff and landing mode M108 is a mode in which the drone takes off and lands within the area of the court F100, and is a mode to which the drone transitions mainly when a command to land on the spot is issued by manual intervention.
- a drone that takes off in the on-court takeoff and landing mode M108 transitions to the on-court flight mode M105.
- the power input unit 227 is a functional unit that accepts the power on/off command for the controller 200 via the power button 328.
- the return input unit 228 is a functional unit that accepts a command to return the drone 100 located in the stadium F ( Figure 7) to the target landing point L101g (see Figure 14) via the return button 329.
- the input control unit 220 may be capable of receiving touch input to the display unit 201 and transmitting control commands to the drone 100 in response to the input. More specifically, for example, when the user selects appropriate information such as a map or schematic diagram displayed on the display unit 201, a route to the selected point may be automatically generated, causing the drone 100 to fly autonomously.
- the communication unit 240 is a functional unit that transmits and receives signals between the controller 200 and an appropriate configuration included in the system 1.
- the controller 200 has a communication function that performs wireless communication with the drone 100 by wireless communication using Wi-Fi, 2.4 GHz, and 5.6 to 5.8 GHz frequency bands.
- the controller 200 also has a wireless communication function that can communicate with the server 300 via the communication network 400 using a communication standard such as LTE (Long Term Evolution).
- the communication unit 240 transmits various input signals by a user such as a pilot to the drone 100 or the server 300.
- the communication unit 240 also receives signals from the drone 100 or the server 300.
- (A-1-4. Server 300) (A-1-4-1. Overview of Server 300) 6 is a functional configuration diagram of the server 300 according to the present embodiment.
- the server 300 manages or controls the flight and photography of the drone 100.
- the server 300 includes an input/output unit (not shown) for inputting or outputting various types of information (image output, audio output).
- the server 300 may be a general-purpose computer such as a workstation or personal computer, or may be logically realized by cloud computing.
- the server 300 is equipped with a calculation device such as a CPU for executing information processing, and storage devices such as RAM and ROM, and as a software configuration, it mainly configures the following functional blocks: a presetting unit 310, an event detection unit 320, a photography condition determination unit 325, a flight mode switching unit 330, an outer edge flight control unit 340, an in-court flight control unit 350, a fixed position flight control unit 360, a communication unit 370, and a memory unit 380.
- a calculation device such as a CPU for executing information processing
- storage devices such as RAM and ROM
- the pre-setting unit 310 is a functional unit that performs the settings necessary for the flight of the drone 100 before the drone 100 flies over the field to be photographed.
- the presetting unit 310 mainly includes a geofence setting unit 311 .
- the geofence setting unit 311 is a functional unit that sets the geofence of the drone 100.
- the geofence includes information on the planar direction and the height direction.
- the geofence setting unit 311 sets a geofence according to the flight mode. That is, for example, the geofence setting unit 311 activates the geofence G100 (see FIG. 7) in the outer edge flight mode M102 (see FIG. 8). The geofence setting unit 311 also activates the geofence G200 (see FIG. 7) in the on-court flight mode M105 (see FIG. 8). The geofence setting unit 311 also sets a geofence different from the geofence G100 or geofence G200, a so-called third geofence, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the on-court flight mode M105, that is, the off-court exit mode M106 and the on-court entry mode M104.
- the third geofence is a geofence that covers a combined area that combines the first area defined by the geofence in the first flight mode and the second area defined by the geofence in the second flight mode.
- the geofence G100 of the outer edge flight mode M102 and the geofence G200 of the inside court flight mode M105 overlap. Therefore, the third geofence is a geofence that divides the area that combines the geofences G100 and G200.
- the third geofence may be a geofence that covers an area that combines a first area defined by the first geofence corresponding to the first flight mode, a second area defined by the second geofence corresponding to the second flight mode, and a gap between the first area and the second area.
- the event detection unit 320 is a functional unit that detects the state of the subject to be photographed or the drone 100.
- the event detection unit 320 detects an event based on the camera image of the photographing camera 141 or an input from the external system 700.
- the detection criteria for each event are stored in, for example, the storage unit 380, and the event detection unit 320 detects an event by referring to the storage unit 380.
- the event detection unit 320 may also detect an event by analysis using a neural network.
- the detection process by the event detection unit 320 can be performed using any known appropriate image analysis technology.
- the event detection unit 320 detects events that trigger a change in the flight mode or shooting conditions of the drone 100.
- the event detection unit 320 mainly has an aircraft state acquisition unit 321 , an aircraft action state acquisition unit 322 , a game state acquisition unit 323 , and an offensive/defensive state acquisition unit 324 .
- the aircraft status acquisition unit 321 is a functional unit that acquires the aircraft status of the drone 100.
- 9 is a diagram showing the state transition of the aircraft state of the drone 100.
- the aircraft state is broadly divided into, for example, a normal operation flight mode M200, a detection and judgment mode M210, and an action mode M220.
- the drone 100 starts flying, the drone 100 transitions to the normal operation flight mode M200.
- the detection and judgment mode M210 includes an abnormality detection mode M211, a failure detection mode M212, a manual intervention mode M213, and a low battery mode M214.
- abnormality detection mode M211 if an abnormality is detected in normal operation flight mode M200, the mode transitions to abnormality detection mode M211.
- This abnormality is a transient, in other words, reversible disturbance such as a drop in radio wave strength or strong winds. If the abnormality is resolved in abnormality detection mode M211, the mode transitions to normal operation flight mode M200.
- the drone 100 transitions to failure detection mode M212. If a manual control command is received, the drone transitions to manual intervention mode M213, and if it is detected that the remaining battery charge is less than a predetermined value, the drone transitions to low battery mode M214. In addition, if a manual control command is received in abnormality detection mode M211, failure detection mode M212, or low battery mode M214, the drone transitions to manual intervention mode M213. The drone 100 transitions to an action mode M220 that corresponds to the detection judgment mode M210.
- the action mode M220 is a state in which the drone 100 performs a series of actions that are preset for each state.
- the action mode M220 includes a landing mode M221 at an evacuation point, an emergency stop mode M222, a landing on the spot mode M223, a return mode M224, and a fixed position flight mode M225.
- the landing at evacuation point mode M221 is set to fly the drone 100 to the evacuation point H200 and land it.
- the landing at evacuation point mode M221 is entered when the abnormality is not resolved in the abnormality detection mode M211.
- the emergency stop mode M222 is set to stop the propellers 122 on the spot. In the emergency stop mode M222, the drone 100 falls freely.
- the emergency stop mode M222 can be selected in the manual intervention mode M213 when the propellers 122 are about to come into contact with a person or object.
- the on-site landing mode M223 is set to perform a soft landing on the spot.
- the return mode M224 is set to return to the takeoff and landing point.
- the fixed position flight mode M225 is a state in which the drone flies at a fixed position, and can transition to the normal operation flight mode M200 based on a user operation.
- the user operation is input, for example, by selecting a button displayed on the display unit 201.
- the fixed position flight mode M225 if an event that can transition from the normal operation flight mode M200 to the detection and judgment mode M210, i.e., an abnormality, a malfunction, manual intervention, or a low battery, is detected, the drone 100 transitions from the fixed position flight mode M225 to each state of the detection and judgment mode M210 via the normal operation flight mode M200.
- the drone 100 in the fixed position flight mode M225 can transition to the return mode M224 based on a user operation.
- the drone 100 in the abnormality detection mode M211 and the failure detection mode M212 transitions to a landing mode M221 at an evacuation point.
- the drone 100 in the manual intervention mode M213 transitions to one of the following states depending on the input command: landing mode M221 at an evacuation point, emergency stop mode M222, landing on the spot mode M223, return mode M224, and fixed position flight mode M225.
- the drone 100 in the low battery mode M214 transitions to the return mode M224.
- the drone 100 in the normal operation flight mode M200 can also transition to the return mode M224 based on a user operation.
- the user operation is input, for example, by selecting a button displayed on the display unit 201.
- the aircraft behavior state acquisition unit 322 is a functional unit that acquires the aircraft behavior state of the drone 100.
- Each mode of the aircraft behavior state is a sub-mode of the aircraft state that is performed to realize a transition of the aircraft state.
- 10 is a diagram showing state transitions of the aircraft's behavioral states.
- the aircraft's behavioral states are broadly divided into a takeoff mode M300, an evacuation mode M310, a normal mode M320, and a landing mode M340, for example.
- Takeoff mode M300 is a mode in which drone 100 takes off.
- the state transition of the aircraft's behavior state starts from takeoff mode M300.
- the aircraft's behavior state transitions from takeoff mode M300 to evacuation mode M310 or normal mode M320.
- Evacuation mode M310 mainly includes evacuation point arrival stationary mode M311 and evacuation moving mode M312.
- Normal mode M320 also includes point arrival stationary mode M321 and moving mode M322.
- Evacuation mode M310 and normal mode M320 can transition to each other via temporary suspension mode M330. This is just one example.
- the evacuation point arrival stationary mode M311 is a mode in which the drone moves to the evacuation point H200 and remains stationary there, i.e., hovers.
- the drone 100 in the evacuation point arrival stationary mode M311 transitions to evacuation in-motion mode M312.
- the aircraft behavior state transitions from the evacuation point arrival stationary mode M311 or the evacuation in-motion mode M312 to the temporary suspension mode M330.
- Point arrival stationary mode M321 is a mode in which the drone moves to a specified destination and remains stationary on the spot, i.e., hovers.
- moving to another location in normal use conditions the drone 100 in point arrival stationary mode M321 transitions to moving mode M322.
- the aircraft behavior state transitions from point arrival stationary mode M321 or moving mode M322 to temporary suspension mode M330.
- the drone 100 in the evacuation point arrival stationary mode M311, the point arrival stationary mode M321, the moving mode M322, and the pause mode M330 can transition to the landing mode M340.
- the aircraft's operating state ends processing in the landing mode M340.
- the game status acquisition unit 323 shown in FIG. 6 is a functional unit that acquires the game status of the competition held at the stadium F.
- the game status acquisition unit 323 detects the game status by performing image processing on the captured image.
- the game status acquisition unit 323 may also acquire the game status based on decision-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700.
- the game status acquisition unit 323 may acquire the game status based on information input from the external input device 600 held by a team member, for example, the manager or coach.
- FIG. 11 is a diagram showing an example of state transitions in a match state.
- the match state includes a pre-match state M400, a normal play state M410, and an end-of-match state M460.
- the state transition starts from the pre-match state M400, and transitions from the pre-match state M400 to the normal play state M410.
- the normal play state M410 is a state in which the game is progressing. When the match ends, transitions from the normal play state M410 to the end-of-match state M460. Note that a transition from the normal play state M410 to the end-of-match state M460 may occur not only at the end of the match, but also during a break during the match, such as halftime.
- the game state also includes a play suspended without foul play state M420 and a play suspended with foul play state M440.
- a transition to the play suspended without foul play state M420 occurs.
- the play suspended without foul play state M420 occurs, for example, when the ball crosses the goal line F110a, F110b or the touch line F111a, F111b and goes out of the court.
- a transition to a throw-in state M421, a goal kick state M422, or a corner kick state M423 occurs in accordance with events that occur according to the rules of the game, such as the type of line the ball crossed or the affiliation of the player who kicked the ball out of the court.
- the throw-in state M421, the goal kick state M422, or the corner kick state M423 transitions to the normal play state M410.
- a transition to the foul state M431 occurs.
- an offside occurs or is recognized by the referee
- a transition to the offside state M432 occurs.
- a transition from the foul state M431 or the offside state M432 occurs to the foul play interruption state M440.
- a transition to the free kick state M441 or the penalty kick state M442 occurs depending on the location where the foul occurred and the event that occurred.
- a so-called indirect free kick may be performed instead of a free kick.
- the free kick state M441 may be subdivided into a free kick state for the attacking side and a free kick state for the defending side.
- the free kick state M441 and the penalty kick state M442 when the event in each state ends, the match is resumed and the match state transitions to the normal play state M410.
- the normal play state M410 transitions to the post-match state M460, and the state transition for the match state ends.
- the normal play state M410 may also transition to a penalty shootout state M443. Although not shown in the figure, the penalty shootout state M443 may transition to an end-of-match state M460, thereby terminating the state transition.
- Some of the game states shown in FIG. 11 may trigger a change in flight mode, while other game states may not.
- the flight mode may be changed based on a transition to the shaded states in the figure, i.e., the pre-game state M400, goal kick state M422, corner kick state M423, free kick state M441, penalty kick state M442, player substitution state M450, and end-of-game state M460.
- the flight mode may be changed to one that corresponds to the offensive or defensive state.
- the offensive and defensive state acquisition unit 324 is a functional unit that acquires the offensive and defensive states of the teams in the match held at the stadium F.
- the offensive and defensive state acquisition unit 324 detects the offensive and defensive states by performing image processing on the captured images.
- the offensive and defensive state acquisition unit 324 may also acquire the offensive and defensive states based on judgment-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700.
- the offensive and defensive state acquisition unit 324 may acquire the offensive and defensive states based on information input from the external input device 600 held by a team member, for example, the manager or coach.
- FIG. 12 is a diagram showing an example of a state transition between offensive and defensive states.
- the figure shows an example of an offensive and defensive state in soccer.
- the offensive and defensive state transitions to an offensive state M510 or a defensive state M520.
- the offensive state M510 and the defensive state M520 transition to each other via an offensive/defensive change state M530 or an offensive/defensive uncertainty state M540.
- the offensive state M510 is a state in which one of the teams (hereinafter also referred to as "Team A") designated in advance is on the offensive.
- An offensive state is, for example, a state in which Team A is in possession of the ball and is advancing toward the other team (hereinafter also referred to as “Team B”), but is not limited to this and may be a predetermined state determined by any determination criterion stored in advance in the memory unit 380.
- the attack state M510 includes an A team offensive (own side) state M511, an A team offensive (enemy side) state M512, and an A team quick attack state M513.
- a transition is possible between the A team offensive (own side) state M511 and the A team offensive (enemy side) state M512, and between the A team offensive (own side) state M511 and the A team quick attack state M513.
- a transition is also possible from the A team quick attack state M513 to the A team offensive (enemy side) state M512.
- the defensive state M520 includes a team A defensive (opponent's half) state M521, a team A defensive (own's half) state M522, and a team B fast attack state M523.
- the team A defensive (own's half) state M521 and the team A defensive (own's half) state M522, and the team A defensive (opponent's half) state M521 and the team B fast attack state M523 can transition to each other. Also, a transition can be made from the team B fast attack state M523 to the team A defensive (own's half) state M522.
- the offense/defense switching state M530 and the offense/defense uncertain state M540 can be transitioned to from any of the following: Team A offensive (own side) state M511, Team A offensive (opponent's side) state M512, Team A quick attack state M513, Team A defensive (opponent's side) state M521, Team A defensive (own side) state M522, and Team B quick attack state M523.
- the offensive and defensive state acquisition unit 324 detects a transition to a fast attack state M513, M523 in an offensive/defensive change state M530 or an offensive/defensive uncertain state M540.
- the offensive and defensive state acquisition unit 324 analyzes, for example, the acceleration of the ball or the players, the fluctuation in the ball's movement direction or the player's orientation, the number of players in a specified area, the movement direction of the players, the number of players moving in a certain direction, etc., from images captured by the image capture camera 141.
- the offensive and defensive state acquisition unit 324 detects a fast attack state M513, M523 based on the results of this analysis.
- the offensive and defensive state acquisition unit 324 also determines whether it is an A team fast attack state M513 or a B team fast attack state M523, depending on the movement direction of the players or the ball.
- the offensive and defensive states are not limited to those described above, and any state that triggers a change in the shooting conditions may be specified. For example, a state that transitions upon detection of a long pass may be specified. In addition, the state may be specified appropriately depending on the content of the sport or event to be shot.
- the event detection unit 320 may determine an event based on input information from the external system 700, instead of or in addition to the above-mentioned acquisition units 321 to 324.
- the event detection unit 320 may determine an external disturbance such as a strong wind as an event based on input information from a weather information system, which is an example of the external system 700.
- the event detection unit 320 may also determine an event based on input information from a court facility system, which is another example of the external system 700, or facility information entered by a person involved with the court facility.
- the shooting condition determination unit 325 is a functional unit that determines the shooting conditions set in the shooting camera 141 of the drone 100.
- the shooting condition determination unit 325 determines the shooting conditions according to the event detected by the event detection unit 320.
- the shooting conditions include at least one of the target shooting position and the target shooting direction of the drone 100.
- the target shooting direction includes, for example, information on either or both of the pitch angle with respect to the horizontal and the yaw angle with respect to a predetermined reference direction.
- the target shooting direction may include information on the target zoom amount of the shooting camera 141. In the following description, the shooting direction will be described as including the pitch angle, the yaw angle, and the zoom amount.
- the shooting conditions may also include information on the shooting range of the shooting camera 141. Note that the technical scope of the present invention is not limited to a configuration in which both are set, but rather that at least one of the target shooting position and the target shooting direction is automatically set depending on the event.
- the target shooting direction is achieved by controlling at least one of the nose direction of the drone 100 and the shooting direction of the shooting camera 141.
- the nose direction of the drone 100 is controlled by the flight control unit 123 of the drone 100.
- the shooting direction of the shooting camera 141 is controlled, for example, by the shooting control unit 143 driving the camera holding unit 142.
- control of the nose direction and “control of the shooting direction” are concepts that include control not only in the left-right direction (the so-called “pan direction”) but also in the up-down direction (the so-called "tilt direction”).
- the photographing condition determination unit 325 determines the photographing conditions according to the type of event detected. In the normal play state M410, the photographing condition determination unit 325 allows manual control via the controller 200. When an event is detected by the event detection unit 320, the photographing condition determination unit 325 refers to the event-photographing condition table T1 (see FIG. 13) stored in the memory unit 380 and determines the photographing conditions according to the event.
- the event-photography condition table T1 is a table in which events detected as game states and the photography conditions selected for the events are stored in association with each other. More specifically, in the event-photography condition table T1, events are associated with the photography range, the photography position where the drone 100 is located, the photography direction of the photography camera 141, and the zoom amount of the photography camera 141.
- the photography range is the penalty area F130a or F130b where the ball is located
- the photography position is the photography position L206 or the photography position L211.
- the photography direction is the direction of the goal F120a or F120b where the ball is located.
- the zoom amount is, for example, predetermined in stages, such as IN, Middle, and OUT, in descending order of zoom amount, and is IN in the PK state M442 or the PK shootout state M443.
- the shooting position is one of the shooting positions L101, L102, and L104 in the outer edge flight mode M102. This configuration reduces the risk of the ball colliding with the drone 100.
- the entire court F100 can be shot by shooting from the shooting positions L101, L102, or L104 along the outer edge.
- the shooting direction is set so that the shooting range is around the ball or the referee's position.
- the corner kick state M423 the area in front of the goal can be photographed up close by taking a photo at the shooting positions L207, L209, L212, or L215.
- the ideal shooting direction required for events occurring in a match such as free kicks, fast breaks, and corner kicks, varies even for the same shooting position.
- it is difficult to quickly and accurately achieve shooting from the ideal shooting direction by manual operation and there is a risk that important scenes will be missed as a result of operational errors or delays.
- manual operation was attempted, multiple camera operators would have to be deployed.
- the shooting direction required for each event is determined to a certain extent, the above-mentioned configuration, which automatically controls the drone 100 to a shooting position and shooting direction in accordance with the event, makes it possible to take appropriate photos according to the match situation. It also reduces the number of camera operators, contributing to labor savings.
- the shooting condition determination unit 325 may select the shooting position that is closest to the ball among the stored shooting positions.
- the event-photography condition table T1 may store different photography conditions set for the multiple drones 100.
- the first drone 100 may photograph the ball location with a large zoom amount
- the second drone may photograph the ball location with a small zoom amount.
- the first drone 100 and the second drone 100 may also photograph while yaw rotating in opposite directions.
- the first drone 100 may photograph from the side, and the second drone 100 may photograph from directly above. While it is even more difficult to manually control multiple drones 100 to the appropriate photography conditions, the above-mentioned automatically controlled configuration makes it possible to quickly photograph from multiple angles using multiple drones 100.
- Events for which overhead shooting is performed may be associated with opposite shooting directions. That is, for example, one event is associated with a shooting direction from team A's court to team B's court, and a shooting direction from team B's court to team A's court.
- the offensive/defensive state acquisition unit 324 detects which team is in possession of the ball, and the shooting condition determination unit 325 determines the shooting direction depending on the team in possession of the ball. Specifically, the shooting condition determination unit 325 determines that the shooting direction should be the offensive direction of the team in possession of the ball. With this configuration, it is possible to continuously take overhead shots of the moving ball.
- the event-photography condition table T1 may be configured to accept correction input by the user, and the changes may be stored in the storage unit 380. With this configuration, the user's photography knowledge is reflected in the event-photography condition table T1, making it possible to realize more optimal automatic photography. Furthermore, multiple types of event-photography condition tables T1 may be stored in the storage unit 380, and the user may select the table to be applied. For example, this is because the content to be photographed differs depending on whether the purpose of photography is to watch a sports game or to instruct athletes, etc. With this configuration, automatic photography suited to the purpose can be easily realized. Note that the event-photography condition table T1 shown in FIG. 13 is an example of a table set for coaching purposes, for example, but it is merely one example, and the specific photography conditions stored in the table are arbitrary.
- both the shooting positions L101 to L215 and the shooting direction are automatically set according to the detected event, but instead, only the shooting positions L101 to L215 may be automatically set.
- the shooting direction is determined by input via the controller 200.
- the shooting condition determination unit 325 may be configured to determine the shooting direction according to the detected event and the selected shooting position L101 to L215.
- the shooting condition determination unit 325 may determine the shooting conditions based on the input to the controller 200 when the event detection unit 320 has not detected an event, and may determine the shooting conditions based on the event when the event detection unit 320 has detected an event.
- the shooting condition determination unit 325 may determine the shooting conditions based on the operation received via the controller 200 or the external system 700. That is, the shooting conditions input from the controller 200 are applied in priority over the shooting conditions associated with the event.
- the user only needs to focus on operations that cannot be handled by automatic control, so the operational burden is reduced compared to a configuration in which everything is manually controlled, and operational errors can be reduced.
- this configuration it is possible to achieve both convenience and freedom, that is, to ensure the freedom of shooting according to the user's appropriate requests while maintaining the convenience of automatic shooting.
- the photographing condition determination unit 325 may determine the photographing conditions based on the event, and if the event detection unit 320 does not detect an event, the photographing condition determination unit 325 may automatically track and photograph the ball.
- the shooting condition determination unit 325 determines different shooting conditions for each of the multiple drones 100.
- the multiple drones 100 may take pictures from different shooting positions and different shooting directions. For example, one drone may take a picture of a player taking a shot, while the other may take a picture of the goalkeeper of the opposing team.
- the shooting condition determination unit 325 may also set shooting conditions for multiple drones 100 flying simultaneously such that they each shoot the same shooting range from different target shooting positions. With this configuration, important scenes can be shot from multiple angles.
- the shooting condition determination unit 325 may also set shooting conditions for multiple drones 100 flying simultaneously such that they each shoot an area including the same shooting range with different zoom amounts. With this configuration, it becomes possible to shoot areas that are particularly noteworthy in the stadium F under multiple shooting conditions, making it possible to more reliably shoot important scenes.
- the shooting condition determination unit 325 may analyze the captured image, predict the shooting range to be captured based on the analysis results, and determine the shooting conditions. For example, the shooting condition determination unit 325 may predict the movement distance of the ball after a predetermined time by analyzing the movement direction and speed or acceleration of the ball from the captured image, and determine the shooting conditions such that the position of the ball after the predetermined time will be in the shooting range. Note that the speed of the ball may refer to the speed at the start of the prediction, i.e., the initial speed.
- the photographing condition determination unit 325 may determine the photographing conditions according to the trajectory of the ball predicted by the event detection unit 320 as a result of the detection of an event.
- the photographing condition determination unit 325 may predict the trajectory of the ball, for example, when a fast attack state M513, M523, a long pass, or the like is detected, and determine the photographing conditions according to the predicted trajectory.
- the photographing condition determination unit 325 may change the photographing direction to the traveling direction of the ball and may determine to zoom out the photographing magnification. By zooming out, the traveling ball can be photographed more reliably.
- the ball trajectory prediction is not limited to a mode in which a predetermined trajectory is predicted and then detected as an event by the event detection unit 320. That is, for example, the shooting condition determination unit 325 may perform ball trajectory prediction by the shooting condition determination unit 325 or another functional unit separately from the event detection unit 320, and determine the shooting conditions based only on the result of the trajectory prediction.
- the shooting direction of the shooting camera 141 when changing the shooting direction in response to a ball trajectory prediction, it is preferable to control the shooting direction of the shooting camera 141 by control of the shooting control unit 143 instead of controlling the nose direction.
- changing the shooting direction based on a ball trajectory prediction requires changing the shooting direction with high response speed. That is, for example, when changing the shooting direction based on a ball trajectory prediction, the shooting direction of the shooting camera 141 is changed by the shooting control unit 143, and when changing the shooting direction not based on a trajectory prediction, the shooting direction may be changed by controlling the nose direction of the drone 100.
- the shooting condition determination unit 325 may determine to change the shooting direction to the direction of the ball's movement and to zoom out the shooting magnification.
- the flight mode switching unit 330 is a functional unit that switches the flight mode depending on the detection result by the event detection unit 320.
- the flight mode switching unit 330 mainly has a mode switching input acquisition unit 331, a flight permitted area switching unit 332, a geofence switching unit 333, and a flight path generation unit 334.
- the mode switching input acquisition unit 331 is a functional unit that acquires input information regarding switching of flight modes.
- the flight modes are, for example, the outer edge flight mode M102, the fixed position flight mode M103 or M107, and the on-court flight mode M105 (all see FIG. 8).
- Whether the off-court fixed position flight mode M103 or the on-court fixed position flight mode M107 is used for flight in a fixed position is determined according to the position of the drone 100 at the time the flight mode selection input is received. In other words, when the drone 100 is in the area outside the court F200, the off-court fixed position flight mode M103 is used, and when the drone 100 is inside the court F100, the on-court fixed position flight mode M107 is used.
- the permitted flight area switching unit 332 is a functional unit that switches the permitted flight area in response to switching of the flight mode.
- the geofence switching unit 333 is a functional unit that switches geofences in response to switching of flight modes. For example, when the flight mode is the outer edge flight mode M102, the geofence switching unit 333 sets a geofence G100. When the flight mode is the inner court flight mode M105, the geofence switching unit 333 sets a geofence G200. In addition, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the inner court flight mode M105, i.e., the inner court entry mode M104 and the outer court exit mode M106, a third geofence different from the geofences G100 and G200 in the outer edge flight mode M102 and the inner court flight mode M105 is set.
- the flight path generating unit 334 is a functional unit that generates a flight path of the drone 100 during movement involving switching of flight modes.
- the flight path generating unit 334 determines, for example, the shooting position at which the mode transitions to the inside-court entry mode M104 or the outside-court exit mode M106.
- the flight path generating unit 334 also determines the shooting position at which the mode transitions from the inside-court entry mode M104 to the inside-court flight mode M105, or the shooting position at which the mode transitions from the outside-court exit mode M106 to the outer edge flight mode M102.
- the flight path generating unit 334 generates a specific flight path in the inside-court entry mode M104 or the outside-court exit mode M106. This flight path is generated, in principle, within the area of the third geofence.
- the outer edge flight control unit 340 is a functional unit that controls the flight of the drone 100 in the outer edge flight mode.
- the outer edge flight control unit 340 has an imaging condition command unit 341 and a flight path generation unit 342.
- the shooting condition command unit 341 is a functional unit that transmits commands regarding shooting conditions to the drone 100.
- the shooting conditions are, for example, a target shooting position or shooting direction.
- This shooting condition command unit 341 acquires a target shooting position located within the range of the flight area in the outer edge flight mode M102 from the shooting condition determination unit 325.
- the shooting condition command unit 341 may also acquire a target shooting position input by the user, for example, received via the target position receiving unit 226 of the controller 200.
- the flight path generating unit 342 is a functional unit that generates a flight path along which the drone 100 moves in the outer perimeter flight mode M102. In other words, the flight path generating unit 342 generates a flight path in a flyable area in the outer perimeter flight mode M102.
- the flight path generating unit 342 generates a flight path from the current position to the target position.
- the flight path generating unit 342 may generate a flight path in the flight range in the outer perimeter flight mode M102.
- the flight path generation unit 342 moves the drone 100 to the outer court area F200 outside the touchline F111b.
- the flight path generation unit 342 may also perform shooting directly below the drone 100 or from that point toward the court F100. With this configuration, it is possible to follow and shoot the ball even if the ball rolls into the outer court area F200.
- the geofence G100 of the outer edge flight mode M102 may be set in advance to extend beyond the touchline F111b to the outside of the court F100. With this configuration, the drone 100 can be reliably maintained within the geofence G100 even when the drone 100 follows the ball and flies slightly outside the touchline F111b as described above.
- the flight path generating unit 342 When the flight path generating unit 342 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle on the inside of the court F100. When the flight path generating unit 342 detects an obstacle, it may hover for a predetermined time and then move along the originally generated flight path. This is because while the safety of the drone 100 is not ensured in the area outside the court F200 in the stadium F, the safety of the drone 100 inside the court F100 is highly likely to be ensured.
- the obstacle may be detected, for example, by the obstacle detecting unit 130 of the drone 100, or by information from an external system 700 or the like.
- the outer edge flight control unit 340 may switch to manual control after causing the flight path generation unit 342 to hover for a predetermined time. Furthermore, when an obstacle is detected, the outer edge flight control unit 340 may cause the flight path generation unit 342 to hover for a predetermined time, and then display a message prompting the user to re-input the target position via the display control unit 210.
- the in-court flight control unit 350 is a functional unit that controls the flight of the drone 100 in the in-court flight mode M105.
- the in-court flight control unit 350 has an image capture condition command unit 351 and a flight path generation unit 352.
- the image capture condition command unit 351 is a functional unit that transmits commands to the drone 100 regarding image capture conditions within the range of the flight area in the in-court flight mode M105.
- the image capture conditions are, for example, a target image capture position or image capture direction.
- the flight path generating unit 352 generates a flight path along which the drone 100 moves in the on-court flight mode M105. That is, the flight path generating unit 352 generates a flight path in a flyable area in the on-court flight mode M105. More specifically, the flight path generating unit 352 generates a flight path by connecting multiple preset shooting positions in the on-court flight mode M105. Like the flight path generating unit 342 of the outer edge flight control unit 340, the flight path generating unit 352 may generate a flight path in a flight range in the on-court flight mode M105 when the current location and the target shooting position belong to flyable areas of different flight modes.
- the flight path generating unit 352 changes the connected shooting positions depending on the event detection status. That is, when an event is detected, the flight path generating unit 352 changes the connection relationship of the shooting positions on the flight path and generates a flight path to the target shooting position.
- the flight path generating unit 352 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle.
- the obstacle is detected by, for example, the obstacle detection unit 130.
- the flight path generating unit 352 may regenerate the flight path by changing the connection between multiple shooting positions that have been set in advance, or may change the flight path to a higher altitude while maintaining the flight path on a plane.
- the flight path generating unit 352 may hover for a predetermined time when it detects an obstacle, and then move along the flight path that was initially generated.
- the in-court flight control unit 350 may switch to manual control after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle.
- the in-court flight control unit 350 may display a message prompting the user to re-input the target position via the display control unit 210 after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle.
- the obstacle may be, for example, a bird, a fixed facility, or a player.
- the obstacles also include balls.
- flight control in the outer edge flight mode M102 is performed by the outer edge flight control unit 340
- flight control in the inner court flight mode M105 is performed by the inner court flight control unit 350.
- the shooting positions defined in each of the outer edge flight mode M102 and the inner court flight mode M105 are presented as options, and a flight path to the selected target position is generated.
- the technical scope of the present invention is not limited to this, and the pilot 200 may control the shooting position and orientation of the drone 100 to fly freely at any position in the area within the geofences G100 and G200 set corresponding to each flight mode.
- flight path generation units 334, 342, and 352 are an example, and for example, a single flight path generation unit may generate the flight path without subdivision.
- FIG. 14 is a schematic diagram showing the routes that the drone 100 can follow, defined by the shooting positions L101-L105, L206-L215, and the evacuation point H200.
- Point L101g on the ground at the shooting position L101 is the takeoff and landing point for the drone 100.
- the position transition of the drone 100 begins with the step of taking off from point L101g and arriving at the shooting position L101.
- the drone 100 also descends at the shooting position L101, and lands at point L101g to end shooting.
- the drone 100 may only be able to transition to adjacent shooting positions.
- the point to which the drone 100 at shooting position L105 can transition while maintaining the outer edge flight mode is shooting position L104.
- the points to which the drone 100 at shooting position L105 can transition after switching to the inside court flight mode M105 are shooting positions L106 and L107.
- the flight path generation unit 334, 342, or 352 (see FIG. 6) generates a flight path for the drone 100 by referring to the possible transition paths.
- the drone 100 transitions to the selected shooting position via the available shooting positions. For example, when the drone 100 is at shooting position L105 and shooting position L215 is selected, the flight path generating unit 334, 342, or 352 (see FIG. 6) generates a flight path that transitions through shooting positions L105, L207, L208, L213, L212, and L215 in that order, and the drone 100 flies along this flight path.
- the drone 100 may fly on a flight path that connects the current position to the target shooting position in a straight line. Furthermore, the drone 100 may transition to an adjacent shooting position in the case of a transition of the shooting position accompanied by a flight mode switch, and may be able to transition directly to a non-adjacent shooting position in the case of a transition without a flight mode switch. That is, for example, when the drone moves from shooting position L105 to shooting position L215, it may transition from shooting position L105 to shooting position L207 accompanied by a mode switch, and then move linearly from shooting position L107 to shooting position L215 in the area within the geofence G200.
- the outer edge flight control unit 340 and the inner court flight control unit 350 fly the drone 100 autonomously in each flight area according to the flight mode.
- the outer edge flight control unit 340 and the inner court flight control unit 350 may perform dolly shooting within each flight area, that is, the drone 100 may automatically follow and shoot a specific object such as a ball or a specified player.
- the outer edge flight control unit 340 and the inner court flight control unit 350 may automatically control the flight height of the drone 100.
- the autonomous flight mode may differ depending on the flight mode. For example, dolly shooting may be performed when controlled by the outer edge flight control unit 340, while automatic follow-up shooting of only the shooting direction with a fixed shooting position, or automatic follow-up shooting of the position and shooting direction may be performed when controlled by the inner court flight control unit 350.
- the outer edge flight control unit 340 and the inner court flight control unit 350 may also generate a flight path within the court (within the stadium F) for moving the drone 100 to a target position specified by the user in each flight area.
- the fixed position flight control unit 360 is a functional unit that controls the flight of the drone 100 in the off-court fixed position flight mode M103 and the on-court fixed position flight mode M107. In the fixed position flight mode, the fixed position flight control unit 360 hovers at a predetermined position and controls the nose direction or the direction of the shooting camera 141 to follow a specific player or the ball and perform automatic shooting. Note that the above-mentioned "control of direction” is a concept that includes control not only in the left-right direction (so-called “pan direction”) but also in the up-down direction (so-called "tilt direction").
- the fixed position flight control unit 360 includes an image capture condition command unit 361.
- the image capture condition command unit 361 is a functional unit that transmits commands for the target position and the target image capture direction in the fixed position flight mode M103 or M107.
- the image capture direction may be information determined by the image capture condition determination unit 325, or may be information input by the user via the controller 200.
- the communication unit 370 has a modem or the like (not shown) and is capable of communicating with the drone 100, the controller 200, and the like via the communication network 400.
- the communication unit 370 may, for example, monitor the state of the drone 100 and its surroundings and notify the controller 200.
- the storage unit 380 is a functional unit that stores information related to flight control of the drone 100, and is, for example, a database.
- the storage unit 380 stores, for example, the coordinates of multiple shooting positions L101 to L105, L210 to L215 in the stadium F. These coordinates may be two-dimensional coordinates on a plane or three-dimensional coordinates including information in the height direction.
- the storage unit 380 also stores an event-shooting condition table T1 shown in FIG. 13. As described above, the event-shooting condition table T1 is recorded in a rewritable manner. Furthermore, multiple event-shooting condition tables T1 may be stored.
- Flowcharts Fig. 15 is a flowchart showing the overall flow of aerial photography control in this embodiment.
- Fig. 16 is a subroutine of the flight restriction process S1002 in Fig. 15.
- Fig. 17 is a subroutine of the photography condition switching process S1010 in Fig. 15.
- the control described in the flowchart shown in FIG. 15 is executed in a regular loop. As shown in FIG. 15, if it is detected that the drone 100 is approaching the vicinity of geofences G100, G200 while flying (YES in step S1001), the process proceeds to flight restriction processing in step S1002. The subroutine of the flight restriction processing S1002 is explained in FIG. 15.
- step S1003 If it is not detected in step S1001 that the drone is approaching the vicinity of geofences G100, G200 (NO in step S1001), the presence or absence of an obstacle in the path or near the drone 100 is detected (step S1003). If an obstacle is detected in step S1003 (YES in step S1003), the drone 100 is caused to hover or a detour route is generated, and the flight route of the drone 100 is changed to the detour route (step S1004).
- step S1003 If no obstacle is detected in step S1003 (NO in step S1003), the presence or absence of an action determination of the aircraft state is detected (step S1005). If an action determination of the aircraft state is detected in step S1005 (YES in step S1005), the process proceeds to step S1006, where an event type determination process is executed (step S1006).
- step S1005 If no action determination is detected in step S1005 (NO in step S1005), it is determined whether or not there is an input from the controller 200 by the user (step S1007). If an input from the controller 200 is detected in step S1007 (YES in step S1007), a command based on the input is executed (step S1008).
- step S1007 If no input from the controller 200 is detected in step S1007 (NO in step S1007), the presence or absence of an event is determined (step S1009). If an event is detected (YES in step S1009), the process proceeds to the shooting condition switching process in step S1010 (step S1010). If no event is detected in step S1009 (NO in step S1009), the process returns to step S1001 and steps S1001 to S1009 are repeated.
- the overall processing of aerial photography control is performed in the following order: geofence restriction, obstacle detection, control based on aircraft status, control based on user input, and control based on in-game events such as the game status or offensive and defensive status.
- each control is executed before control based on in-game events.
- This order is based on the priority of performing safe control processing. With this configuration, the safety of flying the drone 100 can be more reliably guaranteed.
- step S1101 the flight control unit 123 of the drone 100 issues an operation command to restrict the drone 100 from advancing outside the geofence.
- step S1101 a restriction is set on the flight target position so that the drone 100 does not advance outside the geofence even when the drone 100 is manually operated.
- step S1102 if the drone 100 does not advance outside the geofence (NO in step S1102), the process ends.
- step S1103 If the drone 100 advances outside the geofence (YES in step S1102), proceed to step S1103. Possible causes of this situation include, for example, the wind being too strong and causing the aircraft to be swept away, or a malfunction in the drone 100 preventing it from flying in the intended direction.
- the flight control unit 123 issues an operation command to return to the geofence. More specifically, in step S1103, a flight target position command to return to the geofence, that is, an operation command to set the flight target position to a specified point inside the geofence, is given to the drone 100.
- step S1104 by referring to information measured by the measurement unit 110 of the drone 100, such as information on position, direction, altitude, or speed, it is determined whether the drone 100 is approaching the inside of the geofence (step S1104).
- step S1104 is executed a predetermined time after step S1103. Note that in step S1104, it is sufficient if the drone 100 is closer to the inside of the geofence than at the time of step S1103, and it is not necessary to determine whether the drone 100 is located inside the geofence.
- step S1104 If it is determined in step S1104 that the drone 100 is not approaching the geofence (NO in step S1104), it is determined that the above operation command is ineffective, and the flight control unit 123 forces the drone 100 to land (step S1105).
- step S1106 information measured by the measurement unit 110 of the drone 100, such as the position and altitude, is referenced to determine whether the drone 100 is located within the geofence. If the drone 100 is located within the geofence (YES in S1106), end the process. If the drone 100 is not located within the geofence (NO in S1106), return to step 1104 and continue operation based on the operation command to return to the geofence until the drone 100 returns within the geofence.
- FIG. 17 is a diagram showing an example of a switching process flow for switching shooting conditions.
- the shooting condition determination unit 325 refers to the event-shooting condition table T1 to determine the target values for the shooting position and shooting direction, and transmits a control command to the drone 100 (step S1301).
- step S1303 When the target shooting position and the target shooting direction are reached in step S1303, the process proceeds to step S1305.
- step S1305 the mode transitions to manual control mode, and a message is displayed on the operation screen G3 (see FIG. 20) indicating that manual operation is permitted.
- FIGS. 18 and 19 are examples of screens G1 and G2 displayed on the display unit 201 of the controller 200.
- FIG. 18 and 19 are examples of screens G1 and G2 displayed on the display unit 201 of the controller 200.
- the screen G1 shown in FIG. 18 displays a field map G10 that shows a schematic bird's-eye view of the stadium F and the shooting positions L101 to L215, an icon G11 that shows the position information of the drone 100, the shooting range G12 captured by the shooting camera 141, a display field G21 of the flight mode to which the drone 100 belongs, a status display field G22 that shows the status detected by the event detection unit 320, such as the aircraft status, aircraft behavior status, match status, and offensive and defensive status, a landing button G30 for landing the drone 100, and a video field G40 in which images captured by the drone 100 are displayed.
- the display field G21 displays the main control modes, either automatic control mode or manual control mode.
- the drone is flying in the outer edge flight mode at the shooting position L101.
- the position and shooting direction of the drone 100 may be controlled manually, or automatic tracking control of the ball or a specific player may be performed.
- automatic tracking control information about the ball or specific player being tracked may be displayed on the screen G1.
- the icon G11 representing the drone 100 displays an arrow indicating the direction of travel of the drone 100.
- the direction of the nose of the drone 100 is not limited to the direction of travel of the drone 100, and may be pointing in any direction.
- the direction of the nose of the drone 100 does not have to be constant while moving, and for example, the drone 100 may move while photographing a player or the ball by rotating in a yaw motion.
- FIG. 19 shows an example of the display on screen G2 when multiple drones 100 are photographing one stadium F.
- FIG. 19 particularly shows an example of the display when a free kick state M441 is detected.
- icons 11a and 11b of two drones are displayed on a field map G10.
- the shooting ranges G13a and G13b photographed by each of the multiple drones 100, and video columns G40a and G40b showing the captured images are displayed in association with the icons 11a and 11b of the corresponding drones 100.
- the first drone 100 corresponding to icon 11a is taking localized shots near the goal 120a, while the second drone 100 corresponding to icon 11b is taking bird's-eye shots from a shooting position L101 on the outer edge.
- the shooting angles of the first drone and the second drone are different from each other. It is preferable that the shooting conditions of the first drone and the second drone are such that they complement each other's positions where they cannot shoot. In this way, with a configuration in which multiple drones 100 shoot under different shooting conditions, the stadium F can be photographed from multiple angles.
- FIG. 20 is an example of screen G3 displayed on the display unit 201 when an event has not been detected by the event detection unit 320. Because an event has not been detected, automatic control of the drone 100 is not performed, and the display field G21 indicates that the mode is "manual control mode.”
- FIG. 21 is an example of screen G4 when the event detection unit 320 detects an event in the state of screen G3.
- the drone 100 switches to automatic control in response to the detection of the event, and the display field G21 indicates that it is in "automatic control mode.”
- the status display field G22 indicates that a team A quick attack state M513 has been detected as the offensive and defensive state.
- an arrow G15 indicating the predicted trajectory of the ball and the ball G16 after movement based on the trajectory are displayed on the field map G10. Also, an arrow G17 indicating the change in the shooting direction of the shooting camera 141 is displayed.
- the present invention is not limited to the above embodiment, and various configurations can be adopted based on the contents of this specification.
- the series of processes described in relation to the above embodiment may be implemented using software, hardware, or a combination of software and hardware.
- a computer program for implementing each function of the server 300 according to this embodiment may be created and implemented in a PC or the like.
- a computer-readable recording medium on which such a computer program is stored may also be provided. Examples of the recording medium include a magnetic disk, an optical disk, a magneto-optical disk, and a flash memory.
- the above computer program may also be distributed, for example, via the communication network 400, without using a recording medium.
- Aerial photography system 100 Drone (mobile body) 141
- Shooting camera 200
- Pilot 220
- Input control unit 300
- Event detection unit 330
- Flight mode switching unit 331
- Mode switching determination unit 380
- Memory unit F Stadium F100 Court F200 Area outside the court
- Outer edge flight mode M105 Inner court flight mode
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2024548908A JPWO2024069789A1 (enrdf_load_stackoverflow) | 2022-09-28 | 2022-09-28 | |
PCT/JP2022/036120 WO2024069789A1 (ja) | 2022-09-28 | 2022-09-28 | 空中撮影システム、空中撮影方法および空中撮影プログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2022/036120 WO2024069789A1 (ja) | 2022-09-28 | 2022-09-28 | 空中撮影システム、空中撮影方法および空中撮影プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024069789A1 true WO2024069789A1 (ja) | 2024-04-04 |
Family
ID=90476657
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/036120 WO2024069789A1 (ja) | 2022-09-28 | 2022-09-28 | 空中撮影システム、空中撮影方法および空中撮影プログラム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2024069789A1 (enrdf_load_stackoverflow) |
WO (1) | WO2024069789A1 (enrdf_load_stackoverflow) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017011469A (ja) * | 2015-06-22 | 2017-01-12 | カシオ計算機株式会社 | 撮影装置、撮影方法、およびプログラム |
WO2017057157A1 (ja) * | 2015-09-30 | 2017-04-06 | 株式会社ニコン | 飛行装置、移動装置、サーバおよびプログラム |
US20180129212A1 (en) * | 2016-11-09 | 2018-05-10 | Samsung Electronics Co., Ltd. | Unmanned aerial vehicle and method for photographing subject using the same |
JP2019161486A (ja) * | 2018-03-14 | 2019-09-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 動体検出装置、制御装置、移動体、動体検出方法、及びプログラム |
JP2020115642A (ja) * | 2016-02-03 | 2020-07-30 | ソニー株式会社 | 複数カメラネットワークを利用して静止シーン及び/又は移動シーンを取り込むためのシステム及び方法 |
-
2022
- 2022-09-28 JP JP2024548908A patent/JPWO2024069789A1/ja active Pending
- 2022-09-28 WO PCT/JP2022/036120 patent/WO2024069789A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017011469A (ja) * | 2015-06-22 | 2017-01-12 | カシオ計算機株式会社 | 撮影装置、撮影方法、およびプログラム |
WO2017057157A1 (ja) * | 2015-09-30 | 2017-04-06 | 株式会社ニコン | 飛行装置、移動装置、サーバおよびプログラム |
JP2020115642A (ja) * | 2016-02-03 | 2020-07-30 | ソニー株式会社 | 複数カメラネットワークを利用して静止シーン及び/又は移動シーンを取り込むためのシステム及び方法 |
US20180129212A1 (en) * | 2016-11-09 | 2018-05-10 | Samsung Electronics Co., Ltd. | Unmanned aerial vehicle and method for photographing subject using the same |
JP2019161486A (ja) * | 2018-03-14 | 2019-09-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 動体検出装置、制御装置、移動体、動体検出方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2024069789A1 (enrdf_load_stackoverflow) | 2024-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11797009B2 (en) | Unmanned aerial image capture platform | |
US20220374012A1 (en) | Fitness And Sports Applications For An Autonomous Unmanned Aerial Vehicle | |
US11755041B2 (en) | Objective-based control of an autonomous unmanned aerial vehicle | |
US11188101B2 (en) | Method for controlling aircraft, device, and aircraft | |
US10816967B2 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
US10336469B2 (en) | Unmanned aerial vehicle movement via environmental interactions | |
US10377484B2 (en) | UAV positional anchors | |
US10357709B2 (en) | Unmanned aerial vehicle movement via environmental airflow | |
US20210072745A1 (en) | Systems and methods for uav flight control | |
JP6816156B2 (ja) | Uav軌道を調整するシステム及び方法 | |
CN110325939A (zh) | 用于操作无人驾驶飞行器的系统和方法 | |
US20220137647A1 (en) | System and method for operating a movable object based on human body indications | |
US12007763B2 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
CN108268050A (zh) | 运动控制装置、头戴显示设备、无人机和飞行系统 | |
WO2024069789A1 (ja) | 空中撮影システム、空中撮影方法および空中撮影プログラム | |
CN116888045A (zh) | 无人飞行器、控制终端、救机方法和救机系统 | |
WO2024069788A1 (ja) | 移動体システム、空中撮影システム、空中撮影方法および空中撮影プログラム | |
WO2024069790A1 (ja) | 空中撮影システム、空中撮影方法および空中撮影プログラム | |
WO2024189898A1 (ja) | 撮影システム、撮影方法および撮影プログラム | |
WO2024166318A1 (ja) | 撮影システム、撮影方法および撮影プログラム | |
WO2024252444A1 (ja) | 判定システム、判定方法、及び判定プログラム | |
CN114641744A (zh) | 控制方法、设备、系统及计算机可读存储介质 | |
WO2023238208A1 (ja) | 空中撮影システム、空中撮影方法及び空中移動体管理装置 | |
WO2024180639A1 (ja) | 撮影システム、撮影方法、移動体制御装置及びプログラム | |
WO2024018643A1 (ja) | 撮影システム、撮影方法、撮影制御装置及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22960852 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2024548908 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |