WO2021060113A1 - 撮影システム、制御装置、制御方法、プログラム及び記憶媒体 - Google Patents
撮影システム、制御装置、制御方法、プログラム及び記憶媒体 Download PDFInfo
- Publication number
- WO2021060113A1 WO2021060113A1 PCT/JP2020/035124 JP2020035124W WO2021060113A1 WO 2021060113 A1 WO2021060113 A1 WO 2021060113A1 JP 2020035124 W JP2020035124 W JP 2020035124W WO 2021060113 A1 WO2021060113 A1 WO 2021060113A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- shooting
- photographing
- control
- signal
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 68
- 238000004891 communication Methods 0.000 claims abstract description 131
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 claims abstract description 106
- 238000003384 imaging method Methods 0.000 claims description 85
- 238000012545 processing Methods 0.000 claims description 68
- 238000002360 preparation method Methods 0.000 claims description 33
- 230000008569 process Effects 0.000 claims description 33
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical group C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims description 24
- 238000013459 approach Methods 0.000 claims description 12
- 238000001514 detection method Methods 0.000 claims description 10
- 230000008859 change Effects 0.000 claims description 8
- 238000010586 diagram Methods 0.000 description 11
- 230000010365 information processing Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 3
- PXHVJJICTQNCMI-UHFFFAOYSA-N Nickel Chemical compound [Ni] PXHVJJICTQNCMI-UHFFFAOYSA-N 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- UFHFLCQGNIYNRP-UHFFFAOYSA-N Hydrogen Chemical compound [H][H] UFHFLCQGNIYNRP-UHFFFAOYSA-N 0.000 description 1
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 229910052739 hydrogen Inorganic materials 0.000 description 1
- 239000001257 hydrogen Substances 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229910052759 nickel Inorganic materials 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000001141 propulsive effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/54—Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/006—Apparatus mounted on flying objects
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2206/00—Systems for exchange of information between different pieces of apparatus, e.g. for exchanging trimming information, for photo finishing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
Definitions
- the present invention relates to a photographing system, a control device, a control method, a program, and a storage medium, and more particularly to a technique for photographing a pre-registered subject with an unmanned aerial vehicle.
- Patent Document 1 discloses a technique for photographing a plurality of moving objects using an autonomous flying robot.
- Patent Document 1 is to photograph a moving object such as an unspecified number of intruders and vehicles that have invaded the surveillance space with a camera of an autonomous flying robot for the purpose of crime prevention.
- An object of the present invention is to provide a technique capable of photographing a pre-registered subject with an unmanned aerial vehicle.
- the imaging system is an imaging system including an unmanned aerial vehicle having a photographing means capable of photographing a subject in a flight state, a terminal of the subject, and a control device capable of communicating with the unmanned aerial vehicle.
- the control device is A storage means for registering subject information in which the subject is set as a shooting target, and A determination means for determining whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject acquired by communication with the terminal.
- a signal generation means for generating a control signal for controlling the photographing means based on the determination of the determination means, and
- a communication control means for transmitting the subject information and the control signal to the unmanned aerial vehicle.
- the unmanned aerial vehicle A specific means for identifying the subject based on the subject information distributed from the terminal of the subject and the subject information transmitted from the communication control means.
- An imaging control means that controls the imaging means based on the control signal to control the imaging of the subject specified by the specific means. It is characterized by having.
- the storage means registers as the subject information group information in which users of a plurality of vehicles constituting the subject are set as a group to be photographed.
- the determination means is characterized in that it determines whether or not the subject is traveling in a predetermined shooting area based on the position information and the map information of the subject.
- the determination means determines that at least one of the plurality of vehicles set as the group has entered the photographing area, the plurality of vehicles will move. It is determined that the vehicle is traveling in the shooting area, and the vehicle is determined to be running.
- the signal generation means generates a control signal instructing the start of shooting, and the signal generation means generates a control signal.
- the shooting control means controls the shooting means based on the control signal to start shooting a group of the plurality of vehicles.
- the determination means determines.
- the signal generation means generates a control signal instructing the continuation of shooting to generate a control signal.
- the imaging control means controls the imaging means based on the control signal to continue the imaging.
- the signal generation means generates a control signal instructing the end of photographing, and generates a control signal.
- the imaging control means controls the imaging means based on the control signal to end the imaging.
- the determination means determines that all of the plurality of vehicles set as the group have entered the photographing area
- the plurality of vehicles move into the photographing area. Is determined to be running
- the signal generation means generates a control signal instructing the start of shooting
- the signal generation means generates a control signal.
- the shooting control means controls the shooting means based on the control signal to start shooting a group of the plurality of vehicles.
- the determination means has determined that at least one of the plurality of vehicles has entered the photographing area or a predetermined preparation area set in front of the photographing area.
- the signal generation means generates an area notification signal for notifying the user that the user has entered the shooting area or the preparation area.
- the communication control means is characterized in that the area notification signal is transmitted to the plurality of vehicles.
- the control device is An image processing means that performs image processing for extracting the user's face from the image data taken by the shooting means, and an image processing means.
- An image determination means for determining whether or not the faces of the users set as the group have been photographed based on the result of the image processing and the group information is further provided.
- the signal generation means A parameter control signal for controlling the shooting parameters of the shooting means is generated so that the faces of the set number of users can be shot when the faces of the set number of users are not shot.
- the photographing control means of the unmanned aerial vehicle The imaging means is controlled based on the parameter control signal to perform the imaging.
- the signal generation means when the faces of the set number of users are not photographed by the determination of the image determination means, the signal generation means generates a shooting guidance signal that guides the user to redo the shooting.
- the communication control means is characterized in that the shooting guidance signal is transmitted to the plurality of vehicles.
- the determination means is The inter-vehicle distance of the plurality of vehicles traveling in the photographing area or a predetermined preparation area set in front of the photographing area is acquired based on the position information, and the inter-vehicle distance is the upper limit distance of the predetermined reference distance range.
- the signal generation means A distance notification signal for notifying the user that the inter-vehicle distance exceeds the upper limit of the reference distance range is generated.
- the communication control means is characterized in that the distance notification signal is transmitted to the plurality of vehicles.
- the determination means is The inter-vehicle distance of the plurality of vehicles traveling in the photographing area or a predetermined preparation area set in front of the photographing area is acquired based on the position information, and the inter-vehicle distance is the lower limit distance of the predetermined reference distance range.
- the signal generation means An approach notification signal for notifying the user that the inter-vehicle distance is equal to or less than the lower limit of the reference distance range is generated.
- the communication control means is characterized in that the approach notification signal is transmitted to the plurality of vehicles.
- the plurality of vehicles are An acquisition method for acquiring vehicle position information, A vehicle communication means for transmitting the position information of the vehicle and Further provided with a detection means for detecting the speed information of the vehicle, The vehicle communication means transmits the speed information to the control device, and the vehicle communication means transmits the speed information to the control device.
- the determination means of the control device is The speed difference of the plurality of vehicles traveling in the shooting area or a predetermined preparation area set in front of the shooting area is acquired based on the speed information, and it is determined that the speed difference exceeds a predetermined reference speed.
- the signal generation means A speed notification signal for notifying the user that the speed difference exceeds the reference speed is generated.
- the communication control means is characterized in that the speed notification signal is transmitted to the plurality of vehicles.
- the control device is Further provided with a backlight determining means for determining whether or not the photographing state is backlight based on the image data photographed by the photographing means.
- the signal generation means generates a flight control signal instructing a change in the flight position of the unmanned aerial vehicle so as to avoid the backlight when the shooting state is determined to be backlight.
- the communication control means transmits the flight control signal to the unmanned aerial vehicle, and the communication control means transmits the flight control signal to the unmanned aerial vehicle.
- the flight control means of the unmanned aerial vehicle It is characterized in that the flight position is changed based on the flight control signal.
- the signal generating means controls to move the angle of view of the photographing means in the horizontal direction when the photographing state is determined to be backlight, or the photographing means of the photographing means.
- Generate a parameter control signal to control the movement of the angle of view in the vertical way
- the communication control means transmits the parameter control signal to the unmanned aerial vehicle, and the communication control means transmits the parameter control signal to the unmanned aerial vehicle.
- the shooting control means for the unmanned aerial vehicle It is characterized in that the angle of view of the photographing means is changed based on the parameter control signal.
- the storage means registers as the subject information user information set for a user of a pedestrian or a single vehicle constituting the subject.
- the determination means is characterized in that it determines whether or not the subject has entered a predetermined shooting area based on the position information and the map information of the subject.
- the signal generation means When the determination means determines that the subject set as the shooting target has entered the shooting area, The signal generation means generates a control signal instructing the start of shooting, and the signal generation means generates a control signal.
- the shooting control means controls the shooting means based on the control signal to start shooting the subject.
- the determination means determines.
- the signal generation means generates an area notification signal for notifying the subject that the subject has entered the shooting area or the preparation area.
- the communication control means is characterized in that the area notification signal is transmitted to the subject.
- the control device is An image processing means that performs image processing for extracting the face of the subject from the image data taken by the shooting means, and an image processing means.
- An image determination means for determining whether or not the face of the subject has been photographed based on the result of the image processing is further provided.
- the signal generation means A parameter control signal for controlling the shooting parameters of the shooting means is generated so that the face of the subject can be shot when the face of the subject is not shot.
- the photographing control means of the unmanned aerial vehicle The imaging means is controlled based on the parameter control signal to perform the imaging.
- the signal generation means when the face of the subject is not photographed by the determination of the image determining means, when the face of the subject is not photographed,
- the signal generation means when the face of the subject is not photographed,
- the signal generation means generates a shooting guidance signal that guides the subject to redo the shooting.
- the communication control means is characterized in that the shooting guidance signal is transmitted to the subject.
- the control device is a control device capable of communicating with an unmanned aerial vehicle having a photographing means capable of photographing a subject in a flight state.
- a storage means for registering subject information in which the subject is set as a shooting target, and A determination means for determining whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject, and A signal generation means for generating a control signal for controlling the photographing means based on the determination of the determination means, and It is characterized by comprising a communication control means for transmitting the subject information and the control signal to the unmanned aerial vehicle.
- the control method according to the twentieth aspect of the present invention is a control method in a control device capable of communicating with an unmanned aerial vehicle having a photographing means capable of photographing a subject in a flight state.
- a signal generation step in which the signal generation means generates a control signal for controlling the photographing means based on the determination in the determination step.
- the communication control means includes a communication control step of transmitting the subject information and the control signal to the unmanned aerial vehicle.
- the program according to the 21st aspect of the present invention is a program for causing a computer to execute each step of a control method in a control device capable of communicating with an unmanned aerial vehicle having a photographing means capable of photographing a subject in a flight state.
- the control method is A storage process of registering subject information in which the subject is set as a shooting target in a storage means, and A determination step in which the determination means determines whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject.
- a signal generation step in which the signal generation means generates a control signal for controlling the photographing means based on the determination in the determination step.
- the communication control means includes a communication control step of transmitting the subject information and the control signal to the unmanned aerial vehicle.
- the storage medium according to the 22nd aspect of the present invention is a computer that stores a program for causing a computer to execute each step of a control method in a control device capable of communicating with an unmanned aerial vehicle having a photographing means capable of photographing a subject in a flight state.
- It is a readable storage medium
- the control method is A storage process of registering subject information in which the subject is set as a shooting target in a storage means, and A determination step in which the determination means determines whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject.
- a signal generation step in which the signal generation means generates a control signal for controlling the photographing means based on the determination in the determination step.
- the communication control means includes a communication control step of transmitting the subject information and the control signal to the unmanned aerial vehicle.
- the photographing system of the first aspect of the present invention it is possible to provide a technique capable of photographing a pre-registered subject with an unmanned aerial vehicle.
- the photographing system of the second aspect of the present invention it is possible to provide a technique capable of photographing a user traveling in a vehicle by an unmanned aerial vehicle with a plurality of vehicle users registered in advance as a group as a subject. it can.
- the timing of starting photography can be controlled, and even when the photographing area is small or the inter-vehicle distance of the vehicles of the group is wide, the photographing area is first taken. You can take a picture of the vehicle that entered.
- the shooting system of the fourth aspect of the present invention it is possible to control the timing of the end of shooting, and it is possible to shoot a plurality of vehicles set as a group without exception.
- the photographing system of the fifth aspect of the present invention it is possible to control the timing of shooting start, and it is possible to perform shooting when all the plurality of vehicles set as a group are in the shooting area. , It will be possible to shoot only images that meet the needs of users.
- the photographing system of the sixth aspect of the present invention it is possible to perform traveling in preparation for photographing, such as arranging a formation of vehicles in a group by notifying the user of an area notification signal before photographing.
- the photographing system of the seventh aspect of the present invention it is possible to photograph the faces of all the members of the group by controlling the photographing unit based on the parameter control signal.
- a photographing guidance signal for instructing the user to redo the photographing is generated and the photograph is taken by a plurality of vehicles.
- the guidance signal By transmitting the guidance signal, it becomes possible to immediately retake the picture in the shooting area.
- the photographing system of the ninth aspect of the present invention if the inter-vehicle distance is too wide, it may not be possible to photograph a plurality of users at the same time when traveling in the photographing area, so that the inter-vehicle distance is the reference distance.
- a distance notification signal notifying that the distance exceeding the upper limit of the range has been exceeded to a plurality of vehicles and notifying the user, it is possible to urge the user to reduce the inter-vehicle distance.
- the inter-vehicle distance is the reference distance.
- the photographing system of the eleventh aspect of the present invention when the speed difference exceeds the reference speed, it may not be possible to shoot a plurality of users at the same time when traveling in the photographing area, so that the speed difference is the reference speed. It is possible to urge the user to reduce the speed difference by transmitting a speed notification signal notifying that the speed exceeds the limit to a plurality of vehicles and notifying the user.
- the photographing system of the twelfth aspect of the present invention when it is determined that the image data is backlit based on the photographed image data, the flight position of the unmanned aerial vehicle is changed to avoid the backlight. It becomes possible to take a picture.
- the photographing system of the thirteenth aspect of the present invention when it is determined that the image is backlit based on the photographed image data, the angle of view of the photographing unit is changed to avoid the backlight. It becomes possible to shoot.
- the photographing system of the fourteenth aspect of the present invention there is provided a technique capable of photographing a pedestrian or a user of a single vehicle by an unmanned aerial vehicle with a pre-registered pedestrian or a user of a single vehicle as a subject. can do.
- the imaging system it is possible to control the timing of starting imaging. As a result, even when the shooting area is small, it is possible to shoot the subject without missing the shooting timing.
- the subject by notifying the subject of the area notification signal before photographing, the subject can prepare in advance for photographing.
- the imaging system of the seventeenth aspect of the present invention by controlling the imaging unit based on the parameter control signal, the face of a pedestrian who is the subject or the face of a user who is traveling in a single vehicle can be captured. It becomes possible to shoot.
- a shooting guidance signal for guiding the subject to redo the shooting is generated, and the shooting guidance signal is sent to the terminal of the subject. By transmitting, it becomes possible to immediately retake the picture in the shooting area.
- control device of the 19th aspect of the present invention the control method of the 20th aspect, the program of the 21st aspect, and the storage medium according to the 22nd aspect, a pre-registered subject can be photographed by an unmanned aerial vehicle. Control technology can be provided.
- ST21 is a block diagram illustrating the functional configuration of an unmanned aerial vehicle
- ST22 is a block diagram illustrating the functional configuration of the processing unit.
- FIG. 1 is a diagram showing an example of the configuration of the photographing system STM according to the first embodiment.
- the photographing system STM includes an unmanned aerial vehicle DRN, a terminal of the subject, and a control device CNT (control server) capable of communicating with the unmanned aerial vehicle DRN.
- the unmanned aerial vehicle DRN has a photographing unit 200 (camera) capable of photographing a plurality of vehicles in a flight state.
- the subject includes, for example, a pedestrian, a user of a single vehicle, or a user of a plurality of vehicles.
- an example in which users of a plurality of vehicles constituting the subject are described as a group to be photographed will be described.
- the control device CNT can remotely communicate with a plurality of vehicles 1A, 1B and the unmanned aerial vehicle DRN via the network NT, and can output a signal for controlling the unmanned aerial vehicle DRN via the network NT. is there.
- the information processing device 18 is an external terminal that manages the rental of a vehicle (vehicle use service), and when the vehicle is rented to the user, the user information (including, for example, the terminal information of the user) that identifies the user or the rented out device 18 is used.
- the vehicle information that identifies the vehicle is transmitted to the control device CNT via the network NT.
- the information processing device 18 (external terminal) can be installed at an external base (agency) such as a hotel, a rental car company, or a dealer who provides vehicle sales and maintenance services.
- the external base can provide a service in which multiple users A and B who rented the vehicle are group GR, and users A and B running on the vehicles 1A and 1B are photographed by the unmanned aerial vehicle DRN. is there.
- control device CNT acquires the information (user information (including the user's terminal information) and vehicle information) transmitted from the information processing device 18 (external terminal) via the communication interface unit 23 (communication I / F).
- the subject information set for the subject to be photographed is registered in the database DB of the storage unit 22.
- the control device CNT stores the group information in which the users A and B of the plurality of vehicles 1A and 1B are set as one group GR in the database of the storage unit 22. Register in the DB.
- the processing unit 21 of the control device CNT generates a control signal for controlling the shooting unit 200 of the unmanned aerial vehicle DRN when a plurality of vehicles set as a group are traveling in a predetermined shooting area, and generates a network NT. It is transmitted to the unmanned aerial vehicle DRN via.
- the imaging unit 200 of the unmanned aerial vehicle DRN can perform imaging based on the control signal transmitted from the control device CNT.
- the image captured by the photographing unit 200 is transmitted to the control device CNT via the network NT and stored in the database DB of the storage unit 22.
- the captured image is confirmed (viewed) by preview display on the information processing device 18 (external terminal) or the terminals of a plurality of users A and B, for example, a portable terminal SP (for example, a smartphone) when the rented vehicle is returned. ), And if users A and B like the captured image, it is possible to purchase the image data.
- the image data When purchasing image data, the image data may be downloaded to a plurality of users A and B's portable terminal SP (smartphone), or the image data may be saved in a storage medium such as a CD-ROM or DVD for the user. It is also possible to provide.
- the specific functional configurations of the processing unit 21 of the control device CNT and the unmanned aerial vehicle DRN will be described in detail later.
- the vehicles 1A and 1B can use, for example, an electric two-wheeled vehicle such as a saddle-mounted vehicle.
- a saddle-mounted vehicle refers to a type in which a driver rides across a vehicle body, and the concept includes a scooter-type two-wheeled vehicle and the like.
- Vehicle A and vehicle B have the same configuration, and the configuration of vehicle 1A will be described as a representative in the following description.
- FIG. 1 shows an example of two vehicles 1A and 1B, but the present invention is not limited to this example, and a group may be formed by three or more vehicles.
- the vehicle 1A includes a power source 11, a battery 12 (power supply device) that supplies electric power to the vehicle, an operation mechanism 13, a vehicle control device 14 that controls the vehicle, and a communication device 15.
- the power source 11 is an electric motor
- the battery 12 can supply electric power to the power source 11 and each element constituting the vehicle 1.
- a rechargeable secondary battery is used, and examples thereof include a lead storage battery, a lithium ion battery, and a nickel hydrogen battery.
- the battery 12 can be charged by connecting it to a power source capable of supplying a predetermined voltage via a cable.
- the charged battery may be replaced with a charged battery at a battery exchange provided in the middle of the traveling path, and the charged battery 12 may be mounted on the vehicle.
- the operation mechanism 13 is configured to be able to input an operation for controlling the power source 11, and for example, outputs a predetermined signal to the vehicle control device 14 described later based on the operation input by the user.
- Examples of the operation input to the operation mechanism 13 include a rotation operation using a predetermined key (ignition key, remote key, etc.) corresponding to the vehicle, a pressing operation using a pressing switch (start switch, etc.), and the like. Be done.
- the vehicle control device 14 is an ECU (electronic control unit) capable of controlling the operation of the entire vehicle 1A, and for example, sends and receives signals to and from each component of the vehicle 1A via a predetermined signal line. It is possible to do. As an example, the vehicle control device 14 can receive a signal corresponding to an operation input to the operation mechanism 13 and control the power source 11 to start.
- ECU electronic control unit
- the function of the vehicle control device 14 can be realized by either hardware or software.
- the function of the vehicle control device 14 may be realized by the CPU (central processing unit) executing a predetermined program using the memory.
- the function of the vehicle control device 14 may be realized by a known semiconductor device such as a PLD (programmable logic device) or an ASIC (semiconductor integrated circuit for a specific application).
- the vehicle control device 14 is shown here as a single element, the vehicle control device 14 may be divided into two or more elements as needed.
- the communication device 15 has an antenna for realizing communication with the control device CNT via the network NT. Further, the communication device 15 includes a TCU (telematics control unit) and the like that perform signal processing for realizing communication with the control device CNT via the network NT.
- TCU telephones control unit
- the TCU can acquire voltage information indicating the voltage value of the battery 12 from the battery 12, and the TCU acquires control information indicating the control state of the vehicle 1 from the vehicle control device 14 (ECU). Is possible.
- the TCU transmits the acquired voltage information of the battery 12 and the control information of the vehicle control device 14 (ECU) to the control device CNT via the network NT. Further, the TCU can intervene in the vehicle control in the vehicle control device 14 based on the information received from the control device CNT.
- the communication device 15 can perform vehicle-to-vehicle communication between a plurality of vehicles constituting the group GR, and the communication device 15 of the vehicle 1A wirelessly communicates with other vehicles 1B constituting the group GR to perform vehicle-to-vehicle communication. It is possible to exchange information between them.
- the vehicle control device 14 can control the speed and the vehicle distance when traveling in the photographing area by inter-vehicle communication between vehicles so as to be adjusted within the group GR.
- the detection device 16 includes various sensors for detecting various states of the vehicle 1A, for example, a gyro sensor, a GPS sensor, a vehicle speed sensor for detecting vehicle speed information, and the like.
- the vehicle control device 14 can control the vehicle 1A based on the information detected by the detection device 16, and the communication device 15 transmits the detection result of the detection device 16 to the control device CNT via the network NT. It is possible.
- the gyro sensor detects the rotational movement of the vehicle 1A.
- the vehicle control device 14 can determine the course of the vehicle 1A based on the detection result of the gyro sensor, the vehicle speed sensor, and the like.
- the GPS sensor detects the current position of the vehicle 1A.
- the communication device 15 can wirelessly communicate with a server device that provides map information and traffic information to acquire information on the current position of the vehicle 1A.
- the communication device 15 and the detection device 16 function as an acquisition unit for acquiring the vehicle position information
- the communication device 15 functions as a vehicle communication unit for transmitting the vehicle position information via the network NT. ..
- the display device 17 is configured to be able to display the remaining battery level of the battery 12 and the notification information received from the control device CNT, together with the vehicle speed meter and the tachometer.
- the display device 17 displays the notification information to the user and adjusts the vehicle speed and the inter-vehicle distance. It is possible to urge. This makes it possible to travel in the shooting area with the vehicle speed and the inter-vehicle distance adjusted within the group GR in preparation for shooting when shooting.
- the control device CNT has a processing unit 21, a storage unit 22, and a communication interface unit 23 (communication I / F), and is installed in, for example, a management company that provides a vehicle use service.
- the processing unit 21 is composed of a CPU and a processor including a memory
- the storage unit 22 is composed of a RAM serving as a program processing area, a ROM for storing various programs and data, and a relatively large-capacity HDD (hard disk tribe). It is composed. It may also be distributed on the cloud.
- the processing unit 21 communicates with the vehicles 1A, 1B, and the unmanned aerial vehicle DRN by the communication interface unit 23 via the network NT, and stores information about the vehicles 1A, 1B, and the unmanned aerial vehicle DRN in the storage unit 22 or. Information about the vehicles 1A, 1B, and the unmanned aerial vehicle DRN can be read from the storage unit 22. Further, it is possible to store the image data photographed by the photographing unit 200 of the unmanned aerial vehicle DRN in the storage unit 22.
- the storage unit 22 registers subject information in which the subject is set as a shooting target.
- the subject information includes the terminal information of the user and the vehicle information of the vehicle used by the user.
- the storage unit 22 can register group information in which the users of the plurality of vehicles are set as one group GR.
- the group GR is composed of a user A who uses the vehicle 1A and a user B who uses the vehicle 1B.
- FIG. 2 is a block diagram illustrating the functional configuration of the unmanned aerial vehicle DRN.
- the photographing unit 200 is a camera mounted on the unmanned aerial vehicle DRN, and the photographing unit 200 is configured to be capable of photographing a plurality of vehicles in the flight state of the unmanned aerial vehicle DRN.
- the photographing unit 200 of the unmanned aerial vehicle DRN can take a still image or a moving image.
- the communication interface unit 201 (communication I / F) can communicate with the vehicles 1A and 1B and the control device CNT via the network NT.
- the communication interface unit 201 transmits the image data captured by the photographing unit 200 to the control device CNT.
- the identification unit 202 identifies the subject based on the subject information distributed from the subject terminal (portable terminal SP) and the subject information transmitted from the control device CNT (communication control unit 230).
- the information used to identify the subject is not limited to the subject information distributed from the terminal of the subject, the vehicle information distributed from the plurality of vehicles 1A and 1B, and the subject information (group information) transmitted from the control device CNT. ), It is also possible to identify a plurality of vehicles constituting the group GR.
- the group information including the vehicle information for identifying the vehicle is transmitted from the communication interface unit 23 of the control device CNT to the unmanned aerial vehicle DRN as the information for identifying the group GR. Further, while the plurality of vehicles 1A and 1B are traveling, vehicle information for identifying the vehicle is distributed from the communication device 15, and by collating the vehicle information, the identification unit 202 can be used as the communication interface unit 23 of the control device CNT. Based on the group information transmitted from, a plurality of vehicles constituting the group GR can be identified.
- the shooting control unit 203 controls the shooting unit 200 based on the control signal to control the shooting of a plurality of vehicles specified by the specific unit 202.
- the photographing control unit 203 controls to move the angle of view of the photographing unit 200 in the horizontal direction (Pan control) based on the control signal, and the photographing unit. It is possible to control to move the angle of view of 200 in the vertical method (Tilt control) and to control the angle of view to be enlarged (zoomed up) or reduced (zoomed out) for shooting.
- the rotor 204 rotates with the motor 205 as the drive source, and the propulsive force of the unmanned aerial vehicle DRN is generated.
- the unmanned aerial vehicle DRN is provided with at least four rotors 204 and motors 205, and the flight control unit 207 can control the output of each motor 205.
- the flight control unit 207 can turn to change the flight position and change the flight altitude based on the control signal transmitted from the control device CNT.
- the sensor 206 is, for example, a distance sensor, and detects the distance between a plurality of vehicles specified by the specific unit 202 and the unmanned aerial vehicle DRN.
- FIG. 2 is a block diagram illustrating the functional configuration of the processing unit 21.
- FIG. 3 is a diagram schematically illustrating the processing of the determination unit 210.
- the determination unit 210 can perform various determination processes, and determines whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject acquired by communication with the terminal of the subject. .. For example, when a user of a plurality of vehicles is set as a subject, the determination unit 210 determines the plurality of vehicles 1A and 1B based on the position information and the map information of the plurality of vehicles 1A and 1B set as the group GR. Determine if the vehicle is traveling in the shooting area. The determination unit 210 can access the map information database built in the storage unit 22, and the determination unit 210 compares the position information of a plurality of vehicles with the map information and is traveling in the set shooting area. Judge whether or not.
- the determination unit 210 when it is determined that at least one of the plurality of vehicles 1A and 1B set as the group GR (vehicle 1A) has entered the shooting area, the determination unit 210 determines that the plurality of vehicles 1A and 1B travel in the shooting area. Judge that it is inside.
- the determination unit 210 determines that a plurality of vehicles are traveling in the photographing area.
- the signal generation unit 220 can generate various signals based on the determination of the determination unit 210. For example, the signal generation unit 220 generates a control signal for controlling the photographing unit 200 of the unmanned aerial vehicle DRN based on the determination of the determination unit 210. To do.
- the signal generation unit 220 When the determination unit 210 determines that a plurality of vehicles are traveling in the shooting area when the shooting has not started, the signal generation unit 220 generates a control signal instructing the start of shooting.
- the signal generation unit 220 After the start of shooting, when the determination unit 210 determines that at least one of the plurality of vehicles is traveling in the shooting area, the signal generation unit 220 generates a control signal instructing the continuation of shooting.
- the signal generation unit 220 when the determination unit 210 determines that all of the plurality of vehicles 1A and 1B have come out of the shooting area, the signal generation unit 220 generates a control signal instructing the end of shooting.
- the communication control unit 230 can transmit the signal generated by the signal generation unit 220 via the communication interface unit 23, and for example, the control signal generated by the signal generation unit 220 can be transmitted to the communication interface unit 23.
- the unmanned aerial vehicle DRN via.
- the communication control unit 230 transmits subject information and control signals to the unmanned aerial vehicle DRN.
- the storage unit 22 registers group information in which users of a plurality of vehicles constituting the subject are set as a group to be photographed as a part of the subject information.
- the communication control unit 230 transmits subject information (including group information) and control signals to the unmanned aerial vehicle DRN.
- the identification unit 202 of the unmanned aerial vehicle DRN can identify a plurality of vehicles constituting the group GR based on the group information, and the plurality of identified vehicles can be photographed by the photographing unit 200. Become.
- the signal generation unit 220 determines the shooting area or the preparation area. It is also possible for the communication control unit 230 to transmit the area notification signal to a plurality of vehicles by generating an area notification signal for notifying the user that the vehicle has entered.
- the communication control unit 230 transmits a control signal generated by the signal generation unit 220 to instruct the continuation of shooting or a control signal instructing the end of shooting to the unmanned aerial vehicle DRN.
- the shooting control unit 203 of the unmanned aerial vehicle DRN controls the shooting unit 200 based on the control signal instructing the continuation of shooting to control the shooting to continue. Further, the shooting control unit 203 of the unmanned aerial vehicle DRN controls the shooting unit 200 based on the control signal instructing the end of shooting to control the shooting to end.
- the image processing unit 240 can perform image processing for extracting the user's face from the image data taken by the photographing unit 200 of the unmanned aerial vehicle DRN.
- the image processing unit 240 performs image processing on the image of each frame.
- the image processing unit 240 can also perform image processing on an image sampled at a predetermined frame rate.
- the image determination unit 250 determines that the faces of the users set as the group GR have not been photographed.
- the backlight determination unit 260 determines whether or not it is backlit based on the image data photographed by the photographing unit 200. For example, when the image data taken by the photographing unit 200 includes a region where the pixel value locally exceeds the reference pixel value, the backlight determination unit 260 determines that the image is taken in the backlit shooting state. Is possible. In this case, the signal generation unit 220 generates a flight control signal instructing the change of the flight position of the unmanned aerial vehicle DRN so as to avoid backlight, or generates a control signal so as to change the angle of view of the photographing unit 200. A specific process will be described in the additional process relating to the backlight determination after step S570 of FIG.
- FIG. 4 is a diagram illustrating a processing flow of the storage unit 22 and the processing unit 21 (determination unit 210, signal generation unit 220, communication control unit 230).
- step S400 the storage unit 22 registers the group GR information.
- the control device CNT acquires the information (user information and vehicle information) transmitted from the information processing device 18 (external terminal) via the communication interface unit 23, the storage unit 22 uses the users of the plurality of vehicles 1A and 1B.
- the group information in which A and B are set as one group GR is registered in the database DB.
- step S405 the determination unit 210 acquires the map information from the map information database built in the storage unit 22.
- step S410 the determination unit 210 acquires the position information of a plurality of vehicles, and in step S415, the determination unit 210 has a plurality of determination units 210 based on the position information and the map information of the plurality of vehicles 1A and 1B set as the group GR. It is determined whether or not the vehicles 1A and 1B of the above are traveling in a predetermined shooting area.
- step S415 if the vehicle is not traveling in the shooting area (S415-No), the determination unit 210 returns the process to step S410 and repeats the same process.
- step S415 determines whether the vehicle is traveling in the shooting area (S415-Yes). If the determination in step S415 indicates that the vehicle is traveling in the shooting area (S415-Yes), the process proceeds to step S420.
- step S420 the signal generation unit 220 generates a control signal instructing the start of photography as a control signal for controlling the photographing unit 200 of the unmanned aerial vehicle DRN based on the determination of the determination unit 210.
- step S425 the communication control unit 230 transmits the registered group GR information and the generated control signal to the unmanned aerial vehicle DRN.
- the identification unit 202 of the unmanned aerial vehicle DRN identifies a plurality of vehicles constituting the group GR based on the group information, and the photographing unit 200 photographs the specified plurality of vehicles.
- step S430 the determination unit 210 determines whether all of the plurality of vehicles 1A and 1B have left the shooting area, and when all the vehicles have not left the shooting area (S430-No), that is, among the plurality of vehicles.
- the determination unit 210 determines that at least one vehicle is traveling in the photographing area, the process proceeds to step S435.
- step S435 the signal generation unit 220 generates a control signal instructing the continuation of shooting
- step S440 the communication control unit 230 transmits the control signal instructing the continuation of shooting generated by the signal generation unit 220 to the unmanned aerial vehicle DRN.
- the shooting control unit 203 of the unmanned aerial vehicle DRN controls the shooting unit 200 based on the control signal instructing the continuation of shooting to control the shooting to continue.
- the signal generation unit 220 instructs the end of photographing in step S445. Generate a control signal.
- step S450 the communication control unit 230 transmits the control signal generated by the signal generation unit 220 to instruct the end of shooting to the unmanned aerial vehicle DRN.
- the shooting control unit 203 of the unmanned aerial vehicle DRN controls the shooting unit 200 based on the control signal instructing the end of shooting to control the shooting to end.
- FIG. 5 is a diagram illustrating a processing flow of the image processing unit 240 and the image determination unit 250.
- the image processing unit 240 acquires the image data captured by the photographing unit 200 of the unmanned aerial vehicle DRN.
- step S510 the image processing unit 240 performs image processing for extracting the user's face from the image data.
- step S520 the image determination unit 250 determines whether or not the faces of the users set as the group GR have been photographed based on the result of the image processing acquired in step S510 and the group information registered in advance. Image judgment.
- step S520 If the face of a set number of users is photographed in the determination of step S520 (S520-Yes), the process proceeds to step S530.
- step S530 the storage unit 22 stores the captured image data in the database and ends the process.
- the image data stored in the database can be provided for preview display on the information processing device 18 (external terminal) or the portable terminals SP of a plurality of users A and B when the rented vehicle is returned. If users A and B like the captured image, it is possible to purchase the image data. In this case, it is possible to download the image data to a plurality of users A and B's portable terminal SP (smartphone). is there. It is also possible to store the image data in a storage medium and provide it to the user.
- step S520 determines whether the face of the set number of users (S520-No). If the determination of step S520 does not capture the faces of the set number of users (S520-No), the process proceeds to step S540.
- step S540 the signal generation unit 220 generates a parameter control signal for controlling the shooting parameters.
- the signal generation unit 220 controls the shooting parameters of the shooting unit 200 so that when the faces of the set number of users are not shot, the faces of the set number of users (all) can be shot.
- the signal generation unit 220 sets the angle of view of the photographing unit 200 to the vertical method as a parameter control signal for moving the angle of view of the photographing unit 200 in the horizontal direction as a photographing parameter for Pan control, or as a photographing parameter for Tilt control. It is possible to generate a parameter control signal to move. Further, it is possible to generate a parameter control signal for shooting by enlarging (zooming up) or reducing (zooming out) the angle of view.
- step S550 the communication control unit 230 transmits the parameter control signal generated by the signal generation unit 220 to the unmanned aerial vehicle DRN.
- the shooting control unit 203 of the unmanned aerial vehicle DRN controls the shooting unit 200 based on the parameter control signal to perform shooting while moving on the unmanned aerial vehicle DRN.
- the angle of view of the photographing unit 200 By controlling the angle of view of the photographing unit 200 based on the parameter control signal, the faces of all the members of the group can be photographed.
- step S560 the signal generation unit 220 generates a shooting guidance signal for guiding the re-shooting.
- the signal generation unit 220 When the faces of users A and B for the set number of people are not photographed by the determination of the image determination unit 250, the signal generation unit 220 generates a shooting guidance signal for guiding the user A and B to redo the shooting. To do.
- step S570 the communication control unit 230 transmits a shooting guidance signal to the plurality of vehicles 1A and 1B.
- the display device 17 of each vehicle shows the user a display based on the shooting guidance signal and guides the user to take a picture again.
- the photography guidance signal can be generated and the photography guidance signal can be transmitted to a plurality of vehicles to immediately retake the picture in the shooting area. It will be possible.
- step S570 the process is returned to step S500, and the same process is repeatedly executed thereafter.
- the backlight determination unit 260 determines whether or not the photographing state is backlight based on the image data photographed by the photographing unit 200. For example, when the image data taken by the photographing unit 200 includes a region where the pixel value locally exceeds the reference pixel value, the backlight determination unit 260 determines that the image is taken in the backlit shooting state.
- the signal generation unit 220 generates a flight control signal instructing the change of the flight position of the unmanned aerial vehicle DRN so as to avoid the backlight when the shooting state is determined to be backlight. For example, the signal generation unit 220 generates a flight control signal instructing the unmanned aerial vehicle DRN to turn so that the sun does not enter within the viewing angle of the photographing unit.
- the communication control unit 230 transmits a flight control signal to the unmanned aerial vehicle DRN.
- the flight control unit 207 of the unmanned aerial vehicle changes the flight position based on the flight control signal.
- the signal generation unit 220 controls to move the angle of view of the shooting unit 200 in the horizontal direction (Pan control) or moves the angle of view of the shooting unit 200 to the vertical method when the shooting state is determined to be backlight. It is also possible to generate a control signal (parameter control signal) so as to perform control (Perpendicular control).
- the communication control unit 230 transmits a parameter control signal to the unmanned aerial vehicle DRN.
- the shooting control unit 203 of the unmanned aerial vehicle DRN changes the angle of view of the shooting unit 200 based on the parameter control signal.
- shooting is performed in a state where the backlight is avoided by changing the flight position of the unmanned aerial vehicle or changing the angle of view of the shooting unit 200. Becomes possible.
- the control device CNT checks the inter-vehicle distance when a plurality of vehicles 1A and 1B travel in the shooting area or a predetermined preparation area set in front of the shooting area, and the inter-vehicle distance suitable for shooting (predetermined reference distance).
- the notification information is transmitted to a plurality of vehicles so as to be within the range).
- a distance notification signal notifying that the inter-vehicle distance is too wide is transmitted to a plurality of vehicles 1A and 1B, respectively. Notify the user of the vehicle.
- an approach notification signal for notifying that the inter-vehicle distance is too close is transmitted to a plurality of vehicles 1A and 1B to each vehicle. Notify the user of.
- FIG. 6 is a diagram for explaining the flow of the inter-vehicle distance adjustment process of a plurality of vehicles.
- the determination unit 210 acquires the position information of the plurality of vehicles 1A and 1B traveling in the shooting area or a predetermined preparation area set in front of the shooting area.
- the determination unit 210 acquires the inter-vehicle distances of the plurality of vehicles 1A and 1B based on the position information. For example, the determination unit 210 can acquire the inter-vehicle distance based on the difference in the position information.
- step S620 when the determination unit 210 determines that the acquired inter-vehicle distance exceeds the upper limit of the predetermined reference distance range (S620-Yes), the process proceeds to step S630.
- step S630 the signal generation unit 220 generates a distance notification signal for notifying the user that the inter-vehicle distance exceeds the upper limit of the reference distance range.
- step S640 the communication control unit 230 transmits the distance notification signal to the plurality of vehicles 1A and 1B. If the inter-vehicle distance is too wide, it may not be possible to shoot multiple users at the same time when traveling in the shooting area, so the distance to notify that the inter-vehicle distance exceeds the upper limit of the reference distance range. By transmitting a notification signal to a plurality of vehicles and notifying the user, it is possible to urge the user to reduce the inter-vehicle distance.
- the determination unit 210 proceeds to the process in step S650.
- step S650 when the determination unit 210 determines that the acquired inter-vehicle distance is not equal to or less than the lower limit of the reference distance range (S650-No), the determination unit 210 returns the process to step S600 and repeatedly executes the same process. ..
- the inter-vehicle distances of the plurality of vehicles 1A and 1B are inter-vehicle distances suitable for shooting (distances within a predetermined reference distance range), and the notification signals (distance notification signal, approach notification signal) are not generated. Continue to execute the inter-vehicle distance check process.
- the determination unit 210 proceeds to the process in step S660.
- step S660 the signal generation unit 220 generates an approach notification signal for notifying the user that the inter-vehicle distance is equal to or less than the lower limit of the reference distance range.
- step S670 the communication control unit 230 transmits an approach notification signal to a plurality of vehicles. If the inter-vehicle distance is too close, multiple users may overlap when traveling in the shooting area and may not be able to shoot at the same time. By transmitting a signal to a plurality of vehicles and notifying the user, it is possible to urge the user to increase the inter-vehicle distance.
- the control device CNT checks the speed difference between the vehicles when the plurality of vehicles 1A and 1B travel in the shooting area or a predetermined preparation area set in front of the shooting area, and the speed difference suitable for shooting (predetermined).
- the speed notification signal is transmitted to a plurality of vehicles so as to be equal to or less than the reference speed of.
- FIG. 7 is a diagram illustrating a flow of speed difference adjustment processing of a plurality of vehicles.
- the determination unit 210 acquires speed information of a plurality of vehicles 1A and 1B traveling in the shooting area or a predetermined preparation area set in front of the shooting area.
- the determination unit 210 acquires the speed difference between the plurality of vehicles 1A and 1B based on the speed information of the plurality of vehicles 1A and 1B. For example, the determination unit 210 can acquire the speed difference between vehicles based on the difference in speed information.
- step S720 if the acquired speed difference does not exceed the reference speed (S720-No), the determination unit 210 returns the process to step S700 and repeatedly executes the same process.
- the speed difference between the plurality of vehicles 1A and 1B is a speed difference suitable for shooting (below a predetermined reference speed), and the speed difference check process is continuously executed without generating a speed notification signal. ..
- step S720-Yes the determination unit 210 proceeds to the process in step S730.
- step S730 the signal generation unit 220 generates a speed notification signal for notifying the user that the speed difference exceeds the reference speed.
- step S740 the communication control unit 230 transmits speed notification signals to the plurality of vehicles 1A and 1B.
- a speed notification signal for notifying that the speed difference exceeds the reference speed is sent to multiple vehicles. By transmitting and notifying the user, it is possible to urge the user to reduce the speed difference.
- the storage unit 22 of the control device CNT registers as subject information user information set as a shooting target for a single or a plurality of pedestrians or users of a single vehicle constituting the subject.
- the subject information includes the terminal information of the user and the vehicle information of the vehicle used by the user.
- the terminal information of the pedestrian is registered in the storage unit 22 as the subject information in the subject information.
- the determination unit 210 determines whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject acquired by communication with the terminal of the subject (for example, the SP in FIG. 1). That is, the determination unit 210 determines whether or not the subject has entered the predetermined shooting area (FIG. 3) based on the position information and the map information of the subject.
- the signal generation unit 220 generates a control signal instructing the start of shooting
- the communication control unit 230 uses the signal generation unit 220.
- the generated control signal and subject information are transmitted to the unmanned aerial vehicle DRN.
- the identification unit 202 of the unmanned aerial vehicle DRN identifies the subject based on the subject information delivered from the terminal of the subject and the subject information transmitted from the communication control unit 230, and the shooting control unit 203 of the unmanned aerial vehicle DRN controls.
- the photographing unit 200 is controlled based on the signal to start photographing the subject.
- the signal generation unit 220 indicates that the subject has entered the shooting area or the preparation area.
- the area notification signal to be notified is generated, and the communication control unit 230 transmits the area notification signal to the subject.
- control device CNT has an image processing unit 240 that performs image processing for extracting the face of the subject from the image data taken by the photographing means, and the face of the subject is photographed based on the result of the image processing. It is provided with an image determination unit 260 for determining whether or not the image is correct.
- the signal generation unit 220 When the subject's face is not photographed by the determination of the image determination unit 260, the signal generation unit 220 generates a parameter control signal for controlling the imaging parameters of the imaging unit 200 so that the subject's face can be photographed, and communicates.
- the control unit 230 transmits the generated parameter control signal to the unmanned aerial vehicle DRN.
- the shooting control unit 203 of the unmanned aerial vehicle DRN controls the shooting unit 200 based on the parameter control signal to shoot the subject. Further, when the face of the subject is not photographed by the determination of the image determination unit, the signal generation unit 220 generates a shooting guidance signal for guiding the subject to redo the shooting, and the communication control unit 230 is the terminal of the subject. Sends a shooting guidance signal to. Upon receiving the shooting guidance signal, the subject's terminal displays a display based on the shooting guidance signal to the user and guides the user to shoot again.
- the communication control unit 230 can also transmit a shooting guidance signal to the vehicle of the user who is the subject.
- the vehicle display device 17 shows the user a display based on the shooting guidance signal and guides the user to take a picture again.
- the photographing guidance signal is generated and the photography guidance signal is transmitted to the vehicle, so that the photograph can be immediately retaken in the shooting area.
- the photographing system of the above embodiment includes an unmanned aerial vehicle (for example, DRN of FIG. 1) having a photographing means (for example, 200 in FIG. 1) capable of photographing a subject in a flight state, and a terminal of the subject (for example, FIG. 1).
- An imaging system eg, STM of FIG. 1 having a control device (eg, CNT of FIG. 1) capable of communicating with the SP) and the unmanned aerial vehicle.
- the control device (CNT) is A storage means (for example, 22 in FIG. 1) for registering subject information in which the subject is set as a shooting target, and A determination means (for example, 210 in FIG.
- a signal generation means for example, 220 in FIG. 2) that generates a control signal for controlling the photographing means based on the determination of the determination means.
- a communication control means for example, 230 in FIG. 2) for transmitting the subject information and the control signal to the unmanned aerial vehicle is provided.
- the unmanned aerial vehicle (DRN) Specific means for identifying the subject (for example, 202 in FIG. 2) based on the subject delivered from the terminal of the subject and the subject information transmitted from the communication control means.
- the imaging control means for example, 203 in FIG. 2) is provided by controlling the imaging means based on the control signal to control the imaging of the subject specified by the specific means.
- the shooting system of configuration 1 it is possible to provide a technique capable of shooting a pre-registered subject with an unmanned aerial vehicle.
- the storage means registers as the subject information group information in which users of a plurality of vehicles constituting the subject are set as a group to be photographed.
- the determination means determines whether or not the subject is traveling in a predetermined shooting area based on the position information and the map information of the subject.
- the shooting system of configuration 2 it is possible to shoot a user traveling in a vehicle with an unmanned aerial vehicle, with a user of a plurality of vehicles registered in advance as a group as a subject. That is, according to the shooting system of the configuration 1, it is possible to shoot a user traveling in a plurality of vehicles at the same time by an unmanned aerial vehicle based on preset group information, and a user who wants the group to shoot at the same time. It is possible to provide a shooting system that meets the needs.
- the determination means (210) determines that at least one of the plurality of vehicles set as the group has entered the photographing area, the plurality of vehicles take the image. Judging that the area is running,
- the signal generation means (220) generates a control signal instructing the start of photographing, and generates a control signal.
- the photographing control means (230) controls the photographing means (200) based on the control signal to start photographing a group of the plurality of vehicles (1A, 1B).
- the timing of shooting start can be controlled, and even if the shooting area is small or the inter-vehicle distance of the group vehicles is wide, the vehicle that first entered the shooting area is selected. You can shoot.
- the signal generation means (220) when the determination means (210) determines that at least one of the plurality of vehicles (1A, 1B) is traveling in the photographing area, The signal generation means (220) generates a control signal instructing the continuation of shooting to generate a control signal.
- the imaging control means (230) controls the imaging means based on the control signal to continue the imaging.
- the signal generation means (220) When the determination means (210) determines that all of the plurality of vehicles have left the shooting area, The signal generation means (220) generates a control signal instructing the end of photographing, and generates a control signal.
- the imaging control means (230) controls the imaging means based on the control signal to end the imaging.
- the shooting system of configuration 4 it is possible to control the timing of the end of shooting, and it is possible to shoot a plurality of vehicles set as a group without exception.
- the determination means (210) determines that all of the plurality of vehicles (1A, 1B) set as the group have entered the photographing area
- the plurality of vehicles (1A, 1B) 1A, 1B) is determined to be traveling in the shooting area
- the signal generation means (220) generates a control signal instructing the start of photographing, and generates a control signal.
- the shooting control means (230) controls the shooting means based on the control signal to start shooting a group of the plurality of vehicles.
- the timing of shooting start can be controlled, and shooting can be performed when all the vehicles set as a group are in the shooting area, and the user's needs can be met. It is possible to shoot only the images that satisfy the requirements.
- the determination means (210) includes at least one of the plurality of vehicles (1A, 1B) in the photographing area or a predetermined preparation area set in front of the photographing area. If it is determined that The signal generation means (220) generates an area notification signal for notifying the user that the user has entered the shooting area or the preparation area. The communication control means (230) transmits the area notification signal to the plurality of vehicles.
- the shooting system of configuration 6 it is possible to perform driving in preparation for shooting, such as arranging a formation of vehicles in a group by notifying the user of an area notification signal before shooting.
- the control device is An image processing means (for example, 240 in FIG. 2) that performs image processing for extracting the user's face from the image data taken by the photographing means, and Based on the result of the image processing and the group information, an image determination means (for example, 250 in FIG. 2) for determining whether or not the faces of the users set as the group have been photographed is further provided.
- the signal generating means (220) A parameter control signal for controlling the shooting parameters of the shooting means is generated so that the faces of the set number of users can be shot when the faces of the set number of users are not shot.
- the photographing control means (203) of the unmanned aerial vehicle (DRN) is The imaging means is controlled based on the parameter control signal to perform the imaging.
- the shooting system of configuration 7 by controlling the shooting unit based on the parameter control signal, it is possible to shoot the faces of all the members of the group.
- the signal generation means (220) when the faces of the set number of users are not photographed by the determination of the image determination means (250), The signal generation means (220) generates a shooting guidance signal that guides the user to redo the shooting.
- the communication control means (230) transmits the shooting guidance signal to the plurality of vehicles (1A, 1B).
- a shooting guidance signal is generated to guide the user to redo the shooting, and the shooting guidance signal is transmitted to a plurality of vehicles. This makes it possible to immediately retake the picture in the shooting area.
- the determination means (210) is The inter-vehicle distances of the plurality of vehicles (1A, 1B) traveling in the photographing area or a predetermined preparation area set in front of the photographing area are acquired based on the position information, and the inter-vehicle distance is a predetermined reference distance.
- the signal generating means (220) A distance notification signal for notifying the user that the inter-vehicle distance exceeds the upper limit of the reference distance range is generated.
- the communication control means (230) transmits the distance notification signal to the plurality of vehicles (1A, 1B).
- the inter-vehicle distance is the upper limit of the reference distance range.
- the determination means (210) is The inter-vehicle distances of the plurality of vehicles (1A, 1B) traveling in the photographing area or a predetermined preparation area set in front of the photographing area are acquired based on the position information, and the inter-vehicle distance is a predetermined reference distance.
- the signal generating means (220) An approach notification signal for notifying the user that the inter-vehicle distance is equal to or less than the lower limit of the reference distance range is generated.
- the communication control means (230) transmits the approach notification signal to the plurality of vehicles (1A, 1B).
- the inter-vehicle distance is the lower limit of the reference distance range.
- the plurality of vehicles (1A, 1B) are Acquisition means for acquiring vehicle position information (for example, communication device 15 and detection device 16 in FIG. 1) and A vehicle communication means for transmitting the position information of the vehicle (for example, the communication device 15 in FIG. 1) and A detection means (for example, 16 in FIG. 1) for detecting the speed information of the vehicle is provided.
- the vehicle communication means (15) transmits the speed information to the control device, and the vehicle communication means (15) transmits the speed information to the control device.
- the determination means (210) of the control device (CNT) is The speed difference of the plurality of vehicles (1A, 1B) traveling in the shooting area or a predetermined preparation area set in front of the shooting area is acquired based on the speed information, and the speed difference is a predetermined reference speed. If it is determined that the value exceeds The signal generating means (220) A speed notification signal for notifying the user that the speed difference exceeds the reference speed is generated. The communication control means (230) transmits the speed notification signal to the plurality of vehicles.
- the shooting system of configuration 11 when the speed difference exceeds the reference speed, it may not be possible to shoot a plurality of users at the same time when traveling in the shooting area, so that the speed difference exceeds the reference speed.
- a speed notification signal By transmitting a speed notification signal to a plurality of vehicles and notifying the user, it is possible to urge the user to reduce the speed difference.
- the control device CNT is Further provided with a backlight determining means (for example, 260 in FIG. 2) for determining whether or not the photographing state is backlight based on the image data photographed by the photographing means.
- the signal generation means (220) generates a flight control signal instructing the change of the flight position of the unmanned aerial vehicle so as to avoid the backlight when the shooting state is determined to be backlight.
- the communication control means (230) transmits the flight control signal to the unmanned aerial vehicle.
- the flight control means of the unmanned aerial vehicle (DRN) (for example, 207 in FIG. 2) is The flight position is changed based on the flight control signal.
- the shooting system of the configuration 12 when it is determined that the shooting state is backlit based on the shot image data, the shooting can be performed in a state where the backlight is avoided by changing the flight position of the unmanned aerial vehicle. It will be possible.
- the signal generating means (220) controls to move the angle of view of the photographing means (200) in the horizontal direction when the photographing state is determined to be backlight, or the photographing means.
- a parameter control signal is generated so as to control the movement of the angle of view of (200) in the vertical method.
- the communication control means (230) transmits the parameter control signal to the unmanned aerial vehicle.
- the imaging control means (203) of the unmanned aerial vehicle (DRN) is The angle of view of the photographing means (200) is changed based on the parameter control signal.
- the shooting system of the configuration 13 when it is determined that the shooting state is backlit based on the shot image data, the shooting can be performed in a state where the backlight is avoided by changing the angle of view of the shooting unit. It will be possible.
- the storage means (22) registers user information set as a shooting target for a user of a pedestrian or a single vehicle constituting the subject as the subject information.
- the determination means (210) determines whether or not the subject has entered a predetermined shooting area based on the position information and the map information of the subject.
- the photographing system of the configuration 14 it is possible to provide a technique capable of photographing a pedestrian or a user of a single vehicle by an unmanned aerial vehicle with a pedestrian or a user of a single vehicle as a subject registered in advance.
- the determination means (210) determines that the subject set as the shooting target has entered the shooting area.
- the signal generation means (220) generates a control signal instructing the start of photographing, and generates a control signal.
- the shooting control means (203) controls the shooting means (200) based on the control signal to start shooting the subject.
- the shooting system of configuration 15 it is possible to control the timing of shooting start. As a result, even when the shooting area is small, it is possible to shoot the subject without missing the shooting timing.
- the determination means (200) determines that the subject has entered the shooting area or a predetermined preparation area set in front of the shooting area.
- the determination means (200) determines that the subject has entered the shooting area.
- the signal generation means (220) generates an area notification signal for notifying the subject that the subject has entered the shooting area or the preparation area.
- the communication control means (230) transmits the area notification signal to the subject.
- the subject can prepare for shooting by notifying the subject of the area notification signal before shooting.
- the control device is An image processing means (for example, 240 in FIG. 2) that performs image processing for extracting the face of the subject from the image data taken by the photographing means (200), and An image determination means for determining whether or not the face of the subject has been photographed based on the result of the image processing (for example, 250 in FIG. 2) is further provided.
- the signal generating means (220) A parameter control signal for controlling the shooting parameters of the shooting means (200) is generated so that the face of the subject can be shot when the face of the subject is not shot.
- the photographing control means (203) of the unmanned aerial vehicle (DRN) is The imaging means (200) is controlled based on the parameter control signal to perform the imaging.
- the shooting system of configuration 17 it is possible to shoot the face of the subject by controlling the shooting unit based on the parameter control signal.
- the signal generation means (220) when the face of the subject is not photographed by the determination of the image determining means (250), The signal generation means (220) generates a shooting guidance signal that guides the subject to redo the shooting.
- the communication control means (230) transmits the shooting guidance signal to the subject.
- a shooting guidance signal for guiding the subject to redo the shooting is generated, and the shooting guidance signal is transmitted to the subject for shooting. It will be possible to take a picture again immediately in the area.
- the control device of the above embodiment is a control device (for example, of FIG. 1) capable of communicating with an unmanned aerial vehicle (for example, DRN of FIG. 1) having a photographing means (for example, 200 of FIG. CNT)
- a storage means for example, 22 in FIG. 1) for registering subject information in which the subject is set as a shooting target, and
- a determination means for example, 210 in FIG. 2) for determining whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject.
- a signal generation means for example, 220 in FIG. 2) that generates a control signal for controlling the photographing means based on the determination of the determination means.
- a communication control means for example, 230 in FIG. 2) for transmitting the subject information and the control signal to the unmanned aerial vehicle is provided.
- the control method of the above embodiment is a control method in a control device capable of communicating with an unmanned aerial vehicle having a photographing means capable of photographing a subject in a flight state.
- a storage step (for example, S400 in FIG. 4) of registering subject information in which the subject is set as a shooting target in the storage means (22), and
- a determination step (for example, S415 in FIG. 4) in which the determination means (210) determines whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject.
- a signal generation step (for example, S420 in FIG. 4) in which the signal generation means (220) generates a control signal for controlling the photographing means based on the determination in the determination step.
- the communication control means (230) has a communication control step (for example, S425 in FIG. 4) for transmitting the subject information and the control signal to the unmanned aerial vehicle.
- the program of the above embodiment is a program for causing a computer to execute each step of a control method in a control device capable of communicating with an unmanned aerial vehicle having a photographing means capable of photographing a subject in a flight state.
- a storage step for example, S400 in FIG. 4 of registering subject information in which the subject is set as a shooting target in the storage means (22), and
- a determination step for example, S415 in FIG. 4) in which the determination means (210) determines whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject.
- the communication control means (230) has a communication control step (for example, S425 in FIG. 4) for transmitting the subject information and the control signal to the unmanned aerial vehicle.
- the storage medium of the above embodiment is A computer-readable storage medium that stores a program that causes a computer to execute each step of a control method in a control device capable of communicating with an unmanned aerial vehicle having a photographing means capable of photographing a subject in a flight state.
- a storage step (for example, S400 in FIG. 4) of registering subject information in which the subject is set as a shooting target in the storage means (22), and
- a determination step (for example, S415 in FIG. 4) in which the determination means (210) determines whether or not the subject exists in a predetermined shooting area based on the position information and the map information of the subject.
- a signal generation step for example, S420 in FIG.
- the communication control means (230) has a communication control step (for example, S425 in FIG. 4) for transmitting the subject information and the control signal to the unmanned aerial vehicle.
- control device of the configuration 19 the control method of the configuration 20, the program of the configuration 21, and the storage medium of the configuration 22, it is possible to provide a control technique for photographing a pre-registered subject by an unmanned aerial vehicle.
- 1A, 1B Vehicle, DRN: Unmanned aerial vehicle, 200: Imaging unit, 201: Communication interface unit (communication I / F), 202: Specific unit, 203: Imaging control unit, 207: Flight control unit, CNT: Control device ( Control server), 210: Judgment unit, 220: Signal generation unit, 230: Communication control unit, 240: Image processing unit, 250: Image judgment unit, 260: Backlight determination unit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Studio Devices (AREA)
Abstract
Description
前記制御装置は、
前記被写体を撮影対象として設定した被写体情報を登録する記憶手段と、
前記端末との通信により取得した当該被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定手段と、
前記判定手段の判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成手段と、
前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御手段と、を備え、
前記無人航空機は、
前記被写体の端末から配信される被写体情報と前記通信制御手段から送信された前記被写体情報とに基づいて、前記被写体を特定する特定手段と、
前記制御信号に基づいて前記撮影手段を制御して、前記特定手段により特定された前記被写体の撮影を制御する撮影制御手段と、
を備えることを特徴とする。
前記判定手段は、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアを走行しているか否かを判定することを特徴とする。
前記信号生成手段は撮影開始を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記複数の車両のグループの撮影を開始することを特徴とする。
前記信号生成手段は撮影継続を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記撮影を継続し、
前記判定手段が、前記複数の車両のすべてが前記撮影エリアから出たと判定した場合に、
前記信号生成手段は撮影終了を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記撮影を終了することを特徴とする。
前記信号生成手段は撮影開始を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記複数の車両のグループの撮影を開始することを特徴とする。
前記信号生成手段は、当該撮影エリアまたは準備エリアに入ったことを前記ユーザーに報知するエリア報知信号を生成し、
前記通信制御手段は、前記複数の車両に前記エリア報知信号を送信することを特徴とする。
前記撮影手段で撮影された画像データから前記ユーザーの顔を抽出する画像処理を行う画像処理手段と、
前記画像処理の結果と前記グループ情報とに基づいて、前記グループとして設定された人数分のユーザーの顔が撮影されたか否かを判定する画像判定手段と、を更に備え、
前記信号生成手段は、
前記設定された人数分のユーザーの顔が撮影されなかった場合に、前記設定された人数分のユーザーの顔を撮影できるように、前記撮影手段の撮影パラメータを制御するパラメータ制御信号を生成し、
前記無人航空機の前記撮影制御手段は、
前記パラメータ制御信号に基づいて前記撮影手段を制御して前記撮影を行う
ことを特徴とする。
前記信号生成手段は、前記撮影のやり直しを前記ユーザーに案内する撮影案内信号を生成し、
前記通信制御手段は、前記複数の車両に前記撮影案内信号を送信することを特徴とする。
前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアを走行する前記複数の車両の車間距離を前記位置情報に基づいて取得し、前記車間距離が所定の基準距離範囲の上限の距離を超えていると判定する場合に、
前記信号生成手段は、
前記車間距離が前記基準距離範囲の上限の距離を超えていることを前記ユーザーに報知する距離報知信号を生成し、
前記通信制御手段は、前記複数の車両に前記距離報知信号を送信することを特徴とする。
前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアを走行する前記複数の車両の車間距離を前記位置情報に基づいて取得し、前記車間距離が所定の基準距離範囲の下限の距離以下と判定する場合に、
前記信号生成手段は、
前記車間距離が前記基準距離範囲の下限の距離以下であることを前記ユーザーに報知する接近報知信号を生成し、
前記通信制御手段は、前記複数の車両に前記接近報知信号を送信することを特徴とする。
車両の位置情報を取得する取得手段と、
前記車両の位置情報を送信する車両通信手段と、
前記車両の速度情報を検出する検出手段を更に備え、
前記車両通信手段は前記速度情報を前記制御装置に送信し、
前記制御装置の前記判定手段は、
前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアを走行する前記複数の車両の速度差を前記速度情報に基づいて取得し、前記速度差が所定の基準速度を超えると判定した場合に、
前記信号生成手段は、
前記速度差が前記基準速度を超えることを前記ユーザーに報知する速度報知信号を生成し、
前記通信制御手段は、前記複数の車両に前記速度報知信号を送信することを特徴とする。
前記撮影手段により撮影された画像データに基づいて、撮影状態が逆光か否かを判定する逆光判定手段を更に備え、
前記信号生成手段は、前記撮影状態が逆光と判定された場合に、前記逆光を回避するように前記無人航空機の飛行位置の変更を指示する飛行制御信号を生成し、
前記通信制御手段は、前記無人航空機に前記飛行制御信号を送信し、
前記無人航空機の飛行制御手段は、
前記飛行制御信号に基づいて前記飛行位置を変更することを特徴とする。
前記通信制御手段は、前記無人航空機に前記パラメータ制御信号を送信し、
前記無人航空機の撮影制御手段は、
前記パラメータ制御信号に基づいて前記撮影手段の画角を変更することを特徴とする。
前記判定手段は、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに入ったか否かを判定する
ことを特徴とする。
前記判定手段が、前記撮影対象として設定された前記被写体が前記撮影エリアに入ったと判定した場合に、
前記信号生成手段は撮影開始を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記被写体の撮影を開始することを特徴とする。
前記判定手段は、前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアに前記被写体が入ったと判定した場合に、
前記信号生成手段は、当該撮影エリアまたは準備エリアに入ったことを前記被写体に報知するエリア報知信号を生成し、
前記通信制御手段は、前記被写体に前記エリア報知信号を送信することを特徴とする。
前記撮影手段で撮影された画像データから前記被写体の顔を抽出する画像処理を行う画像処理手段と、
前記画像処理の結果に基づいて、前記被写体の顔が撮影されたか否かを判定する画像判定手段と、を更に備え、
前記信号生成手段は、
前記被写体の顔が撮影されなかった場合に、前記被写体の顔を撮影できるように、前記撮影手段の撮影パラメータを制御するパラメータ制御信号を生成し、
前記無人航空機の前記撮影制御手段は、
前記パラメータ制御信号に基づいて前記撮影手段を制御して前記撮影を行う
ことを特徴とする。
前記信号生成手段は、前記撮影のやり直しを前記被写体に案内する撮影案内信号を生成し、
前記通信制御手段は、前記被写体に前記撮影案内信号を送信することを特徴とする。
前記被写体を撮影対象として設定した被写体情報を登録する記憶手段と、
前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定手段と、
前記判定手段の判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成手段と、
前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御手段と、を備えることを特徴とする。
前記被写体を撮影対象として設定した被写体情報を記憶手段に登録する記憶工程と、
判定手段が、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定工程と、
信号生成手段が、前記判定工程での判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成工程と、
通信制御手段が、前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御工程と、を有することを特徴とする。
前記被写体を撮影対象として設定した被写体情報を記憶手段に登録する記憶工程と、
判定手段が、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定工程と、
信号生成手段が、前記判定工程での判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成工程と、
通信制御手段が、前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御工程と、を有することを特徴とする。
前記被写体を撮影対象として設定した被写体情報を記憶手段に登録する記憶工程と、
判定手段が、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定工程と、
信号生成手段が、前記判定工程での判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成工程と、
通信制御手段が、前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御工程と、を有することを特徴とする。
(撮影システムの構成)
図1は、第1実施形態にかかる撮影システムSTMの構成の一例を示す図である。撮影システムSTMは、無人航空機DRNと、被写体の端末及び無人航空機DRNと通信可能な制御装置CNT(制御サーバ)とを有する。無人航空機DRNは、飛行状態で複数の車両を撮影可能な撮影部200(カメラ)を有する。被写体には、例えば、歩行者、または、単独の車両のユーザー、または、複数の車両のユーザーが含まれる。第1実施形態では、被写体を構成する複数の車両のユーザーを撮影対象のグループとして説明する例について説明する。
次に、無人航空機DRNの機能構成を説明する。図2のST21は無人航空機DRNの機能構成を例示するブロック図である。撮影部200は、無人航空機DRNに搭載されているカメラであり、撮影部200は、無人航空機DRNの飛行状態で複数の車両を撮影可能に構成されている。無人航空機DRNの撮影部200は静止画または動画撮影を行うことが可能である。
次に、制御装置CNTの処理部21の具体的な機能構成を説明する。図2のST22は処理部21の機能構成を例示するブロック図である。図3は判定部210の処理を模式的に説明する図である。
図4は記憶部22および処理部21(判定部210、信号生成部220、通信制御部230)の処理の流れを説明する図である。
図5は画像処理部240及び画像判定部250の処理の流れを説明する図である。ステップS500において、画像処理部240は、無人航空機DRNの撮影部200で撮影された画像データを取得する。
尚、ステップS570の後、逆光判定に関する追加処理を行うことも可能である。逆光判定部260は、撮影部200により撮影された画像データに基づいて、撮影状態が逆光か否かを判定する。例えば、撮影部200により撮影された画像データにおいて、画素値が局所的に基準画素値を超える領域が含まれているとき、逆光判定部260は逆光の撮影状態で撮影された画像と判定する。
制御装置CNTは、複数の車両1A、1Bが撮影エリアまたは撮影エリアの手前に設定された所定の準備エリアを走行する際の車間距離をチェックして、撮影に適した車間距離(所定の基準距離範囲に収まる距離)になるように報知情報を複数の車両に送信する。
制御装置CNTは、複数の車両1A、1Bが撮影エリアまたは撮影エリアの手前に設定された所定の準備エリアを走行する際の車両間の速度差をチェックして、撮影に適した速度差(所定の基準速度以下)になるように速度報知信号を複数の車両に送信する。
先に説明した第1実施形態では、被写体を構成する複数の車両のユーザーを撮影対象のグループとして説明したが、被写体の構成はこの例に限られず、例えば、歩行者、または、単独の車両のユーザーであってもよい。第2実施形態では、歩行者、または、単独の車両のユーザーを撮影対象とする構成について説明する。撮影システムSTMの構成及び車両、制御装置CNT、無人航空機DRNの機能構成は、図1、図2と同様である。以下、第1実施形態と相違する部分について、以下説明する。
上記実施形態は、少なくとも以下の構成を開示する。
前記制御装置(CNT)は、
前記被写体を撮影対象として設定した被写体情報を登録する記憶手段(例えば、図1の22)と、
前記端末との通信により取得した当該被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定手段(例えば、図2の210)と、
前記判定手段の判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成手段(例えば、図2の220)と、
前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御手段(例えば、図2の230)と、を備え、
前記無人航空機(DRN)は、
前記被写体の端末から配信される被写体と前記通信制御手段から送信された前記被写体情報とに基づいて、前記被写体を特定する特定手段(例えば、図2の202)と、
前記制御信号に基づいて前記撮影手段を制御して、前記特定手段により特定された前記被写体の撮影を制御する撮影制御手段(例えば、図2の203)と、を備える。
前記判定手段は、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアを走行しているか否かを判定する。
前記信号生成手段(220)は撮影開始を指示する制御信号を生成し、
前記撮影制御手段(230)は、当該制御信号に基づいて前記撮影手段(200)を制御して、前記複数の車両(1A、1B)のグループの撮影を開始する。
前記信号生成手段(220)は撮影継続を指示する制御信号を生成し、
前記撮影制御手段(230)は、当該制御信号に基づいて前記撮影手段を制御して、前記撮影を継続し、
前記判定手段(210)が、前記複数の車両のすべてが前記撮影エリアから出たと判定した場合に、
前記信号生成手段(220)は撮影終了を指示する制御信号を生成し、
前記撮影制御手段(230)は、当該制御信号に基づいて前記撮影手段を制御して、前記撮影を終了する。
前記信号生成手段(220)は撮影開始を指示する制御信号を生成し、
前記撮影制御手段(230)は、当該制御信号に基づいて前記撮影手段を制御して、前記複数の車両のグループの撮影を開始する。
前記信号生成手段(220)は、当該撮影エリアまたは準備エリアに入ったことを前記ユーザーに報知するエリア報知信号を生成し、
前記通信制御手段(230)は、前記複数の車両に前記エリア報知信号を送信する。
前記撮影手段で撮影された画像データから前記ユーザーの顔を抽出する画像処理を行う画像処理手段(例えば、図2の240)と、
前記画像処理の結果と前記グループ情報とに基づいて、前記グループとして設定された人数分のユーザーの顔が撮影されたか否かを判定する画像判定手段(例えば、図2の250)と、を更に備え、
前記信号生成手段(220)は、
前記設定された人数分のユーザーの顔が撮影されなかった場合に、前記設定された人数分のユーザーの顔を撮影できるように、前記撮影手段の撮影パラメータを制御するパラメータ制御信号を生成し、
前記無人航空機(DRN)の前記撮影制御手段(203)は、
前記パラメータ制御信号に基づいて前記撮影手段を制御して前記撮影を行う。
前記信号生成手段(220)は、前記撮影のやり直しを前記ユーザーに案内する撮影案内信号を生成し、
前記通信制御手段(230)は、前記複数の車両(1A、1B)に前記撮影案内信号を送信する。
前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアを走行する前記複数の車両(1A、1B)の車間距離を前記位置情報に基づいて取得し、前記車間距離が所定の基準距離範囲の上限の距離を超えていると判定する場合に、
前記信号生成手段(220)は、
前記車間距離が前記基準距離範囲の上限の距離を超えていることを前記ユーザーに報知する距離報知信号を生成し、
前記通信制御手段(230)は、前記複数の車両(1A、1B)に前記距離報知信号を送信する。
前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアを走行する前記複数の車両(1A、1B)の車間距離を前記位置情報に基づいて取得し、前記車間距離が所定の基準距離範囲の下限の距離以下と判定する場合に、
前記信号生成手段(220)は、
前記車間距離が前記基準距離範囲の下限の距離以下であることを前記ユーザーに報知する接近報知信号を生成し、
前記通信制御手段(230)は、前記複数の車両(1A、1B)に前記接近報知信号を送信する。
車両の位置情報を取得する取得手段(例えば、図1の通信装置15及び検出装置16)と、
前記車両の位置情報を送信する車両通信手段(例えば、図1の通信装置15)と、
前記車両の速度情報を検出する検出手段(例えば、図1の16)と、を備え、
前記車両通信手段(15)は前記速度情報を前記制御装置に送信し、
前記制御装置(CNT)の前記判定手段(210)は、
前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアを走行する前記複数の車両(1A、1B)の速度差を前記速度情報に基づいて取得し、前記速度差が所定の基準速度を超えると判定した場合に、
前記信号生成手段(220)は、
前記速度差が前記基準速度を超えることを前記ユーザーに報知する速度報知信号を生成し、
前記通信制御手段(230)は、前記複数の車両に前記速度報知信号を送信する。
前記撮影手段により撮影された画像データに基づいて、撮影状態が逆光か否かを判定する逆光判定手段(例えば、図2の260)を更に備え、
前記信号生成手段(220)は、前記撮影状態が逆光と判定された場合に、前記逆光を回避するように前記無人航空機の飛行位置の変更を指示する飛行制御信号を生成し、
前記通信制御手段(230)は、前記無人航空機に前記飛行制御信号を送信し、
前記無人航空機(DRN)の飛行制御手段(例えば、図2の207)は、
前記飛行制御信号に基づいて前記飛行位置を変更する。
前記通信制御手段(230)は、前記無人航空機に前記パラメータ制御信号を送信し、
前記無人航空機(DRN)の撮影制御手段(203)は、
前記パラメータ制御信号に基づいて前記撮影手段(200)の画角を変更する。
前記判定手段(210)は、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに入ったか否かを判定する。
前記信号生成手段(220)は撮影開始を指示する制御信号を生成し、
前記撮影制御手段(203)は、当該制御信号に基づいて前記撮影手段(200)を制御して、前記被写体の撮影を開始する。
前記信号生成手段(220)は、当該撮影エリアまたは準備エリアに入ったことを前記被写体に報知するエリア報知信号を生成し、
前記通信制御手段(230)は、前記被写体に前記エリア報知信号を送信する。
前記撮影手段(200)で撮影された画像データから前記被写体の顔を抽出する画像処理を行う画像処理手段(例えば、図2の240)と、
前記画像処理の結果に基づいて、前記被写体の顔が撮影されたか否かを判定する画像判定手段と(例えば、図2の250)、を更に備え、
前記信号生成手段(220)は、
前記被写体の顔が撮影されなかった場合に、前記被写体の顔を撮影できるように、前記撮影手段(200)の撮影パラメータを制御するパラメータ制御信号を生成し、
前記無人航空機(DRN)の前記撮影制御手段(203)は、
前記パラメータ制御信号に基づいて前記撮影手段(200)を制御して前記撮影を行う。
前記信号生成手段(220)は、前記撮影のやり直しを前記被写体に案内する撮影案内信号を生成し、
前記通信制御手段(230)は、前記被写体に前記撮影案内信号を送信する。
前記被写体を撮影対象として設定した被写体情報を登録する記憶手段(例えば、図1の22)と、
前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定手段(例えば、図2の210)と、
前記判定手段の判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成手段(例えば、図2の220)と、
前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御手段(例えば、図2の230)と、を備える。
前記被写体を撮影対象として設定した被写体情報を記憶手段(22)に登録する記憶工程(例えば、図4のS400)と、
判定手段(210)が、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定工程(例えば、図4のS415)と、
信号生成手段(220)が、前記判定工程での判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成工程(例えば、図4のS420)と、
通信制御手段(230)が、前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御工程(例えば、図4のS425)と、を有する。
前記被写体を撮影対象として設定した被写体情報を記憶手段(22)に登録する記憶工程(例えば、図4のS400)と、
判定手段(210)が、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定工程(例えば、図4のS415)と、
信号生成手段(220)が、前記判定工程での判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成工程と(例えば、図4のS420)、
通信制御手段(230)が、前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御工程(例えば、図4のS425)と、を有する。
コンピュータに、飛行状態で被写体を撮影可能な撮影手段を有する無人航空機と通信可能な制御装置における制御方法の各工程を実行させるプログラムを記憶したコンピュータ可読の記憶媒体であって、当該制御方法が、
前記被写体を撮影対象として設定した被写体情報を記憶手段(22)に登録する記憶工程(例えば、図4のS400)と、
判定手段(210)が、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定工程(例えば、図4のS415)と、
信号生成手段(220)が、前記判定工程での判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成工程(例えば、図4のS420)と、
通信制御手段(230)が、前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御工程(例えば、図4のS425)と、を有する。
Claims (22)
- 飛行状態で被写体を撮影可能な撮影手段を有する無人航空機と、前記被写体の端末及び前記無人航空機と通信可能な制御装置と、を有する撮影システムであって、
前記制御装置は、
前記被写体を撮影対象として設定した被写体情報を登録する記憶手段と、
前記端末との通信により取得した当該被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定手段と、
前記判定手段の判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成手段と、
前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御手段と、を備え、
前記無人航空機は、
前記被写体の端末から配信される被写体情報と前記通信制御手段から送信された前記被写体情報とに基づいて、前記被写体を特定する特定手段と、
前記制御信号に基づいて前記撮影手段を制御して、前記特定手段により特定された前記被写体の撮影を制御する撮影制御手段と、
を備えることを特徴とする撮影システム。 - 前記記憶手段は、前記被写体を構成する複数の車両のユーザーを撮影対象のグループとして設定したグループ情報を、前記被写体情報として登録し、
前記判定手段は、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアを走行しているか否かを判定する
ことを特徴とする請求項1に記載の撮影システム。 - 前記判定手段が、前記グループとして設定された前記複数の車両のうち少なくとも一台が前記撮影エリアに入ったと判定した場合に、前記複数の車両が前記撮影エリアを走行中であると判定し、
前記信号生成手段は撮影開始を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記複数の車両のグループの撮影を開始することを特徴とする請求項2に記載の撮影システム。 - 前記判定手段が、前記複数の車両のうち少なくとも一台が前記撮影エリアを走行していると判定した場合に、
前記信号生成手段は撮影継続を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記撮影を継続し、
前記判定手段が、前記複数の車両のすべてが前記撮影エリアから出たと判定した場合に、
前記信号生成手段は撮影終了を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記撮影を終了することを特徴とする請求項3に記載の撮影システム。 - 前記判定手段が、前記グループとして設定された前記複数の車両の全てが前記撮影エリアに入ったと判定した場合に、前記複数の車両が前記撮影エリアを走行中であると判定し、
前記信号生成手段は撮影開始を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記複数の車両のグループの撮影を開始することを特徴とする請求項2に記載の撮影システム。 - 前記判定手段は、前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアに前記複数の車両のうち少なくとも一台が入ったと判定した場合に、
前記信号生成手段は、当該撮影エリアまたは準備エリアに入ったことを前記ユーザーに報知するエリア報知信号を生成し、
前記通信制御手段は、前記複数の車両に前記エリア報知信号を送信することを特徴とする請求項2乃至5のいずれか1項に記載の撮影システム。 - 前記制御装置は、
前記撮影手段で撮影された画像データから前記ユーザーの顔を抽出する画像処理を行う画像処理手段と、
前記画像処理の結果と前記グループ情報とに基づいて、前記グループとして設定された人数分のユーザーの顔が撮影されたか否かを判定する画像判定手段と、を更に備え、
前記信号生成手段は、
前記設定された人数分のユーザーの顔が撮影されなかった場合に、前記設定された人数分のユーザーの顔を撮影できるように、前記撮影手段の撮影パラメータを制御するパラメータ制御信号を生成し、
前記無人航空機の前記撮影制御手段は、
前記パラメータ制御信号に基づいて前記撮影手段を制御して前記撮影を行う
ことを特徴とする請求項2乃至6のいずれか1項に記載の撮影システム。 - 前記画像判定手段の判定により、前記設定された人数分のユーザーの顔が撮影されなかった場合に、
前記信号生成手段は、前記撮影のやり直しを前記ユーザーに案内する撮影案内信号を生成し、
前記通信制御手段は、前記複数の車両に前記撮影案内信号を送信することを特徴とする請求項7に記載の撮影システム。 - 前記判定手段は、
前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアを走行する前記複数の車両の車間距離を前記位置情報に基づいて取得し、前記車間距離が所定の基準距離範囲の上限の距離を超えていると判定する場合に、
前記信号生成手段は、
前記車間距離が前記基準距離範囲の上限の距離を超えていることを前記ユーザーに報知する距離報知信号を生成し、
前記通信制御手段は、前記複数の車両に前記距離報知信号を送信することを特徴とする請求項2乃至8のいずれか1項に記載の撮影システム。 - 前記判定手段は、
前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアを走行する前記複数の車両の車間距離を前記位置情報に基づいて取得し、前記車間距離が所定の基準距離範囲の下限の距離以下と判定する場合に、
前記信号生成手段は、
前記車間距離が前記基準距離範囲の下限の距離以下であることを前記ユーザーに報知する接近報知信号を生成し、
前記通信制御手段は、前記複数の車両に前記接近報知信号を送信することを特徴とする請求項2乃至8のいずれか1項に記載の撮影システム。 - 前記複数の車両は、
車両の位置情報を取得する取得手段と、
前記車両の位置情報を送信する車両通信手段と、
前記車両の速度情報を検出する検出手段と、を備え、
前記車両通信手段は前記速度情報を前記制御装置に送信し、
前記制御装置の前記判定手段は、
前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアを走行する前記複数の車両の速度差を前記速度情報に基づいて取得し、前記速度差が所定の基準速度を超えると判定した場合に、
前記信号生成手段は、
前記速度差が前記基準速度を超えることを前記ユーザーに報知する速度報知信号を生成し、
前記通信制御手段は、前記複数の車両に前記速度報知信号を送信することを特徴とする請求項2乃至10のいずれか1項に記載の撮影システム。 - 前記制御装置は、
前記撮影手段により撮影された画像データに基づいて、撮影状態が逆光か否かを判定する逆光判定手段を更に備え、
前記信号生成手段は、前記撮影状態が逆光と判定された場合に、前記逆光を回避するように前記無人航空機の飛行位置の変更を指示する飛行制御信号を生成し、
前記通信制御手段は、前記無人航空機に前記飛行制御信号を送信し、
前記無人航空機の飛行制御手段は、
前記飛行制御信号に基づいて前記飛行位置を変更することを特徴とする請求項2乃至11のいずれか1項に記載の撮影システム。 - 前記信号生成手段は、前記撮影状態が逆光と判定された場合に、前記撮影手段の画角を水平方向に移動させる制御、または前記撮影手段の画角を垂直方法に移動させる制御を行うようにパラメータ制御信号を生成し、
前記通信制御手段は、前記無人航空機に前記パラメータ制御信号を送信し、
前記無人航空機の撮影制御手段は、
前記パラメータ制御信号に基づいて前記撮影手段の画角を変更することを特徴とする請求項12に記載の撮影システム。 - 前記記憶手段は、前記被写体を構成する、歩行者または単独の車両のユーザーを撮影対象として設定したユーザー情報を、前記被写体情報として登録し、
前記判定手段は、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに入ったか否かを判定する
ことを特徴とする請求項1に記載の撮影システム。 - 前記判定手段が、前記撮影対象として設定された前記被写体が前記撮影エリアに入ったと判定した場合に、
前記信号生成手段は撮影開始を指示する制御信号を生成し、
前記撮影制御手段は、当該制御信号に基づいて前記撮影手段を制御して、前記被写体の撮影を開始することを特徴とする請求項14に記載の撮影システム。 - 前記判定手段は、前記撮影エリアまたは前記撮影エリアの手前に設定された所定の準備エリアに前記被写体が入ったと判定した場合に、
前記信号生成手段は、当該撮影エリアまたは準備エリアに入ったことを前記被写体に報知するエリア報知信号を生成し、
前記通信制御手段は、前記被写体に前記エリア報知信号を送信することを特徴とする請求項14または15に記載の撮影システム。 - 前記制御装置は、
前記撮影手段で撮影された画像データから前記被写体の顔を抽出する画像処理を行う画像処理手段と、
前記画像処理の結果に基づいて、前記被写体の顔が撮影されたか否かを判定する画像判定手段と、を更に備え、
前記信号生成手段は、
前記被写体の顔が撮影されなかった場合に、前記被写体の顔を撮影できるように、前記撮影手段の撮影パラメータを制御するパラメータ制御信号を生成し、
前記無人航空機の前記撮影制御手段は、
前記パラメータ制御信号に基づいて前記撮影手段を制御して前記撮影を行う
ことを特徴とする請求項14乃至16のいずれか1項に記載の撮影システム。 - 前記画像判定手段の判定により、前記被写体の顔が撮影されなかった場合に、
前記信号生成手段は、前記撮影のやり直しを前記被写体に案内する撮影案内信号を生成し、
前記通信制御手段は、前記被写体に前記撮影案内信号を送信することを特徴とする請求項17に記載の撮影システム。 - 飛行状態で被写体を撮影可能な撮影手段を有する無人航空機と通信可能な制御装置であって、
前記被写体を撮影対象として設定した被写体情報を登録する記憶手段と、
前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定手段と、
前記判定手段の判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成手段と、
前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御手段と、
を備えることを特徴とする制御装置。 - 飛行状態で被写体を撮影可能な撮影手段を有する無人航空機と通信可能な制御装置における制御方法であって、
前記被写体を撮影対象として設定した被写体情報を記憶手段に登録する記憶工程と、
判定手段が、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定工程と、
信号生成手段が、前記判定工程での判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成工程と、
通信制御手段が、前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御工程と、
を有することを特徴とする制御方法。 - コンピュータに、飛行状態で被写体を撮影可能な撮影手段を有する無人航空機と通信可能な制御装置における制御方法の各工程を実行させるプログラムであって、当該制御方法が、
前記被写体を撮影対象として設定した被写体情報を記憶手段に登録する記憶工程と、
判定手段が、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定工程と、
信号生成手段が、前記判定工程での判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成工程と、
通信制御手段が、前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御工程と、を有することを特徴とするプログラム。 - コンピュータに、飛行状態で被写体を撮影可能な撮影手段を有する無人航空機と通信可能な制御装置における制御方法の各工程を実行させるプログラムを記憶したコンピュータ可読の記憶媒体であって、当該制御方法が、
前記被写体を撮影対象として設定した被写体情報を記憶手段に登録する記憶工程と、
判定手段が、前記被写体の位置情報と地図情報とに基づいて、当該被写体が所定の撮影エリアに存在するか判定する判定工程と、
信号生成手段が、前記判定工程での判定に基づいて前記撮影手段を制御する制御信号を生成する信号生成工程と、
通信制御手段が、前記被写体情報および前記制御信号を前記無人航空機に送信する通信制御工程と、
を有することを特徴とする記憶媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021548848A JP7171938B2 (ja) | 2019-09-27 | 2020-09-16 | 撮影システム、制御装置、制御方法、プログラム及び記憶媒体 |
EP20870217.5A EP4037305A4 (en) | 2019-09-27 | 2020-09-16 | PHOTOGRAPHY SYSTEM, CONTROL DEVICE, CONTROL METHOD, PROGRAM, AND STORAGE MEDIA |
CN202080064980.5A CN114401895B (zh) | 2019-09-27 | 2020-09-16 | 拍摄系统、控制装置、控制方法以及存储介质 |
US17/701,092 US20220212787A1 (en) | 2019-09-27 | 2022-03-22 | Image capturing system, control device, control method, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019177692 | 2019-09-27 | ||
JP2019-177692 | 2019-09-27 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/701,092 Continuation US20220212787A1 (en) | 2019-09-27 | 2022-03-22 | Image capturing system, control device, control method, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021060113A1 true WO2021060113A1 (ja) | 2021-04-01 |
Family
ID=75165718
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/035124 WO2021060113A1 (ja) | 2019-09-27 | 2020-09-16 | 撮影システム、制御装置、制御方法、プログラム及び記憶媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20220212787A1 (ja) |
EP (1) | EP4037305A4 (ja) |
JP (1) | JP7171938B2 (ja) |
CN (1) | CN114401895B (ja) |
WO (1) | WO2021060113A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2023078640A (ja) * | 2021-11-26 | 2023-06-07 | トヨタ自動車株式会社 | 車両撮影システムおよび車両撮影方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005124176A (ja) * | 2003-09-25 | 2005-05-12 | Fuji Photo Film Co Ltd | 自動撮影システム |
JP2017182690A (ja) | 2016-03-31 | 2017-10-05 | セコム株式会社 | 自律移動ロボット |
JP2018038029A (ja) * | 2016-05-11 | 2018-03-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 撮影制御方法、撮影制御システム及び撮影制御サーバ |
JP2019145947A (ja) * | 2018-02-19 | 2019-08-29 | 大日本印刷株式会社 | 撮影システム及び撮影制御装置 |
JP2019177692A (ja) | 2018-12-17 | 2019-10-17 | 住友ベークライト株式会社 | 多層フィルム及び包装体 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11240628B2 (en) * | 2014-07-29 | 2022-02-01 | GeoFrenzy, Inc. | Systems and methods for decoupling and delivering geofence geometries to maps |
EP3101889A3 (en) * | 2015-06-02 | 2017-03-08 | LG Electronics Inc. | Mobile terminal and controlling method thereof |
JP6275285B2 (ja) * | 2015-12-29 | 2018-02-07 | 楽天株式会社 | 物流システム、荷物運搬方法、及びプログラム |
JP2018092237A (ja) * | 2016-11-30 | 2018-06-14 | キヤノンマーケティングジャパン株式会社 | 無人航空機制御システム、無人航空機制御システムの制御方法、およびプログラム |
KR101959366B1 (ko) * | 2018-04-10 | 2019-03-21 | 주식회사 아르고스다인 | 무인기와 무선단말기 간의 상호 인식 방법 |
-
2020
- 2020-09-16 JP JP2021548848A patent/JP7171938B2/ja active Active
- 2020-09-16 CN CN202080064980.5A patent/CN114401895B/zh active Active
- 2020-09-16 EP EP20870217.5A patent/EP4037305A4/en active Pending
- 2020-09-16 WO PCT/JP2020/035124 patent/WO2021060113A1/ja active Application Filing
-
2022
- 2022-03-22 US US17/701,092 patent/US20220212787A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005124176A (ja) * | 2003-09-25 | 2005-05-12 | Fuji Photo Film Co Ltd | 自動撮影システム |
JP2017182690A (ja) | 2016-03-31 | 2017-10-05 | セコム株式会社 | 自律移動ロボット |
JP2018038029A (ja) * | 2016-05-11 | 2018-03-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 撮影制御方法、撮影制御システム及び撮影制御サーバ |
JP2019145947A (ja) * | 2018-02-19 | 2019-08-29 | 大日本印刷株式会社 | 撮影システム及び撮影制御装置 |
JP2019177692A (ja) | 2018-12-17 | 2019-10-17 | 住友ベークライト株式会社 | 多層フィルム及び包装体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4037305A4 |
Also Published As
Publication number | Publication date |
---|---|
US20220212787A1 (en) | 2022-07-07 |
EP4037305A1 (en) | 2022-08-03 |
JP7171938B2 (ja) | 2022-11-15 |
CN114401895B (zh) | 2024-06-14 |
JPWO2021060113A1 (ja) | 2021-04-01 |
CN114401895A (zh) | 2022-04-26 |
EP4037305A4 (en) | 2022-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190164431A1 (en) | Movable body, dispatch system, server, and method for dispatching movable body | |
JP6382118B2 (ja) | 画像調整方法、サーバ及び動画撮影システム | |
CN109781124B (zh) | 一种无人机救援方法、装置、无人机及车辆 | |
CN110493561B (zh) | 服务器、车辆拍摄系统及车辆拍摄方法 | |
KR20200096518A (ko) | 정보 처리 장치, 이동체, 제어 시스템, 정보 처리 방법 및 프로그램 | |
CN110741631A (zh) | 车辆的图像提供系统、服务器系统及车辆的图像提供方法 | |
WO2021060113A1 (ja) | 撮影システム、制御装置、制御方法、プログラム及び記憶媒体 | |
KR102480424B1 (ko) | 지역 감시기능을 갖는 퍼스널 모빌리티 | |
CN111625014A (zh) | 一种无人机控制方法、车载终端及计算机可读存储介质 | |
US11657723B2 (en) | Management device | |
JP7210394B2 (ja) | 情報提供装置、情報提供方法、およびプログラム | |
JP7444224B2 (ja) | 人物特定装置、人物特定方法およびプログラム | |
WO2020137398A1 (ja) | 操作制御装置、撮像装置、操作制御方法 | |
CN110301133B (zh) | 信息处理装置、信息处理方法和计算机可读记录介质 | |
JP7346365B2 (ja) | 撮影管理システム、撮影管理装置、運営装置、撮影管理装置の制御方法、運営装置の制御方法、及びプログラム | |
KR20190007277A (ko) | 무인 비행체를 이용한 차량용 블랙박스 영상 촬영 장치 및 그 방법 | |
WO2021060114A1 (ja) | ナビゲーションシステム、経路設定装置、経路設定方法、プログラム及び記憶媒体 | |
EP3565233B1 (en) | Camera, camera processing method, server, server processing method, and information processing device | |
JP7208114B2 (ja) | 情報提供装置、情報提供方法、およびプログラム | |
CN114364944B (zh) | 导航系统、路径设定装置、路径设定方法以及存储介质 | |
CN114868167A (zh) | 安全系统和监控方法 | |
JP7311331B2 (ja) | 情報提供装置、情報提供方法、およびプログラム | |
CN111625015A (zh) | 无人机控制方法及车载终端 | |
JP2024154845A (ja) | 車両撮影システム | |
JP2023064443A (ja) | サーバ、情報処理システムおよび情報処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20870217 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021548848 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2020870217 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2020870217 Country of ref document: EP Effective date: 20220428 |