WO2018163571A1 - 情報処理装置、情報処理方法および情報処理プログラム - Google Patents
情報処理装置、情報処理方法および情報処理プログラム Download PDFInfo
- Publication number
- WO2018163571A1 WO2018163571A1 PCT/JP2017/046314 JP2017046314W WO2018163571A1 WO 2018163571 A1 WO2018163571 A1 WO 2018163571A1 JP 2017046314 W JP2017046314 W JP 2017046314W WO 2018163571 A1 WO2018163571 A1 WO 2018163571A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- shooting
- imaging
- information processing
- moving body
- failed
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 75
- 238000003672 processing method Methods 0.000 title claims description 9
- 238000003384 imaging method Methods 0.000 claims abstract description 174
- 238000012545 processing Methods 0.000 claims description 28
- 238000004891 communication Methods 0.000 description 29
- 238000000034 method Methods 0.000 description 26
- 230000008569 process Effects 0.000 description 18
- 238000005516 engineering process Methods 0.000 description 13
- 230000001276 controlling effect Effects 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 7
- 230000007246 mechanism Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000005484 gravity Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000012634 optical imaging Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000010191 image analysis Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 230000007257 malfunction Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 208000033748 Device issues Diseases 0.000 description 1
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000001141 propulsive effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/30—Safety arrangements for control of exposure
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/006—Apparatus mounted on flying objects
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B43/00—Testing correct operation of photographic apparatus or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/634—Warning indications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/93—Regeneration of the television signal or of selected parts thereof
- H04N5/95—Time-base error compensation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
Definitions
- the present technology relates to an information processing apparatus, an information processing method, and an information processing program.
- Patent Document 1 a small electric vehicle (unmanned aerial vehicle)
- Such shooting is used for various purposes such as shooting movies and dramas and surveying.
- the user does not directly operate the camera, but obtains a photographed image by automatic photographing by controlling the camera from some computer system as a host device.
- the host device issues a shooting start command to the camera, and the camera attempts to shoot in response to this request.
- shooting may not be performed depending on the situation at that time. For example, when autofocus is not successful and the shutter cannot be released, or when a captured image cannot be written to a memory card.
- the present technology has been made in view of such a problem, and an information processing apparatus, an information processing method, and an information processing program capable of notifying a moving body of whether imaging has failed in an imaging device mounted on the moving body
- the purpose is to provide.
- the first technique is an information processing apparatus that operates in an imaging device mounted on a moving body and notifies the moving body of failure when shooting by the imaging device fails. .
- the second technique is an information processing method for notifying the moving body of failure when shooting by the imaging device mounted on the moving body has failed.
- the third technique is an information processing program for causing a computer to execute an information processing method for notifying a moving body when photographing by an imaging device mounted on the moving body has failed.
- the fourth technique is an information processing apparatus that operates in a moving body equipped with an imaging device and determines whether or not shooting by the imaging device has failed.
- FIG. 1A is a plan view showing an external configuration of a moving body
- FIG. 1B is a side view showing an external configuration of the moving body.
- It is a block diagram which shows the structure of a moving body.
- It is a block diagram which shows the structure of an imaging device.
- It is the schematic of the communication between the conventional mobile body and an imaging device.
- It is the schematic of the communication between the moving body and imaging device in this technique.
- Embodiment> [1-1. Configuration of moving body] [1-2. Configuration of imaging device] [1-3. Shooting result notification process] ⁇ 2. Modification>
- the moving body 100 is an electric small aircraft (unmanned aerial vehicle) called a drone.
- 1A is a plan view of the moving body 100
- FIG. 1B is a front view of the moving body 100.
- the fuselage is composed of, for example, a cylindrical or rectangular tube-shaped body portion 1 as a central portion and support shafts 2a to 2f fixed to the upper portion of the body portion 1.
- the body portion 1 has a hexagonal cylindrical shape, and six support shafts 2a to 2f extend radially from the center of the body portion 1 at equiangular intervals.
- the body portion 1 and the support shafts 2a to 2f are made of a light material having high strength.
- each component is designed so that the fuselage unit 1 and the support shafts 2a to 2f have a center of gravity on a vertical line passing through the centers of the support shafts 2a to 2f. Further, a circuit unit 5 and a battery 6 are provided in the body 1 so that the center of gravity is on the vertical line.
- the number of rotor blades and motors is six. However, a configuration having four rotor blades and a motor, or a configuration having eight or more rotor blades and a motor may be used.
- Motors 3a to 3f as drive sources for the rotor blades are attached to the tip portions of the support shafts 2a to 2f, respectively.
- Rotor blades 4a to 4f are attached to the rotation shafts of the motors 3a to 3f.
- a circuit unit 5 including a control unit for controlling each motor is attached to a central portion where the support shafts 2a to 2f intersect.
- the motor 3a and the rotor blade 4a, and the motor 3d and the rotor blade 4d constitute a pair.
- (motor 3b, rotor blade 4b) and (motor 3e, rotor blade 4e) constitute a pair
- (motor 3c, rotor blade 4c) and (motor 3f, rotor blade 4f) constitute a pair.
- a battery 6 as a power source is disposed on the bottom of the body 1.
- the battery 6 includes, for example, a lithium ion secondary battery and a battery control circuit that controls charging / discharging.
- the battery 6 is detachably attached to the inside of the body portion 1. By making the center of gravity of the battery 6 coincide with the center of gravity of the airframe, the stability of the center of gravity increases.
- An electric small aircraft generally called a drone enables desired navigation by controlling the output of a motor.
- the tilt is detected using a gyro sensor mounted on the aircraft, and the motor output on the side where the aircraft is lowered is increased and the motor output on the side where it is raised is decreased. This keeps the aircraft level.
- the motor output in the traveling direction is reduced and the motor output in the reverse direction is increased, so that a forward leaning posture is taken and a propulsive force in the traveling direction is generated.
- the installation position of the battery 6 as described above can balance the stability of the aircraft and the ease of control.
- FIG. 2 is a block diagram showing the configuration of the moving body 100.
- the moving body 100 includes a control unit 110, a GPS (Global Positioning System) module 130, a motion sensor 140, a communication unit 150, an external input / output 160, a battery 6, and motors 3a to 3f. Note that the support shaft, the rotor blades, and the like described in the external configuration of the moving body 100 are omitted. It is assumed that the control unit 110, the GPS module 130, the gyro sensor, the communication unit 150, and the external input / output 160 are included in the circuit unit 5 shown in the external view of the moving body 100 in FIG.
- GPS Global Positioning System
- the control unit 110 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like.
- the ROM stores a program that is read and operated by the CPU.
- the RAM is used as a work memory for the CPU.
- the CPU executes various processes in accordance with programs stored in the ROM and issues commands, thereby controlling the entire moving body 100 such as setting and changing the flight course and setting the timing of shooting by the imaging device 200. An imaging instruction or the like is given to the imaging apparatus 200.
- the control unit 110 controls the flight of the moving body 100 by controlling the outputs of the motors 3a to 3f.
- the GPS module 130 acquires the current position, altitude, and current time of the moving body 100 and supplies them to the control unit 110.
- the GPS module 130 acquires shooting position information indicating a shooting position of an image shot by the imaging apparatus 200 and shooting time information indicating a shooting time. Note that the shooting time information may be acquired by a time measuring function provided in the imaging apparatus 200.
- the motion sensor 140 detects the movement of the moving body 100 using, for example, an acceleration sensor, an angular velocity sensor, a gyro sensor, or the like with respect to a biaxial or triaxial direction. An angular velocity around the axial direction is detected and supplied to the control unit 110.
- the communication unit 150 is a communication module for communicating with an external device (a personal computer, a tablet terminal, a smartphone, or the like) that functions as a device (referred to as a base station) for controlling the moving body 100 from the ground.
- the mobile unit 100 transmits the state of the mobile unit 100 in flight to the base station via communication by the communication unit 150. It also receives instructions from the base station. Furthermore, it is also possible to transmit an image captured by the imaging apparatus 200 to the base station.
- Communication methods in the communication unit 150 include Bluetooth (registered trademark), wireless LAN (Local Area Network), Wi-Fi, ZigBee, and the like, which are wireless communications. Since the mobile body 100 is a flying body that flies in the air, communication with the base station is performed by wireless communication.
- the external input / output 160 is various communication terminals, modules, and the like for connecting the mobile unit 100 and external devices by wired connection or wireless connection.
- the mobile device 100 is connected to the imaging device 200 by wired communication using USB (Universal Serial Bus).
- USB Universal Serial Bus
- power can be transmitted and received in addition to data transmission and reception, so that it is possible to have a configuration in which a battery is provided in only one of the moving body 100 and the imaging apparatus 200 and no battery is provided in the other. Become.
- FIG. 3 is a block diagram illustrating a configuration of the imaging apparatus 200.
- the imaging device 200 is attached to the bottom surface of the body portion 1 of the moving body 100 so that the imaging device 200 is suspended via a camera mount 50.
- the image pickup apparatus 200 can take a picture with the lens directed in any direction from 360 degrees in the horizontal direction to the vertical direction by driving the camera mount 50.
- the operation control of the camera mount 50 is performed by the control unit 110.
- the imaging apparatus 200 includes a control unit 210, an optical imaging system 220, a lens driving driver 230, an imaging element 240, an image signal processing unit 250, an image memory 260, a storage unit 270, and an external input / output 280.
- the control unit 210 includes a CPU, a RAM, a ROM, and the like.
- the CPU controls the entire imaging apparatus 200 by executing various processes in accordance with programs stored in the ROM and issuing commands.
- the control unit 210 also functions as the information processing unit 300.
- the information processing unit 300 determines whether shooting by the imaging apparatus 200 has succeeded or failed, and notifies the moving body 100 of the result. This point will be described later. Note that the imaging result may be notified to the mobile unit 100 and to the base station.
- the information processing unit 300 is configured by a program, and the program may be installed in the imaging device 200 in advance, or may be distributed by download, a storage medium, or the like and installed by the user. .
- the control unit 210 may function as the information processing unit 300 when the control unit 210 executes the program. Further, the information processing unit 300 may have a configuration independent of the control unit 210. Furthermore, the information processing unit 300 is not only realized by a program, but may also be realized by combining a dedicated device, a circuit, or the like using hardware having the function.
- the optical imaging system 220 includes an imaging lens for condensing light from a subject on the imaging device 240, a driving mechanism for moving the imaging lens to perform focusing and zooming, a shutter mechanism, an iris mechanism, and the like. Yes. These are driven based on control signals from the control unit 210 and the lens driving driver 230 of the imaging apparatus 200. The optical image of the subject obtained through the optical imaging system 220 is formed on the imaging device 240 included in the imaging device 200.
- the lens driving driver 230 is configured by, for example, a microcomputer, and performs autofocus so as to focus on a target subject by moving the photographing lens by a predetermined amount along the optical axis direction under the control of the control unit 210. Further, in accordance with control from the control unit 210, operations of the drive mechanism, shutter mechanism, iris mechanism, and the like of the optical imaging system 220 are controlled. Thereby, adjustment of exposure time (shutter speed), aperture value (F value), etc. are made.
- the image sensor 240 photoelectrically converts incident light from a subject to convert it into a charge amount, and outputs a pixel signal. Then, the image sensor 240 outputs the pixel signal to the image signal processing unit 250.
- a CCD Charge-Coupled Device
- CMOS Complementary Metal-Oxide Semiconductor
- the image signal processing unit 250 performs a sample hold for maintaining a good S / N (Signal / Noise) ratio by CDS (Correlated Double Sampling) processing on the image signal output from the image sensor 240, AGC (Auto Gain Control). ) Processing, A / D (Analog / Digital) conversion, etc. are performed to create an image signal.
- CDS Correlated Double Sampling
- AGC Auto Gain Control
- the image signal processing unit 250 performs predetermined signal processing such as demosaic processing, white balance adjustment processing, color correction processing, gamma correction processing, Y / C conversion processing, AE (Auto-Exposure) processing, resolution conversion processing, etc. May be applied to.
- predetermined signal processing such as demosaic processing, white balance adjustment processing, color correction processing, gamma correction processing, Y / C conversion processing, AE (Auto-Exposure) processing, resolution conversion processing, etc. May be applied to.
- the image memory 260 is a volatile memory, for example, a buffer memory composed of a DRAM (Dynamic Random Access Memory).
- the image memory 260 temporarily stores image data that has been subjected to predetermined processing by the image signal processing unit 250.
- the storage unit 270 is a large-capacity storage medium such as a hard disk or an SD memory card.
- the captured image is saved in a compressed state based on a standard such as JPEG (Joint Photographic Experts Group).
- EXIF Exchangeable Image File Format
- the moving image is stored in a format such as MPEG2 (Moving / Pictures / Experts / Group2) or MPEG4.
- the external input / output 280 is various communication terminals, modules, and the like for connecting the imaging apparatus 200 and external devices by wired connection or wireless connection.
- imaging device 200 is connected to moving body 100 by USB communication.
- the imaging apparatus 200 receives various settings related to shooting, shooting instructions, and the like from the control unit 110 of the moving body 100 by USB communication. Furthermore, the imaging apparatus 200 is supplied with power from the battery 6 of the moving body 100 by USB communication.
- the moving body 100 and the imaging apparatus 200 are connected by USB, and communicate using PTP (Picture Transfer Protocol) which is an image transfer protocol.
- PTP Picture Transfer Protocol
- an image and accompanying data can be transferred from the imaging apparatus 200 such as a digital camera to the mobile unit 100 without adding a device driver.
- PTP is standardized as ISO 15740.
- PTP is basically a protocol that is controlled by issuing a command from the host device to the device, but a mechanism called EVENT that notifies the host device is also defined.
- EVENT a mechanism called EVENT that notifies the host device is also defined.
- a shooting success event is defined, but there is no shooting failure definition. Therefore, as it is, communication between the imaging device 200 and the moving body 100 is unilaterally transmitted from the moving body 100 to the imaging device 200 as shown in FIG. Is not notified, and the next shooting is performed with the shooting failed.
- the imaging device 200 can notify the mobile body 100 of the imaging failure when the imaging device 200 recognizes the imaging failure. become. Communication between the imaging device 200 and the control unit 110 of the moving body 100 at this time is performed as shown in FIG. 5, for example.
- the control unit 110 requests the imaging apparatus 200 to start shooting. After that, when the imaging device 200 receives a shooting success or shooting failure notification and receives a shooting failure notification from the imaging device 200, for example, a re-shooting instruction is performed as shown in FIG. Details of instructions given to the imaging apparatus 200 after receiving a notification of imaging failure will be described later.
- the moving body 100 and the imaging device 200 are configured as described above.
- a small electric vehicle called a drone can perform not only manual operation by a pilot but also automatic flight and automatic photographing using GPS or the like.
- shooting information such as flight route route information, shooting position, shooting direction, and shooting time is set in advance, and the control unit 110 of the moving body 100 controls flight of the moving body 100 according to the setting contents.
- indication to the imaging device 200 is performed. Further, the route information and the photographing information may be acquired from the base station by wireless communication.
- shooting failure includes a case where shooting has not been performed (cannot be performed) and a case where shooting has been performed but the captured image does not satisfy a predetermined condition.
- the remaining capacity of the battery 6 may be equal to or less than a predetermined amount when the remaining capacity of the storage unit 270 that stores the captured image is equal to or less than a predetermined amount. This is because, when the remaining capacity of the storage unit 270 is equal to or less than the predetermined amount, the captured image cannot be saved even though the image capturing itself can be executed. This is because, when the remaining amount of the battery 6 is equal to or less than the predetermined amount, the moving body 100 and / or the imaging device 200 may not operate and the operation may stop in the middle.
- shooting when shooting is not performed (cannot be performed), a writing error occurs when the captured image is stored in the storage unit 270, and the mobile body 100 and / or the imaging device 200 is not stored. In some cases, shooting was not performed due to a malfunction, malfunction, mechanical error, or control error.
- the determination as to whether or not the subject is in focus is performed by acquiring a focus distance parameter (metadata) recorded at the time of shooting and confirming whether the focus distance parameter is within a predetermined threshold range. be able to. If the in-focus distance parameter is within the threshold range, it is determined that the image is in focus, and if it is out of the range, it is determined that the image is not in focus.
- Whether or not the exposure is within a certain range is determined by storing an EV (Exposure Value) value, which is a value indicating the brightness of exposure at the time of photographing, in association with the photographed image and referring to the EV value. Can be performed.
- EV Exposure Value
- the actual shot position may deviate from the planned shot position when the planned shooting position is set in advance. is there. Furthermore, this is the case where the shooting timing for shooting multiple times with a predetermined distance interval or time interval is set, and the actual shooting position or time deviates from the distance interval or time interval. .
- FIG. 6 is a flowchart illustrating the flow of the imaging result notification process performed by the information processing unit 300 and the imaging process performed by the imaging apparatus 200.
- the imaging apparatus 200 receives an instruction to start shooting from the moving body 100.
- the imaging start instruction may be given to the imaging apparatus 200 from the control unit 110 of the moving body 100, or may be made to the imaging apparatus 200 from an external device (such as a base station) other than the moving body 100 and the imaging apparatus 200. Good.
- step S12 the information processing section 300 determines whether or not the remaining amount of the battery 6 is greater than or equal to a predetermined amount.
- the predetermined amount is, for example, the total of the remaining amount required for the mobile body 100 to fly on the flight path scheduled after the determination and the remaining amount required for performing the scheduled shooting. This can be performed by the information processing unit 300 receiving supply of the remaining amount information of the battery 6 from the control unit 110 of the moving body 100 by USB communication. If the remaining amount of the battery 6 is greater than or equal to the predetermined amount, the process proceeds to step S13 (Yes in step S12).
- the information processing unit 300 determines whether or not the remaining capacity of the storage unit 270 is greater than or equal to a predetermined amount.
- the predetermined amount is, for example, a capacity corresponding to the number of images to be acquired by shooting scheduled after the determination. Further, when the number of images scheduled to be taken is undecided, it may be a predetermined capacity such as a capacity corresponding to 100 images, a specific number of gigabytes, or may be set in advance by the user.
- step S14 If the remaining capacity of the storage unit 270 is greater than or equal to the predetermined amount, the process proceeds to step S14 (Yes in step S13).
- step S14 when the moving body 100 reaches the planned shooting position in step S14, autofocusing is started by the imaging device 200.
- step S15 it is determined whether or not the subject has been focused within a predetermined time by autofocusing of the imaging apparatus 200.
- step S15 the imaging apparatus 200 performs shooting in step S16 and performs processing for saving the captured image in the storage unit 270.
- step S ⁇ b> 17 the information processing unit 300 determines whether a captured image is stored in the storage unit 270. For example, this can be performed by referring to the shooting time information of the latest image stored in the storage unit 270. If the captured image is stored in the storage unit 270, the process proceeds to step S18 (Yes in step S17).
- step S18 the information processing unit 300 performs image analysis processing of the captured image.
- image analysis process it is determined whether the captured image is in focus or whether the exposure is within a certain range including the appropriate exposure. If the subject is not in focus or the exposure is not within a certain range including the proper exposure, the analysis result will be an error.
- shooting is performed at a predetermined shooting position
- shooting is performed with reference to shooting position information and shooting time information stored in association with the shot image as EXIF data. It is also determined whether it was performed at the scheduled position and the scheduled shooting time. If the shooting is not performed at a predetermined shooting scheduled position and scheduled shooting time, the analysis result is an error.
- the shooting position is fixed with reference to shooting position information stored in association with the shot image. It is determined whether or not it is deviated from the distance. Furthermore, when shooting is performed continuously after a certain time (for example, shooting every 10 seconds), the shooting time information indicating the shooting time stored in association with the shot image is referred to. Then, it is determined whether or not the photographing time is deviated from the certain time interval. Further, whether or not shooting and storage processing has been performed at a predetermined scheduled shooting position or scheduled shooting time can also be determined by checking a log (processing record) of the imaging apparatus 200. Thereby, it can be judged whether imaging
- a log processing record
- points A to G on the maps of FIGS. 7A to 7C are scheduled shooting positions that are performed at regular intervals, and the stars in FIGS. 7B and 7C indicate the shooting positions at which the images were shot. Shall.
- FIG. 7B since the shooting position D is deviated from the planned shooting position D, the analysis result of the image shot at the shooting position D is an error.
- FIG. 7C since the shooting at the scheduled shooting position D has been missed and shooting has not been performed at the scheduled shooting position D, the analysis result is an error.
- step S18 If there is no error as a result of the analysis process in step S18, the process proceeds from step S19 to step S20 (No in step S19). In step S ⁇ b> 20, the information processing unit 300 notifies the moving body 100 that the shooting has been successful.
- step S19 If the analysis result is an error, the process proceeds to step S21 in any case.
- step S21 the information processing unit 300 notifies the moving body 100 that the shooting has failed.
- control unit 110 of the moving body 100 When the control unit 110 of the moving body 100 receives a shooting failure notification from the imaging apparatus 200, the control unit 110 performs a predetermined process according to the type of shooting failure.
- the control unit 110 Controls the moving body 100 to return to a predetermined position (base station, position where the user is located, etc.). This is because photographing cannot be performed in either case.
- the user needs to replace or charge the battery 6, add the storage unit 270, delete data for securing the remaining capacity of the storage unit 270, repair the moving body 100 and / or the imaging device 200.
- This control is executed, for example, by resetting the flight path in the control unit 110 and controlling the flight of the moving body 100 along the resetting.
- the base station is notified whether or not re-imaging is necessary, the flight path is acquired from the base station, and the flight of the mobile unit 100 is controlled based on the flight path. May be.
- control unit 110 of the moving body 100 returns to a position slightly before the shooting failure position and performs shooting again at the shooting failure position, or as it is. Controls whether to continue moving and shoot again at the shooting failure position after finishing shooting at the last scheduled shooting position, or return to the first position in the shooting path and start shooting again from the beginning. To do.
- This control is realized by, for example, resetting the flight path in the control unit 110 and controlling the flight of the moving body 100 along the resetting.
- the base station can be realized by notifying the base station whether or not re-shooting is necessary after the shooting results are obtained, acquiring the flight path from the base station, and controlling the flight of the moving body 100 based on the flight path. It is. For example, when the shooting position or the number of shots is set in advance, if a predetermined ratio (for example, 50%) of the shooting position or the number of shots has failed, the shooting position or the number of shots is set to the first position of the shooting path. It is also possible to return and select a process for re-taking the image from the beginning.
- a predetermined ratio for example, 50%
- the reason why such control is executed is that since the photographing itself can be performed, the moving body 100 can perform re-photographing without returning to the base station or the user. In this way, if the mobile unit 100 is notified that the shooting has failed while the mobile unit 100 is flying and the imaging apparatus 200 is shooting, the shooting can be performed again without returning to the base station or the user. Can do. Thereby, re-photographing can be performed quickly, and the time and cost required for re-photographing can be reduced. Note that which of the above is performed may be set in the moving body 100 in advance, or may be selected and set by the user before the start of shooting.
- the processing in this technology is performed as described above. According to the present technology, it is possible to notify the moving body 100 directly from the imaging device 200 whether or not the imaging by the imaging device 200 has failed, without using a user or another device. Thereby, re-photographing can be executed during the flight, and the time and cost required for the subsequent re-photographing can be reduced.
- the information processing unit 300 operates on the imaging apparatus 200 and notifies the mobile unit 100 of imaging failure or success.
- the information processing unit 300 may operate on the moving body 100.
- the information processing unit 300 receives the remaining capacity information in the storage unit 270 from the imaging device 200, information on whether or not the subject is in focus, and an image shot by the imaging device 200, and whether or not shooting has failed. Determine whether. Then, the determination result is supplied to the control unit 110 and reflected in the operation of the moving body 100. In addition, an instruction for re-shooting is issued from the information processing unit of the moving body 100 to the imaging apparatus 200.
- the moving body 100 when the moving body 100 receives a shooting failure notification from the imaging device 200, the moving body 100 performs predetermined processing such as re-shooting according to the type of shooting failure. However, processing (such as re-shooting control) in the case where the imaging apparatus 200 fails in shooting may be performed.
- the present technology is not limited to a drone (electric small air vehicle), can be equipped with the imaging device 200, and can be applied to automobiles, ships, robots, and the like that move automatically without being operated by a person.
- a drone electric small air vehicle
- the imaging device 200 can be equipped with the imaging device 200, and can be applied to automobiles, ships, robots, and the like that move automatically without being operated by a person.
- the present invention is not limited to a digital camera, and can be applied to any device that has an imaging function such as a smartphone, a mobile phone, a portable game machine, a notebook computer, and a tablet terminal and can be mounted on the mobile object 100. is there.
- the shooting failure notification and the shooting success notification are performed, but the notification may not be performed when the shooting is successful, and the notification may be performed only when the shooting fails.
- the shooting failure As a specific example of the shooting failure, the case where the photographed image is not in focus or the exposure is not within the predetermined range is described.
- the shooting failure is not limited to these, and the parameter relating to the photographed image is related. Any thing can be used.
- the imaging apparatus 200 may include an input unit, a display unit, and the like.
- the imaging device 200 may be a device that can be used alone as the imaging device 200 without being connected to the moving body 100.
- the battery 6 may be included in the imaging apparatus 200, and power may be supplied from the battery 6 of the imaging apparatus 200 to the moving body 100.
- both the moving body 100 and the imaging device 200 may include a battery.
- Communication between the moving body 100 and the imaging device 200 may be performed by wireless communication.
- the communication between the mobile unit 100 and the imaging device 200 is performed using the USB communication standard PTP. Any communication method can be used.
- This technology can also be applied to video shooting. Also in the case of moving image shooting, the shooting result notification process described with reference to FIG. However, since image analysis, such as whether or not the subject is in focus, is performed on the frame images constituting the moving image, only the shooting failure may be notified. Usually, it is considered that shooting is more successful than failure. Therefore, when attempting to notify the success of shooting, the amount of processing for performing the notification of success for each frame image and the notification of success becomes enormous.
- the present technology can also have the following configurations.
- Position information indicating the shooting position of an image shot by the imaging device is acquired, and if the shooting position is not a predetermined position, notification that shooting has failed is performed (1) to (4).
- the information processing apparatus according to any one of the above.
- Time information indicating the shooting time of an image shot by the imaging device is acquired, and if the shooting time is not a predetermined time, notification that shooting has failed is performed (1) to (5)
- the information processing apparatus according to any one of the above.
- Position information indicating the shooting position of an image shot by the imaging device is acquired, and if the shooting position is deviated from a predetermined constant distance interval, a notification that shooting has failed is performed (1). ) To (6).
- Time information indicating the photographing time of an image photographed by the imaging device is acquired, and if the photographing time is deviated from a predetermined constant time interval, a notification that photographing has failed is performed (1 ) To (7).
- the information processing apparatus according to any one of (1) to (13), which notifies the moving body of success when photographing by the imaging apparatus is successful.
- the information processing apparatus according to (17), wherein when imaging by the imaging apparatus has failed, the imaging apparatus is instructed to perform imaging again.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Studio Devices (AREA)
Abstract
Description
<1.実施の形態>
[1-1.移動体の構成]
[1-2.撮像装置の構成]
[1-3.撮影結果通知処理]
<2.変形例>
[1-1.移動体の構成]
図1および図2を参照して移動体100の構成について説明する。本実施の形態において、移動体100はドローンと称される電動小型飛行体(無人航空機)である。図1Aは移動体100の平面図であり、図1Bは移動体100の正面図である。中心部としての例えば円筒状または角筒状の胴体部1と、胴体部1の上部に固定された支持軸2a~2fとから機体が構成される。一例として、胴体部1が6角筒状とされ、胴体部1の中心から6本の支持軸2a~2fが等角間隔で放射状に延びるようになされている。胴体部1および支持軸2a~2fは、軽量で強度の高い材料から構成されている。
図1および図3を参照して撮像装置200の構成について説明する。図3は、撮像装置200の構成を示すブロック図である。
次に図6を参照して撮影結果通知処理について説明する。図6は情報処理部300により行われる撮影結果通知処理および撮像装置200により行われる撮影処理の流れを示すフローチャートである。
以上、本技術の実施の形態について具体的に説明したが、本技術は上述の実施の形態に限定されるものではなく、本技術の技術的思想に基づく各種の変形が可能である。
(1)
移動体に搭載された撮像装置において動作し、
前記撮像装置による撮影が失敗した場合に失敗した旨を前記移動体に通知する情報処理装置。
(2)
前記撮像装置により撮影された画像および/または画像に関する情報に基づいて撮影が失敗したか否かを判断する(1)に記載の情報処理装置。
(3)
前記撮像装置により撮影された画像のピントが合っていない場合、撮影が失敗した旨の通知を行う(1)または(2)に記載の情報処理装置。
(4)
前記撮像装置により撮影された画像の露出が所定の範囲ではない場合、撮影が失敗した旨の通知を行う(1)から(3)のいずれかに記載の情報処理装置。
(5)
前記撮像装置により撮影された画像の撮影位置を示す位置情報を取得し、前記撮影位置が予め定められた所定の位置ではない場合、撮影が失敗した旨の通知を行う(1)から(4)のいずれかに記載の情報処理装置。
(6)
前記撮像装置により撮影された画像の撮影時刻を示す時刻情報を取得し、前記撮影時刻が予め定められた所定の時刻ではない場合、撮影が失敗した旨の通知を行う(1)から(5)のいずれかに記載の情報処理装置。
(7)
前記撮像装置により撮影された画像の撮影位置を示す位置情報を取得し、前記撮影位置が予め定められた連続する一定の距離間隔からずれている場合、撮影が失敗した旨の通知を行う(1)から(6)のいずれかに記載の情報処理装置。
(8)
前記撮像装置により撮影された画像の撮影時刻を示す時刻情報を取得し、前記撮影時刻が予め定められた連続する一定の時間間隔からずれている場合、撮影が失敗した旨の通知を行う(1)から(7)のいずれかに記載の情報処理装置。
(9)
予め定められた所定の位置および/または予め定められた所定の時刻に撮影が行われなかった場合、撮影が失敗した旨の通知を行う(1)または(2)に記載の情報処理装置。
(10)
前記撮像装置の撮影により画像が取得されなかった場合に撮影が失敗した旨の通知を行う(1)に記載の情報処理装置。
(11)
前記移動体または前記撮像装置が備える、撮影された画像を保存する記憶部の残り容量が所定量以下である場合、撮影が失敗した旨の通知を行う(10)に記載の情報処理装置。
(12)
前記移動体または前記撮像装置が備える記憶部への前記撮像装置により撮影された画像の書き込みエラーが発生した場合、撮影が失敗した旨の通知を行う(10)または(11)に記載の情報処理装置。
(13)
前記移動体または前記撮像装置が備えるバッテリの残量が所定量以下の場合、撮影が失敗した旨の通知を行う(10)から(12)のいずれかに記載の情報処理装置。
(14)
前記撮像装置による撮影が成功した場合、成功した旨を前記移動体に通知する(1)から(13)のいずれかに記載の情報処理装置。
(15)
移動体に搭載された撮像装置による撮影が失敗した場合に失敗した旨を前記移動体に通知する情報処理方法。
(16)
移動体に搭載された撮像装置による撮影が失敗した場合に失敗した旨を前記移動体に通知する情報処理方法をコンピュータに実行させる情報処理プログラム。
(17)
撮像装置を搭載した移動体において動作し、
前記撮像装置による撮影が失敗したか否かを判断する情報処理装置。
(18)
前記撮像装置による撮影が失敗した場合に、前記撮像装置に再撮影の指示を行う(17)に記載の情報処理装置。
(19)
前記撮像装置による撮影が失敗した場合に、前記移動体の移動経路を設定する
(17)または(18)に記載の情報処理装置。
(20)
前記撮像装置による撮影が失敗した場合に、失敗した旨を外部機器に通知する(17)から(19)のいずれかに記載の情報処理装置。
200・・・撮像装置
300・・・情報処理部
Claims (20)
- 移動体に搭載された撮像装置において動作し、
前記撮像装置による撮影が失敗した場合に失敗した旨を前記移動体に通知する情報処理装置。 - 前記撮像装置により撮影された画像および/または画像に関する情報に基づいて撮影が失敗したか否かを判断する
請求項1に記載の情報処理装置。 - 前記撮像装置により撮影された画像のピントが合っていない場合、撮影が失敗した旨の通知を行う
請求項2に記載の情報処理装置。 - 前記撮像装置により撮影された画像の露出が所定の範囲ではない場合、撮影が失敗した旨の通知を行う
請求項2に記載の情報処理装置。 - 前記撮像装置により撮影された画像の撮影位置を示す位置情報を取得し、前記撮影位置が予め定められた所定の位置ではない場合、撮影が失敗した旨の通知を行う
請求項2に記載の情報処理装置。 - 前記撮像装置により撮影された画像の撮影時刻を示す時刻情報を取得し、前記撮影時刻が予め定められた所定の時刻ではない場合、撮影が失敗した旨の通知を行う
請求項2に記載の情報処理装置。 - 前記撮像装置により撮影された画像の撮影位置を示す位置情報を取得し、前記撮影位置が予め定められた連続する一定の距離間隔からずれている場合、撮影が失敗した旨の通知を行う
請求項2に記載の情報処理装置。 - 前記撮像装置により撮影された画像の撮影時刻を示す時刻情報を取得し、前記撮影時刻が予め定められた連続する一定の時間間隔からずれている場合、撮影が失敗した旨の通知を行う
請求項2に記載の情報処理装置。 - 予め定められた所定の位置および/または予め定められた所定の時刻に撮影が行われなかった場合、撮影が失敗した旨の通知を行う
請求項2に記載の情報処理装置。 - 前記撮像装置の撮影により画像が取得されなかった場合に撮影が失敗した旨の通知を行う
請求項1に記載の情報処理装置。 - 前記移動体または前記撮像装置が備える、撮影された画像を保存する記憶部の残り容量が所定量以下である場合、撮影が失敗した旨の通知を行う
請求項10に記載の情報処理装置。 - 前記移動体または前記撮像装置が備える記憶部への前記撮像装置により撮影された画像の書き込みエラーが発生した場合、撮影が失敗した旨の通知を行う
請求項10に記載の情報処理装置。 - 前記移動体または前記撮像装置が備えるバッテリの残量が所定量以下の場合、撮影が失敗した旨の通知を行う
請求項10に記載の情報処理装置。 - 前記撮像装置による撮影が成功した場合、成功した旨を前記移動体に通知する
請求項1に記載の情報処理装置。 - 移動体に搭載された撮像装置による撮影が失敗した場合に失敗した旨を前記移動体に通知する情報処理方法。
- 移動体に搭載された撮像装置による撮影が失敗した場合に失敗した旨を前記移動体に通知する情報処理方法をコンピュータに実行させる情報処理プログラム。
- 撮像装置を搭載した移動体において動作し、
前記撮像装置による撮影が失敗したか否かを判断する情報処理装置。 - 前記撮像装置による撮影が失敗した場合に、前記撮像装置に再撮影の指示を行う
請求項17に記載の情報処理装置。 - 前記撮像装置による撮影が失敗した場合に、前記移動体の移動経路を設定する
請求項17に記載の情報処理装置。 - 前記撮像装置による撮影が失敗した場合に、失敗した旨を外部機器に通知する
請求項17に記載の情報処理装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/482,640 US11061412B2 (en) | 2017-03-10 | 2017-12-25 | Information processing device and information processing method |
CN201780087880.2A CN110383810B (zh) | 2017-03-10 | 2017-12-25 | 信息处理设备、信息处理方法和信息处理程序 |
EP21216641.7A EP4002831A1 (en) | 2017-03-10 | 2017-12-25 | Information-processing device, information-processing method, and information-processing program |
EP17899534.6A EP3595286B1 (en) | 2017-03-10 | 2017-12-25 | Information-processing device, information-processing method, and information-processing program |
JP2019504343A JP7136079B2 (ja) | 2017-03-10 | 2017-12-25 | 情報処理装置、情報処理方法および情報処理プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017045663 | 2017-03-10 | ||
JP2017-045663 | 2017-03-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018163571A1 true WO2018163571A1 (ja) | 2018-09-13 |
Family
ID=63448403
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/046314 WO2018163571A1 (ja) | 2017-03-10 | 2017-12-25 | 情報処理装置、情報処理方法および情報処理プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US11061412B2 (ja) |
EP (2) | EP4002831A1 (ja) |
JP (1) | JP7136079B2 (ja) |
CN (1) | CN110383810B (ja) |
WO (1) | WO2018163571A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018152737A (ja) * | 2017-03-13 | 2018-09-27 | ヤンマー株式会社 | 無人飛行撮影装置 |
WO2020189506A1 (ja) * | 2019-03-18 | 2020-09-24 | 株式会社ナイルワークス | ドローン、ドローンの制御方法、および、ドローンの制御プログラム |
CN111953892A (zh) * | 2019-05-16 | 2020-11-17 | 阿尔派株式会社 | 无人飞行器、检查方法及检查程序 |
JP2020191523A (ja) * | 2019-05-21 | 2020-11-26 | アルパイン株式会社 | 無人移動体 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113068501A (zh) * | 2020-01-06 | 2021-07-06 | 苏州宝时得电动工具有限公司 | 一种智能割草机 |
EP4117270A4 (en) * | 2020-03-03 | 2023-05-17 | Sony Group Corporation | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND INFORMATION PROCESSING SYSTEM |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09247656A (ja) * | 1996-03-05 | 1997-09-19 | Matsushita Electric Ind Co Ltd | 画像通信端末装置 |
JP2011240745A (ja) * | 2010-05-14 | 2011-12-01 | Chugoku Electric Power Co Inc:The | 無人飛行体の着陸を支援する方法、及び無人飛行体 |
JP2016138788A (ja) | 2015-01-27 | 2016-08-04 | 株式会社トプコン | 測量データ処理装置、測量データ処理方法およびプログラム |
JP2017034444A (ja) * | 2015-07-31 | 2017-02-09 | オリンパス株式会社 | 撮像装置および撮像方法 |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000278573A (ja) * | 1999-03-23 | 2000-10-06 | Minolta Co Ltd | データ記憶媒体の駆動システム |
DE10201520A1 (de) * | 2002-01-17 | 2003-07-31 | Bosch Gmbh Robert | Verfahren und Vorrichtung zur Bildfehlererkennung b zw. -anzeige bei Bildaufnahmesystemen |
US20050219400A1 (en) * | 2002-09-24 | 2005-10-06 | Hasselblad A/S | Image quality indicator |
WO2006120815A1 (ja) * | 2005-05-11 | 2006-11-16 | Matsushita Electric Industrial Co., Ltd. | 固体撮像装置、カメラ、自動車および監視装置 |
JP2008116823A (ja) * | 2006-11-07 | 2008-05-22 | Nikon Corp | カメラ |
US8385672B2 (en) * | 2007-05-01 | 2013-02-26 | Pictometry International Corp. | System for detecting image abnormalities |
JP4841582B2 (ja) * | 2008-03-18 | 2011-12-21 | 富士通株式会社 | 画像補正プログラムおよび画像補正装置 |
JP2010050521A (ja) * | 2008-08-19 | 2010-03-04 | Olympus Corp | 撮像装置 |
JP2012010026A (ja) * | 2010-06-23 | 2012-01-12 | Seiko Epson Corp | 撮像装置及び撮像制御回路 |
US8712157B2 (en) * | 2011-04-19 | 2014-04-29 | Xerox Corporation | Image quality assessment |
JP5910073B2 (ja) | 2011-12-26 | 2016-04-27 | 株式会社ニコン | 撮像装置、システム、電子機器およびプログラム |
JP6003613B2 (ja) | 2012-12-18 | 2016-10-05 | 株式会社ニコン | 補助撮像装置 |
KR101709482B1 (ko) | 2013-01-16 | 2017-02-23 | 삼성전자주식회사 | 서버 장치 및 서버의 제어 방법 |
KR101329583B1 (ko) * | 2013-07-09 | 2013-11-14 | 주식회사 두레텍 | 회전익 구조체를 이용한 공중관측 지형자료 구축 방법 및 그 시스템 |
JP6287092B2 (ja) * | 2013-11-14 | 2018-03-07 | ソニー株式会社 | 情報処理装置および情報処理方法、撮像システム、並びにプログラム |
EP2890112A1 (en) * | 2013-12-30 | 2015-07-01 | Nxp B.V. | Method for video recording and editing assistant |
WO2015107928A1 (ja) * | 2014-01-17 | 2015-07-23 | ソニー株式会社 | 撮影システム、警告発生装置および方法、撮像装置および方法、並びにプログラム |
JP6266799B2 (ja) * | 2014-09-10 | 2018-01-24 | 富士フイルム株式会社 | 撮影制御装置、撮影装置、撮影制御方法、及びプログラム |
JP6090274B2 (ja) * | 2014-09-24 | 2017-03-08 | カシオ計算機株式会社 | 撮影制御装置、同期撮影システム、撮影制御方法、同期撮影方法及びプログラム |
KR102251814B1 (ko) * | 2015-02-06 | 2021-05-13 | 삼성전자주식회사 | 메모리 장치, 그것의 동작 및 제어 방법 |
US20160309069A1 (en) * | 2015-04-17 | 2016-10-20 | mPerpetuo, Inc. | Lighting System for a Camera Including Multiple LEDS |
KR102364730B1 (ko) * | 2015-07-29 | 2022-02-18 | 엘지전자 주식회사 | 이동 단말기 및 이의 제어방법 |
CN110557586B (zh) * | 2016-05-31 | 2022-02-18 | 索尼半导体解决方案公司 | 光检测装置、先进驾驶辅助系统和自主驾驶系统 |
CN106127180A (zh) * | 2016-06-30 | 2016-11-16 | 广东电网有限责任公司电力科学研究院 | 一种机器人辅助定位方法及装置 |
EP3494443B1 (en) * | 2016-10-24 | 2023-01-11 | SZ DJI Technology Co., Ltd. | Systems and methods for controlling an image captured by an imaging device |
US10805507B2 (en) * | 2016-12-21 | 2020-10-13 | Shanghai Xiaoyi Technology Co., Ltd. | Method and system for configuring cameras to capture images |
JP6619761B2 (ja) | 2017-03-13 | 2019-12-11 | ヤンマー株式会社 | 無人飛行撮影装置 |
JP6482696B2 (ja) * | 2017-06-23 | 2019-03-13 | キヤノン株式会社 | 表示制御装置、表示制御方法、およびプログラム |
-
2017
- 2017-12-25 EP EP21216641.7A patent/EP4002831A1/en active Pending
- 2017-12-25 JP JP2019504343A patent/JP7136079B2/ja active Active
- 2017-12-25 US US16/482,640 patent/US11061412B2/en active Active
- 2017-12-25 EP EP17899534.6A patent/EP3595286B1/en active Active
- 2017-12-25 WO PCT/JP2017/046314 patent/WO2018163571A1/ja active Application Filing
- 2017-12-25 CN CN201780087880.2A patent/CN110383810B/zh active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09247656A (ja) * | 1996-03-05 | 1997-09-19 | Matsushita Electric Ind Co Ltd | 画像通信端末装置 |
JP2011240745A (ja) * | 2010-05-14 | 2011-12-01 | Chugoku Electric Power Co Inc:The | 無人飛行体の着陸を支援する方法、及び無人飛行体 |
JP2016138788A (ja) | 2015-01-27 | 2016-08-04 | 株式会社トプコン | 測量データ処理装置、測量データ処理方法およびプログラム |
JP2017034444A (ja) * | 2015-07-31 | 2017-02-09 | オリンパス株式会社 | 撮像装置および撮像方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3595286A4 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018152737A (ja) * | 2017-03-13 | 2018-09-27 | ヤンマー株式会社 | 無人飛行撮影装置 |
WO2020189506A1 (ja) * | 2019-03-18 | 2020-09-24 | 株式会社ナイルワークス | ドローン、ドローンの制御方法、および、ドローンの制御プログラム |
JPWO2020189506A1 (ja) * | 2019-03-18 | 2020-09-24 | ||
JP7045122B2 (ja) | 2019-03-18 | 2022-03-31 | 株式会社ナイルワークス | ドローン、ドローンの制御方法、および、ドローンの制御プログラム |
CN111953892A (zh) * | 2019-05-16 | 2020-11-17 | 阿尔派株式会社 | 无人飞行器、检查方法及检查程序 |
JP2020185941A (ja) * | 2019-05-16 | 2020-11-19 | アルパイン株式会社 | 無人航空機、点検方法および点検プログラム |
JP7305263B2 (ja) | 2019-05-16 | 2023-07-10 | アルパイン株式会社 | 無人航空機、点検方法および点検プログラム |
CN111953892B (zh) * | 2019-05-16 | 2024-01-26 | 阿尔派株式会社 | 无人飞行器、检查方法 |
JP2020191523A (ja) * | 2019-05-21 | 2020-11-26 | アルパイン株式会社 | 無人移動体 |
JP7362203B2 (ja) | 2019-05-21 | 2023-10-17 | アルパイン株式会社 | 無人移動体 |
Also Published As
Publication number | Publication date |
---|---|
CN110383810A (zh) | 2019-10-25 |
EP4002831A1 (en) | 2022-05-25 |
JP7136079B2 (ja) | 2022-09-13 |
US11061412B2 (en) | 2021-07-13 |
US20200014913A1 (en) | 2020-01-09 |
EP3595286B1 (en) | 2022-03-16 |
EP3595286A1 (en) | 2020-01-15 |
JPWO2018163571A1 (ja) | 2020-01-09 |
CN110383810B (zh) | 2021-10-08 |
EP3595286A4 (en) | 2020-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7136079B2 (ja) | 情報処理装置、情報処理方法および情報処理プログラム | |
JP7091613B2 (ja) | 撮像装置、カメラ装着ドローン、およびモード制御方法、並びにプログラム | |
JP7057637B2 (ja) | 制御装置、制御システム、制御方法、プログラム、及び記憶媒体 | |
JP6785412B2 (ja) | 無人航空機システム | |
WO2019104641A1 (zh) | 无人机、其控制方法以及记录介质 | |
CN107580161B (zh) | 摄影设备及方法、移动摄影装置、摄影用移动体及其控制装置 | |
JP6639979B2 (ja) | 撮影機器及び撮影用移動体 | |
WO2021212445A1 (zh) | 拍摄方法、可移动平台、控制设备和存储介质 | |
CN111345033A (zh) | 参数同步方法、拍摄装置和可移动平台 | |
CN107431749B (zh) | 一种跟焦器控制方法和装置及系统 | |
JP7136098B2 (ja) | 撮像装置、カメラ装着ドローン、およびモード制御方法、並びにプログラム | |
JP2017212528A (ja) | 撮像システム、撮像制御方法、撮像制御システム、移動体、制御方法、及びプログラム | |
US10942331B2 (en) | Control apparatus, lens apparatus, photographic apparatus, flying body, and control method | |
CN111630838B (zh) | 确定装置、摄像装置、摄像系统、移动体、确定方法以及程序 | |
JP6475568B2 (ja) | 撮像装置および飛行制御方法 | |
JP2021033177A (ja) | アダプタ、撮像装置、支持機構および移動体 | |
JP2020017924A (ja) | 移動体、撮像制御方法、プログラム、及び記録媒体 | |
JP2011197467A (ja) | 撮影装置 | |
US20240067369A1 (en) | Image capturing system equipped with device capable of flying, method of controlling image capturing system, and storage medium | |
JP6589902B2 (ja) | 情報処理装置、情報処理装置の制御方法及びプログラム | |
JP2020003730A (ja) | 移動体、合焦制御方法、プログラム、及び記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17899534 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019504343 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017899534 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017899534 Country of ref document: EP Effective date: 20191010 |