CN107205111B - Image pickup apparatus, mobile apparatus, image pickup system, image pickup method, and recording medium - Google Patents

Image pickup apparatus, mobile apparatus, image pickup system, image pickup method, and recording medium Download PDF

Info

Publication number
CN107205111B
CN107205111B CN201611095306.5A CN201611095306A CN107205111B CN 107205111 B CN107205111 B CN 107205111B CN 201611095306 A CN201611095306 A CN 201611095306A CN 107205111 B CN107205111 B CN 107205111B
Authority
CN
China
Prior art keywords
mobile device
image pickup
unit
imaging
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611095306.5A
Other languages
Chinese (zh)
Other versions
CN107205111A (en
Inventor
福谷佳之
志村和彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN107205111A publication Critical patent/CN107205111A/en
Application granted granted Critical
Publication of CN107205111B publication Critical patent/CN107205111B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control

Abstract

Provided are an imaging device, a mobile device, an imaging system, an imaging method, and a recording medium, which can effectively cooperate even when an operator of the mobile device and a photographer of the imaging device perform operations separately. An imaging device (20) is mounted on a mobile device (10) and can perform bidirectional communication with the mobile device (10), the imaging device (20) is provided with an imaging unit (21) which images a subject to generate image data, and a2 nd control unit (28), and the 2 nd control unit (28) transmits information which can be displayed on a1 st display unit (32) of a manipulator (30) for manipulating the mobile device (10) to the mobile device (10).

Description

Image pickup apparatus, mobile apparatus, image pickup system, image pickup method, and recording medium
Technical Field
The present invention relates to an imaging apparatus, a mobile apparatus, an imaging system, an imaging method, and a recording medium that capture an object and generate image data of the object.
Background
In recent years, the following techniques are known: image data transmitted from a camera provided in a mobile robot is analyzed, and an obstacle existing in the traveling direction of the mobile robot is detected to prevent collision (see patent document 1). In this technique, a control device for manipulating a mobile robot analyzes image data transmitted from a camera, and the analysis result and the image data are displayed in a display screen of the control device.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2007-320024
Disclosure of Invention
Problems to be solved by the invention
However, in recent years, when an imaging device such as a camera is mounted on a mobile device such as a mobile robot or an unmanned aerial vehicle including an unmanned aerial vehicle to perform imaging, improvement in usability in cooperation with a plurality of persons is insufficient in a situation where an operator and an imaging person cooperate to perform manipulation and imaging with respect to the mobile device and the imaging device using different operation terminals, respectively.
The present invention has been made in view of the above circumstances, and an object thereof is to provide an imaging apparatus, a mobile apparatus, an imaging system, an imaging method, and a program that can effectively cooperate with each other even when an operator of the mobile apparatus and a photographer of the imaging apparatus perform respective operations.
Means for solving the problems
In order to solve the above problems and achieve the object, an image pickup apparatus according to the present invention is an image pickup apparatus that is mounted on a mobile apparatus and is capable of bidirectional communication with the mobile apparatus, the image pickup apparatus including: an image pickup unit that picks up an image of an object to generate image data; and a control unit that controls so that information that can be displayed on a display unit of a manipulation device that manipulates the mobile device is transmitted to the mobile device.
In the imaging apparatus according to the present invention, the information includes at least the image data generated by the imaging unit.
In the imaging device according to the present invention, the imaging device further includes a distance detection unit that detects a distance from the imaging device to the object based on the image data, and the information further includes distance information on the distance detected by the distance detection unit.
In the imaging apparatus according to the present invention, the imaging apparatus further includes a communication unit that receives an instruction signal instructing the imaging apparatus to perform imaging from an operation device operating the imaging apparatus and transmits the image data to the operation device, and the control unit causes the imaging unit to perform imaging when the communication unit receives the instruction signal, and the information further includes reception information indicating that the communication unit has received the instruction signal.
In the imaging device according to the present invention, when a control signal for controlling the movement of the mobile device in accordance with the operation of the manipulation device is received from the mobile device, the control unit transmits control information related to the control signal to the operation device via the communication unit.
In the imaging device according to the present invention, in the above-described invention, the control unit transmits the position information to the operation device via the communication unit when the position information on the position of the mobile device is input from the mobile device.
In the imaging device according to the present invention, in the above-described invention, the control unit does not perform imaging by the imaging unit even when the instruction signal is received when the mobile device returns to the predetermined position.
The mobile device according to the present invention is a mobile device equipped with an imaging device for imaging a subject to generate image data, the mobile device being capable of performing bidirectional communication with the imaging device and moving in accordance with a signal transmitted from an operating device having a display unit, the mobile device including a communication unit for receiving the signal transmitted from the operating device and transmitting information input from the imaging device to the operating device.
In the mobile device according to the present invention, the information includes at least the image data.
In the mobile device according to the present invention, the information further includes distance information on a distance from the imaging device to the object.
In the mobile device according to the present invention, the mobile device further includes: a position detection unit that detects position information relating to a current position of the mobile device; and a control unit that controls so as to transmit the position information detected by the position detection unit to the imaging device.
In the mobile device according to the present invention, the information further includes reception information indicating that the imaging device has received an instruction signal instructing the imaging device to perform imaging, and the control unit makes the mobile device stand still when receiving the information when the communication unit has received the signal.
In the mobile device according to the present invention, in the above-described invention, the control unit maintains the return of the mobile device when the information is received, when the mobile device is returned to a predetermined position.
The imaging system of the present invention includes an imaging device that images a subject to generate image data, a mobile device that is attached to the imaging device and that can perform bidirectional communication with the imaging device, a manipulator that manipulates the mobile device, and an operation device that operates the imaging device, and is characterized in that the imaging device includes a control unit that controls so as to transmit information that can be displayed on a display unit of the manipulator that manipulates the mobile device to the mobile device, and the mobile device includes a communication unit that receives the signal transmitted from the manipulator and transmits the information input from the imaging device to the manipulator.
Further, an image pickup method according to the present invention is an image pickup method executed by an image pickup apparatus mounted on a mobile apparatus and capable of bidirectional communication with the mobile apparatus, the image pickup method including: an image pickup step of picking up an image of a subject to generate image data; and a control step of transmitting information that can be displayed on a display portion of a manipulation device that manipulates the mobile device to the mobile device.
A recording medium of the present invention records a program executed by an imaging apparatus that is mounted on a mobile apparatus and that is capable of bidirectional communication with the mobile apparatus, the program executing: an image pickup step of picking up an image of a subject to generate image data; and a control step of transmitting information that can be displayed on a display portion of a manipulation device that manipulates the mobile device to the mobile device.
Effects of the invention
According to the present invention, even when the operator of the mobile device and the photographer of the imaging device perform respective operations, the operator and the photographer can perform effective cooperation and can collectively perform the operations.
Drawings
Fig. 1 is a schematic diagram showing a schematic configuration of an imaging system according to an embodiment of the present invention.
Fig. 2 is a block diagram showing a functional configuration of an image pickup system according to an embodiment of the present invention.
Fig. 3 is a flowchart illustrating an outline of an operation performed by the imaging apparatus according to the embodiment of the present invention.
Fig. 4 is a diagram showing an example of an image displayed by the operation device according to the embodiment of the present invention.
Fig. 5 is a diagram showing an example of an image displayed by the manipulator according to the embodiment of the present invention.
Fig. 6 is a flowchart showing an outline of an operation performed by the mobile device according to the embodiment of the present invention.
Fig. 7 is a schematic diagram showing a schematic configuration of an imaging system according to a modification of the embodiment of the present invention.
Description of the reference symbols
1. 1 a: a camera system; 2: a cable; 10: a mobile device; 10 a: a rotor; 11: a propelling part; 12: a power source; 13: a spatial information acquisition unit; 14: a position and orientation detection unit; 15: a height/posture detection unit; 16: a1 st recording unit; 17: a1 st communication unit; 18: a2 nd communication unit; 19: a1 st control unit; 20: a camera device; 21: an image pickup unit; 22: an elevation and azimuth detection unit; 23: a clock; 24: 1 st operating part; 25: a 3 rd communication unit; 26: a 4 th communication unit; 27: a2 nd recording unit; 28: a2 nd control part; 30: an operating device; 31: a 5 th communication unit; 32: a1 st display unit; 33: a2 nd operation part; 34: a 3 rd recording part; 35: a 3 rd control part; 40: an operating device; 41: a 6 th communication unit; 42: a2 nd display unit; 43: a 3 rd operation part; 44: a 4 th recording part; 45: a 4 th control unit; 191: a power supply determination unit; 192: a movement determination unit; 193: a posture determining part; 194: a propulsion control unit; 195: a direction control unit; 196: a posture control section; 197: 1 st communication control part; 211: an optical system; 212: an image pickup element; 271: an image data recording unit; 272: a program recording unit; 281: an image processing unit; 282: an image feature extraction unit; 283: a focal position obtaining unit; 284: a viewing angle acquisition unit; 285: a distance detection unit; 286: a trimming section; 287: a2 nd communication control unit; 288: a shooting control section; 351: a 3 rd communication control unit; 352: 1 st display control part; 451: a 4 th communication control unit; 452: a2 nd display control unit; p1: an object; u1: an operator; u2: the photographer.
Detailed Description
A mode for carrying out the present invention (hereinafter referred to as "embodiment") will be described with reference to the drawings. The present invention is not limited to the embodiments described below. In addition, in the description of the drawings, the same parts are denoted by the same reference numerals.
(schematic configuration of imaging System)
Fig. 1 is a schematic diagram showing a schematic configuration of an imaging system according to an embodiment of the present invention. Fig. 2 is a block diagram showing a functional configuration of an image pickup system according to an embodiment of the present invention. The imaging system 1 shown in fig. 1 and 2 is a system as follows: the imaging device is attached to the mobile device, the operator U1 moves the mobile device by the operating device, and the photographer U2 controls the imaging device using the operating device, whereby the subject P1 can be imaged.
As shown in fig. 1 and 2, the imaging system 1 includes a moving device 10, an imaging device 20, a manipulating device 30, and an operating device 40. The imaging device 20 is detachably attached to the mobile device 10. Of course, the image pickup device 20 may be attached to the mobile device 10 via a stabilizer, a vibration correction mechanism (e.g., a gimbal), a rigging (rig), or the like. The mobile device 10 and the imaging device 20 are connected to each other so as to be capable of bidirectional communication via a cable 2 such as a USB cable. The mobile device 10 and the manipulator device 30 are connected so as to be capable of bidirectional communication in a1 st frequency band (for example, a frequency band based on wireless standards of each country, such as 27MHz, 40MHz, 72MHz, 73MHz, 2.4GHz, 5GHz, or 5.8 GHz). The imaging device 20 and the operation device 40 are connected so as to be capable of bidirectional communication in a2 nd frequency band (for example, in the case where the 1 st frequency is 5GHz, a frequency band based on wireless standards of each country such as 5.8 GHz) different from the 1 st frequency band. In the present embodiment, the mobile device 10 and the imaging device 20 are connected by the cable 2, but the present invention is not limited to this, and a configuration capable of performing wireless communication in the 3 rd frequency band may be adopted.
(Structure of Mobile device)
First, the structure of the mobile device 10 will be described. The mobile device 10 has an Unmanned Aerial Vehicle (UAV) of the present invention. The mobile device 10 is configured as an unmanned aerial vehicle such as a rotary wing unmanned aerial vehicle having 4 rotors 10 a. The number of the rotary wings 10a is not limited to 4, and other numbers may be provided. Further, the mobile device 10 may be configured not only by an unmanned aerial vehicle such as a rotary wing unmanned aerial vehicle but also by another unmanned aerial vehicle such as a fixed wing unmanned aerial vehicle. Further, the mobile device 10 may be configured by an autonomous device that can be wirelessly operated, for example, an autonomous robot, an automobile, a model automobile, a capsule endoscope, and a ship.
The mobile device 10 includes a propulsion unit 11, a power source 12, a spatial information acquisition unit 13, a position and orientation detection unit 14, a height and orientation detection unit 15, a1 st recording unit 16, a1 st communication unit 17, a2 nd communication unit 18, and a1 st control unit 19.
The propulsion unit 11 is configured using a plurality of (4 in the example of fig. 1) rotors 10a and a plurality of motors (not shown) for driving the rotors 10a, and flies the moving device 10 under the control of the 1 st control unit 19.
The power source 12 is configured using a battery, a booster circuit, and the like. The power supply 12 adjusts each unit of the mobile device 10 to a predetermined voltage and supplies power to each unit.
The spatial information acquiring unit 13 is configured using a laser radar or the like. The spatial information acquiring unit 13 acquires spatial information on each obstacle (direction of each irradiation point, distance between each irradiation point) located around the mobile device 10 by radiating pulsed laser light from the mobile device 10 and measuring the return time of the reflected pulsed laser light, and outputs the acquired result to the 1 st control unit 19. The spatial information acquiring unit 13 may be configured using a plurality of image pickup devices, for example, and may acquire the distance and direction of each obstacle around the mobile device 10 by extracting feature values from image data from each image pickup device.
The position and orientation detecting unit 14 is configured using a gps (global Positioning system) receiver, a magnetic orientation sensor, or the like. The position and orientation detecting unit 14 detects current position information on the current position of the mobile device 10 and orientation information on an angle (orientation) of the head direction (direction in which the head is facing) of the mobile device 10 with respect to a predetermined reference orientation (for example, north), and outputs the detection result to the 1 st control unit 19. In the present embodiment, the position and orientation detecting unit 14 functions as a position detecting unit.
The height and orientation detection unit 15 is configured using an air pressure sensor, a gyro sensor (angular velocity sensor), an inclination sensor (acceleration sensor), and the like. The height/orientation detection unit 15 detects height information on the height of the mobile device 10, tilt angle information on the tilt angle of the mobile device 10 (tilt angle with respect to a reference orientation in which the 4 rotors 10a are positioned in a horizontal plane), and rotation angle information on the rotation angle of the mobile device 10 (rotation angle around a vertical line passing through the center positions of the 4 rotors 10 a), and outputs the detection results to the 1 st control unit 19.
The 1 st recording unit 16 is configured using a recording medium such as a flash memory or sdram (synchronous Dynamic Random Access memory). The 1 st recording section 16 records various programs for driving the mobile device 10, and temporarily records information in the process.
The 1 st communication unit 17 is configured using a predetermined communication module. The 1 st communication unit 17 performs wireless communication with the manipulator 30 that remotely controls the flight operation of the mobile device 10 under the control of the 1 st control unit 19. The 1 st communication unit 17 transmits information input from the imaging device 20 to the manipulation device 30 under the control of the 1 st control unit 19.
The 2 nd communication unit 18 is configured using a communication module. The 2 nd communication unit 18 communicates with the image pickup device 20 via the cable 2 under the control of the 1 st control unit 19.
The 1 st control unit 19 is configured by using a cpu (central Processing unit) or the like, and controls the operation of the propulsion unit 11 (the flying operation of the mobile device 10) and the communication based on an instruction signal from the manipulator 30 input via the 1 st communication unit 17 and data from the imaging device 20 input via the 2 nd communication unit 18.
Here, the detailed configuration of the 1 st control unit 19 will be described. The 1 st control unit 19 includes a power supply determination unit 191, a movement determination unit 192, a posture determination unit 193, a propulsion control unit 194, a direction control unit 195, a posture control unit 196, and a1 st communication control unit 197.
The power supply determination unit 191 detects the remaining amount (remaining amount of power) of the power supply 12, and determines the flight time and flight distance in which the mobile device 10 can fly, based on the detection result.
The movement determination unit 192 determines the movement distance of the mobile device 10 based on the current position information and the azimuth information detected by the position and azimuth detection unit 14, the altitude information detected by the altitude and attitude detection unit 15, and the flight time determined by the power supply determination unit 191.
The posture determination unit 193 determines the posture of the imaging device 20 based on the inclination angle information and the rotation angle information detected by the height posture detection unit 15.
The propulsion control unit 194 propels the moving device 10 based on the determination results of the power supply determination unit 191, the movement determination unit 192, and the posture determination unit 193. Specifically, the propulsion control unit 194 independently controls the rotation speed of the 4 rotors 10a, thereby raising, advancing, stopping, retreating, and lowering the moving device 10. For example, the propulsion control unit 194 changes the moving direction of the moving device 10 to the front-back direction (the head direction of the moving device 10 is the front direction) or the left-right direction (the left-right direction viewed in the head direction of the moving device 10).
The direction control unit 195 rotates the moving device 10 based on the determination results of the power supply determination unit 191, the movement determination unit 192, and the posture determination unit 193. Specifically, the direction control unit 195 independently controls the rotation speed of the 4 rotors 10a, and the attitude control unit 196 independently controls the rotation speed of the 4 rotors 10a, thereby changing the tilt angle or the rotation angle of the moving device 10.
The attitude control unit 196 controls the attitude of the mobile device 10 based on the determination result of the attitude determination unit 193 and the operation signal of the manipulator 30.
The 1 st communication control unit 197 controls communication of each of the 1 st communication unit 17 and the 2 nd communication unit 18. Specifically, the 1 st communication control unit 197 transmits information input from the imaging device 20 via the 2 nd communication unit 18 to the manipulation device 30 via the 1 st communication unit 17. The 1 st communication control unit 187 transmits information including operation information (hereinafter referred to as "mobile device information") input from the manipulation device 30 via the 1 st communication unit 17 to the imaging device 20 via the 2 nd communication unit 18. Here, the mobile device information is the current position of the mobile device 10, control information (operation information) related to a control signal that controls the movement of the mobile device 10 transmitted from the manipulation device 30, the moving direction of the mobile device 10, the height of the mobile device 10, and the state of the mobile device 10 (e.g., normal movement, return movement, remaining battery level).
(construction of the image pickup device)
Next, the configuration of the imaging device 20 will be described. The imaging device 20 is fixed to the moving device 10 so as not to move in a state where the imaging direction matches the head direction of the moving device 10. The imaging device 20 may be fixed to the mobile device 10 via a rotation mechanism so that the imaging area can be changed with respect to the mobile device 10. Of course, the imaging device 20 may be fixed to the moving device 10 via a stabilizer, a vibration correction mechanism (e.g., a gimbal), a rig (rig), or the like.
As shown in fig. 2, the imaging device 20 includes an imaging unit 21, an elevation/azimuth detection unit 22, a clock 23, a1 st operation unit 24, a 3 rd communication unit 25, a 4 th communication unit 26, a2 nd recording unit 27, and a2 nd control unit 28.
The imaging unit 21 generates image data by imaging a subject under the control of the 2 nd control unit 28. The imaging unit 21 includes an optical system 211 and an imaging element 212.
The optical system 211 has at least a focus lens, a zoom lens, a shutter, and an aperture. The optical system 211 forms an object image on the light receiving surface of the image pickup element 212. The optical system 211 changes various imaging parameters such as zoom magnification, focus position, aperture value, and shutter speed under the control of the 2 nd control unit 28.
The image pickup device 212 receives the subject image formed by the optical system 211, performs photoelectric conversion on the subject image, generates image data, and outputs the image data to the 2 nd control unit 28. The image sensor 212 is configured by using a ccd (charge Coupled device), a cmos (complementary Metal Oxide semiconductor), or the like. The image pickup device 212 includes phase difference pixels (not shown) for detecting a distance to an object and pixels for generating image data arranged in a two-dimensional matrix. In addition, the number of phase difference pixels can be set as appropriate.
The elevation/azimuth detection unit 22 is configured using an azimuth sensor, a 3-axis acceleration sensor, a gyro sensor, and a magnetic azimuth sensor. The elevation/azimuth detection unit 22 detects elevation angle information on the inclination angle (elevation angle) of the imaging device 20 (the optical axis of the optical system 211) with respect to the horizontal plane and azimuth information on the angle (azimuth) of the imaging direction of the imaging device 20 with respect to a predetermined reference azimuth (for example, the north direction), and outputs the detection result to the 2 nd control unit 28.
The clock 23 generates date and time information on the date and time of the image pickup performed by the image pickup unit 21 in addition to the time counting function, and outputs the date and time information to the 2 nd control unit 28.
The 1 st operation unit 24 is configured by using a button, a switch, or the like provided on an outer surface of the imaging device 20 and receiving an operation of the imaging device 20 by a user, and receives an input of an instruction signal corresponding to the operation by the user.
The 3 rd communication unit 25 communicates with the mobile device 10 via a cable under the control of the 2 nd control unit 28.
The 4 th communication unit 26 performs wireless communication with the operation device 40 under the control of the 2 nd control unit 28. The 4 th communication unit 26 is configured using a wireless communication module.
The 2 nd recording section 27 is configured using a flash memory, an SDRAM, and a memory card. The 2 nd recording unit 27 includes an image data recording unit 271 for recording the image data generated by the imaging unit 21, and a program recording unit 272 for recording various programs executed by the imaging device 20.
The 2 nd control unit 28 is configured using a CPU or the like. The image pickup unit 21 performs image pickup control based on an instruction signal input from the operation device 40 via the 4 th communication unit 26, and transmits image data generated by the image pickup unit 21 to the operation device 40 or the mobile device 10. In this case, the 2 nd control unit 28 may transmit the image data directly, or may transmit the image data (resized image data) obtained by thinning out the image data in time and pixel, or may transmit the compressed image data (reduced image data or compressed image data). The 2 nd control unit 28 may perform processing (for example, contrast enhancement processing or exposure value reduction processing) for improving visibility on the image data, and transmit the result of analysis of the image (the feature amount of the subject by the image feature extraction unit 282, which will be described later).
Here, the detailed configuration of the 2 nd control unit 28 will be described. The 2 nd control unit 28 includes an image processing unit 281, an image feature extraction unit 282, a focal position acquisition unit 283, a viewing angle acquisition unit 284, a distance detection unit 285, a clipping unit 286, a2 nd communication control unit 287, and an imaging control unit 288.
The image processing unit 281 performs predetermined image processing on the image data generated by the imaging unit 21. Specifically, the image processing unit 281 performs gain amplification processing, white balance processing, gradation processing, synchronization processing, and the like on the image data.
The image feature extraction unit 282 extracts a feature amount included in the image data on which the image processing unit 281 has performed the image processing. Specifically, the image feature extraction unit 282 extracts features of the subject, such as a face, a position or size of the face, a sex, a luminance value, and a feature amount (vector, etc.), from the image data.
The focal position acquisition unit 283 acquires focal position information on the focal position of the optical system 211 of the imaging unit 21. Specifically, the focal position acquiring unit 283 acquires the position of the optical system 211 in focus with the subject as focal position information.
The viewing angle acquisition unit 284 acquires the current viewing angle of the optical system 211 of the imaging unit 21. Specifically, the field angle acquiring unit 284 acquires the positions of the respective zoom lenses constituting the optical system 211, and calculates the field angle from the acquired positions.
The distance detection unit 285 detects the distance from the imaging device 20 to the subject based on the contrast change of the temporally preceding and succeeding 2 pieces of image data generated by the imaging unit 21, phase difference information based on the phase difference pixel provided in the imaging element 212, and the like. This information is used for focusing of the optical system 211, and is also important information on the manipulation side for manipulating the mobile device 10 as information on the distance between the mobile device 10 and the subject and between the mobile device 10 and the object. In general, since various drawings and constraint photographing is performed by changing the distance to the object and combining zoom operations in photographing by the image pickup device 20 (camera), it is important that the operator U1 grasp the distance to which the photographer U2 focuses attention, in addition to the distance information for flight and movement, in cooperative photographing by the mobile device 10 and the image pickup device 20. Not only the distance information but also the change in distance and the movement direction information that can be determined from the distance information can be effectively used. It is advantageous to use such information for manipulation in order to restrict shooting in the case of shooting with changing other conditions without changing the distance, or the like. The operator U1 operates the manipulator 30 using the information (information on the distance from the imaging device 20 to the object), thereby enabling an operation of specifying a still control such as hovering, and the 1 st control unit 19 of the mobile device 10 automatically performs the same still control in the mobile device 10 based on the information (information on the distance from the imaging device 20 to the object). Further, this information can be effectively used when shooting is performed in a predetermined order with changing the distance, when shooting is performed from various angles with a fixed distance, and the like, and can be applied to accurate 3D information acquisition shooting and the like.
The trimming unit 286 performs trimming processing on the image data subjected to the image processing by the image processing unit 281 to generate trimming image data or thumbnail image data.
The 2 nd communication control section 287 performs communication control of each of the 3 rd communication section 25 and the 4 th communication section 26. Specifically, the 2 nd communication control section 287 transmits information (hereinafter referred to as "image pickup apparatus information") that can be displayed in the 1 st display section 32 of the manipulation apparatus 30 for manipulating the mobile apparatus 10 to the mobile apparatus 10 via the 3 rd communication section 25. When receiving an instruction signal instructing to perform image capturing from the operation device 40 via the 4 th communication unit 26, the 2 nd communication control unit 287 causes the image capturing unit 21 to execute image capturing and transmits reception information indicating that the 4 th communication unit 26 has received the instruction signal instructing to perform image capturing from the operation device 40 to the mobile device 10 via the 3 rd communication unit 25. The 2 nd communication control unit 287 transmits the mobile device information input from the mobile device 10 via the 3 rd communication unit 25 to the operation device 40 via the 4 th communication unit 26.
The imaging control unit 288 controls the imaging by the imaging unit 21. Specifically, when an instruction signal instructing to perform imaging is input from the operation device 40 via the 4 th communication unit 26, the imaging control unit 288 causes the imaging unit 21 to perform imaging.
(construction of the operating device)
Next, the structure of the manipulator 30 will be described. The manipulator 30 includes a 5 th communication unit 31, a1 st display unit 32, a2 nd operation unit 33, a 3 rd recording unit 34, and a 3 rd control unit 35.
The 5 th communication unit 31 is configured using a communication module. The 5 th communication unit 31 wirelessly communicates with the 1 st communication unit 17 of the mobile device 10 under the control of the 3 rd control unit 35, thereby transmitting an instruction signal from the manipulation device 30, and receiving and outputting status information transmitted from the mobile device 10 to the 3 rd control unit 35.
The 1 st display unit 32 displays various information related to the mobile device 10 under the control of the 3 rd control unit 35. The 1 st display unit 32 is configured by using a display panel such as a liquid crystal display (lcd) or an organic el (electro luminescence).
The 2 nd operation unit 33 is configured by using a button, a switch, a cross key, or the like that receives an operation by an operator, receives an input of an instruction signal corresponding to the operation by the operator, and outputs the instruction signal to the 3 rd control unit 35.
The 3 rd recording portion 34 records various programs related to the manipulator 30. The 3 rd recording unit 34 is configured using a flash memory or an SDRAM.
The 3 rd control unit 35 is configured using a CPU, and controls each unit of the manipulator 30. The 3 rd control unit 35 includes a 3 rd communication control unit 351 for controlling communication between the 5 th communication unit 31 and the 1 st communication unit 17 of the mobile device 10, and a1 st display control unit 352 for controlling display on the 1 st display unit 32.
(construction of operating device)
Next, the structure of the operation device 40 will be described. The operation device 40 includes a 6 th communication unit 41, a2 nd display unit 42, a 3 rd operation unit 43, a 4 th recording unit 44, and a 4 th control unit 45.
The 6 th communication unit 41 is configured using a communication module. The 6 th communication unit 41 wirelessly communicates with the 4 th communication unit 26 of the image pickup device 20 under the control of the 4 th control unit 45, thereby transmitting an instruction signal from the operation device 40, receiving information including image data transmitted from the image pickup device 20, and outputting the information to the 4 th control unit 45.
The 2 nd display unit 42 displays an image corresponding to the image data generated by the imaging device 20 and various information on the imaging device 20 under the control of the 4 th control unit 45. The 2 nd display unit 42 is configured by using a display panel such as a liquid crystal or an organic EL.
The 3 rd operation unit 43 is configured by using buttons, switches, a touch panel, and the like for receiving an operation by the photographer, receives an input of an instruction signal corresponding to the operation by the photographer, and outputs the instruction signal to the 4 th control unit 45.
The 4 th recording unit 44 records various programs related to the operation device 40 and image data generated by the imaging device 20. The 4 th recording unit 44 is configured using a flash memory, an SDRAM, a memory card, or the like.
The 4 th control unit 45 is configured using a CPU, and controls each unit of the operation device 40. The 4 th control unit 45 includes a 4 th communication control unit 451 for controlling communication between the 6 th communication unit 41 and the 4 th communication unit 26 of the image pickup device 20, and a2 nd display control unit 452 for controlling display of the 2 nd display unit 42.
(action of the Camera System)
Next, the operation of the imaging system 1 will be described. Next, as the operation processing of the imaging system 1, the operation of the imaging device 20 will be described, and then the operation of the mobile device 10 will be described.
(operation of image pickup device)
Fig. 3 is a flowchart illustrating the operation of the imaging device 20. As shown in fig. 3, first, the 2 nd control unit 28 determines whether or not the communication with the mobile device 10 is possible based on the 3 rd communication unit 25 via the cable 2 (whether or not the mobile device 10 and the imaging device 20 are connected via the cable 2) and whether or not the communication with the operation device 40 via the 4 th communication unit 26 is good (step S101). Specifically, when it is determined that the communication connection with the mobile device 10 is not possible (for example, when it is determined that the cable 2 is not connected), or when it is determined that the communication state with the operation device 40 is not good, the 2 nd control unit 28 notifies that the communication state is abnormal by an led (light Emitting diode) (not shown) or a speaker (not shown) provided on the outer surface of the imaging device 20 (the mobile device 10). In this case, the operation device 40 may cause the 2 nd display unit 42 to display that the communication state is abnormal.
Next, the 2 nd control unit 28 determines whether or not a communication mode is set in which communication is performed in accordance with an instruction signal from the operation device 40 in cooperation with the mobile device 10 via the 3 rd communication unit 25 and the 4 th communication unit 26 (step S102). When the 2 nd control unit 28 determines that the communication mode is set (yes in step S102), the image pickup apparatus 20 proceeds to step S103, which will be described later. On the other hand, when the 2 nd control unit 28 determines that the communication mode is not set (no in step S102), the imaging apparatus 20 proceeds to step S119 described later.
In step S103, the 2 nd control unit 28 determines whether or not communication with the mobile device 10 is possible via the 3 rd communication unit 25. When the 2 nd control unit 28 determines that communication with the mobile device 10 is possible via the 3 rd communication unit 25 (yes in step S103), the imaging device 20 proceeds to step S104 described later. On the other hand, when the 2 nd control unit 28 determines that communication with the mobile device 10 via the 3 rd communication unit 25 is not possible (no in step S103), the imaging device 20 proceeds to step S115 described below.
In step S104, the imaging control unit 288 causes the imaging unit 21 to perform imaging. Then, the imaging control unit 288 executes imaging preparation processing for performing AF processing and AE processing in the imaging unit 21 (step S105). In this case, the distance detection unit 285 detects the distance from the imaging device 20 to the subject based on the image data generated by the imaging unit 21. Of course, the distance detection unit 285 may detect the distance from the imaging device 20 to the subject based on temporally continuous image data, or may detect the distance from the imaging device 20 to the subject based on the result of the AF processing and the image data. The trimming unit 286 further performs trimming processing on the image data generated by the imaging unit 21 to generate trimmed image data. The distance detection unit 285 may detect the distance from the imaging device 20 to the subject by using other known techniques.
Next, the 2 nd communication control unit 287 transmits the image pickup device information to the mobile device 10 via the 3 rd communication unit 25 (step S106). Here, the imaging device information is the content of the distance information on the distance between the subject detected by the distance detection unit 285, the image data generated by the imaging unit 21, the clipped image data generated by the clipping unit 286, and the instruction signal input from the operation device 40 via the 4 th communication unit 26 (for example, a release signal instructing shooting, a1 st release signal instructing shooting preparation operations of AF processing and AE processing, and an instruction signal changing the shooting parameters of the imaging unit 21). Thus, as shown in fig. 4, the distance T2 from the subject, the image W1 corresponding to the image data captured by the image capturing device 20, the travel information a1 relating to the travel direction of the mobile device 10, the remaining battery level a2 of the mobile device 10, and the position information G1 relating to the current mobile device 10 on the map M1 can be displayed on the 1 st display unit 32 of the manipulating device 30. As a result, the operator can intuitively grasp in what state the photographer is currently shooting the subject. Further, since the operator can grasp the information of the subject desired by the photographer, the operation content of the mobile device 10 can be changed in real time. Further, since the operator can grasp the imaging situation of the photographer in real time, the operator can concentrate on the operation of the mobile device 10. In this way, since desired information varies depending on a user, a situation, and an object, it is preferable that the information can be changed, and the information may be selected from a DB or the like recorded in any one of the recording units, or may be programmed by a program or the like included in the recording unit. Further, although the study of the information obtained by summarizing and displaying is also a preferable application example, the summarization function can be provided to any one of the devices, but it is possible to reduce the information and increase the speed by summarizing before communication.
Then, the 2 nd control unit 28 acquires the mobile device information from the mobile device 10 via the 3 rd communication unit 25 (step S107). Here, the mobile device information is at least current position information, traveling direction, rotation information, altitude, operation status of the mobile device 10, such as a stationary state or a return state, and a remaining amount of the power source 12, which are related to the mobile device 10.
Next, the 2 nd control unit 28 transmits the image data generated by the imaging unit 21 and the mobile device information input from the mobile device 10 to the operation device 40 via the 4 th communication unit 26 (step S108). Thus, as shown in fig. 5, the 2 nd display control unit 252 can display an image W1 corresponding to the image data, travel information a1 relating to the travel direction of the mobile device 10, the remaining battery level a2 of the mobile device 10, position information G1 and the height T1 relating to the current mobile device 10 on the map M1 on the 2 nd display unit 42 of the operation device 40. As a result, the photographer can intuitively grasp how the operator is currently operating the mobile device 10. Further, since the photographer can predict the position of a desired object from the traveling direction of the mobile device 10 and perform imaging in real time, the photographer can image the object with a desired composition. Further, the photographer can grasp the position of the mobile device 10 in real time. Although only the traveling direction of the mobile device 10 is used as the mobile device information, the 2 nd display control unit 452 may display the 2 nd display unit 42 of the operation device 40, including the state of the mobile device 10, such as the altitude increase (altitude decrease), the turning (turning direction), the moving speed, and the movement history (movement log) of the mobile device 10. This enables the photographer to grasp the state of the mobile device 10 by the operator in more detail, and to perform shooting collectively. Further, in the case where the photographer observes and feels that the speed is too fast or the case where the photographer is still more suitable for photographing, when the photographer wants to notify the operator, it is possible to display an assist desired as a sound. For example, it may be specified that the mobile device 10 can be set to a specific speed, the mobile device 10 can be set to a specific height, the mobile device 10 can be set to a specific position, or the like. In this case, the mobile device 10 displays such information, and the photographer can perform setting by observing the information, and perform specific and accurate setting according to the situation.
When the photographing operation signal is received from the operation device 40 via the 4 th communication unit 26 (yes in step S109), the image pickup device 20 proceeds to step S116 described later when the mobile device 10 is returning (yes in step S110). On the other hand, when the photographing operation signal is received from the operation device 40 via the 4 th communication unit 26 (yes in step S109), the image pickup device 20 proceeds to step S111 described later when the mobile device 10 is not returning (no in step S110).
In step S111, the 2 nd communication control unit 287 transmits a presence of the shooting instruction to the mobile device 10 via the 3 rd communication unit 25.
Then, the imaging control unit 288 causes the imaging unit 21 to perform imaging (step S112), and the image processing unit 281 performs image processing on the image data generated by the imaging unit 21 and records the image data in the image data recording unit 271 (step S113). In this case, the imaging control unit 288 associates the image data with the mobile device information and records the image data in the image data recording unit 271. This makes it possible to grasp the position of the image data at the time of shooting at the time of reproduction. Of course, in addition to the image data and the mobile device information, the movement history (movement log) of the mobile device 10 may be recorded in the image data recording unit 271 in association with each other.
Next, the 2 nd communication control unit 287 transmits the image data generated by the image pickup unit 21 to the mobile device 10 via the 3 rd communication unit 25 (step S114).
Then, the 2 nd control unit 28 determines whether or not the mobile device 10 is stopped, based on the information input from the mobile device 10 via the 3 rd communication unit 25 (step S115). When the 2 nd control unit 28 determines that the mobile device 10 is stopped (yes in step S115), the imaging device 20 ends the operation. On the other hand, when the 2 nd control unit 28 determines that the mobile device 10 is not stopped (no in step S115), the imaging device 20 returns to step S101.
In step S116, the imaging control unit 288 stops imaging by the imaging unit 21. After step S116, the image pickup device 20 proceeds to step S115.
In step S109, if the photographing operation signal is not received from the operation device 40 via the 4 th communication unit 26 (no in step S109), the image pickup device 20 proceeds to step S117.
Next, when a parameter change signal for changing the imaging parameters of the imaging unit 21 is received from the operation device 40 via the 4 th communication unit 26 (yes in step S117), the imaging control unit 288 changes the imaging parameters of the imaging unit 21, for example, the zoom magnification, the aperture value, and the like so that the imaging parameters correspond to the parameter change signal (step S118). After step S118, the image pickup device 20 proceeds to step S115.
In step S117, if the parameter change signal for changing the imaging parameter of the imaging unit 21 is not received from the operation device 40 via the 4 th communication unit 26 (no in step S117), the imaging device 20 proceeds to step S115.
In step S119, the imaging control unit 288 executes normal imaging processing for causing the imaging unit 21 to perform imaging and recording the imaging result in the image data recording unit 271, in accordance with the operation of the 1 st operation unit 24. After step S119, the imaging device 20 ends the present operation.
(operation of the Mobile device)
Next, the operation of the mobile device 10 will be described. Fig. 6 is a flowchart showing the operation of the mobile device 10. As shown in fig. 6, first, the 1 st control unit 19 determines the communication state of the mobile device 10 (step S201). Specifically, the 1 st control unit 19 determines whether or not the communication with the imaging device 20 is possible based on the 2 nd communication unit 18 via the cable and whether or not the communication with the manipulator 30 via the 1 st communication unit 17 is good. In this case, when it is determined that the communication connection with the imaging device 20 is not possible or when it is determined that the communication state with the manipulator 30 is not good, the 1 st control unit 19 notifies that the communication state is abnormal by an LED (not shown) provided on the outer surface of the mobile device 10.
Next, the 1 st control unit 19 acquires the imaging device information from the imaging device 20 via the 2 nd communication unit 18 (step S202).
When the manipulation signal is received from the manipulation device 30 via the 1 st communication unit 17 (yes in step S203), the mobile device 10 proceeds to step S204, which will be described later. In contrast, when the manipulation signal is not received from the manipulation device 30 via the 1 st communication unit 17 (no in step S203), the mobile device 10 proceeds to step S207 described later.
If the reception information is present in the imaging device information acquired from the imaging device 20 in step S204 (yes in step S204), the mobile device 10 proceeds to step S207 described later. On the other hand, if there is no received information in the imaging device information acquired from the imaging device 20 (no in step S204), the mobile device 10 proceeds to step S205 described later.
If the 1 st control unit 19 determines in step S205 that an obstacle is present in the traveling direction of the mobile device 10 (yes in step S205), the mobile device 10 proceeds to step S207, which will be described later. On the other hand, if the 1 st control unit 19 determines that there is no obstacle in the traveling direction of the mobile device 10 (no in step S205), the mobile device 10 proceeds to step S206 described later.
In step S206, the 1 st control unit 19 moves the mobile device 10 in accordance with the manipulation signal input from the manipulation device 30. After step S206, the mobile device 10 proceeds to step S212 described later.
In step S207, the 1 st control part 19 determines whether or not the mobile device 10 can be stationary (can hover over the air). When the 1 st control unit 19 determines that the mobile device 10 can be stationary (yes in step S207), the propulsion control unit 194 executes a stationary control process for stopping the mobile device 10 (step S208). Specifically, the 1 st control unit 19 makes the mobile device 10 stop at this position. After step S208, the mobile device 10 proceeds to step S212 described later.
If the 1 st control unit 19 determines in step S207 that the mobile device 10 cannot be stationary (no in step S207), the mobile device 10 proceeds to step S209 described later.
Next, the 1 st control unit 19 determines whether or not to return the mobile device 10 to the return location (step S209). Specifically, the 1 st control unit 19 determines whether or not the mobile device 10 can return to the return location based on the remaining amount of the power source 12 and the current position information of the mobile device 10 detected by the position and orientation detecting unit 14. Here, the return location is a takeoff start position of the mobile device 10 (time information of the takeoff start position). When the 1 st control unit 19 determines that the mobile device 10 can return to the return location (yes in step S209), the propulsion control unit 194 executes a return control process for returning the mobile device 10 to the return location (step S210). After step S210, the mobile device 10 proceeds to step S212 described later.
If the 1 st control unit 19 determines in step S209 that the mobile device 10 cannot return to the return location (no in step S209), the mobile device 10 proceeds to step S211, which will be described later.
Next, the propulsion control unit 194 executes avoidance control processing for moving the mobile device 10 to the avoidance space (step S211). Specifically, the propulsion control unit 194 calculates a movable distance from the remaining amount of the power source 12, and moves the mobile device 10 to the nearest safe avoidance space (a flat space without an obstacle) based on the calculated distance and the current position of the mobile device 10 detected by the position and orientation detection unit 14. After step S211, the mobile device 10 proceeds to step S212 described later.
In step S212, the 1 st communication control unit 197 transmits the mobile device information about the mobile device 10 to the manipulation device 30 via the 1 st communication unit 17.
Then, when communicating with the imaging device 20 via the 2 nd communication unit 18 (step S213: YES), the 1 st communication control unit 197 acquires image data from the imaging device 20 via the 2 nd communication unit 18 (step S214).
Next, the 1 st communication control unit 197 transmits necessary information to the manipulator 30 via the 1 st communication unit 17 (step S215).
When the end signal is input via the manipulator 30 (yes in step S216), the propulsion control unit 194 executes a stop control process for stopping the rotor 10a of the mobile device 10 (step S217). After step S217, the mobile device 10 ends the present process. On the other hand, when the end signal is not input via the manipulation device 30 (no in step S216), the mobile device 10 returns to step S201.
In step S213, if communication with the image pickup device 20 is not performed via the 2 nd communication unit 18 (no in step S213), the mobile device 10 proceeds to step S216.
According to the above-described embodiment of the present invention, even when the operator U1 of the mobile device 10 and the photographer U2 of the imaging device 20 perform operations separately, the mutual operations can be collectively performed.
Further, according to one embodiment of the present invention, the 2 nd control unit 28 transmits the mobile device information input from the mobile device 10 to the operation device 40, and the 2 nd display unit 42 of the operation device 40 can display the image corresponding to the image data and the information related to the mobile device 10, so that the photographer can intuitively grasp the operation content of the operator.
Further, according to one embodiment of the present invention, the 2 nd control unit 28 transmits the imaging device information to the mobile device 10, the mobile device 10 transmits the imaging device information to the manipulating device 30, and the 1 st display unit 32 of the manipulating device 30 displays the information on the state of the mobile device 10 and the imaging device information of the imaging device 20, so that the operator can grasp the object and the imaging situation desired by the photographer.
Further, according to one embodiment of the present invention, since the distance information on the distance from the imaging device 20 to the object is included as the imaging device information and the distance from the imaging device 20 to the object is displayed on the 1 st display unit 32 of the manipulator 30, the operator can intuitively grasp the distance to the object and can reflect the distance to the object in the operation of the mobile device 10.
Further, according to one embodiment of the present invention, since the imaging device information further includes a reception signal indicating that the imaging instruction signal is received, and the 1 st control unit 19 makes the mobile device 10 still when the mobile device 10 receives the reception signal via the imaging device 20, the photographer can image the subject in a desired composition and can reduce the shake of the mobile device 10.
Further, according to one embodiment of the present invention, when the instruction signal instructing the image pickup is input from the operation device 40 when the mobile device 10 returns, the 2 nd control unit 28 suspends the image pickup by the image pickup unit 21, thereby giving priority to the return of the mobile device 10, and therefore, it is possible to prevent the power consumption of the mobile device 10 due to the image pickup.
Further, according to one embodiment of the present invention, the 2 nd control unit 28 transmits the mobile device information about the position and the traveling direction of the mobile device 10 input from the mobile device 10 to the operation device 40, and thereby displays the current position and the traveling direction of the mobile device 10 on the 2 nd display unit 42 of the operation device 40, so that the photographer can intuitively grasp the manipulation content of the operator.
(other embodiments)
In one embodiment of the present invention, the mobile device and the imaging device are provided with communication units that can communicate with the outside, respectively, but one of the mobile device and the imaging device may be provided with a communication unit that can communicate with the outside, and the operation of each of the manipulation device and the operation device may be performed via the communication unit. For example, as in the imaging system 1a shown in fig. 7, a communication unit that can communicate with the outside may be provided only in the mobile device 10, and the data of the imaging device 20 may be transmitted to the manipulation device 30 through the communication unit provided in the mobile device 10.
The imaging system of the present invention can be applied to electronic devices such as digital video cameras, surveillance cameras, and tablet-type portable devices having an imaging function, display devices for displaying images corresponding to image data in medical and industrial fields imaged by endoscopes and microscopes, and the like, in addition to unmanned planes and digital still cameras. Further, the present invention can also be applied to an apparatus that performs imaging by moving the apparatus by 2 users. For example, the present invention can be applied to a teleoperation apparatus and a medical apparatus having an operator who cuts an affected part in a patient with a knife while observing an image, and an operator who photographs the inside of the patient for use by the operator, and an endoscope system including an endoscope and an observation scope.
The program executed by the imaging apparatus according to the present invention is provided by recording file data in an installable or executable format in a computer-readable recording medium such as a CD-ROM, a Flexible Disk (FD), a CD-R, DVD (Digital Versatile Disk), a USB medium, or a flash memory.
The program executed by the imaging apparatus according to the present invention may be stored in a computer connected to a network such as the internet and may be provided by downloading the program via the network. Further, the program executed by the imaging apparatus of the present invention may be provided or distributed via a network such as the internet.
In the description of the flowcharts of the present specification, the context of the processing between steps is described using expressions such as "first", "next", and the like, but the order of the processing required to implement the present invention is not uniquely determined by these expressions. That is, the order of processing in the flowcharts described in this specification can be changed without departing from the scope of the invention.
Although several embodiments of the present invention have been described in detail with reference to the accompanying drawings, these are illustrative, and the present invention can be implemented in other embodiments in which various modifications and improvements are possible in light of the knowledge of those skilled in the art, including those described in the summary of the invention.
The "section (module, unit)" may be rewritten as a "unit" or a "circuit". For example, the control unit may be rewritten as a control unit or a control circuit.
As described above, the present invention may include various embodiments not described herein, and various design changes and the like can be made within the scope of the technical idea defined by the claims.

Claims (21)

1. An image pickup apparatus that is mounted on a mobile apparatus and that is capable of bidirectional communication with the mobile apparatus, the image pickup apparatus being wirelessly communicated with an operation apparatus and controlled by the operation apparatus, the image pickup apparatus comprising:
an image pickup unit that picks up an image of an object to generate image data; and
a control unit that controls so that information that can be displayed on a display unit of a manipulation device that manipulates the mobile device is transmitted to the mobile device.
2. The image pickup apparatus according to claim 1,
the information includes at least the image data generated by the image pickup unit.
3. The image pickup apparatus according to claim 2,
the image pickup apparatus further has a distance detection section that detects a distance from the image pickup apparatus to the object based on the image data,
the information further includes distance information relating to the distance detected by the distance detecting unit.
4. The imaging device according to any one of claims 1 to 3,
the image pickup apparatus further has a communication section that receives an instruction signal instructing the image pickup apparatus to perform image pickup from the operation apparatus operating the image pickup apparatus and transmits the image data to the operation apparatus,
the control unit causes the image pickup unit to perform image pickup when the communication unit receives the instruction signal,
the information further includes reception information indicating that the communication unit has received the instruction signal.
5. The image pickup apparatus according to claim 4,
the control unit transmits control information related to a control signal to the operation device via the communication unit when receiving the control signal for controlling the movement of the mobile device corresponding to the operation of the manipulation device from the mobile device.
6. The image pickup apparatus according to claim 4,
the control unit transmits the position information to the operation device via the communication unit when the position information on the position of the mobile device is input from the mobile device.
7. The image pickup apparatus according to claim 5,
the control unit transmits the position information to the operation device via the communication unit when the position information on the position of the mobile device is input from the mobile device.
8. The image pickup apparatus according to claim 4,
the control unit does not perform imaging by the imaging unit even when the instruction signal is received when the mobile device returns to a predetermined position.
9. The image pickup apparatus according to claim 5,
the control unit does not perform imaging by the imaging unit even when the instruction signal is received when the mobile device returns to a predetermined position.
10. The image pickup apparatus according to claim 6,
the control unit does not perform imaging by the imaging unit even when the instruction signal is received when the mobile device returns to a predetermined position.
11. The image pickup apparatus according to claim 7,
the control unit does not perform imaging by the imaging unit even when the instruction signal is received when the mobile device returns to a predetermined position.
12. A mobile device to which an imaging device for imaging a subject to generate image data is attached, the mobile device being capable of bidirectional communication with the imaging device and moving in accordance with a signal transmitted from an operating device having a display portion,
the mobile device has a communication unit that receives the signal transmitted from the manipulation device and transmits information input from the image pickup device to the manipulation device, the image pickup device wirelessly communicating with and controlled by an operation device.
13. Mobile device according to claim 12,
the information includes at least the image data.
14. Mobile device according to claim 12 or 13,
the information further includes distance information relating to a distance from the image pickup device to the object.
15. Mobile device according to claim 12 or 13,
the mobile device further has:
a position detection unit that detects position information relating to a current position of the mobile device; and
a control unit that controls so that the position information detected by the position detection unit is transmitted to the imaging device.
16. The mobile device of claim 15,
the information further includes reception information indicating that the image pickup apparatus has received an instruction signal instructing the image pickup apparatus to perform shooting,
the control unit makes the mobile device stationary when the communication unit receives the information, when the communication unit receives the signal.
17. The mobile device of claim 15,
the control unit maintains the return of the mobile device when receiving the information when returning the mobile device to a predetermined position.
18. The mobile device of claim 16,
the control unit maintains the return of the mobile device when receiving the information when returning the mobile device to a predetermined position.
19. An image pickup system having an image pickup device that picks up an image of a subject to generate image data, a moving device that is mounted with the image pickup device and is capable of bidirectional communication with the image pickup device, a manipulation device that manipulates the moving device, and an operation device that wirelessly communicates with the image pickup device and operates the image pickup device,
the imaging device has a control unit that controls transmission of information that can be displayed on a display unit of a manipulation device that manipulates the mobile device to the mobile device,
the mobile device has a communication section that receives a signal transmitted from the manipulation device and transmits the information input from the image pickup device to the manipulation device.
20. An image pickup method executed by an image pickup apparatus that is mounted on a mobile apparatus and that is capable of bidirectional communication with the mobile apparatus, the image pickup apparatus wirelessly communicating with an operation apparatus and being controlled by the operation apparatus, the image pickup method comprising:
an image pickup step of picking up an image of a subject to generate image data; and
a control step of transmitting information that can be displayed on a display portion of a manipulation device that manipulates the mobile device to the mobile device.
21. A recording medium recording a program executed by an image pickup apparatus that is mounted on a mobile apparatus and that is capable of bidirectional communication with the mobile apparatus, wherein the image pickup apparatus wirelessly communicates with an operating apparatus and is controlled by the operating apparatus, the program executing the steps of:
an image pickup step of picking up an image of a subject to generate image data; and
a control step of transmitting information that can be displayed on a display portion of a manipulation device that manipulates the mobile device to the mobile device.
CN201611095306.5A 2016-03-18 2016-12-01 Image pickup apparatus, mobile apparatus, image pickup system, image pickup method, and recording medium Active CN107205111B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-055313 2016-03-18
JP2016055313A JP2017169170A (en) 2016-03-18 2016-03-18 Imaging apparatus, moving apparatus, imaging system, imaging method, and program

Publications (2)

Publication Number Publication Date
CN107205111A CN107205111A (en) 2017-09-26
CN107205111B true CN107205111B (en) 2021-03-02

Family

ID=59904466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611095306.5A Active CN107205111B (en) 2016-03-18 2016-12-01 Image pickup apparatus, mobile apparatus, image pickup system, image pickup method, and recording medium

Country Status (2)

Country Link
JP (1) JP2017169170A (en)
CN (1) CN107205111B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6570199B2 (en) * 2017-12-08 2019-09-04 浩幸 上野 Virtual pilot system for unmanned aerial vehicles
JP6473266B1 (en) * 2018-11-12 2019-02-20 弘幸 中西 Capsule endoscope
JP7320939B2 (en) * 2018-11-30 2023-08-04 ニデックプレシジョン株式会社 Blade operating device and blade operating method
JP7219204B2 (en) * 2019-11-26 2023-02-07 弘幸 中西 unmanned aerial vehicle
JP7420047B2 (en) 2020-10-21 2024-01-23 トヨタ自動車株式会社 robot system
CN113395444A (en) * 2021-04-25 2021-09-14 戴井之 Household intelligent monitoring anti-theft system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201479306U (en) * 2009-07-16 2010-05-19 姚臣 Remote wireless remote control network video camera
CN202494922U (en) * 2012-03-08 2012-10-17 陶重犇 Mobile robot platform controlled by Android operating system
CN204304922U (en) * 2014-11-20 2015-04-29 中国建材检验认证集团股份有限公司 A kind of photovoltaic module hot spot inspection device based on unmanned plane

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102514719A (en) * 2011-12-29 2012-06-27 广州市阳阳谷信息科技有限公司 Wireless remote control flying surveillance camera shooting command device
CN103426282A (en) * 2013-07-31 2013-12-04 深圳市大疆创新科技有限公司 Remote control method and terminal
US20160042637A1 (en) * 2014-08-11 2016-02-11 Clandestine Development, Llc Drone Safety Alert Monitoring System and Method
CN204650277U (en) * 2015-05-26 2015-09-16 中国安全生产科学研究院 On-the-spot mobile robot environmental information being carried out to mobile monitoring of harmful influence leakage accident

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201479306U (en) * 2009-07-16 2010-05-19 姚臣 Remote wireless remote control network video camera
CN202494922U (en) * 2012-03-08 2012-10-17 陶重犇 Mobile robot platform controlled by Android operating system
CN204304922U (en) * 2014-11-20 2015-04-29 中国建材检验认证集团股份有限公司 A kind of photovoltaic module hot spot inspection device based on unmanned plane

Also Published As

Publication number Publication date
CN107205111A (en) 2017-09-26
JP2017169170A (en) 2017-09-21

Similar Documents

Publication Publication Date Title
CN107205111B (en) Image pickup apparatus, mobile apparatus, image pickup system, image pickup method, and recording medium
US10979615B2 (en) System and method for providing autonomous photography and videography
US11797009B2 (en) Unmanned aerial image capture platform
WO2018103689A1 (en) Relative azimuth control method and apparatus for unmanned aerial vehicle
US11531340B2 (en) Flying body, living body detection system, living body detection method, program and recording medium
US10377487B2 (en) Display device and display control method
US20200304719A1 (en) Control device, system, control method, and program
WO2019230604A1 (en) Inspection system
KR20170086482A (en) Control device and control method of flying bot
JP7391053B2 (en) Information processing device, information processing method and program
JP6482855B2 (en) Monitoring system
JP6831949B2 (en) Display control system, display control device and display control method
JP6329219B2 (en) Operation terminal and moving body
WO2021014752A1 (en) Information processing device, information processing method, and information processing program
JP6821864B2 (en) Display control system, display control device and display control method
WO2020150974A1 (en) Photographing control method, mobile platform and storage medium
JP6803960B1 (en) Image processing equipment, image processing methods, programs, and recording media
US11354897B2 (en) Output control apparatus for estimating recognition level for a plurality of taget objects, display control system, and output control method for operating output control apparatus
WO2023137758A1 (en) Control method for movable platform, head-mounted device, system, and storage medium
KR20170019777A (en) Apparatus and method for controling capturing operation of flying bot
US20220182528A1 (en) Imaging device, imaging signal processing device, and imaging signal processing method
CN114600024A (en) Device, imaging system, and moving object
JP2022040134A (en) Estimation system and automobile
CN116745722A (en) Unmanned aerial vehicle control method and device, unmanned aerial vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant