WO2017169089A1 - Camera control device - Google Patents
Camera control device Download PDFInfo
- Publication number
- WO2017169089A1 WO2017169089A1 PCT/JP2017/003776 JP2017003776W WO2017169089A1 WO 2017169089 A1 WO2017169089 A1 WO 2017169089A1 JP 2017003776 W JP2017003776 W JP 2017003776W WO 2017169089 A1 WO2017169089 A1 WO 2017169089A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- imaging
- altitude
- information
- actuator
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims description 144
- 239000002131 composite material Substances 0.000 description 13
- 238000010586 diagram Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 238000000034 method Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000007792 addition Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000004091 panning Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/38—Releasing-devices separate from shutter
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
- G03B17/561—Support related camera accessories
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2206/00—Systems for exchange of information between different pieces of apparatus, e.g. for exchanging trimming information, for photo finishing
Definitions
- the present disclosure relates to a camera control device that controls a camera that captures an external image in a transportation device such as an aircraft or a train.
- Patent Document 1 discloses a configuration in which zoom amounts and inter-camera angles of a plurality of cameras are adjusted in accordance with a viewing angle designated by an operator's operation or an input from an external device. With this configuration, a wide and detailed inspection can be performed without causing overlapping or omission of display.
- the present disclosure provides a camera control device capable of imaging at an appropriate angle of view.
- the camera control device includes an interface and a controller.
- the interface receives first image data generated by imaging by the first camera, second image data generated by imaging by the second camera, and altitude information about the altitude output from the altitude sensor.
- Drive signals are transmitted to the first actuator that can change the imaging direction of the second camera and the second actuator that can change the imaging direction of the second camera.
- the controller is transmitted via the interface such that the lower the altitude indicated by the altitude information, the narrower the composite imaging area that is a range in which at least one of the first camera and the second camera can be imaged, Drive signals for driving the first actuator and the second actuator are output to control the imaging directions of the first camera and the second camera.
- the camera control device includes an interface, a geographic information database, and a controller.
- the interface includes first image data generated by imaging by the first camera, second image data generated by imaging by the second camera, position information on the current position output by the positioning sensor, and direction output by the compass The information is received, and a drive signal is transmitted to the first actuator that can change the imaging direction of the first camera and the second actuator that can change the imaging direction of the second camera.
- the geographic information database holds landmark information regarding the position of the landmark.
- the controller specifies a landmark located within a predetermined range with respect to the current position based on the landmark information acquired from the position information, the direction information, and the geographic information database.
- a drive that drives the first actuator and the second actuator is transmitted via the interface so that the position of the specified landmark is included in at least one of the imaging area of the first camera and the imaging area of the second camera.
- a signal is output to control the imaging directions of the first camera and the second camera.
- the camera control device is effective for performing imaging at an appropriate angle of view.
- FIG. The figure which shows the data content of the geographic information database in Embodiment 1 The figure which shows the specific example of direction of the 1st camera and 2nd camera when the altitude of the aircraft in Embodiment 1 is low
- Flowchart for explaining processing for controlling the orientation of each camera in the second embodiment The figure which shows the relationship between the present position in Embodiment 2, a landmark position, and a horizon position
- FIG. 1 is a diagram illustrating a configuration of an in-flight system 10 according to the first embodiment.
- In-flight system 10 of the present embodiment is provided in an aircraft.
- the in-flight system 10 captures the scenery outside the aircraft when the aircraft is navigating the route and acquires image data.
- the in-flight system 10 changes the direction of the camera based on the altitude information of the aircraft.
- the in-flight system 10 includes a server device 100, a monitor 200, a GPS module 300, a first camera 400a, a second camera 400b, and a compass 500.
- Server device 100 is connected to monitor 200 and transmits image data to monitor 200.
- the monitor 200 is installed in an aircraft cabin.
- the monitor 200 can display a video based on the image data received from the server device 100.
- the GPS module 300 acquires longitude / latitude information indicating the current position of the aircraft and altitude information indicating the current altitude of the aircraft and transmits them to the server apparatus 100.
- the first camera 400a generates image data by performing an imaging operation and outputs the image data to the server device 100.
- the first camera 400a includes a first actuator 401a.
- the first actuator 401a changes the imaging direction of the first camera 400a based on the data received from the server device 100.
- the server device 100 controls the first actuator 401a, so that the first camera 400a can pan (rotate in the yaw direction) and tilt (rotate in the pitch direction).
- the second camera 400b generates image data by performing an imaging operation and outputs the image data to the server device 100.
- the second camera 400b includes a second actuator 401b.
- the second actuator 401b changes the imaging direction of the second camera 400b based on the data received from the server device 100.
- the server apparatus 100 controls the second actuator 401b, the second camera 400b can be panned (rotated in the yaw direction) and tilted (rotated in the pitch direction).
- the compass 500 acquires azimuth information indicating the current azimuth of the aircraft and transmits it to the server apparatus 100.
- the direction information is information indicating the direction in which the aircraft is facing.
- FIG. 2 is a diagram illustrating a configuration of the server apparatus 100.
- the server apparatus 100 includes an interface (I / F) 101, a CPU 102, a memory 103, a geographic information database (DB) 104, and an operation unit 105.
- the configuration in which the geographic information database 104 is connected to the inside of the server apparatus 100 will be described as an example.
- the geographic information database 104 is configured to be readable and writable with respect to the CPU 102. It only has to be done.
- the geographic information database may be arranged outside the server apparatus 100 in the aircraft and connected to the interface 101 of the server apparatus 100. Further, the geographic information database may be arranged in a data center outside the aircraft (ground) or the like, and may be able to communicate with the server apparatus 100 via wireless communication.
- the CPU 102 executes a program stored in the memory 103 and performs various calculations and information processing.
- the CPU 102 can read from and write to the memory 103 and the geographic information database 104. Further, the CPU 102 communicates with the monitor 200, the GPS module 300, the first camera 400a, the second camera 400b, and the compass 500 via the interface 101.
- the CPU 102 transmits the drive signal to the first actuator 401a of the first camera 400a and the second actuator 401b of the second camera 400b to drive it, thereby changing the imaging direction of the first camera 400a and the second camera 400b.
- the CPU 102 manages the imaging direction of the first camera 400a as first direction information. Further, the CPU 102 manages the imaging direction of the second camera 400b as the second direction information.
- the first direction information and the second direction information are information indicating relative directions with respect to the aircraft in which the first camera 400a and the second camera 400b are installed.
- the CPU 102 collects information from the GPS module 300 and the geographic information database 104, performs image processing to combine the image data acquired from the first camera 400 a and the second camera 400 b, and combines the image data combined with the monitor 200. send.
- the CPU 102 receives a signal from the operation unit 105 and performs various operations in accordance with this signal. In particular, the CPU 102 controls the start and end of the imaging operation of the first camera 400a and the second camera 400b based on a signal from the operation unit 105.
- the memory 103 stores a program executed by the CPU 102, image data generated by imaging by the first camera 400a and the second camera 400b, calculation results of the CPU 102, information obtained from the geographic information database 104, and the like.
- the memory 103 is configured by a flash memory or a RAM.
- the interface 101 includes image data generated by imaging by the first camera 400a, image data generated by imaging by the second camera 400b, longitude / latitude information and altitude information output by the GPS module 300, and the compass 500. Is received and sent to the CPU 102. Further, the interface 101 transmits a drive signal output from the CPU 102 to the first actuator 401a and the second actuator 401b.
- the geographic information database 104 is a database that holds information on landmarks on the map (landmark information).
- the landmark information is information indicating a specific point on the map.
- the landmark is also called POI (Point Of Interest).
- the geographic information database 104 is composed of a hard disk or the like.
- FIG. 3 is a diagram showing a specific example of data contents of the geographic information database 104.
- the geographic information database 104 holds a plurality of sets of landmark names and latitude and longitude (geographic information) indicating the positions of the landmarks on the earth as landmark information.
- the operation unit 105 is a user interface that receives input from a user (such as an aircraft cabin crew).
- the operation unit 105 is installed in an aircraft cabin.
- the operation unit 105 includes at least one of a keyboard, a mouse, a touch panel, a remote controller, and the like. When operated by the user, the operation unit 105 transmits a signal corresponding to the operation to the CPU 102.
- the in-flight system 10 is an example of an imaging system.
- the server device 100 is an example of a camera control device.
- the GPS module 300 is an example of a positioning sensor (longitude / latitude information acquisition unit) and an altitude sensor (altitude information acquisition unit).
- the CPU 102 is an example of a controller.
- the interface 101 is an example of a communication circuit.
- the first camera 400a and the second camera 400b are examples of an imaging device.
- the first actuator 401a and the second actuator 401b are an example of a camera orientation changing unit.
- the compass 500 is an example of an orientation sensor (orientation information acquisition unit).
- the geographic information database 104 is an example of a landmark database.
- the server device 100 acquires altitude information from the GPS module 300.
- the server device 100 drives the first actuator 401a and the second actuator 401b based on the altitude information, and changes the orientation (imaging direction) of the first camera 400a and the second camera 400b.
- the CPU 102 instructs the first camera 400a and the second camera 400b to start imaging.
- the first camera 400 a and the second camera 400 b generate image data by performing an imaging operation, and output the image data to the server device 100.
- the CPU 102 performs image processing on the image data acquired from the first camera 400 a and the second camera 400 b and combines them, and sends the combined image data to the monitor 200.
- the monitor 200 displays the acquired image data.
- the first camera 400a and the second camera 400b are arranged in such directions that their respective angles of view (imaging areas) partially overlap.
- the CPU 102 can generate combined image data that is image data with a wider angle of view by combining the image data obtained by the first camera 400a and the second camera 400b.
- the CPU 102 changes the orientation of the first camera 400a and the second camera 400b based on the altitude information.
- the CPU 102 repeats the above processing at regular intervals until an instruction to stop imaging is given.
- FIG. 4 is a diagram illustrating a specific example of the orientations of the first camera 400a and the second camera 400b when the altitude of the aircraft is low.
- FIG. 5 is a diagram illustrating a specific example of the orientations of the first camera 400a and the second camera 400b when the altitude of the aircraft is high.
- the CPU 102 drives the first camera 400a and the second camera by transmitting drive signals to the first actuator 401a and the second actuator 401b so that the composite imaging region Rc becomes narrower as the altitude indicated by the altitude information is lower.
- the imaging direction of 400b is controlled. Control of the imaging direction of the first camera 400a and the second camera 400b may change both imaging directions, or may change either one of the imaging directions.
- the composite imaging region Rc is a wider imaging region in which the imaging region Ra of the first camera 400a and the imaging region Rb of the second camera 400b are combined.
- the composite imaging area Rc is an area that can be imaged by at least one of the two cameras.
- An axis corresponding to the optical axis of the composite imaging region Rc is defined as a composite optical axis.
- the combined optical axis is the sum of unit vectors indicating the directions of the optical axes of the two cameras. Note that the direction of the combined optical axis can be obtained by calculation from the first direction information and the second direction information indicating the directions of the two cameras and the direction information acquired from the compass 500.
- the CPU 102 controls the first actuator 401a and the first actuator 401a so that the combined imaging region Rc of the first camera 400a and the second camera 400b becomes smaller when the altitude indicated by the altitude information is lower than a set threshold value.
- the second actuator 401b is driven to change the orientation of the first camera 400a and the second camera 400b.
- the composite imaging region Rc becomes smaller, that is, the overlapping imaging region Ro between the imaging region Ra of the first camera 400a and the imaging region Rb of the second camera 400b becomes larger.
- the CPU 102 makes the first actuator 401a and the first actuator 401a and the second imaging area Rc of the first camera 400a and the second camera 400b larger.
- the second actuator 401b is driven to change the orientation of the first camera 400a and the second camera 400b.
- the composite imaging region Rc becomes large, that is, the imaging region Ro where the imaging region Ra of the first camera 400a overlaps the imaging region Rb of the second camera 400b becomes small.
- the server apparatus 100 includes the interface 101 and the CPU 102.
- the interface 101 receives image data generated by imaging by the first camera 400a, image data generated by imaging by the second camera 400b, and altitude information about the altitude output by the GPS module 300, and the first camera A drive signal is transmitted to the first actuator 401a that can change the imaging direction of 400a and the second actuator 401b that can change the imaging direction of the second camera 400b.
- the CPU 102 makes the first actuator 401a such that the combined imaging area Rc, which is a range in which at least one of the first camera 400a and the second camera 400b can be imaged, becomes narrower.
- the drive signal for driving the 2nd actuator 401b is output, and the imaging direction of the 1st camera 400a and the 2nd camera 400b is controlled.
- the server device 100 With this server device 100, the lower the altitude, the narrower the combined imaging area Rc by the two cameras. By narrowing the composite imaging region Rc, the blind spot between the two cameras is reduced, so that it is included in the imaging region to a shorter distance.
- the camera orientation can be controlled so that the possibility that those subjects are included in the imaging region is increased even in such a case. That is, the server device 100 of the present embodiment is effective for performing imaging at an appropriate angle of view (imaging area).
- the in-flight system 10 according to the second embodiment is different from the in-flight system 10 according to the first embodiment in that the directions of the first camera 400a and the second camera 400b are controlled based on the landmark information. Since the configuration of the in-flight system 10 of the second embodiment and the basic imaging operation control by the first camera 400a and the second camera 400b are the same as those of the in-flight system 10 of the first embodiment, description thereof is omitted. To do.
- FIG. 6 is a flowchart illustrating a process for controlling the orientation of each camera in the second embodiment.
- the CPU 102 repeats the processing shown in FIG. 6 at regular intervals during the imaging operation by the first camera 400a and the second camera 400b.
- CPU 102 acquires longitude / latitude information and altitude information (step S401). Next, the CPU 102 acquires landmark information around the current position from the geographic information database 104 (step S402). Specifically, the CPU 102 first calculates a distance d2 from the current position to the horizon based on the altitude information.
- FIG. 7 is a diagram showing the relationship among the current position L, the landmark position D1, and the horizon position D2.
- the current position L is a current position of the aircraft, that is, a position indicated by latitude / longitude information and altitude information output from the GPS module 300.
- the ground surface position L0 is the position of the ground surface directly below the aircraft, that is, the position indicated by the latitude and longitude information output from the GPS module 300.
- the altitude h is the altitude indicated by the altitude information output by the GPS module 300, that is, the altitude of the aircraft.
- the radius R is the radius of the earth when the earth is assumed to be a true sphere.
- the landmark position D ⁇ b> 1 is a position on the earth of a specific landmark indicated by landmark information held in the geographic information database 104.
- the horizon position D2 is a position on the horizon as viewed from the aircraft located at the current position L.
- the distance d1 from the ground surface position L0 to the landmark position D1 can be obtained by calculating from each longitude and latitude.
- the central angle of the arc determined by the ground position L0 and the landmark position D1 is defined as an angle ⁇ .
- a distance d2 from the ground surface position L0 to the horizon position D2 can be obtained by the following equation (1).
- the distance d2 can be calculated from the altitude h, that is, altitude information.
- the CPU 102 lands in a circular area centered on the ground surface position L0 indicated by the longitude / latitude information and having the calculated distance d2 as a radius, and included in the maximum composite imaging area of the first camera 400a and the second camera 400b.
- the mark information is acquired from the geographic information database 104 (step S402).
- the maximum combined imaging area refers to an area when the combined imaging area Rc determined by the orientations of the two cameras is maximized. Specifically, the maximum combined imaging area is the combined imaging area Rc when the overlapping imaging area Ro of the first camera 400a and the second camera 400b is minimized.
- the CPU 102 determines the maximum combined imaging area as the first direction information and the second direction information when the overlapping imaging area Ro of the two cameras is minimum, the orientation information acquired from the compass 500, and the angle of view of each camera. Identify from.
- the CPU 102 acquires, from the geographic information database 104, landmark information in a region where the circular region having the radius of the distance d2 and the maximum combined imaging region overlap.
- the CPU 102 determines whether or not the landmark indicated by the acquired landmark information exists in at least one of the imaging area Ra of the first camera 400a and the imaging area Rb of the second camera 400b (step S403). That is, the CPU 102 determines whether or not the acquired landmark is within the composite imaging region Rc.
- the current camera orientation is maintained. To do.
- the CPU 102 determines that the first actuator 401a And the 2nd actuator 401b is driven with a drive signal, and direction of a camera is changed (step S404).
- the CPU 102 acquires the land acquired in at least one of the imaging region Ra of the first camera 400a and the imaging region Rb of the second camera 400b based on the position of the landmark specified by the acquired landmark information.
- the direction of the camera is changed so that the landmark indicated by the mark information exists. In changing the imaging direction of the first camera 400a and the second camera 400b, both imaging directions may be changed, or one of the imaging directions may be changed.
- the server apparatus 100 includes the interface 101, the geographic information database 104, and the CPU 102.
- the interface 101 outputs image data generated by imaging by the first camera 400a, image data generated by imaging by the second camera 400b, longitude / latitude information regarding the current position output by the GPS module 300, and output by the compass 500.
- the direction information to be received is received.
- the interface 101 transmits a drive signal to the first actuator 401a that can change the imaging direction of the first camera 400a and the second actuator 401b that can change the imaging direction of the second camera 400b.
- the geographic information database 104 holds landmark information regarding the position of the landmark.
- the CPU 102 specifies a landmark located within a predetermined range with respect to the current position based on the longitude / latitude information, the direction information, and the landmark information acquired from the geographic information database 104. Then, driving for driving the first actuator 401a and the second actuator 401b so that the position of the specified landmark is included in at least one of the imaging region of the first camera 400a and the imaging region of the second camera 400b. A signal is output to control the imaging direction of the first camera 400a and the second camera 400b.
- the server apparatus 100 can control the orientation of the camera so that landmarks within the imageable range can be captured within the imaging area of any camera. That is, the server device 100 of the present embodiment is effective for performing imaging at an appropriate angle of view (imaging area).
- Embodiments 1 and 2 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed. Moreover, it is also possible to combine each component demonstrated in the said Embodiment 1 and Embodiment 2 into a new embodiment. Therefore, other embodiments will be exemplified below.
- Embodiment 1 and Embodiment 2 the configuration including two cameras has been described. Even when the number of cameras is three or more, the configuration of the present disclosure can be applied by performing the same processing on two of the cameras.
- the altitude information may be acquired using another altitude sensor such as a barometric sensor.
- the imaging area can also be changed by the CPU 102 controlling the imaging area of the camera using a camera capable of adjusting the imaging area, such as a camera having a zoom lens. Specifically, when the altitude indicated by the altitude information is lower than the set threshold, the CPU 102 increases the imaging area Ro where the imaging area Ra of the first camera 400a and the imaging area Rb of the second camera 400b overlap. The imaging region Ra of the first camera 400a and the imaging region Rb of the second camera 400b are changed (so that the angle of view is on the wide angle side).
- the CPU 102 When the altitude indicated by the altitude information is higher than the set threshold, the CPU 102 reduces the imaging area Ro where the imaging area Ra of the first camera 400a and the imaging area Rb of the second camera 400b overlap (the angle of view is telephoto). The imaging area Ra of the first camera 400a and the imaging area Rb of the second camera 400b are changed.
- imaging may be performed using a camera having a wide-angle lens, and zooming may be performed by cutting out a part from image data for a wide imaging area. Then, by changing the imaging region to be cut out according to the altitude, it is possible to control the overlapping imaging regions to be cut out by the first camera 400a and the second camera 400b.
- Embodiment 1 and Embodiment 2 have described the configuration in which the imaging region is changed by controlling the actuator to pan and tilt the camera.
- the imaging region can also be changed by controlling the actuator to rotate the camera in the roll direction, that is, about the optical axis of the camera.
- the imaging areas (view angles) of the first camera 400a and the second camera are rectangular with an aspect ratio of 16: 9 or the like. Therefore, the imaging region can be changed by rotating the camera in the roll direction between the horizontal position and the vertical position.
- the lateral position refers to a position in the roll direction of the camera such that the longitudinal direction of the imaging region of the camera is parallel to the direction in which the two cameras are aligned.
- the vertical position refers to a position in the roll direction of the camera such that the longitudinal direction of the imaging area of the camera is perpendicular to the direction in which the two cameras are arranged.
- the CPU 102 controls the actuator 401a and the actuator 401b so that the first camera 400a and the second camera 400b are in the horizontal position.
- the CPU 102 controls the actuator 401a and the actuator 401b so that the first camera 400a and the second camera 400b are in the vertical position.
- the imaging regions of the first camera 400a and the second camera 400b partially overlap.
- panning and tilting may be performed by capturing an image using a camera having a wide-angle lens, and cutting out part of the image data for a wide range of imaging area, or image in the roll direction. A part of the data can also be cut out.
- one altitude threshold value is set in order to determine the altitude level and the direction of the camera is changed in two stages.
- a plurality of thresholds may be set, and the camera orientation may be changed in three or more steps.
- the configuration in which the altitude threshold value is set to determine the altitude level and the direction of the camera is changed has been described.
- a table indicating the correspondence between the altitude information and the camera orientation may be prepared, and the camera orientation may be determined from the altitude information by referring to the table.
- the camera orientation may be calculated from the altitude information using a predetermined calculation formula, and the camera orientation may be changed.
- Embodiment 1 and Embodiment 2 have described the configuration in which the process of controlling the camera orientation is repeated at regular intervals. This process may be repeated every time the aircraft moves a certain distance using the movement distance acquired from the GPS module 300.
- Embodiment 1 and Embodiment 2 have been described on the assumption that the direction of the camera and the imaging region are changed in the horizontal direction (panning direction). The change in the camera direction and the imaging area is the same in the vertical relationship.
- the present disclosure can provide a camera control device that can capture an image with an appropriate angle of view
- the present disclosure can be applied to a camera control device in an aircraft, a train, or the like.
- In-flight system 100 Server device 101 Interface 102 CPU 103 Memory 104 Geographic Information Database 105 Operation Unit 200 Monitor 300 GPS Module 400a First Camera 400b Second Camera 500 Compass
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Astronomy & Astrophysics (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
A camera control device of the present invention is provided with an interface and a controller. The interface receives first image data generated by a first camera by performing image pickup, second image data generated by a second camera by performing image pickup, and altitude information relating to an altitude outputted from an altitude sensor, and transmits drive signals to a first actuator capable of changing the image pickup direction of the first camera, and a second actuator capable of changing the image pickup direction of the second camera. The controller controls the image pickup directions of the first camera and the second camera by outputting drive signals, which are transmitted via the interface, and which drive the first actuator and the second actuator, so that the lower the altitude indicated by the altitude information becomes, the smaller a synthetic image pickup region becomes, said synthetic image pickup region being a range where the first camera and/or the second camera can pick up an image.
Description
本開示は、航空機や列車等の輸送機器において外部の映像を撮像するカメラを制御する、カメラ制御装置に関する。
The present disclosure relates to a camera control device that controls a camera that captures an external image in a transportation device such as an aircraft or a train.
特許文献1は、オペレータの操作や外部装置からの入力により指定された視野角に応じて、複数のカメラのズーム量とカメラ間角度とを調節する構成を開示する。この構成により、表示の重複や欠落を生じることなく、広範囲かつ詳細な視察を行うことができる。
Patent Document 1 discloses a configuration in which zoom amounts and inter-camera angles of a plurality of cameras are adjusted in accordance with a viewing angle designated by an operator's operation or an input from an external device. With this configuration, a wide and detailed inspection can be performed without causing overlapping or omission of display.
本開示は、適切な画角での撮像が可能なカメラ制御装置を提供する。
The present disclosure provides a camera control device capable of imaging at an appropriate angle of view.
本開示におけるカメラ制御装置は、インターフェースと、コントローラと、を備える。インターフェースは、第1カメラが撮像を行って生成する第1画像データ、第2カメラが撮像を行って生成する第2画像データ、及び高度センサが出力する高度に関する高度情報を受信し、第1カメラの撮像方向を変更可能な第1アクチュエータ、及び第2カメラの撮像方向を変更可能な第2アクチュエータに対して駆動信号を送信する。コントローラは、高度情報が示す高度が低いほど、第1カメラ及び第2カメラのうち少なくともいずれか一方での撮像が可能な範囲である合成撮像領域が狭くなるように、インターフェースを介して送信され、第1アクチュエータ及び第2アクチュエータを駆動する駆動信号を出力して第1カメラ及び第2カメラの撮像方向を制御する。
The camera control device according to the present disclosure includes an interface and a controller. The interface receives first image data generated by imaging by the first camera, second image data generated by imaging by the second camera, and altitude information about the altitude output from the altitude sensor. Drive signals are transmitted to the first actuator that can change the imaging direction of the second camera and the second actuator that can change the imaging direction of the second camera. The controller is transmitted via the interface such that the lower the altitude indicated by the altitude information, the narrower the composite imaging area that is a range in which at least one of the first camera and the second camera can be imaged, Drive signals for driving the first actuator and the second actuator are output to control the imaging directions of the first camera and the second camera.
本開示のもう一つの態様におけるカメラ制御装置は、インターフェースと、地理情報データベースと、コントローラと、を備える。インターフェースは、第1カメラが撮像を行って生成する第1画像データ、第2カメラが撮像を行って生成する第2画像データ、測位センサが出力する現在位置に関する位置情報、及びコンパスが出力する方位情報を受信し、第1カメラの撮像方向を変更可能な第1アクチュエータ、及び第2カメラの撮像方向を変更可能な第2アクチュエータに対して駆動信号を送信する。地理情報データベースは、ランドマークの位置に関するランドマーク情報を保持する。コントローラは、位置情報、方位情報、及び地理情報データベースから取得するランドマーク情報に基づいて、現在位置に対して所定範囲内に位置するランドマークを特定する。そして、特定したランドマークの位置が第1カメラの撮像領域及び第2カメラの撮像領域のうち少なくとも一方に含まれるように、インターフェースを介して送信され、第1アクチュエータ及び第2アクチュエータを駆動する駆動信号を出力して第1カメラ及び第2カメラの撮像方向を制御する。
The camera control device according to another aspect of the present disclosure includes an interface, a geographic information database, and a controller. The interface includes first image data generated by imaging by the first camera, second image data generated by imaging by the second camera, position information on the current position output by the positioning sensor, and direction output by the compass The information is received, and a drive signal is transmitted to the first actuator that can change the imaging direction of the first camera and the second actuator that can change the imaging direction of the second camera. The geographic information database holds landmark information regarding the position of the landmark. The controller specifies a landmark located within a predetermined range with respect to the current position based on the landmark information acquired from the position information, the direction information, and the geographic information database. Then, a drive that drives the first actuator and the second actuator is transmitted via the interface so that the position of the specified landmark is included in at least one of the imaging area of the first camera and the imaging area of the second camera. A signal is output to control the imaging directions of the first camera and the second camera.
本開示におけるカメラ制御装置は、適切な画角での撮像を行うのに有効である。
The camera control device according to the present disclosure is effective for performing imaging at an appropriate angle of view.
以下、適宜図面を参照しながら、実施の形態を詳細に説明する。但し、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明や実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。
Hereinafter, embodiments will be described in detail with reference to the drawings as appropriate. However, more detailed description than necessary may be omitted. For example, detailed descriptions of already well-known matters and repeated descriptions for substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art.
なお、添付図面および以下の説明は、当業者が本開示を十分に理解するために、提供されるのであって、これらにより請求の範囲に記載の主題を限定することは意図されていない。
The accompanying drawings and the following description are provided for those skilled in the art to fully understand the present disclosure, and are not intended to limit the claimed subject matter.
(実施の形態1)
以下、図1~図5を用いて、実施の形態1を説明する。 (Embodiment 1)
The first embodiment will be described below with reference to FIGS.
以下、図1~図5を用いて、実施の形態1を説明する。 (Embodiment 1)
The first embodiment will be described below with reference to FIGS.
[1-1 構成]
図1は、実施の形態1における機内システム10の構成を示す図である。本実施の形態の機内システム10は、航空機内に備えられる。機内システム10は、航空機が航路を航行しているときの機外の風景を撮影して画像データを取得する。機内システム10は、航空機の高度情報に基づいて、カメラの向きを変更する。 [1-1 Configuration]
FIG. 1 is a diagram illustrating a configuration of an in-flight system 10 according to the first embodiment. In-flight system 10 of the present embodiment is provided in an aircraft. The in-flight system 10 captures the scenery outside the aircraft when the aircraft is navigating the route and acquires image data. The in-flight system 10 changes the direction of the camera based on the altitude information of the aircraft.
図1は、実施の形態1における機内システム10の構成を示す図である。本実施の形態の機内システム10は、航空機内に備えられる。機内システム10は、航空機が航路を航行しているときの機外の風景を撮影して画像データを取得する。機内システム10は、航空機の高度情報に基づいて、カメラの向きを変更する。 [1-1 Configuration]
FIG. 1 is a diagram illustrating a configuration of an in-
機内システム10は、サーバー装置100と、モニタ200と、GPSモジュール300と、第1カメラ400aと、第2カメラ400bと、コンパス500とを備える。サーバー装置100は、モニタ200と接続され、モニタ200へ画像データを送信する。モニタ200は、航空機の客室内に取り付けられる。モニタ200は、サーバー装置100から受信した画像データに基づく映像の表示が可能である。GPSモジュール300は、航空機の現在位置を示す経度緯度情報と、航空機の現在高度を示す高度情報とを取得してサーバー装置100に送信する。
The in-flight system 10 includes a server device 100, a monitor 200, a GPS module 300, a first camera 400a, a second camera 400b, and a compass 500. Server device 100 is connected to monitor 200 and transmits image data to monitor 200. The monitor 200 is installed in an aircraft cabin. The monitor 200 can display a video based on the image data received from the server device 100. The GPS module 300 acquires longitude / latitude information indicating the current position of the aircraft and altitude information indicating the current altitude of the aircraft and transmits them to the server apparatus 100.
第1カメラ400aは、撮像動作を行うことにより画像データを生成し、サーバー装置100に出力する。第1カメラ400aは、第1アクチュエータ401aを備える。第1アクチュエータ401aは、サーバー装置100から受信したデータに基づいて第1カメラ400aの撮像方向を変更する。サーバー装置100が第1アクチュエータ401aを制御することにより、第1カメラ400aのパン(ヨー方向の回転)及びチルト(ピッチ方向の回転)が可能である。
The first camera 400a generates image data by performing an imaging operation and outputs the image data to the server device 100. The first camera 400a includes a first actuator 401a. The first actuator 401a changes the imaging direction of the first camera 400a based on the data received from the server device 100. The server device 100 controls the first actuator 401a, so that the first camera 400a can pan (rotate in the yaw direction) and tilt (rotate in the pitch direction).
第2カメラ400bは、撮像動作を行うことにより画像データを生成し、サーバー装置100に出力する。第2カメラ400bは、第2アクチュエータ401bを備える。第2アクチュエータ401bは、サーバー装置100から受信したデータに基づいて第2カメラ400bの撮像方向を変更する。サーバー装置100が第2アクチュエータ401bを制御することにより、第2カメラ400bのパン(ヨー方向の回転)及びチルト(ピッチ方向の回転)が可能である。
The second camera 400b generates image data by performing an imaging operation and outputs the image data to the server device 100. The second camera 400b includes a second actuator 401b. The second actuator 401b changes the imaging direction of the second camera 400b based on the data received from the server device 100. When the server apparatus 100 controls the second actuator 401b, the second camera 400b can be panned (rotated in the yaw direction) and tilted (rotated in the pitch direction).
コンパス500は、航空機の現在方位を示す方位情報を取得してサーバー装置100に送信する。方位情報は、航空機の向いている方位を示す情報である。
The compass 500 acquires azimuth information indicating the current azimuth of the aircraft and transmits it to the server apparatus 100. The direction information is information indicating the direction in which the aircraft is facing.
図2は、サーバー装置100の構成を示す図である。サーバー装置100は、インターフェース(I/F)101と、CPU102と、メモリー103と、地理情報データベース(DB)104と、操作部105とを備える。本実施の形態では、地理情報データベース104がサーバー装置100の内部に接続される構成を例に挙げて説明をするが、地理情報データベース104は、CPU102に対して読み出し及び書き込みが可能なように構成されていればよい。例えば、地理情報データベースが機内におけるサーバー装置100の外部に配置され、サーバー装置100のインターフェース101と接続されていても良い。また、地理情報データベースは、機外(地上)のデータセンター等に配置され、無線通信を介してサーバー装置100と通信可能であってもよい。
FIG. 2 is a diagram illustrating a configuration of the server apparatus 100. The server apparatus 100 includes an interface (I / F) 101, a CPU 102, a memory 103, a geographic information database (DB) 104, and an operation unit 105. In this embodiment, the configuration in which the geographic information database 104 is connected to the inside of the server apparatus 100 will be described as an example. However, the geographic information database 104 is configured to be readable and writable with respect to the CPU 102. It only has to be done. For example, the geographic information database may be arranged outside the server apparatus 100 in the aircraft and connected to the interface 101 of the server apparatus 100. Further, the geographic information database may be arranged in a data center outside the aircraft (ground) or the like, and may be able to communicate with the server apparatus 100 via wireless communication.
CPU102は、メモリー103に格納されたプログラムを実行し、各種演算や情報処理等を行う。CPU102は、メモリー103、地理情報データベース104に対して読み出しと書き込みが可能である。また、CPU102は、インターフェース101を介してモニタ200と、GPSモジュール300と、第1カメラ400aと、第2カメラ400bと、コンパス500との通信を行う。
The CPU 102 executes a program stored in the memory 103 and performs various calculations and information processing. The CPU 102 can read from and write to the memory 103 and the geographic information database 104. Further, the CPU 102 communicates with the monitor 200, the GPS module 300, the first camera 400a, the second camera 400b, and the compass 500 via the interface 101.
特にCPU102は、第1カメラ400aの第1アクチュエータ401a及び第2カメラ400bの第2アクチュエータ401bに対して駆動信号を送信して駆動することにより、第1カメラ400a及び第2カメラ400bの撮像方向を制御する。CPU102は、第1カメラ400aの撮像方向を第1方向情報として管理している。また、CPU102は、第2カメラ400bの撮像方向を第2方向情報として管理している。第1方向情報及び第2方向情報は、第1カメラ400a及び第2カメラ400bが設置されている航空機に対する相対的な方向を示す情報である。
In particular, the CPU 102 transmits the drive signal to the first actuator 401a of the first camera 400a and the second actuator 401b of the second camera 400b to drive it, thereby changing the imaging direction of the first camera 400a and the second camera 400b. Control. The CPU 102 manages the imaging direction of the first camera 400a as first direction information. Further, the CPU 102 manages the imaging direction of the second camera 400b as the second direction information. The first direction information and the second direction information are information indicating relative directions with respect to the aircraft in which the first camera 400a and the second camera 400b are installed.
CPU102は、GPSモジュール300、地理情報データベース104から情報を収集し、第1カメラ400a及び第2カメラ400bから取得した画像データを、画像処理を行って合成し、モニタ200に対し合成した画像データを送る。CPU102は、操作部105からの信号を受信し、この信号に応じて様々な動作を行う。特にCPU102は、操作部105からの信号に基づいて、第1カメラ400a及び第2カメラ400bの撮像動作の開始及び終了を制御する。
The CPU 102 collects information from the GPS module 300 and the geographic information database 104, performs image processing to combine the image data acquired from the first camera 400 a and the second camera 400 b, and combines the image data combined with the monitor 200. send. The CPU 102 receives a signal from the operation unit 105 and performs various operations in accordance with this signal. In particular, the CPU 102 controls the start and end of the imaging operation of the first camera 400a and the second camera 400b based on a signal from the operation unit 105.
メモリー103は、CPU102が実行するプログラム、第1カメラ400a及び第2カメラ400bが撮像して生成した画像データ、CPU102の演算結果、地理情報データベース104から得られる情報等を格納する。メモリー103は、フラッシュメモリやRAMで構成される。
The memory 103 stores a program executed by the CPU 102, image data generated by imaging by the first camera 400a and the second camera 400b, calculation results of the CPU 102, information obtained from the geographic information database 104, and the like. The memory 103 is configured by a flash memory or a RAM.
インターフェース101は、第1カメラ400aが撮像を行って生成する画像データと、第2カメラ400bが撮像を行って生成する画像データと、GPSモジュール300が出力する経度緯度情報及び高度情報と、コンパス500が出力する方位情報とを受信してCPU102に送る。また、インターフェース101は、CPU102が出力する駆動信号を第1アクチュエータ401a及び第2アクチュエータ401bに対して送信する。
The interface 101 includes image data generated by imaging by the first camera 400a, image data generated by imaging by the second camera 400b, longitude / latitude information and altitude information output by the GPS module 300, and the compass 500. Is received and sent to the CPU 102. Further, the interface 101 transmits a drive signal output from the CPU 102 to the first actuator 401a and the second actuator 401b.
地理情報データベース104は、地図上のランドマークに関する情報(ランドマーク情報)を保持するデータベースである。ランドマーク情報は、地図上の特定の地点を示す情報である。なお、ランドマークは、POI(Point Of Interest)とも呼ばれる。地理情報データベース104は、ハードディスク等で構成される。
The geographic information database 104 is a database that holds information on landmarks on the map (landmark information). The landmark information is information indicating a specific point on the map. The landmark is also called POI (Point Of Interest). The geographic information database 104 is composed of a hard disk or the like.
図3は、地理情報データベース104のデータ内容の具体例を示す図である。地理情報データベース104は、ランドマーク名称と、ランドマークの地球上の位置を示す緯度及び経度(地理情報)との組をランドマーク情報として複数保持している。
FIG. 3 is a diagram showing a specific example of data contents of the geographic information database 104. The geographic information database 104 holds a plurality of sets of landmark names and latitude and longitude (geographic information) indicating the positions of the landmarks on the earth as landmark information.
操作部105は、ユーザー(航空機の客室乗務員等)からの入力を受け付けるユーザーインターフェースである。操作部105は、航空機の客室内に取り付けられる。操作部105は、キーボード、マウス、タッチパネル、リモコン等のうち少なくともいずれか1つで構成される。操作部105は、ユーザーに操作されると、操作に応じた信号をCPU102に送信する。
The operation unit 105 is a user interface that receives input from a user (such as an aircraft cabin crew). The operation unit 105 is installed in an aircraft cabin. The operation unit 105 includes at least one of a keyboard, a mouse, a touch panel, a remote controller, and the like. When operated by the user, the operation unit 105 transmits a signal corresponding to the operation to the CPU 102.
機内システム10は、撮像システムの一例である。サーバー装置100は、カメラ制御装置の一例である。GPSモジュール300は、測位センサ(経度緯度情報取得部)及び高度センサ(高度情報取得部)の一例である。CPU102は、コントローラの一例である。インターフェース101は、通信回路の一例である。第1カメラ400a及び第2カメラ400bは、撮像装置の一例である。第1アクチュエータ401aと第2アクチュエータ401bは、カメラの向き変更部の一例である。コンパス500は、方位センサ(方位情報取得部)の一例である。地理情報データベース104は、ランドマークデータベースの一例である。
The in-flight system 10 is an example of an imaging system. The server device 100 is an example of a camera control device. The GPS module 300 is an example of a positioning sensor (longitude / latitude information acquisition unit) and an altitude sensor (altitude information acquisition unit). The CPU 102 is an example of a controller. The interface 101 is an example of a communication circuit. The first camera 400a and the second camera 400b are examples of an imaging device. The first actuator 401a and the second actuator 401b are an example of a camera orientation changing unit. The compass 500 is an example of an orientation sensor (orientation information acquisition unit). The geographic information database 104 is an example of a landmark database.
[1-2 動作]
以上のように構成された機内システム10について、その動作を以下で説明する。サーバー装置100は、GPSモジュール300から高度情報を取得する。サーバー装置100は、高度情報に基づいて第1アクチュエータ401a及び第2アクチュエータ401bを駆動して、第1カメラ400a及び第2カメラ400bの向き(撮像方向)を変更する。 [1-2 Operation]
The operation of the in-flight system 10 configured as described above will be described below. The server device 100 acquires altitude information from the GPS module 300. The server device 100 drives the first actuator 401a and the second actuator 401b based on the altitude information, and changes the orientation (imaging direction) of the first camera 400a and the second camera 400b.
以上のように構成された機内システム10について、その動作を以下で説明する。サーバー装置100は、GPSモジュール300から高度情報を取得する。サーバー装置100は、高度情報に基づいて第1アクチュエータ401a及び第2アクチュエータ401bを駆動して、第1カメラ400a及び第2カメラ400bの向き(撮像方向)を変更する。 [1-2 Operation]
The operation of the in-
ユーザーが、サーバー装置100の操作部105を操作することにより撮像開始の指示を行うと、CPU102は、第1カメラ400a及び第2カメラ400bに対して撮像開始を指示する。第1カメラ400a及び第2カメラ400bは、撮像動作を行うことにより画像データを生成し、サーバー装置100に出力する。
When the user gives an instruction to start imaging by operating the operation unit 105 of the server apparatus 100, the CPU 102 instructs the first camera 400a and the second camera 400b to start imaging. The first camera 400 a and the second camera 400 b generate image data by performing an imaging operation, and output the image data to the server device 100.
CPU102は、第1カメラ400a及び第2カメラ400bから取得した画像データに対して画像処理を行って合成し、モニタ200に対し合成した画像データを送る。モニタ200は、取得した画像データを表示する。第1カメラ400a及び第2カメラ400bは、各々の画角(撮像領域)が一部重複するような向きに配置される。CPU102は、第1カメラ400a及び第2カメラ400bが撮像して得た画像データを合成することにより、より画角の広い画像データである合成画像データを生成することができる。
The CPU 102 performs image processing on the image data acquired from the first camera 400 a and the second camera 400 b and combines them, and sends the combined image data to the monitor 200. The monitor 200 displays the acquired image data. The first camera 400a and the second camera 400b are arranged in such directions that their respective angles of view (imaging areas) partially overlap. The CPU 102 can generate combined image data that is image data with a wider angle of view by combining the image data obtained by the first camera 400a and the second camera 400b.
また、CPU102は、高度情報に基づいて第1カメラ400a及び第2カメラ400bの向きを変更する。CPU102は、撮像停止の指示があるまで、上記の処理を一定時間毎に繰り返す。
Further, the CPU 102 changes the orientation of the first camera 400a and the second camera 400b based on the altitude information. The CPU 102 repeats the above processing at regular intervals until an instruction to stop imaging is given.
以下、高度情報に基づく第1カメラ400a及び第2カメラ400bの向きの制御について説明する。図4は、航空機の高度が低い場合の第1カメラ400a及び第2カメラ400bの向きの具体例を示す図である。図5は、航空機の高度が高い場合の第1カメラ400a及び第2カメラ400bの向きの具体例を示す図である。CPU102は、高度情報が示す高度が低いほど合成撮像領域Rcが狭くなるように、第1アクチュエータ401a及び第2アクチュエータ401bに駆動信号を送信することにより駆動して、第1カメラ400a及び第2カメラ400bの撮像方向を制御する。第1カメラ400a及び第2カメラ400bの撮像方向の制御は、両方の撮像方向を変更してもよいし、いずれか一方の撮像方向を変更してもよい。
Hereinafter, control of the orientation of the first camera 400a and the second camera 400b based on altitude information will be described. FIG. 4 is a diagram illustrating a specific example of the orientations of the first camera 400a and the second camera 400b when the altitude of the aircraft is low. FIG. 5 is a diagram illustrating a specific example of the orientations of the first camera 400a and the second camera 400b when the altitude of the aircraft is high. The CPU 102 drives the first camera 400a and the second camera by transmitting drive signals to the first actuator 401a and the second actuator 401b so that the composite imaging region Rc becomes narrower as the altitude indicated by the altitude information is lower. The imaging direction of 400b is controlled. Control of the imaging direction of the first camera 400a and the second camera 400b may change both imaging directions, or may change either one of the imaging directions.
ここで合成撮像領域Rcは、第1カメラ400aの撮像領域Ra及び第2カメラ400bの撮像領域Rbを合成した、より広い撮像領域である。言い換えると、合成撮像領域Rcとは、2つのカメラのうち少なくともいずれか一方のカメラにより撮像可能な領域である。また、第1カメラ400a及び第2カメラ400bが撮像して得た画像データを合成して得た合成画像データの撮像領域は、合成撮像領域Rcであると言うことができる。合成撮像領域Rcの光軸に相当する軸を合成光軸と定義する。合成光軸は、2つのカメラの各々の光軸の向きを示す単位ベクトルの和になっている。なお、合成光軸の向きは、2つのカメラのそれぞれの向きを示す第1方向情報及び第2方向情報と、コンパス500から取得する方位情報とから計算により求めることができる。
Here, the composite imaging region Rc is a wider imaging region in which the imaging region Ra of the first camera 400a and the imaging region Rb of the second camera 400b are combined. In other words, the composite imaging area Rc is an area that can be imaged by at least one of the two cameras. Moreover, it can be said that the imaging region of the composite image data obtained by synthesizing the image data acquired by the first camera 400a and the second camera 400b is the composite imaging region Rc. An axis corresponding to the optical axis of the composite imaging region Rc is defined as a composite optical axis. The combined optical axis is the sum of unit vectors indicating the directions of the optical axes of the two cameras. Note that the direction of the combined optical axis can be obtained by calculation from the first direction information and the second direction information indicating the directions of the two cameras and the direction information acquired from the compass 500.
CPU102は、図4に示すように、高度情報が示す高度が、設定した閾値より低いときに、第1カメラ400a及び第2カメラ400bの合成撮像領域Rcが小さくなるように、第1アクチュエータ401a及び第2アクチュエータ401bを駆動して、第1カメラ400a及び第2カメラ400bの向きを変更する。合成撮像領域Rcが小さくなるということは、言い換えると、第1カメラ400aの撮像領域Raと第2カメラ400bの撮像領域Rbとの重複する撮像領域Roが大きくなるということである。
As shown in FIG. 4, the CPU 102 controls the first actuator 401a and the first actuator 401a so that the combined imaging region Rc of the first camera 400a and the second camera 400b becomes smaller when the altitude indicated by the altitude information is lower than a set threshold value. The second actuator 401b is driven to change the orientation of the first camera 400a and the second camera 400b. In other words, the composite imaging region Rc becomes smaller, that is, the overlapping imaging region Ro between the imaging region Ra of the first camera 400a and the imaging region Rb of the second camera 400b becomes larger.
CPU102は、図5に示すように、高度情報が示す高度が、設定した閾値より高いときに、第1カメラ400a及び第2カメラ400bの合成撮像領域Rcが大きくなるように、第1アクチュエータ401a及び第2アクチュエータ401bを駆動して、第1カメラ400a及び第2カメラ400bの向きを変更する。合成撮像領域Rcが大きくなるということは、言い換えると、第1カメラ400aの撮像領域Raと第2カメラ400bの撮像領域Rbの重複する撮像領域Roが小さくなるということである。
As shown in FIG. 5, when the altitude indicated by the altitude information is higher than the set threshold, the CPU 102 makes the first actuator 401a and the first actuator 401a and the second imaging area Rc of the first camera 400a and the second camera 400b larger. The second actuator 401b is driven to change the orientation of the first camera 400a and the second camera 400b. In other words, the composite imaging region Rc becomes large, that is, the imaging region Ro where the imaging region Ra of the first camera 400a overlaps the imaging region Rb of the second camera 400b becomes small.
[1-3 効果等]
以上のように、本実施の形態のサーバー装置100は、インターフェース101と、CPU102と、を備える。インターフェース101は、第1カメラ400aが撮像を行って生成する画像データ、第2カメラ400bが撮像を行って生成する画像データ、及びGPSモジュール300が出力する高度に関する高度情報を受信し、第1カメラ400aの撮像方向を変更可能な第1アクチュエータ401a、及び第2カメラ400bの撮像方向を変更可能な第2アクチュエータ401bに対して駆動信号を送信する。CPU102は、高度情報が示す高度が低いほど、第1カメラ400a及び第2カメラ400bのうち少なくともいずれか一方での撮像が可能な範囲である合成撮像領域Rcが狭くなるように、第1アクチュエータ401a及び第2アクチュエータ401bを駆動するための駆動信号を出力して第1カメラ400a及び第2カメラ400bの撮像方向を制御する。 [1-3 Effects, etc.]
As described above, theserver apparatus 100 according to the present embodiment includes the interface 101 and the CPU 102. The interface 101 receives image data generated by imaging by the first camera 400a, image data generated by imaging by the second camera 400b, and altitude information about the altitude output by the GPS module 300, and the first camera A drive signal is transmitted to the first actuator 401a that can change the imaging direction of 400a and the second actuator 401b that can change the imaging direction of the second camera 400b. As the altitude indicated by the altitude information is lower, the CPU 102 makes the first actuator 401a such that the combined imaging area Rc, which is a range in which at least one of the first camera 400a and the second camera 400b can be imaged, becomes narrower. And the drive signal for driving the 2nd actuator 401b is output, and the imaging direction of the 1st camera 400a and the 2nd camera 400b is controlled.
以上のように、本実施の形態のサーバー装置100は、インターフェース101と、CPU102と、を備える。インターフェース101は、第1カメラ400aが撮像を行って生成する画像データ、第2カメラ400bが撮像を行って生成する画像データ、及びGPSモジュール300が出力する高度に関する高度情報を受信し、第1カメラ400aの撮像方向を変更可能な第1アクチュエータ401a、及び第2カメラ400bの撮像方向を変更可能な第2アクチュエータ401bに対して駆動信号を送信する。CPU102は、高度情報が示す高度が低いほど、第1カメラ400a及び第2カメラ400bのうち少なくともいずれか一方での撮像が可能な範囲である合成撮像領域Rcが狭くなるように、第1アクチュエータ401a及び第2アクチュエータ401bを駆動するための駆動信号を出力して第1カメラ400a及び第2カメラ400bの撮像方向を制御する。 [1-3 Effects, etc.]
As described above, the
このサーバー装置100により、高度が低くなるほど2つのカメラによる合成撮像領域Rcが狭くなる。合成撮像領域Rcが狭くなることで、2つのカメラの間にある死角が小さくなり、より近距離まで撮像領域に含まれるようになる。高度が低くなると、より近距離にランドマーク等の被写体が位置する可能性が高くなる。本実施の形態のサーバー装置100によれば、そのような場合であってもそれらの被写体が撮像領域に含まれる可能性が高まるように、カメラの向きを制御することができる。すなわち本実施の形態のサーバー装置100は、適切な画角(撮像領域)での撮像を行うのに有効である。
With this server device 100, the lower the altitude, the narrower the combined imaging area Rc by the two cameras. By narrowing the composite imaging region Rc, the blind spot between the two cameras is reduced, so that it is included in the imaging region to a shorter distance. When the altitude is lowered, the possibility that a subject such as a landmark is located at a closer distance increases. According to server apparatus 100 of the present embodiment, the camera orientation can be controlled so that the possibility that those subjects are included in the imaging region is increased even in such a case. That is, the server device 100 of the present embodiment is effective for performing imaging at an appropriate angle of view (imaging area).
(実施の形態2)
以下、図6~図7を用いて、実施の形態2を説明する。 (Embodiment 2)
The second embodiment will be described below with reference to FIGS.
以下、図6~図7を用いて、実施の形態2を説明する。 (Embodiment 2)
The second embodiment will be described below with reference to FIGS.
[2-1 構成]
実施の形態2の機内システム10は、ランドマーク情報に基づいて、第1カメラ400a及び第2カメラ400bの向きを制御することが、実施の形態1の機内システム10と異なる点である。実施の形態2の機内システム10の構成、及び、第1カメラ400a及び第2カメラ400bによる基本的な撮像動作の制御については、実施の形態1の機内システム10と同様であるため、説明は省略する。 [2-1 Configuration]
The in-flight system 10 according to the second embodiment is different from the in-flight system 10 according to the first embodiment in that the directions of the first camera 400a and the second camera 400b are controlled based on the landmark information. Since the configuration of the in-flight system 10 of the second embodiment and the basic imaging operation control by the first camera 400a and the second camera 400b are the same as those of the in-flight system 10 of the first embodiment, description thereof is omitted. To do.
実施の形態2の機内システム10は、ランドマーク情報に基づいて、第1カメラ400a及び第2カメラ400bの向きを制御することが、実施の形態1の機内システム10と異なる点である。実施の形態2の機内システム10の構成、及び、第1カメラ400a及び第2カメラ400bによる基本的な撮像動作の制御については、実施の形態1の機内システム10と同様であるため、説明は省略する。 [2-1 Configuration]
The in-
[2-2 動作]
以下、ランドマーク情報に基づく第1カメラ400a及び第2カメラ400bの向きの制御について説明する。図6は、実施の形態2における各カメラの向きを制御する処理を説明するフローチャートである。CPU102は、第1カメラ400a及び第2カメラ400bによる撮像動作中に、図6に示す処理を一定時間毎に繰り返す。 [2-2 Operation]
Hereinafter, the direction control of thefirst camera 400a and the second camera 400b based on the landmark information will be described. FIG. 6 is a flowchart illustrating a process for controlling the orientation of each camera in the second embodiment. The CPU 102 repeats the processing shown in FIG. 6 at regular intervals during the imaging operation by the first camera 400a and the second camera 400b.
以下、ランドマーク情報に基づく第1カメラ400a及び第2カメラ400bの向きの制御について説明する。図6は、実施の形態2における各カメラの向きを制御する処理を説明するフローチャートである。CPU102は、第1カメラ400a及び第2カメラ400bによる撮像動作中に、図6に示す処理を一定時間毎に繰り返す。 [2-2 Operation]
Hereinafter, the direction control of the
CPU102は、経度緯度情報と、高度情報とを取得する(ステップS401)。次にCPU102は、地理情報データベース104から、現在位置周辺のランドマーク情報を取得する(ステップS402)。具体的には、まずCPU102は、高度情報に基づいて現在位置から地平線までの距離d2を算出する。
CPU 102 acquires longitude / latitude information and altitude information (step S401). Next, the CPU 102 acquires landmark information around the current position from the geographic information database 104 (step S402). Specifically, the CPU 102 first calculates a distance d2 from the current position to the horizon based on the altitude information.
図7は、現在位置Lとランドマーク位置D1と地平線位置D2との関係を示す図である。現在位置Lは、航空機の現在位置、すなわちGPSモジュール300が出力する緯度経度情報及び高度情報が示す位置である。地表位置L0は、航空機の真下にある地表の位置、すなわちGPSモジュール300が出力する緯度経度情報が示す位置である。高度hは、航空機の高度、すなわちGPSモジュール300が出力する高度情報が示す高度である。半径Rは、地球を真球と仮定したときの地球の半径である。ランドマーク位置D1は、地理情報データベース104に保持されているランドマーク情報が示す特定のランドマークの地球上の位置である。地平線位置D2は、現在位置Lに位置する航空機から見た地平線上の位置である。
FIG. 7 is a diagram showing the relationship among the current position L, the landmark position D1, and the horizon position D2. The current position L is a current position of the aircraft, that is, a position indicated by latitude / longitude information and altitude information output from the GPS module 300. The ground surface position L0 is the position of the ground surface directly below the aircraft, that is, the position indicated by the latitude and longitude information output from the GPS module 300. The altitude h is the altitude indicated by the altitude information output by the GPS module 300, that is, the altitude of the aircraft. The radius R is the radius of the earth when the earth is assumed to be a true sphere. The landmark position D <b> 1 is a position on the earth of a specific landmark indicated by landmark information held in the geographic information database 104. The horizon position D2 is a position on the horizon as viewed from the aircraft located at the current position L.
地表位置L0からランドマーク位置D1までの距離d1は、それぞれの経度及び緯度から計算して求めることができる。地表位置L0とランドマーク位置D1とで定まる弧の中心角を角θとする。地表位置L0から地平線位置D2までの距離d2は、以下の式(1)にて求めることができる。式(1)からわかるように、距離d2は、高度h、すなわち高度情報から算出することができる。
d2=Rθ=Rcos-1(R/(h+R)) (1)
次にCPU102は、経度緯度情報が示す地表位置L0を中心とし、算出した距離d2を半径とする円領域内で且つ、第1カメラ400a及び第2カメラ400bの最大合成撮像領域内に含まれるランドマーク情報を地理情報データベース104から取得する(ステップS402)。ここで、最大合成撮像領域とは、2つのカメラの向きによって決まる合成撮像領域Rcが最大になったときの領域を指す。具体的には、最大合成撮像領域とは、第1カメラ400a及び第2カメラ400bの重複する撮像領域Roが最小となるときの合成撮像領域Rcである。CPU102は、最大合成撮像領域を、2つのカメラの重複する撮像領域Roが最小となるときの第1方向情報及び第2方向情報と、コンパス500から取得する方位情報と、各カメラの画角とから特定する。CPU102は、距離d2を半径とする円領域と最大合成撮像領域との重複した領域内にあるランドマーク情報を地理情報データベース104から取得する。 The distance d1 from the ground surface position L0 to the landmark position D1 can be obtained by calculating from each longitude and latitude. The central angle of the arc determined by the ground position L0 and the landmark position D1 is defined as an angle θ. A distance d2 from the ground surface position L0 to the horizon position D2 can be obtained by the following equation (1). As can be seen from the equation (1), the distance d2 can be calculated from the altitude h, that is, altitude information.
d2 = Rθ = Rcos −1 (R / (h + R)) (1)
Next, theCPU 102 lands in a circular area centered on the ground surface position L0 indicated by the longitude / latitude information and having the calculated distance d2 as a radius, and included in the maximum composite imaging area of the first camera 400a and the second camera 400b. The mark information is acquired from the geographic information database 104 (step S402). Here, the maximum combined imaging area refers to an area when the combined imaging area Rc determined by the orientations of the two cameras is maximized. Specifically, the maximum combined imaging area is the combined imaging area Rc when the overlapping imaging area Ro of the first camera 400a and the second camera 400b is minimized. The CPU 102 determines the maximum combined imaging area as the first direction information and the second direction information when the overlapping imaging area Ro of the two cameras is minimum, the orientation information acquired from the compass 500, and the angle of view of each camera. Identify from. The CPU 102 acquires, from the geographic information database 104, landmark information in a region where the circular region having the radius of the distance d2 and the maximum combined imaging region overlap.
d2=Rθ=Rcos-1(R/(h+R)) (1)
次にCPU102は、経度緯度情報が示す地表位置L0を中心とし、算出した距離d2を半径とする円領域内で且つ、第1カメラ400a及び第2カメラ400bの最大合成撮像領域内に含まれるランドマーク情報を地理情報データベース104から取得する(ステップS402)。ここで、最大合成撮像領域とは、2つのカメラの向きによって決まる合成撮像領域Rcが最大になったときの領域を指す。具体的には、最大合成撮像領域とは、第1カメラ400a及び第2カメラ400bの重複する撮像領域Roが最小となるときの合成撮像領域Rcである。CPU102は、最大合成撮像領域を、2つのカメラの重複する撮像領域Roが最小となるときの第1方向情報及び第2方向情報と、コンパス500から取得する方位情報と、各カメラの画角とから特定する。CPU102は、距離d2を半径とする円領域と最大合成撮像領域との重複した領域内にあるランドマーク情報を地理情報データベース104から取得する。 The distance d1 from the ground surface position L0 to the landmark position D1 can be obtained by calculating from each longitude and latitude. The central angle of the arc determined by the ground position L0 and the landmark position D1 is defined as an angle θ. A distance d2 from the ground surface position L0 to the horizon position D2 can be obtained by the following equation (1). As can be seen from the equation (1), the distance d2 can be calculated from the altitude h, that is, altitude information.
d2 = Rθ = Rcos −1 (R / (h + R)) (1)
Next, the
CPU102は、取得したランドマーク情報が示すランドマークが第1カメラ400aの撮像領域Raと第2カメラ400bの撮像領域Rbの少なくともいずれか一方に存在するかどうかを判定する(ステップS403)。すなわちCPU102は、取得したランドマークが合成撮像領域Rc内にあるかどうかを判定する。
The CPU 102 determines whether or not the landmark indicated by the acquired landmark information exists in at least one of the imaging area Ra of the first camera 400a and the imaging area Rb of the second camera 400b (step S403). That is, the CPU 102 determines whether or not the acquired landmark is within the composite imaging region Rc.
第1カメラ400aの撮像領域Raと第2カメラ400bの撮像領域Rbの少なくともいずれか一方に取得したランドマーク情報が示すランドマークが存在する場合(ステップS403においてYes)、現在のカメラの向きを保持する。
When the landmark indicated by the landmark information acquired exists in at least one of the imaging area Ra of the first camera 400a and the imaging area Rb of the second camera 400b (Yes in step S403), the current camera orientation is maintained. To do.
一方、第1カメラ400aの撮像領域Ra又は第2カメラ400bの撮像領域Rbのどちらにも取得したランドマーク情報が示すランドマークが存在しない場合(ステップS403においてNo)、CPU102は、第1アクチュエータ401a及び第2アクチュエータ401bを駆動信号により駆動して、カメラの向きを変更する(ステップS404)。この場合、CPU102は、取得したランドマーク情報により特定されるランドマークの位置に基づいて、第1カメラ400aの撮像領域Ra及び第2カメラ400bの撮像領域Rbのうち少なくともいずれか一方に取得したランドマーク情報が示すランドマークが存在するように、カメラの向きを変更する。第1カメラ400a及び第2カメラ400bの撮像方向の変更に当たっては、両方の撮像方向を変更してもよいし、いずれか一方の撮像方向を変更してもよい。
On the other hand, when there is no landmark indicated by the acquired landmark information in either the imaging region Ra of the first camera 400a or the imaging region Rb of the second camera 400b (No in step S403), the CPU 102 determines that the first actuator 401a And the 2nd actuator 401b is driven with a drive signal, and direction of a camera is changed (step S404). In this case, the CPU 102 acquires the land acquired in at least one of the imaging region Ra of the first camera 400a and the imaging region Rb of the second camera 400b based on the position of the landmark specified by the acquired landmark information. The direction of the camera is changed so that the landmark indicated by the mark information exists. In changing the imaging direction of the first camera 400a and the second camera 400b, both imaging directions may be changed, or one of the imaging directions may be changed.
[2-3 効果等]
以上のように、本実施の形態のサーバー装置100は、インターフェース101と、地理情報データベース104と、CPU102と、を備える。インターフェース101は、第1カメラ400aが撮像を行って生成する画像データ、第2カメラ400bが撮像を行って生成する画像データ、GPSモジュール300が出力する現在位置に関する経度緯度情報、及びコンパス500が出力する方位情報を受信する。また、インターフェース101は、第1カメラ400aの撮像方向を変更可能な第1アクチュエータ401a、及び第2カメラ400bの撮像方向を変更可能な第2アクチュエータ401bに対して駆動信号を送信する。地理情報データベース104は、ランドマークの位置に関するランドマーク情報を保持する。CPU102は、経度緯度情報、方位情報及び地理情報データベース104から取得するランドマーク情報に基づいて現在位置に対して所定範囲内に位置するランドマークを特定する。そして、特定したランドマークの位置が、第1カメラ400aの撮像領域及び第2カメラ400bの撮像領域のうち少なくとも一方に含まれるように、第1アクチュエータ401a及び第2アクチュエータ401bを駆動するための駆動信号を出力して第1カメラ400a及び第2カメラ400bの撮像方向を制御する。 [2-3 Effects, etc.]
As described above, theserver apparatus 100 according to the present embodiment includes the interface 101, the geographic information database 104, and the CPU 102. The interface 101 outputs image data generated by imaging by the first camera 400a, image data generated by imaging by the second camera 400b, longitude / latitude information regarding the current position output by the GPS module 300, and output by the compass 500. The direction information to be received is received. Further, the interface 101 transmits a drive signal to the first actuator 401a that can change the imaging direction of the first camera 400a and the second actuator 401b that can change the imaging direction of the second camera 400b. The geographic information database 104 holds landmark information regarding the position of the landmark. The CPU 102 specifies a landmark located within a predetermined range with respect to the current position based on the longitude / latitude information, the direction information, and the landmark information acquired from the geographic information database 104. Then, driving for driving the first actuator 401a and the second actuator 401b so that the position of the specified landmark is included in at least one of the imaging region of the first camera 400a and the imaging region of the second camera 400b. A signal is output to control the imaging direction of the first camera 400a and the second camera 400b.
以上のように、本実施の形態のサーバー装置100は、インターフェース101と、地理情報データベース104と、CPU102と、を備える。インターフェース101は、第1カメラ400aが撮像を行って生成する画像データ、第2カメラ400bが撮像を行って生成する画像データ、GPSモジュール300が出力する現在位置に関する経度緯度情報、及びコンパス500が出力する方位情報を受信する。また、インターフェース101は、第1カメラ400aの撮像方向を変更可能な第1アクチュエータ401a、及び第2カメラ400bの撮像方向を変更可能な第2アクチュエータ401bに対して駆動信号を送信する。地理情報データベース104は、ランドマークの位置に関するランドマーク情報を保持する。CPU102は、経度緯度情報、方位情報及び地理情報データベース104から取得するランドマーク情報に基づいて現在位置に対して所定範囲内に位置するランドマークを特定する。そして、特定したランドマークの位置が、第1カメラ400aの撮像領域及び第2カメラ400bの撮像領域のうち少なくとも一方に含まれるように、第1アクチュエータ401a及び第2アクチュエータ401bを駆動するための駆動信号を出力して第1カメラ400a及び第2カメラ400bの撮像方向を制御する。 [2-3 Effects, etc.]
As described above, the
このサーバー装置100により、撮像可能な範囲内にあるランドマークを、いずれかのカメラの撮像領域内に捉えることができるように、カメラの向きを制御することができる。すなわち本実施の形態のサーバー装置100は、適切な画角(撮像領域)での撮像を行うのに有効である。
The server apparatus 100 can control the orientation of the camera so that landmarks within the imageable range can be captured within the imaging area of any camera. That is, the server device 100 of the present embodiment is effective for performing imaging at an appropriate angle of view (imaging area).
(他の実施の形態)
以上のように、本出願において開示する技術の例示として、実施の形態1、2を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。また、上記実施の形態1と実施の形態2で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。そこで、以下、他の実施の形態を例示する。 (Other embodiments)
As described above, Embodiments 1 and 2 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed. Moreover, it is also possible to combine each component demonstrated in the said Embodiment 1 and Embodiment 2 into a new embodiment. Therefore, other embodiments will be exemplified below.
以上のように、本出願において開示する技術の例示として、実施の形態1、2を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。また、上記実施の形態1と実施の形態2で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。そこで、以下、他の実施の形態を例示する。 (Other embodiments)
As described above, Embodiments 1 and 2 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed. Moreover, it is also possible to combine each component demonstrated in the said Embodiment 1 and Embodiment 2 into a new embodiment. Therefore, other embodiments will be exemplified below.
実施の形態1及び実施の形態2では、2台のカメラを備える構成について説明した。カメラの台数が3台以上の場合にも、それらのうちの2台のカメラに対して同様の処理を行うことで、本開示の構成を適用することが可能である。
In Embodiment 1 and Embodiment 2, the configuration including two cameras has been described. Even when the number of cameras is three or more, the configuration of the present disclosure can be applied by performing the same processing on two of the cameras.
実施の形態1及び実施の形態2では、GPSモジュール300から高度情報を取得する構成について説明した。高度情報は、気圧センサ等の他の高度センサを用いて取得するようにしてもよい。
In the first and second embodiments, the configuration for acquiring altitude information from the GPS module 300 has been described. The altitude information may be acquired using another altitude sensor such as a barometric sensor.
実施の形態1及び実施の形態2では、アクチュエータを制御してカメラの向きを変えることにより撮像領域を変更する構成について説明した。撮像領域の変更は、ズームレンズを備えるカメラ等、撮像領域の調整が可能なカメラを用いて、CPU102がカメラの撮像領域を制御することによっても可能である。具体的には、CPU102は、高度情報が示す高度が設定した閾値より低いときに、第1カメラ400aの撮像領域Raと第2カメラ400bの撮像領域Rbの重複する撮像領域Roが大きくなるように(画角が広角側になるように)、第1カメラ400aの撮像領域Ra及び第2カメラ400bの撮像領域Rbを変更する。CPU102は、高度情報が示す高度が設定した閾値より高いときに、第1カメラ400aの撮像領域Raと第2カメラ400bの撮像領域Rbの重複する撮像領域Roが小さくなるように(画角が望遠側になるように)、第1カメラ400aの撮像領域Ra及び第2カメラ400bの撮像領域Rbを変更する。
In the first and second embodiments, the configuration in which the imaging region is changed by controlling the actuator and changing the direction of the camera has been described. The imaging area can also be changed by the CPU 102 controlling the imaging area of the camera using a camera capable of adjusting the imaging area, such as a camera having a zoom lens. Specifically, when the altitude indicated by the altitude information is lower than the set threshold, the CPU 102 increases the imaging area Ro where the imaging area Ra of the first camera 400a and the imaging area Rb of the second camera 400b overlap. The imaging region Ra of the first camera 400a and the imaging region Rb of the second camera 400b are changed (so that the angle of view is on the wide angle side). When the altitude indicated by the altitude information is higher than the set threshold, the CPU 102 reduces the imaging area Ro where the imaging area Ra of the first camera 400a and the imaging area Rb of the second camera 400b overlap (the angle of view is telephoto). The imaging area Ra of the first camera 400a and the imaging area Rb of the second camera 400b are changed.
また、ズームレンズによるズーミングに代えて、広角レンズを備えるカメラを用いて撮像を行い、広範囲の撮像領域に対する画像データから一部を切り出すことによってズーミングを行ってもよい。そして、高度に応じて切り出される撮像領域を変更することによって、第1カメラ400aと第2カメラ400bの重複する切り出される撮像領域を制御することもできる。
Further, instead of zooming with a zoom lens, imaging may be performed using a camera having a wide-angle lens, and zooming may be performed by cutting out a part from image data for a wide imaging area. Then, by changing the imaging region to be cut out according to the altitude, it is possible to control the overlapping imaging regions to be cut out by the first camera 400a and the second camera 400b.
実施の形態1及び実施の形態2では、アクチュエータを制御してカメラをパン及びチルトさせることにより撮像領域を変更する構成について説明した。撮像領域の変更は、アクチュエータを制御してカメラをロール方向、すなわちカメラの光軸を中心として回転させることでも可能である。第1カメラ400a及び第2カメラの撮像領域(画角)は、縦横比が16:9等の長方形となっている。したがって、カメラを横位置及び縦位置の間でロール方向に回転させることにより、撮像領域を変更することができる。ここで横位置とは、カメラの撮像領域の長手方向が2つのカメラが並んでいる方向と平行になるようなカメラのロール方向における位置を指す。縦位置とは、カメラの撮像領域の長手方向が2つのカメラが並んでいる方向と垂直になるようなカメラのロール方向における位置を指す。
Embodiment 1 and Embodiment 2 have described the configuration in which the imaging region is changed by controlling the actuator to pan and tilt the camera. The imaging region can also be changed by controlling the actuator to rotate the camera in the roll direction, that is, about the optical axis of the camera. The imaging areas (view angles) of the first camera 400a and the second camera are rectangular with an aspect ratio of 16: 9 or the like. Therefore, the imaging region can be changed by rotating the camera in the roll direction between the horizontal position and the vertical position. Here, the lateral position refers to a position in the roll direction of the camera such that the longitudinal direction of the imaging region of the camera is parallel to the direction in which the two cameras are aligned. The vertical position refers to a position in the roll direction of the camera such that the longitudinal direction of the imaging area of the camera is perpendicular to the direction in which the two cameras are arranged.
例えば、CPU102は、高度情報が示す高度が設定した閾値より低いときに、アクチュエータ401a及びアクチュエータ401bを制御して第1カメラ400a及び第2カメラ400bを横位置となるようにする。CPU102は、高度情報が示す高度が設定した閾値より高いときに、アクチュエータ401a及びアクチュエータ401bを制御して第1カメラ400a及び第2カメラ400bを縦位置となるようにする。ただしこのとき、第1カメラ400a及び第2カメラ400bの撮像領域は一部重複しているものとする。このようにすることで、カメラが並んでいる方向に関する撮像領域を変化させることができ、実施の形態1と同様の効果を得ることができる。
For example, when the altitude indicated by the altitude information is lower than the set threshold, the CPU 102 controls the actuator 401a and the actuator 401b so that the first camera 400a and the second camera 400b are in the horizontal position. When the altitude indicated by the altitude information is higher than the set threshold, the CPU 102 controls the actuator 401a and the actuator 401b so that the first camera 400a and the second camera 400b are in the vertical position. However, at this time, it is assumed that the imaging regions of the first camera 400a and the second camera 400b partially overlap. By doing in this way, the imaging area regarding the direction in which the cameras are arranged can be changed, and the same effect as in the first embodiment can be obtained.
また、アクチュエータの制御に代えて、広角レンズを備えるカメラを用いて撮像を行い、広範囲の撮像領域に対する画像データから一部を切り出すことによって、パン及びチルトを行ってもよいし、ロール方向に画像データの一部を切り出すこともできる。
In place of controlling the actuator, panning and tilting may be performed by capturing an image using a camera having a wide-angle lens, and cutting out part of the image data for a wide range of imaging area, or image in the roll direction. A part of the data can also be cut out.
実施の形態1では、高度の高低を判断するために高度の閾値を一つ設定し、カメラの向きを2段階で変化させる構成について説明した。設定する閾値を複数として、カメラの向きを3段階以上で変化させるようにしてもよい。
In the first embodiment, the configuration in which one altitude threshold value is set in order to determine the altitude level and the direction of the camera is changed in two stages has been described. A plurality of thresholds may be set, and the camera orientation may be changed in three or more steps.
実施の形態1では、高度の高低を判断するために高度の閾値を設定し、カメラの向きを変化させる構成について説明した。高度情報とカメラの向きとの対応関係を示すテーブルを準備しておき、テーブルを参照することにより、高度情報からカメラの向きを決定するようにしてもよい。
In the first embodiment, the configuration in which the altitude threshold value is set to determine the altitude level and the direction of the camera is changed has been described. A table indicating the correspondence between the altitude information and the camera orientation may be prepared, and the camera orientation may be determined from the altitude information by referring to the table.
実施の形態1では、高度の高低を判断するために高度の閾値を設定し、カメラの向きを変化させる構成について説明した。所定の計算式によりカメラの向きを高度情報から算出し、カメラの向きを変化させてもよい。
In the first embodiment, the configuration in which the altitude threshold value is set to determine the altitude level and the direction of the camera is changed has been described. The camera orientation may be calculated from the altitude information using a predetermined calculation formula, and the camera orientation may be changed.
実施の形態1及び実施の形態2では、カメラの向きを制御する処理を一定時間毎に繰り返す構成について説明した。この処理は、GPSモジュール300から取得する移動距離を用いて、航空機が一定距離を移動するごとに繰り返すようにしてもよい。
Embodiment 1 and Embodiment 2 have described the configuration in which the process of controlling the camera orientation is repeated at regular intervals. This process may be repeated every time the aircraft moves a certain distance using the movement distance acquired from the GPS module 300.
実施の形態1及び実施の形態2では、カメラの向きや撮像領域の変更について横方向(パンニング方向)を前提に説明した。カメラの向きや撮像領域の変更は縦方向の関係でも同様である。
Embodiment 1 and Embodiment 2 have been described on the assumption that the direction of the camera and the imaging region are changed in the horizontal direction (panning direction). The change in the camera direction and the imaging area is the same in the vertical relationship.
以上のように、本開示における技術の例示として、実施の形態を説明した。そのために、添付図面および詳細な説明を提供した。
As described above, the embodiments have been described as examples of the technology in the present disclosure. For this purpose, the accompanying drawings and detailed description are provided.
したがって、添付図面および詳細な説明に記載された構成要素の中には、課題解決のために必須な構成要素だけでなく、上記技術を例示するために、課題解決のためには必須でない構成要素も含まれ得る。そのため、それらの必須ではない構成要素が添付図面や詳細な説明に記載されていることをもって、直ちに、それらの必須ではない構成要素が必須であるとの認定をするべきではない。
Accordingly, among the components described in the accompanying drawings and the detailed description, not only the components essential for solving the problem, but also the components not essential for solving the problem in order to illustrate the above technique. May also be included. Therefore, it should not be immediately recognized that these non-essential components are essential as those non-essential components are described in the accompanying drawings and detailed description.
また、上述の実施の形態は、本開示における技術を例示するためのものであるから、請求の範囲またはその均等の範囲において種々の変更、置き換え、付加、省略などを行うことができる。
In addition, since the above-described embodiment is for illustrating the technique in the present disclosure, various modifications, replacements, additions, omissions, and the like can be performed within the scope of the claims or an equivalent scope thereof.
本開示は、適切な画角での撮像が可能なカメラ制御装置を提供することができるため、航空機や列車等におけるカメラ制御装置に適用可能である。
Since the present disclosure can provide a camera control device that can capture an image with an appropriate angle of view, the present disclosure can be applied to a camera control device in an aircraft, a train, or the like.
10 機内システム
100 サーバー装置
101 インターフェース
102 CPU
103 メモリー
104 地理情報データベース
105 操作部
200 モニタ
300 GPSモジュール
400a 第1カメラ
400b 第2カメラ
500 コンパス 10 In-flight system 100 Server device 101 Interface 102 CPU
103Memory 104 Geographic Information Database 105 Operation Unit 200 Monitor 300 GPS Module 400a First Camera 400b Second Camera 500 Compass
100 サーバー装置
101 インターフェース
102 CPU
103 メモリー
104 地理情報データベース
105 操作部
200 モニタ
300 GPSモジュール
400a 第1カメラ
400b 第2カメラ
500 コンパス 10 In-
103
Claims (2)
- 第1カメラが撮像を行って生成する第1画像データ、第2カメラが撮像を行って生成する第2画像データ、及び高度センサが出力する高度に関する高度情報を受信し、前記第1カメラの撮像方向を変更可能な第1アクチュエータ、及び前記第2カメラの撮像方向を変更可能な第2アクチュエータに対して駆動信号を送信するインターフェースと、
前記高度情報が示す高度が低いほど、前記第1カメラ及び前記第2カメラのうち少なくともいずれか一方での撮像が可能な範囲である合成撮像領域が狭くなるように、前記インターフェースを介して送信され、前記第1アクチュエータ及び前記第2アクチュエータを駆動する前記駆動信号を出力して前記第1カメラ及び前記第2カメラの撮像方向を制御するコントローラと、
を備えるカメラ制御装置。 First image data generated by imaging by the first camera, second image data generated by imaging by the second camera, and altitude information about the altitude output from the altitude sensor are received, and imaging by the first camera An interface for transmitting a drive signal to a first actuator capable of changing a direction and a second actuator capable of changing an imaging direction of the second camera;
The lower the altitude indicated by the altitude information is, the lower the altitude indicated by the altitude information is transmitted via the interface, so that the combined imaging area that is the range in which at least one of the first camera and the second camera can be imaged becomes narrower. A controller that outputs the drive signal for driving the first actuator and the second actuator to control the imaging direction of the first camera and the second camera;
A camera control device. - 第1カメラが撮像を行って生成する第1画像データ、第2カメラが撮像を行って生成する第2画像データ、測位センサが出力する現在位置に関する位置情報、及びコンパスが出力する方位情報を受信し、前記第1カメラの撮像方向を変更可能な第1アクチュエータ、及び前記第2カメラの撮像方向を変更可能な第2アクチュエータに対して駆動信号を送信するインターフェースと、
ランドマークの位置に関するランドマーク情報を保持する地理情報データベースと、
前記位置情報、前記方位情報、及び前記地理情報データベースから取得する前記ランドマーク情報に基づいて、前記現在位置に対して所定範囲内に位置する前記ランドマークを特定し、特定した前記ランドマークの位置が前記第1カメラの撮像領域及び前記第2カメラの撮像領域のうち少なくとも一方に含まれるように、前記インターフェースを介して送信され、前記第1アクチュエータ及び前記第2アクチュエータを駆動する前記駆動信号を出力して前記第1カメラ及び前記第2カメラの撮像方向を制御するコントローラと、
を備えるカメラ制御装置。 Receives first image data generated by imaging by the first camera, second image data generated by imaging by the second camera, position information on the current position output by the positioning sensor, and azimuth information output by the compass An interface that transmits a drive signal to a first actuator that can change an imaging direction of the first camera, and a second actuator that can change an imaging direction of the second camera;
A geographic information database that holds landmark information about the location of the landmark;
Based on the landmark information acquired from the position information, the orientation information, and the geographic information database, the landmark located within a predetermined range with respect to the current position is identified, and the position of the identified landmark Is transmitted through the interface such that the driving signal is included in at least one of the imaging area of the first camera and the imaging area of the second camera, and the drive signal for driving the first actuator and the second actuator A controller that outputs and controls the imaging direction of the first camera and the second camera;
A camera control device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/705,214 US10778899B2 (en) | 2016-03-30 | 2017-09-14 | Camera control apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-068445 | 2016-03-30 | ||
JP2016068445A JP2019091961A (en) | 2016-03-30 | 2016-03-30 | Camera control unit |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/705,214 Continuation US10778899B2 (en) | 2016-03-30 | 2017-09-14 | Camera control apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017169089A1 true WO2017169089A1 (en) | 2017-10-05 |
Family
ID=59963846
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/003776 WO2017169089A1 (en) | 2016-03-30 | 2017-02-02 | Camera control device |
Country Status (3)
Country | Link |
---|---|
US (1) | US10778899B2 (en) |
JP (1) | JP2019091961A (en) |
WO (1) | WO2017169089A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018032436A1 (en) * | 2016-08-17 | 2018-02-22 | 深圳市大疆灵眸科技有限公司 | Pan-tilt control method and device, storage medium and unmanned aerial vehicle |
KR20190013224A (en) * | 2017-08-01 | 2019-02-11 | 엘지전자 주식회사 | Mobile terminal |
US12084199B2 (en) * | 2021-12-23 | 2024-09-10 | OneSky Flight LLC | Dynamic aircraft-specific graphical user interfaces |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001189888A (en) * | 1999-12-28 | 2001-07-10 | Ntt Data Corp | Device and method for indicating photographing and recording medium |
JP2008242725A (en) * | 2007-03-27 | 2008-10-09 | Seiko Epson Corp | Image data recording system and integrated circuit device |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH118843A (en) | 1997-06-16 | 1999-01-12 | Mitsubishi Heavy Ind Ltd | Visual inspection device |
JP3729161B2 (en) | 2001-08-07 | 2005-12-21 | カシオ計算機株式会社 | Target position search apparatus, target position search method and program |
JP2003289465A (en) | 2002-03-28 | 2003-10-10 | Fuji Photo Film Co Ltd | Imaging system and imaging method |
US7301569B2 (en) | 2001-09-28 | 2007-11-27 | Fujifilm Corporation | Image identifying apparatus and method, order processing apparatus, and photographing system and method |
US7880766B2 (en) | 2004-02-03 | 2011-02-01 | Panasonic Corporation | Detection area adjustment apparatus |
JP4418805B2 (en) | 2004-02-03 | 2010-02-24 | パナソニック株式会社 | Detection area adjustment device |
CA2554578A1 (en) * | 2004-02-17 | 2005-09-01 | Thales Avionics, Inc. | Broadcast passenger flight information system and method for using the same |
US7456847B2 (en) * | 2004-08-12 | 2008-11-25 | Russell Steven Krajec | Video with map overlay |
JP2013117649A (en) | 2011-12-05 | 2013-06-13 | Nikon Corp | Digital camera |
US10547825B2 (en) * | 2014-09-22 | 2020-01-28 | Samsung Electronics Company, Ltd. | Transmission of three-dimensional video |
-
2016
- 2016-03-30 JP JP2016068445A patent/JP2019091961A/en active Pending
-
2017
- 2017-02-02 WO PCT/JP2017/003776 patent/WO2017169089A1/en active Application Filing
- 2017-09-14 US US15/705,214 patent/US10778899B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001189888A (en) * | 1999-12-28 | 2001-07-10 | Ntt Data Corp | Device and method for indicating photographing and recording medium |
JP2008242725A (en) * | 2007-03-27 | 2008-10-09 | Seiko Epson Corp | Image data recording system and integrated circuit device |
Also Published As
Publication number | Publication date |
---|---|
JP2019091961A (en) | 2019-06-13 |
US20180007282A1 (en) | 2018-01-04 |
US10778899B2 (en) | 2020-09-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7503123B2 (en) | Surveying apparatus | |
CN109151402B (en) | Image processing method and image processing system of aerial camera and unmanned aerial vehicle | |
US20200218289A1 (en) | Information processing apparatus, aerial photography path generation method, program and recording medium | |
US11310412B2 (en) | Autofocusing camera and systems | |
CN111953892B (en) | Unmanned aerial vehicle and inspection method | |
WO2017169089A1 (en) | Camera control device | |
US10397474B2 (en) | System and method for remote monitoring at least one observation area | |
CN111247389B (en) | Data processing method and device for shooting equipment and image processing equipment | |
CN107040752B (en) | Intelligent ball-type camera, monitoring system and control method | |
JP2014063411A (en) | Remote control system, control method, and program | |
CN110706447A (en) | Disaster position determination method, disaster position determination device, storage medium, and electronic device | |
KR20030070553A (en) | Video picture processing method | |
JP2006270404A (en) | Device and method for controlling photographing and photographing control program | |
US11489998B2 (en) | Image capturing apparatus and method of controlling image capturing apparatus | |
JP7552589B2 (en) | Information processing device, information processing method, program, and information processing system | |
JP3919994B2 (en) | Shooting system | |
EP3547663A1 (en) | Panaoramic vision system with parallax mitigation | |
JP2021113005A (en) | Unmanned aircraft system and flight control method | |
US11415990B2 (en) | Optical object tracking on focal plane with dynamic focal length | |
CN111868656A (en) | Operation control system, operation control method, device, equipment and medium | |
US11586225B2 (en) | Mobile device, mobile body control system, mobile body control method, and program | |
JP2009086471A (en) | Overhead wire photographing system and method | |
CN114489085B (en) | Industrial robot motion control device based on machine vision | |
JP2011149957A (en) | Image display device, image display method, and program | |
CN114608555B (en) | Target positioning method, system and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17773638 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17773638 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |