CN112073630B - Image pickup apparatus, control method of image pickup apparatus, and recording medium - Google Patents

Image pickup apparatus, control method of image pickup apparatus, and recording medium Download PDF

Info

Publication number
CN112073630B
CN112073630B CN202010530853.1A CN202010530853A CN112073630B CN 112073630 B CN112073630 B CN 112073630B CN 202010530853 A CN202010530853 A CN 202010530853A CN 112073630 B CN112073630 B CN 112073630B
Authority
CN
China
Prior art keywords
image
unit
imaging
image pickup
image capturing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010530853.1A
Other languages
Chinese (zh)
Other versions
CN112073630A (en
Inventor
沼田爱彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN112073630A publication Critical patent/CN112073630A/en
Application granted granted Critical
Publication of CN112073630B publication Critical patent/CN112073630B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Abstract

The invention provides an imaging device, a control method of the imaging device and a recording medium. In order to provide an image pickup apparatus for controlling image pickup ranges of a plurality of image pickup units, the image pickup apparatus includes: a first image pickup unit and a second image pickup unit; a driving mechanism for controlling an imaging range of at least one of the first imaging unit and the second imaging unit; a combining unit configured to combine the first image acquired by the first image capturing unit and the second image acquired by the second image capturing unit to generate a wide-angle image; a display unit for displaying an image; a user interface for a user to specify an area or a position in the image displayed by the display unit; and a control unit configured to control an imaging range of at least one of the first imaging unit and the second imaging unit and control whether or not the combining unit performs the combining, in accordance with the specified region or position.

Description

Image pickup apparatus, control method of image pickup apparatus, and recording medium
Technical Field
The present invention relates to an imaging device and the like suitable for applications such as monitoring.
Background
In recent years, an image pickup apparatus capable of acquiring an image (wide-angle image) having a wider image pickup range than in the case of using a single camera by combining images picked up by a plurality of cameras arranged side by side (hereinafter referred to as multi-eye cameras) has been proposed. In japanese patent application laid-open No. 2004-118786, an image pickup apparatus is proposed that can generate a wide-angle image by performing a matching process to obtain an offset amount between a plurality of images while shifting images photographed by respective multi-eye cameras.
Further, an image pickup apparatus capable of controlling the direction of an image pickup apparatus called a Pan Tilt Zoom (PTZ) camera is proposed as an image pickup apparatus for changing the image pickup direction thereof or the like after the image pickup apparatus is mounted.
In the image pickup apparatus disclosed in japanese patent application laid-open No. 2004-118786, the positions of the respective multi-eye cameras are fixed. On the other hand, by adding a mechanism for controlling the imaging direction of each multi-eye camera, the position that the user desires to monitor can be changed.
However, in order to match the imaging direction of each multi-eye camera with the position that the user desires to monitor, it is necessary to perform trial and error for controlling the imaging direction of each multi-eye camera while checking the imaging range of the multi-eye camera.
According to an aspect of the present invention, an object of the present invention is to provide an image pickup apparatus capable of easily controlling image pickup ranges of a plurality of image pickup units.
Disclosure of Invention
According to an aspect of the present invention, an image pickup apparatus for connecting to an external device via a network, the image pickup apparatus comprising: a first image pickup unit and a second image pickup unit; a driving mechanism configured to control an imaging range of at least one of the first imaging unit and the second imaging unit; and at least one processor or circuit that functions as: a combining unit configured to combine the first image acquired by the first image pickup unit and the second image acquired by the second image pickup unit to generate a wide-angle image; a receiving unit configured to receive, from the external device, information related to an area or a position specified by a user, wherein the area or the position is specified by one of the first image and the second image displayed on a screen of a display unit of the external device or by an image for indicating an imaging area displayed on a screen of a display unit of the external device; a control unit configured to control an imaging range of at least one of the first imaging unit and the second imaging unit and control whether or not the synthesizing unit synthesizes according to an area or a position specified by a user; and a transmission unit configured to transmit the first image, the second image, and the wide-angle image to the external device.
An image pickup apparatus for connecting to an external device via a network, the image pickup apparatus comprising: a first image pickup unit and a second image pickup unit; a driving mechanism configured to be able to control an imaging range of at least one of the first imaging unit and the second imaging unit; at least one processor or circuit operative to: a combining unit configured to combine the first image acquired by the first image pickup unit and the second image acquired by the second image pickup unit and generate a wide-angle image; a receiving unit configured to receive, from the external device, information related to an area or a position specified by a user, wherein the area or the position is specified by one of the first image and the second image displayed on a screen of a display unit of the external device or by an image for indicating an imaging area displayed on a screen of a display unit of the external device; and a control unit configured to determine guidance for instructing how to control an imaging range of at least one of the first imaging unit and the second imaging unit according to the specified area or position.
A control method for controlling an image pickup apparatus for connecting to an external device via a network, the image pickup apparatus having: a first image pickup unit and a second image pickup unit; a driving mechanism configured to control an imaging range of at least one of the first imaging unit and the second imaging unit, wherein the control method includes: combining the first image acquired by the first image pickup unit and the second image acquired by the second image pickup unit to generate a wide-angle image; receiving, from the external device, information related to an area or a position specified by a user, wherein the area or the position is specified by one of the first image and the second image displayed on a screen of a display unit of the external device or by an image for indicating an imaging area displayed on a screen of a display unit of the external device; controlling an imaging range of at least one of the first imaging unit and the second imaging unit according to the specified area or position and judging whether to perform composition; and transmitting the first image, the second image, and the wide-angle image to the external device.
A non-transitory computer-readable storage medium storing a computer program to control an image pickup apparatus connected to an external device via a network, the image pickup apparatus having: a first image pickup unit and a second image pickup unit; a driving mechanism configured to control an imaging range of at least one of the first imaging unit and the second imaging unit; a combining unit configured to combine the first image acquired by the first image pickup unit and the second image acquired by the second image pickup unit and generate a wide-angle image; and a receiving unit configured to receive, from the external device, information related to an area or a position specified by a user, wherein the area or the position is specified by one of the first image and the second image displayed on a screen of a display unit of the external device or by an image for indicating an imaging area displayed on a screen of a display unit of the external device; wherein the computer program comprises instructions for performing the following: controlling an imaging range of at least one of the first imaging unit and the second imaging unit according to the specified area or position and judging whether or not the synthesizing unit synthesizes; and transmitting the first image, the second image, and the wide-angle image to the external device.
Other features of the present invention will become apparent from the following description of exemplary embodiments with reference to the accompanying drawings.
Drawings
Fig. 1 is a layout diagram showing an image pickup apparatus according to embodiment 1 when viewed from the upper side.
Fig. 2 is a functional block diagram showing an image pickup apparatus according to embodiment 1.
Fig. 3A and 3B are diagrams showing an imaging range and a composite image display state of the imaging apparatus according to embodiment 1.
Fig. 4A to 4C are diagrams showing other image capturing ranges and image display states of the image capturing apparatus according to embodiment 1.
Fig. 5A to 5C are diagrams showing still other imaging ranges and image display states of the imaging apparatus according to embodiment 1.
Fig. 6A and 6B are diagrams showing an imaging range and a composite image display state of the imaging apparatus according to embodiment 2.
Fig. 7A to 7C are diagrams showing other image capturing ranges and image display states of the image capturing apparatus according to embodiment 2.
Fig. 8A to 8C are diagrams showing still other imaging ranges and image display states of the imaging apparatus according to embodiment 2.
Fig. 9A to 9D are diagrams showing a user interface of the image pickup apparatus according to embodiment 3.
Fig. 10A to 10D are diagrams showing a user interface of the image pickup apparatus according to embodiment 4.
Fig. 11A to 11C are diagrams showing a user interface of the image pickup apparatus according to embodiment 5.
Fig. 12A to 12C are diagrams showing other examples of a user interface of the image pickup apparatus according to embodiment 5.
Fig. 13A to 13D are diagrams showing a user interface of the image pickup apparatus according to embodiment 6.
Fig. 14A to 14B are flowcharts showing the operation of the image capturing apparatus according to the embodiment.
Detailed Description
An example of an image pickup apparatus according to an embodiment of the present invention will be described below with reference to the drawings. The same reference numerals are given to units having the same functions in the drawings, and repeated description thereof will be omitted.
In the embodiment, an example of applying a web camera as an image pickup apparatus will be described. However, it is assumed that the image pickup apparatus includes an electronic apparatus including a plurality of image pickup units such as a digital still camera, a digital moving image camera, a smart phone with a camera, or a tablet computer with a camera.
Example 1
Fig. 1 shows an image pickup apparatus and a monitoring system using the same according to the present embodiment. Fig. 1 is a layout diagram showing an image pickup apparatus 100 as viewed from the upper side (+z-axis side) according to the present embodiment, and fig. 2 is an internal functional block diagram. The image pickup apparatus 100 includes a first image pickup unit 110, a second image pickup unit 120, a first driving mechanism 111, a second driving mechanism 121, a control unit 130, a combining unit 140, and a first transmitting unit 150.
The first driving mechanism 111 and the second driving mechanism 121 function as driving units, and can control the imaging directions of the first imaging unit 110 and the second imaging unit 120 in the same plane (XY plane), respectively. The image pickup apparatus according to the present embodiment is configured to be able to control the image pickup direction in the panning direction.
Specifically, as shown in fig. 2, the first driving mechanism 111 and the second driving mechanism 121 each include a motor and a gear, and are configured to be capable of rotating the first image capturing unit 110 and the second image capturing unit 120 about the shaft 101 of fig. 1 as a rotation shaft by controlling electric power for driving the motors.
The power for driving the motor is controlled by the control unit 130. That is, the image pickup apparatus 100 is configured to be able to change the image pickup direction of each of the first image pickup unit 110 and the second image pickup unit 120 on the XY plane.
The first image pickup unit 110 and the second image pickup unit 120 include imaging optical systems 112 and 122, respectively, and solid- state image sensors 113 and 123 such as CMOS image sensors, and the like. An image is acquired by forming a subject image on the solid- state image sensors 113 and 123 via the imaging optical systems 112 and 122.
The driving and signal reading of the respective solid- state image sensors 113 and 123 are controlled by the control unit 130. The control unit 130 contains a CPU as a computer, and performs various operations of the entire apparatus based on a computer program stored in a memory (not shown).
The combining unit 140 combines the first image 114 acquired by the first image pickup unit 110 and the second image 124 acquired by the second image pickup unit 120 to generate a wide-angle image (panoramic image, combined image) 134.
Specifically, by applying a so-called "pattern matching technique" in which a correlation coefficient is obtained while shifting an overlapping portion of images, a positional shift amount between a plurality of images is obtained to generate a wide-angle image 134. Further, in the present embodiment, the combining unit 140 changes whether the wide-angle image 134 is to be generated according to the area (image area) or the position specified by the user.
In the case where the wide-angle image 134 is not generated, the combining unit transmits the first image 114 and the second image 124 to the first transmitting unit 150 without combining the first image 114 and the second image 124.
In addition, in the present embodiment, the imaging range is a range in which the imaging unit performs imaging. The imaging range is changed by controlling, for example, the imaging direction, the zoom magnification, the rotation angle of the imaging surface, or the like.
The first transmission unit 150, which is a reception unit and a transmission unit, transmits the image (the first image 114 or the second image 124 or the wide-angle image 134) transmitted from the synthesis unit 140 to the external client apparatus (external device) 180 via a wired or wireless network.
The external client apparatus 180 transmits a command for controlling the image pickup apparatus 100 to the first transmission unit 150 via the second transmission unit 181 and the network, and the image pickup apparatus 100 receives the command and makes a response to the command to the client apparatus 180.
The command is, for example, a command for controlling the first drive mechanism 111 and the second drive mechanism 121. That is, the user can control the directions of the first image capturing unit 110 and the second image capturing unit 120 from the external client apparatus 180 via the network.
The client device is, for example, an external device such as a PC or the like, and includes a user interface 160 for a user to specify an area or location.
Reference numeral 182 denotes a control unit that performs internal control of the client apparatus 180 and contains a computer such as a CPU. The control unit 182 contains a memory (not shown), and a computer program for controlling the operation of the CPU within the control unit is stored in the memory.
Reference numeral 183 denotes a display unit, and displays an image or the like transmitted from the image pickup apparatus 100. The user interface 160 includes various switches and touch panels. Further, the user interface 160 includes a GUI such as buttons or icons displayed on the display unit 183. The user can provide various instructions to the image pickup apparatus 100 by operating the user interface 160.
The image pickup apparatus 100 according to the present embodiment controls the image pickup ranges of the first image pickup unit 110 and the second image pickup unit 120, and controls the combining operation in the combining unit 140, according to the image pickup frame (area) or (image pickup) position specified by the user via the user interface 160.
The network is constituted by a wired LAN, a wireless LAN, and the like. The image pickup apparatus 100 may be supplied with electric power via a network.
In the present embodiment, the synthesizing unit 140 is provided in the image pickup apparatus 100, but the synthesizing unit 140 may be provided in the client apparatus 180.
The image pickup apparatus 100 and the client apparatus 180 are included in an image pickup system.
In embodiment 1 shown in fig. 2, the image pickup apparatus 100 includes a first transmission unit 150 that transmits an image to the client apparatus 180 and operates according to a command from the client apparatus 180.
That is, in the present embodiment, the display unit 183, the control unit 182, and the user interface 160 are separated from the image pickup apparatus. However, the image pickup apparatus 100 may integrally include a memory for storing image data, a display unit 183 for displaying an image, and a user interface 160 such as a switch or a touch panel for receiving support of a user.
That is, the image pickup apparatus 100 may integrally have the function of the client apparatus 180. The image pickup apparatus according to the present embodiment is assumed to have a system configuration including the functions of the client apparatus 180 integrally or individually.
Fig. 3A and 3B are diagrams showing a user interface for a user to specify an area or a position in the image capturing apparatus 100 according to the present embodiment, and a relationship between the image capturing directions of the first image capturing unit 110 and the second image capturing unit 120.
Fig. 3A and 3B show a state before the user designates the image capturing position. In fig. 3A, a wide-angle image 134 generated by synthesizing the first image 114 and the second image 124 is displayed in a user interface 160 such as a GUI or the like on a display unit 183. At this time, it is assumed that the first image capturing unit 110 and the second image capturing unit 120 are oriented in the image capturing direction shown in fig. 3B.
Here, it is assumed that the user designates a position (center of the area) using a mouse or a cursor in a state in which the wide-angle image 134 is displayed on the display unit 183.
In embodiment 1, an example of a case where the user designates the center of a region using a cursor such as an arrow or the like is shown.
First, as shown in fig. 4A, a case will be considered in which the user designates two centers 161 and 162 of the area ranges of the first image capturing unit 110 and the second image capturing unit 120 as image capturing positions. In this case, it is considered that the user desires to take an image of both the periphery of the center 161 and the periphery of the center 162. Therefore, as shown in fig. 4B, the synthesizing unit 140 does not perform synthesis.
The imaging range is changed by changing the imaging directions of the first imaging unit 110 and the second imaging unit 120 so that the first image 114 and the second image 124 are displayed at positions centered on centers 161 and 162, respectively.
Fig. 4C shows the first and second imaging units 110 and 120 and the imaging ranges 115 and 125 after movement. By displaying the first image 114 and the second image 124 independently instead of the wide-angle image 134, the user can be provided with images within a plurality of imaging ranges.
On the other hand, as shown in fig. 5A, a case will be considered in which the user designates only one center 161 of the area as a position. In this case, it is considered that the user desires to take an image of only the surroundings of the center 161.
Accordingly, as shown in fig. 5B, the combining unit 140 performs combining, and the imaging directions of the first imaging unit 110 and the second imaging unit 120 are shifted so that the wide-angle image 134 of the imaging range centered on the center 161 is displayed.
Fig. 5C shows the first and second imaging units 110 and 120 and the imaging ranges 115 and 125 after movement. Specifically, the imaging range 115 of the first imaging unit 110 overlaps the imaging range 125 of the second imaging unit 120, and the center of the overlapping range 135 moves to the position of the center 161.
By controlling in this way and not displaying the first image 114 and the second image 124 independently, but displaying the wide-angle image 134, it is possible to provide the user with an image in a wide imaging range around the center 161.
As shown in fig. 5C, when the image capturing ranges 115 and 125 are equal in size, the center 136 of the overlap range 135 coincides with the center 161 (image capturing position) of the image capturing range. Therefore, the center of the display image of the wide-angle image 134 can be made coincident with the center 161 of the imaging range.
Here, the size of the overlap range 135 may be a size sufficient to obtain a positional offset between the first image 114 and the second image 124. Specifically, the number of pixels in the first image 114 and the second image 124 included in the overlap range 135 is preferably 100 or more. The size of the overlapping range 135 is further preferably 20% or more of an imaging range which is not relatively wide between the first imaging range 115 and the second imaging range 125.
As described above, the image capturing apparatus 100 according to the present embodiment changes at least one of the image capturing direction of the first image capturing unit 110 and the image capturing direction of the second image capturing unit 120 according to the image capturing position specified by the user.
Further, the presence or absence of the synthesizing process in the synthesizing unit changes according to the image capturing position specified by the user. In such a configuration, the imaging direction of the imaging device can be controlled more easily than in the related art at a position where the user desires to monitor and perform appropriate composition control.
Fig. 3A shows an example in which the wide-angle image 134 obtained by the composition is displayed in a display state when the user designates the image capturing range, but the first image 114 and the second image 124 may be displayed separately.
The image may not be displayed until the user designates the imaging range. For example, an instruction corresponding to the maximum ranges of the first image capturing unit and the second image capturing unit may be displayed on the display, and the user may specify the image capturing range within the instruction. In the case where the synthesized wide-angle image is displayed and the full direction (360 degrees) in the panning direction is configured to be covered by a plurality of imaging units (two or more imaging units), the full direction in the panning direction is preferably displayed.
Even in the case where the user designates an imaging range once again after designating the imaging range once, the presence or absence of the synthesizing process may be changed according to the number of imaging ranges designated by the user. Specifically, in the case where one imaging range is specified by the user, the composition processing is performed and the wide-angle image 134 is displayed. In the case where the user designates two imaging ranges, the first image 114 and the second image 124 may be displayed separately without performing the combining process.
Whether two imaging ranges are specified is determined in such a manner as follows: for example, in the case where a click operation of the mouse is performed twice and the interval between the two clicks is within a predetermined time interval and the two imaging ranges are separated by a predetermined distance or more, the two imaging ranges are designated at the same time. On the other hand, in the case where the interval between the two clicks is outside the predetermined time interval and the two imaging ranges are not separated by the predetermined distance or more, it is determined that only one imaging range is specified.
Example 2
Next, embodiment 2 will be described with reference to fig. 6A to 8C. The image pickup apparatus 200 according to embodiment 2 is different from the image pickup apparatus 100 described in embodiment 1 in the structures of the first driving mechanism and the second driving mechanism and the structures of the first image pickup unit and the second image pickup unit.
In the image pickup apparatus 200, the first image pickup unit 210 and the second image pickup unit 220 include a mechanism for changing a zoom magnification in addition to control of an image pickup direction. Specifically, the imaging optical system of each image pickup unit includes a zoom lens movable in the optical axis direction, and the first driving mechanism 211 and the second driving mechanism 221 drive the zoom lens so that the image pickup range can be controlled according to the image pickup direction and the zoom magnification of the image pickup unit.
Further, the other structures are substantially the same as those of embodiment 1, and reference numerals of the 100 series of the units described in embodiment 1 are replaced with reference numerals of the 200 series for the purpose of illustration in embodiment 2.
In the image pickup apparatus 200, in addition to the presence or absence of the combining process in the combining unit 240, the image pickup directions of the first image pickup unit 210 and the second image pickup unit 220 and the zoom magnifications of the first image pickup unit 210 and the second image pickup unit 220 are also changed according to the image pickup frame specified by the user.
Fig. 6A to 8C are diagrams showing a relationship between a user interface 260 used by a user to designate a photographing frame in the image pickup apparatus 200, and the image pickup directions of the first image pickup unit 210 and the second image pickup unit 220.
Fig. 6A and 6B show display states when the user designates an imaging range. In fig. 6A, a wide-angle image 234 generated by combining the first image 214 and the second image 224 is displayed. The first imaging unit 210, the second imaging unit 220, and the imaging ranges 215 and 225 at this time are shown in fig. 6B. In fig. 6B, it is assumed that the state before the user designates the image capturing range is a state in which the zoom magnification of the first image capturing unit 210 and the second image capturing unit 220 is minimum (so-called wide-angle end).
The imaging directions of the first imaging unit 210 and the second imaging unit 220 are controlled so that the synthesized image can display the maximum range, that is, the angle difference in the panning direction is maximum while ensuring the overlap necessary to generate the synthesized image.
Here, it is assumed that the user designates a picture frame (area) on the display screen of the wide-angle image 234 using a mouse or a cursor. In embodiment 2, an example of a case where a user designates a photographing frame with a rectangular frame, for example, is shown.
First, as shown in fig. 7A, consider a case where the user designates an image pickup frame as two rectangular image pickup frames 261 and 262 using a mouse or the like.
In this case, it is considered that the user desires to image the range of the two rectangular image capturing frames 261 and 262. Therefore, as shown in fig. 7B, the synthesizing unit 240 does not perform synthesis. Then, the imaging ranges of the first imaging unit 210 and the second imaging unit 220 are changed by controlling the imaging direction and controlling the zoom magnification so that the first image 214 and the second image 224 in the object range corresponding to the imaging frames 261 and 262 are displayed separately.
The first image capturing unit 210, the second image capturing unit 220, and the image capturing ranges 215 and 225 are shown in fig. 7C after control of the image capturing direction and the zoom magnification. Compared with the state of fig. 6B, both the imaging direction and the zoom magnification (angle of view) of each imaging unit are changed. Further, by displaying the first image 214 and the second image 224 separately instead of the wide-angle image 234, images in a plurality of imaging ranges can be displayed for the user.
On the other hand, as shown in fig. 8A, consider a case where the user designates only one rectangular image capturing frame 261. In this case, the image capturing range in which the user desires to capture an image is considered to be the range of only the rectangular image capturing frame 261. Accordingly, as shown in fig. 8B, the combining unit 240 performs combining, and changes the imaging range by controlling the imaging directions and zoom magnifications of the first imaging unit 210 and the second imaging unit 220 so that a wide-angle image 234 obtained by imaging the range of the imaging frame 261 is displayed.
The first image capturing unit 210, the second image capturing unit 220, and the image capturing ranges 215 and 225 are shown in fig. 8C after control of the image capturing direction and the zoom magnification. Compared with the state of fig. 6B, both the imaging direction and the zoom magnification of each imaging unit are changed. Specifically, the image capturing range 215 of the first image capturing unit 210 overlaps with the image capturing range 225 of the second image capturing unit 220, and the total set of the image capturing ranges 215 and 225 moves to coincide with the image capturing frame 261.
Therefore, in the case where the range of the image capturing frame 261 is the wide-angle image 234 obtained by combining the acquired first image 214 and second image 224, which is divided and captured with the first image capturing unit 210 and the second image capturing unit 220, the following advantages can be obtained.
First, a case where the range of the image capturing frame 261 exceeds the maximum angle of view (angle of view at the wide angle end) of the first image capturing unit 210 and the second image capturing unit 220 will be described. In this case, the range of the image capturing frame 261 cannot be captured by each individual image capturing unit.
Therefore, when the range of the image capturing frame 261 is displayed without performing the synthesis processing, a part of the range of the image capturing frame 261 is displayed as the first image 214 and another part is displayed as the second image 224.
On the other hand, in order to improve the visibility of the overlapping range, it is more preferable to display the range of the photographic frame 261 as the wide-angle image 234 than to independently display the first image 214 and the second image 224.
In other words, by performing the composition processing even in the case where the range of the image capturing frame 261 exceeds the angle of view at the wide-angle end of each image capturing unit, the visibility in which the image capturing range exceeding the angle of view at the wide-angle end is displayed as a single wide-angle image can be improved.
Next, a case where the range of the image capturing frame 261 does not exceed the widest angle of view (angle of view at the wide angle end) of the first image capturing unit 210 and the second image capturing unit 220 will be described. In this case, the range of the image capturing frame 261 can be captured by each image capturing unit as a single unit. That is, even in the case where the synthesis process is not performed, the range of the photographic frame 261 can be displayed as a single image.
Here, in order to improve the resolution of the image, it is more preferable to divide and image the range of the image frame 261 with the first image capturing unit 210 and the second image capturing unit 220 and display the range of the image frame 261 as a wide-angle image 234 obtained by combining the acquired first image 214 and second image 224.
As described above, in the case where only the range of one image pickup frame 261 is specified, the first image 214 and the second image 224 are not displayed independently. By displaying the wide-angle image 234, the imaging range that can be displayed as a single image can be widened or the resolution of the imaging range can be improved.
Fig. 6B shows an example in which the imaging ranges of the first imaging unit 210 and the second imaging unit 220 are set at the maximum angle of view (so-called wide-angle end) in a state before the user designates the image capturing frame, but the imaging ranges may not be set at the wide-angle end.
When the user designates the image capturing frame and the range in which the user desires to capture an image is narrowed from the state in which the image capturing range is maximum, the image capturing range can be easily designated. Therefore, the range is preferably set at the wide-angle end.
Example 3
Next, the image pickup apparatus 300 according to embodiment 3 is different from the image pickup apparatus 200 described in embodiment 2 in the structures of the first driving mechanism and the second driving mechanism. In the image pickup apparatus 300, the first image pickup unit 310 and the second image pickup unit 320 rotate in two directions perpendicular to each other.
Specifically, in addition to a rotation mechanism (so-called pan drive mechanism) centered on the Z axis (vertical axis) of fig. 1, a rotation mechanism (so-called tilt drive mechanism) capable of controlling an angle with respect to the Z axis is included. Further, the other structures are substantially the same as those of embodiment 2, but in embodiment 3, for the purpose of illustration, the reference numerals of 200 series of the units described in embodiment 2 are replaced with the reference numerals of 300 series.
Fig. 9A to 9D are diagrams showing a user interface 360 used by a user in the image pickup apparatus 300 to designate an image pickup frame. In the image pickup apparatus 300, the state before the user designates the image pickup frame and the method of designating the image pickup frame are the same as those in the image pickup apparatus 200.
In the case where the user designates two image capturing frames, a first image 314 and a second image 324 corresponding to the range of the image capturing frame designated by the user are displayed. In this way, the imaging directions or zoom magnifications of the first imaging unit 310 and the second imaging unit 320 are controlled.
On the other hand, in embodiment 3, a case where the user designates a picture frame with only one rectangular picture frame 361 will be considered. In this case, as shown in fig. 9A and 9B, the range of the imaging frame 361 is divided and imaged with the first imaging unit 310 and the second imaging unit 320, and is displayed as a wide-angle image 334 obtained by combining the acquired first image 314 and second image 324. Here, the image capturing directions of the first image capturing unit 310 and the second image capturing unit 320 differ according to the form of the image capturing frame 361 specified by the user.
For example, consider the following case (e.g., the case of longitudinal length): an aspect ratio of a length of the image capturing frame 361 in the pitch direction (longitudinal direction of fig. 9A) to a length thereof in the pan direction (lateral direction of fig. 9A) is equal to or greater than a first threshold value. In this case, as shown in fig. 9C, the range of the imaging frame 361 is divided and imaged in the pitch direction by the first imaging unit 310 and the second imaging unit 320.
Here, the aspect ratio is a value obtained by dividing the longitudinal length by the transverse length. For example, the area indicated by the one-dot chain line in fig. 9C is an area corresponding to the imaging range 315 of the first imaging unit 310, and the area indicated by the two-dot chain line is an area corresponding to the imaging range 325 of the second imaging unit 320.
That is, in fig. 9C, the imaging ranges (imaging direction and zoom magnification) of the first imaging unit 310 and the second imaging unit 320 are controlled such that the imaging ranges of the first imaging unit 310 and the second imaging unit 320 in the pan direction are the same, and the imaging ranges are different only in the tilt direction.
On the other hand, consider the following case (e.g., case of lateral length): the aspect ratio of the length of the image frame 361 in the pitch direction (longitudinal direction of fig. 9B) to the length thereof in the pan direction (lateral direction of fig. 9B) is smaller than a first threshold value. In this case, as shown in fig. 9D, the range of the imaging frame 361 is divided and imaged in the panning direction by the first imaging unit 310 and the second imaging unit 320.
The area indicated by the one-dot chain line in fig. 9D is an area corresponding to the imaging range 315 of the first imaging unit 310, and the area indicated by the two-dot chain line is an area corresponding to the imaging range 325 of the second imaging unit 320.
That is, in fig. 9D, the imaging directions of the first imaging unit 310 and the second imaging unit 320 are controlled such that the imaging ranges of the first imaging unit 310 and the second imaging unit 320 in the pitch direction are the same, and the imaging ranges only in the pan direction are different.
In this way, by changing the dividing direction of the imaging range according to the ratio of the length of the imaging range in the pitch direction to the length thereof in the pan direction, it is possible to further widen the range that can be displayed as a single image or to further improve the resolution of the imaging range.
The first threshold value may be determined based on the aspect ratio of the imaging range of each imaging unit.
Specifically, an average value of the aspect ratio of the imaging range of the first imaging unit 310 and the aspect ratio of the second imaging unit 320 may be set as the first threshold.
The first threshold may deviate by about 20% from an average of the aspect ratio of the imaging range of the first imaging unit 310 and the aspect ratio of the second imaging unit 320. That is, the first threshold value is preferably equal to or greater than 0.8 times the average of the aspect ratio of the imaging range of the first imaging unit 310 and the aspect ratio of the second imaging unit 320 and equal to or less than 1.2 times the average of the aspect ratio of the imaging range of the first imaging unit 310 and the aspect ratio of the second imaging unit 320.
Since the aspect ratio of the imaging range of each imaging unit does not change according to the zoom magnification, the aspect ratio can be uniquely defined regardless of the size of the imaging range.
Example 4
The image pickup apparatus 400 according to embodiment 4 is different from the image pickup apparatus 300 described in embodiment 3 in the structures of the first driving mechanism and the second driving mechanism. The image pickup apparatus 400 can control the image pickup directions and the image pickup ranges of the first image pickup unit 410 and the second image pickup unit 420, and includes a rotation mechanism (so-called rotation mechanism) capable of rotating the image pickup surfaces of the image pickup units about the optical axes of the image pickup units.
Further, the other structures are substantially the same as those of embodiment 3, but reference numerals of 300 series for the units described in embodiment 3 are replaced with reference numerals of 400 series for the purpose of illustration in embodiment 4.
Fig. 10A to 10D are diagrams showing a user interface 460 used by a user to designate a picture frame in the image pickup apparatus 400. In the image pickup apparatus 400, as in the image pickup apparatus 300, the operation of designating two image frames by the user in a state before designating an image frame by the user and adopting the method of designating an image frame is the same as that in the image pickup apparatus 200 or the like, and therefore a description thereof will be omitted.
On the other hand, a case where the user designates one rectangular image capturing frame 461 as an image capturing frame will be considered.
In this case, the range of the image capturing frame 461 is divided and captured with the first image capturing unit 410 and the second image capturing unit 420, and is displayed as a wide-angle image 434 obtained by synthesizing the acquired first image 414 and second image 424.
Here, the imaging directions of the first imaging unit 410 and the second imaging unit 420 change according to the form of the image frame 461 specified by the user. Further, in the present embodiment, the imaging directions and the rotation angles (angles of imaging surfaces) of the first imaging unit 410 and the second imaging unit 420 differ according to the form of the imaging frame 461 specified by the user.
In general, a solid-state image sensor in an image pickup unit for monitoring is longer in length in the lateral direction than in the longitudinal direction in many cases. Therefore, in the case where the image capturing unit rotates around the optical axis, the image capturing range of the image capturing unit changes.
Therefore, the shape of the image capturing frame 461 is easily adjusted by controlling the image capturing direction and the rotation direction as follows according to the ratio (aspect ratio) of the length of the image capturing frame 461 in the pitch direction to the length thereof in the pan direction.
A case where the ratio of the length of the image capturing frame 461 in the pitch direction to the length thereof in the pan direction is greater than the first threshold value and equal to or greater than the second threshold value will be considered. At this time, as shown in fig. 10A, the range of the imaging frame 461 is divided and imaged in the pitch direction by the first imaging unit 410 and the second imaging unit 420. Each image pickup unit is rotated 90 degrees to become longitudinally long, so that the lengths of the image pickup ranges 415 and 425 of the image pickup units in the pitch direction are longer.
Next, consider the following: the ratio of the length of the image capturing frame 461 in the pitch direction to the length thereof in the pan direction is equal to or greater than the first threshold value and less than the second threshold value. At this time, as shown in fig. 10B, the range of the imaging frame 461 is divided and imaged in the pitch direction by the first imaging unit 410 and the second imaging unit 420.
In the imaging ranges 415 and 425 of the imaging units, the length in the panning direction is longer than the length in the tilting direction to become laterally longer, and the respective imaging units do not rotate.
Next, the following will be considered: the ratio of the length of the image capturing frame 461 in the pitch direction to the length thereof in the pan direction is smaller than the first threshold value and equal to or larger than the third threshold value. The third threshold is less than the first threshold.
At this time, as shown in fig. 10C, the range of the image capturing frame 461 is divided and captured in the panning direction by the first image capturing unit 410 and the second image capturing unit 420. However, each image pickup unit is rotated 90 degrees to become longitudinally long, so that the lengths of the image pickup ranges 415 and 425 of the image pickup units in the pitch direction are long.
Finally, the following will be considered: an aspect ratio of a pitch direction (length in a longitudinal direction in fig. 10A) to a pan direction (length in a lateral direction in fig. 10A) of the image capturing frame 461 is smaller than a third threshold value.
At this time, as shown in fig. 10D, the range of the image capturing frame 461 is divided and captured in the panning direction by the first image capturing unit 410 and the second image capturing unit 420.
The respective image pickup units are not rotated, wherein the lengths of the image pickup ranges 415 and 425 of the image pickup units in the panning direction are kept laterally long.
In fig. 10A to 10D, the area indicated by the one-dot chain line is an area corresponding to the imaging range 415 of the first imaging unit 410, and the area indicated by the two-dot chain line is an area corresponding to the imaging range 425 of the second imaging unit 420.
In this way, by controlling the rotation angle of the imaging surface of each imaging unit around each optical axis thereof in addition to the imaging direction and the imaging range of each imaging unit, it is possible to further widen the range that can be displayed as a single image or to further improve the resolution of the imaging range.
The second threshold and the third threshold may be determined based on the aspect ratio of the imaging range of each imaging unit. Specifically, the second threshold value is preferably equal to or greater than 1.4 times and equal to or less than 2.8 times the first threshold value, and the third threshold value is preferably equal to or greater than 0.35 times and equal to or less than 0.7 times the first threshold value.
Example 5
In embodiments 1 to 4, the image pickup apparatus including two image pickup units (i.e., the first image pickup unit and the second image pickup unit) has been described, but the image pickup apparatus may include three or more image pickup units. That is, one or more image capturing units may be included in addition to the first image capturing unit and the second image capturing unit. This is preferable because as the number of image capturing units increases, a flexible countermeasure can be taken for a request from a user.
The image pickup apparatus 500 described in embodiment 5 includes four image pickup units: in addition to the first imaging unit 510 and the second imaging unit 520, there are a third imaging unit and a fourth imaging unit.
As with the image capturing units of the image capturing apparatus 400 described in embodiment 4, each image capturing unit can control all of pan, tilt, zoom, and rotation.
Further, the other structures are substantially the same as those of embodiment 4, but in embodiment 5, for the purpose of illustration, the reference numerals of the 400 series of the units described in embodiment 4 are replaced with the reference numerals of the 500 series of the units.
Fig. 11A to 11C are diagrams showing a user interface 560 for a user to designate a picture frame in the image pickup apparatus 500.
First, a case will be considered in which the user designates the same number of imaging frames (4 in this case) as the number of imaging units included in the imaging apparatus 500 as rectangular imaging frames 561, 562, 563, and 564.
In this case, as shown in fig. 11A, the imaging directions, imaging ranges, and rotation angles of the first imaging unit 510, the second imaging unit 520, the third imaging unit, and the fourth imaging unit are controlled so that the respective imaging frames become imaging ranges. In this case, the synthesizing unit does not perform synthesizing processing on any image.
Next, a case will be considered in which the user designates an image capturing frame of a number M (where M > N) larger than the number N of image capturing units included in the image capturing apparatus 500. Fig. 11B shows a case where the user designates five image capturing frames (m=5) (i.e., image capturing frames 561, 562, 563, 564, and 565). In this case, one image capturing unit captures an image of the range of any two image capturing frames.
Therefore, as shown in fig. 11C, one image capturing unit captures an image of the range of the two image capturing frames 561 and 562 whose distance between the centers of the image capturing frames is closest. The imaging direction, the imaging range, and the rotation angle of the imaging device are controlled so that the remaining three imaging units image the imaging frames 563, 564, and 565.
That is, the imaging range of one predetermined imaging unit is controlled such that at least two image frames or imaging positions are included in the imaging range of the predetermined imaging unit.
At this time, not all the imaging ranges of the imaging units that image the imaging frames 561 and 562 are displayed, but only the image portions corresponding to the imaging frames 561 and 562 may be cut out and displayed.
In this case, the image of the cut-out frame can be enlarged and displayed. In the case where six or more image capturing frames are specified by the user, the number of image capturing frames for which one image capturing unit is responsible may be changed according to the difference between the number of image capturing frames specified by the user and the number of image capturing units included in the image capturing apparatus.
Fig. 11B shows a case where the number of frames specified by the user may be greater than the number of image capturing units by 1 (where M-n=1), but the number of frames specified by the user may be greater than the number of image capturing units by 2 or more. In the above-described embodiment, control is performed such that the range of the plurality of image capturing frames whose distances between the centers of the image capturing frames are closest is captured by one image capturing unit.
However, for example, the areas of rectangular shapes of a plurality of image frames may be calculated for each of the plurality of image frames, the ranges of a plurality of image frames having the smallest rectangular areas may be selected, and the ranges may be imaged by one imaging unit. An imaging range (a range in which an imaging direction can be changed or a range in which a zoom magnification can be changed) that can be controlled by each of the plurality of imaging units may be stored in the memory.
Information such as control characteristics (image capturing direction or speed at which zooming is changed, etc.), the number of pixels, or the current image capturing range of each of the plurality of image capturing units may be stored in the memory in advance. Then, based on such information, a plurality of image frames and one image capturing unit that captures images of the plurality of image frames can be determined to achieve the shortest time or the best image quality. Therefore, the time required for control can be shortened, or the image quality can be optimized.
In this case, the imaging direction, the imaging range, and the rotation angle of the imaging apparatus may be controlled such that one imaging unit preferentially images a range from the nearest range between the centers of the imaging frames. That is, when the difference between the number of image capturing frames specified by the user and the number of image capturing units is referred to as a first number, one image capturing unit sequentially captures up to the first number of image capturing frames from the image capturing frame whose distance between the centers of the image capturing frames is closest.
Then, the respective image pickup units are driven so that the respective image pickup units pick up images of each of the other image pickup frames.
Even in this case, the synthesizing unit does not perform the synthesizing process on any image.
That is, in the case of m—n >1, for example, the range of three or more image frames close to each other out of the M image frames is imaged by one image pickup apparatus. Alternatively, any two image pickup frame pairs, which are close to each other, of the M image pickup frames are selected, and the range of each image pickup frame pair is picked up by each corresponding image pickup device. Alternatively, both are synthesized.
Next, a case will be considered in which the user designates the number of image capturing frames smaller than the number of image capturing units included in the image capturing apparatus 500. In this case, at least one image capturing frame may be divided and captured by a plurality of image capturing units. The image frame to be divided and imaged with the plurality of imaging units can be determined as follows.
The first method is a method of preferentially dividing and imaging a frame having a large size (area). Fig. 12A shows a case where the user designates three image capturing frames (i.e., image capturing frames 561, 562, and 563). The size (area) of the imaging frame 561 is larger than the sizes of the imaging frames 562 and 563. In this case, the imaging frame 561 is divided and imaged with two imaging units, and an image obtained by imaging the range of the imaging frame 561 is displayed as an image obtained by combining the two images.
On the other hand, the image capturing frames 562 and 563 are independently captured by the remaining two image capturing units, respectively. As described in embodiment 3 or 4, how the imaging frame 561 is divided and imaged can be determined as a ratio of the length of the imaging frame 561 in the pitch direction to the length thereof in the pan direction.
That is, in the case where the number of image capturing frames specified by the user is smaller than the number of image capturing units, the difference between the number of image capturing frames specified by the user and the number of image capturing units is referred to as a second number. At this time, each of the second number of image frames may be divided and imaged sequentially from the image frame having the larger size of the image frame designated by the user.
Alternatively, each image pickup frame may be divided and picked up sequentially from the image pickup frame designated by the user to improve resolution.
In the case where two image capturing frames (i.e., image capturing frames 561 and 562) are specified by the user, it is possible to select whether to divide and capture each image capturing frame with two image capturing units or to capture one image capturing frame of the image capturing frames with one image capturing unit and to capture the other image capturing frame with three image capturing units.
In this case, depending on the size of the imaging frame 561 and the size of the imaging frame 562, a method to be used may be determined, or an imaging frame designated by a user may be divided and imaged.
Specifically, as shown in fig. 12B, in the case where the difference in the sizes of the imaging frames 561 and 562 is smaller than the fourth threshold value, each of the imaging frames 561 and 562 is divided and imaged with two imaging units.
On the other hand, as in fig. 12C, in the case where the difference in size between the imaging frames 561 and 562 is equal to or larger than the fourth threshold value, a larger imaging frame (imaging frame 562 in the figure) may be divided and imaged with three imaging units, and a smaller imaging frame (imaging frame 561 in the figure) may be imaged with one imaging unit.
The difference in size can be defined as the ratio of the size of the image frame having a smaller area to the size of the image frame having a larger area between the two image frames. The fourth threshold is preferably greater than 1/3 and less than 1, and further preferably 1/2.
That is, in the case where the number of image capturing frames specified by the user is two or more smaller than the number of image capturing units, the ratio of the size of the image capturing frame to the number of image capturing units used to divide and capture the image capturing frame is obtained for each image capturing frame. The number of imaging units used to segment and image each imaging frame may be determined so as to minimize the dispersion of the ratio.
The second method is a method of determining a picture frame to be divided in response to a request from a user.
In the case where two image capturing frames (i.e., image capturing frames 561 and 562) are specified by the user, it is possible to select whether to divide and capture each image capturing frame with two image capturing units or to capture one image capturing frame of the image capturing frames with one image capturing unit and to capture the other image capturing frame with three image capturing units. At this time, the user is allowed to select an image capturing frame to increase the resolution of the image capturing range through the user interface. The image capturing frame to be improved in resolution is divided and captured by three image capturing units, and another image capturing frame is captured by one image capturing unit.
In such a structure, since a flexible countermeasure can be taken against a request from the user, for example, the resolution of a range in which image capturing is desired to be performed at high resolution among a plurality of image capturing frames is preferentially increased, this is preferable.
The image pickup apparatus may have a structure in which two image pickup units or three or more image pickup units are included, and when the image pickup ranges are connected in the panning direction, image pickup can be performed in all directions of 360 degrees.
In this case, as a display state when the user designates the image capturing frame, a wide-angle image obtained by capturing an image in the 360 degrees in all directions is preferable. This is because, as described above, when the user designates the image capturing frame and the range in which the user desires to capture an image narrows from the state in which the image capturing range is maximum, the image capturing range can be easily designated.
Example 6
In the above-described embodiments 1 to 5, an example in which each image capturing unit in the image capturing apparatus is actually moved in a case where the user designates the image capturing frame is described. However, in order to image a desired image capturing range according to an image capturing frame specified by a user without automatically moving the image capturing units, a guide for indicating where each image capturing unit should be moved may be displayed. Such guidance may include guidance display on a screen of the light emitting element 670 or the display unit mounted on the image pickup apparatus 600.
Next, an image pickup apparatus 600 according to embodiment 6 will be described with reference to fig. 13A to 13D. The image pickup apparatus 600 is different from the image pickup apparatus 100 described in embodiment 1 in that it includes a plurality of light emitting elements 670, and the plurality of light emitting elements 670 serve as display elements for indicating to a user the positions to which the respective image pickup units are moved.
The light emitting elements 670 are configured as LEDs, and for example, 16 light emitting elements 670 are disposed along the outer periphery of the image pickup device 600 in the XY plane. The control unit controls each light emitting element 670 so that the light emitting element is turned on or off. Further, the other structures are substantially the same as those of embodiment 1, and reference numerals of the 100 series of the units described in embodiment 1 are replaced with reference numerals of the 600 series for the purpose of illustration in embodiment 6.
Fig. 13A to 13D are diagrams showing a user interface for a user to specify a picture frame in the image pickup apparatus 600 and a relationship between on and off of each light emitting element.
In the image capturing apparatus 600, the state before the user designates the image capturing frame and the method of designating the image capturing frame are the same as those of the image capturing apparatus 100, and therefore illustration thereof is omitted.
As shown in fig. 13A, a case will be considered in which the user designates two centers 661 and 662 of the imaging range as imaging positions. In this case, since the user is considered to image both around the center 661 and around the center 662, the synthesizing unit 640 does not perform the synthesis.
As shown in fig. 13B, in the case where the direction of each imaging unit is changed, only the LEDs at the positions where the first image 612 and the second image 624 are displayed centering on the centers 661 and 662, respectively, are turned on.
In fig. 13B, the darkened LED is an on LED. The user directs each camera element in the direction in which the LED is on.
On the other hand, as shown in fig. 13C, a case will be considered in which the user designates only one center 661 of the imaging range as the imaging position.
In this case, since it is considered that the user desires to image only the surroundings of the center 661, the combining unit 640 performs the combination. As shown in fig. 13D, only the LEDs at positions indicating the directions of the image capturing units 610 and 620 are turned on.
Accordingly, when images of the image pickup units 610 and 620 whose directions are changed are combined, a wide-angle image 634 centered on the center 661 is displayed. In fig. 13D, the darkened LED is an on LED.
The user can move each camera unit to a desired angular position according to the LED that is turned on. Then, when the user actually moves each imaging unit to a position where the first image 614 and the second image 624 are displayed centering on centers 661 and 662, respectively, each LED is turned off.
That is, in a case where the imaging range of at least one of the first imaging unit and the second imaging unit reaches a predetermined state corresponding to the display of guidance, the guidance display state is changed from on to off. Thus, the user can easily know that the change of the image capturing unit in the desired direction is completed.
In this way, the image capturing apparatus 600 according to the present embodiment notifies the user of the position to which at least one of the first image capturing unit and the second image capturing unit is to be moved, according to the image capturing position or the image capturing frame specified by the user.
Further, whether or not the synthesizing unit performs the synthesizing process varies depending on the image capturing position or the image capturing frame specified by the user. In this structure, the imaging direction of each multi-eye camera can be controlled more easily than in the related art at a position that the user desires to monitor.
In the above-described embodiment, a plurality of display elements such as LEDs are arranged on the outer periphery of the image pickup apparatus, and it is displayed how the direction of the image pickup apparatus changes. However, for example, a CG image as in fig. 13B or 13D may be displayed as a guide on the entire display screen of the display unit. Alternatively, a sub-screen may be displayed superimposed on the image displayed on the display unit, and a CG image as in fig. 13B or 13D may be displayed on the sub-screen.
As described above, how the direction of the image capturing unit changes may be displayed as a guide on the entire screen or the sub-screen. In this structure, in particular, a user remotely operating the image pickup apparatus can easily remotely control the direction of the image pickup unit, thereby viewing the screen of the display unit.
Fig. 14A to 14B are flowcharts showing the operation of the image capturing apparatus according to the embodiment. In this regard, the step for displaying is performed at the client device.
In step S1 of fig. 14A, the user determines whether or not the number M of imaging positions or imaging frames specified by the user is equal to the number N of imaging units.
If yes in step S1, in step S2, the imaging range of the imaging unit is controlled so that each imaging unit images each imaging position or imaging frame. That is, the imaging direction, zoom magnification, rotation of the imaging surface, and the like are controlled. In step S3, the images captured by the respective imaging units are displayed individually without being synthesized.
In contrast, in the case of no in step S1, the process advances to step S4 to determine whether the number M of imaging positions or imaging frames is greater than the number N of imaging units. In the case of yes, the process advances to step S5 to control the imaging ranges of the respective imaging units such that frames close to each other are imaged by the same imaging unit, and each of the other imaging positions or imaging frames is imaged by the respective imaging units.
Subsequently, in step S6, it is determined whether a menu or the like is set so that only the inside of the image capturing frame is displayed. In the case of yes, in step S7, the image corresponding to the image capturing frame is cut out and displayed without synthesizing the images captured by the respective image capturing units. The cut-out portion at this time can be appropriately enlarged and displayed.
In contrast, in the case of no in step S6, the process advances to step S8, and the images captured by the respective image capturing units are not synthesized, and the images are displayed without cutting out.
Subsequently, in the case of no in step S4, in step S9, it is determined whether or not the larger image pickup frame is set as the priority division.
In the case of yes, in step S10, the imaging range of the imaging unit is controlled such that the maximum imaging frame is divided and imaged with priority by the plurality of imaging units to be synthesized, and each of the other imaging frames is imaged with each imaging unit.
In the case of no in step S9, the process advances to step S11, and the imaging range of the imaging unit is controlled so that, for example, the imaging frame in which the user desires to take an image with priority at high resolution is divided and imaged with a plurality of imaging units to perform composition, and each of the other imaging frames is imaged with each of the imaging units. The operations of steps S4 to S11 described above correspond to the operations of embodiment 5.
In step S12 subsequent to step S10 or S11, it is determined whether or not the aspect ratio of the divided and imaged image capturing frame (divided frame) is equal to or greater than a second threshold. In the case of yes in step S12, the process advances to step S13, and the image pickup surfaces of the image pickup units are rotated so that the image pickup frame is divided in the pitch (longitudinal) direction, and the length of the image pickup range of the image pickup units in the pitch direction is longer.
That is, the image pickup frame is rotated by 90 degrees to become longitudinally long, and images from the respective image pickup units are synthesized and displayed.
In contrast, in the case of no in step S12, in step S14, it is determined whether or not the aspect ratio of the divided and imaged image capturing frame (divided frame) is equal to or greater than the first threshold.
In the case of yes in step S14, the process advances to step S15, and the image pickup frame is divided in the pitch (longitudinal) direction. The image pickup frame is picked up such that the image pickup range of each image pickup unit is long in the panning direction (i.e., long in the lateral direction), and images from each image pickup unit are synthesized and displayed.
In the case of no in step S14, in step S16, it is determined whether or not the aspect ratio of the divided and imaged image capturing frame (divided frame) is equal to or greater than a third threshold. In the case of yes in step S16, the process advances to step S17, and the image pickup frame is divided in the panning direction.
The length of the imaging range of each imaging unit in the pitch direction is rotated by 90 degrees to become long (i.e., longitudinal length), and images from each imaging unit are synthesized and displayed.
In contrast, in the case of no in step S16, in step S18, the imaging frame is divided in the panning direction and imaging is performed in such a manner that the imaging range of each imaging unit is long in the panning direction (i.e., laterally long), and images from each imaging unit are synthesized and displayed.
The control of steps S12 to S18 corresponds to the operation described in embodiment 4, and as described in embodiment 4, satisfies the relationship of the second threshold value > the first threshold value > the third threshold value.
The present invention has been described in detail above with reference to the preferred embodiments, but the present invention is not limited to the above embodiments, and various modifications may be made based on the gist of the present invention, and these modifications are not excluded from the scope of the present invention.
For example, in the above-described embodiments, each imaging unit is configured to change each imaging range (imaging direction, zoom magnification, rotation angle of the imaging plane around its optical axis, or the like) by a driving mechanism. However, for example, a driving mechanism capable of controlling the imaging range of at least one imaging unit may be included, and thus the imaging ranges of a plurality of imaging units can be relatively changed.
The computer program for realizing the functions of the above-described embodiments in some or all of the control of these embodiments may be supplied to the image pickup apparatus via a network or any of various storage media. A computer (or CPU or MPU or the like) in the image pickup apparatus can read and execute the program. In this case, the program and a storage medium for storing the program are configured in the present invention.
While the invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
The present application claims the benefit of japanese patent application 2019-108881 filed on date 6-11 of 2019, which is incorporated herein by reference in its entirety.

Claims (13)

1. An image pickup apparatus for connecting to an external device via a network, the image pickup apparatus comprising:
a first image pickup unit and a second image pickup unit;
a driving mechanism configured to control an imaging range of at least one of the first imaging unit and the second imaging unit;
a combining unit configured to combine the first image acquired by the first image pickup unit and the second image acquired by the second image pickup unit to generate a wide-angle image;
a receiving unit configured to receive, from the external device, information related to an area or a position specified by a user, wherein the area or the position is specified with the wide-angle image, with the first image, with the second image, or with an image for indicating an area corresponding to the wide-angle image;
A control unit configured to control an imaging range of at least one of the first imaging unit and the second imaging unit by controlling the driving mechanism according to a region or a position specified by a user, and to control whether or not the synthesizing unit performs synthesizing; and
a transmission unit configured to transmit the wide-angle image to the external device in a case where the control unit controls the synthesizing unit to synthesize, and transmit at least one of the first image and the second image to the external device in a case where the control unit controls the synthesizing unit not to synthesize.
2. The image pickup apparatus according to claim 1, wherein the control unit controls the image pickup range by changing at least one of an image pickup direction, a zoom magnification, and a rotation angle of an image pickup surface of the first image pickup unit or the second image pickup unit.
3. The image pickup apparatus according to claim 1, wherein in the case where the number of specified areas or positions is 1, the control unit controls an image pickup range of at least one of the first image pickup unit and the second image pickup unit, and controls so that the synthesizing unit synthesizes.
4. The image pickup apparatus according to claim 3, wherein, in a case where the number of specified areas is 1 and the shape of the area is rectangular, the control unit changes whether the first image pickup unit and the second image pickup unit divide and pick up the area in the longitudinal direction or divide and pick up the area in the lateral direction according to the aspect ratio of the area.
5. The image pickup apparatus according to claim 4, wherein the control unit causes the region to be segmented and picked up in the longitudinal direction if the aspect ratio is greater than a first value, and causes the region to be segmented and picked up in the lateral direction if the aspect ratio is less than the first value.
6. The image capturing apparatus according to claim 1, wherein the transmission unit transmits, to the external device, an image in a state in which the image capturing ranges of the first image capturing unit and the second image capturing unit are set to be widest in a case where the user designates the area or the position.
7. The image pickup apparatus according to claim 1, wherein in the case where the number of specified areas or positions is 2, the control unit controls the image pickup ranges of the first image pickup unit and the second image pickup unit, and causes the synthesizing unit not to perform synthesizing.
8. The image pickup apparatus according to claim 1, further comprising:
a third image pick-up unit for picking up images,
wherein the control unit is configured to control imaging ranges of the first imaging unit, the second imaging unit, and the third imaging unit,
wherein, in a case where the number of specified areas or positions is greater than the number of imaging units, the control unit controls the imaging range of one of the first imaging unit, the second imaging unit, and the third imaging unit such that at least two of the areas or positions are included in the imaging range of the one of the first imaging unit, the second imaging unit, and the third imaging unit.
9. The image pickup apparatus according to claim 1, wherein the control unit is configured to crop an image such that a cropped image corresponding to the specified area is transmitted to the external device.
10. The image pickup apparatus according to claim 1, further comprising:
a third image pick-up unit for picking up images,
wherein the control unit is configured to control imaging ranges of the first imaging unit, the second imaging unit, and the third imaging unit,
Wherein, in the case where the number of specified areas or positions is smaller than the number of imaging units, the control unit controls so that at least one of the areas or positions is divided and imaged by a plurality of imaging units.
11. The image pickup apparatus according to claim 1, further comprising:
a plurality of imaging units including the first imaging unit and the second imaging unit,
wherein the transmitting unit transmits, to the external device, a wide-angle image in a panning direction based on images obtained from the plurality of imaging units in a case where the user designates the area or the position.
12. A control method for controlling an image pickup apparatus for connecting to an external device via a network, the image pickup apparatus having:
a first image pickup unit and a second image pickup unit; and
a driving mechanism configured to control an imaging range of at least one of the first imaging unit and the second imaging unit,
the control method comprises the following steps:
combining the first image acquired by the first image pickup unit and the second image acquired by the second image pickup unit to generate a wide-angle image;
Receiving information related to an area or a position specified by a user from the external device, wherein the area or the position is specified with the wide-angle image, with the first image, with the second image, or with an image for indicating an area corresponding to the wide-angle image;
controlling an imaging range of at least one of the first imaging unit and the second imaging unit by controlling the driving mechanism according to the specified region or position, and controlling whether or not to perform composition; and
the wide-angle image is transmitted to the external device in a case where synthesis is controlled, and at least one of the first image and the second image is transmitted to the external device in a case where synthesis is not controlled.
13. A non-transitory computer-readable storage medium storing a computer program to cause a computer to execute the control method according to claim 12.
CN202010530853.1A 2019-06-11 2020-06-11 Image pickup apparatus, control method of image pickup apparatus, and recording medium Active CN112073630B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-108881 2019-06-11
JP2019108881A JP2020202503A (en) 2019-06-11 2019-06-11 Imaging device, computer program, and recording medium

Publications (2)

Publication Number Publication Date
CN112073630A CN112073630A (en) 2020-12-11
CN112073630B true CN112073630B (en) 2023-06-06

Family

ID=71094084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010530853.1A Active CN112073630B (en) 2019-06-11 2020-06-11 Image pickup apparatus, control method of image pickup apparatus, and recording medium

Country Status (4)

Country Link
US (1) US11184547B2 (en)
EP (1) EP3751838A1 (en)
JP (1) JP2020202503A (en)
CN (1) CN112073630B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021019228A (en) 2019-07-17 2021-02-15 キヤノン株式会社 Imaging apparatus, computer program, and recording medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004118786A (en) * 2002-09-30 2004-04-15 Sony Corp Image processing apparatus and method for processing image, recording medium, as well as program
CN102342099A (en) * 2009-05-29 2012-02-01 (株)荣国电子 Intelligent monitoring camera apparatus and image monitoring system implementing same
CN106060454A (en) * 2015-04-08 2016-10-26 安讯士有限公司 Monitoring camera
CN107454309A (en) * 2016-04-15 2017-12-08 佳能株式会社 Camera system, image processing equipment and its control method, program and recording medium
CN109714525A (en) * 2017-10-26 2019-05-03 佳能株式会社 Photographic device, system, the control method of photographic device and storage medium
CN109714524A (en) * 2017-10-26 2019-05-03 佳能株式会社 Photographic device, system, the control method of photographic device and storage medium

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7183549B2 (en) * 2004-09-09 2007-02-27 Flir Systems, Inc. Multiple camera systems and methods
JP2009010915A (en) * 2007-05-31 2009-01-15 Takeshi Yanagisawa Video display method and video system
CN102714711A (en) * 2010-02-01 2012-10-03 (株)荣国电子 Tracking and monitoring camera device and remote monitoring system using same
JP5593772B2 (en) * 2010-03-29 2014-09-24 ソニー株式会社 Image processing apparatus, image processing method, and program
JP5573349B2 (en) * 2010-05-17 2014-08-20 パナソニック株式会社 Panorama development image photographing system and method
JP5672862B2 (en) * 2010-08-27 2015-02-18 ソニー株式会社 Imaging apparatus, imaging system, and imaging method
JP5574423B2 (en) * 2010-09-24 2014-08-20 カシオ計算機株式会社 Imaging apparatus, display control method, and program
JP6075066B2 (en) * 2012-12-28 2017-02-08 株式会社リコー Image management system, image management method, and program
CN107580178B (en) * 2013-01-07 2022-04-05 华为技术有限公司 Image processing method and device
JP6292227B2 (en) * 2013-04-30 2018-03-14 ソニー株式会社 Image processing apparatus, image processing method, and program
CN114422738A (en) * 2015-04-01 2022-04-29 猫头鹰实验室股份有限公司 Compositing and scaling angularly separated sub-scenes
JP6335395B2 (en) * 2015-09-30 2018-05-30 富士フイルム株式会社 Image processing apparatus, imaging apparatus, image processing method, and program
JP6554162B2 (en) 2017-12-20 2019-07-31 東北電力株式会社 Power plant performance evaluation method and power plant performance evaluation program
JP7163057B2 (en) * 2018-04-26 2022-10-31 キヤノン株式会社 IMAGING DEVICE, IMAGING METHOD, PROGRAM AND RECORDING MEDIUM
JP7182935B2 (en) 2018-07-30 2022-12-05 キヤノン株式会社 IMAGING DEVICE, CONTROL METHOD AND PROGRAM
KR102525000B1 (en) * 2018-08-08 2023-04-24 삼성전자 주식회사 Electronic device for blurring an image obtained by combining a plurality of images based on depth information and method of driving the electronic device
JP7210199B2 (en) 2018-09-20 2023-01-23 キヤノン株式会社 IMAGING DEVICE, CONTROL METHOD, COMPUTER PROGRAM, AND STORAGE MEDIUM
JP7271132B2 (en) 2018-10-26 2023-05-11 キヤノン株式会社 Imaging device and surveillance system
KR102619271B1 (en) * 2018-11-01 2023-12-28 한화비전 주식회사 Video capturing device including plurality of cameras and video capturing system including the same
JP7250483B2 (en) 2018-11-12 2023-04-03 キヤノン株式会社 Imaging device, computer program, and storage medium
JP7316809B2 (en) 2019-03-11 2023-07-28 キヤノン株式会社 Image processing device, image processing device control method, system, and program
JP2020188349A (en) 2019-05-14 2020-11-19 キヤノン株式会社 Imaging device, imaging method, computer program, and storage medium
JP7286412B2 (en) 2019-05-22 2023-06-05 キヤノン株式会社 IMAGE PROCESSING DEVICE AND CONTROL METHOD THEREOF, IMAGING DEVICE, MONITORING SYSTEM

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004118786A (en) * 2002-09-30 2004-04-15 Sony Corp Image processing apparatus and method for processing image, recording medium, as well as program
CN102342099A (en) * 2009-05-29 2012-02-01 (株)荣国电子 Intelligent monitoring camera apparatus and image monitoring system implementing same
CN106060454A (en) * 2015-04-08 2016-10-26 安讯士有限公司 Monitoring camera
CN107454309A (en) * 2016-04-15 2017-12-08 佳能株式会社 Camera system, image processing equipment and its control method, program and recording medium
CN109714525A (en) * 2017-10-26 2019-05-03 佳能株式会社 Photographic device, system, the control method of photographic device and storage medium
CN109714524A (en) * 2017-10-26 2019-05-03 佳能株式会社 Photographic device, system, the control method of photographic device and storage medium

Also Published As

Publication number Publication date
CN112073630A (en) 2020-12-11
JP2020202503A (en) 2020-12-17
US20200396385A1 (en) 2020-12-17
EP3751838A1 (en) 2020-12-16
US11184547B2 (en) 2021-11-23

Similar Documents

Publication Publication Date Title
CN110022431B (en) Image pickup apparatus, image pickup method, display apparatus, and display method
US20050100087A1 (en) Monitoring system and method, program, and recording medium
KR101677303B1 (en) Camera device, camera system, control device and program
JP2008311804A (en) Imaging apparatus and method
JP2020188349A (en) Imaging device, imaging method, computer program, and storage medium
CN110351475B (en) Image pickup system, information processing apparatus, control method therefor, and storage medium
US8692879B2 (en) Image capturing system, image capturing device, information processing device, and image capturing method
US11140327B2 (en) Image-capturing device and method for operating image-capturing system of two cameras
CN112073630B (en) Image pickup apparatus, control method of image pickup apparatus, and recording medium
JPH118845A (en) Panoramic image generation device and its method
JP4948294B2 (en) Imaging apparatus, imaging apparatus control method, and program
US9300878B2 (en) Imaging apparatus and method for controlling imaging apparatus
JP2000341574A (en) Camera device and camera control system
KR102009988B1 (en) Method for compensating image camera system for compensating distortion of lens using super wide angle camera and Transport Video Interface Apparatus used in it
JP6836306B2 (en) Imaging control device, its control method, program and recording medium
JP2006115091A (en) Imaging device
JP2020122883A (en) Camera platform device
US11516390B2 (en) Imaging apparatus and non-transitory storage medium
JP2018146847A (en) Universal head imaging system
US20240137642A1 (en) Imaging apparatus
US20210409607A1 (en) Control apparatus, control method, and storage medium
JP2005005816A (en) Wide angle camera and wide angle camera system
JP2022047281A (en) Imaging apparatus, control method, program, and storage medium
JP2002218288A (en) Camera display system
CN117641127A (en) Control apparatus, image pickup apparatus, system, control method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant