WO2013186806A1 - 画像撮像装置および画像撮像方法 - Google Patents
画像撮像装置および画像撮像方法 Download PDFInfo
- Publication number
- WO2013186806A1 WO2013186806A1 PCT/JP2012/003800 JP2012003800W WO2013186806A1 WO 2013186806 A1 WO2013186806 A1 WO 2013186806A1 JP 2012003800 W JP2012003800 W JP 2012003800W WO 2013186806 A1 WO2013186806 A1 WO 2013186806A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- imaging
- exposure
- image
- imaging units
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/04—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/58—Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/65—Control of camera operation in relation to power supply
- H04N23/651—Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/002—Special television systems not provided for by H04N7/007 - H04N7/18
- H04N7/005—Special television systems not provided for by H04N7/007 - H04N7/18 using at least one opto-electrical conversion device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
Definitions
- the present invention relates to an image capturing apparatus and method.
- the present invention relates to an apparatus and method for capturing a composite image.
- Digital still cameras and digital video cameras have become widespread, and there are more opportunities to save and view captured still images and movies on a computer, display them on the screen of game consoles and television (TV) systems. Yes. In addition, it is also popular to upload a captured video to an Internet posting site and share the video with other users.
- Some digital cameras can shoot panoramic images, and it has become easy to shoot panoramic images with a wide viewing angle.
- a software tool that generates a panoramic image by combining a plurality of images taken by a digital camera while changing the shooting direction is also often used.
- Mobile devices such as mobile phones with cameras are also provided with panorama shooting modes and panoramic image applications that can be shot with the camera while the tilt of the mobile device is changed, and automatically combined with the panoramic images. There is something.
- the panoramic image is a celestial sphere, so if you shoot outdoors, the sun is reflected or the streetlight is reflected, so the difference between the bright and dark areas is large and the dynamic range is very wide. For this reason, when a panoramic image is taken by adjusting exposure to a specific object, “whiteout” may occur in a bright part and “blackout” may occur in a dark part. “White jump” means that a portion exposed to strong light becomes pure white, and “black crushing” means that a dark portion that is hardly exposed to light becomes black.
- each camera When taking multi-stage exposure images using an omnidirectional multi-lens camera and combining images captured by multiple cameras into a single image with a high dynamic range, each camera has multiple stages under the same exposure settings.
- the method of shooting with changing the exposure value was taken.
- the dynamic range of the field of view varies greatly depending on the camera angle, and the exposure value of all cameras is set with the number of exposure steps determined by the darkest field and the brightest field of view. (Bracket shooting), so depending on the camera angle, a large number of images with “whiteout” and “blackout” will be shot.
- the “whiteout” and “blackout” images are redundant images. As a result, there is a problem that the data amount of the captured image increases and the time required for bracket shooting also increases.
- the present invention has been made in view of these problems, and an object of the present invention is to provide a technology capable of efficiently generating a panoramic image without overexposure or underexposure.
- an image imaging apparatus includes a multi-eye imaging unit including a plurality of imaging units and an exposure setting for individually setting an exposure value for each of the plurality of imaging units.
- the dynamic range is adjusted by combining the captured image with the different exposure value and the storage unit that holds the captured image with the different exposure value captured by each of the imaging units whose exposure values are set by the exposure setting unit.
- a synthesizing unit that generates the captured image.
- Another aspect of the present invention is an image capturing method.
- an exposure setting step for individually setting an exposure value for each of a plurality of imaging units, and captured images with different exposure values captured by each imaging unit for which the exposure value has been set by the exposure setting step.
- a synthesis step of generating a captured image with an adjusted dynamic range by reading out and synthesizing captured images having different exposure values from the held memory.
- a composite image can be efficiently generated using a multi-lens camera.
- FIG. 1 is a perspective view of a panoramic imaging device according to an embodiment. It is a figure explaining the angle of view of the side imaging unit of the multi-view imaging unit of the panoramic imaging apparatus of FIG. 1 and the installation position of the operation / display unit. It is a figure explaining the internal structure of the multi-eye imaging part of FIG.
- FIGS. 4A to 4C are diagrams for explaining the positional relationship between the six imaging units mounted on the fixing member of FIG. It is a figure which shows the center axis
- FIG. 6A and FIG. 6B are diagrams illustrating a separable configuration of the multi-view imaging unit and the main body unit.
- FIGS. 8A to 8D are views for explaining the relationship between the angle of view of the imaging unit installed on the side surface of the fixed member and the angle of view of the imaging unit installed on the top surface of the fixed member.
- FIG. 10A and FIG. 10B are diagrams illustrating an operation screen displayed on the operation / display unit. It is a functional block diagram of the panoramic video stream production
- FIG. 14A and FIG. 14B are diagrams for explaining the data structure of a panoramic video stream generated by the video stream switching recording control. It is a block diagram of the control system for selectively controlling a multi-lens camera. It is a figure explaining the function structure of the multi-view camera control system of FIG. 17A is a flowchart for explaining the writing process by the multi-camera control system of FIG. 16, and FIG. 17B is a flowchart for explaining the reading process by the multi-camera control system of FIG.
- FIG. 20A and 20B are diagrams illustrating an example of an operation screen displayed on the operation / display unit 40.
- FIG. It is a figure explaining the case where a reference camera is designated and batch common setting is carried out.
- FIG. It is a figure which shows the example of an operation screen in the case of designating a reference camera and carrying out a batch common setting.
- the outdoors are image
- FIG. 29A and FIG. 29B are diagrams illustrating captured images that are captured by setting a common reference exposure value for a plurality of imaging units for comparison.
- 30 (a) to 30 (c) are diagrams illustrating the reference exposure value set for each imaging unit by the exposure setting unit in FIG. 28 and the variable exposure value that is changed in multiple stages around the reference exposure value. .
- 33 (a) to 33 (c) are diagrams for explaining a method of pseudo HDR synthesis by the HDR synthesis unit in FIG. It is a functional block diagram of the panoramic video stream production
- FIG. 1 is a perspective view of a panoramic imaging apparatus 100 according to an embodiment.
- the panoramic imaging device 100 includes a multi-eye imaging unit 10 and a main body unit 30.
- the multi-eye imaging unit 10 and the main body unit 30 are cylindrical, and the multi-eye imaging unit 10 and the main body unit 30 are coupled with the central axis aligned.
- the multi-view imaging unit 10 is covered with a cylindrical camera casing 12, and a plurality of imaging units 20A to 20F (hereinafter collectively referred to as imaging units 20) are mounted on the camera casing 12. Yes.
- Each of the imaging units 20A to 20F includes components necessary for shooting such as a lens and an imaging device.
- five imaging units 20A to 20E (hereinafter also referred to as “side imaging units”) are arranged at an equal angle so that the lens is positioned on the circumference of the cylindrical camera casing 12.
- One imaging unit 20 ⁇ / b> F (hereinafter also referred to as “zenith imaging unit”) is arranged so that the lens is positioned on the upper surface of the camera housing 12.
- the main body 30 includes an operation / display unit 40, a battery, a main circuit, and an external interface.
- the operation / display unit 40 is provided at a position recessed inward from the outer periphery of the cylindrical main body 30.
- the moving image data imaged by each of the imaging units 20A to 20F of the multi-eye imaging unit 10 is transmitted to the main circuit of the main body unit 30 via a serial interface or a USB (Universal Serial Bus) interface.
- the main circuit is equipped with a function of multiplexing a moving image frame picked up by each of the image pickup units 20A to 20F to generate a panoramic moving image stream.
- the main circuit includes an external interface for connecting an external recording device such as a USB memory, and the generated panoramic video stream is recorded in the external recording device via the external interface.
- the operation / display unit 40 is a display provided with a touch panel. Each image photographed by each of the imaging units 20A to 20F of the multi-eye imaging unit 10 is displayed on the display. Further, a panoramic image obtained by combining a plurality of images taken by the plurality of imaging units 20A to 20F can be displayed on the display. Further, an operation menu can be displayed on the display, and the user can input an operation on the screen by directly touching the touch panel with a finger while looking at the display screen.
- FIG. 2 is a diagram illustrating the angle of view of the side imaging unit 20A of the multi-eye imaging unit 10 of the panoramic imaging apparatus 100 and the installation position of the operation / display unit 40.
- the main body 30 of the panoramic imaging apparatus 100 is attached to a tripod 80.
- the field of view photographed by the side imaging unit 20A is expanded by the angle of view ⁇ of the side imaging unit 20A as shown in the figure.
- the photographer sits in front of the tripod 80, extends his hand from below, and operates the operation menu displayed on the operation / display unit 40. Since the operation / display unit 40 is provided in the recessed portion of the cylindrical main body 30, even when the photographer touches the touch panel to operate the operation menu on the screen, the operation / display unit 40 is outside the field of view of the side imaging unit 20A. Can touch the touch panel. As a result, it is possible to avoid the photographer's finger from appearing in the image unintentionally.
- the operation / display unit 40 in the hollow portion of the cylindrical main body 30, it is possible to prevent illumination light from the operation / display unit 40 from being reflected in any of the imaging units 20A to 20F. it can. Design the depth of the depression provided with the operation / display unit 40 and the inclination of the operation / display unit 40 provided in the depression so that the influence of the illumination light from the operation / display unit 40 is below a predetermined level. To do.
- FIG. 3 is a diagram illustrating the internal structure of the multi-eye imaging unit 10.
- a plurality of imaging units 20A to 20F are fixed to a single fixing member.
- the positional deviation between the plurality of imaging units 20A to 20F varies due to deformation due to thermal expansion of each fixing member. If a plurality of imaging units 20A to 20F are fixed to one fixing member 14 as in the present embodiment, the influence of the characteristic change due to thermal expansion of the fixing member 14 on the imaging units 20A to 20F becomes constant. In addition, the positional accuracy between the plurality of imaging units 20A to 20F can be improved.
- the fixing member 14 is a regular pentagonal prism, and five side imaging units 20A, 20B, 20C, 20D, and 20E are installed on the five sides of the pentagonal prism, and one is provided on the zenith surface of the pentagonal prism.
- the zenith part imaging unit 20F is installed.
- the fixing member 14 is a pentagonal prism, but an imaging unit may be provided on each surface around the polygonal prism as other polygonal prisms.
- panoramic images have a subject to be photographed at 360 degrees in the horizontal direction of the photographer, and there are many cases where the overhead is empty and the feet are the ground.
- the imaging unit is provided on the outer peripheral surface of the polygonal column and one imaging unit is provided on the upper surface of the polygonal column as in the present embodiment, the line-of-sight direction is changed at an equal angle in the horizontal direction of the photographer.
- a plurality of images can be taken.
- by arranging a plurality of imaging units at equal intervals it is possible to easily perform a combining process when generating panoramic images by connecting the imaging data of the imaging units.
- an image pickup unit (a total of 12 image pickup units) provided on each surface of a regular dodecahedron, but in this embodiment, a smaller number of image pickup units is provided. There is also an advantage that an image necessary for panoramic image synthesis can be taken.
- FIGS. 4A to 4C are diagrams for explaining the positional relationship between the six imaging units 20A to 20F mounted on the fixing member 14 of FIG. 4A is a top view seen from the direction of the central axis of the fixing member 14 (the z-axis direction of FIG. 3), and FIG. 4B is a prism of the fixing member 14 to which the imaging unit 20A is attached.
- 4 is a side view seen from a direction (x-axis direction in FIG. 3) perpendicular to one side surface of FIG. 3
- FIG. 4 (c) is a y-axis in FIG. 3 perpendicular to the x-axis direction and the z-axis direction in FIG. It is the side view seen from the direction.
- the five side surface imaging units 20A to 20E form a circle with a radius L at an equal circumferential angle (72 °) in the horizontal plane around the fixing member 14 of the pentagonal column.
- the imaging directions of the side imaging units 20A to 20E are the radial direction of the circle.
- the zenith imaging unit 20F arranged on the zenith surface of the fixing member 14 and the side imaging unit 20B arranged on the side of the fixing member 14 are: They are arranged to form a circle with a radius L at a circumferential angle of 90 ° on the vertical plane.
- the side imaging units 20A, 20C, 20D, and 20E arranged on the other side of the fixing member 14 of the zenith imaging unit 20F and the fixing member 14 are also arranged to form a circle with a radius L at a circumferential angle of 90 °.
- the six imaging units 20A to 20F are arranged to form a sphere having a radius L.
- FIG. 5 is a diagram illustrating the fixing member 14 to which the six imaging units 20A to 20F are fixed and the central axis of the main body 30.
- the multi-view imaging unit 10 is connected to the main body 30 so that the central axes of the fixing member 14 and the main body 30 match.
- the tripod seat fixing hole 50 of the main body 30 is provided on the central axis of the main body 30, the position of the tripod coupled to the tripod seat fixing hole 50 coincides with the central axis of the fixing member 14. Even when the main body 30 installed on the tripod is tilted or rotated, it is easy to adjust the shooting position, the line-of-sight direction, and the like.
- FIG. 6A and 6 (b) are diagrams illustrating a separable configuration of the multi-view imaging unit 10 and the main body 30.
- FIG. 6A the multi-view imaging unit 10 and the main unit 30 are separated, and the image data is converted in accordance with an interface standard for connecting peripheral devices such as USB (Universal Serial Bus).
- the main body 30 may be connected to a physically separated position using an extendable cable or the like.
- the multiview imaging unit 10 and the main body unit 30 may be connected via a removable relay cable that extends the signal line of the camera.
- both the multi-view imaging unit 10 and the main body unit 30 may be provided with a wireless communication function, and both may be wirelessly connected. Accordingly, panoramic photography can be performed by bringing only the multi-lens imaging unit 10 even in a narrow place where the main body unit 30 cannot be carried.
- the multi-view imaging unit 10 can be put in the hole to perform panoramic shooting in the hole.
- the main body unit 30 may be a portable terminal such as a tablet PC or a mobile phone, and the multi-eye imaging unit 10 and the main body unit 30 are separated and data is transmitted between the multi-eye imaging unit 10 and the main body unit 30 by wireless communication. You may exchange. Thus, the photographer can operate the multi-view imaging unit 10 from a remote place while carrying the main body unit 30, and it is further easier to avoid the photographer's reflection.
- FIG. 6B shows a configuration in which a tripod seat fixing hole 52 is provided on the bottom surface of the multi-view imaging unit 10, and the tripod can be directly coupled to the multi-view imaging unit 10. If the multi-eye imaging unit 10 is provided with a wireless communication function, data captured by the multi-eye imaging unit 10 can be transmitted to the main body unit 30 and other portable terminals.
- FIG. 7 is a diagram showing a configuration in which an imaging unit 20G is also provided on the bottom surface of the multiview imaging unit 10.
- an imaging unit 20G is also provided on the bottom surface of the multiview imaging unit 10.
- photographing is performed by a method such as hanging the multiview imaging unit 10 from the ceiling with a wire or the like.
- omnidirectional imaging may be performed by throwing the multi-lens imaging unit 10 in the air. If the multi-eye imaging unit 10 is provided with a wireless communication function, data captured by the multi-eye imaging unit 10 can be transmitted to the main body unit 30 and other portable terminals.
- FIG. 8A to 8D illustrate the relationship between the angle of view of the imaging unit 20A installed on the side surface of the fixing member 14 and the angle of view of the imaging unit 20F installed on the zenith surface of the fixing member 14.
- FIG. 8A to 8D illustrate the relationship between the angle of view of the imaging unit 20A installed on the side surface of the fixing member 14 and the angle of view of the imaging unit 20F installed on the zenith surface of the fixing member 14.
- FIG. 8A shows the relationship between the visual field 60A of the side imaging unit 20A installed on the side surface of the fixing member 14 and the visual field 60B of the zenith part imaging unit 20F installed on the zenith surface of the fixing member 14.
- FIG. 8B is a top view (viewed from the z-axis direction of FIG. 3) of the multi-view imaging unit 10 of the panoramic imaging device 100 as viewed from above, and side imaging installed on the side surface of the fixing member 14.
- the horizontal field angle ⁇ AH of the visual field 60A of the unit 20A is shown.
- FIGS. 8C and 8D are side views of the panoramic imaging device 100 viewed from the side (views viewed from the x-axis direction and the y-axis direction in FIG. 3).
- FIG. 8C shows the horizontal field angle ⁇ FH of the zenith imaging unit 20F installed on the zenith surface of the fixing member 14.
- FIG. 8D shows the vertical field angle ⁇ AV of the side imaging unit 20A installed on the side surface of the fixed member 14 and the vertical field angle ⁇ FV of the zenith part imaging unit 20F installed on the zenith surface of the fixed member 14. Show.
- panoramic shooting in the vertical direction 180 ° is possible.
- the relationship between the vertical field angle ⁇ AV and the horizontal field angle ⁇ AH of the side imaging unit 20A is ⁇ AV > ⁇ AH .
- the horizontal field angle is larger than the vertical field angle.
- the side surface of the fixing member 14 is rotated by 90 ° with respect to the normal side. Note that the imaging units 20A to 20E are installed in such a state.
- FIG. 9 is a diagram illustrating the overlapping of the visual fields by the five side surface imaging units 20A to 20E installed on the side surface of the fixing member 14 in the top view of the multi-view imaging unit 10 of the panoramic imaging device 100 as viewed from above.
- the horizontal angle of view of each of the side surface imaging units 20A to 20E is 121 ° as an example, and panoramic imaging in the horizontal direction of 360 ° can be performed by combining the images captured by the five side surface imaging units 20A to 20E.
- FIG. 10 (a) and 10 (b) are diagrams for explaining an operation screen displayed on the operation / display unit 40.
- FIG. Images taken by the six imaging units 20A to 20F of the panorama imaging apparatus 100 shown in FIG. 10A are displayed in areas A to F of the operation screen shown in FIG. 10B, respectively.
- any of the imaging units 20A to 20F can be selected as the reference camera.
- the area B is selected, and the area B is highlighted, for example, the frame of the area B is displayed in a different color.
- the area B corresponds to the imaging unit 20B, and the imaging unit 20B serves as a reference camera, and the imaging unit 20B serves as a reference when setting shooting conditions such as exposure and white balance.
- an automatic exposure (AE (auto exposure)) lock button 74 When the photographer presses an automatic exposure (AE (auto exposure)) lock button 74, the imaging unit 20B as the reference camera is set to the optimum exposure value by the automatic exposure, and the optimum exposure value is set to the other imaging units 20A, 20C, It is reflected in 20D, 20E, and 20F.
- the AE lock button 74 is toggled. When the AE lock button 74 is pressed again, the AE lock is released, and the exposure can be automatically adjusted independently for each of the imaging units 20A to 20F.
- an automatic white balance (AWB (auto white balance) lock button 76 when the photographer presses an automatic white balance (AWB (auto white balance)) lock button 76, the white balance of the image pickup unit 20B that is the reference camera is adjusted by the automatic white balance, and the correction value is set to the other image pickup unit. This is also reflected in 20A, 20C, 20D, 20E, and 20F.
- the AWB lock button 76 is also toggled, and when pressed again, the AWB lock is released and the white balance can be automatically adjusted independently for each of the imaging units 20A to 20F.
- the moving image recording start / stop button 70 When the photographer presses the moving image recording start / stop button 70, a moving image is shot by each of the imaging units 20A to 20F, and when the moving image recording start / stop button 70 is pressed again, the moving image recording stops.
- the still image shooting button 72 When the still image shooting button 72 is pressed, a still image is shot by each of the imaging units 20A to 20F.
- the recorded moving image or still image is recorded in a USB memory or the like connected to the main body unit 30 and the external interface of the main circuit.
- FIG. 11 is a functional configuration diagram of the panoramic video stream generation unit 200 installed in the main circuit of the main body unit 30.
- the shooting control unit 210 sets shooting parameters such as exposure values, white balance values, and frame rates of the imaging units 20A to 20F of the multi-lens imaging unit 10 individually or collectively. Further, the shooting control unit 210 controls zooming of the imaging units 20A to 20F, start / stop of shooting, and the like.
- the moving image frames shot by the imaging units 20A to 20F are stored in the frame memories 220A to 220F, respectively.
- the moving image stream multiplexing unit 230 multiplexes the moving image frames shot by the imaging units 20A to 20F stored in the frame memories 220A to 220F, generates a multi-view moving image stream, and supplies the multi-view moving image stream to the panorama stitching unit 242.
- the panorama stitching unit 242 synthesizes a panoramic video by stitching processing that joins the video frames shot by the imaging units 20A to 20F, and generates a panoramic video stream.
- the panoramic stitch unit 242 can output the multi-view video stream as it is without performing the stitching process.
- the panoramic stitch unit 242 can output both a multi-view video stream and a panoramic video stream as necessary.
- the panoramic stitch unit 242 records at least one of the multi-view video stream and the panoramic video stream in the panoramic video storage unit 240.
- the display control unit 260 reads out the moving image frames shot by the imaging units 20A to 20F from the frame memories 220A to 220F and displays them on the screen of the operation / display unit 40.
- the user interface unit 250 gives operation menu information of the imaging units 20A to 20F to the display control unit 260, and the display control unit 260 displays the operation menu on the screen of the operation / display unit 40.
- the touch panel control unit 270 detects a touch operation by a user's finger on the touch panel, and gives information on a touch position and the like to the user interface unit 250.
- the user interface unit 250 specifies the operation content selected by the user in the operation menu from the touch position information given by the touch panel control unit 270, and sends an operation command to the imaging control unit 210.
- the imaging control unit 210 controls the imaging units 20A to 20F based on the operation command given from the user interface unit 250.
- FIG. 12 shows a configuration related to moving image stream switching recording control in the panoramic moving image stream generation unit 200.
- different configurations and operations of the shooting control unit 210 and the moving image stream multiplexing unit 230 illustrated in FIG. 11 will be described.
- the configurations of the panoramic video storage unit 240, the display control unit 260, the user interface unit 250, and the touch panel control unit 270 in FIG. 11 are omitted here.
- broken lines indicate control signal lines, and thick lines indicate image data transmission paths.
- the moving image frames shot by the image pickup units 20A to 20F are stored in the frame memories 220A to 220F, respectively, and the moving image stream multiplexing unit 230 stores the respective image pickups stored in the frame memories 220A to 220F.
- the moving image frames shot by the units 20A to 20F are read out and multiplexed to generate a multi-view moving image stream.
- the moving image stream multiplexing unit 230 has functions of a motion detection unit 232, a mode determination unit 234, and a control signal generation unit 236 in addition to the function of multiplexing the moving image frames shot by the imaging units 20A to 20F.
- the motion detector 232 detects motion vectors in the moving image frames A to F photographed by the imaging units 20A to 20F, and obtains the sum of the magnitudes of the motion vectors of the moving image frames A to F.
- the motion detection unit 232 may detect a motion vector in a moving image frame shot by one or more specific imaging units of interest, and obtain a sum of the magnitudes of the motion vectors of the one or more moving image frames of interest. Good.
- the mode determination unit 234 sequentially and intermittently operates the imaging units 20A to 20F with a frame period for each frame. Switching to one of the imaging units 20A to 20F, the captured image output by one imaging unit as the switching destination is read from the corresponding frame memory, and set to “imaging unit intermittent operation mode” that is multiplexed into one moving image stream.
- the mode determination unit 234 When the sum of the magnitudes of the motion vectors calculated by the motion detection unit 232 is greater than or equal to a predetermined threshold, the mode determination unit 234 operates all the imaging units 20A to 20F at the same time, and outputs the imaging units 20A to 20F.
- the captured image to be read is read from the frame memories 220A to 220F and set to the “imaging unit simultaneous operation mode” for multiplexing into one moving image stream.
- the mode determination unit 234 gives information on the set operation mode of the imaging unit to the imaging control unit 210 and the control signal generation unit 236.
- the imaging control unit 210 supplies a control signal corresponding to the operation mode set by the mode determination unit 234 to the imaging units 20A to 20F.
- the imaging control unit 210 gives a control signal for causing the imaging units 20A to 20F to intermittently operate sequentially in a frame period. Accordingly, it is not necessary to supply power to the imaging unit that is not operating, and power consumption can be reduced.
- the imaging control unit 210 gives a control signal for operating all the imaging units 20A to 20F simultaneously.
- the imaging units 20A to 20F are operated, power consumption is required, but the quality of the panoramic video to be synthesized does not deteriorate even when the subject moves strongly or the scene changes.
- the control signal generation unit 236 supplies a control signal corresponding to the operation mode set by the mode determination unit 234 to the frame memories 220A to 220F.
- the control signal generation unit 236 switches to one of the frame memories 220A to 220F for each frame, activates reading from the frame memory at the switching destination, and outputs from other frame memories. Provides a control signal to deactivate reading.
- the control signal generation unit 236 provides a control signal for activating reading from all the frame memories 220A to 220F.
- the imaging units 20A to 20F are continuously operated and the frame memories 220A to 220F are set to the frame period. Then, the intermittent operation may be performed sequentially to switch to one of the frame memories 220A to 220F for each frame, and the captured image may be read from one frame memory at the switching destination and multiplexed into one moving image stream. This also can reduce the power supplied to the frame memory.
- the mode determination unit 234 switches between “imaging unit intermittent operation mode” and “imaging unit simultaneous operation mode” according to not only the amount of motion of the captured image but also the remaining battery level and allowable power consumption of the panorama imaging apparatus 100. May be. For example, in order to operate for a long time, it is necessary to suppress the power consumption increased by unit time, and the mode is switched to the “imaging unit intermittent operation mode”. Further, it may be executed in the “imaging unit simultaneous operation mode” while the remaining battery level is sufficient, and may be switched to the “imaging unit intermittent operation mode” when the remaining battery level is low. Also, when the panoramic imaging device 100 is driven by a built-in rechargeable battery, it is set to “imaging unit intermittent operation mode”, and when it is driven by external power supplied from the AC adapter, The operation mode may be set.
- the moving picture stream multiplexing unit 230 multiplexes the image frames transmitted from the frame memories 220A to 220F in any of the operation modes, generates a multi-view moving picture stream, and provides the main processor 280 with it.
- the main processor 280 performs a stitching process on the multi-view video stream, generates a panoramic video stream, and records it in a secondary storage device 290 such as a flash memory.
- the main processor 280 implements functional configurations of the panorama stitch unit 242 and the switching unit 246 by software in order to execute panorama stitch processing. These functions may be implemented by hardware.
- the panorama stitching unit 242 joins the image frames captured by the imaging units 20A to 20F included in the multi-view video stream supplied from the video stream multiplexing unit 230 to synthesize a panoramic image, and time-series panoramic image frames.
- a panoramic video stream in which data is arranged is generated and output.
- the panorama stitching unit 242 outputs the multi-view video stream as it is without synthesizing the panoramic image.
- the switching unit 246 switches between the panoramic video stream output from the panorama stitching unit 242 and the multi-view video stream according to the power consumption restriction, and records them in the secondary storage device 290. For example, the following three-stage control is possible depending on the amount of available power.
- the panoramic stitch unit 242 When the available power is small, the panoramic stitch unit 242 outputs a multi-view video stream without generating a panoramic video stream.
- the switching unit 246 turns off the switch 247 on the data transmission path 243 of the panoramic video stream, turns on the switch 248 on the data transmission path 244 of the multi-view video stream, and stores the multi-view video stream in the secondary storage device. Record at 290. Power consumption can be reduced by not performing panorama stitch processing.
- the panorama stitching unit 242 When the available power is medium, the panorama stitching unit 242 generates and outputs a panoramic video stream, but does not output a multi-view video stream.
- the switching unit 246 turns on the switch 247 on the data transmission path 243 of the panoramic video stream, turns off the switch 248 on the data transmission path 244 of the multi-view video stream, and stores the panoramic video stream in the secondary storage device 290. To record. Since the panorama stitching unit 242 operates, it consumes more power than in the case of (a), but there is an advantage that a panoramic video is generated in real time.
- the panorama stitching unit 242 When the available power is large, the panorama stitching unit 242 generates and outputs a panoramic video stream and also outputs a multi-view video stream.
- the switching unit 246 turns on the switch 247 on the data transmission path 243 of the panoramic video stream, turns on the switch 248 on the data transmission path 244 of the multi-view video stream, and
- the moving image stream is recorded in the secondary storage device 290. Since both the multi-view video stream and the panoramic video stream are generated and recorded, the processing amount is large, the recording data amount is increased, and the power consumption is maximized. However, by recording not only a panoramic video stream but also a multi-view video stream, there is an advantage that the use is increased.
- the imaging control unit 210 and the control signal generation unit 236 change the ratio of switching the imaging units 20A to 20F and the frame memories 220A to 220F by the ratio of the frame rates. That is, round-robin control weighted by the frame rate ratio is performed.
- FIG. 13 is a flowchart for explaining a moving image stream switching recording control procedure by the panoramic moving image stream generation unit 200.
- the motion detection unit 232 detects a motion vector from images captured by the imaging units 20A to 20F (S10).
- the motion detection unit 232 calculates the sum of the magnitudes of motion vectors of the captured images of all the imaging units 20A to 20F or one or more imaging units of interest (S12).
- the mode determination unit 234 determines whether or not the sum of the magnitudes of the motion vectors of each captured image is greater than or equal to a threshold (S14). If the sum of the magnitudes of the motion vectors of each captured image is less than the threshold (S14) N), “imaging unit intermittent operation mode” is set (S20). In the “imaging unit intermittent operation mode”, a “low frame rate recording mode” in which a multi-view video stream or a panoramic video stream is recorded at a low frame rate.
- the mode determination unit 234 determines whether or not the panorama imaging apparatus 100 is operating within the power consumption limit when the sum of the magnitudes of the motion vectors of the respective captured images is equal to or greater than the threshold (Y in S14) (S16). . For example, in consideration of allowable power consumption per unit time in the case of long-time shooting, remaining battery power, and the like, it is determined whether the operation is within the power consumption limit. If the panorama imaging device 100 is not operating within the power consumption limit (N in S16), the mode determination unit 234 sets the “imaging unit intermittent operation mode” (S20), and the panorama imaging device 100 is within the power consumption limitation. If it is operating (Y in S16), it is set to “imaging unit simultaneous operation mode” (S18). In the “imaging unit simultaneous operation mode”, a “high frame rate recording mode” in which a multi-view video stream or a panoramic video stream is recorded at a high frame rate.
- the imaging control unit 210 switches the imaging units 20A to 20F to simultaneous operation or intermittent operation according to the mode set by the mode determination unit 234, and the control signal generation unit 236 switches to the mode set by the mode determination unit 234.
- the frame memories 220A to 220F are switched to simultaneous output or intermittent output (S22).
- the panoramic video stream generation unit 200 multiplexes the frame data output from the frame memories 220A to 220F according to each mode to generate a multi-view video stream or a panoramic video stream (S24).
- step S26 If the shooting is stopped by the user (Y in S26), the process is terminated, and if the shooting is not stopped (N in S26), the process returns to step S10 and the processes from S10 are repeated.
- FIGS. 14A and 14B are diagrams illustrating the data structure of a multi-view video stream generated by video stream switching recording control.
- the number of imaging units is 4, and the frame rate of each imaging unit is 30 fps (frames / second).
- FIG. 14A shows a multi-view video stream generated in the “imaging unit intermittent operation mode”.
- the imaging units 20A to 20D are intermittently operated sequentially in a frame cycle. In each frame cycle, the frame A captured by the imaging unit 20A, the frame B captured by the imaging unit 20B, the frame C captured by the imaging unit 20C, Multiplexing is performed in the order of the frames D imaged by the imaging unit 20D.
- FIG. 14B shows a multi-view video stream generated in the “imaging unit simultaneous operation mode”.
- the imaging units 20A to 20D operate simultaneously, and the frame A captured by the imaging unit 20A, the frame B captured by the imaging unit 20B, the frame C captured by the imaging unit 20C, and the imaging unit 20D are captured in each frame period.
- the four imaging units 20A to 20D operate simultaneously, in the panoramic image obtained by combining the four captured images captured by the four imaging units 20A to 20D, adjacent captured images are captured at the same timing. Therefore, even when the motion is intense, the image quality of the panoramic image obtained by joining the captured images is maintained.
- the image quality does not deteriorate so much even if the frame rate of the camera is lowered.
- the movement of the subject is intense, if the frame rate of the camera is lowered (corresponding to a longer shutter speed in the case of a CMOS image sensor), the image quality is significantly lowered due to motion blur.
- the highest image quality can be obtained in any scene.
- power consumption increases and flash memory, SSD (Solid State Drive), HDD ( There arises a problem of squeezing the recording capacity of a data storage device such as a hard disk drive. Therefore, it is desirable to be able to adaptively control the operating state of the camera and the recording data according to the scene.
- the operation mode of the imaging units 20A to 20F and the output mode of the frame memories 220A to 220F can be switched according to the movement of the scene.
- the scene movement is large, the image quality deteriorates unless the effective frame rate of each imaging unit is ensured. Therefore, the image quality is prioritized over the power consumption, and the “imaging unit simultaneous operation mode” is set.
- the image quality does not deteriorate so much even if the effective frame rate of each image pickup unit is lowered, so the power consumption is reduced by switching to the “image pickup unit intermittent operation mode”.
- the panoramic imaging device 100 can be used efficiently. Furthermore, in the “imaging unit intermittent operation mode”, a low-frame image stream is generated, so that the recording capacity can be reduced.
- FIG. 15 is a configuration diagram of a control system for selectively controlling the multi-lens camera.
- the imaging control unit 210 in FIG. 12 includes a bus interface / camera control interface unit 300, a transmission mask register 310, and a control data transmission logic 320, and receives a command from the main processor 280 to receive a plurality of imaging units 20A to 20A.
- Arbitrary imaging units of 20F are collectively controlled selectively. For example, control such as automatic exposure (AE), automatic white balance (AWB), and exposure value (EV) can be performed collectively or selectively.
- AE automatic exposure
- AVB automatic white balance
- EV exposure value
- the bus interface / camera control interface unit 300 is a bus interface of a predetermined standard that connects the main processor 280 and the imaging unit 20 and a dedicated interface for controlling the imaging units 20A to 20F, and receives an instruction from the main processor 280. A control signal is given to the control data transparency logic 320.
- the control data transmission logic 320 is a circuit for writing control data to the imaging units 20A to 20F or reading data from the imaging units 20A to 20F.
- the control data transmission logic 320 transmits write data to the imaging units 20A to 20F by unicast, multicast, or broadcast for writing, and reads the read data by unicast from the imaging units 20A to 20F for reading. Enable to receive.
- the transparency mask register 310 is a register for controlling the input / output operation of the circuit of the control data transparency logic 320.
- the transmission mask registers 310 are transmission mask registers for the number of the imaging units 20A to 20F.
- FIG. 16 is a diagram illustrating the functional configuration of the multi-view camera control system of FIG.
- the camera control interface 302 is an example of the bus interface / camera control interface unit 300 in FIG. 15 and constitutes the application processor 140 together with the main processor 280.
- Application processor 140 is implemented, for example, by system on chip (SoC) technology.
- SoC system on chip
- control data transparent logic 320 of FIG. 15 is a circuit that connects the camera control interface 302 and the imaging units 20A to 20F by a bus interface such as I2C or SPI, and switches 130A to 130F and a multiplexer 132. including.
- the data transmission unit Tx of the camera control interface 302 is connected to the imaging units 20A to 20F via the switches 130A to 130F, and the control data instructed by the main processor 280 is selectively written to the imaging units 20A to 20F.
- the control data write transmission register 110 (hereinafter simply referred to as “write transmission register”) is a transmission mask register for the number of the imaging units 20A to 20F, and the switches 130A to 130F corresponding to the imaging units 20A to 20F are turned on / off. The mask for switching is set.
- the switches 130A and 130B corresponding to the two imaging units 20A and 20B are turned on, and corresponding to the other four imaging units 20C to 20F.
- the switches 130C to 130F are turned off. Thereby, the control data is selectively written in the two imaging units 20A and 20B.
- the imaging units 20A to 20F are connected to the data receiving unit Rx of the camera control interface 302 via the multi-input 1-output multiplexer 132, and data is selectively transmitted from the imaging units 20A to 20F in accordance with a read instruction from the main processor 280. Read out.
- the camera setting value readout transparency register 120 (hereinafter simply referred to as “readout transparency register”) is a transparency selection register for the number of the imaging units 20A to 20F, and is a mask for selecting one of the plurality of input streams of the multiplexer 132. Is set.
- the input stream of one imaging unit 20B is selected from the input streams from the six imaging units 20A to 20F input to the multiplexer 132, and the imaging is performed.
- Data from the unit 20B is output as an output stream from the multiplexer 132, and supplied to the receiving unit Rx of the camera control interface 302.
- the imaging unit Data can be read / written from / to 20A to 20F in a batch or selectively. If an independent interface is provided for each of the imaging units 20A to 20F, the circuit scale increases. However, in the configuration of FIG. 16, the number of circuits can be reduced by using one interface. In addition, by simply rewriting the write transparent register 110, an imaging unit to be written can be arbitrarily selected, and data can be simultaneously written into the selected imaging unit, so that writing can be performed at high speed.
- FIG. 17A is a flowchart for explaining the writing process by the multi-view camera control system of FIG.
- the main processor 280 prepares control data to be written in the imaging units 20A to 20F (S30).
- the control data is data relating to shooting conditions such as an exposure value, for example, and is specified by the user by the user interface unit 250.
- the main processor 280 determines an imaging unit to be written, and writes a value designating the imaging unit to be written into the write transmission register 110 (S32).
- a transmission mask for selecting imaging target imaging units is set.
- the on / off states of the switches 120A to 120F are set by the write transparent register 110.
- the setting of the transmission mask of the writing transmission register 110 is necessary for selectively transmitting control data to the imaging unit to be written.
- the main processor 280 determines an imaging unit to be read, and writes a value for designating the imaging unit to be read in the read transmission register 120 (S34).
- the readout transparency register 120 is a transmission mask for the number of imaging units
- a transmission mask for selecting the imaging unit to be read is set.
- an input stream designated by the read transparency register 120 is selected and output as an output stream.
- the setting of the transmission mask of the read transmission register 120 is necessary for selectively receiving an ACK from the imaging unit to be written.
- each imaging unit of the writing destination returns ACK when writing is completed.
- ACKs must be received from all imaging units to be written and ACK reception confirmation must be performed individually.
- the circuit scale increases and differs from the bus standard. A procedure is required.
- all the ACKs to be written are confirmed by transmitting only the ACK from one imaging unit among the imaging units to be written and confirming the ACK. It can be used instead of confirmation, and the circuit becomes simple. For this reason, in the present embodiment, at the time of ACK confirmation, the read transparent register 120 sets only one bit. If it is desired to strictly check whether or not writing has been correctly performed on each imaging unit, it is only necessary to individually read the register value for each imaging unit and perform ACK confirmation after the writing process.
- the camera control interface 302 outputs a signal for writing control data from the transmission terminal Tx to the imaging units 20A to 20F via the switches 120A to 120F (S36). At this time, the on / off states of the switches 120A to 120F are switched according to the transmission mask by the writing transmission register 110, and the control data is written only to the imaging unit 20 selected as the writing target.
- the camera control interface 302 receives an ACK signal from the imaging units 20A to 20F via the multiplexer 132 at the reception terminal Rx (S38). At this time, ACK is confirmed from the imaging unit selected as the reading target according to the transmission mask by the reading transmission register 120.
- step S34 and step S38 can be omitted.
- FIG. 17B is a flowchart for explaining a reading process by the multi-camera control system of FIG.
- the main processor 280 determines the register of the imaging unit 20 to be read (S40).
- the main processor 280 determines an imaging unit to be read, and writes a value for designating the imaging unit to be read into the write transmission register 110 (S42).
- a transmission mask for selecting imaging units to be read is set.
- the on / off states of the switches 120A to 120F are set by the write transparent register 110.
- the setting of the transmission mask of the write transmission register 110 is necessary for selectively transmitting a read address to the image pickup unit to be read.
- the main processor 280 determines an imaging unit to be read, and writes a value for designating the imaging unit to be read in the read transmission register 120 (S44).
- the readout transparency register 120 is a transmission mask for the number of imaging units
- a transmission mask for selecting the imaging unit to be read is set.
- an input stream designated by the read transparency register 120 is selected and output as an output stream.
- the setting of the transmission mask of the read transmission register 120 is necessary for selectively receiving data and ACK from the imaging unit to be read.
- the camera control interface 302 outputs the readout address to the imaging unit from the transmission terminal Tx to the imaging units 20A to 20F via the switches 120A to 120F (S46). At this time, the on / off states of the switches 120A to 120F are switched according to the transmission mask by the write transmission register 110, and the read address is sent only to the imaging unit 20 selected as the data read target.
- the camera control interface 302 receives data corresponding to the address designated via the multiplexer 132 from the imaging units 20A to 20F at the reception terminal Rx (S48). At this time, data is received from the imaging unit selected as a read target according to the transmission mask by the read transmission register 120, and ACK is confirmed.
- FIG. 18 is a diagram for explaining an implementation example of the multi-view camera control system of FIG.
- a method of serial communication with a peripheral device called I2C is used.
- I2C Inter-Integrated Circuit
- the I2C interface 304 is an interface on the application processor 140 side and is a master of I2C communication.
- the imaging units 20A to 20D are I2C communication slaves.
- the I2C state monitoring circuit 150 is a circuit that enables reception enable for the tristate buffer 170 in the reception state. Thereby, reception of data from the multiplexer 176 is enabled. At the time of writing, the tristate buffer 170 is blocked.
- SDATA output from the I2C interface 304 is supplied to tristate buffers 172A to 172D corresponding to the imaging units 20A to 20D.
- the tristate buffer corresponding to the writing target imaging unit is enabled according to the transmission mask of the writing transparent register 110, and SDATA is transmitted only to the writing target imaging unit.
- the pull-up resistor 174 is a resistor for adjusting the output signals of the tristate buffers 172A to 172D so as to have an appropriate logic level.
- SCLK output from the I2C interface 304 is a clock signal for synchronizing the imaging units 20A to 20D.
- the input values from the designated input terminals among the input terminals of the multi-input 1-output multiplexer 132 are multiplexed and supplied to the I2C interface 304 via the tri-state buffer 170. Is done.
- FIG. 19 is a diagram for explaining another implementation example of the multi-view camera control system of FIG.
- a bus interface called SPI ISerialerPeripheral Interface
- SPI ISerialerPeripheral Interface
- the SPI interface 306 is an interface on the application processor 140 side and is a master of SPI communication.
- the imaging units 20A to 20D are SPI communication slaves.
- the CS # signal output from the SPI interface 306 is a chip select signal and is supplied to the NAND circuits 180A to 180D.
- the write transparency register 110 is a transmission mask for selecting the imaging units 20A to 20D to be written, and the register value is supplied to the NAND circuits 180A to 180D corresponding to the imaging units 20A to 20D.
- the transparent mask of the write transparent register 110 is “1100”, 1 is input to the two NAND circuits 180A and 180B, and 0 is input to the remaining two NAND circuits 180C and 180D.
- Output values CAM1_CS # to CAM4_CS # of the NAND circuits 180A to 180D are inverted and input to the imaging units 20A to 20D.
- the two image pickup units 20A and 20B are selected, and control data TxData from the MOSI (Master Out Out Slave In) of the SPI interface 306 is transmitted to the image pickup units 20A and 20B.
- the SCLK signal from the CLK of the SPI interface 306 is supplied to the imaging units 20A to 20D for synchronization.
- the read data RxData from the multiplexer 182 is received by MISO (Master In Slave Out) of the SPI interface 306.
- the read transmission register 120 is a transmission mask for selecting the imaging units 20A to 20D to be read, and is used for selecting an input signal of the multi-input 1-output multiplexer 182.
- the transparency mask of the readout transparency register 120 is “1000”, and an input stream from the imaging unit 20A is selected from among a plurality of input streams to the multi-input 1-output multiplexer 132, and an output stream of the multiplexer 132 Is output as
- FIG. 20 (a) and 20 (b) are diagrams illustrating an example of an operation screen displayed on the operation / display unit 40.
- FIG. Images captured by the six imaging units 20A to 20F of the panorama imaging apparatus 100 shown in FIG. 20A are displayed in areas A to F of the operation screen shown in FIG. 20B, respectively.
- the frames of areas A to F on the operation screen are displayed in a color such as blue when selected as a reference camera.
- a color such as blue when selected as a reference camera.
- the frame color changes for each group.
- the selection as the reference camera is toggled, and the designation / non-designation is switched each time the button is clicked.
- the camera freeze button 75 When the camera freeze button 75 is pressed, the selected imaging unit is frozen as it is currently captured. That is, the image pickup unit keeps operating and keeps the image when the camera freeze button 75 is pressed.
- the camera freeze button 75 is toggled. When the button is pressed again, the camera freeze is canceled and imaging by the selected imaging unit is resumed.
- the grouping button 77 When the grouping button 77 is pressed, areas arbitrarily selected from the plurality of areas A to F are grouped. In other words, a specific imaging unit is selected from the imaging units 20A to 20F and grouped. A region to be grouped can be selected by clicking on the operation screen or by surrounding with a free curve.
- FIG. 21 is a diagram for explaining a case where a reference camera is designated and collectively set
- FIG. 22 is an operation screen example in that case.
- the imaging unit 20A is set as a reference camera, and shooting parameters such as the exposure value of the reference camera are also set in the other imaging units 20B to 20F at once.
- shooting parameters such as the exposure value of the reference camera are also set in the other imaging units 20B to 20F at once.
- the imaging unit 20A in the front direction is set as a reference camera, and the other imaging units 20B to 20F perform shooting with the same exposure setting as the reference camera.
- the other imaging units 20B to 20F are also selected. The same parameters are reflected.
- the readout transparency register 120 is set to a value that designates the imaging unit 20A selected as the reference camera as an imaging unit to be read, and imaging parameters such as an exposure value and a white balance value are read from the imaging unit 20A that is the reference camera. . Thereafter, the write transparency register 110 is set to a value that designates the remaining imaging units 20B to 20F other than the imaging unit 20A selected as the reference camera as the imaging unit to be written, and is read from the imaging unit 20A that is the reference camera. The obtained imaging parameters are written in the remaining imaging units 20B to 20F at once.
- a more specific operation procedure is as follows. (1) Click the area A of the front camera on the operation screen. Thereby, the frame of the area A becomes blue, and the imaging unit 20A becomes the reference camera. (2) The automatic white balance lock button 76 is pressed to lock the white balance of all the imaging units 20A to 20F with the adjustment value of the imaging unit 20A that is the reference camera. (3) The automatic exposure lock button 74 is pressed to lock the exposure values of all the imaging units 20A to 20F with the adjustment value of the imaging unit 20A that is the reference camera. (4) Shoot with the moving image recording start / stop button 70 or the still image shooting button 72.
- FIG. 23 is a diagram for explaining a case where the outside is photographed from the room through the window, and FIGS. 24 and 25 are operation screen examples in that case.
- the window-side imaging units 20A, 20B and 20E limit the exposure for outdoor shooting, while the remaining imaging units 20C and 20D have exposure settings for indoor shooting. I need to shoot.
- regions A, B, and E corresponding to the window-side imaging units 20A, 20B, and 20E are selected by being surrounded by a free curve 192.
- a grouping button 77 is pressed to group the selected areas A, B, and E.
- the area B is selected and the imaging unit 20B is selected as the reference camera.
- the same shooting parameters as those of the imaging unit 20B as the reference camera are reflected in the other imaging units 20A and 20E belonging to the same group.
- the readout transparency register 120 is set to a value that designates the imaging unit 20B selected as the reference camera in the group as an imaging unit to be read, and shooting parameters such as an exposure value and a white balance value from the imaging unit 20B that is the reference camera. Is read out. Thereafter, the write transparency register 110 is set to a value that designates the remaining imaging units 20A and 20E other than the imaging unit 20B selected as the reference camera in the group as the imaging unit to be written, and the imaging unit 20B that is the reference camera. The imaging parameters read out from are written in the remaining imaging units 20A and 20E in a lump.
- a more specific operation procedure is as follows. (1) Select areas A, B, and E corresponding to the imaging units 20A, 20B, and 20E on the window side on the operation screen. (2) The grouping button 77 is pressed to group the selected areas A, B and E. As a result, the frames of the regions A, B, and E turn green to form the first group. The remaining areas C, D and F form a second group. (3) Click area B. As a result, the frame of the region B becomes blue, and the imaging unit 20B corresponding to the region B becomes the green first group reference camera. (4) Click area D. As a result, the frame of the region D turns blue, and the imaging unit 20D corresponding to the region D becomes the second group reference camera.
- the white balance of the imaging units 20A, 20B, and 20E in the same group is locked with the adjustment value of the imaging unit 20B that is the reference camera in the first group.
- the white balance of the imaging units 20C, 20D, and 20F in the same group is locked with the adjustment value of the imaging unit 20D that is the reference camera.
- the automatic exposure lock button 74 is pressed, the exposure values of the imaging units 20A, 20B, and 20E in the same group are locked with the adjustment value of the imaging unit 20B that is the reference camera in the first group.
- the exposure values of the imaging units 20C, 20D, and 20F in the same group are locked by the adjustment value of the imaging unit 20D that is the reference camera.
- the imaging unit 20D that is the reference camera.
- FIG. 26 is a diagram for explaining a case where some cameras are frozen
- FIG. 27 is an example of an operation screen in that case.
- the camera image is frozen in a state where the person 194 is not shown, and the camera shooting is temporarily interrupted.
- imaging by the imaging unit 20A and the imaging unit 20B is temporarily suspended so that the person 194 does not appear in the captured image, and is frozen with an image in which the person 194 is not reflected.
- regions A and B to be frozen are selected and grouped by pressing grouping button 77, and then either region A or B is selected as a reference camera and camera freeze button 75 is pressed. Thereby, the imaging units 20A and 20B corresponding to the areas A and B belonging to the same group can be temporarily stopped and frozen with an image in which no person is shown.
- a more specific operation procedure is as follows. (1) Select areas A and B of the captured image to be frozen on the operation screen. (2) The grouping button 77 is pressed to group the selected areas A and B. Thereby, the frames of the regions A and B become green. (3) Click area A or B. As a result, the image pickup unit corresponding to the selected region becomes the reference camera, but in actuality, the effect is the same as when the group for which the camera is to be frozen is designated. (4) Press the camera freeze button 75 in a state where no person is captured. As a result, the camera group designated in (3) is frozen, and the areas A and B are frozen with an image in which no person is shown. (5) Thereafter, photographing is performed by the remaining imaging units 20C to 20F that are not frozen.
- the data write mask register and the data read selection register are connected to the interface that the main processor accesses the camera and the interface of each camera module. By inserting them in between, the camera can be controlled at once and at a high speed, and as a result, the camera can be controlled easily and efficiently. Since a multi-view camera control system can be realized simply by inserting a register into an existing interface, it is not necessary to change the interface of the camera module or processor, and the control system can be easily designed.
- FIG. 28 is a functional configuration diagram of the panoramic video stream generation unit 200 for performing multistage exposure shooting.
- the exposure setting unit 400 in FIG. 28 can be implemented in the imaging control unit 210 described in FIG. 11, and the HDR synthesis unit 420 in FIG. 28 can be implemented in the video stream multiplexing unit 230 described in FIG. it can.
- the exposure setting unit 400 sets an exposure value for each of the plurality of imaging units 20A to 20F. Specifically, an individual reference exposure value is set for each of the imaging units 20A to 20F, and a relative exposure value is set by changing the exposure value up and down about the reference exposure value.
- the individual reference exposure values of the imaging units 20A to 20F may be set to optimum exposure values by the automatic exposure (AE) mechanism of the imaging units 20A to 20F.
- AE automatic exposure
- the exposure value setting for each of the imaging units 20A to 20F by the exposure setting unit 400 can be performed collectively or selectively using the multi-view camera control system described with reference to FIG.
- Each of the frame buffers 430A to 430F is a frame memory having an area for storing an image shot with a plurality of exposure values set in each of the imaging units 20A to 20F.
- the HDR synthesizing unit 420 reads low-dynamic-range captured images with different exposure values captured by the imaging units 20A to 20F from the frame buffers 430A to 430F, and synthesizes an imaging region with an optimal exposure value by stitching processing. As a result, a high dynamic range (HDR) panoramic image is generated.
- the HDR synthesizing unit 420 can generate an LDR panoramic image obtained by performing tone mapping processing on the HDR synthesized image to compress the dynamic range.
- the HDR synthesizing unit 420 extracts feature points between adjacent captured images, and stitches adjacent captured images based on the feature points to synthesize a panoramic image.
- a known technique used in image matching can be used for the feature point extraction processing.
- the panoramic image generated by the HDR synthesizing unit 420 is stored in the panoramic image storage unit 410.
- 29 (a) and 29 (b) are diagrams illustrating captured images that are captured by setting a common reference exposure value for the plurality of imaging units 20A to 20F for comparison.
- FIG. 29 (a) shows images 440A to 446A when bracket shooting is performed with a certain imaging unit 20A changing the exposure value in multiple stages around the reference exposure value (here, the reference exposure value is 0).
- FIG. 29 (b) shows images 440B to 446B when bracket imaging is performed with different exposure values in multiple stages around the same reference exposure value EV as in FIG. 29 (a) by another imaging unit 20B.
- the images 440B and 441B cause “blackout” and the amount of information is zero.
- the image 446 ⁇ / b> B when the exposure value EV + 7 has “overexposed” in part.
- bracket shooting is performed by changing the exposure value in multiple steps above and below the reference exposure value using the same reference exposure value in all the imaging units 20A to 20F, an image that causes “blackout” or “overexposure” occurs. Is taken, and waste is generated.
- the appropriate exposure value varies greatly depending on the field of view of each camera.
- it is effective to perform bracket shooting in which exposure values are changed in multiple steps.
- the dynamic range of field brightness varies greatly depending on the camera angle, and all camera exposure values are set in multiple stages with a wide exposure range covering the darkest field to the brightest field of view. It is necessary to perform bracket shooting, which increases the amount of image data and increases the shooting time.
- the exposure value is changed in multiple steps above and below the same reference exposure value for all cameras, depending on the camera angle, a large number of images with “excessive highlights” and “blackouts” will be taken, which is wasteful. .
- the reference exposure value is individually set for each of the imaging units 20A to 20F, and the exposure value is changed in multiple stages above and below the individual reference exposure value in each of the imaging units 20A to 20F. Perform bracket shooting.
- FIG. 30A shows a relative exposure value obtained by changing the exposure value by three steps in the positive direction and three steps in the negative direction with respect to the reference exposure value EVa and the reference exposure value EVa set in the imaging unit 20A.
- FIG. 30B shows a relative exposure value obtained by changing the exposure value by three steps in the plus direction and three steps in the minus direction with respect to the reference exposure value EVb and the reference exposure value EVb set in the imaging unit 20B.
- FIG. 30C shows the relative exposure value obtained by changing the exposure value by three steps in the plus direction and three steps in the minus direction with respect to the reference exposure value EVc and the reference exposure value EVc set in the imaging unit 20C.
- the reference exposure values EVa, EVb, EVc set for the respective imaging units 20A, 20B, 20C are optimum exposure values in the field of view of the respective imaging units 20A, 20B, 20C, and are generally different values.
- the reference exposure values EVa, EVb, EVc of the imaging units 20A, 20B, 20C are shifted by two stages. Multi-stages in which the exposure values are changed by +3, +2, +1, -1, -2, -3 with respect to the respective reference exposure values EVa, EVb, EVc that are shifted by two stages in each of the imaging units 20A, 20B, 20C. Since exposure photography is performed, as a result, the three imaging units 20A, 20B, and 20C as a whole have been photographed with a total of 11 different exposure values.
- Each of the imaging units 20A, 20B, and 20C performs multi-stage exposure shooting in which the exposure value is changed by three steps in the positive direction and three steps in the negative direction around the appropriate reference exposure value.
- the imaging units 20A, 20B, and 20C include 7 from +3 to ⁇ 3 including the reference exposure value. Since it is only necessary to perform bracket shooting in stages, the time required for bracket shooting can be greatly reduced. As described above, in the present embodiment, even when the number of bracket shooting steps by each of the imaging units 20A to 20F is small, the dynamic range during HDR composition is greatly expanded, and HDR panoramic image composition can be efficiently performed without waste.
- FIG. 31 is a diagram for explaining a panoramic image synthesized by the HDR synthesizing unit 420.
- a panoramic image shown in FIG. 31 is obtained by stitching and synthesizing an imaging area having an appropriate exposure value from captured images taken with a plurality of exposure values by the imaging units 20A to 20F.
- a region 450 is a landscape outside a window, which is extracted from a captured image captured by a certain imaging unit 20A with an appropriate exposure value in the field of view of the imaging unit 20A
- a region 451 is an indoor landscape
- the image is extracted from a captured image captured with an appropriate exposure value in the field of view of the imaging unit 20B by another imaging unit 20B.
- FIG. 32 is a flowchart for explaining the multi-stage exposure shooting procedure according to this embodiment.
- each of the image pickup units 20A to 20F automatic exposure (AE) and automatic white balance (AWB) are performed, and each of the image pickup units 20A to 20F is set to an optimal reference exposure value to perform shooting (S50).
- AE automatic exposure
- AVB automatic white balance
- the exposure setting unit 400 sets ⁇ N-stage relative exposure values in the imaging units 20A to 20F around the reference exposure values of the imaging units 20A to 20F (S52).
- bracket imaging is performed with the set relative exposure value of ⁇ N steps (S54).
- the HDR synthesizing unit 420 synthesizes a panoramic image with a high dynamic range by synthesizing a properly exposed area from the captured images obtained by bracket shooting by the imaging units 20A to 20F by stitching processing (S56).
- the multi-stage exposure shooting is performed by changing the exposure value by a predetermined number of levels above and below the reference exposure value, and from the captured images captured by the imaging units 20A to 20F.
- a high dynamic range panoramic image was synthesized by stitching the areas captured with proper exposure.
- a pseudo high dynamic range panoramic image can be synthesized from images captured under individual reference exposure values in the imaging units 20A to 20F.
- a method of this pseudo HDR synthesis will be described.
- FIG. 33 (a) to 33 (c) are diagrams for explaining a method of pseudo-HDR synthesis by the HDR synthesis unit 420.
- FIG. 33 (a) to 33 (c) are diagrams for explaining a method of pseudo-HDR synthesis by the HDR synthesis unit 420.
- each of the image pickup units 20A to 20F is individually subjected to automatic exposure control and automatic white balance control so that one image is captured under an optimal exposure value.
- the HDR synthesizing unit 420 synthesizes an HDR panoramic image by matching corresponding points between two captured images whose shooting directions are adjacent to each other. In this case, it is considered that the luminance and color of the matching points are equal. Since the optimum exposure value by automatic exposure of each of the imaging units 20A to 20F is known, HDR synthesis is performed without rounding the luminance and color information of each pixel of the captured image to 8 bits.
- the 8-bit luminance information of the image photographed by the imaging unit 20A is within the luminance range shown in FIG. 33A due to the deviation of the appropriate exposure values of the three imaging units 20A, 20B, and 20C.
- the 8-bit luminance information of the image taken by the unit 20B is in the luminance range shown in FIG. 33B
- the 8-bit luminance information of the image taken by the imaging unit 20C is the luminance range shown in FIG.
- the luminance of the image captured by the imaging unit 20B is within a range that is 3 bits brighter than the luminance of the image captured by the imaging unit 20A, and further, the imaging unit 20C.
- the brightness of the image shot by the above is in a range brighter by 6 bits than the brightness of the image shot by the imaging unit 20B.
- the pixels of the image captured by the imaging unit 20B are bit-shifted by 3 bits compared to the pixels of the image captured by the imaging unit 20A, and the pixels of the image captured by the imaging unit 20C are captured by the imaging unit 20A.
- high dynamic range color difference information can be obtained by adding a pixel value in a 32-bit space after performing a bit shift according to an appropriate exposure value of each of the imaging units 20A to 20F.
- the shooting direction of the imaging unit is not adjusted to a specific direction.
- the shooting direction of the imaging unit 20A in the front direction is set to magnetic north
- the center of the panoramic image after stitch processing is set to the north.
- face is set to face.
- FIG. 34 is a functional configuration diagram of the panoramic video stream generation unit 200 that can perform orientation alignment. A configuration and operation different from the panoramic video stream generation unit 200 of FIG. 11 will be described.
- a multiaxial geomagnetic sensor 252 and a triaxial acceleration sensor 254 are mounted on the multiview imaging unit 10 or the main body unit 30 of the panoramic imaging apparatus 100. Further, a three-axis gyro sensor may be mounted.
- the triaxial geomagnetic sensor 252 detects the geomagnetic vector with three axes
- the triaxial acceleration sensor 254 detects the acceleration vector of the panoramic imaging device 100 with three axes.
- the panoramic imaging device 100 is equipped with at least one of a three-axis gyro (angular velocity) sensor, a three-axis acceleration sensor, and a three-axis geomagnetic sensor, the tilt of the panoramic imaging device 100 is detected in three axes.
- Posture information can be acquired.
- the azimuth information can be acquired by detecting the geomagnetic vector with three axes by the three-axis geomagnetic sensor.
- Orientation information acquired by the triaxial geomagnetic sensor 252 is supplied to the user interface unit 250 and acquired by at least one of the triaxial geomagnetic sensor 252, the triaxial acceleration sensor 254, and the triaxial gyro sensor (if installed).
- the posture information is supplied to the user interface unit 250.
- the user interface unit 250 gives an instruction to display on the screen the azimuth taken by the multi-view imaging unit 10 of the panoramic imaging apparatus 100 and the attitude of the panoramic imaging apparatus 100.
- the display control unit 260 operates /
- the display unit 40 displays information about the shooting direction and orientation using graphics.
- the user interface unit 250 displays, on the screen, an instruction for the user to adjust the direction of the multi-eye imaging unit 10 so that the imaging direction of any imaging unit is magnetic north, for example.
- FIGS. 35 (a), (b) and FIGS. 36 (a), (b) are diagrams illustrating a user interface for adjusting the imaging direction of the imaging unit 20A in the front direction to magnetic north.
- FIG. 35A is a schematic view of the multi-view imaging unit 10 of the panoramic imaging apparatus 100 as seen from the zenith, and there is an imaging unit 20A that captures the front direction on the opposite side of the operation / display unit 40.
- this imaging unit 20A as a front camera, guidance for allowing the user to adjust the direction of the multi-eye imaging unit 10 so as to adjust the shooting direction of the front camera to magnetic north is displayed on the operation / display unit 40.
- the magnetic north measured by the triaxial geomagnetic sensor 252 is indicated by an arrow in FIG. 35A, and the imaging direction of the imaging unit 20A is deviated from the magnetic north.
- FIG. 35B is a diagram illustrating a screen displayed on the operation / display unit 40. Normally, as shown in FIG. 10B, images taken by the six imaging units 20A to 20F are displayed in the areas A to F on the operation / display unit 40, but the imaging unit which is a front camera. When the area A corresponding to 20A is selected, only the image captured by the imaging unit 20A is displayed large on the operation / display unit 40, as shown in FIG.
- the magnetic north measured by the three-axis geomagnetic sensor 252 is displayed as an image of the azimuth magnet (reference numeral 44) under the area A (reference numeral 42) in which an image photographed by the imaging unit 20A is displayed. Further below that, a message “Move the front camera to magnetic north” is displayed (reference numeral 46).
- the imaging direction of the imaging unit 20A changes, and accordingly, the orientation of the magnetic north of the compass magnet also changes.
- the user changes the orientation of the imaging unit 20A by rotating the main body 30 of the panoramic imaging device 100 until the magnetic north of the compass magnet displayed by the reference numeral 44 faces straight upward according to the displayed message.
- 36 (a) and 36 (b) show a state where the orientation of the imaging unit 20A, which is a front camera, coincides with the magnetic north measured by the triaxial geomagnetic sensor 252.
- the orientation of the image pickup unit 20A coincides with magnetic north.
- the magnetic north of the compass magnet faces straight up (reference numeral 44), A message “Good!” Is displayed (reference numeral 46).
- the user can set the orientation of the main body 30 of the panoramic imaging device 100 so that the imaging direction of the imaging unit 20A, which is a front camera, becomes magnetic north.
- the center of the panoramic image after stitching is true north.
- the imaging unit 20A on the back side of the operation / display unit 40 as the front camera and the direction of the front camera as magnetic north, the correspondence between the shooting direction of the front camera and the direction of the center of the panoramic image becomes clear.
- the panoramic imaging apparatus 100 is also equipped with a three-axis acceleration sensor 254, the inclination of the multi-view imaging unit 10 can be detected and horizontal correction can be performed during stitch processing without using a level or the like. Is possible.
- the shooting direction of the imaging unit 20A which is a front camera, is set to magnetic north, but the shooting direction of the imaging unit 20A may be set to an arbitrary direction that is desired to be the center of the panoramic image. Moreover, you may match the imaging
- FIGS. 38 (a), (b) are diagrams illustrating a user interface for adjusting the shooting direction of a specific imaging unit to magnetic north.
- the imaging unit 20C is in the direction closest to magnetic north.
- FIG. 37B is a screen displayed on the operation / display unit 40, and images taken by the imaging units 20A to 20F are displayed in the areas A to F, respectively (reference numeral 42). Since the imaging unit 20C is in the direction closest to magnetic north, the compass is displayed in the area C where the captured image of the imaging unit 20C is displayed, indicating the direction of magnetic north (reference numeral 44). A message “Move the camera to magnetic north” is displayed at the bottom (reference numeral 46).
- the user changes the orientation of the imaging unit 20C by rotating the main body 30 of the panorama imaging device 100 until the magnetic north of the compass magnet indicated by reference numeral 44 is directed straight upward according to the displayed message.
- FIG. 38A and 38B show a state where the orientation of the imaging unit 20C coincides with the magnetic north measured by the triaxial geomagnetic sensor 252.
- FIG. 38 (a) the orientation of the imaging unit 20C coincides with magnetic north.
- FIG. 38 (b) the magnetic north of the compass magnet faces straight up (reference numeral 44), A message “Good!” Is displayed (reference numeral 46).
- the user can set the orientation of the main body 30 of the panoramic imaging device 100 so that the shooting direction of the specific imaging unit is magnetic north.
- the shooting direction of the specific imaging unit is magnetic north.
- the shooting direction of the unit should be set to magnetic north.
- the multi-eye imaging unit 10 including the plurality of imaging units 20A to 20F directed in different shooting directions and the main body unit 30 including the operation / display unit 40 are included.
- the panoramic imaging device 100 has been described, at least some of the plurality of imaging units of the multi-eye imaging unit 10 may be directed in the same direction.
- the multi-lens imaging unit 10 includes at least two imaging units oriented in the same imaging direction and the viewpoint positions of the two imaging units are different, an image with parallax can be captured. If a parallax image is used, since a depth can be grasped, a three-dimensional image can be generated.
- the parallax image is a panoramic image
- a three-dimensional panoramic image can be generated.
- a stereo imaging device such as a stereo imaging device may be configured, and various characteristic configurations and processes in the above-described embodiments are also applicable to such a stereo imaging device. This will be understood by those skilled in the art.
- the omnidirectional panoramic image is described as an example of the panoramic image.
- the panoramic image does not need to be the omnidirectional panoramic image, and a plurality of images captured by a plurality of cameras having different shooting directions are used. An image obtained by combining these images may be used.
- panoramic image used in this specification is not limited to a “panoramic” image in a narrow sense, that is, a horizontally long or vertically long image or a 360-degree panoramic image, and simply covers a wide range. It is an image to be.
- the output composite image does not have to be a so-called panoramic image, and a normal image having an arbitrary size can be used as a composite image. Can be applied.
- the output composite image may be an image in which a plurality of images having different resolutions are hierarchized. Such a hierarchized image may be configured such that when a partial area of the image is enlarged, the enlarged area is replaced with an image with higher resolution.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Exposure Control For Cameras (AREA)
Abstract
Description
図1は、実施の形態に係るパノラマ撮像装置100の斜視図である。パノラマ撮像装置100は、多眼撮像部10と本体部30を含む。本実施の形態では、多眼撮像部10と本体部30は円筒形であり、多眼撮像部10と本体部30は中心軸を合わせて連結されている。
次に、パノラマ撮像装置100のパノラマ動画ストリーム生成部200における動画ストリーム切り替え記録制御について説明する。
図15は、多眼カメラを選択的に制御するための制御系の構成図である。図12の撮影制御部210は、バスインタフェース・カメラ制御インタフェース部300と、透過マスクレジスタ310と、制御データ透過ロジック320とを含み、メインプロセッサ280からの命令を受けて、複数の撮像ユニット20A~20Fの内、任意の撮像ユニットをまとめて選択的に制御する。たとえば、自動露出(AE)、自動ホワイトバランス(AWB)、露出値(EV)などの制御を一括または選択的に行うことができる。
(1)操作画面において正面カメラの領域Aをクリックする。これにより、領域Aの枠が青色になり、撮像ユニット20Aが基準カメラとなる。
(2)自動ホワイトバランスロックボタン76を押して、基準カメラである撮像ユニット20Aの調整値ですべての撮像ユニット20A~20Fのホワイトバランスをロックする。
(3)自動露出ロックボタン74を押して、基準カメラである撮像ユニット20Aの調整値ですべての撮像ユニット20A~20Fの露出値をロックする。
(4)動画記録開始/停止ボタン70または静止画撮影ボタン72により撮影する。
(1)操作画面において窓側の撮像ユニット20A、20Bおよび20Eに対応する領域A、BおよびEを選択する。
(2)グルーピングボタン77を押して、選択された領域A、BおよびEをグルーピングする。これにより、領域A、BおよびEの枠が緑色になり、第1グループを形成する。残りの領域C、DおよびFは、第2グループを形成する。
(3)領域Bをクリックする。これにより、領域Bの枠が青色になり、領域Bに対応する撮像ユニット20Bが緑色の第1グループの基準カメラとなる。
(4)領域Dをクリックする。これにより、領域Dの枠が青色になり、領域Dに対応する撮像ユニット20Dが第2グループの基準カメラとなる。
(5)自動ホワイトバランスロックボタン76を押すと、第1グループにおいて、基準カメラである撮像ユニット20Bの調整値で同一グループ内の撮像ユニット20A、20Bおよび20Eのホワイトバランスがロックされる。また、第2グループにおいて、基準カメラである撮像ユニット20Dの調整値で同一グループ内の撮像ユニット20C、20Dおよび20Fのホワイトバランスがロックされる。
(6)自動露出ロックボタン74を押すと、第1グループにおいて、基準カメラである撮像ユニット20Bの調整値で同一グループ内の撮像ユニット20A、20Bおよび20Eの露出値がロックされる。また、第2グループにおいて、基準カメラである撮像ユニット20Dの調整値で同一グループ内の撮像ユニット20C、20Dおよび20Fの露出値がロックされる。
(7)動画記録開始/停止ボタン70または静止画撮影ボタン72により撮影する。
(1)操作画面においてフリーズさせたい撮像画像の領域AおよびBを選択する。
(2)グルーピングボタン77を押して、選択された領域AおよびBをグルーピングする。これにより、領域AおよびBの枠が緑色になる。
(3)領域AまたはBをクリックする。これにより選択された方の領域に対応する撮像ユニットが基準カメラとなるが、実際にはカメラをフリーズさせたいグループを指定したのと同じ効果となる。
(4)人物が写っていない状態で、カメラフリーズボタン75を押す。これにより(3)で指定したカメラグループがフリーズ状態となり、領域AおよびBが人物が写っていない画像でフリーズする。
(5)以後、フリーズしていない残りの撮像ユニット20C~20Fにより撮影する。
図28は、多段露出撮影を行うためのパノラマ動画ストリーム生成部200の機能構成図である。ここでは、多段露出撮影に関係する構成を図示しており、それ以外の構成は説明の便宜上、省略している。図28の露出設定部400は、図11で説明した撮影制御部210に実装することができ、図28のHDR合成部420は、図11で説明した動画ストリーム多重化部230に実装することができる。
上記の説明では、撮像ユニットの撮影方向を特定の方角に合わせることをしなかったが、たとえば、正面方向の撮像ユニット20Aの撮影方向を磁北に合わせると、ステッチ処理後のパノラマ画像の中心が北を向くようになる。以下では、パノラマ画像の方位合わせを行うための構成と処理を説明する。
Claims (7)
- 複数の撮像ユニットを含む多眼撮像部と、
前記複数の撮像ユニットのそれぞれに対して個別に露出値を設定する露出設定部と、
前記露出設定部により露出値が設定された各撮像ユニットにより撮像された露出値の異なる撮像画像を保持する記憶部と、
露出値の異なる撮像画像を合成することにより、ダイナミックレンジが調整された撮像画像を生成する合成部とを含むことを特徴とする画像撮像装置。 - 前記露出設定部は、撮像ユニット毎に個別の基準露出値および前記基準露出値を中心に所定段数分だけ露出値を変化させた相対露出値を設定することを特徴とする請求項1に記載の画像撮像装置。
- 前記露出設定部は、撮影方向が隣接する撮像ユニット間で設定される露出値の範囲が一部重なるように、撮像ユニット毎の前記相対露出値の前記所定段数を設定することを特徴とする請求項2に記載の画像撮像装置。
- 前記露出設定部は、撮像ユニット毎に個別の基準露出値を設定し、
前記合成部は、前記基準露出値で撮像された露出値の異なる撮像画像を合成する際、各画素の輝度値のビットを各撮像ユニットの基準露出値に応じてシフトさせた上で輝度値を合成することにより、ダイナミックレンジが調整された撮像画像を合成することを特徴とする請求項1に記載の画像撮像装置。 - 前記基準露出値は、各撮像ユニットの自動露出機構によって設定されることを特徴とする請求項2から4のいずれかに記載の画像撮像装置。
- 複数の撮像ユニットのそれぞれに対して個別に露出値を設定する露出設定ステップと、
前記露出設定ステップにより露出値が設定された各撮像ユニットにより撮像された露出値の異なる撮像画像を保持するメモリから、露出値の異なる撮像画像を読み出して合成することにより、ダイナミックレンジが調整された撮像画像を生成する合成ステップとを含むことを特徴とする画像撮像方法。 - 複数の撮像ユニットのそれぞれに対して個別に露出値を設定する露出設定ステップと、
前記露出設定ステップにより露出値が設定された各撮像ユニットにより撮像された露出値の異なる撮像画像を保持するメモリから、露出値の異なる撮像画像を読み出して合成することにより、ダイナミックレンジが調整された撮像画像を生成する合成ステップとをコンピュータに実現させるプログラムを格納したことを特徴とするコンピュータ読み取り可能な記録媒体。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014520799A JP5881827B2 (ja) | 2012-06-11 | 2012-06-11 | 画像撮像装置および画像撮像方法 |
US14/405,238 US9979884B2 (en) | 2012-06-11 | 2012-06-11 | Image capturing device, and image capturing method |
EP12879030.0A EP2860957B1 (en) | 2012-06-11 | 2012-06-11 | Image capturing device, image capturing method and computer-readable recording medium |
PCT/JP2012/003800 WO2013186806A1 (ja) | 2012-06-11 | 2012-06-11 | 画像撮像装置および画像撮像方法 |
CN201280073714.4A CN104335569B (zh) | 2012-06-11 | 2012-06-11 | 图像生成设备以及图像生成方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/003800 WO2013186806A1 (ja) | 2012-06-11 | 2012-06-11 | 画像撮像装置および画像撮像方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013186806A1 true WO2013186806A1 (ja) | 2013-12-19 |
Family
ID=49757680
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/003800 WO2013186806A1 (ja) | 2012-06-11 | 2012-06-11 | 画像撮像装置および画像撮像方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9979884B2 (ja) |
EP (1) | EP2860957B1 (ja) |
JP (1) | JP5881827B2 (ja) |
CN (1) | CN104335569B (ja) |
WO (1) | WO2013186806A1 (ja) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104486544A (zh) * | 2014-12-08 | 2015-04-01 | 广东欧珀移动通信有限公司 | 一种全景照片的拍摄方法及装置 |
JP2016057602A (ja) * | 2014-09-10 | 2016-04-21 | 清水建設株式会社 | 撮影器具及びパノラマ撮影装置 |
JP2016178435A (ja) * | 2015-03-19 | 2016-10-06 | カシオ計算機株式会社 | 撮像制御装置、撮像制御方法及びプログラム |
US20160301866A1 (en) | 2015-04-10 | 2016-10-13 | Samsung Electronics Co., Ltd. | Apparatus and method for setting camera |
WO2016174707A1 (ja) * | 2015-04-27 | 2016-11-03 | 株式会社日立ハイテクノロジーズ | 荷電粒子線装置及び当該装置を用いた試料の観察方法 |
CN106161967A (zh) * | 2016-09-13 | 2016-11-23 | 维沃移动通信有限公司 | 一种逆光场景全景拍摄方法及移动终端 |
JP2016208306A (ja) * | 2015-04-23 | 2016-12-08 | パナソニックIpマネジメント株式会社 | 画像処理装置及びこれを備えた撮像システムならびに画像処理方法 |
JP2017073620A (ja) * | 2015-10-06 | 2017-04-13 | 株式会社トプコン | 画像処理装置、画像処理方法および画像処理用プログラム |
JP2017143390A (ja) * | 2016-02-09 | 2017-08-17 | キヤノン株式会社 | 画像処理装置およびその方法 |
WO2017149875A1 (ja) * | 2016-02-29 | 2017-09-08 | ソニー株式会社 | 撮像制御装置、撮像装置及び撮像制御方法 |
KR101806840B1 (ko) | 2016-12-15 | 2017-12-11 | 인천대학교 산학협력단 | 다수의 카메라를 이용한 고해상도 360도 동영상 생성 시스템 |
CN107566749A (zh) * | 2017-09-25 | 2018-01-09 | 维沃移动通信有限公司 | 拍摄方法及移动终端 |
JP2018186514A (ja) * | 2018-06-13 | 2018-11-22 | パナソニックIpマネジメント株式会社 | 画像処理装置及びこれを備えた撮像システムならびに画像処理方法 |
US10536633B2 (en) | 2015-02-06 | 2020-01-14 | Panasonic Intellectual Property Management Co., Ltd. | Image processing device, imaging system and imaging apparatus including the same, and image processing method |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103636187B (zh) * | 2011-08-30 | 2016-11-16 | 松下电器产业株式会社 | 摄像装置 |
US9406028B2 (en) * | 2012-08-31 | 2016-08-02 | Christian Humann | Expert system for prediction of changes to local environment |
US10257494B2 (en) * | 2014-09-22 | 2019-04-09 | Samsung Electronics Co., Ltd. | Reconstruction of three-dimensional video |
US11205305B2 (en) | 2014-09-22 | 2021-12-21 | Samsung Electronics Company, Ltd. | Presentation of three-dimensional video |
US10531071B2 (en) * | 2015-01-21 | 2020-01-07 | Nextvr Inc. | Methods and apparatus for environmental measurements and/or stereoscopic image capture |
CN104869297A (zh) * | 2015-06-15 | 2015-08-26 | 联想(北京)有限公司 | 图像处理方法和电子设备 |
CN106817516A (zh) * | 2015-11-30 | 2017-06-09 | 英业达科技有限公司 | 摄像整合系统及方法 |
CN105872393A (zh) * | 2015-12-08 | 2016-08-17 | 乐视移动智能信息技术(北京)有限公司 | 高动态范围图像的生成方法和装置 |
KR20180101165A (ko) | 2016-01-03 | 2018-09-12 | 휴먼아이즈 테크놀로지즈 리미티드 | 파노라마 프레임으로의 프레임 스티칭 |
CN105530433A (zh) * | 2016-01-28 | 2016-04-27 | 北京金久瑞科技有限公司 | 多相机系统的取像同步控制装置 |
WO2017147001A1 (en) | 2016-02-24 | 2017-08-31 | Endochoice, Inc. | Circuit board assembly for a multiple viewing element endoscope using cmos sensors |
WO2017156302A1 (en) * | 2016-03-10 | 2017-09-14 | Visbit Inc. | Time multiplexing programmable field of view imaging |
KR101822169B1 (ko) * | 2016-03-11 | 2018-01-25 | 삼성전자주식회사 | 파노라마 영상을 제공하는 전자 장치 및 그 제어 방법 |
US10645282B2 (en) * | 2016-03-11 | 2020-05-05 | Samsung Electronics Co., Ltd. | Electronic apparatus for providing panorama image and control method thereof |
EP3229070B1 (en) * | 2016-04-06 | 2018-09-19 | Facebook, Inc. | Three-dimensional, 360-degree virtual reality camera exposure control |
US10200624B2 (en) * | 2016-04-06 | 2019-02-05 | Facebook, Inc. | Three-dimensional, 360-degree virtual reality exposure control |
CN105721788B (zh) * | 2016-04-07 | 2019-06-07 | 福州瑞芯微电子股份有限公司 | 一种多摄像头电子设备及其拍摄方法 |
CN106570849A (zh) * | 2016-10-12 | 2017-04-19 | 成都西纬科技有限公司 | 一种图像优化方法 |
US9921464B1 (en) * | 2017-05-03 | 2018-03-20 | Seung Kwon Choi | Gimbal for 360-degree video and picture shooting |
US11049218B2 (en) | 2017-08-11 | 2021-06-29 | Samsung Electronics Company, Ltd. | Seamless image stitching |
CN108476291A (zh) * | 2017-09-26 | 2018-08-31 | 深圳市大疆创新科技有限公司 | 图像生成方法、图像生成装置和机器可读存储介质 |
CN111373730B (zh) * | 2017-09-28 | 2021-04-13 | 深圳传音制造有限公司 | 全景拍摄方法及终端 |
DE102017123261A1 (de) * | 2017-10-06 | 2019-04-11 | Soccerwatch.Tv Gmbh | Panoramakamera zur Videoaufnahme von Spielfeldern sowie ein Verfahren zum Betrieb dieser Panoramakamera |
CN107566739B (zh) * | 2017-10-18 | 2019-12-06 | 维沃移动通信有限公司 | 一种拍照方法及移动终端 |
US11057573B2 (en) * | 2017-12-20 | 2021-07-06 | Texas Instruments Incorporated | Multi camera image processing |
US10764496B2 (en) * | 2018-03-16 | 2020-09-01 | Arcsoft Corporation Limited | Fast scan-type panoramic image synthesis method and device |
CN108377345B (zh) * | 2018-04-11 | 2020-04-03 | 浙江大华技术股份有限公司 | 一种曝光参数值确定方法、装置、多目摄像机及存储介质 |
FR3082690B1 (fr) * | 2018-06-15 | 2021-05-21 | Safran Electronics & Defense | Dispositif de veille proximale |
US11624909B2 (en) | 2018-06-18 | 2023-04-11 | Magic Leap, Inc. | Head-mounted display systems with power saving functionality |
CN112567287A (zh) * | 2018-06-18 | 2021-03-26 | 奇跃公司 | 具有帧调制功能的增强现实显示 |
CN109348124B (zh) * | 2018-10-23 | 2021-06-11 | Oppo广东移动通信有限公司 | 图像传输方法、装置、电子设备和存储介质 |
US11375104B2 (en) * | 2019-08-15 | 2022-06-28 | Apple Inc. | System for producing a continuous image from separate image sources |
CN110581956A (zh) * | 2019-08-26 | 2019-12-17 | Oppo广东移动通信有限公司 | 图像处理方法、装置、存储介质及电子设备 |
CN114143472A (zh) * | 2019-09-02 | 2022-03-04 | 深圳市道通智能航空技术股份有限公司 | 图像曝光方法、装置、拍摄设备及无人机 |
US20230031023A1 (en) * | 2021-07-29 | 2023-02-02 | Qualcomm Incorporated | Multiple camera system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11355624A (ja) * | 1998-06-05 | 1999-12-24 | Fuji Photo Film Co Ltd | 撮影装置 |
JP2000032303A (ja) * | 1998-07-09 | 2000-01-28 | Fuji Photo Film Co Ltd | 撮像装置 |
JP2005197952A (ja) * | 2004-01-06 | 2005-07-21 | Sony Corp | 撮像装置及び撮像方法 |
JP2010074535A (ja) * | 2008-09-18 | 2010-04-02 | Fujifilm Corp | 撮影装置及び撮影方法 |
JP2010098616A (ja) * | 2008-10-17 | 2010-04-30 | Sanyo Electric Co Ltd | 画像処理回路 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0912047B1 (en) * | 1997-10-23 | 2004-04-07 | Olympus Optical Co., Ltd. | Imaging apparatus comprising means for expanding the dynamic range |
US6546197B2 (en) * | 2000-03-15 | 2003-04-08 | Fuji Photo Film Co., Ltd. | Method and apparatus of correcting image data picked up from photographic film |
JP2002251608A (ja) * | 2001-02-23 | 2002-09-06 | Mixed Reality Systems Laboratory Inc | 撮像装置の制御装置及びその制御方法、画像処理装置及びその方法、並びにプログラムコード、記憶媒体 |
US20030043292A1 (en) * | 2001-08-31 | 2003-03-06 | Pyle Norman C. | System and method for automatic capture of light producing scenes |
JP2004048561A (ja) * | 2002-07-15 | 2004-02-12 | Fuji Photo Film Co Ltd | 撮像装置及び測光装置 |
US7561731B2 (en) * | 2004-12-27 | 2009-07-14 | Trw Automotive U.S. Llc | Method and apparatus for enhancing the dynamic range of a stereo vision system |
JP4932660B2 (ja) * | 2007-10-05 | 2012-05-16 | 富士フイルム株式会社 | 画像記録装置及び画像記録方法 |
JP2009276956A (ja) | 2008-05-14 | 2009-11-26 | Fujifilm Corp | 画像処理装置および方法並びにプログラム |
WO2010118177A1 (en) * | 2009-04-08 | 2010-10-14 | Zoran Corporation | Exposure control for high dynamic range image capture |
JP5464955B2 (ja) | 2009-09-29 | 2014-04-09 | 株式会社ソニー・コンピュータエンタテインメント | パノラマ画像表示装置 |
CN101963751B (zh) | 2010-08-19 | 2011-11-30 | 西北工业大学 | 高分辨率实时全景高动态范围图像获取装置及方法 |
JP5409577B2 (ja) * | 2010-10-05 | 2014-02-05 | 株式会社ソニー・コンピュータエンタテインメント | パノラマ画像生成装置およびパノラマ画像生成方法 |
JP5701664B2 (ja) * | 2011-04-07 | 2015-04-15 | オリンパス株式会社 | 撮像装置 |
JP5693355B2 (ja) * | 2011-04-27 | 2015-04-01 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム、記憶媒体 |
-
2012
- 2012-06-11 EP EP12879030.0A patent/EP2860957B1/en active Active
- 2012-06-11 US US14/405,238 patent/US9979884B2/en active Active
- 2012-06-11 CN CN201280073714.4A patent/CN104335569B/zh active Active
- 2012-06-11 WO PCT/JP2012/003800 patent/WO2013186806A1/ja active Application Filing
- 2012-06-11 JP JP2014520799A patent/JP5881827B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11355624A (ja) * | 1998-06-05 | 1999-12-24 | Fuji Photo Film Co Ltd | 撮影装置 |
JP2000032303A (ja) * | 1998-07-09 | 2000-01-28 | Fuji Photo Film Co Ltd | 撮像装置 |
JP2005197952A (ja) * | 2004-01-06 | 2005-07-21 | Sony Corp | 撮像装置及び撮像方法 |
JP2010074535A (ja) * | 2008-09-18 | 2010-04-02 | Fujifilm Corp | 撮影装置及び撮影方法 |
JP2010098616A (ja) * | 2008-10-17 | 2010-04-30 | Sanyo Electric Co Ltd | 画像処理回路 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2860957A4 * |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016057602A (ja) * | 2014-09-10 | 2016-04-21 | 清水建設株式会社 | 撮影器具及びパノラマ撮影装置 |
JP2019215581A (ja) * | 2014-09-10 | 2019-12-19 | 清水建設株式会社 | 撮影器具及びパノラマ撮影装置 |
CN104486544B (zh) * | 2014-12-08 | 2017-08-11 | 广东欧珀移动通信有限公司 | 一种全景照片的拍摄方法及装置 |
CN104486544A (zh) * | 2014-12-08 | 2015-04-01 | 广东欧珀移动通信有限公司 | 一种全景照片的拍摄方法及装置 |
US11290645B2 (en) | 2015-02-06 | 2022-03-29 | Panasonic Intellectual Property Management Co., Ltd. | Imaging processing device, imaging system and imaging apparatus including the same, and image processing method |
US10536633B2 (en) | 2015-02-06 | 2020-01-14 | Panasonic Intellectual Property Management Co., Ltd. | Image processing device, imaging system and imaging apparatus including the same, and image processing method |
JP2016178435A (ja) * | 2015-03-19 | 2016-10-06 | カシオ計算機株式会社 | 撮像制御装置、撮像制御方法及びプログラム |
EP3079346A3 (en) * | 2015-04-10 | 2016-11-16 | Samsung Electronics Co., Ltd. | Apparatus and method for setting camera |
US20160301866A1 (en) | 2015-04-10 | 2016-10-13 | Samsung Electronics Co., Ltd. | Apparatus and method for setting camera |
US10257416B2 (en) | 2015-04-10 | 2019-04-09 | Samsung Electronics Co., Ltd. | Apparatus and method for setting camera |
JP2016208306A (ja) * | 2015-04-23 | 2016-12-08 | パナソニックIpマネジメント株式会社 | 画像処理装置及びこれを備えた撮像システムならびに画像処理方法 |
JPWO2016174707A1 (ja) * | 2015-04-27 | 2018-01-11 | 株式会社日立ハイテクノロジーズ | 荷電粒子線装置及び当該装置を用いた試料の観察方法 |
WO2016174707A1 (ja) * | 2015-04-27 | 2016-11-03 | 株式会社日立ハイテクノロジーズ | 荷電粒子線装置及び当該装置を用いた試料の観察方法 |
JP2017073620A (ja) * | 2015-10-06 | 2017-04-13 | 株式会社トプコン | 画像処理装置、画像処理方法および画像処理用プログラム |
JP2017143390A (ja) * | 2016-02-09 | 2017-08-17 | キヤノン株式会社 | 画像処理装置およびその方法 |
WO2017149875A1 (ja) * | 2016-02-29 | 2017-09-08 | ソニー株式会社 | 撮像制御装置、撮像装置及び撮像制御方法 |
US10582127B2 (en) | 2016-02-29 | 2020-03-03 | Sony Corporation | Image processing device, display device, reproduction control method, and image processing system |
CN106161967B (zh) * | 2016-09-13 | 2020-03-17 | 维沃移动通信有限公司 | 一种逆光场景全景拍摄方法及移动终端 |
CN106161967A (zh) * | 2016-09-13 | 2016-11-23 | 维沃移动通信有限公司 | 一种逆光场景全景拍摄方法及移动终端 |
KR101806840B1 (ko) | 2016-12-15 | 2017-12-11 | 인천대학교 산학협력단 | 다수의 카메라를 이용한 고해상도 360도 동영상 생성 시스템 |
CN107566749A (zh) * | 2017-09-25 | 2018-01-09 | 维沃移动通信有限公司 | 拍摄方法及移动终端 |
JP2018186514A (ja) * | 2018-06-13 | 2018-11-22 | パナソニックIpマネジメント株式会社 | 画像処理装置及びこれを備えた撮像システムならびに画像処理方法 |
Also Published As
Publication number | Publication date |
---|---|
JP5881827B2 (ja) | 2016-03-09 |
EP2860957A1 (en) | 2015-04-15 |
US20150116453A1 (en) | 2015-04-30 |
EP2860957A4 (en) | 2016-03-02 |
US9979884B2 (en) | 2018-05-22 |
JPWO2013186806A1 (ja) | 2016-02-01 |
CN104335569A (zh) | 2015-02-04 |
EP2860957B1 (en) | 2020-11-18 |
CN104335569B (zh) | 2017-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5881827B2 (ja) | 画像撮像装置および画像撮像方法 | |
JP5828039B2 (ja) | 画像生成装置および画像生成方法 | |
JP5828038B2 (ja) | 画像撮像装置および画像撮像方法 | |
JP6151809B2 (ja) | 画像撮像装置および画像撮像方法 | |
US8988558B2 (en) | Image overlay in a mobile device | |
JP7135299B2 (ja) | 画像処理装置、表示装置、画像処理方法、プログラム | |
WO2013186805A1 (ja) | 画像撮像装置および画像撮像方法 | |
US11736806B2 (en) | Auto exposure metering for spherical panoramic content | |
CN113574856A (zh) | 图像处理设备、图像处理方法和程序 | |
CN105744132B (zh) | 全景图像拍摄的光学镜头配件 | |
JP2022128489A (ja) | 撮像装置 | |
JP2020042064A (ja) | 表示制御装置、撮像装置、制御方法、プログラム、及び、記憶媒体 | |
US11431906B2 (en) | Electronic device including tilt OIS and method for capturing image and processing captured image | |
US11122202B2 (en) | Imaging device, image processing system, and image processing method | |
US20200412928A1 (en) | Imaging device, imaging system, and imaging method | |
JP2024084281A (ja) | 撮像装置の制御装置、方法及びプログラム | |
JP2020016883A (ja) | 撮像装置 | |
WO2022019879A1 (en) | Wide angle adapter lens for enhanced video stabilization | |
JP2020043387A (ja) | 画像処理装置、画像処理方法、プログラム、及び、記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12879030 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014520799 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14405238 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012879030 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |