US10582127B2 - Image processing device, display device, reproduction control method, and image processing system - Google Patents
Image processing device, display device, reproduction control method, and image processing system Download PDFInfo
- Publication number
- US10582127B2 US10582127B2 US16/077,218 US201716077218A US10582127B2 US 10582127 B2 US10582127 B2 US 10582127B2 US 201716077218 A US201716077218 A US 201716077218A US 10582127 B2 US10582127 B2 US 10582127B2
- Authority
- US
- United States
- Prior art keywords
- view
- reproduction
- imaging
- field
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 24
- 238000003384 imaging method Methods 0.000 claims abstract description 687
- 239000000872 buffer Substances 0.000 claims description 103
- 238000010586 diagram Methods 0.000 description 58
- 230000010363 phase shift Effects 0.000 description 31
- 238000005516 engineering process Methods 0.000 description 11
- 230000006835 compression Effects 0.000 description 9
- 238000007906 compression Methods 0.000 description 9
- 230000001934 delay Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 230000007704 transition Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000002441 reversible effect Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000005096 rolling process Methods 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000005520 cutting process Methods 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 238000007654 immersion Methods 0.000 description 2
- 230000001771 impaired effect Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000005549 size reduction Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 102220062245 rs1800369 Human genes 0.000 description 1
- 102220139156 rs774350715 Human genes 0.000 description 1
- 102220059961 rs786201335 Human genes 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H04N5/23238—
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/04—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/211—Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H04N5/232—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present disclosure relates to an image processing device, a display device, a reproduction control method, and an image processing system.
- Patent Literature 1 discloses an example of a technique of stitching images from a plurality of cameras to generate an omnidirectional image.
- An omnidirectional image may also be generated by sequentially performing imaging while revolving the field of view of a single camera module (that is, while rotating the camera module) and stitching a plurality of captured images captured in such a manner.
- Patent Literature 2 discloses an example of a digital still camera for generating such an omnidirectional image.
- Patent Literature 1 JP 2006-039564A
- Patent Literature 2 JP 2015-156523A
- a technology according to the present disclosure has an object to resolve or reduce such inconvenience.
- an image processing device including: a reproduction control unit configured to control reproduction of omnidirectional video based on a plurality of captured images sequentially captured in a plurality of fields of view that revolve while partially overlapping one another.
- the omnidirectional video is covered by M fields of view from a first field of view to an M-th (M>1) field of view with a reference direction serving as a starting point.
- the reproduction control unit causes a display unit to display a reproduction image based on a first captured image corresponding to the first field of view and a second captured image corresponding to the M-th field of view captured earlier than the first captured image.
- a display device including the above-described image processing device and display unit.
- a reproduction control method of controlling reproduction of omnidirectional video based on a plurality of captured images sequentially captured in a plurality of fields of view that revolve while partially overlapping one another is provided.
- the omnidirectional video is covered by M fields of view from a first field of view to an M-th (M>1) field of view with a reference direction serving as a starting point.
- the reproduction control method includes: generating, by an image processing device, a reproduction image based on a first captured image corresponding to the first field of view and a second captured image corresponding to the M-th field of view captured earlier than the first captured image, in a case where reproduction in a reproduction field of view that straddles the reference direction is requested; and causing a display unit to display the generated reproduction image.
- an image processing system including: an imaging device configured to generate a plurality of captured images by sequentially performing imaging in a plurality of fields of view that revolve while partially overlapping one another; and an image processing device that includes a reproduction control unit configured to control reproduction of omnidirectional video based on the plurality of captured images.
- the omnidirectional video is covered by M fields of view from a first field of view to an M-th (M>1) field of view with a reference direction serving as a starting point.
- the reproduction control unit causes a display unit to display a reproduction image based on a first captured image corresponding to the first field of view and a second captured image corresponding to the M-th field of view captured earlier than the first captured image.
- the technology according to the present disclosure resolves or reduces inconvenience resulting from a shift in imaging timing that occurs during reproduction of omnidirectional video or integral processing based on a plurality of captured images captured at different timings.
- the effects described above are not necessarily limitative. With or in the place of the above effects, there may be achieved any one of the effects described in this specification or other effects that may be grasped from this specification.
- FIG. 1A is an explanatory diagram for describing an example of a schematic configuration of an omnidirectional camera.
- FIG. 1B is an explanatory diagram for describing another example of a schematic configuration of an omnidirectional camera.
- FIG. 2A is an explanatory diagram for describing a first example of an arrangement of a plurality of camera modules neighboring in an azimuth angle direction.
- FIG. 2B is an explanatory diagram for describing a second example of an arrangement of a plurality of camera modules neighboring in an azimuth angle direction.
- FIG. 3A is an explanatory diagram for describing a first example of an arrangement of a plurality of camera modules also neighboring in an attack and depression angle direction.
- FIG. 3B is an explanatory diagram for describing a second example of an arrangement of a plurality of camera modules also neighboring in an attack and depression angle direction.
- FIG. 4A is an explanatory diagram for describing a phase shift between two camera modules arranged in a landscape arrangement and neighboring in the azimuth angle direction.
- FIG. 4B is an explanatory diagram for describing a phase shift between two camera modules arranged in a portrait arrangement and neighboring in the azimuth angle direction.
- FIG. 5A is an explanatory diagram for describing a phase shift between two camera modules arranged in the portrait arrangement and neighboring in the attack and depression angle direction.
- FIG. 5B is an explanatory diagram for describing a phase shift between two camera modules arranged in the landscape arrangement and neighboring in the attack and depression angle direction.
- FIG. 6 is a schematic diagram showing an example of a configuration of a camera system according to a first embodiment.
- FIG. 7 is a block diagram showing an example of a configuration of an imaging device and a display terminal according to the first embodiment.
- FIG. 8 is an explanatory diagram for describing an example of control of imaging timing between two camera modules arranged in the landscape arrangement and neighboring in the azimuth angle direction.
- FIG. 9 is an explanatory diagram for describing an example of control of imaging timing between two camera modules arranged in the portrait arrangement and neighboring in the azimuth angle direction.
- FIG. 10 is an explanatory diagram for describing an example of control of imaging timing between two camera modules arranged in the portrait arrangement and adjacent in the attack and depression angle direction.
- FIG. 11 is an explanatory diagram for describing an example of control of imaging timing between two camera modules arranged in the landscape arrangement and neighboring in the attack and depression angle direction.
- FIG. 12 is an explanatory diagram for describing an example of an omnidirectional image captured without a phase shift over a plurality of frames.
- FIG. 13 is an explanatory diagram for describing some examples of displayed images that may be constructed in accordance with various reproduction fields of view.
- FIG. 14 is a flowchart showing an example of a flow of imaging control processing according to the first embodiment.
- FIG. 15A is a flowchart showing an example of a detailed flow of control information acquisition processing.
- FIG. 15B is a flowchart showing another example of a detailed flow of control information acquisition processing.
- FIG. 16 is a flowchart showing an example of a flow of reproduction control processing according to the first embodiment.
- FIG. 17 is an explanatory diagram for describing an example of a schematic configuration of a rotating camera.
- FIG. 18 is an explanatory diagram showing a camera module and an imaging field of view revolving in accordance with rotation of the camera module.
- FIG. 19 is a schematic diagram showing an example of a configuration of a camera system according to a second embodiment.
- FIG. 20 is a block diagram showing an example of a configuration of an imaging device according to a first example of the second embodiment.
- FIG. 21 is an explanatory diagram for describing an example of omnidirectional frames generated by an omnidirectional frame generation unit shown in FIG. 20 .
- FIG. 22 is a block diagram showing an example of a configuration of a display terminal according to the first example of the second embodiment.
- FIG. 23A is a first explanatory diagram showing a manner in which a user moves a reproduction field of view in a certain scenario.
- FIG. 23B is a second explanatory diagram showing a manner in which a user moves a reproduction field of view in a certain scenario.
- FIG. 23C is a third explanatory diagram showing a manner in which a user moves a reproduction field of view in a certain scenario.
- FIG. 24A is an explanatory diagram for describing generation of a reproduction image at the time point of FIG. 23A .
- FIG. 24B is an explanatory diagram for describing generation of a reproduction image at the time point of FIG. 23B .
- FIG. 24C is an explanatory diagram showing a manner in which a failure occurs in a reproduction image generated at the time point of FIG. 23C .
- FIG. 24D is an explanatory diagram showing a manner in which a reproduction image having no failure is generated at the time point of FIG. 23C .
- FIG. 25 is a flowchart showing an example of a flow of imaging control processing according to the first example of the second embodiment.
- FIG. 26 is a flowchart showing an example of a flow of reproduction control processing according to the first example of the second embodiment.
- FIG. 27 is a block diagram showing an example of a configuration of an imaging device according to a second example of the second embodiment.
- FIG. 28 is an explanatory diagram for describing reference frames for inter-frame prediction in the second example.
- FIG. 29 is a block diagram showing an example of a configuration of a display terminal according to the second example of the second embodiment.
- FIG. 30 is an explanatory diagram for describing an example of a required buffer size of a frame buffer shown in FIG. 29 .
- FIG. 31 is an explanatory diagram for describing a readout position of a reproduction image corresponding to a desired reproduction field of view in the frame buffer shown in FIG. 29 .
- FIG. 32 is a flowchart showing an example of a flow of imaging control processing according to the second example of the second embodiment.
- FIG. 33 is a flowchart showing an example of a flow of reproduction control processing according to the second example of the second embodiment.
- FIG. 34 is an explanatory diagram showing an example of transition of a buffer state of the frame buffer in a first scenario.
- FIG. 35 is an explanatory diagram showing an example of transition of a buffer state of the frame buffer in a second scenario.
- FIG. 36 is an explanatory diagram showing an example of transition of a buffer state of the frame buffer in a third scenario.
- an “omnidirectional image” refers to an image that covers the entire field of view of 360° in at least one direction (for example, one or more of an azimuth angle direction and an attack and depression angle direction).
- An omnidirectional image is typically generated by integrating a plurality of captured images.
- “Omnidirectional video” refers to video that covers the entire field of view of 360° in at least one direction. Reproduction of an omnidirectional image and reproduction of omnidirectional video are typically performed in a partial reproduction field of view. For example, in reproduction of omnidirectional video, a reproduction image (reproduction frame) of a (partial) reproduction field of view in any direction requested by a user in the entire field of view of 360° is generated and displayed on a screen.
- a reproduction frame of omnidirectional video may be cut out from an omnidirectional frame generated in advance, in agreement with the reproduction field of view, as an example.
- the reproduction frame of omnidirectional video may be constructed at the time of reproduction from captured images of corresponding one or more imaging fields of view.
- the expression of an “omnidirectional frame” has a meaning substantially equivalent to an omnidirectional image, but suggests that it is a kind of frame included in a moving image.
- a mere expression of a “frame” or an “individual frame” includes each of a series of captured images included in video that have been captured in individual imaging fields of view.
- the present disclosure includes a technology related to imaging and reproduction of omnidirectional video based on images captured by a plurality of camera modules and a technology related to imaging and reproduction of omnidirectional video based on images captured by a single rotating camera module.
- First embodiment mainly describes the former
- Second embodiment mainly describes the latter. Note that characteristics described in these embodiments may be combined with each other in any way unless otherwise stated.
- Imaging device
- Timing of reading out a pixel value performed by each camera module of an omnidirectional camera may vary depending on the pixel position.
- the timing of reading out a pixel value may differ between the leading pixel and the trailing pixel of one image sensor. This difference causes, between camera modules that respectively image partially overlapping fields of view, a shift in imaging timing of an overlapping portion of the fields of view. This shift is referred to as a phase shift in the present specification.
- a quickly moving subject is imaged by the omnidirectional camera, how the subject is seen does not agree between neighboring camera modules due to such a phase shift, and it may be difficult to integrally process captured images from those camera modules. The same applies to the case where imaging is performed while moving the camera quickly.
- the present embodiment proposes a technology for resolving or reducing inconvenience resulting from a phase shift in such an omnidirectional camera (or a device that integrally processes captured images from at least a plurality of camera modules).
- An omnidirectional camera refers to, in many cases, a camera capable of imaging the entire field of view of 360° around a certain reference axis.
- the reference axis is a vertical axis
- the field of view of 360° is formed in the azimuth angle direction.
- Some omnidirectional cameras also have a wide field of view in the attack and depression angle direction, and may be capable of imaging even the zenith, for example.
- the present embodiment may be applied to various omnidirectional cameras having various fields of view. For example, the present embodiment is also applicable to a camera only having a field of view of less than 360° in any direction.
- FIG. 1A is an explanatory diagram for describing an example of a schematic configuration of an omnidirectional camera.
- An omnidirectional camera 10 a shown in FIG. 1A has an enclosure 11 a formed cylindrically around an axis 13 a .
- a plurality of camera modules 20 a , 20 b , 20 c , 20 d , 20 e , 20 f , 20 g , and 20 h having lenses directed radially to directions different from one another are located.
- each camera module 20 images the field of view of the camera module to generate a captured image.
- the camera module 20 a images a first field of view to generate a first image signal.
- the camera module 20 b images a second field of view neighboring to the first field of view in the azimuth angle direction to generate a second image signal.
- the optical axes of eight camera modules 20 extend radially in the horizontal plane.
- an integral image that covers the entire field of view of 360° in the azimuth angle direction can be obtained by integrating captured images from all the camera modules 20 .
- an overlapping portion exists in some proportion between neighboring two fields of view, and in that case, the field of view of each camera module 20 in the azimuth angle direction has a viewing angle of more than 45°.
- FIG. 1B is an explanatory diagram for describing another example of a schematic configuration of an omnidirectional camera.
- An omnidirectional camera 10 b shown in FIG. 1B has an enclosure 11 b formed generally as a sphere having been cut at the top surface and the bottom surface that are orthogonal to an axis 13 b .
- On the outer peripheral surface of the enclosure 11 b four pairs of camera modules having fields of view neighboring in the attack and depression angle direction are located.
- the pair of camera modules 20 i and 20 j , the pair of camera modules 20 k and 20 m , the pair of the camera modules 20 n and 20 p , and the pair of the camera modules 20 q and 20 r are located at a spacing of 90° in the azimuth angle direction between neighboring pairs.
- each camera module 20 images the field of view of the camera module to generate a captured image.
- the camera module 20 i images a first field of view to generate a first image signal.
- the camera module 20 j images a second field of view neighboring to the first field of view in the attack and depression angle direction to generate a second image signal.
- the optical axes of the two camera modules 20 of each pair extend radially in the vertical plane.
- an integral image having a viewing angle of 105° can be obtained in the attack and depression angle direction by integrating captured images generated by the camera modules 20 i and 20 j .
- an image that covers the entire field of view of 360° in the azimuth angle direction can also be obtained.
- FIG. 1A and FIG. 1B are mere examples.
- the present embodiment may be applied to a camera of any shape including any number of two or more camera modules.
- Readout of a pixel value from an image sensor is usually performed in the rolling shutter system or the global shutter system.
- a charge coupled device (CCD) is employed as an image sensor
- pixel value readout is performed in principle in the global shutter system.
- CMOS complementary metal oxide semiconductor
- pixel value readout is usually performed in the rolling shutter system.
- charges accumulated in respective pixels are sequentially read out as pixel values from an upper line to a lower line in a two-dimensional focal plane and from a left pixel to a right pixel in each line.
- pixel readout direction in the sequential readout system there may be two typical patterns of landscape arrangement and portrait arrangement for the arrangement of camera modules.
- FIG. 2A is an explanatory diagram for describing a first example of the arrangement of a plurality of camera modules neighboring in the azimuth angle direction.
- the fields of view of a plurality of camera modules are expressed two-dimensionally using the horizontal axis representing an azimuth angle ⁇ and the vertical axis representing an attack and depression angle ⁇ .
- a field of view Fv 1 a is the field of view of the camera module 20 a .
- a field of view Fv 1 b is the field of view of the camera module 20 b .
- a field of view Fv 1 c is the field of view of the camera module 20 c .
- the fields of view Fv 1 a and Fv 1 b neighbor to each other in the azimuth angle direction, and include an overlapping portion Fo 1 a in common.
- the fields of view Fv 1 b and Fv 1 c neighbor to each other in the azimuth angle direction, and include an overlapping portion Fo 1 b in common.
- the camera module 20 reads out the pixel value of each pixel in the field of view of its own for each horizontal line and from the left to the right in each line, as indicated by the arrow in the drawing. Consequently, for example, the field of view Fv 1 a of the camera module 20 a precedes the field of view Fv 1 b of the camera module 20 b in the pixel readout direction.
- the time until the readout timing reaches the overlapping portion Fo 1 a after readout of the pixel value by the camera module 20 a is started is shorter than the time required for reading out one line by the proportion of the overlapping portion.
- FIG. 2B is an explanatory diagram for describing a second example of the arrangement of a plurality of camera modules neighboring in the azimuth angle direction.
- a field of view Fv 2 a is the field of view of the camera module 20 a .
- a field of view Fv 2 b is the field of view of the camera module 20 b .
- a field of view Fv 2 c is the field of view of the camera module 20 c .
- the fields of view Fv 2 a and Fv 2 b neighbor to each other in the azimuth angle direction, and include an overlapping portion Fo 2 a in common.
- the field of view Fv 2 b and Fv 2 c neighbor to each other in the azimuth angle direction, and include an overlapping portion Fo 2 b in common.
- the camera module 20 reads out the pixel value of each pixel in the field of view of its own for each vertical line and from the bottom to the top in each line, as indicated by the arrow in the drawing. Consequently, for example, the field of view Fv 2 a of the camera module 20 a precedes the field of view Fv 2 b of the camera module 20 b in the pixel readout direction.
- the time until the readout timing reaches the overlapping portion Fo 2 a after readout of pixel values by the camera module 20 a is started is shorter than the time required for reading out all the lines in the focal plane by the proportion of the overlapping portion.
- FIG. 3A is an explanatory diagram for describing a first example of the arrangement of a plurality of camera modules also neighboring in an attack and depression angle direction.
- the fields of view of a plurality of camera modules are expressed two-dimensionally using the horizontal axis representing an azimuth angle ⁇ and the vertical axis representing an attack and depression angle ⁇ .
- a field of view Fv 3 i is the field of view of the camera module 20 i .
- a field of view Fv 3 j is the field of view of the camera module 20 j .
- a field of view Fv 3 k is the field of view of the camera module 20 k .
- the fields of view Fv 1 a and Fv 1 b neighbor to each other in the attack and depression angle direction, and include an overlapping portion Fo 3 ij in common.
- the fields of view Fv 3 i and Fv 3 k neighbor to each other in the azimuth angle direction, and include an overlapping portion Fo 3 ik in common.
- the camera module 20 reads out the pixel value of each pixel in the field of view of its own for each vertical line and from the bottom to the top in each line, as indicated by the arrow in the drawing.
- the field of view Fv 3 i of the camera module 20 i precedes the field of view Fv 3 j of the camera module 20 j in the pixel readout direction.
- the field of view Fv 3 k precedes the field of view Fv 3 k of the camera module 20 k .
- the time until the readout timing reaches the overlapping portion Fo 3 ij after readout of the pixel value by the camera module 20 i is started is shorter than the time required for reading out one line by the proportion of the overlapping portion.
- FIG. 3B is an explanatory diagram for describing a second example of the arrangement of a plurality of camera modules also neighboring in an attack and depression angle direction.
- a field of view Fv 4 j is the field of view of the camera module 20 j .
- a field of view Fv 4 m is the field of view of the camera module 20 m .
- a field of view Fv 4 i is the field of view of the camera module 20 i .
- the fields of view Fv 4 j and Fv 4 m neighbor to each other in the azimuth angle direction, and include an overlapping portion Fo 4 jm in common.
- the fields of view Fv 4 j and Fv 4 i neighbor to each other in the attack and depression angle direction, and include an overlapping portion Fo 4 ji in common.
- the camera module 20 reads out the pixel value of each pixel in the field of view of its own for each horizontal line and from the left to the right in each line, as indicated by the arrow in the drawing. Consequently, for example, the field of view Fv 4 j of the camera module 20 j precedes the field of view Fv 4 m of the camera module 20 m in the pixel readout direction, and precedes the field of view Fv 4 i of the camera module 20 i .
- the time until the readout timing reaches the overlapping portion Fo 4 ji after readout of the pixel value by the camera module 20 j is started is shorter than the time required for reading out all the lines in the focal plane by the proportion of the overlapping portion.
- the phase of an image signal corresponding to the overlapping portion differs between the first image signal generated by the first imaging unit and the second image signal generated by the second imaging unit. It may be understood that the phase herein stands for the timing at which the pixel value at a certain pixel position is read out during one frame time.
- FIG. 4A is an explanatory diagram for describing a phase shift between two camera modules arranged in the landscape arrangement and neighboring in the azimuth angle direction.
- the field of view Fv 1 a is the field of view of the camera module 20 a .
- the field of view Fv 1 b is the field of view of the camera module 20 b .
- a pixel P 11 a and a pixel P 12 a are pixels in the field of view Fv 1 a .
- a pixel P 11 b and a pixel P 12 b are pixels in the field of view Fv 1 b .
- the pixel P 12 a and the pixel P 11 b occupy substantially the same position in the integral field of view.
- FIG. 4B is an explanatory diagram for describing a phase shift between two camera modules arranged in the portrait arrangement and neighboring in the azimuth angle direction.
- the field of view Fv 2 a is the field of view of the camera module 20 a .
- the field of view Fv 2 b is the field of view of the camera module 20 b .
- a pixel P 21 a and a pixel P 22 a are pixels in the field of view Fv 2 a .
- a pixel P 21 b and a pixel P 22 b are pixels in the field of view Fv 2 b .
- the pixel P 22 a and the pixel P 21 b occupy substantially the same position in the integral field of view.
- FIG. 5A is an explanatory diagram for describing a phase shift between two camera modules arranged in the portrait arrangement and neighboring in the attack and depression angle direction.
- the field of view Fv 3 i is the field of view of the camera module 20 i .
- the field of view Fv 3 j is the field of view of the camera module 20 j .
- a pixel P 31 i and a pixel P 32 i are pixels in the field of view Fv 3 i .
- a pixel P 31 i and a pixel P 32 j are pixels in the field of view Fv 3 j .
- the pixel P 32 i and the pixel P 31 j occupy substantially the same position in the integral field of view.
- FIG. 5B is an explanatory diagram for describing a phase shift between two camera modules arranged in the landscape arrangement and neighboring in the attack and depression angle direction.
- the field of view Fv 4 j is the field of view of the camera module 20 j .
- the field of view Fv 4 i is the field of view of the camera module 20 i .
- a pixel P 4 j 1 i and a pixel P 42 j are pixels in the field of view Fv 4 j .
- a pixel P 41 i and a pixel P 42 j are pixels in the field of view Fv 4 i .
- the pixel P 42 j and the pixel P 41 i occupy substantially the same position in the integral field of view.
- phase shift described using FIGS. 4A, 4B, 5A, and 5B causes inconsistency in how the subject or background is seen between captured images that reflect neighboring fields of view, and reduces the accuracy of integral processing such as stitching or stereo matching. Therefore, in a camera system which will be described in detail from the following section, control of imaging timing for resolving such a phase shift is introduced.
- FIG. 6 is a schematic diagram showing an example of a configuration of a camera system according to the first embodiment.
- a camera system 1 may include an imaging device 100 , a server 160 , and display terminals 170 a and 170 b.
- the imaging device 100 includes, for example, a plurality of imaging units (camera modules) respectively having such a plurality of fields of view that integrally cover the entire field of view of 360° around a certain reference axis.
- the imaging device 100 may include any number of imaging units, and those imaging units may have any arrangement.
- the imaging device 100 is an omnidirectional camera placed in an imaging environment 102 .
- the imaging device 100 may be, for example, a standalone device that records captured images in a local memory.
- the imaging device 100 may be connected to the server 160 , or the display terminal 170 a or 170 b via a network 165 .
- the server 160 is an information processing device that accumulates captured images generated by the imaging device 100 (or images for reproduction generated from those captured images through integral processing).
- the server 160 receives an image from the imaging device 100 via the network 165 or via a direct connection line (not shown), and causes a storage medium to store the received image.
- the server 160 distributes the image received from the imaging device 100 to the display terminal 170 . Note that, in the case where the imaging device 100 and the display terminal 170 are directly connected, the camera system 1 may not include the server 160 .
- the network 165 may be a public network such as the Internet, or a private network such as a home network or a corporate network.
- the network 165 may include any combination of a wireless link and a wired link.
- the display terminal 170 a is a terminal device having the function of displaying an image captured by the imaging device 100 .
- the display terminal 170 a is a wearable terminal worn on the head of a user.
- the user can reproduce an image captured by the imaging device 100 on the screen of the display terminal 170 a .
- the image captured by the imaging device 100 is, for example, an omnidirectional image (each of frames included in a still image or a moving image) that integrally covers the entire field of view of 360°. In the case where the user is directed to a certain direction, a displayed image of a reproduction field of view corresponding to that direction in the omnidirectional image may be reproduced.
- the reproduction field of view changes, and a displayed image of the reproduction field of view after the change may be reproduced.
- the display terminal 170 b is also a terminal device having the function of displaying an image captured by the imaging device 100 .
- the display terminal 170 b is a mobile terminal held by the user.
- the user can reproduce an omnidirectional image captured by the imaging device 100 on the screen of the display terminal 170 b .
- a displayed image of a reproduction field of view corresponding to that direction in the omnidirectional image may be reproduced.
- the reproduction field of view changes, and a displayed image of the reproduction field of view after the change may be reproduced.
- the display terminal 170 may acquire an omnidirectional image directly from the imaging device 100 , or may acquire an omnidirectional image from the server 160 . Processing of constructing an omnidirectional image or an individual displayed image from captured images generated by the plurality of imaging units of the imaging device 100 may be performed by any of the imaging device 100 , the server 160 , and the display terminal 170 .
- FIG. 7 shows an example of a configuration of the imaging device 100 according to the first embodiment.
- the imaging device 100 includes a multi-camera unit 110 , an imaging control unit 120 , a memory 125 , an image processing unit 130 , and a frame buffer 135 .
- the multi-camera unit 110 includes a plurality of imaging units 112 a , 112 b , 112 c , . . . , and 112 n .
- the first imaging unit 112 a images a first field of view to generate a first image signal, and outputs the generated first image signal to the image processing unit 130 .
- the second imaging unit 112 b images a second field of view to generate a second image signal, and outputs the generated second image signal to the image processing unit 130 .
- the second field of view partially overlaps the first field of view.
- the third imaging unit 112 c images a third field of view to generate a third image signal, and outputs the generated third image signal to the image processing unit 130 .
- the third field of view partially overlaps at least one of the first field of view and the second field of view.
- the n-th imaging unit 112 n images an n-th field of view to generate an n-th image signal, and outputs the generated n-th image signal to the image processing unit 130 .
- the entire field of view of 360° around a certain reference axis may be integrally covered by all the fields of view of these plurality of imaging units 112 a , 112 b , 112 c , . . . , and 112 n or their subsets.
- the reference axis is the vertical axis, and the entire field of view of 360° in the azimuth angle direction is covered by a plurality of fields of view.
- the present embodiment is also applicable to the case where the reference axis has any inclination.
- the first imaging unit 112 a includes a CMOS image sensor (not shown) that photoelectrically converts an image of light that enters from the first field of view via a lens into an electric signal, and reads out the pixel value from the image sensor in the sequential readout system to generate the first image signal.
- the second imaging unit 112 b includes a CMOS image sensor (not shown) that photoelectrically converts an image of light that enters from the second field of view via a lens into an electric signal, and reads out the pixel value from the image sensor in the sequential readout system to generate the second image signal.
- the other imaging units 112 may also generate respective image signals with a technique similar to the first imaging unit 112 a and the second imaging unit 112 b except having specific fields of view, respectively.
- the first field of view of the first imaging unit 112 a and the second field of view of the second imaging unit 112 b neighbor in the azimuth angle direction.
- the first imaging unit 112 a and the second imaging unit 112 b are both arranged in the portrait arrangement, and read out pixel values per vertical line.
- the relation between the first and second fields of view in this case is equivalent to the relation between the field of view Fv 2 a and the field of view Fv 2 b illustrated in FIG. 2B .
- first field of view of the first imaging unit 112 a and the second field of view of the second imaging unit 112 b neighbor in the azimuth angle direction.
- first imaging unit 112 a and the second imaging unit 112 b are both arranged in the landscape arrangement, and read out pixel values per horizontal line.
- the relation between the first and second fields of view in this case is equivalent to the relation between the field of view Fv 1 a and the field of view Fv 1 b illustrated in FIG. 2A .
- first field of view of the first imaging unit 112 a and the second field of view of the second imaging unit 112 b neighbor in the attack and depression angle direction.
- first imaging unit 112 a and the second imaging unit 112 b are both arranged in the portrait arrangement, and read out pixel values per vertical line.
- the relation between the first and second fields of view in this case is equivalent to the relation between the field of view Fv 3 i and the field of view Fv 3 j illustrated in FIG. 3A .
- first field of view of the first imaging unit 112 a and the second field of view of the second imaging unit 112 b neighbor in the attack and depression angle direction.
- first imaging unit 112 a and the second imaging unit 112 b are both arranged in the landscape arrangement, and read out pixel values per horizontal line.
- the relation between the first and second fields of view in this case is equivalent to the relation between the field of view Fv 4 j and the field of view Fv 4 i illustrated in FIG. 3B .
- the imaging control unit 120 is a controller module that controls an imaging operation in the multi-camera unit 110 .
- the imaging control unit 120 causes capturing of an omnidirectional image in the multi-camera unit 110 to be started in accordance with a trigger for starting imaging detected via some user interface (not shown) or communication interface (not shown), for example.
- a trigger for starting imaging detected via some user interface (not shown) or communication interface (not shown), for example.
- the omnidirectional image constitutes a moving image
- capturing of the omnidirectional image may be repeated over a plurality of frames until a trigger for terminating imaging is detected.
- the imaging control unit 120 controls imaging timing of each imaging unit 112 of the multi-camera unit 110 for the imaging operation in each frame. For example, the imaging control unit 120 controls imaging timing of at least one of the first image signal generated in the first imaging unit 112 a and the second image signal generated in the second imaging unit 112 b such that the phase of the above-described first image signal and the phase of the above-described second image signal corresponding to the overlapping portion of the first field of view and the second field of view agree. In addition, the imaging control unit 120 also similarly controls imaging timing of another pair of imaging units having fields of view that partially overlap each other such that the phases of image signals corresponding to the overlapping portion agree in the pair.
- the first field of view of the first imaging unit 112 a shall precede the second field of view of the second imaging unit 112 b in the pixel readout direction.
- the imaging control unit 120 may delay readout of the pixel value from the leading pixel in the second imaging unit 112 b by the time during which the readout pixel position in the first imaging unit 112 a reaches the overlapping portion of the first field of view and the second field of view from the leading pixel.
- FIG. 8 to FIG. 11 show examples of control of imaging timing performed by the imaging control unit 120 for each combination of the two arrangement patterns (landscape arrangement/portrait arrangement) of imaging units and two neighboring directions (azimuth angle direction/attack and depression angle direction) of the imaging units.
- a timing chart of reading out pixel values in the case where control of imaging timing is not performed is shown on the left side for comparison, and a timing chart of reading out pixel values in the case where control of imaging timing is performed by the imaging control unit 120 is shown on the right side.
- the horizontal axis of each of the timing charts represents the phase of image signal, and the vertical axis represents the pixel position one-dimensionalized in accordance with the readout order in the sequential readout system.
- FIG. 8 corresponds to the case where the first imaging unit 112 a and the second imaging unit 112 b are arranged in the landscape arrangement, and the fields of view of those imaging units neighbor in the azimuth angle direction.
- the solid line represents imaging timing (pixel value readout timing) of a first image signal Im 1 a generated by the first imaging unit 112 a at each pixel position
- the broken line represents imaging timing of a second image signal Im 1 b generated by the second imaging unit 112 b at each pixel position.
- imaging of the second image signal Im 1 b is started at the pixel P 11 b at the same time when imaging of the first image signal Im 1 a is started at the pixel P 11 a .
- the pixel P 11 b of the second image signal Im 1 b occupies substantially the same position as the pixel P 12 a of the first image signal Im 1 a in the integral field of view, the pixel value of the pixel P 12 a of the first image signal Im 1 a is not read out at this time point. Thereafter, at the time point when a time ⁇ 1 elapses, the pixel value of the pixel P 12 a is read out by the first imaging unit 112 a.
- the imaging control unit 120 delays readout of the pixel value of the leading pixel P 11 b in the second imaging unit 112 b by the time ⁇ 1 until the readout pixel position in the first imaging unit 112 a reaches the first pixel P 12 a in the overlapping portion from the leading pixel P 11 a .
- the phase of the first image signal Im 1 a and the phase of the second image signal Im 1 b agree.
- FIG. 9 corresponds to the case where the first imaging unit 112 a and the second imaging unit 112 b are arranged in the portrait arrangement, and the fields of view of those imaging units neighbor in the azimuth angle direction.
- the solid line represents imaging timing (pixel value readout timing) of a first image signal Im 2 a generated by the first imaging unit 112 a at each pixel position
- the broken line represents imaging timing of a second image signal Im 2 b generated by the second imaging unit 112 b at each pixel position.
- imaging of the second image signal Im 2 b is started at the pixel P 21 b at the same time when imaging of the first image signal Im 2 a is started at the pixel P 21 a .
- the pixel P 21 b of the second image signal Im 2 b occupies substantially the same position as the pixel P 22 a of the first image signal Im 2 a in the integral field of view, the pixel value of the pixel P 22 a of the first image signal Im 2 a is not read out at this time point. Thereafter, at the time point when a time ⁇ 2 elapses, the pixel value of the pixel P 22 a is read out by the first imaging unit 112 a.
- the imaging control unit 120 delays readout of the pixel value of the leading pixel P 21 b in the second imaging unit 112 b by the time ⁇ 2 until the readout pixel position in the first imaging unit 112 a reaches the first pixel P 22 a in the overlapping portion from the leading pixel P 21 a .
- the phase of the first image signal Im 2 a and the phase of the second image signal Im 2 b agree.
- FIG. 10 corresponds to the case where the first imaging unit 112 a and the second imaging unit 112 b are arranged in the portrait arrangement, and the fields of view of those imaging units neighbor in the attack and depression angle direction.
- the solid line represents imaging timing (pixel value readout timing) of a first image signal Im 3 a generated by the first imaging unit 112 a at each pixel position
- the broken line represents imaging timing of a second image signal Im 3 b generated by the second imaging unit 112 b at each pixel position.
- imaging of the second image signal Im 3 b is started at the pixel P 31 j at the same time when imaging of the first image signal Im 3 a is started at the pixel P 31 i .
- the pixel P 31 j of the second image signal Im 3 b occupies substantially the same position as the pixel P 32 i of the first image signal Im 3 a in the integral field of view, the pixel value of the pixel P 32 i of the first image signal Im 3 a is not read out at this time point. Thereafter, at the time point when a time ⁇ 3 elapses, the pixel value of the pixel P 32 i is read out by the first imaging unit 112 a.
- the imaging control unit 120 delays readout of the pixel value of the leading pixel P 31 i in the second imaging unit 112 b by the time ⁇ 3 until the readout pixel position in the first imaging unit 112 a reaches the first pixel P 32 i in the overlapping portion from the leading pixel P 31 i .
- the phase of the first image signal Im 3 a and the phase of the second image signal Im 3 b agree.
- FIG. 11 corresponds to the case where the first imaging unit 112 a and the second imaging unit 112 b are arranged in the landscape arrangement, and the fields of view of those imaging units neighbor in the attack and depression angle direction.
- the solid line represents imaging timing (pixel value readout timing) of a first image signal Im 4 a generated by the first imaging unit 112 a at each pixel position
- the broken line represents imaging timing of a second image signal Im 4 b generated by the second imaging unit 112 b at each pixel position.
- imaging of the second image signal Im 4 b is started at the pixel P 41 i at the same time when imaging of the first image signal Im 4 a is started at the pixel P 41 j .
- the pixel P 41 i of the second image signal Im 4 b occupies substantially the same position as the pixel P 42 j of the first image signal Im 4 a in the integral field of view, the pixel value of the pixel P 42 j of the first image signal Im 4 a is not read out at this time point. Thereafter, at the time point when a time ⁇ 4 elapses, the pixel value of the pixel P 42 j is read out by the first imaging unit 112 a.
- the imaging control unit 120 delays readout of the pixel value of the leading pixel P 41 i in the second imaging unit 112 b by the time ⁇ 4 until the readout pixel position in the first imaging unit 112 a reaches the first pixel P 42 j in the overlapping portion from the leading pixel P 41 j .
- the phase of the first image signal Im 4 a and the phase of the second image signal Im 4 b agree.
- the imaging control unit 120 may not relatively delay the imaging timing ignoring a phase shift that less influences integral processing.
- the memory 125 may store timing control information that defines the delay time ⁇ 1 , ⁇ 2 , ⁇ 3 , or ⁇ 4 in advance as described using FIG. 8 to FIG. 11 .
- the timing control information not only defines the delay time of imaging timing of the second imaging unit 112 b based on imaging timing of the first imaging unit 112 a (that is, the delay time of imaging timing of the second image signal with respect to the first image signal), but also may define the delay time of imaging timing of each of the other imaging units 112 c to 112 n .
- the imaging control unit 120 may acquire timing control information that defines such a delay time in advance in association with each of the imaging units 112 from the memory 125 , and may transmit an imaging start instruction signal to each of the imaging units 112 in accordance with the acquired timing control information.
- the imaging start instruction signal may be successively forwarded between the imaging units 112 having neighboring fields of view using the imaging start instruction signal transmitted from the imaging control unit 120 to the first imaging unit 112 a as a trigger, in such a manner from the imaging control unit 120 to the first imaging unit 112 a , from the first imaging unit 112 a to the second imaging unit 112 b , from the second imaging unit 112 b to the third imaging unit 112 c (and so forth).
- the imaging control unit 120 may dynamically determine the delay time of imaging timing of the other imaging units 112 b to 112 n based on the imaging timing of the first imaging unit 112 a , and may transmit the imaging start instruction signal to each of the imaging units 112 in accordance with the determined delay time.
- the delay time of imaging timing between two imaging units 112 having neighboring fields of view may depend on a frame time as will be described later. Therefore, for example, the imaging control unit 120 may dynamically determine the delay time to be applied to each imaging unit depending on a required frame time (or a required frame rate) that may be set in a variable manner.
- the imaging control unit 120 controls the pixel readout speed in the first imaging unit 112 a the pixel readout speed in the first imaging unit 112 a such that pixel values within the range from the leading pixel (for example, the pixel P 11 a in FIG. 8 , a pixel P 21 a in FIG. 9 , a pixel P 31 i in FIG. 10 , or a pixel P 41 j in FIG. 11 ) to reach the overlapping portion (for example, the pixel P 12 a in FIG. 8 , a pixel P 22 a in FIG. 9 , a pixel P 32 i in FIG. 10 , or a pixel P 42 j in FIG. 11 ) are read out during the delay time (for example, ⁇ 1 , ⁇ 2 , ⁇ 3 , or ⁇ 4 ) for pixel value readout in the second imaging unit 112 b.
- the delay time for example, ⁇ 1 , ⁇ 2 , ⁇ 3 , or ⁇ 4
- N CAM imaging units 112 integrally cover the entire field of view of 360° around a certain reference axis.
- the N CAM imaging units 112 may be all the imaging units included in the multi-camera unit 110 , or may be a subset of those imaging units (for example, four camera modules on the top surface side or four camera modules on the bottom surface side in the example of FIG. 1B ).
- the imaging timing of the first imaging unit 112 a is the earliest among the N CAM imaging units 112
- the imaging timing of the n-th imaging unit 112 n is the latest.
- the field of view of the n-th imaging unit 112 n neighbors the field of view of the first imaging unit 112 a (in the direction opposite to the pixel readout direction). In terms of imaging timing, imaging of the i-th frame by the first imaging unit 112 a is performed subsequently to imaging of the i ⁇ 1-th frame by the n-th imaging unit 112 n .
- a phase shift in the overlapping portion between a captured image of the i ⁇ 1-th frame by the n-th imaging unit 112 n and a captured image of the i-th frame by the first imaging unit 112 a can be resolved for any integral number i, and integral processing of those captured images can be smoothed.
- pixel value readout in the N CAM imaging units 112 can be repeated cyclically without a phase shift over a plurality of frames in the case where the following relational expression holds between a frame time T FRAME which is a reciprocal of a required frame rate of an omnidirectional image and the number N CAM of imaging units 112 arranged around the reference axis which may be the vertical axis.
- T FRAME (1 ⁇ r OL ) ⁇ T FP ⁇ N CAM (1)
- r OL in Expression (1) represents the proportion of a portion in a captured image of one imaging unit 112 that overlaps a captured image of another imaging unit 112 .
- T FP is a time required for one imaging unit 112 to read out a captured image in one focal plane.
- the delay time ⁇ of imaging timing between two imaging units 112 having neighboring fields of view may be determined as follows on the basis of Expression (1).
- the delay time ⁇ of pixel value readout in the second imaging unit 112 b with respect to the first imaging unit 112 a in the N CAM imaging units 112 is equal to a quotient obtained by dividing the required frame time T FRAME by the number N CAM of imaging units 112 .
- the delay time between any two neighboring imaging units 112 may be the same value.
- the pixel readout speed S FP in each imaging unit 112 may be a quotient obtained by dividing the total number of pixels in one focal plane by the focal plane time T FP , for example.
- calculation of the delay time ⁇ and the pixel readout speed S FP may be performed in advance (for example, during manufacturing of the camera or calibration after manufacturing) as described above.
- timing control information that defines the delay time of imaging timing of each imaging unit 112 and readout control information that defines the pixel readout speed S FP may be generated and stored by the memory 125 .
- the imaging control unit 120 may dynamically calculate one or both of the delay time of imaging timing and the pixel readout speed of each imaging unit 112 using the above-described parameters.
- FIG. 12 is an explanatory diagram for describing an example of an omnidirectional image captured over a plurality of frames without a phase shift.
- the i-th frame of the omnidirectional image includes eight captured images Im 51 a , Im 51 b , Im 51 c , Im 51 d , Im 51 e , Im 51 f , Im 51 g , and Im 51 h . Then, imaging timings of these captured images are delayed by the delay time z sequentially from a captured image preceding in the pixel readout direction, and ⁇ times 8 is equal to the frame time T FRAME . That is, the product of the delay time ⁇ and the number N CAM of imaging units 112 is equal to the frame time T FRAME of the omnidirectional image, and the total of phase delays is equivalent to 360°.
- the phase in the overlapping portion agrees between the captured image Im 50 h captured latest in the i ⁇ 1-th frame and the captured image Im 51 a captured earliest in the i-th frame
- the phase in the overlapping portion also agrees between the captured image Im 51 h captured latest in the i-th frame and a captured image Im 52 a captured earliest in the i+1-th frame.
- the image processing unit 130 or the display terminal 170 which will be described later can integrally process a captured image of the earliest field of view in the i ⁇ (or i+1-)th frame and a captured image of the latest field of view in the i ⁇ 1-(or i-)th frame without being influenced by a phase shift.
- an omnidirectional image usually represents an image of a field of view around the camera two-dimensionally in accordance with the cylindrical projection (such as the equirectangular projection or the rectilinear projection, for example).
- the field of view circulates in one of the dimensions of an image, and the field of view does not circulate in the other dimension.
- a cyclical field of view is applied to the azimuth angle direction and a non-cyclical field of view is applied to the attack and depression angle direction in conformity to properties of the human visual system.
- the delay time of imaging timing between the imaging units 112 having fields of view neighboring in the non-cyclical direction may be determined with any technique as long as image signals of the overlapping portion of the fields of view agree in phase.
- the delay time may not be set.
- the image processing unit 130 integrally processes images represented by image signals input from the plurality of imaging units 112 of the multi-camera unit 110 to generate an output image.
- the image signals input from the plurality of imaging units 112 represent captured images that respectively reflect fields of view different from one another.
- the image processing unit 130 may use a partial image in an overlapping portion of two captured images that reflect neighboring fields of view to stitch those captured images and generate a combined image.
- the image processing unit 130 may determine a parallax of a subject by executing stereo matching using two captured images overlapping each other.
- the image processing unit 130 may generate a depth map representing a result of parallax determination, or may generate an omnidirectional stereoscopic image.
- the image processing unit 130 may simply forward those images to another device (for example, the server 160 or the display terminal 170 described using FIG. 6 ).
- the image processing unit 130 may subject the images to compression encoding in any video compression system before forwarding the images.
- the image processing unit 130 may avoid the influence caused by a phase shift by integrally processing a captured image of the i ⁇ (or i+1-)th frame and a captured image of the i ⁇ 1-(or i-)th frame in the case where the field of view targeted for processing straddles the phase origin.
- the frame buffer 135 may buffer not only captured images of the newest frame imaged by the imaging units 112 , respectively, but also captured images of the immediately preceding frame.
- FIG. 7 also shows an example of a configuration of the display terminal 170 according to the first embodiment.
- the display terminal 170 includes a reproduction control unit 180 , a frame buffer 185 , and a display unit 190 .
- the reproduction control unit 180 acquires an omnidirectional image output from the imaging device 100 in accordance with a trigger for reproduction start detected via some user interface (not shown), for example, to reproduce the omnidirectional image on the screen of the display unit 190 .
- the reproduction control unit 180 may receive the image directly from the imaging device 100 , or may receive the image via an intermediate device such as the server 160 , for example.
- the omnidirectional image acquired by the reproduction control unit 180 typically includes a plurality of images that respectively reflect fields of view different from one another per frame.
- Each image may be a captured image itself generated by the imaging unit 112 of the imaging device 100 , or an image (for example, a stereoscopic image including a right-eye image and a left-eye image) generated by processing a captured image.
- the reproduction control unit 180 trims or links one or more images corresponding to a reproduction field of view instructed by a user (or autonomously determined in accordance with the attitude of the terminal, for example) to construct a displayed image of the reproduction field of view. Then, the reproduction control unit 180 outputs the constructed displayed image to the display unit 190 .
- FIG. 13 is an explanatory diagram for describing some examples of displayed images that may be constructed in accordance with various reproduction fields of view.
- the fields of view of eight imaging units 112 cover the field of view of 360° in the azimuth angle direction, and two imaging units 112 cover the integral field of view in the attack and depression angle direction. Therefore, an omnidirectional image of one frame includes sixteen images in total.
- the reproduction field of view is smaller than the entire field of view of the omnidirectional image.
- three illustrative reproduction fields of view Fr 1 , Fr 2 , and Fr 3 are shown by thick-frame rectangles.
- the reproduction control unit 180 may acquire an image Im 60 p in the i ⁇ 1-th frame and images Im 61 b and Im 61 d in the i-th frame, for example, and trim required portions from those images Im 60 p , Im 61 b , and Im 61 d and link the required portions to construct a displayed image of the reproduction field of view Fr 1 .
- the reproduction control unit 180 may acquire images Im 61 e , Im 61 g , and Im 61 i in the i-th frame, and trim required portions from them and link the required portions to construct a displayed image of the reproduction field of view Fr 2 .
- the reproduction control unit 180 may acquire images Im 61 m , Im 61 n , Im 61 o , and Im 61 p in the i-th frame and images Im 62 a and Im 62 b in the i+1-th frame, and trim required portions from them and link the required portions to construct a displayed image of the reproduction field of view Fr 3 .
- the reproduction field of view Fr 1 reflects a subject J 1 , and the image of the subject J 1 may be reproduced without distortion or failure since there is no phase shift between partial images included in the displayed image of the reproduction field of view Fr 1 regardless of that those partial images have been captured by different camera modules.
- the reproduction field of view Fr 3 also reflects the subject J 1 , and the image of the subject J 1 may be reproduced without distortion or failure since there is no phase shift between partial images included in the displayed image of the reproduction field of view Fr 3 .
- the displayed image of the reproduction field of view Fr 3 is constructed from the images Im 61 a , Im 61 b , Im 61 m , Im 61 n , Im 61 o , and Im 61 p in the i-th frame
- the image of the subject J 1 should fail due to a phase shift between the images Im 61 a , Im 61 b and the remaining images.
- such a failure may be prevented by the technique described in the above-described embodiment.
- the frame buffer 185 buffers individual images included in an omnidirectional image (still image or moving image) acquired by the reproduction control unit 180 for the past two frames. Accordingly, if the reproduction field of view is set in any direction in the entire field of view of 360°, it is possible to make access to all the partial images required for the reproduction control unit 180 to appropriately construct the displayed image of that reproduction field of view.
- the display unit 190 is typically a display device that may include a screen and a display driver.
- the display unit 190 reproduces a displayed image of a reproduction field of view input from the reproduction control unit 180 on the screen.
- the screen may be, for example, equivalent to a microdisplay in the eyeglass wearable terminal 170 a shown in FIG. 6 , and a touch panel in the mobile terminal 170 b .
- the omnidirectional image is not limited to these examples, but may be displayed on a monitor of a fixed terminal such as a personal computer (PC) or a television device, for example.
- the omnidirectional image may be projected onto a screen by a projector.
- FIG. 14 is a flowchart showing an example of a flow of imaging control processing that may be executed by the imaging control unit 120 according to the first embodiment.
- the imaging control unit 120 first acquires control information for controlling an imaging operation in the multi-camera unit 110 (step S 110 ).
- the control information acquired herein may include timing control information that defines the delay time of imaging timing of each of the plurality of imaging units 112 , and readout control information that defines the pixel readout speed.
- the imaging control unit 120 awaits a trigger for starting imaging (step S 115 ). For example, when a user input that instructs the start of imaging is detected via a user interface, or a control command that instructs the start of imaging is detected via a communication interface, the imaging control processing proceeds into step S 120 .
- step S 120 the imaging control unit 120 supplies the imaging start instruction signal to the first imaging unit 112 a that should operate at the earliest timing among the plurality of imaging units 112 (step S 120 ).
- the first imaging unit 112 a is triggered by the supplied imaging start instruction signal to start readout of the pixel value from the leading pixel in the focal plane by the sequential readout system.
- the pixel readout speed in each imaging unit 112 may be designated by the imaging start instruction signal, or may be set via a different control signal.
- the imaging start instruction signal is supplied to the second imaging unit 112 b (step S 130 ). Waiting for such a delay time and supply of the imaging start signal is repeated until imaging of all the fields of view is started (step S 135 ).
- the image processing unit 130 integrally processes captured images that reflect different fields of view imaged respectively in this manner by the plurality of imaging units 112 at different timings (step S 140 ).
- the imaging control unit 120 supplies the imaging start instruction signal to the first imaging unit 112 a again (step S 120 ). Such capturing of the omnidirectional image may be repeated until the termination of imaging is instructed.
- FIG. 15A is a flowchart showing an example of a detailed flow of control information acquisition processing that may be executed in step S 110 in FIG. 14 .
- the imaging control unit 120 first acquires timing control information that defines imaging timing per module determined in advance considering the frame time T FRAME of an omnidirectional image and the camera module number N CAM from the memory 125 (step S 111 ).
- the imaging control unit 120 acquires readout control information that defines the pixel readout speed determined in advance from the memory 125 (step S 112 ).
- FIG. 15B is a flowchart showing another example of a detailed flow of control information acquisition processing that may be executed in step S 110 in FIG. 14 .
- the imaging control unit 120 first determines the imaging timing per module (for example, the delay time of imaging timing of another imaging unit 112 based on the imaging timing of one imaging unit 112 ), and generates timing control information indicating the determined imaging timing (step S 113 ).
- the imaging control unit 120 determines the pixel readout speed on the basis of parameters such as the total number of pixels in one focal plane and the focal plane time, for example, and generates readout control information indicating the determined pixel readout speed (step S 114 ).
- FIG. 16 is a flowchart showing an example of a flow of reproduction control processing that may be executed by the display terminal 170 according to the first embodiment.
- the reproduction control unit 180 of the display terminal 170 determines the reproduction field of view on the basis of an instruction from the user or a result of measuring the attitude of the terminal (step S 181 ). Subsequent processing branches in accordance with whether or not the determined reproduction field of view straddles the phase origin (step S 183 ).
- the reproduction control unit 180 acquires one or more images corresponding to the reproduction field of view from an image set included in an omnidirectional image of a single frame (step S 185 ).
- the reproduction control unit 180 acquires a plurality of images corresponding to the reproduction field of view from an image set included in an omnidirectional image over a plurality of frames (step S 187 ).
- the reproduction control unit 180 trims or links one or more images acquired in step S 185 or S 187 in agreement with the reproduction field of view to construct the displayed image of the reproduction field of view (step S 189 ).
- the display unit 190 reproduces the displayed image of the reproduction field of view constructed by the reproduction control unit 180 on the screen (step S 191 ). Such reproduction of an omnidirectional image may be repeated until the termination of reproduction is instructed (step S 193 ).
- the imaging timing of at least one of two imaging units that image fields of view that partially overlap each other and generate image signals, respectively, is controlled such that the image signals corresponding to the overlapping portion of those fields of view agree in phase. That is, since a phase shift between two imaging units is suppressed, it is possible to prevent integral processing of captured images from the two imaging units from becoming difficult due to a phase shift.
- CMOS can be employed as an image sensor, for example, and it is not necessary to mount a global shutter on the CMOS device. This means that the present embodiment facilitates low-cost manufacture or size reduction of an omnidirectional camera capable of imaging a quickly moving subject in high quality or an omnidirectional camera that may even support quick camera work.
- readout of the pixel value of the leading pixel in the second imaging unit may be delayed by the time until the readout pixel position in the first imaging unit having a field of view preceding the field of view of the second imaging unit in the pixel readout direction reaches the overlapping portion of the fields of view from the leading pixel.
- a delay time per imaging unit is determined in advance, an additional calculation burden is not imposed on camera processing during imaging.
- a phase shift can be suppressed while providing flexibility of causing the user to variably set parameters such as the required frame rate, for example.
- the above-described delay time may be determined on the basis of the number of imaging units that integrally cover the entire field of view of 360° around a certain reference axis and a required frame time.
- a phase shift can be suppressed smoothly at any position in the circulating direction (for example, the azimuth angle direction).
- the pixel readout speed in each imaging unit may be controlled such that pixel values in a range from the leading pixel to reach the overlapping portion of the fields of view are read out during the above-described delay time.
- the above-described delay time can be achieved by controlling the pixel readout speed, so that a phase shift can be suppressed reliably.
- a single camera module performs imaging M times while rotating 360° around a certain rotation axis.
- M captured images cover the entire field of view of 360°. By stitching these M captured images, one omnidirectional image can be generated.
- omnidirectional video for X omnidirectional frames can include X ⁇ M captured images.
- the display terminal can provide the user with an experience as if a scene as reflected in the omnidirectional frame spreads around the user.
- the imaging timing is greatly different between the first captured image positioned at the beginning in the forward direction in the omnidirectional frame and the M-th captured image positioned at the end in the forward direction.
- the time difference between imaging timings of two captured images positioned on the both ends of the omnidirectional frame is (M ⁇ 1) ⁇ t. If a reproduction image is going to be generated on the basis of the first captured image and the M-th captured image in the case where the reproduction field of view straddles a reference direction equivalent to the boundary of the omnidirectional frame, inconsistency in how a subject or background is seen occurring due to the time difference (M ⁇ 1) ⁇ t between them interferes with generation of the reproduction image without failure.
- the present embodiment proposes a technology for resolving or reducing inconvenience resulting from a shift in imaging timing in reproduction of omnidirectional video based on captured images from such a rotating camera module.
- FIG. 17 is an explanatory diagram for describing an example of a schematic configuration of a rotating camera.
- a rotating camera 50 shown in FIG. 17 includes an enclosure 61 including a camera module 60 , a rotating member 63 , and a fixed member 65 .
- the enclosure 61 has one side surface (or the bottom surface) coupled to the rotating member 63 in a fixed manner.
- the rotating member 63 is coupled to the fixed member 65 rotationally around a rotation axis 51 .
- the rotation axis 51 is the vertical axis.
- the camera module 60 performs imaging periodically while rotating in such a manner in detail to generate a captured image, as will be described later.
- the intersection between the rotation axis 51 and the optical axis of the camera module 60 is referred to as a nodal point. While the camera module 60 is rotating, a nodal point 53 does not move.
- FIG. 18 shows the camera module 60 as seen from vertically above and an imaging field of view revolving around the nodal point 53 (or the rotation axis 51 ) in accordance with the rotation of the camera module 60 .
- the camera module 60 performs imaging six times during a rotation.
- the camera module 60 is directed to a first imaging direction D 1 a and performs first imaging for a first imaging field of view Fv 6 a .
- the camera module 60 is directed to a second imaging direction D 1 b and performs second imaging for a second imaging field of view Fv 6 b .
- the camera module 60 is directed to a third imaging direction D 1 c and performs third imaging for a third imaging field of view Fv 6 c .
- the camera module 60 is directed to a fourth imaging direction D 1 d and performs fourth imaging for a fourth imaging field of view Fv 6 d .
- the camera module 60 is directed to a fifth imaging direction D 1 e and performs fifth imaging for a fifth imaging field of view Fv 6 e .
- the camera module 60 is directed to a sixth imaging direction D 1 f and performs sixth imaging for a sixth imaging field of view Fv 6 f .
- any two imaging fields of view imaged consecutively overlap each other if the viewing angle of each of the imaging fields of view is larger than 60° ( 360°/6).
- the rotating camera shown in FIG. 17 and FIG. 18 is a mere example.
- the present embodiment is also applicable to the case where the rotation axis 51 has any inclination.
- the rotating camera may have two or more camera modules.
- FIG. 17 shows an example where the rotating member 63 has an arm-like shape and the fixed member 65 has a rod-like shape, whilst these members may have other shapes.
- FIG. 19 is a schematic diagram showing an example of a configuration of the camera system according to the second embodiment.
- a camera system 5 may include an imaging device 200 , a server 240 , and display terminals 250 a and 250 b.
- the imaging device 200 is a device similar to the rotating camera 50 illustrated in FIG. 17 that, for example, performs imaging periodically while rotating around a certain rotation axis.
- the imaging device 200 is placed in an imaging environment 202 .
- the imaging device 200 may be, for example, a standalone device that records captured images in a local memory.
- the imaging device 200 may be connected to the server 240 or the display terminal 250 a or 250 b via a network 245 .
- the server 240 is an information processing device that accumulates captured images generated by the imaging device 200 or an omnidirectional frame generated from those captured image.
- the server 240 receives an image from the imaging device 200 via the network 245 or via a direct connection line (not shown), and causes a storage medium to store the received image.
- the server 240 distributes the image received from the imaging device 200 to the display terminal 250 . Note that, in the case where the imaging device 200 and the display terminal 250 are directly connected, the camera system 5 may not include the server 240 .
- the network 245 may be a public network such as the Internet, or a private network such as a home network or a corporate network.
- the network 245 may include any combination of a wireless link and a wired link.
- the display terminal 250 a is a terminal device having the function of displaying an image captured by the imaging device 200 .
- the display terminal 250 a is a wearable terminal worn on the head of a user.
- the user can reproduce an image captured by the imaging device 200 on the screen of the display terminal 250 a .
- an image of a reproduction field of view corresponding to that direction in the entire field of view of 360° may be reproduced.
- the reproduction field of view changes, and an image of the reproduction field of view after the change may be reproduced.
- the display terminal 250 b is also a terminal device having the function of displaying an image captured by the imaging device 200 .
- the display terminal 250 b is a mobile terminal held by the user.
- the user can reproduce an omnidirectional image captured by the imaging device 200 on the screen of the display terminal 250 b .
- a displayed image of a reproduction field of view corresponding to that direction in the entire field of view of 360° may be reproduced.
- the display terminal 250 may acquire an omnidirectional image directly from the imaging device 200 , or may acquire an image from the server 240 .
- Processing of constructing omnidirectional frames or individual reproduction images from a series of captured images may be performed by any of the imaging device, the server, and the display terminal.
- the imaging device generates each of omnidirectional frames included in omnidirectional video
- a display terminal generates a reproduction image for omnidirectional video from individual captured images generated by the imaging device
- FIG. 20 shows an example of a configuration of the imaging device 200 according to a first example of the second embodiment.
- the imaging device 200 includes an imaging unit 210 , an imaging control unit 220 , an omnidirectional frame generation unit 230 , and a frame memory 235 .
- the imaging unit 210 generates a captured image by imaging an imaging field of view to which the optical axis is directed at that time point under control exerted by the imaging control unit 220 which will be described later. More specifically, while the imaging control unit 220 causes the imaging direction of the imaging unit 210 to make one rotation around the rotation axis (causes the imaging field of view to make one revolution), the imaging unit 210 sequentially images at least M (M>1) imaging fields of view partially overlapping one another to generate M captured images. The entire field of view of 360° of the omnidirectional frame is integrally covered by the first to the M-th, M imaging fields of view. The imaging unit 210 sequentially outputs image signals representing these captured images to the omnidirectional frame generation unit 230 .
- the imaging control unit 220 controls the rotation of an enclosure of the imaging device 200 and an imaging operation in the imaging unit 210 .
- the imaging control unit 220 starts a rotation around the rotation axis of the enclosure of the imaging device 200 in accordance with an instruction for the start of imaging detected via some user interface (not shown) or communication interface (not shown), for example.
- the imaging control unit 220 causes the imaging unit 210 to perform imaging at respective timings when the imaging unit 210 is directed to M imaging fields of view (or more imaging fields of view) from the first imaging field of view to the M-th imaging field of view with the reference direction serving as the starting point.
- the imaging control unit 220 causes periodic imaging by the imaging unit 210 to terminate and stops the rotation of the enclosure of the imaging device 200 in accordance with an instruction for termination of imaging detected via some user interface or communication interface, for example.
- the omnidirectional frame generation unit 230 integrally processes captured images input from the imaging unit 210 to generate an omnidirectional frame.
- M captured images sequentially input from the imaging unit 210 respectively reflect imaging fields of view different from one another, and cover the entire field of view of 360° as a whole.
- the omnidirectional frame generation unit 230 may overlap overlapping portions of two captured images that reflect neighboring fields of view to couple those two captured images.
- the omnidirectional frame generation unit 230 may trim one of the overlapping portions of two captured images that reflect neighboring fields of view, and then may couple the two captured images.
- stitching in the present specification shall include both of these two techniques.
- FIG. 21 is an explanatory diagram for describing an example of omnidirectional frames generated by the omnidirectional frame generation unit 230 .
- the vertical axis of the chart shown in the upper half of FIG. 21 represents phases of a rotation of a camera module with a phase ⁇ 0 of the reference direction serving as a reference.
- the horizontal axis of the chart represents the time.
- the number M of imaging fields of view imaged during one rotation is equal to 6.
- An image Im(i,j) is the j-th captured image generated during the period of the i-th rotation (1 ⁇ i, 1 ⁇ j ⁇ M).
- An image Ia(i) is an omnidirectional frame generated from M captured images generated during the period of the i-th rotation.
- the omnidirectional frame generation unit 230 stitches captured images Im( 1 , 1 ) to Im( 1 ,M) generated during the period of the first rotation to generate a first omnidirectional frame Ia( 1 ).
- the omnidirectional frame generation unit 230 stitches captured images Im( 2 , 1 ) to Im( 2 ,M) generated during the period of the second rotation to generate a second omnidirectional frame Ia( 2 ).
- the omnidirectional frame generation unit 230 repeats generation of omnidirectional frames in this manner at a predetermined omnidirectional frame rate while imaging by the imaging unit 210 is being continued, and causes the frame memory 235 to store generated omnidirectional frames.
- the omnidirectional frame generation unit 230 may forward the generated omnidirectional frames to a server (for example, the server 240 shown in FIG. 19 ) that assists in imaging or reproduction of omnidirectional video.
- the omnidirectional frame generation unit 230 (or the server 240 ) may distribute the omnidirectional frames to one or more display terminals in real time.
- the omnidirectional frame generation unit 230 may subject omnidirectional video to compression encoding in any video compression system before storing, forwarding, or distributing the omnidirectional frames.
- FIG. 22 shows an example of a configuration of a display terminal 250 according to the first example.
- the display terminal 250 includes an image acquisition unit 260 , a frame buffer 265 , a field-of-view determination unit 275 , a reproduction control unit 280 , and a display unit 290 .
- the image acquisition unit 260 acquires omnidirectional video including a series of omnidirectional frames generated by the imaging device 200 from the imaging device 200 or the server 240 .
- the image acquisition unit 260 may decode video information acquired from the imaging device 200 or the server 240 to restore the series of omnidirectional frames.
- the image acquisition unit 260 causes the frame buffer 265 to store the respective omnidirectional frames included in the omnidirectional video.
- the respective omnidirectional frames are images generated by stitching M captured images corresponding to the above-described M imaging fields of view.
- the frame buffer 265 temporarily stores the omnidirectional frames acquired by the image acquisition unit 260 .
- the frame buffer 265 buffers not only the newest omnidirectional frame, but also at least an omnidirectional frame which is the previous frame.
- the field-of-view determination unit 275 determines a reproduction field of view required by the user. For example, the field-of-view determination unit 275 may set an initial reproduction field of view at the start of reproduction on the basis of the reference direction defined in advance, and may revolve the reproduction field of view in the forward direction or the reverse direction in accordance with a change in attitude of the display terminal 250 after the start of reproduction. The field-of-view determination unit 275 may measure a change in attitude of the display terminal 250 using an acceleration sensor or a gyro sensor. In addition, the field-of-view determination unit 275 may determine the reproduction field of view in accordance with the orientation of the display terminal 250 measured by a geomagnetic sensor.
- the field-of-view determination unit 275 may determine the reproduction field of view in accordance with a user input (such as a tap or a drag, for example) detected via a user interface (not shown) or a voice command detected via a voice recognition module (not shown).
- a user input such as a tap or a drag, for example
- the reproduction control unit 280 causes the image acquisition unit 260 to start acquisition of omnidirectional frames in accordance with a trigger for reproduction start.
- the acquired omnidirectional frames are buffered by the frame buffer 265 .
- the reproduction control unit 280 causes a reproduction image corresponding to the reproduction field of view determined by the field-of-view determination unit 275 to be displayed on the screen of the display unit 290 .
- the reproduction control unit 280 generates a reproduction image by cutting out a reproduction image in a portion corresponding to the reproduction field of view in one omnidirectional frame from the omnidirectional frame.
- the reproduction control unit 280 cuts out, from the omnidirectional frame, a reproduction image in a portion corresponding to the reproduction field of view in the next omnidirectional frame to generate the next reproduction image.
- these reproduction images are sequentially displayed on the display unit 290 which will be described later, video is displayed to the user.
- the reproduction control unit 280 In the case where reproduction in a reproduction field of view that straddles the reference direction is requested, the reproduction control unit 280 generates a reproduction image on the basis of a first partial image corresponding to the first imaging field of view in the omnidirectional frame at that time point and a second partial image corresponding to the M-th imaging field of view in a past omnidirectional frame captured earlier than (typically, immediately before) the omnidirectional frame.
- FIGS. 23A, 23B, and 23C show a manner in which the user moves a reproduction field of view in a certain scenario.
- a reproduction direction D 21 ⁇ 0+120°.
- the reproduction field of view Fr 41 covers an orientation from ⁇ 0+120° to ⁇ 0+240°.
- the reproduction direction herein shall correspond to the starting point of the reproduction field of view, rather than the center of the reproduction field of view (for example, in the case where the imaging field of view rotates to the right, the reproduction direction corresponds to the left end of the reproduction field of view).
- a reproduction direction D 22 ⁇ 0+30°.
- a reproduction field of view Fr 42 covers an orientation from ⁇ 0+30° to ⁇ 0+150°. That is, the reproduction field of view has moved 90° to the left with respect to the time point of FIG. 23A .
- a reproduction direction D 23 ⁇ 0 ⁇ 60°.
- a reproduction field of view Fr 43 covers an orientation from ⁇ 0 ⁇ 60° to ⁇ 0+60°. That is, the reproduction field of view has further moved 90° to the left with respect to the time point of FIG. 23B .
- the reproduction fields of view Fr 41 and Fr 42 do not straddle the reference direction D 0
- the reproduction field of view Fr 43 straddles the reference direction D 0 at the time point of FIG. 23B .
- FIG. 24A and FIG. 24B are explanatory diagrams for respectively describing generation of a reproduction image at the time points of FIG. 23A and FIG. 23B , respectively.
- omnidirectional frames Ia(i ⁇ 1) and Ia(i ⁇ 2) buffered by the frame buffer 265 are shown.
- An omnidirectional frame Ia(i) that may be being written in the frame buffer 265 at this time point is not illustrated. Since the reproduction field of view Fr 41 that should be reproduced occupies the range from ⁇ 0 +120° to ⁇ 0 +240°, the reproduction control unit 280 may cut out a partial image of this range from the omnidirectional frame Ia(i ⁇ 1) to generate a reproduction image.
- the reproduction control unit 280 may cut out a partial image of this range from the omnidirectional frame Ia(i ⁇ 1) to generate a reproduction image.
- an omnidirectional frame should reflect contents of an earlier frame with the lapse of time.
- the ordinal numbers of frames herein remain i ⁇ 1 and i ⁇ 2 for ease of description.
- FIG. 24C shows a manner in which a failure occurs in a reproduction image in the case where the reproduction image is generated from a single omnidirectional frame at the time point of FIG. 23C .
- FIG. 24D shows a manner in which a reproduction image without a failure is generated in accordance with the present example at the same time point.
- the reproduction field of view Fr 43 straddles the reference direction, and occupies the range from ⁇ 0 ⁇ 60° ( ⁇ 0 +300°) to ⁇ 0 ( ⁇ 0 +360°) and the range from ⁇ 0 to ⁇ 0 +60°.
- the reproduction control unit 280 generates a reproduction image Ir 43 by coupling the second partial image corresponding to the M-th imaging field of view that occupies the range from ⁇ 0 +300° to ⁇ 0 +360° in the omnidirectional frame Ia(i ⁇ 2) and the first partial image corresponding to the first imaging field of view that occupies the range from ⁇ 0 to ⁇ 0 +60° in the omnidirectional frame Ia(i ⁇ 1), as shown in FIG. 24D . Since these two partial images have been captured with an imaging timing difference which is significantly shorter than the imaging timing difference between the both ends of the single omnidirectional frame, the subject fits at the coupled portion of the two partial images, and a great failure does not occur in the reproduction image.
- the display unit 290 is typically a display device that may include a screen and a display driver.
- the display unit 290 causes a reproduction image corresponding to the reproduction field of view generated by the reproduction control unit 280 to be displayed on the screen.
- the screen may be equivalent to a microdisplay in the eyeglass wearable terminal 250 a shown in FIG. 19 , and a touch panel in the mobile terminal 250 b .
- the reproduction image may be displayed on a monitor of a fixed terminal such as a PC or a television device, for example.
- the reproduction image may be projected onto a screen by a projector.
- the display unit 290 displays reproduction images sequentially generated by the reproduction control unit 280 , it is possible for the user to view video of any partial reproduction field of view in the entire field of view of 360° of the omnidirectional video.
- FIG. 25 is a flowchart showing an example of a flow of imaging control processing that may be executed by the imaging control unit 220 according to the first example.
- the imaging control unit 220 awaits a trigger for starting imaging (step S 210 ). For example, when a user input that instructs the start of imaging is detected via a user interface, or a control command that instructs the start of imaging is detected via a communication interface, the imaging control processing proceeds into step S 215 .
- step S 215 the imaging control unit 220 causes the rotation of the enclosure of the imaging device 200 to be started. Thereafter, when imaging timing arrives (step S 220 ), the imaging control unit 220 instructs imaging by supplying the imaging start instruction signal to the imaging unit 210 (step S 225 ). The imaging control unit 220 repeats the imaging instruction M times while the enclosure makes a rotation, and when M captured images are accumulated (step S 230 ), causes the omnidirectional frame generation unit 230 to generate an omnidirectional frame from the M captured images (step S 235 ).
- the imaging control unit 220 causes the omnidirectional frame generation unit 230 to generate an omnidirectional frame in this manner at a predetermined omnidirectional frame rate. In the meantime, the enclosure of the imaging device 200 continues rotating.
- the imaging control unit 220 stops rotation of the enclosure (step S 245 ), and terminates the imaging control processing.
- FIG. 26 is a flowchart showing an example of a flow of reproduction control processing that may be executed by the display terminal 250 according to the first example.
- the reproduction control unit 280 of the display terminal 250 causes the field-of-view determination unit 275 to determine the reproduction field of view on the basis of an instruction from the user or a result of measuring the attitude of the terminal (step S 260 ). Subsequent processing branches in accordance with whether or not the determined reproduction field of view straddles the reference direction (step S 265 ).
- the reproduction control unit 280 In the case where the reproduction field of view does not straddle the reference direction, the reproduction control unit 280 generates a reproduction image corresponding to the reproduction field of view from a single omnidirectional frame having been buffered (step S 270 ). On the other hand, in the case where the reproduction field of view straddles the reference direction, the reproduction control unit 280 generates a reproduction image by coupling two partial images corresponding to reproduction fields of view respectively cut out from a certain omnidirectional frame and an omnidirectional frame captured immediately previously (step S 272 ).
- the reproduction control unit 280 causes the display unit 290 to display the generated reproduction image (step S 280 ). Generation and display of such a reproduction image is repeated until the termination of reproduction is instructed (step S 290 ).
- the camera system may be configured similarly to the camera system 5 described using FIG. 19 .
- the camera system according to the present example at least includes an imaging device 300 illustrated in FIG. 27 and a display terminal 350 illustrated in FIG. 29 .
- the camera system may further include a server that assists in imaging or reproduction of omnidirectional video.
- FIG. 27 shows an example of a configuration of the imaging device 300 according to the second example of the second embodiment.
- the imaging device 300 includes an imaging unit 310 , a direction detection unit 315 , an imaging control unit 320 , an information multiplexing unit 330 , and a frame memory 335 .
- the imaging unit 310 generates a captured image by imaging an imaging field of view to which the optical axis is directed at that time point under control exerted by the imaging control unit 320 which will be described later. More specifically, while the imaging control unit 320 causes the imaging direction of the imaging unit 310 to make one rotation around the rotation axis (causes the imaging field of view to make one revolution), the imaging unit 310 sequentially images at least M (M>1) imaging fields of view partially overlapping one another to generate M captured images. The entire field of view of 360° of the omnidirectional frame is integrally covered by the first to the M-th, M imaging fields of view. The imaging unit 310 sequentially outputs image signals representing these captured images to the information multiplexing unit 330 .
- the direction detection unit 315 detects the direction (imaging direction) to which the optical axis of the imaging unit 310 is directed.
- the direction detection unit 315 may detect the imaging direction using a sensor device such as a rotational angle sensor, an acceleration sensor, a gyro sensor, or a geomagnetic sensor, for example.
- the direction detection unit 315 outputs sensor data indicating the detected imaging direction to the imaging control unit 320 .
- the imaging control unit 320 controls rotation of the enclosure of the imaging device 300 and the imaging operation in the imaging unit 310 .
- the imaging control unit 320 causes rotation of the enclosure of the imaging device 300 around the rotation axis to be started in accordance with an instruction for the start of imaging.
- the imaging control unit 320 causes the imaging unit 310 to perform imaging at respective timings when the imaging unit 310 is directed to M (or more) imaging fields of view from the first imaging field of view to the M-th imaging field of view with the reference direction serving as the starting point during a period in which the enclosure is rotating.
- the imaging control unit 320 terminates periodic imaging performed by the imaging unit 310 , and stops rotation of the enclosure of the imaging device 300 .
- the imaging control unit 320 outputs imaging control information to the information multiplexing unit 330 .
- the imaging control information includes imaging direction information that specifies the imaging direction of each captured image.
- the imaging direction information may include one or more of the following parameters, for example:
- a relative angle an angular difference from the immediately preceding imaging direction
- the imaging control information may further include viewing angle information that specifies the viewing angles of individual imaging fields of view, frame rate information that specifies one or both of the omnidirectional frame rate and the individual frame rate, and the like.
- the information multiplexing unit 330 While imaging by the imaging unit 310 is being continued, the information multiplexing unit 330 causes the frame memory 335 to store a series of captured images, or transmits the captured images to the server or the display terminal.
- the information multiplexing unit 330 multiplexes the above-described imaging control information on the series of captured images.
- the information multiplexing unit 330 may insert imaging control information in a header region of video information including a series of captured images, for example.
- the information multiplexing unit 330 may transmit the imaging control information in a blanking period on a transmission path on which the series of captured images are transmitted.
- the information multiplexing unit 330 may include an encoding unit 332 that executes compression encoding in any video compression system before storing or transmitting the series of captured images.
- the encoding unit 332 uses every M+1-th reference frame with an encoding target frame (captured image) serving as a reference to encode the encoding target frame.
- FIG. 28 is an explanatory diagram for describing a reference frame for inter-frame prediction in the second example.
- a k-th captured image Im(k) is the encoding target frame.
- a reference frame positioned immediately before the captured image Im(k) on the time axis is a captured image Im(k ⁇ 1)
- a reference frame positioned immediately after the captured image Im(k) on the time axis is a captured image Im(k+1).
- the captured images Im(k ⁇ 1) and Im(k+1) are images that reflect imaging fields of view different from the encoding target frame, and are not effective in most cases as the reference frame for inter-frame prediction.
- a captured image Im(k ⁇ M) and a captured image Im(k+M) distant from the encoding target frame by an M individual frame time on the time axis has the same rotational phase as the encoding target frame, that is, reflects the same imaging field of view. Consequently, by performing inter-frame prediction using every M+1-th reference frame with the encoding target frame serving as a reference, effective predictive encoding can be performed, and encoding efficiency can be increased.
- the information multiplexing unit 330 may not necessarily include the encoding unit 332 .
- a device for example, the server 240 ) external to the imaging device 200 may encode omnidirectional video.
- FIG. 29 shows an example of a configuration of the display terminal 350 according to the second example.
- the display terminal 350 includes an image acquisition unit 360 , a writing control unit 365 , a frame buffer 370 , the field-of-view determination unit 275 , a reproduction control unit 380 , and the display unit 290 .
- the image acquisition unit 360 acquires a series of captured images to be used for constructing omnidirectional video that are generated by the imaging device 300 , from the imaging device 300 or the server 240 .
- the image acquisition unit 360 may include a decoding unit 362 that decodes an encoded stream in the case where the captured images have been subjected to compression encoding to restore the series of captured images.
- the decoding unit 362 may decode each of M captured images encoded using inter-frame prediction, using every M+1-th reference frame with the captured image serving as a reference.
- the image acquisition unit 360 may not necessarily include the decoding unit 362 .
- the image acquisition unit 360 sequentially outputs the acquired captured images to the writing control unit 365 .
- the image acquisition unit 360 acquires imaging control information associated with (for example, multiplexed on) a series of captured images from the imaging device 300 or the server 240 .
- the imaging control information may include, for example, the above-described imaging direction information, viewing angle information, and frame rate information. Then, the image acquisition unit 360 outputs the acquired imaging control information to the reproduction control unit 380 .
- the writing control unit 365 controls writing of captured images into the frame buffer 370 .
- the frame buffer 370 stores at least one captured image corresponding to at least one imaging field of view imaged in a second period subsequent to the first period.
- the writing control unit 365 writes each of captured images input from the image acquisition unit 360 into the frame buffer 370 as an individual frame.
- the frame buffer 370 may be a ring buffer that cyclically stores (at least M+1 individual frames corresponding to) M captured images and at least one captured image.
- FIG. 30 is an explanatory diagram for describing an example of a required buffer size of the frame buffer 370 shown in FIG. 29 .
- a required buffer size BS of the frame buffer 370 is expressed by the following expression.
- R TOTAL represents the total angular width in the phase direction that the frame buffer 370 should hold.
- R TOTAL is equivalent to the sum of a total viewing angle 371 of 360°, a reproduction viewing angle 372 , a movement margin 373 , and a writing frame 374 .
- the reproduction direction the starting point of the reproduction field of view
- the reproduction field of view occupies from the beginning ⁇ init +360° of the in-buffer phase to ⁇ init +360°+R v . Consequently, also in the case where a movement of the reproduction field of view is not considered, a buffer size equivalent to the sum of at least the total viewing angle 371 and the reproduction viewing angle 372 is required.
- R c represents an angular width obtained by excluding one of the overlapping portions on the both ends from the imaging viewing angle, and is typically equal to 360°/M.
- R c is referred to as an individual frame angle.
- the movement margin 373 may have a definition different from Expression (4) depending on a condition or restriction concerning a movement of the reproduction field of view.
- the writing frame 374 is incorporated as a capacitance for the writing control unit 365 to write a new frame into the frame buffer 370 .
- the angular width of the writing frame 374 is equal to the individual frame angle R c . Consequently, the total angular width R TOTAL in the phase direction that the frame buffer 370 should hold may be given by the following expression.
- a function Int( ) is a function that returns the largest integer smaller than an argument. Since writing into the buffer is performed using the individual frame angle R c as one unit although 360°+R m is not necessarily a multiple of the individual frame angle R c , it is desirable that such a quantizing operation is applied.
- the writing control unit 365 writes image data of the t-th individual frame from the writing position ⁇ write,t determined by Expression (6) below into the frame buffer 370 .
- ⁇ init represents the writing position of the initial individual frame
- mod represents a modulo operation
- Expression (6) expresses that the writing position of an individual frame is cyclically determined using the total angular width R TOTAL as a divisor.
- the reproduction control unit 380 controls reproduction of omnidirectional video based on M captured images sequentially captured in a plurality of fields of view that revolve while partially overlapping one another. As described above, reproduction of omnidirectional video is performed by reproducing a reproduction image corresponding to a reproduction field of view which is a part of the entire field of view of 360°. The entire field of view of 360° of omnidirectional video is covered by the first to the M-th, M imaging fields of view with the reference direction serving as the starting point.
- the reproduction control unit 380 When reproduction of omnidirectional video is requested, the reproduction control unit 380 causes the image acquisition unit 360 to start acquisition of a series of captured images. Each of the acquired captured images are buffered by the frame buffer 370 as individual frames after the overlapping portions are trimmed, for example. Then, the reproduction control unit 380 generates a reproduction image corresponding to the reproduction field of view determined by the field-of-view determination unit 275 . In the present example, the reproduction image corresponding to the reproduction field of view may be generated by reading out image data of an appropriate buffer region from the frame buffer 370 .
- the reproduction control unit 380 reads out image data of a range that straddles an in-buffer phase corresponding to the reference direction D 0 to generate a reproduction image corresponding to that reproduction field of view.
- an individual frame corresponding to the M-th imaging field of view is stored on the smaller phase side
- an individual frame corresponding to the first imaging field of view is stored on the larger phase side. Since these two individual frames are images captured consecutively (or with an imaging timing difference which is significantly shorter than the imaging timing difference between the both ends of a single omnidirectional frame), a great failure does not occur in the reproduction image generated in this manner.
- the reproduction control unit 280 may await until one or more reproduction start conditions which will be described next are met after reproduction start is instructed, and may start display of a reproduction image only after those reproduction start conditions are met.
- the initial (at the start of reproduction) reproduction direction (the starting point of the reproduction field of view) is ⁇ v
- the imaging direction (the starting point of the imaging field of view) of the newest individual frame already buffered is ⁇ c
- a first reproduction start condition is expressed by the following expression, for example.
- the small-letter theta ( ⁇ ) means that the variable indicates a direction having a range of more than or equal to zero and less than 360°.
- the reproduction control unit 280 can derive the imaging direction ⁇ c and the individual frame angle R c of each individual frame from imaging control information input from the image acquisition unit 360 .
- the reproduction direction ⁇ v and the reproduction viewing angle R v are determined by the field-of-view determination unit 275 .
- a second reproduction start condition is expressed by the following expression, for example.
- ⁇ init represents the starting point of an imaging field of view of an individual frame buffered first.
- ⁇ c ′ represents the imaging direction of the newest individual frame already buffered, and shall increase without returning to zero even when exceeding 360° unlike ⁇ c .
- the left side of Expression (8) is equivalent to a phase difference between the starting point ( ⁇ v ) of the initial reproduction field of view and the boundary ( ⁇ init ) of a phase having no image data in the reverse direction.
- the inside of the first bracket on the right side is obtained by adding a margin for one individual frame (R c ) to the terminal end ( ⁇ init +360°+R v ) of a reproduction field of view whose starting point is located 360° ahead of the boundary of a phase having no image data.
- the second reproduction start condition means whether, even if the user immediately moves the reproduction field of view from the initial reproduction field of view in the reverse direction, preparation for an individual frame after one omnidirectional frame of the reproduction field of view after the movement is in time.
- the second reproduction start condition is satisfied, even if the reproduction field of view is immediately moved after the start of reproduction, a reproduction image may be generated appropriately since the reproduction image corresponding to the reproduction field of view after the movement has been already stored in the frame buffer 370 . Note that, in the case where it is assumed that the user does not rapidly move the reproduction field of view, the second reproduction start condition may be omitted.
- a readout position ⁇ read ( ⁇ v , ⁇ c , ⁇ write,t ) when reading out a reproduction image from the frame buffer 370 may be derived as in the following expression.
- FIG. 31 shows a manner in which a buffer position ( ⁇ write,t ⁇ ( ⁇ c +R c ) ⁇ 360°+ ⁇ v ) having a phase lead of 360° is calculated as the readout position ⁇ read of the reproduction image since the imaging field of view of the individual frame during writing overlaps the reproduction field of view.
- the reproduction control unit 280 reads out image data of an individual frame corresponding to one imaging field of view or two or more imaging fields of view imaged consecutively from the frame buffer 370 in accordance with the readout position calculated in this manner, and generates a reproduction image on the basis of the image data having been read out. Then, the reproduction control unit 280 causes the generated reproduction image to be displayed on the screen of the display unit 290 .
- FIG. 32 is a flowchart showing an example of a flow of imaging control processing that may be executed by the imaging control unit 320 according to the second example.
- the imaging control unit 320 awaits a trigger for starting imaging (step S 310 ). For example, when a user input that instructs the start of imaging is detected via a user interface, or a control command that instructs the start of imaging is detected via a communication interface, the imaging control processing proceeds into step S 315 .
- step S 315 the imaging control unit 320 causes the rotation of the enclosure of the imaging device 300 to be started. Thereafter, when imaging timing arrives (step S 320 ), the imaging control unit 320 instructs imaging by supplying the imaging start instruction signal to the imaging unit 310 (step S 325 ).
- the imaging control unit 320 repeats an imaging instruction M times while the enclosure makes a rotation, for example. Each time a captured image is generated by the imaging unit 310 , the imaging control unit 320 causes the information multiplexing unit 330 to store the captured image in the frame memory 335 , or causes the captured image to be transmitted to a server or a display terminal (step S 335 ). Imaging control information is multiplexed on a series of stored or transmitted captured images.
- the imaging control unit 320 stops the rotation of the enclosure (step S 345 ), and terminates the imaging control processing.
- FIG. 33 is a flowchart showing an example of a flow of reproduction control processing that may be executed by the display terminal 350 according to the second example.
- the writing control unit 365 of the display terminal 350 starts writing of a captured image acquired by the image acquisition unit 360 (and decoded according to necessity) into the frame buffer 370 (step S 350 ).
- the reproduction control unit 380 causes the field-of-view determination unit 275 to determine an initial reproduction field of view (step S 352 ) on the basis of an instruction from the user or a result of measuring the attitude of the terminal. Thereafter, the reproduction control unit 380 awaits until one or more reproduction start conditions are met (step S 354 ).
- the reproduction control unit 380 calculates the readout position in the frame buffer 370 corresponding to a desired reproduction field of view in accordance with Expression (9) (step S 365 ). Then, the reproduction control unit 380 reads out, from the frame buffer 370 , image data of one reproduction viewing angle using the calculated readout position as the starting point to generate a reproduction image (step S 370 ).
- the reproduction control unit 380 causes the generated reproduction image to be displayed on the display unit 290 (step S 380 ). Generation and display of such a reproduction image is repeated until the termination of reproduction is instructed (step S 390 ). In each repetition, a determination of the reproduction field of view is also executed again, and the reproduction field of view moves in accordance with a user's request (step S 360 ).
- FIG. 34 to FIG. 36 show examples of transition of the buffer state of the frame buffer in three scenarios, respectively.
- the individual frame angle R c is 600, and the reproduction viewing angle R v is 120°.
- the reproduction direction ⁇ v shall be ⁇ 0 +60° and shall not move.
- the total angular width R TOTAL of the frame buffer 370 is 600° in accordance with Expression (4) and Expression (5), and this is equivalent to ten individual frames.
- the writing position ⁇ init of the zero-th individual frame shall be equal to 00.
- a region in which data is being written is hatched by dots.
- the symbol x is shown in a region having no data.
- a region in which data that can be read out exists is white.
- a readout region corresponding to the reproduction field of view is indicated by a thick frame.
- the black bars added under blocks representing in-buffer regions represent accessible ranges in the reproduction direction.
- a buffer region [0°, 360°] is accessible as the reproduction direction.
- writing into the frame buffer 370 circulates, and writing into the buffer region [0°, 60°] of the individual frame is started again.
- the readout region of the reproduction image advances 360° from the previous buffer region [60°, 120°] to be a buffer region [420°, 540°].
- the readout region of the reproduction image advances 360° from the previous buffer region [420°, 540°] and circulates to be a buffer region [180°, 300°].
- the individual frame angle R c is 60°
- the reproduction viewing angle R v is 120°.
- the reproduction direction ⁇ v is initially ⁇ 0 +240°, and after reproduction is started, moves 60° per individual frame time in the forward direction.
- the readout position moves 60° in the forward direction in accordance with the movement of the reproduction field of view, and a reproduction image is read out from a buffer region [300°, 420°].
- the individual frame angle R c is 60°
- the reproduction viewing angle R v is 120°.
- the reproduction direction ⁇ v is initially ⁇ 0 +60°, and after reproduction is started, moves 60° per individual frame time in the reverse direction.
- the readout position moves 60° in the reverse direction in accordance with the movement of the reproduction field of view, but the readout position displaces 360° because of deviation from the accessible buffer region, and a reproduction image is read out from the buffer region [360°, 480°].
- the readout position moves 60° in the reverse direction in accordance with the movement of the reproduction field of view, and a reproduction image is read out from the buffer region [300°, 420°].
- the second embodiment of the technology according to the present disclosure has been described in detail so far using FIGS. 17, 18, 19, 20, 21, 22, 23A, 23B, 23C, 24A, 24B, 24C, 24D, 25, 26, 27, 28, 29, 30, 31, 32 , 33 , 34 , 35 , and 36 .
- a reproduction image based on the first captured image corresponding to the first field of view and the second captured image corresponding to the M-th field of view captured earlier than the first captured image is displayed. Therefore, since the imaging timing difference between partial images included in the reproduction image becomes significantly smaller than the imaging timing difference between the both ends of a single omnidirectional frame, it is possible to prevent a great failure from occurring in an integrally generated reproduction image.
- reproduction of omnidirectional video is performed by reproducing a reproduction image corresponding to a reproduction field of view which is a part of the entire field of view of 360°.
- the user may move the reproduction field of view straddling the reference direction equivalent to the starting point of the omnidirectional frame, a failure in the reproduction image is also prevented in such a case, and thus, a user's sense of immersion into the omnidirectional video will not be impaired.
- At least one captured image corresponding to at least one imaging field of view captured during another omnidirectional frame period subsequent to that omnidirectional frame period is buffered for reproduction of omnidirectional video. Therefore, in whichever direction in 360° the user directs the reproduction field of view, a natural reproduction image can be generated and displayed.
- a reproduction image without a failure can be generated promptly with a simple mechanism of cyclic writing of frames into the ring buffer and readout of a reproduction image from an appropriate readout region corresponding to the reproduction direction.
- video that is reproduced with a small delay can also be provided for the user.
- display of a reproduction image is started on the condition that, in the case where the reproduction field of view is moved immediately after the start of reproduction, the reproduction image corresponding to the reproduction field of view after the movement can be generated. Therefore, it is possible to avoid an interruption of reproduction due to absence of necessary image data as a result that the user moves the reproduction field of view after reproduction is started once.
- a reproduction image is integrally generated on the basis of the first partial image corresponding to the first imaging field of view in the first omnidirectional frame and the second partial image corresponding to the M-th imaging field of view in the second omnidirectional frame captured earlier. Also in this case, it is possible to prevent a failure in the reproduction image, and to avoid a user's sense of immersion into the omnidirectional video being impaired.
- captured images generated by sequentially imaging a plurality of imaging fields of view while rotating the imaging direction around the rotation axis are used for reproduction of omnidirectional video. Therefore, there is no need to utilize an omnidirectional camera including a plurality of camera modules, and a camera system that provides omnidirectional video can be constructed at lower cost.
- a rotary mechanism should only be added to a general digital video camera.
- control processing of each of the apparatuses described in the present specification may be realized using any combination of software, hardware, and software and hardware.
- Programs configuring software are stored in, for example, storage media (non-transitory media) provided inside or outside each of the apparatuses in advance. Therefore, each of the programs is read by, for example, a random access memory (RAM) at the time of execution and executed by a processor such as a central processing unit (CPU).
- RAM random access memory
- CPU central processing unit
- an aspect of the present technology may also be configured as below.
- An image processing device including:
- a reproduction control unit configured to control reproduction of omnidirectional video based on a plurality of captured images sequentially captured in a plurality of fields of view that revolve while partially overlapping one another, in which
- the omnidirectional video is covered by M fields of view from a first field of view to an M-th (M>1) field of view with a reference direction serving as a starting point, and
- the reproduction control unit causes a display unit to display a reproduction image based on a first captured image corresponding to the first field of view and a second captured image corresponding to the M-th field of view captured earlier than the first captured image.
- the image processing device in which the reproduction control unit causes the omnidirectional video to be reproduced by reproducing the reproduction image corresponding to the reproduction field of view which is a part of an entire field of view of 360°.
- the image processing device further including:
- a buffer configured to store M captured images corresponding to the M fields of view captured in a first period and at least one captured image corresponding to at least one field of view captured in a second period subsequent to the first period.
- the buffer is a ring buffer configured to cyclically store the M captured images and the at least one captured image.
- the reproduction control unit causes the display unit to display the reproduction image read out from the buffer, the reproduction image being based on an image corresponding to one field of view or two or more fields of view imaged consecutively.
- the reproduction control unit starts display of the reproduction image.
- the image processing device further including: a decoding unit configured to decode each of a series of captured images encoded using inter-frame prediction, using every M+1-th reference frame with the captured image serving as a reference.
- the image processing device further including:
- an image acquisition unit configured to acquire an omnidirectional image generated by stitching M captured images corresponding to the M fields of view, in which
- the reproduction control unit causes the display unit to display the reproduction image on a basis of a first partial image corresponding to the first field of view in a first omnidirectional image and a second partial image corresponding to the M-th field of view in a second omnidirectional image captured earlier than the first omnidirectional image.
- the image processing device according to any one of (1) to (8), further including:
- a determination unit configured to determine the reproduction field of view requested, in which
- the reproduction control unit causes the display unit to display the reproduction image corresponding to the reproduction field of view determined by the determination unit.
- the image processing device according to any one of (1) to (9), in which the plurality of captured images are images generated by sequentially imaging the M fields of view with an imaging device while rotating an imaging direction around a rotation axis.
- a display device including
- the omnidirectional video is covered by M fields of view from a first field of view to an M-th (M>1) field of view with a reference direction serving as a starting point,
- the reproduction control method including:
- An image processing system including:
- an imaging device configured to generate a plurality of captured images by sequentially performing imaging in a plurality of fields of view that revolve while partially overlapping one another;
- an image processing device that includes a reproduction control unit configured to control reproduction of omnidirectional video based on the plurality of captured images, in which
- the omnidirectional video is covered by M fields of view from a first field of view to an M-th (M>1) field of view with a reference direction serving as a starting point, and
- the reproduction control unit causes a display unit to display a reproduction image based on a first captured image corresponding to the first field of view and a second captured image corresponding to the M-th field of view captured earlier than the first captured image.
- An imaging control device including:
- control unit configured to control imaging timing of at least one of a first image signal generated by imaging a first field of view in a first imaging unit and a second image signal generated by imaging a second field of view that partially overlaps the first field of view in a second imaging unit such that a phase of the first image signal and a phase of the second image signal that correspond to an overlapping portion of the first field of view and the second field of view agree.
- the first image signal indicates a pixel value read out in a sequential readout system from an image sensor of the first imaging unit
- the second image signal indicates a pixel value read out in the sequential readout system from an image sensor of the second imaging unit.
- the imaging control device in which the first field of view precedes the second field of view in the pixel readout direction, and
- control unit delays readout of the pixel value of a pixel at the beginning in the second imaging unit by a time until a readout pixel position in the first imaging unit reaches the overlapping portion from the pixel at the beginning.
- the entire field of view of 360° is covered by fields of view of a plurality of imaging units including the first imaging unit and the second imaging unit, and
- a delay time of readout of the pixel value in the second imaging unit with respect to the first imaging unit is equal to a quotient obtained by dividing a required frame time by the number of the plurality of imaging units.
- the imaging control device in which the control unit controls the pixel readout speed in the first imaging unit such that pixel values in a range from the beginning of a frame to reach the overlapping portion are read out during a delay time of readout of the pixel value in the second imaging unit with respect to the first imaging unit.
- imaging timing of the first imaging unit is the earliest among the plurality of imaging units
- imaging timing of a third imaging unit among the plurality of imaging units is the latest among the plurality of imaging units
- an image of an i-th (i is an integer) frame in the first field of view imaged by the first imaging unit is integrally processed with an image of an i ⁇ 1-th frame in a third field of view imaged by the third imaging unit.
- the first imaging unit and the second imaging unit read out pixel values per vertical line.
- the first imaging unit and the second imaging unit read out pixel values per horizontal line.
- the first imaging unit and the second imaging unit read out pixel values per vertical line.
- the first imaging unit and the second imaging unit read out pixel values per horizontal line.
- the imaging control device according to any of (1) to (10), further including:
- an image processing unit that stitches an image represented by the first image signal and an image represented by the second image signal.
- the imaging control device according to any of (1) to (11), further including:
- an image processing unit that executes stereo matching using an image represented by the first image signal and an image represented by the second image signal.
- the imaging control device according to any of (3) to (6), further including:
- a memory that stores control information that defines in advance the delay time of imaging timing of the second image signal with respect to the first image signal, in which
- control unit delays readout of the pixel value of the pixel at the beginning in the second imaging unit by the delay time defined by the control information.
- control unit dynamically determines the delay time of imaging timing of the second image signal with respect to the first image signal depending on setting of a frame time or a frame rate, and delays readout of the pixel value of the pixel at the beginning in the second imaging unit by the dynamically determined delay time.
- An imaging device including:
- An imaging control method of controlling an imaging device including a first imaging unit that images a first field of view to generate a first image signal and a second imaging unit that images a second field of view that partially overlaps the first field of view to generate a second image signal, including:
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Stereoscopic And Panoramic Photography (AREA)
Abstract
Description
[Math. 1]
T FRAME=(1−r OL)·T FP ·N CAM (1)
(3) Information Multiplexing Unit
[Math. 4]
R m=max(R v ,R c) (4)
[Math. 6]
Θwrite,t(Θinit +t×R c)mod R TOTAL (6)
[Math. 7]
θ0 +R v ++R c≤θc +R c (7)
[Math. 8]
θv−θinit≥(θinit+360°+R v +R c)−(θc ′+R c) (8)
-
- generating, by an image processing device, a reproduction image based on a first captured image corresponding to the first field of view and a second captured image corresponding to the M-th field of view captured earlier than the first captured image, in a case where reproduction in a reproduction field of view that straddles the reference direction is requested; and
- causing a display unit to display the generated reproduction image.
(13)
- 1 camera system
- 10 a, 10 b omnidirectional camera
- 100 imaging device
- 112 a-112 n imaging unit (camera module)
- 120 imaging control unit (imaging control device)
- 130 image processing unit
- 170 display terminal
- 180 reproduction control unit
- 190 display unit
- τ1, τ2, τ3, τ4 delay time
- 5 camera system
- 50 rotating camera
- 200, 300 imaging device
- 210, 310 imaging unit
- 220, 320 imaging control unit
- 230 omnidirectional frame generation unit
- 330 information multiplexing unit
- 250, 350 display terminal
- 260, 360 image acquisition unit
- 365 writing control unit
- 265, 370 frame buffer
- 275 field-of-view determination unit
- 280, 380 reproduction control unit
- 290 display unit
Claims (13)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016-037836 | 2016-02-29 | ||
| JP2016037836 | 2016-02-29 | ||
| PCT/JP2017/007230 WO2017150394A1 (en) | 2016-02-29 | 2017-02-24 | Image processing device, display device, playback control method, and image processing system |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20190028643A1 US20190028643A1 (en) | 2019-01-24 |
| US10582127B2 true US10582127B2 (en) | 2020-03-03 |
Family
ID=59743681
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/077,218 Active US10582127B2 (en) | 2016-02-29 | 2017-02-24 | Image processing device, display device, reproduction control method, and image processing system |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US10582127B2 (en) |
| JP (1) | JP6724977B2 (en) |
| WO (2) | WO2017149875A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12219207B2 (en) | 2021-02-26 | 2025-02-04 | Arashi Vision Inc. | Video splicing method and apparatus, and computer device and storage medium |
Families Citing this family (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10339595B2 (en) | 2016-05-09 | 2019-07-02 | Grabango Co. | System and method for computer vision driven applications within an environment |
| WO2018013439A1 (en) | 2016-07-09 | 2018-01-18 | Grabango Co. | Remote state following devices |
| WO2018148613A1 (en) | 2017-02-10 | 2018-08-16 | Grabango Co. | A dynamic customer checkout experience within an automated shopping environment |
| US10778906B2 (en) * | 2017-05-10 | 2020-09-15 | Grabango Co. | Series-configured camera array for efficient deployment |
| IL271528B2 (en) | 2017-06-21 | 2024-08-01 | Grabango Co | Linking observed human activity on video to a user account |
| US20190079591A1 (en) | 2017-09-14 | 2019-03-14 | Grabango Co. | System and method for human gesture processing from video input |
| KR101963449B1 (en) * | 2017-09-20 | 2019-04-01 | 주식회사 쓰리아이 | System and method for generating 360 degree video |
| US10963704B2 (en) | 2017-10-16 | 2021-03-30 | Grabango Co. | Multiple-factor verification for vision-based systems |
| US11481805B2 (en) | 2018-01-03 | 2022-10-25 | Grabango Co. | Marketing and couponing in a retail environment using computer vision |
| US11128764B2 (en) * | 2018-05-17 | 2021-09-21 | Canon Kabushiki Kaisha | Imaging apparatus, control method, and non-transitory computer readable medium |
| JP7249755B2 (en) * | 2018-10-26 | 2023-03-31 | キヤノン株式会社 | Image processing system, its control method, and program |
| US11288648B2 (en) | 2018-10-29 | 2022-03-29 | Grabango Co. | Commerce automation for a fueling station |
| EP3691249A1 (en) | 2019-01-29 | 2020-08-05 | Koninklijke Philips N.V. | Image signal representing a scene |
| AU2020231365A1 (en) | 2019-03-01 | 2021-09-16 | Grabango Co. | Cashier interface for linking customers to virtual data |
| CN111464804A (en) * | 2020-04-08 | 2020-07-28 | 北京小米松果电子有限公司 | Omnidirectional parallax view synthesis method and device and storage medium |
| US20230262208A1 (en) * | 2020-04-09 | 2023-08-17 | Looking Glass Factory, Inc. | System and method for generating light field images |
| WO2024252913A1 (en) * | 2023-06-07 | 2024-12-12 | ソニーグループ株式会社 | Information processing device, information processing method, and program |
Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0962861A (en) | 1995-08-21 | 1997-03-07 | Matsushita Electric Ind Co Ltd | Panoramic imager |
| JP2000002927A (en) | 1998-06-16 | 2000-01-07 | Olympus Optical Co Ltd | Automatic universal head, supporting structure thereof and photographing system |
| JP2004531113A (en) | 2001-02-09 | 2004-10-07 | リー,クジン | Omnidirectional three-dimensional image data acquisition apparatus by annotation, method and method for enlarging photosensitive area |
| JP2006039564A (en) | 2004-07-28 | 2006-02-09 | Microsoft Corp | Camera system and panoramic camera system |
| JP2008118602A (en) | 2006-11-03 | 2008-05-22 | Advanced Technology:Kk | Television broadcast video image constitution system |
| JP2011512735A (en) | 2008-02-08 | 2011-04-21 | グーグル インコーポレイテッド | Panorama camera with multiple image sensors using timed shutters |
| WO2013165006A1 (en) | 2012-05-01 | 2013-11-07 | セントラルエンジニアリング株式会社 | Stereo camera and stereo camera system |
| WO2013186806A1 (en) | 2012-06-11 | 2013-12-19 | 株式会社ソニー・コンピュータエンタテインメント | Image capturing device, and image capturing method |
| US20140098224A1 (en) * | 2012-05-17 | 2014-04-10 | Hong Kong Applied Science and Technology Research Institute Company Limited | Touch and motion detection using surface map, object shadow and a single camera |
| US20140142486A1 (en) * | 2008-11-09 | 2014-05-22 | 3D Systems, Inc. | Bikini brace |
| US8749547B2 (en) * | 2010-02-05 | 2014-06-10 | Sony Corporation | Three-dimensional stereoscopic image generation |
| JP2014115863A (en) | 2012-12-11 | 2014-06-26 | Sony Corp | Information processing apparatus, information processing method, and program |
| JP2014168147A (en) | 2013-02-28 | 2014-09-11 | Nikon Corp | Image processing program and digital camera |
| JP2015156523A (en) | 2012-06-06 | 2015-08-27 | ソニー株式会社 | Image processing device, image processing method, and program |
-
2016
- 2016-12-02 WO PCT/JP2016/085927 patent/WO2017149875A1/en not_active Ceased
-
2017
- 2017-02-24 US US16/077,218 patent/US10582127B2/en active Active
- 2017-02-24 JP JP2018503116A patent/JP6724977B2/en active Active
- 2017-02-24 WO PCT/JP2017/007230 patent/WO2017150394A1/en not_active Ceased
Patent Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0962861A (en) | 1995-08-21 | 1997-03-07 | Matsushita Electric Ind Co Ltd | Panoramic imager |
| JP2000002927A (en) | 1998-06-16 | 2000-01-07 | Olympus Optical Co Ltd | Automatic universal head, supporting structure thereof and photographing system |
| JP2004531113A (en) | 2001-02-09 | 2004-10-07 | リー,クジン | Omnidirectional three-dimensional image data acquisition apparatus by annotation, method and method for enlarging photosensitive area |
| JP2006039564A (en) | 2004-07-28 | 2006-02-09 | Microsoft Corp | Camera system and panoramic camera system |
| JP2008118602A (en) | 2006-11-03 | 2008-05-22 | Advanced Technology:Kk | Television broadcast video image constitution system |
| JP2011512735A (en) | 2008-02-08 | 2011-04-21 | グーグル インコーポレイテッド | Panorama camera with multiple image sensors using timed shutters |
| US20140142486A1 (en) * | 2008-11-09 | 2014-05-22 | 3D Systems, Inc. | Bikini brace |
| US8749547B2 (en) * | 2010-02-05 | 2014-06-10 | Sony Corporation | Three-dimensional stereoscopic image generation |
| WO2013165006A1 (en) | 2012-05-01 | 2013-11-07 | セントラルエンジニアリング株式会社 | Stereo camera and stereo camera system |
| US20140098224A1 (en) * | 2012-05-17 | 2014-04-10 | Hong Kong Applied Science and Technology Research Institute Company Limited | Touch and motion detection using surface map, object shadow and a single camera |
| JP2015156523A (en) | 2012-06-06 | 2015-08-27 | ソニー株式会社 | Image processing device, image processing method, and program |
| WO2013186806A1 (en) | 2012-06-11 | 2013-12-19 | 株式会社ソニー・コンピュータエンタテインメント | Image capturing device, and image capturing method |
| JP2014115863A (en) | 2012-12-11 | 2014-06-26 | Sony Corp | Information processing apparatus, information processing method, and program |
| JP2014168147A (en) | 2013-02-28 | 2014-09-11 | Nikon Corp | Image processing program and digital camera |
Non-Patent Citations (1)
| Title |
|---|
| International Search Report and Written Opinion of PCT Application No. PCT/JP2017/007230, dated May 16, 2017, 07 pages. ISRWO. |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12219207B2 (en) | 2021-02-26 | 2025-02-04 | Arashi Vision Inc. | Video splicing method and apparatus, and computer device and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190028643A1 (en) | 2019-01-24 |
| WO2017149875A1 (en) | 2017-09-08 |
| JP6724977B2 (en) | 2020-07-15 |
| JPWO2017150394A1 (en) | 2018-12-20 |
| WO2017150394A1 (en) | 2017-09-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10582127B2 (en) | Image processing device, display device, reproduction control method, and image processing system | |
| US11765396B2 (en) | Apparatus and methods for video compression | |
| US20200336725A1 (en) | Systems, methods and apparatus for compressing video content | |
| US11064110B2 (en) | Warp processing for image capture | |
| EP3673646B1 (en) | Image stitching with electronic rolling shutter correction | |
| US10958820B2 (en) | Intelligent interface for interchangeable sensors | |
| US10652517B2 (en) | Virtual reality 360-degree video camera system for live streaming | |
| CN106576160B (en) | Imaging architecture for depth camera mode with mode switching | |
| TW201205308A (en) | Combining data from multiple image sensors | |
| US20170366814A1 (en) | Apparatus and methods for image encoding using spatially weighted encoding quality parameters | |
| US10743002B2 (en) | Sequential in-place blocking transposition for image signal processing | |
| US20120242780A1 (en) | Image processing apparatus and method, and program | |
| US20120169840A1 (en) | Image Processing Device and Method, and Program | |
| US20150304629A1 (en) | System and method for stereophotogrammetry | |
| US11363214B2 (en) | Local exposure compensation | |
| US20230216999A1 (en) | Systems and methods for image reprojection | |
| WO2017205597A1 (en) | Image signal processing-based encoding hints for motion estimation | |
| KR101806840B1 (en) | High Resolution 360 degree Video Generation System using Multiple Cameras | |
| US20180211413A1 (en) | Image signal processing using sub-three-dimensional look-up tables | |
| EP3190460A1 (en) | Image capturing device on a moving body | |
| EP3091742A1 (en) | Device and method for encoding a first image of a scene using a second image having a lower resolution and captured at the same instant | |
| US20180213139A1 (en) | Image processing apparatus and method | |
| US11636708B2 (en) | Face detection in spherical images |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ORYOJI, HIROSHI;REEL/FRAME:046614/0921 Effective date: 20180606 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |