WO2017002360A1 - Système d'imagerie vidéo entièrement sphérique et support d'enregistrement lisible par ordinateur - Google Patents

Système d'imagerie vidéo entièrement sphérique et support d'enregistrement lisible par ordinateur Download PDF

Info

Publication number
WO2017002360A1
WO2017002360A1 PCT/JP2016/003104 JP2016003104W WO2017002360A1 WO 2017002360 A1 WO2017002360 A1 WO 2017002360A1 JP 2016003104 W JP2016003104 W JP 2016003104W WO 2017002360 A1 WO2017002360 A1 WO 2017002360A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
tilt angle
image
acquisition
vertical direction
Prior art date
Application number
PCT/JP2016/003104
Other languages
English (en)
Inventor
Hideaki Yamamoto
Original Assignee
Ricoh Company, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016124881A external-priority patent/JP6677098B2/ja
Application filed by Ricoh Company, Ltd. filed Critical Ricoh Company, Ltd.
Priority to EP16817470.4A priority Critical patent/EP3318053A4/fr
Priority to CN201680037512.2A priority patent/CN107710728A/zh
Priority to US15/578,282 priority patent/US20180146136A1/en
Publication of WO2017002360A1 publication Critical patent/WO2017002360A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors

Definitions

  • the present invention relates to a full-spherical video imaging system and a computer-readable recording medium.
  • a technique of cutting out a part of a distorted circular image where a hemisphere is recorded by wide-angle image capture using a fisheye lens and transforming the distorted image into a planar regular image by performing computer image processing is already known.
  • Patent Literature 1 discloses a method and a configuration of equipment for transforming each frame image of a video to record a full-spherical video where upward vertical direction is consistently correct even if the video is captured with a main body of a video capture apparatus including two fisheye lenses and an acceleration sensor tilted. This is achieved by recording two fisheye videos and performing, after capturing the videos, zenith correction simultaneously with transformation into an equirectangular projection by using data about tilt angles of the main body obtained from the acceleration sensor.
  • Patent Literature 2 A technique of collecting video information from a domical field of view with a horizontal angle of view of 360 degrees and a vertical angle of view of 180 degrees transmitted full-spherically, or omni-directionallly, by using a specific electromagnetic wave as a data medium is disclosed in Patent Literature 2.
  • Patent Literature 3 A technique of collecting, as in Patent Literature 2, video information from a domical field of view with a horizontal angle of view of 360 degrees and a vertical angle of view of 180 degrees transmitted full-spherically, or omni-directionallly, by using a specific electromagnetic wave as a data medium is disclosed in Patent Literature 3.
  • Patent Literature 1 for displaying, when image capture is performed using a hand-holdable full-spherical camera with its main body tilted, a still image after applying tilt angle correction.
  • a novel problem that fluctuation in output value of the acceleration sensor caused by noises makes it difficult to obtain an angle relative to the vertical direction has arisen.
  • Patent Literature 2 and Patent Literature 3 which differ from the present invention in technical feature, also cannot solve the novel problem that fluctuation in output value of the acceleration sensor caused by noises makes it difficult to obtain an angle relative to the vertical direction.
  • the present invention is characterized by the following features.
  • a full-spherical video imaging system comprising: an acquisition unit configured to acquire tilt angles relative to a vertical direction from an image sensor with a second cycle time of "one over a whole number", the second cycle time being shorter than a first cycle time with which the acquisition unit acquires original image data from the image sensor; a calculator configured to calculate an average tilt angle from a tilt angle of the image sensor relative to the vertical direction acquired simultaneously with acquisition of the original image data and a tilt angle of the image sensor relative to the vertical direction acquired in a period straddling the acquisition of the original image data; and a generator configured to generate corrected image data by correcting the original image data on the basis of the average tilt angle.
  • tilt angles of a main body can be obtained highly accurately by setting an acceleration-sensor-data-acquisition cycle time to be shorter than an image-data-acquisition cycle time. Accordingly, it is possible to generate highly-accurate omnidirectional (full-spherical) video images.
  • Fig. 1 is a functional block diagram describing an imaging system of an embodiment of the present invention.
  • Fig. 2 is a hardware configuration diagram of an image capture apparatus of the embodiment.
  • Fig. 3 is a hardware configuration diagram of an information processing apparatus of the embodiment.
  • Fig. 4 is a diagram illustrating a tilt-angle-acquisition cycle time and an image-data-acquisition cycle time of the embodiment.
  • Fig. 5A is a diagram describing a projection relationship of a fisheye lens used in the image capture apparatus of the embodiment from an appearance of the fisheye lens in side view.
  • Fig. 5B is a diagram describing a projection relationship of a fisheye lens used in the image capture apparatus of the embodiment from a projection function f of a captured image in plan view.
  • Fig. 1 is a functional block diagram describing an imaging system of an embodiment of the present invention.
  • Fig. 2 is a hardware configuration diagram of an image capture apparatus of the embodiment.
  • Fig. 3 is a hardware configuration diagram of an information processing apparatus
  • FIG. 6A is a diagram describing a format of an omnidirectional (full-spherical) image captured by the image capture apparatus of the embodiment presented in a planar form.
  • Fig. 6B is a diagram describing a format of an omnidirectional (full-spherical) image captured by the image capture apparatus of the embodiment presented in a spherical form.
  • Fig. 7 is a schematic diagram describing a tilt of the image capture apparatus of the embodiment.
  • Fig. 8 is a diagram describing vertical correction calculation for an omnidirectional (full-spherical) image captured by the image capture apparatus of the embodiment using (A) a camera coordinate system and (B) a global coordinate system.
  • FIG. 9A is a conversion table for an omnidirectional (full-spherical) image captured by the image capture apparatus of the embodiment in terms of a matrix of coordinate values of a post-transformation image and a pre-transformation image.
  • Fig. 9B is a diagram describing a relationship between coordinate values of the post-transformation image and coordinate values of the pre-transformation image.
  • Fig. 10 is a flowchart for the image capture apparatus of the embodiment.
  • Fig. 11 is a flowchart for the information processing apparatus of the embodiment.
  • Fig. 12 is a set of images obtained by joining images captured with two fisheye lenses and having undergone tilt angle correction of the embodiment.
  • FIG. 13 is a set of images obtained by joining images captured with two fisheye lenses but having not undergone the tilt angle correction of the embodiment.
  • the accompanying drawings are intended to depict exemplary embodiments of the present invention and should not be interpreted to limit the scope thereof. Identical or similar reference numerals designate identical or similar components throughout the various drawings.
  • an omnidirectional (full-spherical) image capture apparatus detects a vertical direction and corrects a conversion table for use in image processing according to the vertical direction at generation of an omnidirectional (full-spherical) image.
  • imaging system is used to refer to a plurality of apparatuses, e.g., a digital camera and an information processing apparatus, used separately.
  • image capture apparatus conceptually includes imaging system unless otherwise specified.
  • a full-spherical camera is used as the image capture apparatus, and an image forming apparatus is used as the information processing apparatus.
  • Functional blocks of the present embodiment are described below.
  • Fig. 1 illustrates functional blocks of an imaging system of the present embodiment.
  • the image capture apparatus used in the present embodiment includes an acquisition unit 11; the information processing apparatus includes a calculator 21 and a generator 31.
  • the acquisition unit 11 acquires tilt angles relative to the vertical direction from an acceleration sensor with a cycle time shorter than an image-data-acquisition cycle time.
  • the acceleration sensor is a three-axis acceleration sensor that detects acceleration of the image capture apparatus in three-axial directions orthogonal to each other, which are up and down, right and left, and near and far directions. For instance, when the image capture apparatus is hand-held still, the acceleration sensor detects only gravitational acceleration.
  • image data is compressed into video data.
  • information about the tilt angles is recorded in frame-by-frame video data. Because the tilt angles relative to the vertical direction are acquired with the cycle time shorter than the image-data-acquisition cycle time, not only information about tilt angles in sync with acquisition of image data but also information about tilt angles out of sync with the acquisition of the image data are recorded. Accordingly, information about a plurality of tilt angles is contained for a single image data piece. This will be described in detail later.
  • the calculator 21 calculates, based on the plurality of tilt angles that are in sync on a per-frame basis, an average tilt angle relative to the vertical direction for each of the three axial directions.
  • the average value is not limited to an arithmetic mean value; a median or a mode value may alternatively be used.
  • the generator 31 corrects a tilt relative to the vertical direction for each frame of the image data using the calculated average tilt angle relative to the vertical direction and, after the tilt angle correction has been correctly made, generates corrected image data on the basis of the corrected data.
  • tilt angle correction is applied to a predetermined conversion table, which will be described in detail later.
  • an image capture apparatus (hereinafter, sometimes also referred to as "full-spherical camera") 110 includes a first image sensor 130A, a second image sensor 130B, a DRAM (Dynamic Random Access Memory) 132, an external storage 134, an acceleration sensor 136, a USB connector 138, a wireless NIC (Network Interface Controller) 140, a CPU (Central Processing Unit) 112, a ROM (Read Only Memory) 114, an image processing block 116, and a video processing block 118.
  • a first image sensor 130A includes a first image sensor 130A, a second image sensor 130B, a DRAM (Dynamic Random Access Memory) 132, an external storage 134, an acceleration sensor 136, a USB connector 138, a wireless NIC (Network Interface Controller) 140, a CPU (Central Processing Unit) 112, a ROM (Read Only Memory) 114, an image processing block 116, and a video processing block 118.
  • CPU Central Processing Unit
  • ROM Read
  • the DRAM 120, the external storage 134, the acceleration sensor 136, the USB connector 138, and the wireless NIC 140 generate respectively a required interface signal thereof by a DRAM interface 120, an external storage interface 122, an external sensor interface 124, a USB interface 126, and a serial block 128 respectively, and are respectively connected to the CPU 112, the ROM 114, the image processing block 116, and the video compression processing block 118 via a BUS 119.
  • the DRAM 120, the external storage 134, the acceleration sensor 136, the USB connector 138, and the wireless NIC 140 exchange data such as the image data and the tilt angle respectively with the CPU 112, the ROM 114, the image processing block 116, and the video compression processing block 118.
  • a configuration for obtaining omnidirectional images using two image sensors is employed.
  • the number of the image sensors may alternatively be three or more.
  • the lens used in the image sensor does not necessarily have an angle of view of 180 degrees or wider. The angle of view can be adjusted as appropriate.
  • a wide-angle lens, such as a fisheye lens, is typically used as the lens.
  • the image capture apparatus is not limited to an omnidirectional one. An image capture apparatus capable of capturing an image with a horizontal angle of view of 360 degrees may alternatively be used.
  • the acceleration sensor 136 is used to detect a tilt of the full-spherical camera 110 at image capture. Hence, a tilt direction of the full-spherical camera 110 can be detected instantly and easily.
  • the up and down directions of the full-spherical camera 110 coincide with the vertical direction with respect to the ground surface.
  • the digital camera is held horizontally in a manner in which a digital camera is typically handled.
  • digitized image data is fed to the image processing block 116 by the first image sensor 130A and the second image sensor 130B.
  • the thus-fed image data undergoes image processing performed by using the image processing block 116, the CPU 112, the ROM 114, and the like and, after undergoing video compression, is stored in the DRAM 132.
  • Information about tilt angles acquired by the acceleration sensor 136 is also stored in the DRAM 132 such that the information about the tilt angles is recorded in a stored video file in accordance with predetermined program instructions.
  • the video file is eventually stored in the external storage 134.
  • the external storage include a CompactFlash (registered trademark) memory and an SD (Secure Digital) memory.
  • FIG. 3 is a hardware configuration diagram of the information processing apparatus of the present embodiment.
  • an image forming apparatus is used as an example of the information processing apparatus.
  • the information processing apparatus is not limited to such an image forming apparatus, and may be of any form so long as it provides the functions required of the imaging system of the present invention.
  • an information processing apparatus 150 includes an external storage 160, a display 162 made up of a liquid crystal or organic electroluminescent panel or the like, a USB connector 166, a wireless NIC 164, a CPU (Central Processing Unit) 152, a RAM (Random Access Memory) 154, an HDD 156, which is a non-volatile memory such as a flash memory, and an input device 158 such as a keyboard, mouse, and a touch panel.
  • an external storage 160 includes an external storage 160, a display 162 made up of a liquid crystal or organic electroluminescent panel or the like, a USB connector 166, a wireless NIC 164, a CPU (Central Processing Unit) 152, a RAM (Random Access Memory) 154, an HDD 156, which is a non-volatile memory such as a flash memory, and an input device 158 such as a keyboard, mouse, and a touch panel.
  • CPU Central Processing Unit
  • RAM Random Access Memory
  • HDD 156 which is
  • the external storage 160, the display 162, the USB connector 166, the wireless NIC 164, the CPU 152, the RAM 154, the HDD 156, and the input device 158 are connected to each other via a BUS 168, and exchange different types of data with each other.
  • a video file is provided to the image processing apparatus 150 through communication with the full-spherical camera 110.
  • the provided video file is temporarily stored in the HDD 156.
  • the video file is read out from the HDD 156 and decoded into frame-by-frame image data by the CPU 152 in accordance with predetermined program instructions.
  • the video file is temporarily stored in the RAM 154.
  • each frame of the image data has a plurality of information pieces about tilt angles for each of the directions. The tilt angle in each of the directions of a single frame is calculated as an average value of the plurality of data pieces.
  • the conversion table is corrected using predetermined correcting means depending on the tilt angles, which are angle information in the image data.
  • a method for correcting the conversion table will be described later.
  • Frame-by-frame corrected image data is generated using the corrected conversion table. Processing is iterated until all the frames are transformed. Generating the corrected image data from the video file in the RAM 154 is performed on memory rather than by temporarily converting the video file stored in the RAM 154 into a still-image file. Performing this processing on memory without converting the video file into still images increases speedup of operation.
  • the corrected image data When corrected image data has been generated for every frame, the corrected image data is joined to encode it into video data.
  • the encoded video data is stored in the HDD 156 and output to display a video on the display 162, which is a display device, or the like.
  • Fig. 4 is a diagram illustrating a tilt-angle-acquisition cycle time and an image-data-acquisition cycle time in image capture with the full-spherical camera.
  • Fig. 4 illustrates an example where the tilt-angle-acquisition cycle time is "1/4" the image-data-acquisition cycle time.
  • the tilt-angle-acquisition cycle time is not limited thereto, and may be any cycle time shorter than the image-data-acquisition cycle time, e.g., "1/5" or "1/10" the image-data-acquisition cycle time.
  • information about tilt angles that vary depending on tilt of the full-spherical camera is acquired for a video file that is generated during image capture and, furthermore, the information about the tilt angles is recorded in the video file.
  • the cycle time for acquisition of tilt angles from the acceleration sensor is "1/4" the image-data-acquisition cycle time. Accordingly, during when a single image data piece is acquired, the acceleration sensor acquires four tilt angles. It is required that the cycle time, with which the acceleration sensor acquires tilt angles, be constant. This is because if the acquisition cycle time is not constant, reliability of calculated tilt angles relative to the vertical direction decreases. Referring to Fig. 4, information about seven tilt angles is recorded for a single image data piece. This is because a tilt angle in the same cycle as the image data and tilt angles within preceding three cycles and following three cycles of the cycle are also related to the image data as tilt-angle information.
  • tilt-angle information related to the image data is that the tilt-angle information additionally contains information about a tilt angle(s), which is within the cycles straddling the acquisition of the image data, out of sync with the image-data-acquisition cycle time.
  • FIGs. 5A and 5B are diagrams describing the projection relationship of the fisheye lens used in the digital camera of the embodiment from an appearance of the fisheye lens in side view (Fig. 5A) and a projection function f of a captured image in plan view (Fig. 5B).
  • An image captured through the fisheye lens having an angle of view wider than 180 degrees is an image of a scene of substantially a half of a sphere cut by a plane passing through its center, which is at an image capture position.
  • Examples of a method (function) for projective transformation include central projection, stereographic projection, equidistant projection, equi-solid angle projection, and orthographic projection.
  • the central projection is a method used in image capture with a digital camera having a general angle of view.
  • the other four methods are used in digital cameras using a wide-angle lens having an ultra-wide angle of view, such as a fisheye lens.
  • FIGs. 6A and 6B are diagrams describing the format of an omnidirectional (full-spherical) image captured by the digital camera of the present embodiment presented in a planar form (Fig. 6A) and a spherical form (Fig. 6B).
  • the format of the omnidirectional (full-spherical) image is such that, when developed onto a plane as illustrated in Fig. 6A, the image has pixel values at angular coordinates of horizontal angles from 0 to 360 degrees and vertical angles from 0 to 180 degrees.
  • the angular coordinates are associated with respective coordinate points on the spherical surface illustrated in Fig. 6B and substantially equivalent to latitude/longitude coordinates on a globe.
  • f f ⁇
  • Fig. 7 is a schematic diagram describing a tilt of the full-spherical camera of the present embodiment.
  • the vertical direction coincides with the z1-axis of the x1-y1-z1 three-dimensional Cartesian coordinate system, which is a global coordinate system.
  • this direction coincides with the orientation of the digital camera illustrated in Figs. 9A and 9B, the camera is in a non-tilted state. When they do not coincide, the digital camera is in a tilted state.
  • Equation (1) An inclination angle ⁇ relative to the gravity vector and a slope angle ⁇ in the x1-y1 plane can be obtained using an output of the acceleration sensor from Equation (1) below.
  • Ax is a value of an x0-axis direction component in the camera coordinate system of the acceleration sensor
  • Ay is a value of a y0-axis direction component in the camera coordinate system of the acceleration sensor
  • Az is a value of a z0-axis direction component in the camera coordinate system of the acceleration sensor.
  • Fig. 8 is a diagram describing vertical correction calculation for an omnidirectional (full-spherical) image captured by the digital camera of the present embodiment using (A) the camera coordinate system and (B) the global coordinate system.
  • Fig. 8 three-dimensional Cartesian coordinates in the global coordinate system are denoted by (x1,y1,z1), and spherical coordinates are denoted by ( ⁇ 1, ⁇ 1).
  • Three-dimensional Cartesian coordinates in the camera coordinate system are denoted by (x0,y0,z0) and spherical coordinates are denoted by ( ⁇ 0, ⁇ 0).
  • Transformation from the spherical coordinates ( ⁇ 1, ⁇ 1) to the spherical coordinates ( ⁇ 0, ⁇ 0) is performed using equations presented in Fig. 8.
  • Performing rotational transformation using three-dimensional Cartesian coordinates is required to correct a tilt.
  • transformation from the spherical coordinates ( ⁇ 1, ⁇ 1) to the three-dimensional Cartesian coordinates (x1,y1,z1) is performed using the equations presented in Fig. 8.
  • Fig. 9A illustrates a conversion table for an omnidirectional (full-spherical) image captured by the digital camera of the present embodiment.
  • Fig. 9A is a diagram illustrating a matrix of coordinate values of a post-transformation image and a pre-transformation image to describe the conversion table (or conversion data).
  • Fig. 9B is a diagram describing a relationship between coordinate values of the post-transformation image and coordinate values of the pre-transformation image.
  • the conversion table for use in image transformation contains a data set of coordinate values ( ⁇ , ⁇ )(pix: pixel) of the post-transformation image and corresponding coordinate values (x,y)(pix) of the pre-transformation image for each of coordinates of the post-transformation image.
  • the presented conversion table has a tabular data structure, the data structure is not limited to a tabular form. In short, it is required only that the conversion table be conversion data.
  • a post-transformation image is generated from a captured image (pre-transformation image) in accordance with the conversion table illustrated in Fig. 9A.
  • pixels of the post-transformation image are generated by referring to pixel values at coordinate values (x,y)(pix) of the pre-transformation image associated with coordinate values ( ⁇ , ⁇ )(pix) based on correspondence between the pre-transformation image and the post-transformation image in the conversion table illustrated in Fig. 9A.
  • Fig. 10 is a flowchart of the present embodiment. The description below is made with reference to the flowchart of Fig. 10.
  • image capture is performed in a video capture mode by placing the image capture apparatus in the video capture mode.
  • acquisition of captured image data is performed in accordance with a predetermined cycle time (step S1).
  • step S2 When acquisition of the image data is completed, predetermined image processing is performed (step S2). Image processing performed at this stage denotes general image processing, such as distortion correction and defective pixel correction, and is not tilt angle correction. Thereafter, the acquisition unit 11 acquires tilt angles relative to the vertical direction from the acceleration sensor in accordance with a predetermined cycle time (step S3 to step S6). In an example of the present embodiment, tilt-angle acquisition is performed with a cycle time, which is "1/4" the image-data-acquisition cycle time as described above.
  • step S7 When acquisition of the image data and the tilt angles is completed, the image data is compressed into video data (step S7).
  • the compressed video is recorded in a predetermined recording medium (step S8).
  • the above is processing for one frame. Thereafter, processing (from step S1 to step S8) is performed similarly on every frame until the image capture ends (until “YES” is determined at step S9).
  • step S10 information about the tilt angles is recorded in an expansion area of the video data stored in the recording medium.
  • information about total six tilt angles which are a tilt angle in sync with the acquired image data and in the same cycle as the acquired image data and tilt angles acquired within cycles straddling the acquisition of the image data, is also handled as tilt-angle information related to the image data and recorded. What has hitherto been described is processing performed in a main body of the image capture apparatus.
  • Fig. 11 is a flowchart for the information processing apparatus of the present embodiment. The description below is made with reference to the flowchart of Fig. 11.
  • the information processing apparatus acquires the video data processed by the image capture apparatus (step S11). Acquisition of the video data may be performed by any method for data transmission and reception. Examples of the method include wired or wireless network communication and use of an external storage.
  • the calculator 21 calculates average values on a per-frame basis from the information about the tilt angles recorded in the acquired video file (step S12).
  • the tilt angle can be calculated from the data obtained from the acceleration sensor by calculating the camera tilt parameter, which is described above in paragraphs [0040] through [0048], concerning the camera coordinate system and the global coordinate system.
  • the acceleration data is three-axis coordinates in the X direction, the Y direction, and the Z direction in each of the coordinate systems.
  • the generator 31 performs tilt angle correction relative to the vertical direction on a per-frame basis (step S13).
  • the tilt angle correction is performed using the conversion table described in paragraphs [0049] through [0054].
  • the conversion table is stored in a predetermined recording unit, e.g., a RAM.
  • the tilt angle correction is performed in accordance with predetermined program instructions.
  • step S14 additional information in the video file is read out.
  • the additional information includes time information and audio information for use in displaying video data.
  • Corrected image data having undergone the tilt angle correction is generated using the calculated average tilt angles and the additional information in the video file (step S15).
  • Generating the corrected image data is performed on memory in a state of being stored in a recording medium, e.g., a RAM, rather than by temporarily converting the video file into a still-image file. This scheme allows fast completion of processing.
  • the above is performed by the generator 31. Thereafter, processing (from step 11 to step 15) by the calculator 21 and the generator 31 is performed in a similar manner until every frame has been processed (until “YES” is determined at step S16).
  • Fig. 12 is a set of still images of a video having undergone the tilt angle correction of the present embodiment and then encoded.
  • Fig. 13 is a set of still images of a video encoded without having undergone the tilt angle correction of the present embodiment.
  • Fig. 12 Difference between Fig. 12 and Fig. 13 is apparent in distortion of image plane relative to the upward vertical direction and in horizontal plane.
  • the image plane appears to be a spherical surface.
  • applying the tilt angle correction of the present invention causes the upward vertical direction to be reproduced clearly even if the image capture apparatus is tilted, thereby providing a video image substantially equivalent to that obtained with the image capture apparatus not tilted but horizontally situated.
  • the present invention allows acquiring, by using the omnidirectional (full-spherical) image capture apparatus and the information processing apparatus, tilt angles from the acceleration sensor with a cycle time which is "1/4" the image-data-acquisition cycle time, thereby achieving highly-accurate tilt angle correction. Furthermore, by performing tilt-angle-corrected image data on memory, high-speed processing can be achieved.
  • the imaging system of the present invention includes two pieces of equipment, which are the full-spherical camera and the image forming apparatus. For instance, if an image capture apparatus capable of implementing the imaging system of the present invention is available, the need for using the information processing apparatus such as the image forming apparatus can be eliminated.
  • any of the above-described apparatus, devices or units can be implemented as a hardware apparatus, such as a special-purpose circuit or device, or as a hardware/software combination, such as a processor executing a software program.
  • any one of the above-described and other methods of the present invention may be embodied in the form of a computer program stored in any kind of storage medium. Examples of storage mediums include, but are not limited to, flexible disk, hard disk, optical discs, magneto-optical discs, magnetic tapes, nonvolatile memory, semiconductor memory, read-only-memory (ROM), etc.
  • any one of the above-described and other methods of the present invention may be implemented by an application specific integrated circuit (ASIC), a digital signal processor (DSP) or a field programmable gate array (FPGA), prepared by interconnecting an appropriate network of conventional component circuits or by a combination thereof with one or more conventional general purpose microprocessors or signal processors programmed accordingly.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Un système d'imagerie vidéo entièrement sphérique selon la présente invention comprend une unité d'acquisition conçue pour acquérir des angles d'inclinaison par rapport à une direction verticale depuis un capteur d'image avec un deuxième temps de cycle égal à « un sur un nombre entier », qui est plus court qu'un premier temps de cycle avec lequel l'unité d'acquisition acquiert des données d'image originales depuis le capteur d'image, un calculateur conçu pour calculer un angle d'inclinaison moyen à partir d'un angle d'inclinaison du capteur d'image par rapport à la direction verticale acquise en même temps que l'acquisition des données d'image originales et un angle d'inclinaison du capteur d'image par rapport à la direction verticale acquis dans une période chevauchant l'acquisition des données d'image originales, et un générateur conçu pour générer des données d'image corrigées par la correction des données d'image originales en fonction de l'angle d'inclinaison moyen.
PCT/JP2016/003104 2015-07-01 2016-06-28 Système d'imagerie vidéo entièrement sphérique et support d'enregistrement lisible par ordinateur WO2017002360A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP16817470.4A EP3318053A4 (fr) 2015-07-01 2016-06-28 Système d'imagerie vidéo entièrement sphérique et support d'enregistrement lisible par ordinateur
CN201680037512.2A CN107710728A (zh) 2015-07-01 2016-06-28 全球面视频成像系统和计算机可读记录介质
US15/578,282 US20180146136A1 (en) 2015-07-01 2016-06-28 Full-spherical video imaging system and computer-readable recording medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015-132564 2015-07-01
JP2015132564 2015-07-01
JP2016-124881 2016-06-23
JP2016124881A JP6677098B2 (ja) 2015-07-01 2016-06-23 全天球動画の撮影システム、及びプログラム

Publications (1)

Publication Number Publication Date
WO2017002360A1 true WO2017002360A1 (fr) 2017-01-05

Family

ID=57608028

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/003104 WO2017002360A1 (fr) 2015-07-01 2016-06-28 Système d'imagerie vidéo entièrement sphérique et support d'enregistrement lisible par ordinateur

Country Status (1)

Country Link
WO (1) WO2017002360A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003304434A (ja) * 2002-04-10 2003-10-24 Tateyama Machine Kk 全方位撮像システム
JP2013182218A (ja) * 2012-03-02 2013-09-12 Ricoh Co Ltd 撮像装置、撮像方法、及びプログラム
WO2013133456A1 (fr) * 2012-03-09 2013-09-12 Ricoh Company, Limited Appareil de capture d'images, système de capture d'images, procédé de traitement d'images, appareil de traitement d'informations et support à mémoire lisible par ordinateur

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003304434A (ja) * 2002-04-10 2003-10-24 Tateyama Machine Kk 全方位撮像システム
JP2013182218A (ja) * 2012-03-02 2013-09-12 Ricoh Co Ltd 撮像装置、撮像方法、及びプログラム
WO2013133456A1 (fr) * 2012-03-09 2013-09-12 Ricoh Company, Limited Appareil de capture d'images, système de capture d'images, procédé de traitement d'images, appareil de traitement d'informations et support à mémoire lisible par ordinateur

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3318053A4 *

Similar Documents

Publication Publication Date Title
US20180146136A1 (en) Full-spherical video imaging system and computer-readable recording medium
US10594941B2 (en) Method and device of image processing and camera
US20200296282A1 (en) Imaging system, imaging apparatus, and system
KR101961364B1 (ko) 화상 캡처링 장치, 화상 캡처 시스템, 화상 처리 방법, 정보 처리 장치, 및 컴퓨터-판독 가능 저장 매체
CN109076200B (zh) 全景立体视频系统的校准方法和装置
JPWO2018235163A1 (ja) キャリブレーション装置、キャリブレーション用チャート、チャートパターン生成装置、およびキャリブレーション方法
JP7024817B2 (ja) 撮像装置、撮像システム、画像処理方法、情報処理装置、及びプログラム
KR102442089B1 (ko) 이미지의 관심 영역의 위치에 따라 이미지를 다면체에 매핑하는 장치 및 그 영상 처리 방법
WO2017002360A1 (fr) Système d'imagerie vidéo entièrement sphérique et support d'enregistrement lisible par ordinateur
JP6901580B2 (ja) パノラマ画像または映像の水平較正方法、システムおよび携帯端末
JP6879350B2 (ja) 画像表示システム、画像表示装置、画像表示方法、およびプログラム
JP2013109643A (ja) 球面勾配検出方法、エッジ点検出方法、球面勾配検出装置、エッジ点検出装置、球面勾配検出プログラム及びエッジ点検出プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16817470

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15578282

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016817470

Country of ref document: EP