WO2012090309A1 - 情報処理装置、そのプログラム、および情報処理方法 - Google Patents
情報処理装置、そのプログラム、および情報処理方法 Download PDFInfo
- Publication number
- WO2012090309A1 WO2012090309A1 PCT/JP2010/073751 JP2010073751W WO2012090309A1 WO 2012090309 A1 WO2012090309 A1 WO 2012090309A1 JP 2010073751 W JP2010073751 W JP 2010073751W WO 2012090309 A1 WO2012090309 A1 WO 2012090309A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- information processing
- processing apparatus
- parallax
- information
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
- G03B35/10—Stereoscopic photography by simultaneous recording having single camera with stereoscopic-base-defining system
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/31—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
Definitions
- the present invention relates to an information processing technique for generating a stereoscopic image.
- Patent Document 1 pattern matching is performed on a left image and a right image obtained by photographing a subject with a stereo camera having two left and right cameras, and corresponding pixels of the left image and the right image are set.
- a pseudo image at each predetermined position along the arrangement direction of the left image and the right image is created by searching and interpolating or extrapolating the distance between two corresponding pixels corresponding to each other.
- a multi-view image creation system is shown.
- the left and right camera arrangement direction is different from the arrangement direction of the left and right eyes of the observer, that is, the horizontal direction.
- the image group in which the amount of parallax between the left and right images along the different directions is variously generated, the above-described problem relating to the stereoscopic vision of the observer is still not solved.
- the present invention has been made to solve these problems, and the horizontal direction of a real subject can be measured regardless of whether the two cameras constituting the stereo camera are arranged in the horizontal direction or not. It is an object of the present invention to provide a technique capable of generating an image group of a subject having a parallax as a stereoscopic image of the subject.
- an information processing apparatus includes an imaging unit having a first imaging system and a second imaging system for imaging a subject from different directions, and the first imaging
- An acquisition means for acquiring determination information for determining a geometric relationship between an array direction of the system and the second imaging system and a horizontal direction, and the geometric relationship based on the determination information.
- the one generation process selected according to the determination result of the geometric relationship is based on the imaging result of the imaging unit.
- generating means for generating a stereoscopic image of the subject.
- An information processing apparatus is the information processing apparatus according to the first aspect, wherein a first source image obtained from the imaging result is set as a target of the first generation process, and the imaging result The second source image obtained from the second generation image is the target of the second generation process, and the selection rules for the first source image and the second source image from the imaging result are different from each other.
- An information processing apparatus is the information processing apparatus according to the second aspect, and when the selection rule determines that the arrangement direction is a horizontal direction, the first imaging system When the first image obtained by the above and the second image obtained by the second imaging system are employed as the first source image and the arrangement direction is determined to be other than the horizontal direction, It is a rule that any one of the first image and the second image is adopted as the second source image.
- An information processing apparatus is the information processing apparatus according to the third aspect, wherein the generation unit generates a third image obtained by spatially deforming any one of the images in an image space. At the same time, the second generation process is performed using the third image as the second source image.
- An information processing apparatus is the information processing apparatus according to the third or fourth aspect, wherein the generation unit is configured to calculate the imaging unit estimated from one of the images and the subject. The second generation process is performed based on the distance information.
- An information processing apparatus is the information processing apparatus according to any one of the third to fifth aspects, further comprising a display unit, wherein the generation unit includes the first image and the first unit.
- the second generation processing is performed using the image displayed on the display unit of the two images as the second source image.
- An information processing apparatus is the information processing apparatus described in any one of the third to sixth claims, wherein the first and second information processing apparatuses are configured by the imaging unit based on the determination information.
- An information processing device is the information processing device according to the seventh aspect, wherein the detection unit is configured from the two postures that are different from each other by 90 degrees around the optical axis of the imaging unit.
- the attitude information of the information processing apparatus is specified.
- An information processing device is the information processing device according to the seventh aspect, wherein the detection means is selected from four postures that differ from each other by 90 degrees around the optical axis of the imaging means.
- the attitude information of the information processing apparatus is specified.
- An information processing apparatus is the information processing apparatus according to any one of the third to ninth aspects, wherein the determination information is an operation generated by operating the information processing apparatus And at least one of a signal, at least one of the first image and the second image, and an output signal of a posture sensor provided in the information processing apparatus.
- An information processing apparatus is the information processing apparatus according to any one of the third to tenth aspects, wherein the first generation process and the second generation process are the stereoscopic image. Are generated by spatially shifting pixel values of the first source image and the second source image in units of pixels.
- An information processing apparatus is the information processing apparatus according to any one of the third to eleventh aspects, wherein the generation means performs the first generation process and the second generation process.
- the stereoscopic image is generated so that a parallax state in an image group constituting the stereoscopic image is different from a parallax state between the first image and the second image.
- An information processing apparatus is the information processing apparatus according to any one of the first to twelfth aspects, wherein the information processing apparatus is a portable information terminal, a digital still camera, or a digital video camera It is.
- the program according to the fourteenth aspect is executed by a computer mounted on the information processing apparatus, thereby causing the information processing apparatus to function as the information processing apparatus according to any one of the first to thirteenth aspects.
- An information processing method includes: the first imaging system and the second imaging system in an imaging unit having a first imaging system and a second imaging system that shoot a subject from different directions.
- first imaging system and the second imaging system also by the information processing apparatus according to any one of the first to thirteenth aspects, the program according to the fourteenth aspect, or the information processing method according to the fifteenth aspect
- the geometric relationship between the direction and the horizontal direction is determined, and the first generation process and the second generation process that are different from each other according to the determination result are selected according to the determination result of the geometric relationship Since the one generation process is performed based on the imaging result of the imaging unit to generate the stereoscopic image of the subject, even when the arrangement direction of the first imaging system and the second imaging system is the horizontal direction In addition, even when different from the horizontal direction, a group of subject images having a horizontal parallax in a real subject can be generated as a stereoscopic image of the subject.
- FIG. 1 is a schematic diagram illustrating an outline of an external appearance of an information processing apparatus according to the embodiment.
- FIG. 2 is a schematic diagram illustrating an outline of an external appearance of the information processing apparatus according to the embodiment.
- FIG. 3 is a schematic diagram illustrating an outline of the appearance of the information processing apparatus according to the embodiment.
- FIG. 4 is a diagram illustrating an example of a main configuration of the information processing apparatus according to the embodiment.
- FIG. 5 is a diagram illustrating an example of a functional configuration of the information processing apparatus according to the embodiment.
- FIG. 6 is a diagram illustrating an example of a main functional configuration of the stereo camera according to the embodiment.
- FIG. 7 is a diagram illustrating an example of parallax between the left image and the right image.
- FIG. 8 is a diagram illustrating an example of parallax between an upper image and a lower image.
- FIG. 9 is a diagram illustrating an example of the upper image.
- FIG. 10 is a diagram illustrating an example of the lower image.
- FIG. 11 is a diagram illustrating an example of a parallax image.
- FIG. 12 is a diagram illustrating an example of a parallax image.
- FIG. 13 is a diagram illustrating an example of a left-eye image.
- FIG. 14 is a diagram illustrating an example of a right-eye image.
- FIG. 15 is a diagram illustrating a concept of an example of a procedure for generating a stereoscopic image.
- FIG. 9 is a diagram illustrating an example of the upper image.
- FIG. 10 is a diagram illustrating an example of the lower image.
- FIG. 11 is a diagram illustrating an example of a parallax image.
- FIG. 12 is a diagram illustrating an example of a
- FIG. 16 is a diagram illustrating a concept of an example of a procedure for generating a stereoscopic image.
- FIG. 17 is a diagram illustrating a concept of an example of a procedure for generating a stereoscopic image.
- FIG. 18 is a diagram illustrating an example of a correspondence relationship of each pixel in each of the partial image of the source image and the partial image of the stereoscopic image.
- FIG. 19 is a diagram illustrating an example of a correspondence relationship between pixel coordinates and pixel shift values of a partial image of a source image and pixel coordinates of a partial image of a stereoscopic image.
- FIG. 20 is a diagram illustrating an example of the left image.
- FIG. 21 is a diagram illustrating an example of the right image.
- FIG. 22 is a diagram illustrating an example of the parallax image of the left image.
- FIG. 23 is a diagram illustrating an example of the parallax image of the right image.
- FIG. 24 is a diagram illustrating an example of the parallax image of the left image.
- FIG. 25 is a diagram illustrating an example of the parallax image of the right image.
- FIG. 26 is a diagram illustrating an example of a left-eye image.
- FIG. 27 is a diagram illustrating an example of the right-eye image.
- FIG. 28 is a diagram for explaining an example of a correspondence relationship between the upper image and the lower image and each stereoscopic image.
- FIG. 23 is a diagram illustrating an example of the parallax image of the right image.
- FIG. 24 is a diagram illustrating an example of the parallax image of the left image.
- FIG. 25 is a diagram illustrating an example of the parallax image of the right image.
- FIG. 26 is
- FIG. 29 is a diagram for describing an example of a correspondence relationship between the upper image and the lower image, and each stereoscopic image.
- FIG. 30 is a diagram for explaining an example of a correspondence relationship between the left image and the right image and each stereoscopic image.
- FIG. 31 is a diagram illustrating an example of correspondence between stereoscopic images and posture information of the information processing device.
- FIG. 32 is a diagram illustrating an operation flow of the information processing apparatus according to the embodiment.
- FIG. 33 is a diagram illustrating an operation flow of the information processing apparatus according to the embodiment.
- FIG. 34 is a diagram illustrating an operation flow of the information processing apparatus according to the embodiment.
- FIG. 35 is a diagram illustrating an operation flow of the information processing apparatus according to the embodiment.
- ⁇ About embodiment:> ⁇ External configuration of information processing apparatus 100A:> 1 to 3 are schematic diagrams showing an outline of an external configuration of an information processing apparatus 100A according to an embodiment of the present invention.
- FIG. 1 and the drawings after FIG. 1 in order to clarify the azimuth relationship, three axes of XYZ orthogonal to each other or two axes of XY orthogonal to each other are appropriately attached.
- the information processing apparatus 100A is configured as a foldable portable information terminal that functions as a terminal device that transmits and receives various types of information to and from a mobile phone, a camera, and a server device by wireless communication or the like.
- the housing 200A, the housing 200B, and the hinge portion 400 are included.
- the hinge part 400 mechanically connects the housing 200A and the housing 200B and electrically connects the housing 200A and the housing 200B.
- the information processing apparatus 100 ⁇ / b> A can be folded by the hinge unit 400.
- FIG. 1 to 3 respectively show the appearance of the information processing apparatus 100A being opened.
- FIG. 1 shows a surface (also referred to as “back surface”) that is an outer surface of the information processing apparatus 100A when the information processing apparatus 100A is folded.
- 2 and 3 show a surface (also referred to as a “front surface”) other than the back surface when the information processing apparatus 100A is opened.
- the housing 200A and the housing 200B are plate-like members, and have a role as housings for storing various electronic members.
- the housing 200 ⁇ / b> A has a stereo camera 300 (FIG. 4) including the first camera 61 and the second camera 62 on the back surface side, and a display unit 43 on the front surface.
- the casing 200B has an operation unit 42 such as a button on its front surface, and has a CPU (Central Processing Unit) 11A (FIG. 4) and the like for electrically controlling the information processing apparatus 100A. is doing.
- the information processing apparatus 100A generates a stereoscopic image that can be stereoscopically viewed by the operator based on the image of the subject photographed by the stereo camera 300 in various postures such as FIGS. To do.
- FIG. 6 is a diagram illustrating an example of a main functional configuration of the stereo camera 300 provided in the information processing apparatus 100A according to the embodiment.
- the stereo camera 300 mainly includes a first camera 61 and a second camera 62. Further, the first camera 61 and the second camera 62 are provided with a predetermined baseline length apart.
- the first camera 61 mainly includes a photographing optical system 72a, an image sensor 75a, and a control processing circuit 85a.
- the second camera 62 is mainly configured by including a photographing optical system 72b, an image sensor 75b, and a control processing circuit 85b.
- Various operations of the stereo camera 300 are controlled based on a control signal 56 (FIG. 5) supplied from the control unit 13 of the CPU 11A via the input / output unit 41 and the data line DL.
- the stereo camera 300 captures the light from the subject 71 with the first camera 61 and the second camera 62 and acquires the first image 21 and the second image 22 constituting the stereo image.
- the generated first image 21 and second image 22 are supplied to the input / output unit 41 (FIG. 4) via the data line DL.
- the photographing optical systems 72a and 72b mainly include a thin lens and a lens barrel (not shown) that supports the lens, and an optical system that forms an image of the subject 71 on the imaging elements 75a and 75b, respectively. It is. At this time, the image of the object point M on the subject 71 is formed as image points Pa and Pb on the image sensors 75a and 75b along principal rays 76a and 76b passing through the optical centers 73a and 73b, respectively.
- the optical centers 73a and 73b are usually the main points of the imaging optical system. For example, when a telecentric optical system is employed as the imaging optical system, the focal point of the imaging optical system is usually the optical center.
- the virtual principal ray 76av is a virtual principal ray obtained by translating the principal ray 76a so as to pass through the optical center 73b, and the virtual image point Pav corresponding to the image point Pa is taken along the virtual principal ray 76av. 75b.
- the imaging centers 77a and 77b of the first camera 61 and the second camera 62 are the intersection of the image sensor 75a and the optical axis 74a, and the intersection of the image sensor 75b and the optical axis 74b, respectively, and the photographing optical system 72a.
- the base line length b between 72b and 72b is the distance between the optical centers 73a and 73b.
- the distance d between the virtual image point Pav and the image point Pb is obtained when the image points Pa and Pb corresponding to the same object point M on the subject 71 are expressed by a common image coordinate system in which the coordinates of the imaging center are equal.
- the distance between the respective image point positions corresponds to the parallax between the first camera 61 and the second camera 62 with respect to the object point M.
- the focal lengths fr (more precisely, the distance between the optical center and the image sensor) of the photographing optical systems 72a and 72b are equal, and the optical axes 74a and 74b are parallel to each other.
- the main planes of the photographing optical systems 72a and 72b are on the same plane perpendicular to the optical axes 74a and 74b, and the optical centers 73a and 73b are also on the same plane.
- the image sensors 75a and 75b of the respective photographing optical systems are on the same plane perpendicular to the optical axes 74a and 74b.
- the imaging elements 75a and 75b are installed so that their scanning lines are parallel so that the corresponding point search processing between the first image 21 and the second image 22 can be easily performed.
- the CPU 11A performs camera operations on the first image 21 and the second image 22 supplied from the first camera 61 and the second camera 62, respectively.
- processing using parameters or the like also referred to as “parallelization processing”
- the distance D between the main planes of the photographing optical systems 72a and 72b and the object point M is the parallax d, the focal length fr, and the baseline length b between the photographing optical systems 72a and 72b. And is given by equation (1).
- the parallax is an index value regarding the distance from the stereo camera 300 of the point on the subject.
- the image pickup devices 75a and 75b are image pickup devices configured by, for example, a CCD image sensor or a CMOS image sensor having an effective pixel number of 3456 ⁇ 2592 pixels, and the intensity of the image formed on the image pickup devices 75a and 75b.
- the image signal corresponding to is generated and supplied to the control processing circuit 85a and the control processing circuit 85b.
- the imaging elements 75a and 75b are color image sensors or monochrome image sensors, so that the usefulness of the present invention is not impaired.
- either one of the number of pixels in the main scanning direction (horizontal scanning direction) and the number of pixels in the sub-scanning direction (vertical scanning direction) is greater than the other. Is the same number, it does not impair the usefulness of the present invention.
- the control processing circuit 85a and the control processing circuit 85b shown in FIG. 6 process the respective image signals supplied from the image pickup devices 75a and 75b in a synchronized manner and convert them into digital images, whereby the number of effective pixels of each image pickup device.
- the first image 21 and the second image 22 corresponding to the above are generated.
- the generated first image 21 and second image 22 constitute a stereo image of the subject.
- the stereo camera 300 continuously captures the subject in time sequence while the first camera 61 and the second camera 62 are synchronized, so that the plurality of first images 21 and the plurality of second images 22 (“ It is also possible to generate a “time-series stereo image”.
- FIG. 4 is a block diagram illustrating an example of a main configuration of the information processing apparatus 100A according to the embodiment.
- the information processing apparatus 100A mainly includes a CPU 11A, an input / output unit 41, an operation unit 42, a display unit 43, a ROM 44, a RAM 45, a storage device 46, a posture sensor 47, and the above-described stereo camera 300. Configured.
- the input / output unit 41 includes, for example, a data line DL electrically connected to each of the stereo camera 300 and the attitude sensor 47 and a connector that electrically connects the signal line 49, and the stereo camera 300 and the CPU 11A. Data is exchanged with other companies. Specifically, the input / output unit 41 supplies, for example, various control signals for the CPU 11A to control the stereo camera 300 to the stereo camera 300 connected to the input / output unit 41 via the data line DL or the like. To do. Further, the input / output unit 41 supplies the first image 21 and the second image 22 captured by the stereo camera 300 to the RAM 45 and the CPU 11A, respectively, and supplies an output signal 51 output from the attitude sensor 47 to the CPU 11A.
- the input / output unit 41 also includes an interface for an external device such as a USB interface. For this reason, the information processing apparatus 100 ⁇ / b> A can also acquire the first image 21 and the second image 22 captured in advance and stored in an external device such as a computer via the input / output unit 41.
- the operation unit 42 includes, for example, various operation buttons provided on the front surface of the housing 200B.
- an operation signal 52 corresponding to the operation is displayed.
- the CPU 11A sets various control parameters and various operation modes of the information processing apparatus 100A based on the supplied operation signal 52.
- each function unit of the information processing apparatus 100 ⁇ / b> A performs processing according to each operation mode set via the operation unit 42.
- the operation unit 42 is provided with a button or a switch for the operator to input to the information processing apparatus 100A the posture of the information processing apparatus 100A when photographing the subject.
- the posture is, for example, the posture of the information processing apparatus 100A in which the arrangement direction of the first camera 61 and the second camera 62 is the horizontal direction (X-axis direction) as shown in FIG. 2, or is shown in FIG. In this way, the orientation of the information processing apparatus 100A is such that the arrangement direction is the vertical direction (Y-axis direction).
- the operation signal 52 is a signal corresponding to the setting result of the switch set by the operator in order to set information regarding the posture in the information processing apparatus 100A, or the result of the posture input by the operator.
- the operation unit 42 is provided with shooting buttons of the stereo camera 300 at positions where the operator can easily operate, and indicates which of the shooting buttons has been operated. Even if the operation signal 52 is supplied to the CPU 11A, the usefulness of the present invention is not impaired.
- the display unit 43 can switch the spatial distribution direction of each barrier unit in the display unit 43 to a plurality of directions such as an X-axis direction and a Y-axis direction in FIG. 2 in accordance with control from the CPU 11A. It is composed of a liquid crystal display for three-dimensional display using a parallax barrier system. Observation for observing the stereoscopic image displayed on the display unit 43 by the CPU 11A switching the direction of the barrier of the display unit 43 according to the parallax direction between the right-eye image and the left-eye image displayed on the display unit 43. The direction in which a person can visually recognize a solid with the naked eye is switched.
- the display unit 43 can alternately display the left-eye image and the right-eye image at high speed, and can alternately open and close the shutter units corresponding to the left eye and the right eye in synchronization with the switching. Even if a three-dimensional display method in which a stereoscopic image displayed on the display unit 43 is observed through dedicated glasses is employed, the usefulness of the present invention is not impaired.
- the CPU 11A switches the parallax direction between the left-eye image and the right-eye image displayed on the display unit 43, so that an observer wearing dedicated glasses observes the stereoscopic image. The direction in which a solid can be recognized can be switched.
- the first image 21, the second image 22, and various setting information related to the information processing apparatus 100A supplied from the stereo camera 300 can be visually recognized by the observer as a two-dimensional image or text information. Can also be displayed.
- ROM (Read Only Memory) 44 is a read-only memory and stores a program for operating the CPU 11A.
- a readable / writable nonvolatile memory for example, a flash memory may be used instead of the ROM 44.
- RAM (Random Access Memory) 45 is a readable and writable volatile memory.
- the RAM 45 functions as an image storage unit that temporarily stores various images captured by the stereo camera 300, stereoscopic images generated by the information processing apparatus 100A, parallax information (parallax images), and the like.
- the RAM 45 also functions as a work memory that temporarily stores processing information of the CPU 11A.
- the storage device 46 is configured by, for example, a readable / writable nonvolatile memory such as a flash memory, a small hard disk device, or the like, and permanently records information such as various control parameters and various operation modes of the information processing device 100A. .
- the storage device 46 also corresponds to the stereoscopic image generated by the information processing device 100A and the posture information of the information processing device 100A detected based on the output signal 51 (FIG. 5) from the posture sensor 47. And permanently remembered.
- the attitude sensor 47 is constituted by, for example, a small attitude sensor in which a three-axis gyro sensor and a three-axis acceleration sensor are mounted by a MEMS (Micro Electro Mechanical Systems) technology or the like.
- the attitude sensor 47 is electrically connected to the input / output unit 41 via a data line DL, and sequentially receives an output signal 51 (FIG. 5) corresponding to the attitude of the information processing apparatus 100A with respect to the vertical direction (gravity direction). This is supplied to the output unit 41.
- the CPU 11A acquires the output signal 51 supplied to the input / output unit 41 at a predetermined timing.
- the CPU (Central Processing Unit) 11A is a control processing device that controls and controls each functional unit of the information processing apparatus 100A, and executes control and processing according to a program stored in the ROM 44.
- the CPU 11A also functions as an acquisition unit 12, a control unit 13, a determination unit 14, a detection unit 15, and a generation unit 16, as will be described later.
- the CPU 11A generates a left-eye image 27 and a right-eye image 28 that form a stereoscopic image of the subject from the subject image captured by the stereo camera 300 by using each of these functional units.
- the CPU 11 ⁇ / b> A controls the imaging operation of the stereo camera 300 and controls the display unit 43 to display various images, stereoscopic images, calculation results, various control information, and the like on the display unit 43.
- the CPU 11 A, the input / output unit 41, the operation unit 42, the display unit 43, the ROM 44, the RAM 45, the storage device 46, the attitude sensor 47, etc. are electrically connected via a signal line 49. Therefore, the CPU 11A performs control of the stereo camera 300 via the input / output unit 41, acquisition of image information from the stereo camera 300, acquisition of the output signal 51 from the orientation sensor 47, display on the display unit 43, and the like, for example. It can be executed at a predetermined timing.
- the functional units of the acquisition unit 12, the control unit 13, the determination unit 14, the detection unit 15, and the generation unit 16 are realized by executing predetermined programs on the CPU 11A.
- each of these functional units may be realized by a dedicated hardware circuit, for example.
- FIG. 5 is a block diagram illustrating an example of a main functional configuration of the information processing apparatus 100A according to the embodiment.
- the information processing apparatus 100 ⁇ / b> A uses the first image 21 obtained based on the photographing of the subject by the first camera 61 (FIG. 1) of the stereo camera 300 and the second camera 62 (FIG. 1).
- a stereoscopic image 29 for the subject that is, a left-eye image 27 and a right-eye image 28 are generated.
- 32 to 34 are diagrams illustrating an operation flow S100 in which the information processing apparatus 100A according to the embodiment generates the stereoscopic image 29.
- the operation of each functional unit of the information processing apparatus 100A will be described below with reference to FIG. 5 and FIGS.
- the information processing apparatus 100A can capture the subject from both the first camera 61 and the second camera 62 provided in the stereo camera 300. The position and posture of the are adjusted by the operator.
- stereo camera 300 and control unit 13 When a shooting button provided on the operation unit 42 is operated in a state where the position and orientation of the information processing apparatus 100A are adjusted, an operation for instructing the start of the shooting operation by the stereo camera 300 as shown in FIG. A signal is supplied as an operation signal 52 to the control unit 13 via the acquisition unit 12. Then, the control unit 13 supplies a control signal 56 (FIG. 5) that causes the stereo camera 300 to perform a shooting operation to each of the stereo camera 300 and the acquisition unit 12.
- a control signal 56 FIG. 5
- the first camera 61 and the second camera 62 start a shooting operation for shooting a subject from different directions.
- a first image 21 and a second image 22 constituting a stereo image of the subject photographed by the first camera 61 and the second camera 62 of the stereo camera 300, respectively, are generated.
- the generated first image 21 and second image 22 of the subject are supplied to the acquisition unit 12 via the input / output unit 41 (FIG. 4) and acquired by the acquisition unit 12 (step S110 in FIG. 32). .
- the acquisition unit 12 determines the geometric relationship between the arrangement direction of the first camera 61 and the second camera 62 and the horizontal direction.
- Information for determination 55 (FIG. 5) is acquired (step S120).
- the arrangement direction of the first camera 61 and the second camera 62 is specifically the direction of the base line of the stereo camera 300 connecting the optical center 73a (FIG. 6) and the optical center 73b (FIG. 6), for example. It is.
- the acquisition unit 12 specifically, as the determination information 55, for example, an operation signal 52 generated by operating the operation unit 42, at least one of the first image 21 and the second image 22, At least one of the output signals 51 of the attitude sensor 47 is acquired.
- the acquired determination information 55 is supplied to each of the determination unit 14 and the detection unit 15.
- the determination unit 14 determines whether the arrangement direction of the first camera 61 and the second camera 62 is, for example, a horizontal direction or a vertical direction perpendicular to the horizontal direction based on the supplied determination information 55. Alternatively, it is determined whether the direction is oblique with respect to the horizontal direction. That is, the determination unit 14 determines the geometric relationship between the arrangement direction of the first camera 61 and the second camera 62 and the horizontal direction based on the determination information 55 (step S130 in FIG. 32). As a process for the determination, for example, the determination unit 14 determines the geometric relationship based on a signal representing the attitude of the information processing apparatus 100A at the time of shooting in the operation signal 52, and outputs an output signal 51.
- OCR optical character recognition
- the detection unit 15 uses the posture information 54 (FIG. 5) of the information processing apparatus 100 ⁇ / b> A when the image of the subject used by the generation unit 16 to generate the stereoscopic image 29 is captured by the stereo camera 300 as information for determination. Based on 55 (step S140).
- the detected posture information 54 is associated with the stereoscopic image 29 and temporarily stored in the RAM 45, and permanently stored in the storage device 46 in response to a predetermined operation signal from the operation unit 42. Is remembered.
- the detection unit 15 uses the determination information 55 supplied to the acquisition unit 12 other than when the stereo camera 300 is captured, for example, when the stereoscopic image stored in the storage device 46 is displayed on the display unit 43. Obtaining and detecting the attitude information 54 (FIG. 5) of the information processing apparatus 100A.
- the operation of the generation unit 16 When the posture information 54 is detected, the generation unit 16 determines that the arrangement direction of the first camera 61 and the second camera 62 is the horizontal direction based on the determination result information 53 supplied from the determination unit 14. It is confirmed whether or not (step S150 in FIG. 32).
- step S150 when it is determined that the arrangement direction of the first camera 61 and the second camera 62 is other than the horizontal direction, the generation unit 16 generates the stereoscopic image 29.
- Process A is performed.
- the generation unit 16 when it is determined that the arrangement direction of the first camera 61 and the second camera 62 is the horizontal direction, the generation unit 16 generates the stereoscopic image 29.
- Generation processing B is performed.
- the generation processes A and B are processes different from each other, and are performed based on the imaging result of the subject by the information processing apparatus 100A.
- the generation unit 16 selects different generation processing A and generation processing B according to the determination result of the geometric relationship between the arrangement direction of the first camera 61 and the second camera 62 and the horizontal direction. By doing so, a stereoscopic image 29 of the subject is generated.
- the generation processing A and the generation are performed by taking as an example the case where the posture of the information processing apparatus 100A is set to the posture shown in FIG. 2 and the posture shown in FIG. Switching to process B will be described.
- the arrangement direction of the first camera 61 and the second camera 62 is the horizontal direction (X-axis direction)
- the arrangement direction is the vertical direction (Y-axis direction).
- the first image 21 of the subject photographed by the first camera 61 on the left side is the left image 25.
- the second image 22 of the subject photographed by the second camera 62 on the right side (+ X direction side) is the right image 26.
- the posture of the information processing apparatus 100A is the posture shown in FIG. 3
- the second image 22 obtained by photographing the subject by the second camera 62 on the upper side is the upper image 23.
- the first image 21 in which the subject is photographed by the first camera 61 on the lower side (+ Y direction side) is the lower image 24.
- FIG. 7 illustrates an example of parallax between the left image 25e (first image 21e) and the right image 26e (second image 22e) when the posture of the information processing apparatus 100A is the posture illustrated in FIG.
- FIG. 8 is a diagram illustrating an example of parallax between the upper image 23f (second image 22f) and the lower image 24f (first image 21f) when the posture of the information processing apparatus 100A is the posture illustrated in FIG. is there.
- the first image 21e and the first image 21f are an example of the first image 21, and the second image 22e and the second image 22f are an example of the second image 22.
- the upper image 23f, the lower image 24f, the left image 25e, and the right image 26e are examples of the upper image 23, the lower image 24, the left image 25, and the right image 26, respectively.
- the pixel 68c on the foreground subject image 66c and the pixel 68d on the foreground subject image 66d are pixels corresponding to the same point of the near-side subject, respectively,
- the pixels 69d are pixels corresponding to the same point of the far-side subject.
- the parallax 9c is a parallax about the pixel 68c and the pixel 68d
- the parallax 9d is a parallax about the pixel 69c and the pixel 69d.
- the left image 25e and the right image 26e are arranged in the vertical direction (Y-axis direction in FIG. 7) so that the X coordinates of the left end (right end) of both the images are equal to each other in order to easily grasp the parallax. They are displayed side by side.
- the parallax 9c and the parallax 9d are generated in the horizontal direction (X-axis direction). Further, the parallax 9c and the parallax 9d have different values due to the difference in distance between the near subject and the far subject with respect to the stereo camera 300. More specifically, the size of the parallax 9c corresponding to the near subject is larger than that of the parallax 9d corresponding to the far subject. Thus, the magnitude of the parallax varies according to the distance from the stereo camera 300 of the point on the subject corresponding to the pixel on the image.
- foreground subject images 66a and 66b of the same near subject located in the + Z direction with respect to the stereo camera 300 are respectively photographed, and the stereo camera Distant view subject images 67a and 67b are photographed for the same far-side subject that is farther from the near-side subject in the + Z direction with respect to 300.
- the pixel 68a on the foreground subject image 66a and the pixel 68b on the foreground subject image 66b are pixels corresponding to the same point of the near-side subject, respectively
- the pixel 69b is a pixel corresponding to the same point of the far-side subject.
- the parallax 9a is a parallax about the pixel 68a and the pixel 68b
- the parallax 9b is a parallax about the pixel 69a and the pixel 69b.
- FIG. 8 only the edge (outline) of each characteristic portion in each subject image is displayed, as in FIG.
- the upper image 23f and the lower image 24f are displayed side by side in the horizontal direction (X-axis direction in FIG. 8) so that the Y coordinates of the upper ends (lower ends) of both the images are equal to make it easy to grasp the parallax. ing.
- the parallax 9a and the parallax 9b are generated in the vertical direction (Y-axis direction), respectively, and the parallax 9a corresponding to the near subject is larger in magnitude than the parallax 9b corresponding to the far subject. ing.
- the arrangement direction of the left eye and the right eye of the observer observing the display unit 43 of the information processing apparatus 100A shown in FIG. 2 is the horizontal direction (X-axis direction), for example, as shown in FIG.
- the left image 25e and the right image 26e are respectively the left eye image 27 and the right eye image so that the orientation of the left image 25e and the right image 26e with respect to the coordinate system is maintained with respect to the coordinate system shown in FIG.
- the observer can recognize a three-dimensional object because the arrangement direction of both eyes of the observer coincides with the direction of parallax between images.
- the display unit 43 is a parallax barrier type display, for example, the longitudinal direction of each barrier in the display unit 43 is set along the Y-axis direction (vertical direction). Further, the setting of the display unit 43 is performed, for example, by the CPU 11A controlling the display unit 43 based on the attitude information 54 of the information processing apparatus 100A acquired by the detection unit 15 or the like.
- the orientation of the information processing apparatus 100A is the orientation shown in FIG. 3
- the orientation of the upper image 23f and the lower image 24f with respect to the coordinate system shown in FIG. 8 is the coordinate system shown in FIG.
- the arrangement direction of both eyes of the observer and the upper image Since the direction of the parallax between 23f and the lower image 24f is different, the observer cannot recognize the solid.
- the arrangement direction of the left and right eyes of the observer observing the display unit 43 in FIG. 3 is the horizontal direction (X-axis direction).
- each barrier of the display unit 43 that is a display of a parallax barrier system is relative to the information processing apparatus 100A from the longitudinal direction of each barrier in the case of FIG.
- the longitudinal direction is set so as to be along the Y-axis direction (vertical direction) in FIG. 3 by being rotated 90 degrees around the Z-axis.
- the left image 25e and the right image 26e whose parallax direction is the horizontal direction (lateral direction) may be displayed on the display unit 43 as the left-eye image 27 and the right-eye image 28 of the stereoscopic image 29 as they are.
- the observer of the display unit 43 can recognize a solid. Also, there is no sense of incongruity about the orientation of the image.
- the upper image 23f and the lower image 24f whose parallax direction is the vertical direction (vertical direction) are displayed as they are on the display unit 43 as the left-eye image 27 and the right-eye image 28, that is, the parallax direction is horizontal.
- the display unit 43 is observed. A person cannot recognize a solid, or feels a sense of incongruity in the orientation of the image even if the solid can be recognized.
- the difference between the arrangement direction of the eyes of the observer and the parallax direction of the stereoscopic image affects the visibility of the stereoscopic image.
- a generation process different from the generation process of the stereoscopic image for the left image 25e and the right image 26e having a parallax in the horizontal direction (lateral direction) needs to be employed.
- the geometric relationship between the arrangement direction of the first camera 61 and the second camera 62 and the horizontal direction is determined, and generation processing A and generation processing B that are different from each other according to the determination result.
- a group of subject images can be generated as a stereoscopic image of the subject. These generation processes will be described below.
- the subject image used as an image constituting the stereoscopic image 29 of the subject by being subjected to spatial deformation in the image space or without being deformed.
- the term “source image” is used as the name of the image.
- About generation process A 28 shows an upper image 23 and a lower image 24 taken by the second camera 62 and the first camera 61 in the information processing apparatus 100A in the posture shown in FIG. 3, respectively, and a left-eye image 27 that is a stereoscopic image.
- FIG. 10 is a diagram for explaining an example of a correspondence relationship in generation processing A with the right-eye image 28;
- one of the upper image 23 and the lower image 24 is specified as a source image, and the space in the image space of the source image with respect to the specified source image
- the left-eye image 27 and the right-eye image 28 are respectively generated by performing the general deformation.
- FIG. 9 is a diagram showing an upper image 23 a as an example of the upper image 23.
- FIG. 10 is a diagram illustrating a lower image 24 a as an example of the lower image 24.
- FIG. 11 is a diagram illustrating a parallax image 30a as an example of the parallax image.
- FIG. 12 is a diagram illustrating a parallax image 31a as an example of the parallax image.
- FIG. 13 is a diagram illustrating a left-eye image 27 a as an example of the left-eye image 27.
- FIG. 14 is a diagram showing a right eye image 28 a as an example of the right eye image 28. 9 to 14, for convenience of illustration, the aspect ratio of the number of pixels of each image does not necessarily match the aspect ratio of the actual image. The same applies to each image in FIGS. 20 to 27 described later.
- step S150 of FIG. 32 when it is determined that the arrangement direction of the first camera 61 and the second camera 62 is other than the horizontal direction, the generation unit 16 performs the first image 21 and the second image Either one of the images 22 is specified as a source image (step S160 in FIG. 33). Note that even if the generation unit 16 specifies the image displayed on the display unit 43 out of the first image 21 and the second image 22 as the source image and performs the generation process A, the usefulness of the present invention is impaired. is not. If the image displayed on the display unit 43 is specified as the source image, even if the display unit 43 changes the display of the source image and the display of the stereoscopic image generated from the source image, the change An observer who has observed the display unit 43 sometimes feels uncomfortable.
- the generation unit 16 acquires the parallax image 30a (FIG. 11) of the upper image 23a that is the source image (FIG. 11). Step S170 in FIG. 33).
- the “parallax image” refers to each pixel of one image of two images obtained by shooting the same subject from different viewpoints (directions) and the other image corresponding to each pixel of the one image.
- Each parallax with each corresponding pixel is an image arranged according to the pixel arrangement of each pixel in the one image. Since the parallax and the distance can be converted to each other as shown in the equation (1), when the distance of each point of the subject is used instead of the parallax in the parallax image, the parallax image is “ Also referred to as “distance image”.
- each corresponding pixel of the other image corresponding to each pixel of one image is identified by performing corresponding point search processing using a correlation calculation method for one image and the other image.
- the As a correlation calculation method used for the corresponding point search process for example, an NCC (Normalized Cross Correlation) method, a SAD (Sum of Absolute Difference) method, or a POC (Phase Only Correlation) method is employed.
- the generation unit 16 acquires the parallax image 30a (FIG. 11) by performing a corresponding point search process on the upper image 23a and the lower image 24a (FIG. 10) with the upper image 23a (FIG. 9) as a reference.
- the utility of the present invention is achieved even when a method that does not use an image that is not specified as a source image is employed. Is not detrimental.
- the saturation of an image of a subject is higher as the subject is closer, and lower as the subject is farther.
- the saturation of the source image corresponds to each pixel of the source image based on the saturation of the source image.
- Distance can be estimated.
- various methods can be adopted as a method capable of estimating the distance from a single source image, such as a method of estimating the distance according to a part in the screen.
- the distance acquired by estimation from a single source image may be acquired after being converted into parallax by the equation (1), for example. That is, even if the generation unit 16 performs the generation process A based on the distance information between the stereo camera 300 and the subject estimated from one of the upper image 23 and the lower image 24, It does not impair usability.
- the term “distance information” is used as a general term for parallax and distance.
- the generation unit 16 performs the generation process A based on the distance information estimated from one of the upper image 23 and the lower image 24, the upper image 23 and the lower image 24 Even if only one of the images, that is, the source image is taken, the usefulness of the present invention is not impaired.
- the generation unit 16 When the parallax image 30a is acquired, the generation unit 16 performs a smoothing process (smoothing process) on the parallax image 30a (step S180 in FIG. 33), and the parallax image 31a (FIG. 12) is obtained as a result of the smoothing process. Is generated.
- the smoothing process for example, the random noise in the parallax image 30a caused by the search error in the corresponding point search process is reduced, and the parallax for the pixel for which the corresponding point is not obtained is used for the neighboring pixels.
- a smoothing filter process using various smoothing filters such as an averaging filter, a median filter, or a Gaussian filter is employed.
- the smoothing strength of the smoothing filter can be changed by changing the size of the filter, for example.
- an averaging filter is employed as the smoothing filter
- the image size of the parallax image 30a is, for example, 3456 pixels ⁇ 2592 pixels
- averaging is performed with the same number of pixels (number of elements) in the vertical and horizontal directions.
- the size in each direction for the filter for example, a value of about 150 to 300 pixels (elements) is employed.
- the smoothing process in step S180 is not performed on the parallax image 30a, the usefulness of the present invention is not impaired. The same applies to the smoothing process in step S220 in FIG.
- step S190 the generation unit 16 performs parallax adjustment for adjusting the parallax value of the parallax image 31a (step S190 in FIG. 33).
- the generation unit 16 generates the left-eye image 27a (FIG. 13) and the right-eye image 28a (FIG. 14) based on the parallax image 31a (FIG. 12) subjected to the parallax adjustment (FIG. 14). 33 step S200).
- FIG. 15 is a diagram illustrating a concept of an example of a procedure for generating a stereoscopic image in the generation process A.
- FIG. 15 shows an example of a concept of a procedure for generating the left-eye image 27c and the right-eye image 28c by the generation process A based on the upper image 23c and the lower image 24c.
- the parallax adjustment processing step step S190 in FIG. 33 in the generation processing A will be described in detail with reference to FIG. 15 as appropriate.
- FIG. 15 is a diagram illustrating a concept of an example of a procedure for generating a stereoscopic image in the generation process A.
- FIG. 15 shows an example of a concept of a procedure for generating the left-eye image 27c and the right-eye image 28c by the generation process A based on the upper image 23c and the lower image 24c.
- the parallax adjustment processing step step S190 in FIG. 33
- the number of pixels in the main scanning direction (horizontal scanning direction) and the number of pixels in the sub-scanning direction (vertical scanning direction) of the image sensor for each image actually correspond to the image sensor.
- the number of pixels is different from the number of pixels.
- one coordinate system representing the orientation of the image coordinate system in each image is provided, and the origin of the image coordinate system in each image is the upper left end ( ⁇ X end and ⁇ Y end) of each image. ).
- the number of pixels and the image coordinate system of each image in FIGS. 16 and 17 described later are also the same as those in FIG.
- one of the purposes of parallax adjustment is that the amount of parallax is too large with respect to the number of pixels in the horizontal direction of the stereoscopic image, so that the viewer of the stereoscopic image in the left-eye image 27 and the right-eye image 28 This is to suppress the occurrence of a phenomenon in which stereoscopic viewing is not possible because the corresponding points cannot be associated with each other.
- the shortest distance perceived by a viewer for a stereoscopic image is set as the distance of the display unit 43. Further, even if the parallax adjustment in step S190 is not performed, the usefulness of the present invention is not impaired. The same applies to the parallax adjustment in step S230 in FIG.
- the upper image 23c is set with a pixel p1 having coordinates (10, 28) and a pixel p2 having coordinates (40, 24).
- a pixel p3 having coordinates (10, 20) and a pixel p4 having coordinates (40, 20) are set. Pixels p1 and p3 correspond to the same point on the subject, and pixels p2 and p4 also correspond to the same point on the subject.
- the parallax image 31c (FIG. 15) is a parallax image between the upper image 23c and the lower image 24c with reference to the upper image 23c when the upper image 23c is set as the source image, and the smoothing process in step S180 is performed. ing.
- the parallax values corresponding to the pixels p1 and p2 of the upper image 23c are 8 and 4, respectively.
- the pixel p1 and the pixel p2 are pixels corresponding to a point on the subject closest to the stereo camera 300 and a point farthest from the stereo camera 300, respectively.
- the parallax 8 is the maximum parallax in the parallax image 31c, and the parallax 4 is the minimum parallax. Further, the relationship between the pixel q1 and the pixel q2 in FIG. 16 to be described later is the same as that of the pixel p1 and the pixel p2.
- the parallax image 32c (FIG. 15) is a parallax image having, as a parallax value, a parallax d2 obtained as a result of adjusting each parallax value in the parallax image 31c according to the equations (2) and (3). Note that the value of each parallax in the parallax image 32c is obtained by setting the value of Wmax in Equation (2) to 2, that is, the value of k1 is 0.5. In the parallax image 32c, the parallax values corresponding to the pixels p1 and p2 of the upper image 23c are 4 and 2, respectively.
- Equation (3) the maximum value of the parallax distribution width in the parallax image 32c is set to Wmax for the purpose of suppressing the occurrence of a phenomenon in which stereoscopic viewing is impossible due to the parallax value being too large in the stereoscopic image.
- the parallax value in the parallax image 31c is adjusted so that Note that the parallax smaller than Wmax among the parallaxes in the parallax image 31c does not impair the usefulness of the present invention even if the adjustment according to the equation (3) is not performed.
- the parallax image 33c (FIG. 15) is a parallax image having, as a parallax value, a parallax d3 obtained by adjusting each parallax value in the parallax image 32c according to the equation (4).
- the parallax values corresponding to the pixels p1 and p2 of the upper image 23c are 0 and -2, respectively.
- the value of each parallax in the parallax image 32c is generally set for the purpose of setting the shortest distance perceived by the observer to the stereoscopic image as the distance of the display unit 43. It is shifted.
- the parallax adjustment process is completed by generating the parallax image 33c.
- the parallax adjustment process is performed according to the expressions (2) to (4) as in the case of the parallax adjustment step in the generation process A.
- ⁇ Stereoscopic image generation processing step of generation processing A The stereoscopic image generation processing step (step S200 in FIG. 33) in the generation processing A will be described in detail below with reference to FIG. 15 as appropriate.
- (A-1) Calculation of pixel shift value When the parallax adjustment (step S190 in FIG. 33) is completed, a stereoscopic image generation process step (step S200 in FIG. 33) is started. First, the pixel shift in the generation process A is performed according to the expressions (5) and (6). A value calculation process is performed. The pixel shift value calculation process uses each parallax in the parallax image after the parallax adjustment (for example, the parallax image 33c in FIG. 15) corresponding to one source image (for example, the upper image 23c in FIG. 15) as the left-eye image.
- the pixel shift value dL for the pixel of the left-eye image 27 is calculated by the equation (5)
- the pixel shift value dR for the pixel of the right-eye image 28 is calculated by the equation (6).
- the values of the sharing ratios rL and rR are each 0.5.
- the sharing rate for the one is 1 and the sharing rate for the other is 0. .
- the values of the sharing ratios rL and rR in the interval of 0 to 1 the ratio when the parallax in the parallax image adjusted for parallax is distributed to the left-eye image 27 and the right-eye image 28 is appropriately set. Can be adjusted.
- the left-eye image 27c and the right-eye image 28c in FIG. 15 are stereoscopic images generated using the parallax image 33c.
- the pixels p1 and p2 in the upper image 23c which is the source image, are shifted to the pixel p5 at coordinates (10, 28) and the pixel p6 at coordinates (39, 24) in the left-eye image 27c, respectively. Further, the pixels p1 and p2 are shifted to the pixel p7 at the coordinates (10, 28) and the pixel p8 at the coordinates (41, 24) in 28c.
- FIG. 35 is a diagram illustrating an example of an operation flow S10 in which the information processing apparatus 100A generates a stereoscopic image by shifting pixels.
- the partial image 58a (FIG. 18) for one line in the horizontal scanning direction (X-axis direction) at the upper end ( ⁇ Y direction end) of the source image is selected (FIG. 18). Step S20).
- FIG. 18 shows a part of each pixel 7a to 7j of the partial image 58a for one line in the horizontal scanning direction (X-axis direction) at the upper end ( ⁇ Y direction end) of the source image, and the left-eye image corresponding to the source image.
- 27 is a diagram illustrating an example of a correspondence relationship with a part of each pixel 8a to 8j of the partial image 58b for one line in the horizontal scanning direction at the upper end (end in the ⁇ Y direction) of FIG.
- the partial image 58a and the partial image 58b correspond to the same part of the subject.
- each of the pixels 7a to 7j and each of the pixels 8a to 8j is displayed by being classified for each pixel by shading according to the pixel value.
- FIG. 19 shows pixel coordinates and pixel shift values of the pixels 7a to 7j of the partial image 58a (FIG. 18) of the source image, and the pixels 8a of the partial image 58b (FIG. 18) of the stereoscopic image (left-eye image 27). It is a figure which shows an example of the correspondence with the pixel coordinate of -8j.
- the first row and the fifth row in FIG. 19 show pixel numbers for specifying the pixels 7a to 7j of the partial image 58a and pixel numbers for specifying the pixels 8a to 8j of the partial image 58b.
- the X coordinate of each pixel 7a to 7j is shown in association with the pixel number shown in the first row.
- pixel shift values corresponding to the pixels 7a to 7j are shown in association with the pixel numbers shown in the first row.
- the X coordinate of each pixel of the partial image 58b is calculated by the equation (7) using the pixel shift value calculated by the equation (5).
- the X coordinates of the pixels 8a to 8j calculated by the equation (7) are shown in association with the pixel numbers shown in the fifth row.
- step S40 the processing in step S40 will be described using the pixels 7a to 7j of the partial image 58a and the pixels 8a to 8j of the partial image 58b shown in FIG. 18 as examples.
- each pixel 7a, 7b, 7c, 7d, 7e, 7f, 7g, 7h, 7i, 7j of the partial image 58a is It corresponds to each pixel 8a, 8b, 8b, 8c, 8d, 8d, 8e, 8g, 8i, 8j of the partial image 58b. That is, each of the pixels 8a to 8j includes a first type pixel corresponding to one pixel among the pixels 7a to 7j, a second type pixel corresponding to two pixels, and the pixels 7a to 7j. There are three types of pixels, the third type of pixels, that none of the pixels 7j correspond to.
- the pixel value of the pixel of the partial image 58a corresponding to the pixel is adopted as the pixel value of the first type pixel, and the pixel value of the second type pixel is used as the pixel value of the second type pixel.
- a representative value for example, an average value of the two pixel values of the partial image 58a corresponding to the pixel is employed.
- the pixel value of the third type pixel for example, among the pixels of the partial image 58b in which the pixel value is acquired based on the correspondence relationship with the partial image 58a, the third type pixel is most spatially related.
- the pixel value of a close pixel is adopted.
- the image of the partial image 58b is specified by the pixel coordinate (X coordinate) specified for each pixel of the partial image 58b and the pixel value.
- step S40 When the processing of step S40 is completed, whether or not the processing (steps S30 to S40) for generating the corresponding partial image of the left-eye image 27 for all the horizontal lines (X-axis direction) of the source image is completed. This is confirmed (step S50 in FIG. 35). As a result of the confirmation in step S50, if the processing has not been completed for all the horizontal lines, the next line in the + Y direction of the processed line in the reference image 21 is selected as a new processing target ( In step S60 of FIG. 35, the process returns to step S30. Further, as a result of the confirmation in step S50, if the process of generating the partial image of the left-eye image 27 for all the horizontal lines has been completed, the generation process of the left-eye image 27 is ended.
- the deformation of the source image based on the parallax may be performed with the pixel size as the minimum unit. Accordingly, a stereoscopic image can be acquired if parallax is acquired in units of pixel size, but for example, parallax is acquired in units of subpixels by performing corresponding point search for determining parallax in units of subpixels equal to or smaller than the pixel size. Even when the source image is deformed based on the parallax, a stereoscopic image can be obtained by performing the deformation amount in units of pixels, so that the usefulness of the present invention is not impaired.
- the generation unit 16 generates the intermediate image 59 by temporarily shifting each pixel of the upper image 23 in the vertical direction based on the parallax between the upper image 23 and the lower image 24. Is used as a new source image to generate the left eye image 27 and the right eye image 28.
- the generation unit 16 generates the intermediate image 59 obtained by spatially deforming one of the upper image 23 and the lower image 24 in the image space, and converts the intermediate image 59 into a new source image. Even if the generation process A is performed, the stereoscopic image 29 can be generated, so that the usefulness of the present invention is not impaired.
- step S200 of FIG. 33 the process proceeds to a process of displaying a stereoscopic image (step S250 of FIG. 34).
- the stereoscopic image display process is the same as the stereoscopic image display process in the generation process B described later.
- the stereoscopic image display process in the generation processes A and B will be described later.
- either the first image 21 or the second image 22 is used even when the arrangement direction of the first camera 61 and the second camera 62 is oblique to the horizontal direction.
- About generation process B 30 shows the left image 25 and the right image 26 taken by the first camera 61 and the second camera 62 in the information processing apparatus 100A in the posture shown in FIG. 2, respectively, and the left-eye image 27 that is a stereoscopic image.
- 10 is a diagram for explaining an example of a correspondence relationship in the generation process B with the right-eye image 28.
- FIG. 10 is a diagram for explaining an example of a correspondence relationship in the generation process B with the right-eye image 28.
- the left image 25 and the right image 26 are specified as source images. Then, the left image 25 is spatially deformed in the image space of the left image 25 to generate a left-eye image 27 and the right image 26 is spatially converted in the image space of the right image 26. By performing various modifications, the right eye image 28 is generated.
- FIG. 20 is a diagram illustrating a left image 25b as an example of the left image 25.
- FIG. 21 is a diagram illustrating a right image 26 b as an example of the right image 26.
- FIG. 22 is a diagram illustrating a parallax image 30Lb as an example of the parallax image of the left image 25b.
- FIG. 23 is a diagram illustrating a parallax image 30Rb as an example of the parallax image of the right image 26b.
- FIG. 24 is a diagram illustrating a parallax image 31Lb as an example of the parallax image of the left image 25b.
- FIG. 25 is a diagram illustrating a parallax image 31Rb as an example of the parallax image of the right image 26b.
- FIG. 26 is a diagram illustrating a left-eye image 27 b as an example of the left-eye image 27.
- FIG. 27 is a diagram illustrating a right eye image 28 b as an example of the right eye image 28.
- the generation unit 16 performs the example shown in FIG. Similarly, the first image 21 and the second image 22 are specified as source images, respectively.
- the generation unit 16 acquires a parallax image 30Lb of the first image 21 that is the source image, that is, the left image 25b, with the second image 22 that is based on the left image 25b, that is, the right image 26b.
- the generation unit 16 obtains a parallax image 30Rb of the second image 22, which is the source image, that is, the right image 26b, with the left image 25b based on the right image 26b (step S210 in FIG. 34).
- These parallax images are acquired by the generation unit 16 performing a corresponding point search process in the same manner as the generation process A.
- the parallax image generation processing in the information processing apparatus 100A when the parallax image is generated with reference to either the left image 25b or the right image 26b, the closer the subject is to the stereo camera 300, the more the parallax image is generated. A parallax image is generated so that the value of becomes larger.
- the generation unit 16 When the parallax images 30Lb and 30Rb are acquired, the generation unit 16 performs a smoothing process (smoothing process) on each of the parallax images 30Lb and 30Rb (step S220 in FIG. 34), and as a result of the smoothing process Parallax images 31Lb and 31Rb (FIGS. 24 and 25, respectively) are generated.
- a smoothing process smoothing process
- step S230 the generation unit 16 performs parallax adjustment for adjusting the parallax values of the parallax images 31Lb and 31Rb (step S230 in FIG. 34).
- the generation unit 16 generates the left-eye image 27b (FIG. 26) and the right-eye image 28b (FIG. 27) based on the parallax images 31Lb and 31Rb subjected to the parallax adjustment (FIG. 34). Step S240).
- 16 and 17 are diagrams illustrating a concept of an example of a procedure for generating a stereoscopic image in the generation process B.
- 16 and 17 illustrate an example of a concept of a procedure in which the left-eye image 27d and the right-eye image 28d are generated by the generation process B based on the left image 25d and the right image 26d.
- the parallax adjustment processing step (step S230 in FIG. 34) in the generation process B will be described in detail with reference to FIGS. 16 and 17 as appropriate.
- a pixel q1 at coordinates (24, 10) and a pixel q2 at coordinates (28, 40) are set in the left image 25d.
- a pixel q3 at coordinates (20, 10) and a pixel q4 at coordinates (20, 40) are set. Pixels q1 and q3 correspond to the same point on the subject, and pixels q2 and q4 also correspond to the same point on the subject.
- the parallax image 31Ld (FIG. 16) is a parallax image between the left image 25d and the right image 26d with reference to the left image 25d when the left image 25d is set as the source image, and is subjected to the smoothing process in step S220.
- the parallax image 31Rd (FIG. 16) is a parallax image between the right image 26d and the left image 25d with the right image 26d as a reference when the right image 26d is set as the source image, and the smoothing process in step S220 is performed.
- the parallax values corresponding to the pixels q1 and q2 of the left image 25d are 4 and 8, respectively.
- the pixels q1 and q2 are pixels corresponding to a point on the subject farthest from the stereo camera 300 and a point on the closest subject, respectively. Therefore, the parallax 8 is the maximum parallax in the parallax image 31Ld, and the parallax 4 is the minimum parallax.
- the parallax values corresponding to the pixels q3 and q4 of the right image 26d are 4 and 8, respectively.
- the pixel q3 and the pixel q4 are pixels corresponding to a point on the subject farthest from the stereo camera 300 and a point on the closest subject, respectively. Therefore, the parallax 8 is the maximum parallax in the parallax image 31Rd, and the parallax 4 is the minimum parallax.
- the parallax images 32Ld and 32Rd are parallax images having the parallax d2 as a parallax value as a result of adjusting the parallax values in the parallax images 31Ld and 31Rd by the equations (2) and (3), respectively.
- the parallax values in the parallax images 32Ld and 32Rd are obtained by setting the value of Wmax in the equation (2) to 2, that is, the value of k1 is 0.5.
- the parallax values corresponding to the pixels q1 and q2 of the left image 25d are 2 and 4, respectively.
- the parallax values corresponding to the pixels q3 and q4 of the right image 26d are 2 and 4, respectively.
- the parallax images 33Ld (FIG. 17) and 33Rd (FIG. 17) are parallax images having the parallax d3 as a parallax value as a result of adjusting the parallax values in the parallax images 32Ld and 32Rd by the equation (4), respectively.
- the parallax values corresponding to the pixels q1 and q2 of the left image 25d are ⁇ 2 and 0, respectively.
- the parallax values corresponding to the pixels q3 and q4 of the right image 26d are -2 and 0, respectively.
- ⁇ Stereoscopic image generation processing step of generation processing B The stereoscopic image generation processing step (step S240 in FIG. 34) in the generation processing B will be described in detail below with reference to FIGS. 16 and 17 as appropriate.
- step S1 Calculation of pixel shift value:
- the stereoscopic image generation processing step (step S240 in FIG. 34) is started.
- the pixel shift value in the generation processing B is expressed by the equations (9) and (10). Is calculated.
- the pixel shift value calculation processing is performed on each parallax in the parallax images (parallax images 33Ld and 33Rd) after parallax adjustment corresponding to the two source images (the left image 25d and the right image 26d), and also on the two source images.
- the difference between each parallax in the corresponding initial parallax images is determined based on the pixel shift sharing rate rL of the left eye image 27d and the pixel shift sharing rate rR of the right eye image 28d.
- the image 27d is assigned to the right-eye image 28d.
- the generation process B unlike the generation process A, as shown in FIG. 30, a left eye image 27d is generated based on the left image 25d, and a right eye image 28d is generated based on the right image 26d. Therefore, as shown in the equations (9) and (10), the method for calculating the allocated parallax is different from the generation process A.
- the pixel shift value dL for the pixel of the left-eye image 27d is calculated by the equation (9)
- the pixel shift value dR for the pixel of the right-eye image 28d is calculated by the equation (10).
- the operations and setting methods of the sharing ratios rL and rR are the expressions (5) and (6). This is the same as the operation and setting method for the sharing ratios rL and rR in the equation.
- (B-3) Generation of stereoscopic image by pixel shifting When the X coordinates of the pixels of the source image, that is, the left-eye image 27d and the right-eye image 28d corresponding to the respective pixels in the left image 25d and the right image 26d are calculated, each source image is calculated based on the calculated X-coordinates. By shifting each pixel, a left-eye image 27d and a right-eye image 28d (respectively FIG. 17) are generated.
- the pixels q1 and q2 in the left image 25d that is the source image are shifted to the pixel q5 at the coordinates (21, 10) and the pixel q6 at the coordinates (24, 40), respectively.
- the pixels q3 and q4 in the right image 26d that is the source image are shifted to the pixel q7 at coordinates (23, 10) and the pixel q8 at coordinates (24, 40), respectively. Yes.
- generation of the entire image of the left-eye image 27d and the right-eye image 28d is performed by the same process as the generation method in the generation process A described above with reference to FIGS.
- step S200 in FIG. 33 the stereoscopic image generation process (step S200 in FIG. 33) in the generation process B is completed, the process proceeds to the process of displaying the stereoscopic image in the generation processes A and B (step S250 in FIG. 34).
- the information processing apparatus 100A determines the geometric relationship between the arrangement direction of the first camera 61 and the second camera 62 and the horizontal direction, and generates a stereoscopic image according to the determination result.
- a group of subject images having horizontal parallax in the subject can be generated as a stereoscopic image of the subject.
- ⁇ Stereoscopic image display processing and storage processing When the stereoscopic image 29 (FIG. 5) is generated by the generation process A or B, the CPU 11A causes the generated stereoscopic image 29, that is, the left-eye image 27 (FIG. 5) and the right-eye image 28 (FIG. 5). Is displayed on the display unit 43 (step S250 in FIG. 34). Furthermore, in response to an operation for instructing storage of an image from the display unit 43, the CPU 11A performs the generated stereoscopic image 29 and the information processing apparatus 100A when the source image of the stereoscopic image 29 is captured.
- the posture information 54 is stored in the storage device 46 in association with each other (step S260 in FIG. 34).
- the posture information 54 (FIG. 5) is generated by the detection unit 15 based on the determination information 55 (FIG. 5) acquired by the acquisition unit 12.
- FIG. 31 is a diagram showing an example of correspondence between the stereoscopic image 29 and the posture information 54 of the information processing apparatus 100A.
- FIG. 31 shows five sets of stereoscopic images in the second to fifth rows. Each stereoscopic image is associated with four index values 0 to 3, each indicating the posture of the information processing apparatus 100A when the source image of the stereoscopic image is captured.
- the first column is the file name for storing the left-eye image 27
- the second column is the file name for storing the right-eye image 28, and the third column
- the eye is an index value for the posture information of the information processing apparatus 100A.
- the data of each row after the second row is stored in the storage device 46 in association with each other.
- the file name is automatically generated based on, for example, a predetermined naming rule.
- the straight line and the optical axis are in a vertical plane with respect to straight lines orthogonal to the optical axis of the stereo camera 300 and the arrangement direction of the first camera 61 and the second camera 62, respectively.
- the information processing apparatus 100A is sequentially rotated 90 degrees in a predetermined rotation direction around the rotation axis parallel to the optical axis from the included state, the arrangement of the first camera 61 and the second camera 62 There are two directions such as a vertical direction or a horizontal direction. Further, there are four types of postures of the information processing apparatus 100A as illustrated in FIG. 31, for example.
- the detection unit 15 selects a direction corresponding to the direction of the straight line at the time of shooting with the stereo camera 300 out of two directions with respect to the arrangement direction of the first camera 61 and the second camera 62 according to the set operation mode.
- a process of detecting, or a process of detecting an attitude corresponding to the attitude of the information processing apparatus 100A at the time of shooting by the stereo camera 300 among the four attitudes of the information processing apparatus 100A is performed.
- the detection of one of the two directions of the arrangement direction of the first camera 61 and the second camera 62 is from two postures that are different from each other by 90 degrees around the optical axis of the stereo camera 300. This is processing for specifying the posture information of the information processing apparatus 100A.
- the processing for detecting the posture corresponding to the posture of the information processing device 100A at the time of shooting by the stereo camera 300 is 90 degrees around the optical axis of the stereo camera 300. This is processing for identifying posture information of the information processing apparatus 100A from four different postures.
- the arrangement direction of the first camera 61 and the second camera 62 detected by the detection unit 15 or the attitude of the information processing apparatus 100A is the stereoscopic image 29 by the CPU 11A as the attitude information of the information processing apparatus 100A when the source image is captured. Is associated with the image. Then, the CPU 11A temporarily stores the posture information 54 associated with the stereoscopic image 29 in the RAM 45, and the operation signal 52 from the operation unit 42 instructs the storage of the generated stereoscopic image. When it is detected that the signal is related to the stereoscopic image 29, the stereoscopic image 29 and the posture information 54 when the source image of the stereoscopic image is captured are associated and stored permanently in the storage device 46.
- the information processing apparatus 100A When the orientation information of the information processing apparatus 100A stored in association with the stereoscopic image 29 is in the above-described two directions, for example, the information processing apparatus 100A at least when the source image is captured. Since it is possible to determine whether or not the arrangement direction of the first camera 61 and the second camera 62 is the horizontal direction, the information processing apparatus 100A generates the stereoscopic image 29 according to the determination result regarding the arrangement direction. Process A or generation process B can be selectively executed. Therefore, the CPU 11A can generate a group of subject images having a horizontal parallax in a real subject as a stereoscopic image of the subject regardless of whether the arrangement direction is horizontal or non-horizontal.
- the posture information of the information processing apparatus 100 ⁇ / b> A stored in association with the stereoscopic image 29 is the above-described four postures, for example, when the stored stereoscopic image is displayed on the display unit 43.
- the posture of the information processing apparatus 100A at the time of display is different from the posture of the information processing apparatus 100A when the source image of the stereoscopic image 29 is generated.
- the CPU 11A observes the direction of the parallax of the stereoscopic image by performing necessary control on the display unit 43 based on the attitude information 54 of the information processing apparatus 100A acquired by the detection unit 15.
- the direction of the stereoscopic image may be the same as the direction of the real subject. That is, according to the information processing apparatus 100A, even when the arrangement direction of the first camera 61 and the second camera 62 is the horizontal direction or different from the horizontal direction, the observer of the stereoscopic image can A stereoscopic image can be generated so that a stereoscopic image can be viewed without feeling uncomfortable about the orientation of the image, and a stereoscopic image can be displayed.
- the above-described information processing apparatus 100A is, for example, a digital still camera or a digital video camera each including the stereo camera 300 and the attitude sensor 47, the usefulness of the present invention is not impaired.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Studio Devices (AREA)
Abstract
Description
<◎情報処理装置100Aの外観構成について:>
図1~図3は、本発明の実施形態に係る情報処理装置100Aの外観構成の概略を示す模式図である。なお、図1および図1以降の図では方位関係を明確化するために、XYZの相互に直交する3軸、またはXYの相互に直交する2軸が適宜付されている。
次に、ステレオカメラ300の構成と動作について説明する。図6は、実施形態に係る情報処理装置100Aに設けられたステレオカメラ300の主な機能構成の1例を示す図である。図6に示されるように、ステレオカメラ300は、第1カメラ61および第2カメラ62を主に備えて構成されている。また、第1カメラ61と第2カメラ62とは、所定の基線長を隔てて設けられている。第1カメラ61は、撮影光学系72a、撮像素子75a、および制御処理回路85aを主に備えて構成されている。また、第2カメラ62は、撮影光学系72b、撮像素子75b、および制御処理回路85bを主に備えて構成されている。
図4は、実施形態に係る情報処理装置100Aの主な構成の1例を示すブロック図である。図4に示されるように、情報処理装置100Aは、CPU11A、入出力部41、操作部42、表示部43、ROM44、RAM45、記憶装置46、姿勢センサ47および上述したステレオカメラ300を主に備えて構成されている。
図5は、実施形態に係る情報処理装置100Aの主な機能構成の1例を示すブロック図である。図5に示されるように、情報処理装置100Aは、ステレオカメラ300の第1カメラ61(図1)による被写体の撮影に基づいて得られる第1画像21と、第2カメラ62(図1)による該被写体の撮影に基づいて得られる第2画像22とのうち少なくとも一方に基づいて、該被写体についての立体視画像29、すなわち左目用画像27および右目用画像28をそれぞれ生成する。
情報処理装置100Aの位置および姿勢が調整された状態で、操作部42に設けられた撮影ボタンが操作されると、図5に示されるように、ステレオカメラ300による撮影動作の開始を指示する操作信号が操作信号52として取得部12を介して制御部13に供給される。そして、制御部13は、ステレオカメラ300に撮影動作を行わせる制御信号56(図5)をステレオカメラ300と取得部12とのそれぞれに供給する。
生成された被写体の第1画像21および第2画像22は、入出力部41(図4)を介して取得部12へと供給され、取得部12によってそれぞれ取得される(図32のステップS110)。第1画像21および第2画像22が取得されると、取得部12は、第1カメラ61と第2カメラ62との配列方向と、水平方向との幾何学的関係を判定部14が判定するための判定用情報55(図5)を、取得する(ステップS120)。なお、第1カメラ61と第2カメラ62との配列方向は、具体的には、例えば、光学中心73a(図6)と光学中心73b(図6)とを結んだステレオカメラ300の基線の方向である。
判定部14は、供給された判定用情報55に基づいて第1カメラ61と第2カメラ62との配列方向が、例えば、水平方向であるか、あるいは水平方向と垂直な鉛直方向であるか、あるいは水平方向に対して斜めの方向であるかなどを判定する。すなわち、判定部14は、判定用情報55に基づいて第1カメラ61と第2カメラ62との配列方向と、水平方向との幾何学的関係を判定する(図32のステップS130)。該判定のための処理として、判定部14は、例えば、操作信号52のうち撮影時の情報処理装置100Aの姿勢を表現した信号に基づいて該幾何学的関係を判定する処理、出力信号51に基づいて該幾何学的関係を判定する処理、第1画像21および第2画像22の少なくとも一方に対してOCR(optical character recognition)を施して画像中の文字の方向を判定することなどによって該幾何学的関係を判定する処理のうち少なくとも1つの処理を行う。また、第1画像21および第2画像22に対する処理として、例えば、画像に対する人物認識処理を施して抽出された人物の両目の配列方向などの特徴部分に基づいて該幾何学的関係を判定する処理が採用されてもよい。また、判定部14によって判定された該幾何学的関係の判定結果は、判定結果情報53として生成部16に供給される。
次に、検出部15は、生成部16によって立体視画像29の生成に用いられる被写体の画像がステレオカメラ300によって撮影されたときの情報処理装置100Aの姿勢情報54(図5)を判定用情報55に基づいて検出する(ステップS140)。なお、検出された姿勢情報54は、立体視画像29と相互に対応づけられてRAM45に一時的に記憶されるとともに、操作部42からの所定の操作信号に応答して記憶装置46に恒久的に記憶される。また、検出部15は、例えば、記憶装置46に記憶された立体視画像を表示部43に表示する場合など、ステレオカメラ300の撮影時以外にも取得部12に供給される判定用情報55を取得して、情報処理装置100Aの姿勢情報54(図5)を検出する。
姿勢情報54が検出されると、生成部16は、判定部14から供給された判定結果情報53に基づいて、第1カメラ61と第2カメラ62との配列方向が水平方向であると判定されたかどうかを確認する(図32のステップS150)。
ここでは、情報処理装置100Aの姿勢が、図2に示される姿勢と、図3に示される姿勢とにそれぞれ設定されて立体視画像29の生成が行われる場合を例として、生成処理Aと生成処理Bとの切り替えについて説明する。図2では、第1カメラ61と第2カメラ62との配列方向が水平方向(X軸方向)であり、図3では、該配列方向が鉛直方向(Y軸方向)である。
図28は、図3に示された姿勢の情報処理装置100Aにおける第2カメラ62および第1カメラ61によってそれぞれ撮影された上画像23および下画像24と、各立体視画像である左目用画像27および右目用画像28との生成処理Aにおける対応関係の1例を説明するための図である。
図15は、生成処理Aにおける立体視画像の生成手順の1例の概念を示す図である。図15においては、上画像23cと下画像24cとに基づいて左目用画像27cと右目用画像28cとが生成処理Aによって生成される手順の1例の概念が示されている。以下に、図15を適宜参照しつつ、生成処理Aにおける視差調整の処理ステップ(図33のステップS190)について詳しく説明する。なお、図15においては、図示の便宜上、各画像について撮像素子における主走査方向(水平走査方向)の画素数と、副走査方向(垂直走査方向)の画素数とは、実際に撮像素子に対応した画素数とは異なった画素数となっている。また、各画像における画像座標系の向きを代表して表現した1の座標系が設けられており、各画像における画像座標系の原点は各画像の左上端(-X端、かつ、-Y端)である。また、後述する図16および図17の各画像の画素数および画像座標系についても図15と同様である。
以下に、図15を適宜参照しつつ、生成処理Aにおける立体視画像の生成処理ステップ(図33のステップS200)について詳しく説明する。
視差調整(図33のステップS190)が終了すると、立体視画像の生成処理ステップ(図33のステップS200)が開始され、先ず、(5)式および(6)式によって、生成処理Aにおける画素ずらし値の算出処理が行われる。該画素ずらし値の算出処理は、1つのソース画像(例えば、図15の上画像23c)に対応した視差調整後の視差画像(例えば、図15の視差画像33c)における各視差を、左目用画像27の画素ずらしの分担率rLと、右目用画像28の画素ずらしの分担率rRとに基づいて左目用画像27と右目用画像28とに振り分ける処理である。(5)式によって、左目用画像27の画素に対する画素ずらし値dLが算出され、(6)式によって、右目用画像28の画素に対する画素ずらし値dRが算出される。なお、例えば、視差画像の視差が左目用画像27と右目用画像28とのそれぞれに均等に割り振られる場合には、分担率rLおよびrRの値はそれぞれ0.5となる。また、例えば、視差画像の視差が左目用画像27と右目用画像28との何れか一方に全て振られる場合には、該一方についての分担率が1となり、他方についての分担率が0となる。このように、分担率rLおよびrRの値を、0~1の区間で調整することによって、視差調整された視差画像における視差を左目用画像27と右目用画像28とに振り分ける際の割合を適宜調整することができる。なお、後述する生成処理Bにおける画素ずらし値の算出処理を規定した(9)式および(10)式においては、割り振られる対象の視差が生成処理Aのものとは異なるものの、分担率rLおよびrRの作用および設定手法は、(5)式および(6)式における分担率rLおよびrRについての作用および設定手法と同様である。
左目用画像27および右目用画像28のそれぞれについて画素ずらし値が算出されると、(7)式によって左目用画像27の画素のX座標が算出されるとともに、(8)式によって右目用画像28の画素のX座標が算出される。
ソース画像における各画素にそれぞれ対応した左目用画像27および右目用画像28の画素のX座標が算出されると、算出されたX座標に基づいてソース画像の各画素がずらされることによって左目用画像27および右目用画像28が生成される。
図30は、図2に示された姿勢の情報処理装置100Aにおける第1カメラ61および第2カメラ62によってそれぞれ撮影された左画像25および右画像26と、各立体視画像である左目用画像27および右目用画像28との生成処理Bにおける対応関係の1例を説明するための図である。
図16および図17は、生成処理Bにおける立体視画像の生成手順の1例の概念を示す図である。図16および図17においては、左画像25dと右画像26dとに基づいて左目用画像27dと右目用画像28dとが生成処理Bによって生成される手順の1例の概念が示されている。以下に、図16および図17を適宜参照しつつ、生成処理Bにおける視差調整の処理ステップ(図34のステップS230)について詳しく説明する。
以下に、図16および図17を適宜参照しつつ、生成処理Bにおける立体視画像の生成処理ステップ(図34のステップS240)について詳しく説明する。
視差調整(図34のステップS230)が終了すると、立体視画像の生成処理ステップ(図34ステップS240)が開始され、先ず、(9)式および(10)式によって、生成処理Bにおける画素ずらし値の算出処理が行われる。該画素ずらし値の算出処理は、2つのソース画像(左画像25dおよび右画像26d)にそれぞれ対応した視差調整後の視差画像(視差画像33Ldおよび33Rd)における各視差と、同じく2つのソース画像に対応した当初の視差画像(視差画像31Ldおよび31Rd)における各視差との差を、左目用画像27dの画素ずらしの分担率rLと、右目用画像28dの画素ずらしの分担率rRとに基づいて左目用画像27dと右目用画像28dとに振り分ける処理である。生成処理Bにおいては、生成処理Aとは異なり、図30に示されるように左画像25dに基づいて左目用画像27dが生成され、右画像26dに基づいて右目用画像28dが生成される。従って(9)式および(10)式に示されるように、割り振られる視差の算出方法が生成処理Aとは異なる。なお、(9)式によって、左目用画像27dの画素に対する画素ずらし値dLが算出され、(10)式によって、右目用画像28dの画素に対する画素ずらし値dRが算出される。ここで、(9)式および(10)式においては割り振られる対象の視差が生成処理Aのものとは異なるものの、分担率rLおよびrRの作用および設定手法は、(5)式および(6)式における分担率rLおよびrRについての作用および設定手法と同様である。
左目用画像27dおよび右目用画像28dのそれぞれについて画素ずらし値が算出されると、(11)式によって左目用画像27dの画素のX座標が算出されるとともに、(12)式によって右目用画像28dの画素のX座標が算出される。
ソース画像、すなわち左画像25dおよび右画像26dにおける各画素にそれぞれ対応した左目用画像27dおよび右目用画像28dの画素のX座標が算出されると、算出されたX座標に基づいて各ソース画像の各画素がずらされることによって左目用画像27dおよび右目用画像28d(それぞれ図17)が生成される。
生成処理AまたはBによって、立体視画像29(図5)が生成されると、CPU11Aは、生成された立体視画像29、すなわち左目用画像27(図5)および右目用画像28(図5)を表示部43に表示する(図34のステップS250)。さらに、表示部43からの画像の記憶を指示する操作などに応答して、CPU11Aは、生成された立体視画像29と、立体視画像29のソース画像が撮影されたときの情報処理装置100Aの姿勢情報54とを相互に対応付けて記憶装置46に記憶する(図34のステップS260)。なお、姿勢情報54(図5)は、取得部12が取得した判定用情報55(図5)に基づいて検出部15が生成する。
以上、本発明の実施の形態について説明してきたが、本発明は上記実施の形態に限定されるものではなく様々な変形が可能である。
200A,200B 筐体
400 ヒンジ部
21 第1画像
22 第2画像
23 上画像
24 下画像
25 左画像
26 右画像
27 左目用画像
28 右目用画像
29 立体視画像
51 出力信号
52 操作信号
53 判定結果情報
54 姿勢情報
55 判定用情報
56 制御信号
58a,58b 部分画像
59 中間画像
61 第1カメラ
62 第2カメラ
66a,66b 近景被写体
67a,67b 遠景被写体
b 基線長
Claims (15)
- 互いに異なる方向から被写体を撮影する第1の撮像系と第2の撮像系とを有する撮像手段と、
前記第1の撮像系と前記第2の撮像系との配列方向と、水平方向との幾何学的関係を判定するための判定用情報を取得する取得手段と、
前記判定用情報に基づいて前記幾何学的関係を判定する判定手段と、
互いに異なる第1の生成処理と第2の生成処理とのうち、前記幾何学的関係の判定結果に応じて選択した一方の生成処理を、前記撮像手段による撮像結果に基づいて行うことにより、前記被写体の立体視画像を生成する生成手段と、
を備える情報処理装置。 - 請求項1に記載された情報処理装置であって、
前記撮像結果から得られた第1ソース画像が前記第1の生成処理の対象とされ、
前記撮像結果から得られた第2ソース画像が前記第2の生成処理の対象とされ、
前記撮像結果からの前記第1ソース画像と前記第2ソース画像との選択規則が互いに異なる情報処理装置。 - 請求項2に記載された情報処理装置であって、
前記選択規則は、
前記配列方向が水平方向であると判定されたときは、前記第1の撮像系によって得た第1の画像と、前記第2の撮像系によって得た第2の画像とを、前記第1ソース画像として採用し、
前記配列方向が水平方向以外であると判定されたときは、前記第1画像と、前記第2の画像との何か一方の画像を、前記第2ソース画像として採用する、
という規則である情報処理装置。 - 請求項3に記載された情報処理装置であって、
前記生成手段が、前記何れか一方の画像を画像空間において空間的に変形させた第3の画像を生成するとともに、該第3の画像を前記第2ソース画像として前記第2の生成処理を行う情報処理装置。 - 請求項3または請求項4に記載された情報処理装置であって、
前記生成手段が、前記何れか一方の画像から推定された前記撮像手段と前記被写体との距離情報に基づいて前記第2の生成処理を行う情報処理装置。 - 請求項3から請求項5の何れか1つの請求項に記載された情報処理装置であって、
表示手段を更に備え、
前記生成手段が、前記第1の画像と前記第2の画像とのうち前記表示手段に表示されている画像を前記第2ソース画像として前記第2の生成処理を行う情報処理装置。 - 請求項3から請求項6の何れか1つの請求項に記載された情報処理装置であって、
前記判定用情報に基づいて前記撮像手段によって前記第1と第2の画像が得られたときの前記情報処理装置の姿勢情報を検出する検出手段と、
前記立体視画像を、前記検出手段によって検出された前記情報処理装置の姿勢情報と相互に対応づけて記憶する記憶手段と、
をさらに備える情報処理装置。 - 請求項7に記載された情報処理装置であって、
前記検出手段が、前記撮像手段の光軸まわりに互いに90度異なる2通りの姿勢の中から、前記情報処理装置の姿勢情報を特定する情報処理装置。 - 請求項7に記載された情報処理装置であって、
前記検出手段が、前記撮像手段の光軸まわりに互いに90度ずつ異なる4通りの姿勢の中から、前記情報処理装置の姿勢情報を特定する情報処理装置。 - 請求項3から請求項9の何れか1つの請求項に記載された情報処理装置であって、
前記判定用情報が、前記情報処理装置が操作されたことにより発生した操作信号、前記第1の画像および前記第2の画像の少なくとも一方、および前記情報処理装置に設けられた姿勢センサの出力信号のうち少なくとも1つを含む情報処理装置。 - 請求項3から請求項10の何れか1つの請求項に記載された情報処理装置であって、
前記第1の生成処理および前記第2の生成処理は、前記立体視画像を生成するにあたって、前記第1ソース画像および第2ソース画像の画素値を画素単位で空間的にそれぞれシフトさせた画像を使用する情報処理装置。 - 請求項3から請求項11の何れか1つの請求項に記載された情報処理装置であって、
前記生成手段が、前記第1の生成処理および前記第2の生成処理において前記立体視画像を構成する画像群における視差の状態と、前記第1の画像と前記第2の画像との視差の状態とが異なるように前記立体視画像を生成する情報処理装置。 - 請求項1から請求項12の何れか1つの請求項に記載された情報処理装置であって、
前記情報処理装置が、携帯型情報端末、デジタルスチルカメラ、またはデジタルビデオカメラである情報処理装置。 - 情報処理装置に搭載されたコンピュータにおいて実行されることにより、当該情報処理装置を請求項1から請求項13の何れか1つの請求項に記載の情報処理装置として機能させることを特徴とするプログラム。
- 互いに異なる方向から被写体を撮影する第1の撮像系と第2の撮像系とを有する撮像手段における前記第1の撮像系と前記第2の撮像系との配列方向と、水平方向との幾何学的関係を判定するための判定用情報に基づいて前記幾何学的関係を判定する判定工程と、
互いに異なる第1の生成処理と第2の生成処理とのうち、前記幾何学的関係の判定結果に応じて選択した一方の生成処理を、前記撮像手段による撮像結果に基づいて行うことにより、前記被写体の立体視画像を生成する生成工程と、
を備える情報処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/990,614 US9420263B2 (en) | 2010-12-28 | 2010-12-28 | Information processor and information processing method |
PCT/JP2010/073751 WO2012090309A1 (ja) | 2010-12-28 | 2010-12-28 | 情報処理装置、そのプログラム、および情報処理方法 |
JP2012550632A JP5464280B2 (ja) | 2010-12-28 | 2010-12-28 | 情報処理装置および情報処理方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2010/073751 WO2012090309A1 (ja) | 2010-12-28 | 2010-12-28 | 情報処理装置、そのプログラム、および情報処理方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012090309A1 true WO2012090309A1 (ja) | 2012-07-05 |
Family
ID=46382458
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/073751 WO2012090309A1 (ja) | 2010-12-28 | 2010-12-28 | 情報処理装置、そのプログラム、および情報処理方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9420263B2 (ja) |
JP (1) | JP5464280B2 (ja) |
WO (1) | WO2012090309A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014197314A (ja) * | 2013-03-29 | 2014-10-16 | ソニー株式会社 | 画像処理装置及び画像処理方法 |
US20190281708A1 (en) * | 2012-06-06 | 2019-09-12 | Apple Inc. | Notched Display Layers |
JP2021508965A (ja) * | 2017-12-20 | 2021-03-11 | レイア、インコーポレイテッドLeia Inc. | クロスレンダリングマルチビューカメラ、システム、及び方法 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5901376B2 (ja) * | 2012-03-23 | 2016-04-06 | 任天堂株式会社 | 情報処理装置、情報処理プログラム、情報処理システム、および情報処理方法 |
US11019323B2 (en) * | 2015-02-06 | 2021-05-25 | Tara Chand Singhal | Apparatus and method for 3D like camera system in a handheld mobile wireless device |
JP6588840B2 (ja) * | 2016-02-04 | 2019-10-09 | 株式会社ジャパンディスプレイ | 表示装置 |
JP6335237B2 (ja) * | 2016-09-15 | 2018-05-30 | 株式会社Subaru | ステレオ測距装置、ステレオ測距方法及びステレオ測距プログラム |
JP6463319B2 (ja) * | 2016-10-19 | 2019-01-30 | 株式会社Subaru | ステレオ測距装置、ステレオ測距方法及びステレオ測距プログラム |
EP3486606A1 (de) * | 2017-11-20 | 2019-05-22 | Leica Geosystems AG | Stereokamera und stereophotogrammetrisches verfahren |
EP3824620A4 (en) * | 2018-10-25 | 2021-12-01 | Samsung Electronics Co., Ltd. | METHOD AND DEVICE FOR PROCESSING VIDEO |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010206774A (ja) * | 2009-02-05 | 2010-09-16 | Fujifilm Corp | 3次元画像出力装置及び方法 |
JP2010226390A (ja) * | 2009-03-23 | 2010-10-07 | Nikon Corp | 撮像装置および撮像方法 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4958233B2 (ja) | 2007-11-13 | 2012-06-20 | 学校法人東京電機大学 | 多眼視画像作成システム及び多眼視画像作成方法 |
KR20110020082A (ko) * | 2009-08-21 | 2011-03-02 | 엘지전자 주식회사 | 이동 단말기의 제어 장치 및 그 방법 |
-
2010
- 2010-12-28 US US13/990,614 patent/US9420263B2/en active Active
- 2010-12-28 JP JP2012550632A patent/JP5464280B2/ja not_active Expired - Fee Related
- 2010-12-28 WO PCT/JP2010/073751 patent/WO2012090309A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010206774A (ja) * | 2009-02-05 | 2010-09-16 | Fujifilm Corp | 3次元画像出力装置及び方法 |
JP2010226390A (ja) * | 2009-03-23 | 2010-10-07 | Nikon Corp | 撮像装置および撮像方法 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190281708A1 (en) * | 2012-06-06 | 2019-09-12 | Apple Inc. | Notched Display Layers |
US12016134B2 (en) * | 2012-06-06 | 2024-06-18 | Apple Inc. | Notched display layers |
JP2014197314A (ja) * | 2013-03-29 | 2014-10-16 | ソニー株式会社 | 画像処理装置及び画像処理方法 |
US9684964B2 (en) | 2013-03-29 | 2017-06-20 | Sony Corporation | Image processing apparatus and image processing method for determining disparity |
JP2021508965A (ja) * | 2017-12-20 | 2021-03-11 | レイア、インコーポレイテッドLeia Inc. | クロスレンダリングマルチビューカメラ、システム、及び方法 |
JP7339259B2 (ja) | 2017-12-20 | 2023-09-05 | レイア、インコーポレイテッド | クロスレンダリングマルチビューカメラ、システム、及び方法 |
Also Published As
Publication number | Publication date |
---|---|
JP5464280B2 (ja) | 2014-04-09 |
US9420263B2 (en) | 2016-08-16 |
JPWO2012090309A1 (ja) | 2014-06-05 |
US20130258066A1 (en) | 2013-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5464280B2 (ja) | 情報処理装置および情報処理方法 | |
JP7059355B2 (ja) | シーンの表現を生成するための装置及び方法 | |
KR101629479B1 (ko) | 능동 부화소 렌더링 방식 고밀도 다시점 영상 표시 시스템 및 방법 | |
JP5456020B2 (ja) | 情報処理装置および方法 | |
CN102783162B (zh) | 摄像装置 | |
JP5464279B2 (ja) | 画像処理装置、そのプログラム、および画像処理方法 | |
KR101852209B1 (ko) | 자동입체 디스플레이 및 그 제조방법 | |
JP5757790B2 (ja) | 情報処理プログラム、情報処理装置、情報処理システム、及び、情報処理方法 | |
JP4928476B2 (ja) | 立体像生成装置、その方法およびそのプログラム | |
JP6585938B2 (ja) | 立体像奥行き変換装置およびそのプログラム | |
US20160054572A1 (en) | Stereoscopic image | |
CN112929639A (zh) | 人眼追踪装置、方法及3d显示设备、方法和终端 | |
JP6021489B2 (ja) | 撮像装置、画像処理装置およびその方法 | |
JP5840022B2 (ja) | 立体画像処理装置、立体画像撮像装置、立体画像表示装置 | |
US9167231B2 (en) | System and method for calibrating a stereoscopic camera based on manual alignment of test images | |
JPH07240945A (ja) | 仮想空間生成提示装置 | |
TWI500314B (zh) | A portrait processing device, a three-dimensional portrait display device, and a portrait processing method | |
CN101655747B (zh) | 一种三维立体鼠标 | |
US20140362197A1 (en) | Image processing device, image processing method, and stereoscopic image display device | |
JP2013074473A (ja) | パノラマ撮像装置 | |
JP5741353B2 (ja) | 画像処理システム、画像処理方法および画像処理プログラム | |
JP2012199759A (ja) | 情報処理装置、そのプログラム、および情報処理方法 | |
JP7395296B2 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP5582101B2 (ja) | 画像処理装置、そのプログラム、および画像処理方法 | |
KR100893381B1 (ko) | 실시간 입체영상 생성방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10861341 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2012550632 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13990614 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10861341 Country of ref document: EP Kind code of ref document: A1 |