WO2003081921A1 - Procede de traitement d'images tridimensionnelles et dispositif - Google Patents
Procede de traitement d'images tridimensionnelles et dispositif Download PDFInfo
- Publication number
- WO2003081921A1 WO2003081921A1 PCT/JP2003/003791 JP0303791W WO03081921A1 WO 2003081921 A1 WO2003081921 A1 WO 2003081921A1 JP 0303791 W JP0303791 W JP 0303791W WO 03081921 A1 WO03081921 A1 WO 03081921A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- parallax
- image
- stereoscopic
- stereoscopic image
- information
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
Definitions
- the present invention relates to a stereoscopic image processing technique, and particularly to a method and an apparatus for generating or displaying a stereoscopic image based on a parallax image.
- stereoscopic display (hereinafter simply referred to as “stereoscopic display”) has been studied in various ways for a long time, and has been put to practical use in a somewhat limited market using theater applications and special display devices. In the future, research and development in this area will be accelerated with the aim of providing more realistic content, and it is expected that an age will come when individual users can enjoy 3D display even at home.
- stereoscopic display is expected to spread in the future, and therefore, display forms that could not be imagined with current display devices have been proposed.
- a technique has been disclosed in which a selected partial image of a two-dimensional image is displayed in a three-dimensional manner (see, for example,
- stereoscopic vision is realized by parallax, even if parallax is expressed by the amount of pixel shift between left and right images, it is not the case that the same stereoscopic video can be appropriately stereoscopically viewed due to differences in the hardware of the display device. May be. If the parallax representing the distant place exceeds the interocular distance, stereoscopic vision cannot be theoretically achieved.
- the resolution and screen size of display devices are diversifying today, such as PCs (personal computers), television receivers, and portable devices, the most suitable for stereoscopic display is to consider various hardware. Creating content is a challenge, or it is more accurate that no methodology is provided for it. And even if given the methodology, it would be difficult for the average programmer to understand and use it to create content and applications.
- the present invention has been made in view of such a background, and an object of the present invention is to propose a new expression method of stereoscopic display. Another object is to generate or display an appropriate stereoscopic image for the user even when the display target image or the display device changes. Still another object is to adjust the stereoscopic effect by a simple operation when a stereoscopic display is performed. Yet another purpose is to provide content or applications that can provide appropriate stereoscopic display. —To reduce the burden on the programmer when creating a scene. Yet another object is to provide a technology for realizing appropriate stereoscopic display as a business model.
- the inventor's knowledge that forms the basis of the present invention is based on the fact that the proper parallax is temporarily determined by factors such as the hardware of the display device and the distance between the user and the display device (hereinafter, these will be collectively referred to as "head air")
- the expression of the appropriate parallax is generalized by the camera interval and the optical axis intersection position described later, and is once described in a general form that does not depend on hardware.
- “Independent of hardware” means that reading of hardware information unique to the display device is basically unnecessary, and if this general description is made, the rest is based on the proper parallax. If a parallax image is generated or adjusted, a desired stereoscopic display is realized.
- the first group is based on a technology for acquiring an appropriate parallax based on a response from a user. This technique can be used for “initial setting” of parallax by a user, and once a proper parallax is acquired in the device, the proper parallax is realized even when another image is displayed.
- this technology goes beyond initial settings
- the present invention relates to a three-dimensional image processing device, an instruction obtaining unit that obtains a user response to a three-dimensional image displayed based on a plurality of viewpoint images corresponding to different parallaxes, and an instruction obtaining unit that obtains a response based on the obtained response.
- a parallax specifying unit that specifies proper parallax for the user.
- the instruction obtaining unit is provided, for example, as GUI (graphical design interface), and displays the image while changing the parallax between viewpoint images.
- GUI graphical design interface
- a “stereoscopic image” is an image displayed with a three-dimensional effect, and the actual data is This is a “parallax image” in which a plurality of images have parallax.
- a parallax image is generally a set of two-dimensional images. Each image constituting the parallax image is a “viewpoint image” having a corresponding viewpoint. That is, a parallax image is composed of a plurality of viewpoint images, and when it is displayed, it is displayed as a stereoscopic image. Display of a stereoscopic image is also simply referred to as “stereoscopic display”.
- Parax is a parameter for creating a three-dimensional effect, and can be defined in various ways. For example, it can be expressed by a shift amount of a pixel representing the same point between viewpoint images.
- the definition is followed unless otherwise specified.
- the range of the appropriate parallax may be specified. In that case, both ends of the range are called “limit parallax.” “Specification of appropriate parallax” may be performed with a maximum value that is allowable as parallax of a nearby object described later.
- the stereoscopic image processing device of the present invention may further include a parallax control unit that performs processing so that the specified proper parallax is realized even when another image is displayed.
- the parallax control unit may determine a plurality of viewpoints for generating the three-dimensional image according to the appropriate parallax. More specifically, the distance between a plurality of viewpoints and the intersection position of the optical axis at which the object is viewed from those viewpoints may be determined.
- An example of these processes is performed by a camera arrangement determination unit described later. If these processes are performed in real time, optimal three-dimensional display is always realized.
- the parallax control unit may perform control such that an appropriate parallax is realized in a predetermined basic three-dimensional space to be displayed. An example of this processing is performed by a projection processing unit described later.
- the parallax control unit may control the coordinates of the closest object and the coordinates of the most distant object in the three-dimensional space so that the proper parallax is realized.
- An example of this processing is performed by a projection processing unit described later.
- Objects can be static.
- “Near” refers to the line of sight of the camera placed at multiple viewpoints, that is, the plane at the intersection of the optical axes (hereinafter also referred to as “intersection of optical axes”) (hereinafter also referred to as “intersection of optical axes”) )
- “Distant” Indicates a state in which parallax is provided so that the image is stereoscopically viewed from the optical axis intersection plane. The closer the parallax of the near object is, the closer the object is sensed to the user. The larger the parallax of the far object is, the farther the object is perceived. That is, unless otherwise specified, the disparity is defined as a non-negative value without reversing the sign at the near and far positions, and both the near and far disparities at the optical axis crossing plane are set to zero.
- the optical axis intersection plane matches the screen surface of the display device. This is because, for a pixel without parallax, the line of sight seen from the left and right eyes reaches the same position in the screen plane, that is, intersects there.
- the parallax control unit may determine a horizontal shift amount of the plurality of two-dimensional images according to an appropriate parallax.
- the input for stereoscopic display is not generated with high degrees of freedom starting from the three-dimensional data, but is a parallax image that has already been generated, and the parallax is fixed. In this case, it is not possible to return to the original three-dimensional space or the real space where the image was actually taken, change the camera position, and perform the process of redrawing or retaking the image. Therefore, the parallax is adjusted by shifting the viewpoint images constituting the parallax image or the pixels included in the parallax images horizontally.
- the parallax control unit may adjust the depth according to the appropriate parallax.
- An example of this processing is performed by a two-dimensional image generation unit of a third stereoscopic image processing device described later.
- the stereoscopic image processing apparatus further includes a parallax holding unit that records an appropriate parallax, and a parallax control unit, at a predetermined timing, for example, when the apparatus is activated, or when the stereoscopic image processing function of the apparatus or a part thereof is activated.
- the appropriate parallax may be read, and the value may be used as an initial value for processing.
- “launch” may be in a hardware or software sense. According to this aspect, if the user determines an appropriate parallax any time, an automatic process for adjusting the stereoscopic effect is realized thereafter. This is also called “initial setting of proper parallax”.
- Another embodiment of the present invention relates to a stereoscopic image processing method, comprising: Displaying an image on a user; and identifying an appropriate parallax for the user based on the user's response to the displayed stereoscopic image.
- Still another embodiment of the present invention also relates to a stereoscopic image processing method, which includes a step of acquiring proper disparity depending on a user, and a step of performing processing on an image before display so that the acquired proper disparity is realized.
- acquisition may be a process of positively identifying or a process of reading from the parallax holding unit or the like.
- each of these steps is implemented as a function of a library for stereoscopic display, and a function of this library can be called as a function from a plurality of programs, the programmer can take into account the hardware of the stereoscopic display device and create a program. There is no need to write it and it is effective.
- the second group of the present invention is based on a technique for adjusting parallax based on a user's instruction.
- This technology can be used for “manual adjustment” of parallax by a user, and the user can appropriately change the stereoscopic effect of an image being displayed.
- this technique can be used not only for manual adjustment, but also for reading an appropriate parallax and automatically adjusting the parallax of the image when stereoscopically displaying a certain image.
- the difference from the automatic adjustment of the first group is that the automatic adjustment of the second group acts on a two-dimensional parallax image or an image with depth information, and changes the parallax retroactively to three-dimensional data. In that case, use the technology of the first group.
- the following relates to the second group.
- One embodiment of the present invention relates to a stereoscopic image processing apparatus, and an instruction obtaining unit configured to obtain an instruction of a user for a stereoscopic image displayed from a plurality of viewpoint images, and a disparity between the plurality of viewpoint images according to the obtained instruction.
- a parallax control unit for changing the amount is shown in FIG. 45 described later, and is a typical example of “manual adjustment”. It is convenient if the user's instructions are provided in a simple GUI such as a button operation.
- Another embodiment of the present invention also relates to a stereoscopic image processing apparatus, wherein a parallax amount detection unit that detects a first parallax amount generated when a stereoscopic image is displayed from a plurality of viewpoint images, wherein the first parallax amount is a user.
- a parallax control unit that changes the amount of parallax between the plurality of viewpoint images so as to fall within the range of the second amount of parallax that is the allowable amount of parallax. This is a typical example of “automatic adjustment”, and the above-mentioned suitable parallax can be used as the second parallax amount.
- An example of this processing is shown in FIG. 46 described later.
- the parallax amount detection unit detects the maximum value of the first parallax amount, and the parallax control unit changes the parallax amount between the plurality of viewpoint images so that the maximum value does not exceed the maximum value of the second parallax amount. Is also good.
- the intention is to protect the maximum amount of parallax, that is, the limit parallax, in order to avoid excessive stereoscopic effect due to excessive parallax.
- the maximum value here may be considered as the maximum value on the near side.
- the parallax amount detection unit calculates corresponding point matching between the plurality of viewpoint images to detect the first amount of parallax, or detects the first amount of parallax previously recorded in any header of the plurality of viewpoint images. It may be detected.
- An example of these processes is shown in FIG. 47 described below.
- the parallax control unit may change the amount of parallax between the plurality of viewpoint images by shifting the synthesis position of the plurality of viewpoint images. This is common to FIGS. 45-47. A shift in the composite position is a horizontal or vertical shift in units of pixels or the entire image.
- the parallax control unit may adjust the depth information to change the amount of parallax.
- Another aspect of the present invention relates to a stereoscopic image processing method, comprising the steps of: acquiring a user instruction for a stereoscopic image displayed based on a plurality of viewpoint images; and, according to the instruction, disparity between the plurality of viewpoint images. Changing the amount.
- Still another embodiment of the present invention also relates to a stereoscopic image processing method, comprising the steps of: detecting a first parallax amount generated when displaying a stereoscopic image from a plurality of viewpoint images; Changing the amount of parallax between the plurality of viewpoint images so as to fall within the range of the second amount of difference that is the amount.
- the third group of the present invention is based on a technique for correcting parallax based on a position in an image. This “automatic correction” works to reduce the user's discomfort or rejection of stereoscopic display, and can be used in combination with the first and second group technologies. In general, when the three-dimensional display, or a plurality of viewpoint images is observed shifted closer to the image edge, such as birth and Sui discomfort, are technical or physiological issues point force s pointed out. In the third group, this problem is reduced by processing such as reducing the parallax near the end of the image or adjusting the parallax so that the object moves from the near side to the far side. Below, 3rd Regarding the loop.
- One embodiment of the present invention relates to a stereoscopic image processing apparatus, and stores a parallax control unit that corrects parallax between a plurality of viewpoint images for displaying a stereoscopic image, and a correction map that the parallax control unit refers to when performing the processing.
- This correction map is described so that the parallax is corrected based on the position in the viewpoint image.
- the correction map includes a parallax correction map, a distance sense correction map, and the like.
- the parallax control unit reduces the parallax, for example, in the periphery of the plurality of viewpoint images, or changes the parallax so that the object is sensed farther from the user.
- the parallax control unit may change parallax by selectively performing processing on any of the plurality of viewpoint images.
- the parallax control unit When a plurality of viewpoint images are generated from a three-dimensional image, that is, when a viewpoint image can be generated by returning to a three-dimensional space, the parallax control unit generates a plurality of viewpoint images using camera parameters.
- the parallax may be changed by controlling overnight. Camera parameters—In the evening, the distance between the left and right cameras, the angle at which the camera looks at the object, or the position of the optical axis intersection.
- the parallax control unit when a plurality of viewpoint images are generated from a 3D image, the parallax control unit generates a plurality of Ne viewpoint images by distorting the 3D space itself in, for example, a world coordinate system. May be changed.
- the parallax control unit may change the parallax by operating the depth information.
- Another aspect of the present invention relates to a three-dimensional image processing method, comprising: obtaining a plurality of viewpoint images for displaying a three-dimensional image; and disparity between the obtained plurality of viewpoint images, Changing based on the following.
- These steps may be implemented as functions of a library for stereoscopic display, and a plurality of programs may be configured to be able to call the functions of the library as functions.
- a fourth group of the present invention relates to a technology that provides the first to third groups and their related functions as a software library, reduces the burden on programmers and users, and promotes the spread of stereoscopic image display applications.
- the following is the fourth group.
- One embodiment of the present invention relates to a stereoscopic image processing method, in which information related to stereoscopic image display is stored in a memory, and the stored information is shared between a plurality of different programs. When displaying, the state of the image to be output is determined with reference to the stored information. An example of the state of the image is how much parallax is given to the parallax image.
- the “held information” may include any one of information of a format of an image input to the stereoscopic image display device, a display order of the viewpoint images, and a parallax amount between the viewpoint images. Further, in addition to sharing the stored information, processing unique to displaying a stereoscopic image may be shared by a plurality of programs.
- An example of the “process specific to stereoscopic image display” is a process for determining the held information. Another example is processing related to a graphical user interface for determining an appropriate parallax, display processing of a parallax adjustment screen that supports realization of an appropriate parallax state, processing for detecting and tracking a user's head position, The processing includes displaying an image for adjusting the display device.
- Another embodiment of the present invention relates to a three-dimensional image processing apparatus, a three-dimensional effect adjusting unit that provides a user with a graphical user interface for adjusting a three-dimensional effect of a three-dimensional display image, and a result of the user adjusting the three-dimensional effect. And a parallax control unit that generates a parallax image in a manner that protects the apparent limit parallax.
- the apparatus further includes: an information detection unit that acquires information to be referred to in order to optimize the stereoscopic image display; and a conversion unit that converts a format of the parallax image generated by the parallax control unit according to the acquired information. May be included.
- the parallax control unit may control the camera parameters based on the three-dimensional data to generate a parallax image while maintaining the limit parallax, or may generate a parallax image by controlling the depth of the image with the depth information. Alternatively, a parallax image may be generated after a horizontal shift amount of a plurality of two-dimensional images having parallax is determined.
- the fifth group of the present invention relates to one application or business model using the above-described stereoscopic image processing technology or its related technology.
- a fourth group of software libraries is available. The following is related to Group 5.
- One embodiment of the present invention relates to a stereoscopic image processing method, and converts an appropriate parallax for stereoscopically displaying a parallax image into an expression format that does not depend on hardware of a display device. Is distributed between different display devices.
- Another embodiment of the present invention also relates to a stereoscopic image processing method, comprising the steps of: loading a proper disparity of a user acquired by a first display device into a second display device; Adjusting the parallax between the parallax images, and outputting the adjusted parallax image from the second display device.
- the first display device is a device normally used by the user
- the second display device is a device provided at another place. Reading the information about the hardware of the first display device into the second display device; and reading the information about the read hardware of the first display device and the information about the hardware of the second display device.
- the method may further include a step of correcting the parallax of the parallax image in which the parallax has been adjusted in the step of adjusting the parallax of the parallax image according to an appropriate parallax on the second display device.
- the information on hardware may include at least one of the size of the display screen, the optimum observation distance of the display device, and the image separation performance of the display device.
- Another embodiment of the present invention relates to a three-dimensional image processing apparatus, including a first display device, a second display device, and a server connected via a network, wherein the first display device is acquired by the device.
- the server sends the appropriate disparity information of the user to the server, the server receives the appropriate disparity information, records the information in association with the user, and when the user requests the second display device to output the image data overnight.
- the device reads the appropriate parallax information of the user from the server, adjusts the parallax, and outputs a parallax image.
- the sixth group of the present invention is based on a technology for devising a new expression method using a stereoscopic image.
- This stereoscopic image processing device is a stereoscopic image processing device that displays a stereoscopic image based on a plurality of viewpoint images corresponding to different parallaxes, and is recommended for displaying a stereoscopic image using the stereoscopic image display device.
- a recommended disparity acquisition unit that acquires a disparity range to be obtained, and a disparity control unit that sets disparity so as to display the stereoscopic display image within the acquired recommended disparity range.
- an object specification unit that receives specification of a predetermined object included in a stereoscopic image from a user, and a plurality of viewpoint images and optical axis intersection positions associated therewith are associated with the position of the specified object.
- the object may further include: an optical axis crossing position setting unit that sets the crossing position of the optical axis so as to be displayed near the position of the display screen on which the stereoscopic image is displayed.
- the optical axis correspondence information describing that the object is associated with the optical axis intersection position and that the object is expressed near the position on the display screen is described, And a designation information adding unit for associating with an object.
- the optical axis intersection position setting unit acquires the optical axis correspondence information, associates the optical axis intersection position with the object described in the acquired optical axis correspondence information, and associates the optical axis intersection position with the object. May be represented near the position of the display screen on which the stereoscopic image is displayed.
- the parallax control unit may further include an identification information acquisition unit that acquires the identification information to be obtained, and the parallax control unit may reflect the acquired identification information when expressing the intelligent object in a stereoscopic image.
- the identification information may include information on evening when the object is expressed in the basic expression space, and the identification information acquiring unit reflects the acquired timing when expressing the object in a stereoscopic image. You may.
- Another embodiment of the present invention relates to a stereoscopic image processing method.
- This stereoscopic image processing method enables a predetermined object included in a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes to be selected, and when an object is selected, the position of the selected object is selected. Then, the plurality of viewpoint images and the optical axis intersection positions associated therewith are made to correspond to each other, and the optical axis intersection positions are made to substantially match the positions on the display screen where the stereoscopic image is displayed.
- the display screen can be set at the boundary between the distant space and the near space, and it is possible to represent an object as if the object crosses the display screen and faces the observer.
- the specified object may have a predetermined interface, and the optical axis intersection position setting unit may associate the optical axis intersection position on the interface. Further, a three-dimensional image may be generated from three-dimensional data as a starting point. When a 3D H image is generated starting from 3D data It is easy to add various effects to a stereoscopic image. For example, when an object is expressed so as to extend beyond an interface, that is, a display screen, an effect of deforming the display screen can be added.
- Still another embodiment of the present invention also relates to a stereoscopic image processing method.
- This stereoscopic image processing method sets an interface that separates space as part of a stereoscopic image near the display screen where a stereoscopic image generated based on multiple viewpoint images corresponding to different parallaxes is displayed.
- a three-dimensional image is represented by using the interface as a boundary between the near space and the far space.
- the interface may be a boundary surface between substances or a thin plate. Examples of the thin plate include a glass plate, and further, paper.
- Still another embodiment of the present invention relates to a stereoscopic image processing method.
- This stereoscopic image processing method includes an object to be included in a stereoscopic image generated based on a plurality of viewpoint images corresponding to different parallaxes and to be expressed in a basic expression space including an object to be stereoscopically displayed. Changing the moving speed of the camera in the near or far direction.
- Yet another embodiment of the present invention also relates to a stereoscopic image processing method.
- a 'object to be expressed in a basic expression space including an object to be displayed in a stereoscopic manner is defined as a predetermined object. At least one of the foreground or the last of the basic expression is set to a position where no object is present, while being expressed so as to be within the parallax range.
- Yet another embodiment of the present invention also relates to a stereoscopic image processing method.
- a parallax of an object to be expressed in a basic expression space including an object to be stereoscopically displayed is calculated.
- the parallax of the object is calculated as a size including the extended area in front of the object instead of the actual size of the object.
- the object moves to the front of the basic representation space by moving the object, including the front extended area, and the object moves further forward, the object is expressed to move in the front extended area. May be.
- Yet another embodiment of the present invention also relates to a stereoscopic image processing method.
- this stereoscopic image processing method when a stereoscopic image is generated based on a plurality of viewpoint images corresponding to different parallaxes, an object to be expressed in a basic expression space including an object to be stereoscopically displayed is included.
- the parallax of an object is calculated as the size including the extended area behind the object instead of the actual size of the object.
- the object moves further back after it is positioned on the last plane of the basic representation space by moving, including the front extended area, the object is moved to the rear extended area. May be expressed.
- a seventh group according to the present invention is based on a technique of adjusting a parallax to be set according to an image state.
- One embodiment of the present invention relates to a stereoscopic image processing device.
- the stereoscopic image processing apparatus uses a parallax in which the ratio between the width and the depth of an object represented in the stereoscopic image is smaller than the parallax in a range that can be correctly perceived by human eyes. It has a parallax control unit that controls so as not to increase.
- Another embodiment of the present invention also relates to a stereoscopic image processing device.
- the stereoscopic image processing apparatus is capable of correctly interpreting the ratio between the width and the depth of an object represented in the stereoscopic image to the human eye.
- a parallax controller is provided to control the parallax so as not to be greater than the parallax of the surrounding area.
- Still another embodiment of the present invention also relates to a stereoscopic image processing apparatus.
- This stereoscopic image processing apparatus includes an image determination unit that performs frequency analysis on a stereoscopic image to be displayed based on a plurality of viewpoint images corresponding to different parallaxes, and a parallax based on the amount of high-frequency components found by frequency analysis. And a parallax control unit that adjusts the amount. In addition, the parallax control unit may perform adjustment to increase the amount of parallax when the amount of the high-frequency component is large.
- Still another embodiment of the present invention also relates to a stereoscopic image processing apparatus.
- This stereoscopic image processing apparatus includes an image determination unit that detects a motion of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes, and adjusts a parallax amount according to an amount of motion of the stereoscopic image.
- Parallax control unit may perform adjustment to reduce the amount of parallax when the amount of movement of the stereoscopic image is small.
- Still another embodiment of the present invention also relates to a stereoscopic image processing device.
- This stereoscopic image processing device provides a stereoscopic image processing device.
- Still another embodiment of the present invention also relates to a stereoscopic image processing device.
- This stereoscopic image processing apparatus generates a maximum value of the depth included in the depth information, which is generated with the progress of the two-dimensional moving image when a three-dimensional image of the moving image is generated from the two-dimensional moving image to which the depth information is given.
- control is performed so that the fluctuation of the minimum value falls within a predetermined threshold value. According to this device, it is possible to reduce the possibility that the observer of the stereoscopic image feels uncomfortable due to a rapid change in parallax.
- Still another embodiment of the present invention relates to a stereoscopic image processing method.
- this stereoscopic image processing method an appropriate parallax of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes is set for each scene.
- Still another embodiment of the present invention relates to a stereoscopic image processing method.
- this stereoscopic image processing method an appropriate parallax of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes is set at predetermined time intervals.
- the stereoscopic image processing apparatus includes: a camera arrangement setting unit configured to set arrangement of a plurality of virtual cameras for generating a plurality of viewpoint images when original data serving as a starting point of a stereoscopic image is input; An object area determination unit that determines whether or not an area in which information of an object to be displayed does not exist in a viewpoint image generated corresponding thereto, and an area in which information of an object to be displayed does not exist occurs. And a camera parameter adjustment unit that adjusts at least one of the angle of view of the virtual camera, the camera interval, and the intersection position of the optical axis so as to eliminate the region where the object information does not exist.
- FIG. 1 is a diagram showing a positional relationship between a user, a screen, and a reproduction object 14 that can perform ideal stereoscopic viewing.
- FIG. 2 is a diagram showing an example of a photographing system for realizing the state of FIG.
- FIG. 3 is a diagram showing another example of a photographing system for realizing the state of FIG.
- FIG. 4 is a diagram showing another example of a photographing system for realizing the state of FIG.
- FIG. 5 is a diagram illustrating a model coordinate system used in the first stereoscopic image processing device.
- FIG. 6 is a diagram showing a world coordinate system used in the first stereoscopic image processing device.
- FIG. 7 is a diagram showing a camera coordinate system used in the first stereoscopic image processing device.
- FIG. 8 is a diagram showing a view volume used in the first stereoscopic image processing apparatus.
- FIG. 9 is a diagram showing a coordinate system after the perspective transformation of the volume in FIG.
- FIG. 10 is a diagram showing a screen coordinate system used in the first stereoscopic image processing device.
- FIG. 11 is a configuration diagram of the first stereoscopic image processing device.
- FIG. 12 is a configuration diagram of the second stereoscopic image processing device.
- FIG. 13 is a configuration diagram of a third stereoscopic image processing device.
- FIGS. 14A and 14B are diagrams respectively showing a left-eye image and a right-eye image displayed by the stereoscopic effect adjustment unit of the first stereoscopic image processing device.
- FIG. 15 is a diagram illustrating a plurality of objects having different parallaxes displayed by the stereoscopic effect adjustment unit of the first stereoscopic image processing device.
- FIG. 16 is a diagram illustrating an object in which parallax changes, which is displayed by the stereoscopic effect adjustment unit of the first stereoscopic image processing device.
- Fig. 17 is a diagram showing the relationship between the camera angle of view, image size, and parallax when proper parallax is realized.
- FIG. 18 is a diagram showing a positional relationship of a photographing system for realizing the state of FIG.
- FIG. 19 is a diagram showing the positional relationship of the imaging system for realizing the state of FIG.
- FIG. 20 is a diagram showing a camera arrangement when generating a multi-viewpoint image with proper parallax.
- FIG. 21 shows a parallax correction map used by the distortion processing unit of the first stereoscopic image processing apparatus.
- FIG. 22 is a diagram illustrating a camera viewpoint when generating a parallax image according to the parallax correction map of FIG. 21.
- FIG. 23 is a diagram illustrating another camera viewpoint when generating a parallax image according to the parallax correction map of FIG. 21.
- FIG. 24 is a diagram illustrating a parallax correction map used by the distortion processing unit of the first stereoscopic image processing device.
- FIG. 25 is a diagram illustrating a camera viewpoint when generating a parallax image according to the parallax correction map of FIG. 24.
- FIG. 26 is a diagram illustrating a distance sense correction map used by the distortion processing unit of the first stereoscopic image processing device.
- FIG. 27 is a diagram illustrating a viewpoint of a power camera when a parallax image is generated according to the distance sense correction map of FIG.
- FIG. 28 is a diagram illustrating another sense of distance correction map used by the distortion processing unit of the first stereoscopic image processing apparatus.
- FIG. 29 is a diagram illustrating a viewpoint of a power camera when generating a parallax image according to the distance sense correction map of FIG.
- FIG. 30 (a), FIG. 30 (b), FIG. 30 (c), FIG. 30 (d), FIG. 30 (e), and FIG. 30 (f) are all first stereoscopic images.
- FIG. 7 is a top view of a parallax distribution obtained as a result of processing performed on a three-dimensional space by a distortion processing unit of the processing device.
- FIG. 31 is a diagram illustrating the principle of processing by the distortion processing unit of the first stereoscopic image processing apparatus.
- FIG. 32 is a diagram specifically illustrating the process of FIG. 31.
- FIG. 33 is a diagram specifically illustrating the process of FIG. 31.
- FIG. 34 is a diagram specifically illustrating the process of FIG. 31.
- FIG. 35 is a diagram illustrating another example of the processing performed by the distortion processing unit of the first stereoscopic image processing device.
- FIG. 36 is a diagram specifically showing the process of FIG.
- FIG. 37 is a diagram showing a depth map.
- FIG. 38 is a diagram illustrating an example of processing by the distortion processing unit of the third stereoscopic image processing device.
- FIG. 39 is a diagram illustrating a depth map generated by the processing by the distortion processing unit of the third stereoscopic image processing device.
- FIG. 40 is a diagram illustrating another example of the processing performed by the distortion processing unit of the third stereoscopic image processing device.
- FIG. 41 is a diagram illustrating an example of a process performed by the two-dimensional image generation unit of the second stereoscopic image processing device.
- FIG. 42 is a diagram illustrating an example of a parallax image.
- FIG. 43 is a diagram illustrating a parallax image in which a combination position is shifted by the two-dimensional image generation unit of the second stereoscopic image processing device.
- FIG. 44 is a diagram illustrating a process of the image edge adjustment unit of the second stereoscopic image processing device.
- FIG. 45 is a diagram illustrating a process of the second stereoscopic image processing device.
- FIG. 46 is a diagram illustrating another process of the second stereoscopic image processing device.
- FIG. 47 is a diagram illustrating another process of the second stereoscopic image processing device.
- FIG. 48 is a diagram showing a planar image to which a depth map has been added.
- FIG. 49 is a diagram showing a depth map.
- FIG. 50 is a diagram showing how a two-dimensional image generation unit of the second stereoscopic image processing device generates a parallax image based on a depth map.
- FIG. 51 is a diagram illustrating a depth map corrected by the two-dimensional image generation unit of the second stereoscopic image processing device.
- FIG. 52 is a diagram illustrating a manner in which the stereoscopic image processing apparatus according to the embodiment is used as a library.
- Figure 53 is a configuration diagram in which the 3D display library is incorporated into 3D data software.
- Fig. 54 is a diagram showing how a stereoscopic display library is used in a network-based system.
- FIG. 55 is a diagram showing a state where an image composed of three-dimensional data is displayed on the display screen.
- FIG. 56 is a diagram showing another state in which an image composed of three-dimensional data is displayed on the display screen.
- FIG. 57 is a diagram showing another state in which an image composed of three-dimensional data is displayed on the display screen.
- FIG. 58 is a diagram showing a method of matching the interface of the object to be displayed with the display screen.
- FIG. 59 is a diagram showing another state in which an image is taken by aligning the optical axis intersection positions of the two virtual cameras with one surface of the aquarium.
- FIG. 60 is a configuration diagram of a fourth stereoscopic image processing device.
- FIG. 61 is a diagram illustrating a convenient basic expression space T for an image displayed by the fourth stereoscopic image processing device.
- FIG. 62 is a diagram in which a region where no object exists is represented by being included in the basic representation space T.
- FIG. 63 is a diagram in which a region where no object exists is represented by being included in the basic representation space T.
- FIG. 64 is a diagram illustrating a state where a moving object is formed so as to include not only a bird but also the space before and after the target object for calculating the parallax.
- FIG. 65 is a diagram illustrating a state in which the moving object moves the bird 330 in a space that has been included in advance after the moving object has passed the front projection plane.
- FIG. 66 is a diagram showing a state where an observer is observing a stereoscopic image on the display screen.
- FIG. 67 is a diagram illustrating a camera arrangement determined by the camera arrangement determining unit.
- FIG. 68 is a diagram illustrating a state where the observer is observing the parallax image obtained by the camera arrangement in FIG. 67.
- FIG. 69 is a view showing an image in which an appropriate parallax is obtained by the camera arrangement of FIG. 67, and a state where the observer is observing the display screen at the observer position shown in FIG.
- FIG. 70 is a diagram illustrating a state in which the camera arrangement shown in FIG. 67 captures the nearest point of a sphere located at a distance A from the display screen.
- Figure 71 shows the optical axis tolerance distance between the two cameras and the parallax shown in Figure 70.
- FIG. 6 is a diagram showing a relationship between camera intervals necessary for obtaining the camera distance.
- FIG. 72 is a diagram illustrating a state in which the camera arrangement shown in FIG. 67 captures the farthest point of a sphere located at a distance TA from the display screen.
- FIG. 73 is a diagram showing the relationship between the optical axis tolerance distance between two cameras and the camera interval E1 required to obtain the parallax shown in FIG.
- FIG. 74 is a diagram illustrating a relationship between force and force required to set the parallax of a stereoscopic image within an appropriate parallax range.
- FIG. 75 is a diagram illustrating the relationship between the force parameters required to set the parallax of the stereoscopic image within the appropriate parallax range.
- FIG. 76 is a configuration diagram of a fifth stereoscopic image processing device.
- FIG. 77 is a diagram illustrating a relationship between a temporary camera position, an angle of view, and first to third objects set by a creator who creates a three-dimensional image.
- FIG. 78 is a diagram illustrating a state where two virtual cameras are arranged based on the temporary camera positions determined in FIG.
- FIG. 79 is a diagram illustrating a state in which the camera arrangement is adjusted so that an area in which the smart object information does not exist does not occur.
- FIG. 80 is a diagram showing a process of adjusting the angle of view.
- FIG. 81 is a diagram showing a relationship between a stereoscopic photographing apparatus for photographing a stereoscopic photograph in an entertainment facility, a photo studio, or the like and a subject.
- FIG. 82 is a diagram showing a configuration of the sixth stereoscopic image processing device.
- FIG. 83 is a diagram showing a state in which a camera is operated by remote control and a captured image is observed on a three-dimensional image display device.
- FIG. 84 is a diagram illustrating an example of photographing by a stereoscopic image photographing device including a sixth stereoscopic image processing device.
- FIG. 85 is a diagram illustrating a state in which image resolution is measured by an illuminometer.
- FIG. 86 is a diagram showing a moiré image used for measuring image resolvability.
- FIG. 87 is a diagram illustrating an example of conversion of an appropriate parallax.
- FIG. 88 is a diagram illustrating another conversion example of the appropriate parallax.
- FIG. 89 shows a table used for simple determination of disparity and basic expression space.
- FIG. 1 shows the positional relationship between the user 10, the screen 12, and the stereoscopically displayed playback object 14.
- the interocular distance of user 10 is E
- the distance between user 10 and screen 12 is D
- the width of playback object 14 when displayed is W. Since the playback object 14 is displayed in a stereoscopic manner, the pixels sensed closer to the screen 12, i.e., the pixels located closer, and the pixels sensed farther than the screen 12, i.e., the pixels located farther away. Pixel.
- Non-parallaxed pixels are visible on the screen 12 because they are exactly at the same position from both eyes on the screen 12.
- Figure 2 shows the imaging system to produce the ideal display of Figure 1. .
- the distance between the two cameras 22 and 24 is E
- the distance from them to the optical axis crossing position when viewing the real object 20 is D
- the screen 1 2 If an object 20 whose width is actually W is taken at the same angle of view as that of the same width as above, parallax images can be obtained from the two cameras. If this is displayed on the screen 12 of FIG. 1, the ideal state of FIG. 1 is realized.
- FIGS. 3 and 4 show the positions of FIG. 2 multiplied by A (A ⁇ 1) and B (B> 1), respectively.
- A A ⁇ 1
- B B> 1
- FIG. 5 to FIG. 10 show the outline of the processing up to the point where stereoscopic display is performed based on the three-dimensional data of the object 20 in the embodiment.
- FIG. 5 shows a model coordinate system, that is, a coordinate space of each three-dimensional object 20. Gives the coordinates when modeling object 20 in this space. Usually, the origin is at the center of object 20.
- Figure 6 shows the world coordinate system.
- the world space is a large space where a scene is formed by placing objects 20, floors, and walls.
- the process up to the modeling in Fig. 5 and the determination of the world coordinate system in Fig. 6 can be recognized as "construction of three-dimensional data”.
- Figure 7 shows the camera coordinate system.
- the camera 22 By setting the camera 22 at an arbitrary angle of view in an arbitrary direction from an arbitrary position in the world coordinate system, conversion to the camera coordinate system is performed.
- the camera position, direction, and angle of view are all within the camera parameters.
- the camera interval and the optical axis crossing position are also determined to determine the parameters for the two cameras.
- the origin is moved to make the midpoint of the two cameras the origin.
- Figures 8 and 9 show the perspective coordinate system.
- the space to be displayed is clipped on the front projection plane 30 and the rear projection plane 32.
- a plane having a near maximum parallax point is a front projection plane 30 and a plane having a far maximum parallax point is a rear projection plane 32.
- this view volume is converted to a rectangular parallelepiped as shown in Fig. 9. 8 and 9 are also called projection processing.
- FIG. 10 shows a screen coordinate system.
- images from each of a plurality of cameras are converted into a coordinate system of a screen, and a plurality of two-dimensional images, that is, parallax images are generated.
- FIG. 11, FIG. 12, and FIG. 13 show the configuration of a stereoscopic image processing apparatus 100 partially different from each other.
- the three-dimensional image processing devices 100 are also referred to as first, second, and third three-dimensional image processing devices 100, respectively. These three-dimensional image processing devices 100 can be integrated into the device, but are divided into three here to avoid complication of the figure.
- the first stereoscopic image processing device 100 is effective when the object and space to be drawn can be obtained from the stage of three-dimensional data, and therefore, the main input is three-dimensional data.
- the second stereoscopic image processing apparatus 100 is effective for adjusting the parallax of a plurality of two-dimensional images to which parallax has already been given, that is, an existing parallax image, and thus inputs a two-dimensional parallax image.
- the third stereoscopic image processing apparatus 100 operates the depth information of the image with depth information to realize proper parallax. Therefore, the input is mainly an image with depth information.
- an “image format determination unit” is provided as a preprocessing unit for the three-dimensional image processing, a parallax image, and an image with depth information. After the determination, the optimal one of the first to third three-dimensional image processing apparatuses 100 is determined. It may be configured to be activated.
- the first stereoscopic image processing apparatus 100 is used to set a stereoscopic effect for stereoscopic display.
- the first three-dimensional image processing apparatus 100 further has a sub-function “parallax correction” for artificially reducing parallax in a peripheral portion of an image.
- a sub-function “parallax correction” for artificially reducing parallax in a peripheral portion of an image.
- the displacement of a plurality of viewpoint images becomes more likely to be recognized as a “double image” as approaching the edge of the image. This is mainly due to mechanical errors such as warpage of the screen of the parallax barrier display device. Therefore, at the periphery of the image, 1) reduce both the near parallax and the far parallax, 2) reduce the near parallax and leave the far parallax unchanged, 3) regardless of the near parallax and the far parallax, Various methods such as shifting to the far parallax throughout are implemented. Note that the “parallax correction” function also exists in the third stereoscopic image processing apparatus 100, but the processing is different due to the difference in input data.
- the first three-dimensional image processing device 100 is specified by a three-dimensional effect adjusting unit 112 and a three-dimensional effect adjusting unit 112 that adjust the three-dimensional effect based on a response from the user to the image displayed in three dimensions.
- a parallax information storage unit 120 that stores the appropriate parallax, and a parallax control unit that reads a proper parallax from the parallax information storage unit 120 and generates a parallax image having a proper parallax from the original image.
- an information acquisition unit 118 having a function of acquiring hardware information of the display device and acquiring a stereoscopic display method, and a parallax control unit based on the information acquired by the information acquisition unit 118. It includes a format conversion unit 116 that changes the format of the parallax image generated in 114.
- the original image is simply called a three-dimensional image, but strictly speaking, it is the object and space data described in the world coordinate system.
- Examples of the information acquired by the information acquisition unit 118 include the number of viewpoints for stereoscopic display, the method of a stereoscopic display device such as space division or time division, and whether or not shirt evening glasses are used.
- the ophthalmic method there is a method of arranging the viewpoint images, whether or not there is a line of viewpoint images in which the parallax is inverted in the parallax images, a result of head tracking, and the like. It should be noted that only the result of the head tracking is exceptionally input directly to the camera arrangement determining unit 132 via a path (not shown) and processed there.
- the above configuration can be realized in hardware by any computer CPU, memory, and other LSIs, and in software by GUI programs, parallax control functions, and other programs with other functions.
- the functional blocks that are realized by their cooperation are drawn. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by only hardware, only software, or a combination thereof, and the same applies to the subsequent configuration. is there.
- the three-dimensional effect adjustment unit 1 12 includes an instruction acquisition unit 1 22 and a parallax identification unit i 24.
- the instruction obtaining unit 122 obtains an appropriate parallax range when the user specifies the range of the parallax for the stereoscopically displayed image.
- the parallax specifying unit 124 specifies a proper parallax when the user uses the display device based on the range. Appropriate parallax is expressed in an expression format that does not depend on the hardware of the display device. Achieving proper parallax enables stereoscopic vision that matches the user's physiology.
- the parallax control unit 1 14 includes a camera temporary arrangement unit 130 that temporarily sets camera parameters, a camera arrangement determination unit 1 32 that corrects camera parameters temporarily set according to appropriate parallax, and a camera.
- the origin moving unit 1 34 that performs the origin moving process to set the midpoint of the multiple cameras as the origin
- the projection processing unit 1 3 8 that performs the above-mentioned projection process
- a two-dimensional image generator 142 that generates a parallax image by performing conversion processing to a screen coordinate system.
- a distortion processing unit 1336 that performs spatial distortion conversion (hereinafter, also simply referred to as “distortion conversion”) to reduce parallax in the peripheral portion of the image as necessary includes a camera temporary arrangement unit 130 and a camera arrangement determination unit 13 It is provided between two.
- the distortion processing unit 136 reads a correction map described later from the correction map holding unit 140 and uses it.
- the second stereoscopic image processing apparatus 100 in FIG. 12 receives a plurality of parallax images as input. This is simply called the input image.
- the second stereoscopic image processing device 100 reads the proper disparity acquired by the first stereoscopic image processing device 100 earlier, adjusts the disparity of the input image to be within the proper disparity range, and outputs I do. In that sense, the second stereoscopic image processing apparatus 100 has a function of “automatic adjustment” of parallax.
- a GUI function is provided, and a “manual adjustment” function for changing the parallax according to the user's instruction is also provided.
- the parallax of the already generated parallax image cannot usually be changed, according to the second stereoscopic image processing apparatus 100, it is sufficient to shift the synthesis position of the viewpoint images forming the parallax image.
- the stereoscopic effect can be changed at a level that can withstand practical use.
- the second stereoscopic image processing device 100 exhibits a good stereoscopic effect adjusting function even in a situation where input data cannot be traced back to three-dimensional data.
- differences from the first stereoscopic image processing apparatus 100 will be mainly described.
- the three-dimensional effect adjustment unit 1 1 2 is used for manual adjustment.
- the instruction obtaining unit 122 realizes, for example, numerical input of “+ n” or “_n” by a screen, and the value is specified by the parallax specifying unit 124 as a parallax change amount. It seems that the relationship between the numerical value and the three-dimensional effect indicated is going to be good.
- “+ n” is an instruction to increase the stereoscopic effect
- “1-n” is an instruction to weaken the stereoscopic effect.
- “+ n” may be an instruction to move the object in the overall near direction
- “1-n” may be an instruction to move the object in the overall far direction.
- the value of n may not be specified, and only the "+” and “one” buttons may be displayed, and the parallax may be changed each time the button is clicked.
- the second stereoscopic image processing apparatus 100 includes a parallax amount detection unit 150 and a parallax control unit 152.
- the disparity amount detection unit 150 examines the header area of those disparity images, and outputs the disparity amount described in the form of the number of pixels, in particular, the nearest maximum disparity pixel number and the far position. ⁇ If there is a large parallax pixel number, obtain it. If the amount of parallax is not described, the matching unit 158 specifies the amount of parallax by detecting a corresponding point between the parallax images using a known method such as block matching.
- Matching part 1 5 8 May be processed only in an important area such as the center of an image, or may be detected by narrowing down to the most important maximum number of nearby parallax pixels.
- the detected amount of parallax is sent to the parallax control unit 152 in the form of the number of pixels.
- the range of the appropriate parallax may be determined by the manufacturer of the stereoscopic image display device, the creator of the content to be displayed on the stereoscopic image display device, or according to general guidelines. It may be determined by other methods. For example, reflect guidelines and standards established by industry and academic organizations related to stereoscopic images.
- the position shift unit 160 of the parallax control unit 152 shifts the synthesis position of the viewpoint images forming the neparallax image in the horizontal direction so that the amount of parallax between the viewpoint images becomes an appropriate parallax.
- the shift may be performed on any of the viewpoint images.
- the position shift unit 160 also has another operation mode, and when the user gives an instruction to increase or decrease the parallax via the stereoscopic effect adjusting unit 112, the image combining position is simply changed according to the instruction. . That is, the position shift unit 160 has two functions, an automatic adjustment function for proper parallax and a manual adjustment function by the user.
- the parallax writing unit 164 stores the parallax amount in one of the header areas of a plurality of viewpoint images constituting the parallax image for the above-described parallax amount detection unit 150 or for another purpose. Write with prime numbers.
- the image edge adjustment unit 168 fills in the missing pixels at the image edge due to the shift by the position shift unit 160.
- the third stereoscopic image processing apparatus 100 in FIG. 13 receives an image with depth information as an input.
- the third stereoscopic image processing apparatus 100 adjusts the depth so as to realize proper parallax. It also has the “parallax correction” function described above.
- the distortion processing unit 174 of the parallax control unit 170 changes the distortion according to the correction map stored in the correction map holding unit 176 as described later. Exchange is performed.
- the depth information and the image after the distortion conversion are input to a two-dimensional image generation unit 178, where a parallax image is generated. This two-dimensional image generation unit 178 is different from the two-dimensional image generation unit 142 of the first three-dimensional image processing apparatus 100, and the proper parallax is considered here.
- the two-dimensional image generator 178 has a function similar to the position shifter 160 of the second three-dimensional image processing apparatus 100, although not shown. It shifts the pixels in the image in the horizontal direction according to the depth information to create a stereoscopic effect. At this time, an appropriate parallax is realized by the processing described later.
- the processing operation and principle of each unit of each stereoscopic image processing apparatus 100 in the above configuration are as follows.
- FIGS. 14 (a) and 14 (b) show the left-eye image 200 0 displayed in the specific process of the appropriate parallax by the stereoscopic effect adjustment unit 112 of the first stereoscopic image processing apparatus 100.
- the right eye image 202 is shown. In each image, five black circles are displayed, and the larger the distance, the closer the disparity is.
- FIG. 15 schematically shows the sense of distance perceived by the user 10 when these five black circles are displayed.
- the user 10 responds that the range of these five senses of distance is “appropriate”, and the instruction acquiring unit 122 acquires this response.
- five black circles with different parallaxes are displayed simultaneously or sequentially, and the user 10 inputs whether or not the parallax is acceptable.
- the display itself is performed with a single black circle, but the disparity is continuously changed, and a response is made when the user 10 reaches the allowable limit in both the remote and the near directions.
- the response may be based on a well-known technique, such as a normal key operation, a mouse operation, or a voice input.
- the determination of the parallax may be performed by a simpler method.
- the setting range of the basic expression space may be determined by a simple method.
- Fig. 89 is a table used for simple determination of disparity and basic expression space.
- the setting range of the basic expression space is divided into four ranks, A to D, from the setting of increasing the near space side to the setting of only the far space side, and the parallax is 1 to 5 respectively. It is divided into ranks.
- ranks for example, you like the strongest three-dimensional effect, If you prefer labeling, rank 5A. Then, it is not always necessary to determine the rank while checking the three-dimensional display, and only the button for determining the rank may be displayed. Beside them, there is a button for confirming the stereoscopic effect, and an image for confirming the stereoscopic effect may be displayed by pressing the button.
- the instruction acquisition unit 122 can acquire the appropriate disparity as a range, and the limit disparity on the near side and the far side is determined.
- the near maximum disparity is the disparity corresponding to the proximity allowed for the point seen closest to the user
- the far maximum disparity is the disparity corresponding to the distance allowed for the point seen farthest from the user.
- the limit disparity may be referred to as the limit disparity.
- FIG. 17 illustrates the principle of actually adjusting the parallax of two viewpoints when a stereoscopically displayed image is extracted from a three-dimensional image.
- the limit parallax determined by the user is converted into the estimated angle of view of the temporarily placed camera.
- the limit parallax between near and far can be represented by the number of pixels M and N, and the angle of view 0 of the camera corresponds to the number L of horizontal pixels on the display screen.
- the nearest possible viewing angle ⁇ and the farthest possible viewing angle ⁇ , which are the viewing angles, are represented by ⁇ , M, N, and L.
- a basic expression space T (the depth of which is also described as T) is determined.
- T the depth of which is also described as T
- S the distance from the front projection plane 30, which is the front of the basic representation space T, to the camera arrangement plane, ie, the viewpoint plane 208.
- T and S can be specified by the user.
- A be the distance between the optical axis intersection plane 2 10 and the front projection plane 30.
- E is the distance between viewpoints.
- point G which is a pixel without parallax, is at the position where the optical axes K 2 from both cameras intersect on the optical axis intersecting plane 210, and the optical axis intersecting plane 210 is the screen position.
- the ray K 1 that produces the near maximum parallax P intersects on the front projection plane 30, and the ray K 3 that produces the far maximum parallax Q intersects on the rear projection plane 32.
- the optical axis intersection distance D and the inter-camera distance E are automatically determined, and the camera parameters are determined. If the camera arrangement determination unit 132 determines the camera arrangement according to these parameters, the processing of the projection processing unit 138 and the two-dimensional image generation unit 142 is performed independently for the images from each camera, A parallax image with parallax can be generated and output. As described above, E and A do not contain hardware information, and a hardware-independent expression format is realized.
- Figure 20 shows a four-lens camera arrangement with four cameras 22,24,26,28.
- a and E should be determined so that proper parallax is obtained between adjacent cameras, such as between the first camera 22 and the second camera 24. In other words, even if A and E determined between the second camera 24 and the third camera 26 closer to the center are diverted to other cameras, almost the same effect can be obtained.
- T is a restriction on the arrangement of objects, but may be determined by a program as the size of a basic three-dimensional space. In this case, objects can always be placed only in this basic representation space T throughout the entire program, or for effective display, the parallax is given to the object so that it sometimes jumps out of this space intentionally. Is also good.
- T may be determined for the coordinates of the nearest object and the most distant object in the three-dimensional space. If this is performed in real time, the basic expression space T Objects can be placed in As an exception to always putting an object in the basic representation space T, it is possible to create a short-time exception by operating under the relaxed condition that "the average of the positions for a certain time should be in the basic representation space T". .
- the objects that define the basic expression space T may be limited to static objects. In this case, an exceptional operation in which a dynamic object protrudes from the basic expression space T can be provided.
- conversion may be performed to reduce the space in which objects are already arranged to the size of the width T of the basic expression space, or may be combined with the above-described operation.
- a method for intentionally displaying objects so that they protrude from the basic expression space will be described later.
- the stereoscopic image adjusting unit 112 of the first stereoscopic image processing apparatus 100 displays a double image as an image to be displayed to the user, the marginal parallax is determined to be small, and other images are displayed. , The appearance frequency of double images can be reduced. It is known that images and contrast between the object and the background are easy to produce double images, and such images can be used at the stage of specifying the limit parallax, that is, at the time of initial setting. .
- FIG. 21 to FIG. 36 show the processing by the distortion processing unit 136 of the first stereoscopic image processing apparatus 100 and its principle.
- FIG. 21 conceptually shows an example of the correction map stored in the correction map holding unit 140 of the first stereoscopic image processing apparatus 100.
- This map corrects the parallax directly, and the whole corresponds to the parallax image as it is. become.
- FIG. 22 shows the change in the parallax resulting from the operation of the camera parameters by the camera arrangement determining unit 132 when the distortion processing unit 1336 determines the camera arrangement according to the correction map.
- “normal parallax” is assigned, while when viewing the direction far away from the front, “small disparity” is assigned.
- the camera arrangement determining unit 132 approaches the camera interval as the distance to the periphery increases.
- FIG. 23 shows another example in which the camera arrangement determining unit 132 changes the parallax by changing the camera arrangement according to the instruction of the distortion processing unit 136.
- the parallax changes from “normal parallax” to “medium parallax” to “small parallax” toward the periphery of the image while moving only the left camera of the two cameras. This method has lower computational cost than Fig.22.
- FIG. 24 shows another example of the correction map. This map also changes the parallax. The center of the image is not touched with the normal parallax, and the parallax is gradually reduced in other parallax correction areas.
- FIG. 25 shows a camera position changed by the camera arrangement determining unit 132 according to this map. The direction of the camera deviates greatly from the front, and the position of the left camera is shifted toward the right camera first, giving “small parallax”.
- FIG. 26 conceptually shows another example of the correction map.
- This map corrects the sense of distance from the viewpoint to the object.
- the camera arrangement determining unit 13'2 adjusts the optical axis intersection distance of the two turtles. If the optical axis crossing distance is reduced toward the periphery of the image, the object will appear relatively deeper in the distance direction, and this is particularly useful in reducing the near parallax.
- the camera arrangement determining unit 132 may change the direction of the optical axis of the camera, and may change the direction of one of the cameras.
- FIG. 27 shows the change of the optical axis intersection position or the optical axis intersection plane 210 when a two-dimensional image is generated based on the map of FIG. The nearer the image, the closer the optical axis intersecting plane 210 approaches the camera.
- FIG. 28 shows another correction map for the sense of distance
- FIG. 29 shows the map shown in FIG. 28. This shows how 0 is changed.
- the object is placed at the normal position without correction in the central area of the image, and the position of the object is corrected in the peripheral area of the image. So
- FIGS. 30A to 30F show another distortion conversion by the distortion processing unit 136.
- the 3D space itself is directly distorted in the camera coordinate system.
- 30A to 30F a rectangular area is a top view of the original space, and a hatched area is a top view of the converted space.
- point U in the original space in Fig. 30 (a) moves to point V after conversion. This means that this point has been moved away.
- Fig. 30 (a) the space is crushed in the direction of the arrow in the depth direction as it goes to the periphery, and the distance close to a certain sense of distance, as shown by point W in the same figure, both close and far away I have a feeling.
- the sense of distance is uniform at the periphery of the image, and there is no specially placed object. This solves the problem of double images, and provides expressions that are easily adapted to the physiology of the user.
- FIGS. 30 (b), 30 (c), 30 (d), and 30 (e) all show modifications of the transformation that brings the sense of distance closer to a constant value around the image. Shows an example of transforming all points in the distance direction.
- FIG. 31 shows the principle for realizing the conversion of FIG. 30 (a).
- the rectangular parallelepiped space 228 includes a space in which the projection processing of the first camera 22 and the second camera 24 is performed.
- the view volume of the first camera 22 is determined by the angle of view of the camera and the front projection plane 230 and the rear projection plane 232, and that of the second camera 24 is determined by the angle of view of the camera and the front projection plane 234 and the rear projection plane Determined by 236.
- the distortion processing unit 136 performs distortion conversion on the rectangular parallelepiped space 228.
- the origin is the center of the rectangular parallelepiped space 228. In the case of a multi-view system, the conversion principle is the same, only the number of cameras increases.
- FIG. 32 shows an example of the distortion conversion, which employs a reduction conversion in the Z direction. In practice, it processes individual objects in the space.
- the conversion formula is as follows.
- E is the intersection of the straight line and the plane. If the coordinates are (X0, Y0, ⁇ 2), ⁇ 2 can be obtained as follows.
- X X0
- Y YO (straight line).
- ⁇ 2 ⁇ - ⁇ 0 x ⁇ / ⁇
- Figure 35 shows another example of distortion transformation. More precisely, the camera shoots radially In consideration of this, reduction processing in the X-axis and Y-axis directions is also combined. Here, the conversion is performed with the center of the two cameras as the representative of the camera position.
- the conversion formula is as follows.
- Figure 36 verifies this transformation. Again, consider the range of X 0 and ⁇ 0. When the point ( ⁇ 0, ⁇ 0, ⁇ 0) is moved to the point ( ⁇ 1, ⁇ 1, ⁇ 1) by the reduction process, the reduction ratios Sx, Sy, Sz are
- the processing changes at the boundary of the tangent between the planes, and in some cases, a sense of discomfort may occur.
- the connection may be made by a curved surface, or the space may be constituted only by the curved surface. The calculation simply turns into finding the intersection E of the curved surface and the straight line.
- the reduction ratio is the same on the same straight line CD, but may be weighted.
- a weighting function G (L) for the distance L from the camera may be applied to Sx, Sy, and Sz.
- FIGS. 37 to 40 show the distortion processing unit 174 of the third stereoscopic image processing apparatus 100. The following describes the processing and the principle.
- FIG. 37 shows a depth map of an image with depth information input to the third stereoscopic image processing apparatus 100, where the depth range has a larger value of K 1 to K 2.
- the near depth is expressed as positive and the far depth is expressed as negative.
- FIG. 38 shows the relationship between the original depth range 240 and the converted depth range 242.
- the depth approaches a certain value as it goes to the periphery of the image.
- the distortion processing unit 174 converts the depth map according to the correction. The same applies to the case where parallax is provided in the vertical direction. Since this conversion is only reduction in the ⁇ direction, it can be expressed by the following equation.
- FIG. 40 illustrates another principle of distortion transformation for a depth map. Since the space is more strictly observed radially from the user 10, the reduction processing in the X-axis and Y-axis directions is also combined. Here, the interocular center is set as the observation position. The specific processing is the same as in the case of Fig. 36. Note that the original depth map has only the Z value, but when this calculation is performed, the X value and the Y value are also retained. The Z value is converted into a pixel shift amount in the X direction or the Y direction, but the X value and the green value may be held as offset values for them.
- the depth map converted by the distortion processing unit 174 and the original image are input to the two-dimensional image generation unit 178, where they are shifted in the horizontal direction so as to have an appropriate parallax.
- the synthesized processing is performed. The details will be described later.
- FIGS. 41 to 51 show the two-dimensional image generation unit of the third stereoscopic image processing apparatus 100 which can be grasped as the position shift unit 160 of the second stereoscopic image processing apparatus 100 and its extension. The processing of 178 is shown.
- FIG. 41 shows the principle of shifting the combined position of two parallax images by the position shift unit 160.
- the positions of the right-eye image R and the left-eye image L match in the initial state.
- the parallax at the near point increases and the parallax at the far point decreases.
- the parallax at the near point decreases and the parallax at the far point increases.
- the above is the essence of parallax adjustment by shifting the parallax image. is there.
- the image may be shifted one way or both may be shifted in opposite directions. From this principle, it can be seen that the stereoscopic display method can be applied to all methods using parallax, regardless of the glasses method or the method without glasses. Similar processing is possible for multi-view video and vertical parallax.
- ' Figure 42 shows the shift process at the pixel level.
- the first quadrangle 250 has near parallax, and when the amount of the parallax is represented by a positive number, it is “6 pixels”.
- the second rectangle 2552 has a distance parallax, and when the amount of the parallax is represented by a negative number, it becomes “16 pixels”.
- this amount of parallax is assumed to be F2, F1.
- the position shift unit 160 shifts the synthesis start position of both images by (J 2 ⁇ F 2) pixels with respect to each other.
- Figure 43 shows the state after the end of the shift.
- the synthesis start positions are mutually _ 2 pixels, that is, shifted in the direction in which the whole shifts in the far direction.
- the double image in the near direction is more uncomfortable than in the far direction, and the subject is arranged in the near direction. In most cases, it is desirable to keep parallax in the near direction within the limit.
- the following is an example of processing.
- the near point is outside the limit disparity and the far point is within the limit disparity, the near point is shifted to the limit disparity point. However, if the parallax of the distant point reaches the interocular distance, the processing is stopped. 2. If the near point is outside the limit disparity and the far point is outside the limit disparity, shift the near point to the limit disparity point. However, if the parallax of the distant point reaches the interocular distance, the processing is stopped.
- FIG. 44 shows the loss of the image edge due to the shift of the combining position.
- the shift amount of the left-eye image 200 and the right-eye image 202 is 1 pixel, and the right edge of the left-eye image 200 and the left edge of the right-eye image 202 each have a missing portion of 1 pixel width. 260 occurs.
- the image edge adjusting unit 1668 duplicates the pixel row at the image edge as shown in FIG. 44 to compensate for the number of horizontal pixels.
- the missing portion 260 may be displayed in a specific color such as black or white, or may be hidden. Further, the image may be cut out or added so as to have the same size as the initial image. Also, the size of the initial image may be made larger than the actual display size in advance, and care may be taken so that the missing portion # 260 does not affect the display.
- FIG. 45 shows the flow of manual adjustment of parallax by the second stereoscopic image processing apparatus 100.
- first, left and right images are manually created as parallax images (S10), and are distributed via a network or other route (S12).
- This is received by the second stereoscopic image processing device 100 (S14), and in the example of this figure, the image is first synthesized and displayed in a normal state without any shift (S16). That is, here, a case is considered where proper parallax has not yet been acquired or the position shift unit 160 has not been operated.
- the position shift unit 160 receives the instruction in the “manual adjustment mode” to perform image synthesis.
- the position is adjusted and displayed (S18).
- S10 and S12 are the procedures of the image creator 27, and S14 and thereafter are the second stereoscopic image processing device 1.
- 0 0 is the procedure 2 7 2.
- FIG. 46 shows the flow of automatic adjustment by the second stereoscopic image processing apparatus 100.
- a procedure 2 7 0 image click Rie Isseki, generation of the left and right image (S 3 0), the image distribution (S 3 2) is the same as FIG 5.
- the procedure 272 of the second stereoscopic image processing apparatus 100 the same applies to the image reception (S34).
- the parallax pre-attached between the parallax images, particularly the maximum parallax is detected by the matching unit 1558 of the parallax amount detection unit 150 (S36), while the parallax information holding unit 120
- An appropriate parallax, in particular, a marginal parallax is acquired (S38).
- the position shift unit 160 shifts the synthesized position of the image so as to satisfy the limit parallax by the above-described processing (S40), and the parallax writing unit 1664, the image edge adjusting unit 1668, The image is stereoscopically displayed through the processing by the format conversion section 116 (S42).
- FIG. 47 shows a flow of still another automatic adjustment by the second stereoscopic image processing apparatus 100.
- the maximum parallax is detected at this point (S52) and the parallax image is recorded in the header of one of the viewpoint images. (S54). This detection may be performed by corresponding point matching. However, when Kuriya generates a parallax image manually, it is necessary to record this because it is naturally known in the editing process. Thereafter, the image is distributed (S56).
- the same processing can be performed in a multi-view system, and the same processing may be performed on the amount of parallax between adjacent viewpoint images.
- these multiple viewpoint images The maximum parallax among the parallaxes may be regarded as “maximum parallax” between all viewpoint images, and the shift amount of the combined position may be determined.
- the header information only needs to be present in at least one of the multi-view images, if the multi-view image is combined into one image, the header of the image may be used.
- images that have already been combined are distributed.In such a case, the images are separated by an inverse transformation process, and the combined position shift amount is calculated and recombined, or the result is the same.
- Pixel rearrangement processing may be performed as follows.
- FIGS. 48 to 51 show a process of shifting the synthesis position for an image with depth information. This is performed by the two-dimensional image generation unit 18 of the third stereoscopic image processing device 100.
- FIGS. 48 and 49 show a plane image 204 and a depth map, each of which forms an image with depth information. Here, the near depth is expressed as positive and the far depth is expressed as negative.
- the first rectangle 250, the second rectangle 252, and the third rectangle 254 exist as objects.
- the first rectangle 250 has a depth of ⁇ 4 ''
- the second rectangle 252 has a depth of ⁇ 2 ''
- the first rectangle 250 is at the nearest point
- the second rectangle 255 is at the intermediate point
- the third rectangle 254 is at the furthest point.
- the two-dimensional image generation unit 178 performs a process of shifting each pixel by the value of the depth map based on the original planar image 204, and generates the other viewpoint image. If the reference is a left-eye image, the original plane image 204 becomes the left-eye image as it is. The first square 250 is shifted by 4 pixels to the left, the second square 252 is shifted by 2 pixels to the left, and the third square 254 is shifted by 4 pixels to the right, and the right eye image 202 is created as shown in FIG.
- the image edge adjustment unit 168 fills the missing part 260 of the pixel information due to the movement of the object with the neighboring pixel whose parallax is “0” and is determined to be the background.
- the two-dimensional image generation unit 178 calculates a depth that satisfies the appropriate parallax. If the depth range is K1 to K2 and the depth value of each pixel is 3 ⁇ 4Gxy, the depth map has the form of Hxy changed to Gxy in Fig. 37. Also, it is assumed that the appropriate parallax of the display device held by the user is found to be J1 to J2. In this case, in the depth map, the depth value G of each pixel is transformed as follows, and a new depth value Fxy is obtained.
- a weighting function F (Gxy) for Gxy may be further applied, and various other non-linear conversions may be considered. Further, it is also possible to shift the object in the opposite direction from the original plane image 204 and generate a new left and right image. In the case of a multi-view system, the same processing may be performed a plurality of times to generate a multi-view image.
- the stereoscopic image processing apparatus 100 has been described as an apparatus, this may be a combination of hardware and software, or may be configured only with software. In this case, it is convenient if an arbitrary part of the three-dimensional image processing device 100 is made into a library and can be called from various programs. Programmers can skip programming where knowledge of stereoscopic display is required. For the user, the operation related to the three-dimensional display, that is, the GUI, becomes common regardless of the software or the content, and the set information can be shared with other software, so that the trouble of resetting is omitted.
- Various programs can determine the state of the image by referring to the information.
- An example of the shared information is the information obtained by the information obtaining unit 118 of the stereoscopic image processing apparatus 100 described above. This information may be stored in a recording unit (not shown), the correction map storage unit 140, or the like.
- FIGS. 52 to 54 show an example in which the above-described stereoscopic image processing apparatus 100 is used as a library.
- FIG. 52 shows an application of the stereoscopic display library 300.
- the stereoscopic display library 300 is referred to by calling a function from a plurality of programs A 302, program B 304, program C 306, and the like.
- the parameter file 318 In addition to the above information, the user's proper parallax and the like are stored.
- the 3D display library 300 is used by a plurality of devices A 312, device B 3 14, and device C 3 16 via an API (application program interface) 310.
- Examples of the program A302 include games, so-called 3D applications called Web 3D, 3D desktop screens, 3D maps, visuals of parallax images as 2D images, and pure images such as images with depth information. Can be considered. Of course, some games use coordinates differently, but the 3D display library 300 can handle that.
- examples of the device A312 include any stereoscopic display device using parallax, such as a binocular or multi-view parallax barrier system, a shutter glasses system, and a polarized glasses system.
- FIG. 53 shows an example in which the three-dimensional display library 300 is incorporated in the three-dimensional data software 402.
- the three-dimensional decoding software 402 includes a program main body 404, a stereoscopic display library 300 for realizing appropriate parallax, and a photographing instruction processing unit 406.
- the program body 404 communicates with the user via the user interface 410.
- the shooting instruction processing unit 406 virtually shoots a predetermined scene during operation of the program body 404 in accordance with a user's instruction.
- the captured image is recorded in the image recording device 4 12.
- the image is output to the stereoscopic display device 408.
- the three-dimensional data software 402 is game software.
- the user can execute the game while experiencing an appropriate three-dimensional effect by using the three-dimensional display library 300 during the game.
- the user if the user wants to keep a record, for example, when a complete victory is achieved in a fighting battle game, the user issues an instruction to the photographing instruction processing unit 406 via the user interface 410 and records the scene I do.
- a stereoscopic display library 300 is used to generate a parallax image so as to have an appropriate parallax when reproduced by the stereoscopic display device 408 later, and this is recorded in an electronic album or the like of the image recording device 412. Is recorded.
- a parallax image By performing recording with a two-dimensional image called a parallax image, the three-dimensional data itself of the program main body 404 does not leak out, and copyright protection can be considered.
- FIG. 54 shows a network-based system using the three-dimensional data software 402 of FIG. Here is an example of incorporating it into the system 430.
- the game machine 432 is connected to a server 436 and a user terminal 434 via a network (not shown).
- the game machine 432 is for a so-called arcade game, and includes a communication section 442, three-dimensional data software 402 and a three-dimensional display device 440 for displaying the game in a single word.
- the three-dimensional data software 402 is shown in FIG. Parallax images displayed from the three-dimensional data software 402 to the stereoscopic display device 450 are optimally set in advance for the stereoscopic display device 450.
- the parallax adjustment by the three-dimensional data software 402 is used when transmitting an image to a user via the communication unit 442 as described later.
- the display device used here only needs to have a function of generating a stereoscopic image by adjusting parallax, and does not necessarily need to be a device capable of stereoscopic display.
- the user terminal 4 3 4 includes a communication unit 4 54, a viewer program 4 52 for viewing a 3D image, and a 3D display device 4 50 of any size and type for displaying the 3D image verbally. .
- a stereoscopic image processing device 100 is mounted.
- the server 436 includes a communication unit 460, an image storage unit 462 for recording an image virtually taken by a user in connection with a game, user proper parallax information, a user's mail address, and other personal information. It has a user information storage unit 464 that records such information in association with the user.
- the server 436 functions as, for example, an official site of the game, and records a scene that the user likes or a moving image or a still image of the game during the game. 3D display is possible for both moving and still images.
- An example of image capturing in the above configuration is performed in the following manner.
- the user performs stereoscopic display in advance using the stereoscopic display device 450 of the user terminal 4334, obtains appropriate parallax based on the function of the stereoscopic image processing device 100, and transmits this to the communication unit 454.
- the user information is stored in the user information storage unit 464. This proper parallax is a general description irrespective of the hardware of the stereoscopic display device 450 held by the user.
- the user plays the game with the game machine 4 32 at an arbitrary timing.
- the stereoscopic display device 4440 performs stereoscopic display based on the initially set parallax or the parallax manually adjusted by the user.
- the stereoscopic display library 300 included in the three-dimensional data software 402 of the game machine 432 is sent to the server via the two communication units 442 and 460.
- the correct parallax of the user is obtained from the user information storage unit 464 of the user 4346, and a parallax image is generated in accordance with the proper parallax.
- the image is again transmitted through the two communication units 442 and 4600.
- a parallax image related to the virtually shot image is stored in the holding unit 462.
- stereoscopic display can be performed with a desired stereoscopic effect.
- the parallax can be manually adjusted by the stereoscopic image processing device 100 included in the pure program 452.
- the programming related to the three-dimensional effect which should be originally set for each display device hardware and each user, is collected in the three-dimensional image processing device 100 and the three-dimensional display library 300.
- game software programmers do not have to worry about complicated requirements for stereoscopic display at all. This applies not only to game software, but also to any software that uses stereoscopic display, and eliminates restrictions on the development of content applications that use stereoscopic display. Therefore, the spread of these can be drastically promoted.
- the appropriate parallax of the user is registered in the server 436.
- the user may bring an IC card or the like recording the information and use the game machine 432. You may record your score and favorite images for this game in this effort.
- the present invention has been described based on the embodiments. This embodiment is an exemplification, and it is understood by those skilled in the art that various modifications can be made to the combination of each component and each processing process, and that such modifications are also within the scope of the present invention. is there. The following is an example.
- the first three-dimensional image processing apparatus 100 Processing is possible with precision. However, the three-dimensional data may be dropped into an image with depth information, and a parallax image may be generated using the third stereoscopic image processing apparatus 100. In some cases, this may be less computational. Similarly, when inputting a plurality of viewpoint images, it is possible to create a depth map using high-precision corresponding point matching. The parallax image may be generated using the image processing device 100.
- the camera temporary arrangement unit 130 is configured as the three-dimensional image processing apparatus 100. However, this may be a pre-process of the three-dimensional image processing apparatus 100. Good. This is because up to the temporary placement of the camera can be processed regardless of proper parallax. Similarly, an arbitrary processing unit constituting the first, second, and third stereoscopic image processing apparatuses 100 can be output to the outside of the stereoscopic image processing apparatus 100. The high degree of freedom of the 100 configuration is understood by those skilled in the art.
- a unit for enlarging the character data may be provided.
- the horizontal resolution of the image seen by the user is 1/2.
- the legibility of characters can be reduced, so it is effective to stretch characters twice in the horizontal direction. If there is also vertical disparity, it may be useful to stretch the characters vertically as well.
- an “operating display section” for putting characters or marks such as “3D” on the displayed image may be provided.
- a switching unit for stereoscopic display / normal display may be provided so that the user can know whether or not the image can adjust the parallax.
- This unit contains a GUI, and it is convenient if the display is switched from stereoscopic display to normal two-dimensional display when the user clicks a predetermined button, and vice versa.
- the information acquisition unit 118 does not necessarily acquire information by user input, but may include information that can be automatically acquired by a function such as a play.
- a function such as a play.
- the method of deriving E and A is used.
- a method of fixing these and deriving other parameters may be used, and the specification of variables is free.
- a plane image is displayed as "an object passes through a certain interface".
- an object passes through a certain interface.
- the display screen and its surrounding frame are visually perceived, so a display method that uses this as a window can be considered, and an interface between spaces and a plate-like object is used on the surface. You need to specify the location. In this case, the optical axis crossing position D is specified in the positional relationship shown in FIG.
- FIG. 55 shows a state in which an image composed of three-dimensional images is displayed on the display screen 400.
- one glass surface 401 of the water tank 410 coincides with the display screen 400, and it is expressed that the fish 310 is swimming in the water tank 410. If processing is performed so that the farther space is closer to the display screen and the closer space is closer to the front than the display screen 400, the fish 310 will normally swim in the farther space as shown in Figure 56. And occasionally, as shown in Figure 57, "Fish 301 hits display screen 400. It breaks up and appears in the nearby space.
- the expression "splashes fly from the periphery of the display screen 400, and the interface is reproduced when the fish 301 passes through”.
- Another example of expression is, for example, "Because there is no water in the near space before the display screen, the fish 3101 becomes stuffy after swimming in the nearby space for a while, and again the interface, that is, the display screen 400 Through it, it returns to the distant space. "
- the interface may be a single surface, but a plate-like object such as glass or a thin object such as paper may be arranged.
- the interface does not need to completely match the display screen, but may be near it. It is clear that the above-mentioned expression effects cannot adequately convey the situation to the observer with a two-dimensional image. In particular, if the original data serving as the starting point of the three-dimensional image is a three-dimensional image, editing for expressing the above-described effects becomes easy.
- Such an expression that matches the interface of the object to be displayed to the display screen can be generated by the method shown in FIG. That is, a virtual aquarium 4 10 is arranged in a three-dimensional space, and two images having parallax are generated from the two virtual cameras 4 30 and 4 0 arranged on the left side. At that time, the optical axis intersection positions of the two virtual cameras 4300 and 4440 are made to coincide with one surface of the water tank. Further, such an image can be taken as shown in FIG. Two virtual cameras 4330 and 4400 are arranged on the actual water tank 4100 ⁇ to photograph the water tank 4100. At that time, the intersection of the optical axes of the two cameras should coincide with the water surface.
- FIG. 60 shows a configuration of a fourth stereoscopic image processing apparatus 100 for realizing the above processing.
- This stereoscopic image processing apparatus 100 has a configuration in which an object designating section 180 is further provided in the stereoscopic effect adjusting section 112 of the first stereoscopic image processing apparatus 100 shown in FIG. is there.
- the object specifying unit 180 performs a process of positioning or matching the interface of the object specified by the user near the display screen.
- the user is assumed to be a creator of a stereoscopic image, and the above-described processing is performed at the time of producing or editing a stereoscopic image. Note that the user may be an observer.
- the object designation unit 180 receives a designation of an object corresponding to the optical axis crossing plane of the two virtual cameras 430 and 440 from a user using a predetermined input device such as a mouse, and controls the parallax of the designated object. Notify Part 1 1 4
- the parallax control unit 114 and more specifically, the camera arrangement determining unit 132, turns the plane of the object specified by the user into the optical axis intersection plane of the two virtual cameras 430 and 440. Adjust as follows. The operation other than this processing may be the same as the operation of the stereoscopic image processing apparatus 100 shown in FIG.
- Information indicating that the object is to be displayed near the display screen is added to the object determined in this manner.
- the intersection distance D of the optical axis is determined, and the inter-camera distance E is determined by the processing described above.
- Another expression method is proposed. If there are multiple objects to be displayed on the display screen, it is not always necessary to keep all objects within the proper parallax. At times, for effective display, some objects may be displayed under certain conditions, for example, only for a certain period of time, while excluding some objects from the conditions for proper parallax. As described above, the basic expression is determined for a stationary object as described above.
- the object to be expressed in the basic expression space including the object to be displayed three-dimensionally is used.
- Information (hereinafter simply referred to as “identification information”) may be provided.
- the object to be expressed in the basic expression space is also referred to as “calculation object of the basic expression space”. Then, the basic expression space may be determined at any time based on the identification information.
- the identification information is configured to be able to be changed as needed, it is possible to flexibly set conditions for excluding from the appropriate parallax. For example, if the identification information specifies the time to be excluded from the proper parallax condition, it is possible to automatically return to the proper parallax range after the specified time. '
- the camera arrangement determining unit 132 corrects the temporarily set camera parameters according to the appropriate parallax.
- the function may be further extended as follows. That is, the camera arrangement determination unit 132 reads the identification information associated with each object, and arranges the camera parameters so as to reflect the identification information.
- FIG. 61 shows the image displayed by the fourth stereoscopic image processing apparatus 100 in the depth direction, particularly the basic expression space T, for convenience.
- a front projection plane 310 is set on the left side of the figure, and a rear projection plane 312 is set on the right side.
- the basic expression space T is between the front projection plane 310 and the rear projection plane 312.
- a house 350 is represented as a stationary object on the front projection plane 310 side, and a tree 370 is represented on the rear projection plane 312 side.
- a bird 330 which is a dynamic object, is moving forward in the space above the two stationary objects.
- the bird 330 can express its movement as long as it moves within the range of the basic expression space T, but if it reaches the front projection plane 310 or the rear projection plane 310, the bird 33 0 is an object located on the front projection plane 312 as shown in the left side of FIG. 6 1, and the bird 3 3 0 is fixed at the maximum parallax, Cannot move further forward or backward in real space. If the object can be expressed as if it were moving at all, it will be possible to maintain the presence of the object.
- FIG. 62 a region where no object exists is included in the basic expression space T.
- a space where nothing exists is provided as a part of the basic representation space T in front of the front stationary object house 350, and the bird 330, which is a dynamic object, is created. It is designed to be able to move ahead of the house 350.
- a space in which nothing exists further behind the still object tree 370 placed behind is provided as a part of the basic expression space T.
- the bird 330 which is a dynamic object, moves from behind and exceeds the position in front of the house 350, the bird 330 is located within the range of the basic expression space T. Therefore, even if the user moves further forward, the parallax is expressed by proper parallax, and the observer who is the user does not feel uncomfortable with the movement.
- a moving object 390 is formed as a target for calculating the parallax by including the bird 330 in its own space as well as the surrounding space.
- the foreground of the moving object 390 reaches the front projection plane 310, only the bird 330 is moved.
- the bird 330 reaches the front projection plane 310 immediately, and the movement thereafter cannot be expressed. You can slow down your time.
- the bird 330 may be moved in a space that is included in advance.
- the maximum parallax is determined by the moving object 390, and since the bird 330 approaches the maximum parallax little by little, it is possible to move forward in the real space.
- the moving speed may be set to any of the originally expected moving speed, a high speed, and a low speed. By giving flexibility to the moving speed, various expressions are possible. For example, by moving the movement speed slower as it approaches the end of the movement object 390, it is possible to express a forward movement while preventing the parallax amount from becoming excessively large in the front-back direction.
- N LE (T-A) / (2 (T + S) (A + S) t an (0/2))
- N ' LE (T-A,) / (2 (T + S') (A, + S,) t an ( ⁇ / 2) )
- N ,, LE ,, (T-A ") / (2 (T + S,) (A" + S,) t a n ( ⁇ / 2))
- the movement speed on the actual coordinates with respect to the movement of the object toward the observer can prevent a sudden change in the amount of parallax.
- the configuration of the stereoscopic image display device 100 that realizes the expression methods shown in FIGS. 61 to 65 has been described above.
- This stereoscopic image display device 100 can be realized by the stereoscopic image display device 100 shown in FIG.
- the camera arrangement determination unit 1332 determines from the original data overnight the information regarding the range to be calculated in the basic expression space and the visual observation of the object. It also has a function to read the information on the change of the difference amount and reflect it on the camera parameters. This information may be included in the original data itself, or may be stored in the parallax information storage unit 120, for example.
- a proper parallax process determines that the parallax is too large for a correct parallax state in which a sphere is seen correctly
- processing is performed so that the parallax of the stereoscopic image is reduced.
- the sphere looks like a crushed shape in the depth direction, but in general the feeling of discomfort for such a display is small. Since a person is usually used to seeing a two-dimensional image, if the disparity is between 0 and the correct disparity, the person often does not feel uncomfortable.
- the parallax of the stereoscopic image is too small in the proper parallax processing for the parallax state in which the sphere can be seen correctly, the parallax is processed to increase.
- the sphere appears to have a shape that swells in the depth direction, and to such a display, a person may feel a sense of discomfort greatly.
- FIG. 11 When a three-dimensional image is generated from three-dimensional data, adjustment of parallax can be performed relatively easily by changing the arrangement of cameras. A procedure for correcting parallax will be described based on FIGS. 66 to 71.
- FIG. 10 the correction of the parallax can be performed by the above-described first to fourth stereoscopic image processing devices 100.
- the first three-dimensional image processing apparatus 100 shown in FIG. 11 generates a three-dimensional image by three-dimensional image processing.
- the above-described correction processing can also be realized by fourth and sixth stereoscopic image display devices 100 described later.
- FIG. 66 shows a state where an observer is observing a stereoscopic image on a display screen 400 of a certain stereoscopic image display device 100.
- the screen size of the display screen 400 is L
- the distance between the display screen 400 and the observer is d
- the interocular distance is e.
- the near limit disparity M and the far limit disparity N are obtained in advance by the stereoscopic effect adjustment unit 112, and the proper disparity is between the near limit disparity M and the far limit disparity N.
- the proximity limit parallax M is displayed for easy understanding, and the maximum pop-out amount m is determined from this value.
- the protrusion amount m is the distance from the display screen 400 to the proximity point.
- the unit of L, M, and N is "pixel". Unlike other parameters such as d, m, and e, it is necessary to adjust using a predetermined conversion formula. Are expressed in the same unit system to facilitate
- the camera arrangement is determined by the camera arrangement determination unit 13 2 of the parallax control unit 114 based on the nearest point and the farthest point of the sphere 21.
- the optical axis crossing distance of the two cameras 22 and 24 is D
- the camera spacing is Ec.
- the coordinate system is scaled so that the expected width of the camera at the optical axis intersection distance matches the screen size L.
- the camera interval Ec is equal to the interocular distance e and the optical axis intersection distance D is smaller than the observation distance d. Then, as shown in FIG.
- the sphere 21 becomes Looks right.
- the sphere 21 is observed on the original stereoscopic image display device 100 based on the image generated by such an imaging system, as shown in FIG. 69, the sphere 21 extending in the depth direction over the entire appropriate parallax range is observed. .
- FIG. 70 shows a state in which the camera arrangement shown in FIG. 67 captures the nearest point of a sphere located at a distance A from the display screen 400.
- the maximum disparity M corresponding to the distance A is determined by two straight lines connecting each of the two cameras 22 and 24 with the point at the distance A.
- FIG. 71 shows the camera interval E1 required to obtain the parallax M shown in FIG. 70 when the optical axis tolerance distance between the two cameras 22 and 24 and the camera is d. This can be said to be a conversion that makes all the parameters of the shooting system other than the camera interval coincide with the parameters of the observation system.
- E 1 E c (d-A) / (D-A)
- E1 When E1 is greater than the interocular distance e, it is determined that correction is required to reduce the parallax. Since E1 may be the interocular distance e, Ec may be corrected as in the following equation.
- E2 is larger than the interocular distance e, it is determined that correction is necessary.
- the parallax may become too large for both the near and far positions. Disappears.
- the camera is set by returning the selected E c to the original coordinate system in three-dimensional space.
- the correction was performed only at the camera interval without changing the optical axis intersection distance, but the object position may be changed by changing the optical axis intersection distance, or the camera interval and the optical axis intersection may be changed. Both separations may be changed.
- correction is also required when using a depth map. If the depth map value indicates the amount of shift of the point by the number of pixels, and if the initial value, generally the value described in the original image, is in a state that realizes optimal stereoscopic vision, appropriate parallax processing is performed. The above process is not performed when the range of the depth map value needs to be increased, and the above process is performed only when the range of the depth map value needs to be reduced, that is, when it is necessary to reduce the parallax. Just do it.
- the maximum permissible value may be held in a header region of the image or the like, and the appropriate parallax processing may be performed so as to be within the maximum permissible value.
- hardware information is required for the appropriate distance, but higher-performance processing is performed compared to the processing that does not depend on the hardware information described above. Can appear.
- the above processing can be used not only when parallax is set automatically but also when parallax is set manually.
- the limit of parallax at which an observer feels uncomfortable differs depending on the image.
- crosstalk becomes noticeable when the parallax is increased.
- crosstalk is noticeable in images with a large luminance difference on both sides of the page when the parallax is increased.
- crosstalk is noticeable in an image with less movement.
- checking the extension of a file name often indicates whether the file type is a moving image or a still image. Therefore, when it is determined that the image is a moving image, the state of the motion may be detected by a known motion detection method such as a motion vector, and the appropriate amount of parallax may be corrected according to the state. In other words, correction is performed so that the disparity is smaller than the original disparity for images with little motion. On the other hand, no correction is applied to moving images. Or, if you want to emphasize movement, you can add a correction so that the parallax is larger than the original parallax. Note that the correction of the appropriate parallax is an example, and any correction can be performed within a predetermined parallax range. In addition, it is possible to correct the depth map and to correct the amount of shift of the synthesis position of the parallax image.
- these analysis results may be recorded in the header area of the file, and the three-dimensional image processing apparatus may read the header and use the header when displaying the next and subsequent three-dimensional images.
- the amount and motion distribution of the high-frequency component may be ranked by actual stereoscopic vision by the creator or user of the image, or may be ranked by stereoscopic vision by a plurality of evaluators, and the average value may be determined. Any method of ranking may be used.Also, proper parallax does not need to be strictly observed, and calculation of camera parameters does not need to be performed at all times, and may be performed at regular intervals or every scene change. Good. In particular, This is effective when the processing is performed by a device having a low processing capacity.
- the parallax control unit 114 sets the internal image It is only necessary to instruct the camera arrangement determining unit 132 to recalculate the camera parameters at regular intervals.
- the internal timer may use the reference frequency of the CPU that performs the arithmetic processing of the three-dimensional image processing apparatus 100, or a dedicated timer may be separately provided.
- FIG. 76 illustrates a configuration of a fifth stereoscopic image processing apparatus 100 that realizes calculation of an appropriate parallax according to the state of an image.
- an image determination unit 190 is newly provided in the first three-dimensional image processing apparatus 100 shown in FIG. 11. Since other configurations and operations are the same, the different points will be mainly described.
- the image determination unit 190 analyzes the frequency components of the image to determine the amount of the high-frequency components, and notifies the parallax control unit 114 of a parallax suitable for the image. If the night is a moving image, the scene determination unit notifies the parallax control unit 114 of the calculation timing of the camera parameter overnight by detecting a scene change or detecting motion in the image. 4 is provided.
- Scene change detection may be performed by using a known method.
- the frequency component detection unit 19 Processing load increases. If an arithmetic processing device that matches the processing load is used, there is a concern that the cost of the stereoscopic image processing device 100 will increase. As described above, proper parallax does not need to be strictly maintained at all times, so the frequency component of the image is analyzed when the image changes greatly, such as a scene change, based on the detection result of the scene determination unit 190. With this configuration, the processing load of the image determination 190 can be reduced.
- Figure 77 shows the temporary camera position S (Xs, Ys) set by the creator who creates the three-dimensional data. , Z s), the angle of view ⁇ , and the first to third objects 700, 720, 70.
- the temporary camera position S (Xs, Ys, Zs) is the center of these virtual cameras when generating parallax images based on the virtual cameras (hereinafter, the center position of the camera group). S also).
- the first object 700 is the background.
- the creator determines that the second and third objects 720 and 704 fall within the angle of view 0, and that the object is within the angle of view ⁇ ⁇ ⁇ ⁇ by the first object 700 that is the background image.
- Angle of view 0 and camera group center position S are set so that the information exists.
- the optical axis crossing position A (Xa, Ya, Za), which is a reference for near and far, is obtained so that a desired parallax is obtained as shown in FIG.
- the parameters of the two virtual cameras 7 2 2 and 7 2 4 are determined over time, specifically, the positions of the power cameras and their optical axes.
- the camera positions of these two virtual cameras 7 2 2 and 7 2 4 may, for example, depend on the size of the first object which is the background image.
- first and second object zero areas 7440 and 742 in which object information does not exist are generated.
- FIG. 80 is a flowchart showing the angle-of-view adjustment processing. This angle-of-view adjustment processing can be realized by the first stereoscopic image display device 100 shown in FIG.
- the temporary camera placement unit 130 determines the camera group center position S (S110). Subsequently, the camera arrangement determining unit 132 determines the camera angle of view 0 based on the camera group center position S (S112), determines the camera interval E (S114), and sets the light of the virtual camera. The axis intersection position A is determined (S116). Further, the camera arrangement determining unit 132 performs a coordinate conversion process on the original data based on the camera interval E and the optical axis intersection position A (S118), and determines whether or not object information exists in all pixels on the display screen. Is determined (S120).
- a method of expressing a stereoscopic image starting from a real image will be described.
- the difference between the case where the starting point is three-dimensional data and the case where the starting point is a real image is that there is no concept of the depth T of the basic expression space when the starting point is a real image. This can be rephrased as a depth range T in which an appropriate parallax display is possible. As shown in Fig. 17 and Fig.
- the parameters required for setting the camera for generating a stereoscopic image are the camera interval E, the optical axis intersection distance A, the angle of view 0, and the front which is the front of the basic representation space
- the remaining parameters can be calculated by specifying three of the six parameters E, ⁇ , ⁇ ⁇ S, D, and T.
- any parameter can be specified freely, but in the above-described embodiment, ⁇ ⁇ S and T were specified, and E, A, and D were calculated.
- T can also be said to be a parameter representing the limitation of the expression range, and is preferably determined in advance. In the case of a three-dimensional de-night, it is almost the same to change any of the parameters. However, it is different in the case of live action. Depending on the structure of the camera, the price varies greatly and the operability also changes. Therefore, it is desirable to change the parameters specified according to the application.
- FIG. 81 shows the relationship between a stereoscopic photographing device 510 for photographing a stereoscopic photograph at an entertainment facility or a photo studio and a subject 552.
- the three-dimensional photographing device 5100 includes a camera 550 and a three-dimensional image processing device 100.
- the shooting environment is fixed. That is, the position of the camera 550 and the position of the subject 552 are determined in advance, and 0, S, and T are determined as parameters.
- the example shown in FIG. 18 is replaced with an actual camera 550, and two lenses 5 2 2 and 5 2 4 are provided in one camera 550. With only 550, two parallax images serving as base points of a stereoscopic image can be captured.
- FIG. 82 shows a configuration of a sixth stereoscopic image processing apparatus 100 that performs this processing.
- This stereoscopic image processing apparatus 100 is obtained by replacing the parallax detection section 150 with a camera control section 151 in the stereoscopic image processing apparatus 100 shown in FIG.
- the camera control unit 151 has a lens interval adjustment unit 1553 and an optical axis adjustment unit 150.
- the lens interval adjusting unit 153 adjusts the camera interval E, more specifically, the lens interval E by adjusting the positions of the two lenses 5 2 2 and 5 2 4.
- the optical axis adjustment unit 15 5 adjusts D by changing the directions of the optical axes of the two lenses 5 2 2 and 5 2 4.
- the appropriate parallax information of the stereoscopic image display device held at home or the like is input through a portable recording medium such as a memory or a force, or a communication means such as the Internet.
- the information acquisition unit 118 receives the input of the proper parallax. Notification to the camera control unit.
- the camera control unit 151 calculates E, A, and D, and adjusts the lenses 522, 524 to allow the camera 550 to shoot with an appropriate parallax. This is realized by the fact that the library uses the same process for the stereoscopic display device for displaying the subject and the stereoscopic photographing device 5110.
- D and A should also be determined, and the subject should be positioned at D, and shooting should be performed. , And select the one with smaller E. Also, T may be larger than the range occupied by the subject. If there is a background, T should be determined including the background.
- the proper parallax information does not necessarily need to be obtained by using a stereoscopic image display device owned by the user who is the subject.
- a desired stereoscopic effect may be selected by a typical stereoscopic image display device at the shooting site. This selection can be made by the stereoscopic effect adjusting unit 112. Alternatively, simply select from items such as “on screen / distant / close” and “three-dimensional: large / medium / small” and store them in the disparity information storage unit 120 corresponding to them.
- a predetermined camera parameter may be used.
- the change of the optical axis crossing position may be changed by a mechanical structure, but may be realized by changing a range used as an image using a high-resolution CCD (Charge Coupled Device). In this process, the function of the position shift unit 160 may be used.
- CCD Charge Coupled Device
- Fig. 83 shows that a camera 550 that can be moved to a place where humans cannot enter is installed, and that the camera 550 is operated by remote control using the controller 519, and the captured image is captured. This shows a state of observation with the stereoscopic image display device 511.
- a stereoscopic image display device 100 having the configuration shown in FIG. 82 is incorporated in the stereoscopic image display device 511.
- the camera 550 has a mechanism that can automatically adjust the lens interval ⁇ . Also, this camera 550 has an optical zoom or electronic zoom function, which determines 0. However, the amount of parallax changes due to this zoom operation. In general, the farther a picture is taken, the smaller the angle formed by the optical axis between the viewpoints at the time of display becomes. Therefore, at the lens interval ⁇ ⁇ ⁇ , the parallax becomes small and the stereoscopic effect becomes poor. Therefore, lens spacing —It is necessary to change the camera settings such as the camera amount appropriately. Here, the camera settings are automatically controlled in such a case, and the complicated camera settings are greatly reduced. The camera settings may be adjusted using the controller 519.
- the camera 550 When the operator first operates the optical zoom or the electronic zoom using the controller 5 19, 0 is determined. Next, the camera 550 is moved, and the subject to be photographed is displayed at the center on the stereoscopic display device 511. The camera 550 focuses on the subject using the auto-force function, and at the same time acquires the distance. In the initial state, this distance is D. That is, the camera 550 is automatically set so that the subject can be seen near the display screen. T can manually change the range, and the operator specifies in advance the distribution in the depth direction of the object whose context is to be grasped. Thus, ⁇ , D, and D are determined. As a result, E, A, and S are determined from the above three relational expressions, and the camera 550 is automatically adjusted appropriately. In this case, since S is determined later, it is uncertain what range T will eventually be. Therefore, it is better to set T to some extent.
- the subject may be displayed once in the center, a predetermined button may be pressed to fix the focus and D, and then the direction of the camera 550 may be changed. If the focus and D can be changed manually, the depth position of the object can be freely changed.
- FIG. 84 shows an example of photographing by the stereoscopic image photographing device 5 10.
- the body image capturing device 5110 has a configuration shown in FIG.
- the camera 550 is preliminarily input with the proper parallax of the stereoscopic image display device held by the photographer through a recording medium such as a portable memory or a communication means such as the Internet.
- a recording medium such as a portable memory or a communication means such as the Internet.
- the camera 550 has a simple structure and is available at a relatively low price.
- the camera interval E, the optical axis crossing distance D, and the angle of view 0 are fixed, and A, S, and T are determined from the above three relational expressions.
- the appropriate range of the distance to the subject can be calculated, so the distance to the subject is measured in real time, and it is determined whether the calculated distance is appropriate. To notify the photographer.
- the distance to the subject may be obtained by a known technique such as an autofocus distance measuring function. ' As described above, any combination of camera parameters as variables or constants is free, and there are various forms according to the application.
- the camera 550 may be attached to various devices such as a microscope, a medical endoscope, and a portable terminal.
- the stereoscopic display device includes a stereoscopic image processing device for realizing stereoscopic vision.
- the proper parallax obtained by the stereoscopic effect adjustment unit 111 of the first to sixth stereoscopic image processing apparatuses 100 is a parameter determined by the user while stereoscopically viewing the specific stereoscopic image processing apparatus 100.
- the proper parallax is maintained thereafter.
- Two factors are added to the operation of adjusting the stereoscopic effect, namely, the “image separation performance” specific to the stereoscopic display device and the “physiological limit” specific to the observer.
- “Image separation performance” is an objective factor that indicates the performance of separating multiple viewpoint images.For stereoscopic displays with low performance, crosstalk is perceived even with little parallax, and multiple observations are performed.
- the range of the appropriate parallax when the person makes the adjustment becomes narrow on average. Conversely, if image separation performance is high, crosstalk is hardly perceived even with large parallax, and the range of appropriate parallax tends to be wide on average.
- the “physiological limit” is a subjective factor. For example, even if the image separation performance is very high and the images are completely separated, the parallax range where the viewer does not feel discomfort differs. This appears as a variation in proper parallax in the same stereoscopic image processing device 100.
- the image separation performance is also called the degree of separation, and can be determined by measuring the illuminance of the reference image 572 while moving the illuminometer 570 horizontally at the optimum observation distance as shown in FIG. At that time, in the case of the binocular system, for example, all white is displayed in the left eye image, and all black is displayed in the right eye image. If the images are completely separated, the illuminance at the position where the right eye image can be seen will be 0. In contrast, by measuring the degree of white leakage in the left-eye image, Separation performance is obtained. In this figure, the graph on the right end is an example of the measurement result. Also, since this measurement is almost equivalent to measuring the density of moire, image separation can also be performed by capturing a moire image at a distance where moire is observed as shown in Fig. 86 and analyzing the density. Performance can be measured.
- the image separation performance can be measured by measuring leaked light in the same manner. Also, in practice, the calculation may be made by taking the measured value when both the left and right images are all black as a background.
- the image separation performance can also be determined by the average value of the ranking evaluation by a large number of observers.As described above, the image separation performance of the stereoscopic display device can be given an objective criterion such as an objective numerical value. For example, if the rank of the stereoscopic display device 450 of FIG. 54 owned by the user and the appropriate parallax of the user to the stereoscopic display device 450 are known, it is possible to match the rank of another stereoscopic display device 450.
- the proper parallax can be converted.
- the stereoscopic display device has parameters that are eigenvalues such as the screen size, pixel pitch, and optimal viewing distance, and the information of these parameters is also used to convert the proper parallax. Will be described in order for each parameter using FIGS. 87 and 88.
- the proper parallax is held by N / L and M / L.
- M is the near limit disparity
- N is the far limit disparity
- L is the screen size.
- the optimal viewing distance d increases by a factor of b
- the absolute value of the parallax is preferably increased by a factor of b. That is, the angle of parallax that the eye sees is kept constant. Therefore, by converting N / L to bN / L and converting M / L to bM / L, an appropriate parallax can be realized even when the optimum viewing distance is different. In this figure, it is shown as an example of the closest position.
- N / L is converted into b c N / (a L)
- M / L is converted into b c M / (a L).
- This conversion can be applied to both the horizontal and vertical disparities.
- the above conversion of the appropriate parallax can be realized by the configurations shown in FIGS. 52, 53, and 54.
- the front and back of the basic expression space may be determined using the Z buffer.
- the Z-buffer is a hidden surface processing method that can obtain a depth map of a group of objects viewed from a camera. The minimum and maximum values for removing the Z value may be used as the frontmost and rearmost positions. As processing, processing to obtain the Z value from the position of the virtual camera is added. Since this process does not require a final resolution, processing with a reduced number of pixels reduces the processing time. With this method, the hidden parallax is ignored, and the proper parallax range can be used effectively. It is easy to handle even if there are multiple objects.
- the parallax control unit 1114 may be configured to: when generating a stereoscopic image from three-dimensional data, changing a parameter related to a camera arrangement set to generate a parallax image; May be controlled so as to fall within a predetermined threshold value with respect to the fluctuation of the parameter. Further, when generating a stereoscopic image of a moving image from a two-dimensional moving image to which the depth information is given, the parallax control unit 114 generates the depth information included in the depth information generated with the progress of the two-dimensional moving image. Of the maximum or minimum of The variation may be controlled so as to fall within a preset threshold. The threshold values used for these controls may be stored in the parallax information storage unit 120.
- the size of the basic expression space will rapidly increase due to rapid movement of objects, framing, and frame-out. In other words, the camera placement parameters may fluctuate significantly. If this change is larger than a predetermined threshold value, the change may be allowed up to the threshold value.
- a threshold may be set for this variation.
- the burden on the programmer can be reduced when creating content or applications that can provide appropriate stereoscopic display.
- the present invention is applicable to a stereoscopic image processing method, a stereoscopic image processing method device, and the like.
Description
Claims
Priority Applications (13)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP03715473A EP1489857B1 (en) | 2002-03-27 | 2003-03-27 | 3-dimensional image processing method and device |
CN038071592A CN1643939B (zh) | 2002-03-27 | 2003-03-27 | 立体图像处理方法及装置 |
KR1020047015105A KR100812905B1 (ko) | 2002-03-27 | 2003-03-27 | 입체 화상 처리 방법, 장치 및 컴퓨터 판독가능 기록 매체 |
US10/949,528 US8369607B2 (en) | 2002-03-27 | 2004-09-27 | Method and apparatus for processing three-dimensional images |
US12/976,262 US8131064B2 (en) | 2002-03-27 | 2010-12-22 | Method and apparatus for processing three-dimensional images |
US12/986,509 US8417024B2 (en) | 2002-03-27 | 2011-01-07 | Method and apparatus for processing three-dimensional images |
US12/986,453 US8254668B2 (en) | 2002-03-27 | 2011-01-07 | Method and apparatus for processing three-dimensional images |
US12/986,491 US8879824B2 (en) | 2002-03-27 | 2011-01-07 | Method and apparatus for processing three-dimensional images |
US12/986,551 US8724886B2 (en) | 2002-03-27 | 2011-01-07 | Method and apparatus for processing three-dimensional images |
US12/986,530 US8577128B2 (en) | 2002-03-27 | 2011-01-07 | Method and apparatus for processing three-dimensional images |
US12/986,471 US8577127B2 (en) | 2002-03-27 | 2011-01-07 | Method and apparatus for processing three-dimensional images |
US13/088,752 US8472702B2 (en) | 2002-03-27 | 2011-04-18 | Method and apparatus for processing three-dimensional images |
US13/283,361 US8559703B2 (en) | 2002-03-27 | 2011-10-27 | Method and apparatus for processing three-dimensional images |
Applications Claiming Priority (14)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002087494 | 2002-03-27 | ||
JP2002-87496 | 2002-03-27 | ||
JP2002-87497 | 2002-03-27 | ||
JP2002087497A JP2003284095A (ja) | 2002-03-27 | 2002-03-27 | 立体画像処理方法および装置 |
JP2002087495A JP2003284093A (ja) | 2002-03-27 | 2002-03-27 | 立体画像処理方法および装置 |
JP2002-87494 | 2002-03-27 | ||
JP2002-87495 | 2002-03-27 | ||
JP2002087496A JP3702243B2 (ja) | 2002-03-27 | 2002-03-27 | 立体画像処理方法および装置 |
JP2002087493 | 2002-03-27 | ||
JP2002-87493 | 2002-03-27 | ||
JP2003003761A JP3749227B2 (ja) | 2002-03-27 | 2003-01-09 | 立体画像処理方法および装置 |
JP2003003763A JP3857988B2 (ja) | 2002-03-27 | 2003-01-09 | 立体画像処理方法および装置 |
JP2003-3761 | 2003-01-09 | ||
JP2003-3763 | 2003-01-09 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/949,528 Continuation US8369607B2 (en) | 2002-03-27 | 2004-09-27 | Method and apparatus for processing three-dimensional images |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003081921A1 true WO2003081921A1 (fr) | 2003-10-02 |
Family
ID=28458078
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/003791 WO2003081921A1 (fr) | 2002-03-27 | 2003-03-27 | Procede de traitement d'images tridimensionnelles et dispositif |
Country Status (4)
Country | Link |
---|---|
EP (11) | EP1489857B1 (ja) |
KR (1) | KR100812905B1 (ja) |
CN (1) | CN1643939B (ja) |
WO (1) | WO2003081921A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8482598B2 (en) | 2005-03-18 | 2013-07-09 | Ntt Data Sanyo System Corporation | Stereoscopic image display apparatus, stereoscopic image displaying method and computer program product |
CN103636187A (zh) * | 2011-08-30 | 2014-03-12 | 松下电器产业株式会社 | 摄像装置 |
KR101417024B1 (ko) | 2007-11-22 | 2014-07-08 | 엘지이노텍 주식회사 | 카메라를 이용한 3d 영상 획득 방법 |
Families Citing this family (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2407224C2 (ru) * | 2005-04-19 | 2010-12-20 | Конинклейке Филипс Электроникс Н.В. | Восприятие глубины |
KR100739730B1 (ko) * | 2005-09-03 | 2007-07-13 | 삼성전자주식회사 | 3d 입체 영상 처리 장치 및 방법 |
KR101185870B1 (ko) * | 2005-10-12 | 2012-09-25 | 삼성전자주식회사 | 3d 입체 영상 처리 장치 및 방법 |
KR100704634B1 (ko) | 2005-12-19 | 2007-04-09 | 삼성전자주식회사 | 사용자의 위치에 따른 입체 영상을 디스플레이하는 장치 및방법 |
KR100762670B1 (ko) * | 2006-06-07 | 2007-10-01 | 삼성전자주식회사 | 스테레오 이미지로부터 디스패리티 맵을 생성하는 방법 및장치와 그를 위한 스테레오 매칭 방법 및 장치 |
KR101311896B1 (ko) * | 2006-11-14 | 2013-10-14 | 삼성전자주식회사 | 입체 영상의 변위 조정방법 및 이를 적용한 입체 영상장치 |
TWI331872B (en) | 2006-12-29 | 2010-10-11 | Quanta Comp Inc | Method for displaying stereoscopic image |
KR101345303B1 (ko) * | 2007-03-29 | 2013-12-27 | 삼성전자주식회사 | 스테레오 또는 다시점 영상의 입체감 조정 방법 및 장치 |
KR20080100984A (ko) | 2007-05-15 | 2008-11-21 | 삼성전자주식회사 | 3차원 영상 디스플레이 방법 및 장치 |
US8726194B2 (en) * | 2007-07-27 | 2014-05-13 | Qualcomm Incorporated | Item selection using enhanced control |
KR100906784B1 (ko) * | 2007-11-15 | 2009-07-09 | (주)레드로버 | 입체영상 제작 프로그램용 플러그인 모듈 및 입체영상 제작방법 |
KR100924432B1 (ko) * | 2007-12-06 | 2009-10-29 | 한국전자통신연구원 | 다시점 영상의 인식 깊이감 조절 장치 및 방법 |
CN101472190B (zh) * | 2007-12-28 | 2013-01-23 | 华为终端有限公司 | 多视角摄像及图像处理装置、系统 |
JP5684577B2 (ja) * | 2008-02-27 | 2015-03-11 | ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー | シーンの深度データをキャプチャし、コンピュータのアクションを適用する方法 |
WO2010010521A2 (en) * | 2008-07-24 | 2010-01-28 | Koninklijke Philips Electronics N.V. | Versatile 3-d picture format |
JP4657331B2 (ja) | 2008-08-27 | 2011-03-23 | 富士フイルム株式会社 | 3次元表示時における指示位置設定装置および方法並びにプログラム |
KR101633627B1 (ko) | 2008-10-21 | 2016-06-27 | 코닌클리케 필립스 엔.브이. | 입력 3차원 비디오 신호를 프로세싱하는 방법 및 시스템 |
JP4737573B2 (ja) * | 2009-02-05 | 2011-08-03 | 富士フイルム株式会社 | 3次元画像出力装置及び方法 |
JP5327524B2 (ja) * | 2009-02-27 | 2013-10-30 | ソニー株式会社 | 画像処理装置、画像処理方法、およびプログラム |
TW201116041A (en) * | 2009-06-29 | 2011-05-01 | Sony Corp | Three-dimensional image data transmission device, three-dimensional image data transmission method, three-dimensional image data reception device, three-dimensional image data reception method, image data transmission device, and image data reception |
JP5491786B2 (ja) * | 2009-07-21 | 2014-05-14 | 富士フイルム株式会社 | 画像再生装置及び方法 |
US8509519B2 (en) * | 2009-07-29 | 2013-08-13 | Intellectual Ventures Fund 83 Llc | Adjusting perspective and disparity in stereoscopic image pairs |
JP5444955B2 (ja) * | 2009-08-31 | 2014-03-19 | ソニー株式会社 | 立体画像表示システム、視差変換装置、視差変換方法およびプログラム |
JP5415217B2 (ja) * | 2009-10-02 | 2014-02-12 | パナソニック株式会社 | 三次元映像処理装置 |
JP5299214B2 (ja) * | 2009-10-20 | 2013-09-25 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
JP2011109294A (ja) * | 2009-11-16 | 2011-06-02 | Sony Corp | 情報処理装置、情報処理方法、表示制御装置、表示制御方法、およびプログラム |
EP2334089A1 (en) * | 2009-12-04 | 2011-06-15 | Alcatel Lucent | A method and systems for obtaining an improved stereo image of an object |
JP5287702B2 (ja) * | 2009-12-25 | 2013-09-11 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
JP2011138354A (ja) * | 2009-12-28 | 2011-07-14 | Sony Corp | 情報処理装置および情報処理方法 |
EP2525579B1 (en) | 2010-01-11 | 2019-03-06 | LG Electronics Inc. | Broadcasting receiver and method for displaying 3d images |
JP4951079B2 (ja) * | 2010-03-11 | 2012-06-13 | 株式会社東芝 | 立体表示装置、映像処理装置 |
EP2549766A1 (en) * | 2010-03-17 | 2013-01-23 | Panasonic Corporation | Replay device |
CN102835116B (zh) * | 2010-04-01 | 2015-03-25 | 诺基亚公司 | 用于选择立体成像视点对的方法和装置 |
EP2375763A2 (en) * | 2010-04-07 | 2011-10-12 | Sony Corporation | Image processing apparatus and image processing method |
JP5477128B2 (ja) * | 2010-04-07 | 2014-04-23 | ソニー株式会社 | 信号処理装置、信号処理方法、表示装置及びプログラム |
JP2011223481A (ja) * | 2010-04-14 | 2011-11-04 | Sony Corp | データ構造、画像処理装置、画像処理方法、およびプログラム |
JP5202584B2 (ja) * | 2010-06-24 | 2013-06-05 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置、コンテンツ作成支援装置、画像処理方法、コンテンツ作成支援方法、および画像ファイルのデータ構造 |
EP2586209A1 (en) * | 2010-06-28 | 2013-05-01 | Thomson Licensing | Method and apparatus for customizing 3-dimensional effects of stereo content |
WO2012002347A1 (ja) * | 2010-06-30 | 2012-01-05 | 富士フイルム株式会社 | 画像処理装置、撮像装置及び画像処理方法 |
CN103098479A (zh) * | 2010-06-30 | 2013-05-08 | 富士胶片株式会社 | 图像处理装置、图像处理方法和图像处理程序 |
CN102340678B (zh) * | 2010-07-21 | 2014-07-23 | 深圳Tcl新技术有限公司 | 一种景深可调的立体显示装置及其景深调整方法 |
WO2012014708A1 (ja) * | 2010-07-26 | 2012-02-02 | 富士フイルム株式会社 | 画像処理装置、方法およびプログラム |
CN105915880B (zh) | 2010-08-10 | 2018-02-23 | 株式会社尼康 | 图像处理装置和图像处理方法 |
EP2434764A1 (en) * | 2010-09-23 | 2012-03-28 | Thomson Licensing | Adaptation of 3D video content |
US8693687B2 (en) * | 2010-10-03 | 2014-04-08 | Himax Media Solutions, Inc. | Method and apparatus of processing three-dimensional video content |
JP4956658B2 (ja) * | 2010-10-12 | 2012-06-20 | シャープ株式会社 | 立体映像変換装置及び立体映像表示装置 |
TW201216204A (en) * | 2010-10-13 | 2012-04-16 | Altek Corp | Method for combining dual-lens images into mono-lens image |
JP2012090094A (ja) * | 2010-10-20 | 2012-05-10 | Sony Corp | 画像処理装置、画像処理方法、およびプログラム |
US20120113093A1 (en) * | 2010-11-09 | 2012-05-10 | Sharp Laboratories Of America, Inc. | Modification of perceived depth by stereo image synthesis |
JP5640681B2 (ja) * | 2010-11-11 | 2014-12-17 | ソニー株式会社 | 情報処理装置、立体視表示方法及びプログラム |
JP5695395B2 (ja) | 2010-11-19 | 2015-04-01 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | 立体画像生成方法及びその装置 |
CN103238337B (zh) * | 2010-11-23 | 2016-03-16 | 深圳超多维光电子有限公司 | 立体图像获取系统及方法 |
KR101783608B1 (ko) * | 2010-12-02 | 2017-10-10 | 엘지전자 주식회사 | 전자 장치 및 입체감 조정 방법 |
CN103947198B (zh) * | 2011-01-07 | 2017-02-15 | 索尼电脑娱乐美国公司 | 基于场景内容的预定三维视频设置的动态调整 |
EP2574065B1 (en) | 2011-01-26 | 2016-09-07 | FUJIFILM Corporation | Image processing device, image-capturing device, reproduction device, and image processing method |
RU2476825C2 (ru) * | 2011-03-01 | 2013-02-27 | Государственное образовательное учреждение высшего профессионального образования Томский государственный университет (ТГУ) | Способ управления движущимся объектом и устройство для его осуществления |
CN102971770B (zh) * | 2011-03-31 | 2016-02-10 | 松下电器产业株式会社 | 进行全周围立体图像的描绘的图像描绘装置、图像描绘方法 |
CN102740015B (zh) * | 2011-04-13 | 2015-03-11 | 鸿富锦精密工业(深圳)有限公司 | 同时播放不同频道的电视系统 |
CN103503449B (zh) * | 2011-04-28 | 2016-06-15 | 松下知识产权经营株式会社 | 影像处理装置及影像处理方法 |
US20120300034A1 (en) * | 2011-05-23 | 2012-11-29 | Qualcomm Incorporated | Interactive user interface for stereoscopic effect adjustment |
JP2012247891A (ja) * | 2011-05-26 | 2012-12-13 | Sony Corp | 画像処理装置、画像処理方法、およびプログラム |
JP5971632B2 (ja) * | 2011-05-26 | 2016-08-17 | パナソニックIpマネジメント株式会社 | 電子機器および合成画像の編集方法 |
KR20120133951A (ko) * | 2011-06-01 | 2012-12-11 | 삼성전자주식회사 | 3d 영상변환장치 그 깊이정보 조정방법 및 그 저장매체 |
TWI486052B (zh) | 2011-07-05 | 2015-05-21 | Realtek Semiconductor Corp | 立體影像處理裝置以及立體影像處理方法 |
CN103037236A (zh) * | 2011-08-22 | 2013-04-10 | 联发科技股份有限公司 | 图像处理方法以及装置 |
US9402065B2 (en) * | 2011-09-29 | 2016-07-26 | Qualcomm Incorporated | Methods and apparatus for conditional display of a stereoscopic image pair |
KR101888155B1 (ko) * | 2011-11-22 | 2018-08-16 | 삼성전자주식회사 | 디스플레이 장치 및 그 디스플레이 방법 |
CN102413350B (zh) * | 2011-11-30 | 2014-04-16 | 四川长虹电器股份有限公司 | 蓝光3d视频处理方法 |
JP5818674B2 (ja) * | 2011-12-21 | 2015-11-18 | 株式会社東芝 | 画像処理装置、方法、及びプログラム、並びに、画像表示装置 |
CN103227929B (zh) * | 2012-01-29 | 2016-10-26 | 晨星软件研发(深圳)有限公司 | 立体影像处理装置及立体影像处理方法 |
DE102012201564B3 (de) * | 2012-02-02 | 2013-05-29 | Leica Microsystems (Schweiz) Ag | System zur Darstellung stereoskopischer Bilder |
US8855442B2 (en) * | 2012-04-30 | 2014-10-07 | Yuri Owechko | Image registration of multimodal data using 3D-GeoArcs |
CN102752621A (zh) * | 2012-06-26 | 2012-10-24 | 京东方科技集团股份有限公司 | 景深保持装置、3d显示装置及显示方法 |
EP2680593A1 (en) * | 2012-06-26 | 2014-01-01 | Thomson Licensing | Method of adapting 3D content to an observer wearing prescription glasses |
CN103546736B (zh) | 2012-07-12 | 2016-12-28 | 三星电子株式会社 | 图像处理设备和方法 |
CN103686118A (zh) * | 2012-09-19 | 2014-03-26 | 珠海扬智电子科技有限公司 | 影像深度调整方法与装置 |
JP5903023B2 (ja) * | 2012-10-04 | 2016-04-13 | 株式会社ジオ技術研究所 | 立体視地図表示システム |
US9300942B2 (en) | 2012-10-18 | 2016-03-29 | Industrial Technology Research Institute | Method and control system for three-dimensional video playback using visual fatigue estimation |
CN103780894B (zh) * | 2012-10-18 | 2016-02-10 | 财团法人工业技术研究院 | 具视觉疲劳估算的立体影片播放方法与控制系统 |
RU2525228C2 (ru) * | 2012-10-19 | 2014-08-10 | Федеральное бюджетное учреждение "3 Центральный научно-исследовательский институт Министерства обороны Российской Федерации" | Устройство локации и навигации |
JP2014230251A (ja) * | 2013-05-27 | 2014-12-08 | ソニー株式会社 | 画像処理装置、および画像処理方法 |
CN103458259B (zh) * | 2013-08-27 | 2016-04-13 | Tcl集团股份有限公司 | 一种3d视频引起人眼疲劳度的检测方法、装置及系统 |
KR101528683B1 (ko) * | 2013-11-29 | 2015-06-15 | 영산대학교산학협력단 | 과도시차 객체 검출방법 |
CN103686416B (zh) * | 2013-12-27 | 2016-09-28 | 乐视致新电子科技(天津)有限公司 | 智能电视中的3d设置信息处理方法及装置 |
TW201528775A (zh) | 2014-01-02 | 2015-07-16 | Ind Tech Res Inst | 景深圖校正方法及系統 |
US9667951B2 (en) | 2014-02-18 | 2017-05-30 | Cisco Technology, Inc. | Three-dimensional television calibration |
CN103841403B (zh) * | 2014-03-11 | 2015-12-02 | 福州大学 | 一种无形变立体图像视差快速调节方法 |
CN104284177A (zh) * | 2014-10-28 | 2015-01-14 | 天津大学 | 会聚立体图像视差控制方法 |
KR101817436B1 (ko) * | 2016-08-02 | 2018-02-21 | 연세대학교 산학협력단 | 안구 전위 센서를 이용한 영상 표시 장치 및 제어 방법 |
JP6780983B2 (ja) * | 2016-08-18 | 2020-11-04 | マクセル株式会社 | 画像処理システム |
RU2672092C1 (ru) * | 2017-07-19 | 2018-11-12 | Федеральное государственное бюджетное образовательное учреждение высшего образования "Московский авиационный институт (национальный исследовательский университет)" | Способ измерения углового положения наземных неподвижных радиоконтрастных объектов |
CN113343207A (zh) * | 2021-06-07 | 2021-09-03 | 网易(杭州)网络有限公司 | 信息验证的方法、装置、计算机设备及存储介质 |
KR102317230B1 (ko) * | 2021-08-25 | 2021-10-26 | 주식회사 에스에이엠지엔터테인먼트 | 인공지능 기반 아케이드 게임 환경 확장 시스템 |
KR20230053956A (ko) | 2021-10-15 | 2023-04-24 | 김채은 | 잉크 찌꺼기가 나오지 않는 더 매끄러운 쇠볼을 가진 볼펜 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0568268A (ja) * | 1991-03-04 | 1993-03-19 | Sharp Corp | 立体視画像作成装置および立体視画像作成方法 |
JPH089421A (ja) * | 1994-06-20 | 1996-01-12 | Sanyo Electric Co Ltd | 立体映像装置 |
JPH09271043A (ja) * | 1996-03-29 | 1997-10-14 | Olympus Optical Co Ltd | 立体映像ディスプレイ装置 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01139507A (ja) | 1987-11-25 | 1989-06-01 | Hitachi Ltd | ドライフラワ輸送容器 |
US5502481A (en) * | 1992-11-16 | 1996-03-26 | Reveo, Inc. | Desktop-based projection display system for stereoscopic viewing of displayed imagery over a wide field of view |
EP0641132B1 (en) * | 1993-08-26 | 1999-04-14 | Matsushita Electric Industrial Co., Ltd. | Stereoscopic image pickup apparatus |
JPH07298307A (ja) * | 1994-04-28 | 1995-11-10 | Canon Inc | 画像記録再生装置 |
KR100414629B1 (ko) * | 1995-03-29 | 2004-05-03 | 산요덴키가부시키가이샤 | 3차원표시화상생성방법,깊이정보를이용한화상처리방법,깊이정보생성방법 |
US6005607A (en) * | 1995-06-29 | 1999-12-21 | Matsushita Electric Industrial Co., Ltd. | Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus |
US6163337A (en) * | 1996-04-05 | 2000-12-19 | Matsushita Electric Industrial Co., Ltd. | Multi-view point image transmission method and multi-view point image display method |
GB9611938D0 (en) * | 1996-06-07 | 1996-08-07 | Philips Electronics Nv | Stereo image generation |
WO1998025413A1 (en) * | 1996-12-04 | 1998-06-11 | Matsushita Electric Industrial Co., Ltd. | Optical disc for high resolution and three-dimensional image recording, optical disc reproducing device, and optical disc recording device |
JP3500056B2 (ja) * | 1997-11-10 | 2004-02-23 | 三洋電機株式会社 | 2次元映像を3次元映像に変換する装置および方法 |
WO1999030280A1 (en) * | 1997-12-05 | 1999-06-17 | Dynamic Digital Depth Research Pty. Ltd. | Improved image conversion and encoding techniques |
JP2000209614A (ja) * | 1999-01-14 | 2000-07-28 | Sony Corp | 立体映像システム |
AUPQ119799A0 (en) * | 1999-06-25 | 1999-07-22 | Dynamic Digital Depth Research Pty Ltd | Ddc/3 image conversion and encoding techniques |
US7161614B1 (en) * | 1999-11-26 | 2007-01-09 | Sanyo Electric Co., Ltd. | Device and method for converting two-dimensional video to three-dimensional video |
JP2001218228A (ja) * | 2000-02-01 | 2001-08-10 | Canon Inc | 立体画像撮影用光学系及びそれを用いた立体画像撮影装置 |
JP2001326947A (ja) * | 2000-05-12 | 2001-11-22 | Sony Corp | 立体画像表示装置 |
EP1185112B1 (en) * | 2000-08-25 | 2005-12-14 | Fuji Photo Film Co., Ltd. | Apparatus for parallax image capturing and parallax image processing |
KR20020077692A (ko) * | 2001-04-02 | 2002-10-14 | 에이디아이 코퍼레이션 | 입체영상을 발생시키는 방법 및 장치 |
-
2003
- 2003-03-27 EP EP03715473A patent/EP1489857B1/en not_active Expired - Fee Related
- 2003-03-27 WO PCT/JP2003/003791 patent/WO2003081921A1/ja active Application Filing
- 2003-03-27 EP EP11161295.8A patent/EP2357838B1/en not_active Expired - Lifetime
- 2003-03-27 EP EP11161274A patent/EP2357835A3/en not_active Withdrawn
- 2003-03-27 EP EP11161340A patent/EP2387248A3/en not_active Withdrawn
- 2003-03-27 EP EP11161315A patent/EP2357839A3/en not_active Withdrawn
- 2003-03-27 EP EP11161320A patent/EP2357840A3/en not_active Ceased
- 2003-03-27 EP EP11161269A patent/EP2362670B1/en not_active Expired - Fee Related
- 2003-03-27 EP EP11161329.5A patent/EP2357841B1/en not_active Expired - Fee Related
- 2003-03-27 EP EP20110161281 patent/EP2357836B1/en not_active Expired - Fee Related
- 2003-03-27 EP EP20110161303 patent/EP2381691B1/en not_active Expired - Fee Related
- 2003-03-27 CN CN038071592A patent/CN1643939B/zh not_active Expired - Fee Related
- 2003-03-27 EP EP11161284A patent/EP2357837A3/en not_active Withdrawn
- 2003-03-27 KR KR1020047015105A patent/KR100812905B1/ko not_active IP Right Cessation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0568268A (ja) * | 1991-03-04 | 1993-03-19 | Sharp Corp | 立体視画像作成装置および立体視画像作成方法 |
JPH089421A (ja) * | 1994-06-20 | 1996-01-12 | Sanyo Electric Co Ltd | 立体映像装置 |
JPH09271043A (ja) * | 1996-03-29 | 1997-10-14 | Olympus Optical Co Ltd | 立体映像ディスプレイ装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1489857A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8482598B2 (en) | 2005-03-18 | 2013-07-09 | Ntt Data Sanyo System Corporation | Stereoscopic image display apparatus, stereoscopic image displaying method and computer program product |
KR101417024B1 (ko) | 2007-11-22 | 2014-07-08 | 엘지이노텍 주식회사 | 카메라를 이용한 3d 영상 획득 방법 |
CN103636187A (zh) * | 2011-08-30 | 2014-03-12 | 松下电器产业株式会社 | 摄像装置 |
Also Published As
Publication number | Publication date |
---|---|
EP2357841B1 (en) | 2015-07-22 |
EP2357841A3 (en) | 2012-02-22 |
EP2357837A3 (en) | 2012-02-22 |
EP1489857B1 (en) | 2011-12-14 |
KR100812905B1 (ko) | 2008-03-11 |
EP2357838A2 (en) | 2011-08-17 |
EP2357835A3 (en) | 2012-02-22 |
EP1489857A1 (en) | 2004-12-22 |
EP2387248A3 (en) | 2012-03-07 |
EP2362670A1 (en) | 2011-08-31 |
KR20050004823A (ko) | 2005-01-12 |
EP2357839A3 (en) | 2012-02-29 |
EP2357840A3 (en) | 2012-02-29 |
EP2357836A2 (en) | 2011-08-17 |
EP2362670B1 (en) | 2012-10-03 |
EP2357838B1 (en) | 2016-03-16 |
EP2357840A2 (en) | 2011-08-17 |
EP2387248A2 (en) | 2011-11-16 |
EP2381691A3 (en) | 2012-05-23 |
EP2357838A3 (en) | 2012-02-22 |
CN1643939B (zh) | 2010-10-06 |
EP2381691A2 (en) | 2011-10-26 |
EP2357835A2 (en) | 2011-08-17 |
EP2381691B1 (en) | 2015-05-13 |
EP2357836A3 (en) | 2012-03-07 |
EP2357839A2 (en) | 2011-08-17 |
EP1489857A4 (en) | 2007-07-25 |
EP2357836B1 (en) | 2015-05-13 |
EP2357841A2 (en) | 2011-08-17 |
EP2357837A2 (en) | 2011-08-17 |
CN1643939A (zh) | 2005-07-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2003081921A1 (fr) | Procede de traitement d'images tridimensionnelles et dispositif | |
US8559703B2 (en) | Method and apparatus for processing three-dimensional images | |
JP2004221700A (ja) | 立体画像処理方法および装置 | |
JP3749227B2 (ja) | 立体画像処理方法および装置 | |
JP3857988B2 (ja) | 立体画像処理方法および装置 | |
JP2003284093A (ja) | 立体画像処理方法および装置 | |
JP2004221699A (ja) | 立体画像処理方法および装置 | |
JP4118146B2 (ja) | 立体画像処理装置 | |
JP2003284095A (ja) | 立体画像処理方法および装置 | |
JP3702243B2 (ja) | 立体画像処理方法および装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): DE FR GB |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1020047015105 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10949528 Country of ref document: US Ref document number: 20038071592 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003715473 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2003715473 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020047015105 Country of ref document: KR |