JP2004221700A - Stereoscopic image processing method and apparatus - Google Patents

Stereoscopic image processing method and apparatus Download PDF

Info

Publication number
JP2004221700A
JP2004221700A JP2003003765A JP2003003765A JP2004221700A JP 2004221700 A JP2004221700 A JP 2004221700A JP 2003003765 A JP2003003765 A JP 2003003765A JP 2003003765 A JP2003003765 A JP 2003003765A JP 2004221700 A JP2004221700 A JP 2004221700A
Authority
JP
Japan
Prior art keywords
parallax
image
stereoscopic image
stereoscopic
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2003003765A
Other languages
Japanese (ja)
Inventor
Takeshi Masutani
健 増谷
Original Assignee
Sanyo Electric Co Ltd
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sanyo Electric Co Ltd, 三洋電機株式会社 filed Critical Sanyo Electric Co Ltd
Priority to JP2003003765A priority Critical patent/JP2004221700A/en
Publication of JP2004221700A publication Critical patent/JP2004221700A/en
Application status is Granted legal-status Critical

Links

Images

Abstract

[PROBLEMS] When performing three-dimensional display on various display devices, since appropriate parallax is different, optimal programming is difficult, and this has hindered the spread of three-dimensional images.
A three-dimensional effect adjusting unit displays a three-dimensional image to a user. When the displayed object comes to the limit parallax, the user responds to the stereoscopic effect adjusting unit 112. According to the acquired proper parallax information, the parallax control unit 114 generates a parallax image so as to realize the proper parallax in the subsequent stereoscopic display. The control of parallax is realized by optimally setting camera parameters retroactively to three-dimensional data. Provides a library of functions to achieve proper parallax.
[Selection diagram] FIG.

Description

[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a stereoscopic image processing technique, and more particularly to a method and apparatus for generating or displaying a stereoscopic image based on a parallax image.
[0002]
[Prior art]
In recent years, the lack of network infrastructure has been regarded as a problem, but the transition to broadband has entered the period, and rather, the types and the number of contents that make effective use of a wide band have begun to stand out. Video has always been the most important means of expression, but much of the work up to now has been related to improving display quality and data compression ratios. There is a feeling that such efforts are behind.
[0003]
Under such circumstances, stereoscopic video display (hereinafter simply referred to as stereoscopic display) has been studied in various ways for a long time, and has been put to practical use in a somewhat limited market using theater applications and special display devices. In the future, research and development in this area will be accelerated with the aim of providing more realistic content, and an era in which individual users will enjoy stereoscopic display at home is expected.
[0004]
In addition, stereoscopic display is expected to be widely used in the future, and therefore, display forms that cannot be imagined with current display devices have been proposed. For example, a technique for displaying a selected partial image of a two-dimensional image in three dimensions has been disclosed (see Patent Document 1).
[0005]
[Patent Document 1]
JP-A-11-39507
[0006]
[Problems to be solved by the invention]
In such a trend, several problems have been pointed out in stereoscopic display. For example, it is difficult to optimize parallax, which causes a three-dimensional effect. Originally, instead of really projecting a three-dimensional object, the image is cast with a pixel shift for both the left and right eyes, and it is not easy to give the artificial three-dimensional effect a natural feeling. Absent.
[0007]
In addition, excessive parallax may cause a problem, and some observers of the stereoscopic video (hereinafter, simply referred to as users) may complain of slight discomfort. Of course, this is caused not only by the stereoscopic display but also by various factors such as a situation in which the displayed scene does not match the situation or sensation around the user. However, as a rule of thumb, such a problem is easily observed when the parallax is too large, in other words, when the stereoscopic effect is too strong.
[0008]
The above is a description of human physiology, but there are technical factors that prevent the spread of stereoscopic video contents and applications. Although stereoscopic vision is realized by parallax, even if parallax is represented by the amount of pixel shift between left and right images, there may be cases where the same stereoscopic image can be appropriately stereoscopically viewed due to differences in hardware of the display device, and cases where it is not. . If the parallax representing a distant place exceeds the interocular distance, stereoscopic vision is theoretically impossible. As the resolution and screen size of display devices are diversifying as in today's PCs (personal computers), television receivers, portable devices, etc., creating optimal content for stereoscopic display by considering various hardware Is more challenging, or it is more accurate that no methodology has been given for it.
[0009]
And even if the methodology is given, it would be difficult for the average programmer to understand and use it to create content and applications.
[0010]
In the technology disclosed in the above-mentioned literature, it has been proposed as a method for solving the above-mentioned problems, but in order to spread stereoscopic display in the future, further techniques are proposed and new technologies are accumulated, and It is necessary to link technologies and apply them to products.
[0011]
The present invention has been made in view of such a background, and an object of the present invention is to propose a new expression method of stereoscopic display. Another object is to generate or display a stereoscopic image appropriate for a user even when the display target image or the display device changes. Still another object is to adjust the stereoscopic effect by a simple operation when stereoscopic display is performed. Still another object is to reduce the burden on a programmer when creating content or an application capable of appropriate three-dimensional display. Still another object is to provide a technology for realizing an appropriate three-dimensional display as a business model.
[0012]
[Means for Solving the Problems]
The inventor's knowledge that forms the basis of the present invention is to separate proper parallax from factors such as the hardware of the display device and the distance between the user and the display device (hereinafter, these are collectively referred to as "hardware"). It is in. That is, the expression of the appropriate parallax is generalized by a camera interval and an optical axis intersecting position, which will be described later, and is once described in a general form independent of hardware. “Independent of hardware” means that reading of hardware information unique to the display device is basically unnecessary, and if this general-purpose description is made, the parallax image will be based on the proper parallax. Is generated or adjusted, a desired stereoscopic display is realized.
[0013]
By providing the library with the control to acquire the proper parallax and realize the proper parallax when displaying the image stereoscopically, ordinary programmers can call this library without being aware of the complex stereoscopic principles and programming. 3D display is realized.
[0014]
Among the various aspects of the present invention, the first group is based on a technique for acquiring an appropriate parallax based on a user's response. This technique can be used for “initial setting” of parallax by a user, and once a proper parallax is acquired in the device, the proper parallax is realized even when another image is displayed. However, this technique is used not only in the initial setting but also in “manual adjustment” in which the user appropriately adjusts the parallax of the image being displayed. Hereinafter, the first group will be described.
[0015]
The present invention relates to a stereoscopic image processing apparatus, an instruction obtaining unit that obtains a user's response to a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes, and, based on the obtained response, A parallax specifying unit that specifies an appropriate parallax for the user.
[0016]
The instruction obtaining unit is provided, for example, as a GUI (graphical user interface, the same applies hereinafter), and first displays the image while changing the parallax between the viewpoint images. When the user has a stereoscopic effect that he or she likes, the user inputs that fact by operating a button or the like.
[0017]
A “stereoscopic image” is an image displayed with a stereoscopic effect, and the substance of the data is a “parallax image” in which a plurality of images have parallax. Generally, a parallax image is a set of a plurality of two-dimensional images. Each image forming the parallax image is a “viewpoint image” having a corresponding viewpoint. That is, a parallax image is formed by a plurality of viewpoint images, and when the parallax image is displayed, it is displayed as a stereoscopic image. Display of a stereoscopic image is also simply referred to as “stereoscopic display”.
[0018]
“Parallax” is a parameter for creating a three-dimensional effect, and can be defined in various ways. For example, it can be expressed by a shift amount of a pixel representing the same point between viewpoint images. Hereinafter, in this specification, the definition is followed unless otherwise specified.
[0019]
The range of the appropriate parallax may be specified. In that case, both ends of the range are referred to as “limit parallax”. The “specification of the appropriate parallax” may be performed with a maximum value allowable as the parallax of the nearby object described later.
[0020]
The stereoscopic image processing device of the present invention may further include a parallax control unit that performs processing so that the specified proper parallax is realized even when another image is displayed. When another image is a stereoscopic image generated from three-dimensional data as a starting point, the parallax control unit may determine a plurality of viewpoints for generating the stereoscopic image according to the appropriate parallax. More specifically, a distance between a plurality of viewpoints and an intersection position of an optical axis at which an object is viewed from those viewpoints may be determined. An example of these processes is performed by a camera arrangement determination unit described later. If these processes are performed in real time, optimal three-dimensional display is always realized.
[0021]
The parallax control unit may perform control such that appropriate parallax is realized in a predetermined basic three-dimensional space to be displayed. An example of this processing is performed by a projection processing unit described later.
[0022]
The parallax control unit may perform control such that the appropriate parallax is realized for the coordinates of the closest object and the coordinates of the farthest object in the three-dimensional space. An example of this processing is performed by a projection processing unit described later. Objects can be static.
[0023]
“Near” refers to the line of sight of the camera placed at each of a plurality of viewpoints, that is, a plane at the intersection of the optical axes (hereinafter, also referred to as “optical axis intersection position”) (hereinafter, also referred to as “optical axis intersection plane”). It refers to a state in which parallax is provided so as to be stereoscopically viewed earlier. The term “distant” refers to a state in which parallax is provided so that the image is stereoscopically viewed behind the plane intersecting the optical axis. The closer the parallax of the near object is, the closer to the user is perceived, and the larger the parallax of the far object is, the farther the user is perceived. That is, unless otherwise specified, the parallax is defined as a non-negative value without reversing the sign of the parallax between the near position and the distant position, and both the near parallax and the far parallax are set to zero at the optical axis intersection plane.
[0024]
In a portion of the displayed object or space where there is no parallax, the optical axis intersection plane matches the screen surface of the display device. This is because, for a pixel to which no parallax is attached, the line of sight viewed from the left and right eyes reaches the same position in the screen plane, that is, intersects there.
[0025]
When the another image is a plurality of two-dimensional images to which parallax has already been given, the parallax control unit may determine a shift amount in the horizontal direction of the plurality of two-dimensional images according to an appropriate parallax. In this embodiment, the input for stereoscopic display is not generated with a high degree of freedom starting from the three-dimensional data, but is an already generated parallax image, and the parallax is fixed. In this case, it is not possible to return to the original three-dimensional space or the actual space where the image was actually taken, change the camera position, and perform a process of redrawing or re-imaging. Therefore, the parallax is adjusted by horizontally shifting the viewpoint images forming the parallax image or the pixels included therein.
[0026]
In the case where the another image is a plane image to which depth information is given (hereinafter also referred to as “image with depth information”), the parallax control unit may adjust the depth according to the appropriate parallax. An example of this processing is performed by a two-dimensional image generation unit of a third stereoscopic image processing device described later.
[0027]
The stereoscopic image processing apparatus further includes a parallax holding unit that records an appropriate parallax, and the parallax control unit performs a predetermined timing, for example, when the apparatus is activated, or when the stereoscopic image processing function of the apparatus or a part thereof is activated. In such a case, the appropriate parallax may be read, and the value may be used as an initial value for processing. That is, “activation” may have either a hardware meaning or a software meaning. According to this aspect, once the user determines the appropriate parallax, the automatic processing for adjusting the stereoscopic effect is realized thereafter. This is a function also referred to as “initial setting of proper parallax”.
[0028]
Another aspect of the present invention relates to a stereoscopic image processing method, wherein a step of displaying a plurality of stereoscopic images with different parallaxes to a user, and specifying an appropriate parallax for the user based on a response of the user to the displayed stereoscopic image. Steps.
[0029]
Still another embodiment of the present invention also relates to a stereoscopic image processing method, which includes a step of acquiring a proper parallax depending on a user and a step of performing processing on an image before display so that the acquired proper parallax is realized. Here, “acquisition” may be a process of positively identifying or a process of reading from the parallax holding unit or the like.
[0030]
If these steps are implemented as a function of a stereoscopic display library, and the functions of this library can be called as functions from multiple programs, the programmer must write the program in consideration of the hardware of the stereoscopic display device. Is no longer effective.
[0031]
The second group of the present invention is based on a technique for adjusting parallax based on a user's instruction. This technique can be used for “manual adjustment” of parallax by a user, and the user can appropriately change the stereoscopic effect of an image being displayed. However, this technique can be used not only for manual adjustment but also for reading an appropriate parallax and automatically adjusting the parallax of the image when stereoscopically displaying an image. The difference from the automatic adjustment of the first group is that the automatic adjustment of the second group acts on a two-dimensional parallax image or an image with depth information. Use a group of technologies. Hereinafter, the second group will be described.
[0032]
One embodiment of the present invention relates to a stereoscopic image processing apparatus, and an instruction acquisition unit that acquires a user instruction for a stereoscopic image displayed from a plurality of viewpoint images, and a parallax amount between the plurality of viewpoint images according to the acquired instruction. And a parallax control unit that changes An example of this processing is shown in FIG. 45 described later, and is a typical example of “manual adjustment”. It is convenient if the user's instruction is provided by a simple GUI such as a button operation.
[0033]
Another embodiment of the present invention also relates to a stereoscopic image processing apparatus, wherein a parallax amount detection unit that detects a first parallax amount generated when a stereoscopic image is displayed from a plurality of viewpoint images, and the first parallax amount is set by a user. A disparity controller configured to change a disparity amount between the plurality of viewpoint images so as to fall within a range of a second disparity amount that is a disparity amount. This is a typical example of “automatic adjustment”, and the above-described appropriate parallax can be used as the second parallax amount. An example of this processing is shown in FIG. 46 described later.
[0034]
The parallax amount detection unit detects the maximum value of the first parallax amount, and the parallax control unit changes the parallax amount between the plurality of viewpoint images so that the maximum value does not exceed the maximum value of the second parallax amount. Is also good. In order to avoid excessive stereoscopic effect due to excessive parallax, the maximum value of the parallax amount, that is, the limit parallax is intended to be kept. The maximum value here may be considered as the maximum value on the near side.
[0035]
The parallax amount detection unit calculates a corresponding point matching between the plurality of viewpoint images to detect a first parallax amount, or detects a first parallax amount recorded in advance in any header of the plurality of viewpoint images. It may be detected. An example of these processes is shown in FIG. 47 described later.
[0036]
The parallax control unit may change the amount of parallax between the plurality of viewpoint images by shifting the synthesis position of the plurality of viewpoint images. This is common to FIGS. The shift of the combining position is a shift in the horizontal or vertical direction in units of pixels or the entire image. If the input is an image with depth information, the parallax control unit may adjust the depth information to change the amount of parallax.
[0037]
Another aspect of the present invention relates to a stereoscopic image processing method, wherein a step of acquiring a user instruction for a stereoscopic image displayed based on a plurality of viewpoint images, and a parallax amount between the plurality of viewpoint images according to the instruction. Is changed.
[0038]
Still another embodiment of the present invention also relates to a stereoscopic image processing method, wherein a step of detecting a first parallax amount generated when displaying a stereoscopic image from a plurality of viewpoint images, and the first parallax amount is a permissible parallax amount of a user. And changing the amount of parallax between the plurality of viewpoint images so as to fall within the range of the second amount of parallax.
[0039]
These steps may be implemented as a function of a library for stereoscopic display, and a function of the library may be called as a function from a plurality of programs.
[0040]
A third group of the present invention is based on a technique for correcting parallax based on a position in an image. This “automatic correction” acts to reduce the user's feeling of discomfort or rejection of the stereoscopic display, and can be used in combination with the technologies of the first and second groups. In general, in stereoscopic display, technical or physiological problems are pointed out, such as a plurality of viewpoint images being displaced closer to the end of the image and being more likely to produce a sense of incongruity. In the third group, this problem is reduced by processing such as reducing the parallax in a portion near the end of the image or adjusting the parallax so that the object moves from the near side to the far side. Hereinafter, the third group will be described.
[0041]
One embodiment of the present invention relates to a stereoscopic image processing apparatus, and stores a parallax control unit that corrects parallax between a plurality of viewpoint images for displaying a stereoscopic image, and a correction map to which the parallax control unit refers when performing the processing. The correction map is described so that the parallax is corrected based on the position in the viewpoint image. The correction maps include a parallax correction map, a distance feeling correction map, and the like.
[0042]
The parallax control unit reduces the parallax, for example, in the periphery of the plurality of viewpoint images, or changes the parallax so that the object is sensed farther from the user. The parallax control unit may change the parallax by selectively performing processing on any of the plurality of viewpoint images.
[0043]
When the plurality of viewpoint images are generated from the three-dimensional data, that is, when the viewpoint images can be generated by returning to the three-dimensional space, the parallax control unit controls the camera parameters when generating the plurality of viewpoint images to generate the parallax. May be changed. The camera parameters include an interval between the left and right cameras, an angle at which an object is viewed from the camera, and an optical axis intersection position.
[0044]
Similarly, when a plurality of viewpoint images are generated from three-dimensional data, the parallax control unit may change the parallax by generating a plurality of viewpoint images by distorting the three-dimensional space itself in, for example, a world coordinate system. . On the other hand, when a plurality of viewpoint images are generated from images with depth information, the parallax control unit may change the parallax by operating the depth information.
[0045]
Another aspect of the present invention relates to a stereoscopic image processing method, wherein a step of acquiring a plurality of viewpoint images for displaying a stereoscopic image, and a method of calculating a parallax between the acquired plurality of viewpoint images based on a position in the viewpoint images. And changing it. These steps may be implemented as functions of a library for stereoscopic display, and a function of the library may be called as a function from a plurality of programs.
[0046]
A fourth group of the present invention relates to a technology that provides the first to third groups and their related functions as a software library, reduces the burden on programmers and users, and promotes the spread of stereoscopic image display applications. Hereinafter, the fourth group will be described.
[0047]
One embodiment of the present invention relates to a stereoscopic image processing method, in which information related to stereoscopic image display is stored in a memory, and the stored information is shared between a plurality of different programs, and one of the programs is used for stereoscopic image processing. Is displayed, the state of the image to be output is determined with reference to the stored information. An example of the state of the image is how much parallax is given to the parallax image and the degree thereof.
[0048]
The “held information” may include any information of the format of the image input to the stereoscopic image display device, the display order of the viewpoint images, and the amount of parallax between the viewpoint images. Further, in addition to sharing the held information, a process unique to stereoscopic image display may be shared by a plurality of programs. An example of “processing unique to stereoscopic image display” is processing for determining held information. Another example is a process relating to a graphical user interface for determining a proper parallax, a process of displaying a screen for parallax adjustment to support realization of a proper parallax state, a process of detecting and tracking a user's head position, and a stereoscopic display device. For example, a process of displaying an image for adjusting is performed.
[0049]
Another embodiment of the present invention relates to a three-dimensional image processing apparatus, and a three-dimensional effect adjusting unit that provides a user with a graphical user interface for adjusting a three-dimensional effect of a three-dimensional display image, and a result of the adjustment of the three-dimensional effect by the user. A parallax control unit that generates a parallax image in a manner that protects the limit parallax.
[0050]
The apparatus further includes an information detection unit that obtains information to be referred to in order to optimize the stereoscopic image display, and a conversion unit that converts a format of the parallax image generated by the parallax control unit according to the obtained information. May be included.
[0051]
The parallax control unit may control the camera parameters based on the three-dimensional data, generate a parallax image while maintaining the limit parallax, or generate a parallax image by controlling the depth of the image with depth information. Alternatively, a parallax image may be generated after determining a horizontal shift amount of a plurality of two-dimensional images having parallax.
[0052]
A fifth group of the present invention relates to one application or business model using the above-described stereoscopic image processing technology or its related technology. A fourth group of software libraries is available. Hereinafter, the fifth group will be described.
[0053]
One embodiment of the present invention relates to a stereoscopic image processing method, in which a proper parallax for stereoscopically displaying a parallax image is once converted into an expression format that does not depend on the hardware of the display device, and the proper parallax in this expression format is converted between different display devices. Distribute.
[0054]
Another aspect of the present invention also relates to a stereoscopic image processing method, wherein a step of reading a proper disparity of a user acquired by a first display device into a second display device, and a process of reading the proper disparity by the second display device according to the proper disparity. The method includes a step of adjusting the parallax between the parallax images and a step of outputting the adjusted parallax image from the second display device. For example, the first display device is a device normally used by the user, and the second display device is a device provided at another place. Reading information about the hardware of the first display device into the second display device; and reading the information about the read hardware of the first display device and the information about the hardware of the second display device. Correcting the parallax of the parallax image in which the parallax has been adjusted in the step of adjusting the parallax of the parallax image in accordance with the proper parallax on the second display device.
[0055]
The information on hardware may include at least one of the size of the display screen, the optimal observation distance of the display device, and the image separation performance of the display device.
[0056]
Another embodiment of the present invention relates to a stereoscopic image processing device, including a first display device, a second display device, and a server connected via a network, wherein the first display device is acquired by the device. The appropriate disparity information of the user to the server, the server receives the appropriate disparity information and records it in association with the user, and when the user requests the output of the image data on the second display device, the device Reads the appropriate parallax information of the user from the server, adjusts the parallax, and outputs a parallax image.
[0057]
A sixth group of the present invention is based on a technology for proposing a new expression method using a stereoscopic image.
[0058]
One embodiment of the present invention relates to a three-dimensional image processing device. This stereoscopic image processing device is a stereoscopic image processing device that displays a stereoscopic image based on a plurality of viewpoint images corresponding to different parallaxes, and is recommended when displaying a stereoscopic image using the stereoscopic image display device. A recommended disparity acquisition unit that acquires a disparity range to be obtained, and a disparity control unit that sets disparity so as to display the stereoscopic display image within the acquired recommended disparity range.
[0059]
In addition, an object specification unit that receives specification of a predetermined object included in the stereoscopic image from a user, and an optical axis intersection position associated with each of the plurality of viewpoint images is associated with the position of the specified object. An optical axis crossing position setting unit that sets the crossing position of the optical axis such that the object is represented near the position on the display screen where the stereoscopic image is displayed.
[0060]
Also, for the specified object, the optical axis correspondence information describing that the object is associated with the optical axis intersection position and that the object is expressed near the position on the display screen is described above. A designation information adding unit for associating with an object.
[0061]
The optical axis intersection position setting unit acquires the optical axis correspondence information, associates the optical axis intersection position with the object described in the acquired optical axis correspondence information, and associates the optical axis intersection position with the object. May be represented near the position on the display screen where the stereoscopic image is displayed.
[0062]
Also, it is associated with image data used when generating a stereoscopic image, and the object included in the stereoscopic image includes information on whether or not to express in the basic expression space including the object to be stereoscopically displayed. The image processing apparatus may further include an identification information acquisition unit that acquires identification information, and the parallax control unit may reflect the acquired identification information when expressing an object in a stereoscopic image.
[0063]
Further, the identification information may include information on timing when the object is expressed in the basic expression space, and the identification information acquisition unit may reflect the acquired timing when expressing the object in a stereoscopic image. Good.
[0064]
Another embodiment of the present invention relates to a stereoscopic image processing method. This stereoscopic image processing method allows a predetermined object included in a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes to be selectable, and when an object is selected, the position of the selected object is The optical axis intersection positions associated with the plurality of viewpoint images are made to correspond to each other, and the optical axis intersection positions are made to substantially match the positions on the display screen where the stereoscopic image is displayed. According to this stereoscopic image processing method, the display screen can be set at the boundary between the distant space and the near space, and it is possible to express the object as if it were going to the viewer beyond the display screen.
[0065]
Further, the designated object may have a predetermined interface, and the optical axis intersection position setting unit may associate the optical axis intersection position on the interface. Further, a three-dimensional image may be generated starting from three-dimensional data. When a three-dimensional image is generated starting from three-dimensional data, it is easy to add various effects to the three-dimensional image. For example, when an object is expressed so as to exceed an interface, that is, beyond a display screen, an effect of deforming the display screen can be added.
[0066]
Still another embodiment of the present invention also relates to a stereoscopic image processing method. This stereoscopic image processing method sets, as a part of a stereoscopic image, an interface separating a space near a display screen on which a stereoscopic image generated based on a plurality of viewpoint images corresponding to different parallaxes is displayed. At the same time, a three-dimensional image is expressed using the interface as a boundary between the near space and the far space. Further, the interface may be a boundary surface between substances or a thin plate. As a thin plate, there are a glass plate, and further, paper.
[0067]
Still another embodiment of the present invention relates to a stereoscopic image processing method. This stereoscopic image processing method includes a moving speed of an object to be expressed in a basic expression space that is included in a stereoscopic image generated based on a plurality of viewpoint images corresponding to different parallaxes and that includes an object to be stereoscopically displayed. Is changed for the near or far direction.
[0068]
Yet another embodiment of the present invention also relates to a stereoscopic image processing method. According to this stereoscopic image processing method, when a stereoscopic image is generated based on a plurality of viewpoint images corresponding to different parallaxes, an object to be expressed in a basic expression space including an object to be stereoscopically displayed is defined by a predetermined parallax range. And at least one of the foreground or last surface of the basic expression space is set at a position where no object exists.
[0069]
Yet another embodiment of the present invention also relates to a stereoscopic image processing method. This stereoscopic image processing method calculates a parallax of an object to be expressed in a basic expression space including an object to be stereoscopically displayed when generating a stereoscopic image based on a plurality of viewpoint images corresponding to different parallaxes. , The parallax of the object is calculated as a size including the extended area in front of the object instead of the actual size of the object. When the object moves to the front of the basic expression space by moving the object so as to include the front extended area, if the object moves further forward, the object is expressed to move in the front extended area. You may.
[0070]
Yet another embodiment of the present invention also relates to a stereoscopic image processing method. This stereoscopic image processing method calculates a parallax of an object to be expressed in a basic expression space including an object to be stereoscopically displayed when generating a stereoscopic image based on a plurality of viewpoint images corresponding to different parallaxes. , The parallax of the object is calculated as a size including the extended area behind the object instead of the actual size of the object. Also, if the object moves further rearward after moving to the rear of the basic representation space by including the front extended area, the object is represented as moving in the rear extended area. May be.
[0071]
A seventh group of the present invention is based on a technique of adjusting a parallax to be set according to an image state.
[0072]
One embodiment of the present invention relates to a three-dimensional image processing device. When generating a stereoscopic image from three-dimensional data, the stereoscopic image processing apparatus has a parallax that is smaller than the parallax in a range in which the ratio between the width and the depth of an object represented in the stereoscopic image is correctly perceived by human eyes. There is a parallax control unit for controlling the size to not increase.
[0073]
Another embodiment of the present invention also relates to a stereoscopic image processing device. This stereoscopic image processing apparatus is capable of generating a stereoscopic image from a two-dimensional image given depth information, in which the ratio of the width to the depth of the object represented in the stereoscopic image is correctly perceived by human eyes. And a parallax control unit that controls the parallax so that the parallax is not greater than the parallax.
[0074]
Still another embodiment of the present invention also relates to a stereoscopic image processing device. This stereoscopic image processing apparatus includes an image determination unit that performs frequency analysis on a stereoscopic image to be displayed based on a plurality of viewpoint images corresponding to different parallaxes, and a parallax amount according to an amount of a high-frequency component determined by the frequency analysis. And a parallax control unit that adjusts Further, when the amount of the high-frequency component is large, the parallax control unit may perform adjustment to increase the amount of parallax.
[0075]
Still another embodiment of the present invention also relates to a stereoscopic image processing device. This stereoscopic image processing apparatus adjusts the amount of parallax according to the amount of motion of a stereoscopic image, and an image determination unit that detects the motion of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes. A parallax control unit. Further, the parallax control unit may perform adjustment to reduce the parallax amount when the amount of movement of the stereoscopic image is small.
[0076]
Still another embodiment of the present invention also relates to a stereoscopic image processing device. This stereoscopic image processing apparatus is provided with a camera parameter in advance for a change in the parameter when a parameter related to a camera arrangement set to generate a parallax image is changed when a stereoscopic image is generated from data. Control to be within the threshold value. According to this device, it is possible to reduce a situation in which the parallax changes abruptly and the observer of the stereoscopic image feels strange.
[0077]
Still another embodiment of the present invention also relates to a stereoscopic image processing device. This three-dimensional image processing apparatus, when generating a three-dimensional image of a moving image from a two-dimensional moving image given depth information, occurs with the progress of the two-dimensional moving image, the maximum value of the depth included in the depth information or Control is performed so that the change in the minimum value falls within a threshold value provided in advance. According to this device, it is possible to reduce a situation in which the parallax changes abruptly and the observer of the stereoscopic image feels strange.
[0078]
Still another embodiment of the present invention relates to a stereoscopic image processing method. In this stereoscopic image processing method, an appropriate parallax of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes is set for each scene.
[0079]
Still another embodiment of the present invention relates to a stereoscopic image processing method. In this stereoscopic image processing method, an appropriate parallax of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes is set at predetermined time intervals.
[0080]
Another embodiment of the present invention relates to a stereoscopic image processing device. The three-dimensional image processing apparatus includes a camera arrangement setting unit configured to set arrangement of a plurality of virtual cameras for generating a plurality of viewpoint images when original data serving as a starting point of a three-dimensional image is input. An object area determining unit that determines whether or not an area in which information of an object to be displayed does not exist in a correspondingly generated viewpoint image, and an area in which information of an object to be displayed does not exist. And a camera parameter adjustment unit that adjusts at least one of the angle of view of the virtual camera, the camera interval, and the intersection position of the optical axis so that there is no area where object information does not exist.
[0081]
It is to be noted that any combination of the above-described components and any conversion of the expression of the present invention between a method, an apparatus, a system, a recording medium, a computer program, and the like are also effective as embodiments of the present invention.
[0082]
BEST MODE FOR CARRYING OUT THE INVENTION
FIG. 1 shows a positional relationship among a user 10, a screen 12, and a reproduction object 14 which is stereoscopically displayed. The distance between the eyes of the user 10 is E, the distance between the user 10 and the screen 12 is D, and the width of the reproduced object 14 when displayed is W. Since the reproduction object 14 is stereoscopically displayed, it has pixels that are sensed closer to the screen 12, ie, pixels that are closer, and pixels that are sensed farther than the screen 12, that is, pixels that are farther away. Pixels with no parallax are perceived on the screen 12 because they appear at the same position on both sides of the screen 12.
[0083]
FIG. 2 shows a photographing system for generating the ideal display of FIG. The distance between the two cameras 22 and 24 is E, the distance from the camera to the optical axis crossing position when viewing the real object 20 (this is called the optical axis crossing distance) is D, and the same width as the screen 12 is If the object 20 whose width is actually W at the expected angle of view is photographed, a parallax image can be obtained from the two cameras. If this is displayed on the screen 12 of FIG. 1, the ideal state of FIG. 1 is realized.
[0084]
3 and 4 show a state where the positional relationship of FIG. 2 is multiplied by A (A <1) and B (B> 1), respectively. The parallax images obtained with these positional relationships also realize the ideal state of FIG. That is, the basis of an ideal three-dimensional display starts with W: D: E being constant. This relationship is also the basis for parallax.
[0085]
FIG. 5 to FIG. 10 show the outline of processing until a three-dimensional display is performed based on the three-dimensional data of the object 20 in the embodiment.
FIG. 5 shows a model coordinate system, that is, a coordinate space of each three-dimensional object 20. In this space, coordinates when modeling the object 20 are given. Usually, the origin is located at the center of the object 20.
[0086]
FIG. 6 shows the world coordinate system. The world space is a large space in which a scene is formed by arranging the objects 20, floors, and walls. The process up to the modeling in FIG. 5 and the determination of the world coordinate system in FIG. 6 can be recognized as “construction of three-dimensional data”.
[0087]
FIG. 7 shows a camera coordinate system. By setting the camera 22 at an arbitrary angle of view in an arbitrary direction from an arbitrary position in the world coordinate system, conversion to the camera coordinate system is performed. The camera position, direction, and angle of view are camera parameters. In the case of a three-dimensional display, since the parameters are determined for the two cameras, the camera interval and the optical axis intersecting position are also determined. In addition, the origin is moved to set the midpoint of the two cameras as the origin.
[0088]
8 and 9 show a perspective coordinate system. First, as shown in FIG. 8, a space to be displayed is clipped by a front projection plane 30 and a rear projection plane 32. As will be described later, one feature of the embodiment resides in that a plane having a near maximum parallax point is a front projection plane 30 and a plane having a far maximum parallax point is a rear projection plane 32. After clipping, this view volume is converted into a rectangular parallelepiped as shown in FIG. 8 and 9 are also referred to as projection processing.
[0089]
FIG. 10 shows a screen coordinate system. In the case of stereoscopic display, images from each of a plurality of cameras are converted into a coordinate system of a screen, and a plurality of two-dimensional images, that is, parallax images are generated.
[0090]
11, 12, and 13 show the configuration of the stereoscopic image processing device 100 partially different from each other. Hereinafter, for convenience, those three-dimensional image processing apparatuses 100 are also referred to as first, second, and third three-dimensional image processing apparatuses 100, respectively. These three-dimensional image processing apparatuses 100 can be integrated into the apparatus, but are divided into three here to avoid complication of the drawing. The first three-dimensional image processing apparatus 100 is effective when the object to be drawn and the space can be obtained from the three-dimensional data stage, and therefore, the main input is the three-dimensional data. The second stereoscopic image processing apparatus 100 is effective for adjusting a plurality of two-dimensional images to which parallax has already been given, that is, parallax adjustment of an existing parallax image, and therefore inputs a two-dimensional parallax image. The third three-dimensional image processing apparatus 100 realizes proper parallax by operating depth information of an image with depth information. Therefore, the input is mainly an image with depth information. These three types of inputs are collectively referred to as “original data”.
[0091]
When the first to third stereoscopic image processing apparatuses 100 are integrally mounted, an “image format determination unit” is provided as a preprocessing unit for the three-dimensional data, the parallax image, and the image with depth information. A configuration may be adopted in which the optimal one of the first to third stereoscopic image processing apparatuses 100 is activated.
[0092]
The first three-dimensional image processing apparatus 100 has “initial setting” and “automatic adjustment” functions for setting a three-dimensional effect for a three-dimensional display. When the user specifies his or her proper parallax range for the stereoscopically displayed image, this is acquired by the system, and thereafter, when another stereoscopic image is displayed, conversion processing is performed in advance to realize this proper parallax. Is displayed. Therefore, with the first three-dimensional image processing apparatus 100, the user can enjoy a three-dimensional display suitable for himself after performing the setting procedure only once in principle.
[0093]
The first three-dimensional image processing apparatus 100 further has a sub-function of “parallax correction” for artificially reducing parallax in a peripheral portion of an image. As described above, the shift of a plurality of viewpoint images becomes more likely to be recognized as a “double image” as approaching the edge of the image. This is mainly due to a mechanical error such as a parallax barrier or a screen warpage of the display device. Therefore, in the peripheral part of the image, 1) reduce both the near parallax and the far parallax, 2) reduce the near parallax and leave the far parallax unchanged, 3) regardless of the near parallax and the far parallax, And various methods such as shifting to far parallax. Note that the “parallax correction” function also exists in the third stereoscopic image processing apparatus 100, but the processing is different due to a difference in input data.
[0094]
The first three-dimensional image processing apparatus 100 includes a three-dimensional effect adjusting unit 112 that adjusts a three-dimensional effect based on a response from a user to an image displayed three-dimensionally, and a parallax that stores an appropriate parallax specified by the three-dimensional effect adjusting unit 112. An information holding unit 120, a proper disparity is read from the disparity information holding unit 120, a disparity control unit 114 that generates a disparity image having an appropriate disparity from the original data, and hardware information of the display device are acquired, An information acquisition unit 118 having a function of acquiring a method and a format conversion unit 116 that changes the format of the parallax image generated by the parallax control unit 114 based on the information acquired by the information acquisition unit 118 are included. Original data is simply referred to as three-dimensional data. Strictly speaking, this refers to object and space data described in the world coordinate system.
[0095]
Examples of the information acquired by the information acquisition unit 118 include the number of viewpoints for stereoscopic display, the method of a stereoscopic display device such as space division or time division, whether or not shutter glasses are used, and the number of viewpoint images in a multi-view system. The arrangement, the presence or absence of the arrangement of the viewpoint images in which the parallax is inverted in the parallax images, the result of head tracking, and the like are included. It should be noted that only the result of head tracking is exceptionally input directly to the camera arrangement determining unit 132 via a path (not shown), and is processed there.
[0096]
The above configuration can be realized in terms of hardware by a CPU, a memory, and other LSIs of an arbitrary computer, and can be realized in terms of software by a program having a GUI function, a parallax control function, and other functions. Here, the functional blocks realized by their cooperation are drawn. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by only hardware, only software, or a combination thereof, and the same applies to the subsequent configuration.
[0097]
The stereoscopic effect adjustment unit 112 includes an instruction acquisition unit 122 and a parallax identification unit 124. The instruction obtaining unit 122 obtains an appropriate parallax range when the user specifies the range of the parallax for the stereoscopically displayed image. The parallax specifying unit 124 specifies a proper parallax when the user uses the display device based on the range. The appropriate parallax is expressed in an expression format that does not depend on the hardware of the display device. Achieving proper parallax enables stereoscopic vision suited to the physiology of the user.
[0098]
The parallax control unit 114 first includes a camera provisional placement unit 130 for temporarily setting camera parameters, a camera placement determination unit 132 for correcting camera parameters provisionally set according to proper parallax, and a plurality of cameras when camera parameters are determined. , An origin moving unit 134 that performs an origin moving process so as to set the middle point as an origin, a projection processing unit 138 that performs the above-described projection process, and performs a conversion process to a screen coordinate system after the projection process to generate a parallax image. And a two-dimensional image generation unit 142. In addition, a distortion processing unit 136 that performs spatial distortion conversion (hereinafter, also simply referred to as distortion conversion) in order to reduce parallax in an image peripheral part is provided between the camera temporary arrangement unit 130 and the camera arrangement determination unit 132 as necessary. ing. The distortion processing unit 136 reads a correction map described later from the correction map holding unit 140 and uses the same.
[0099]
If it is necessary to adjust the display device for stereoscopic display, a GUI (not shown) for that purpose may be added. This GUI may be used to perform processing such as finely shifting the entire displayed parallax image vertically and horizontally to determine the optimal display position.
[0100]
The second stereoscopic image processing device 100 in FIG. 12 receives a plurality of parallax images as input. This is simply called an input image. The second three-dimensional image processing apparatus 100 reads the appropriate parallax acquired by the first three-dimensional image processing apparatus 100, adjusts the parallax of the input image to be within the range of the appropriate parallax, and outputs it. In that sense, the second stereoscopic image processing apparatus 100 has a “parallax“ automatic adjustment ”function. However, in addition to this, when the user wants to change the stereoscopic effect while the stereoscopic display is actually performed, a GUI function is provided, and a “manual adjustment” function for changing the parallax according to the user's instruction is also provided.
[0101]
Although the parallax of a parallax image that has already been generated cannot usually be changed, the second stereoscopic image processing apparatus 100 shifts the synthesis position of the viewpoint images forming the parallax image to a level sufficient for practical use. Can change the stereoscopic effect. The second stereoscopic image processing apparatus 100 exhibits a good stereoscopic effect adjusting function even in a situation where input data cannot be traced back to three-dimensional data. Hereinafter, differences from the first stereoscopic image processing apparatus 100 will be mainly described.
[0102]
The stereoscopic effect adjusting unit 112 is used for manual adjustment. The instruction obtaining unit 122 realizes input of numerical values such as “+ n” and “−n” on the screen, and the value is specified by the parallax specifying unit 124 as a parallax change amount. There are several possible relationships between the numerical values and the three-dimensional effect indicated. For example, “+ n” is an instruction to enhance the stereoscopic effect, and “−n” is an instruction to weaken the stereoscopic effect. The larger the n is, the larger the change amount to the stereoscopic effect may be. Further, “+ n” may be an instruction to move the object entirely in the near direction, and “−n” may be an instruction to move the object entirely in the far direction. As another method, the value of n may not be specified, and only “+” and “−” buttons may be displayed, and the parallax may be changed each time the button is clicked.
[0103]
The second stereoscopic image processing device 100 includes a parallax amount detection unit 150 and a parallax control unit 152. If the input image is a plurality of parallax images, the parallax amount detection unit 150 examines the header area of those parallax images, and sets the parallax amount described in the form of the number of pixels, in particular, the nearest maximum parallax pixel number and the far maximum parallax. If there is a pixel number, it is obtained. If the amount of parallax is not described, the matching unit 158 specifies the amount of parallax by detecting a corresponding point between the parallax images using a known method such as block matching. The matching unit 158 may perform processing only on an important region such as the central portion of the image, or may detect the narrowest parallax pixel number that is most important. The detected amount of parallax is sent to the parallax control unit 152 in the form of the number of pixels.
[0104]
In general, when a stereoscopic image is displayed on the display screen of a mobile phone, individual differences in stereoscopic effect are small, and it can be assumed that the user sometimes feels troublesome in inputting proper parallax. In addition, even with a stereoscopic display device used by an unspecified number of users, input of proper parallax may be felt to be inconvenient. In such a case, the range of the appropriate parallax may be determined by the manufacturer of the stereoscopic image display device, the creator of the content to be displayed on the stereoscopic image display device, or according to general guidelines. May be determined by the following method. For example, a guideline or a standard established by an industry group or an academic group related to stereoscopic images is reflected. As an example, if there is a guideline that “the maximum parallax should be about 20 mm on a 15-inch display screen”, processing such as following the guideline or performing correction based on the guideline can be mentioned. In this case, the stereoscopic effect adjusting unit 112 becomes unnecessary.
[0105]
The position shift unit 160 of the parallax control unit 152 shifts the synthesis position of the viewpoint images forming the parallax image in the horizontal direction so that the amount of parallax between the viewpoint images becomes an appropriate parallax. The shift may be performed for any of the viewpoint images. The position shift unit 160 also has another operation mode. When the user instructs to increase or decrease the parallax via the stereoscopic effect adjusting unit 112, the image shift position is simply changed according to the instruction. That is, the position shift unit 160 has two functions, an automatic adjustment function for proper parallax and a manual adjustment function by the user.
[0106]
The parallax writing unit 164 writes the parallax amount in the number of pixels in one of header regions of a plurality of viewpoint images constituting the parallax image for the above-described parallax amount detection unit 150 or for another use. The image edge adjustment unit 168 fills in the missing pixels at the image edge due to the shift by the position shift unit 160.
[0107]
The third stereoscopic image processing device 100 in FIG. 13 receives an image with depth information as an input. The third stereoscopic image processing device 100 adjusts the depth so as to realize proper parallax. Also, it has the “parallax correction” function described above. The distortion processing unit 174 of the parallax control unit 170 performs distortion conversion according to the correction map stored in the correction map holding unit 176 in a manner described later. The depth information and the image after the distortion conversion are input to the two-dimensional image generation unit 178, where a parallax image is generated. This two-dimensional image generation unit 178 is different from the two-dimensional image generation unit 142 of the first three-dimensional image processing apparatus 100, and here, an appropriate parallax is considered. Since the image with the depth information is also a two-dimensional image, the two-dimensional image generation unit 178 has a function similar to the position shift unit 160 of the second three-dimensional image processing apparatus 100, although not shown, and Therefore, the pixels in the image are shifted in the horizontal direction to generate a three-dimensional effect. At this time, proper parallax is realized by the processing described below.
[0108]
The processing operation and principle of each unit of each stereoscopic image processing apparatus 100 in the above configuration are as follows.
FIGS. 14A and 14B show the left-eye image 200 and the right-eye image 202 respectively displayed in the specific process of the appropriate parallax by the stereoscopic effect adjustment unit 112 of the first stereoscopic image processing device 100. . In each image, five black circles are displayed, and a larger parallax is set closer to the upper side, and a larger parallax is set closer to the lower side.
[0109]
FIG. 15 schematically shows the sense of distance perceived by the user 10 when these five black circles are displayed. The user 10 has responded that the range of these five senses of distance is “appropriate”, and the instruction acquisition unit 122 acquires this response. In the figure, five black circles having different parallaxes are displayed simultaneously or sequentially, and the user 10 inputs whether or not the parallax is acceptable. On the other hand, in FIG. 16, the display itself is performed by one black circle, but the parallax is continuously changed, and a response is made when the user 10 reaches the limit allowed in each of the far and near directions. The response may use a known technique such as a normal key operation, a mouse operation, or a voice input.
[0110]
Further, the determination of the parallax may be performed by a simpler method. Similarly, the setting range of the basic expression space may be determined by a simple method. FIG. 89 is a table used for simple determination of a parallax and a basic expression space. The setting range of the basic expression space is divided into four ranks of A to D from the setting of increasing the near space side to the setting of only the far space side. Are divided into ranks. Here, for example, if the user prefers the strongest three-dimensional effect and likes the most protruding three-dimensional display, the rank is set to 5A. Then, it is not necessary to determine the rank while checking the three-dimensional display, and only the button for determining the rank may be displayed. A button for confirming the three-dimensional effect may be provided beside them, and an image for confirming the three-dimensional effect may be displayed by pressing the button.
[0111]
In either case of FIGS. 15 and 16, the instruction acquisition unit 122 can acquire the appropriate parallax as a range, and the limit parallax on the near side and the far side is determined. The near maximum parallax is a parallax corresponding to the proximity allowed to a point seen at a position closest to the user, and the far maximum parallax is a parallax corresponding to a distance allowed to a point viewed at a position farthest from the user. However, in general, it is often necessary to care for the nearest disparity due to physiological problems of the user, and hereinafter, only the nearest disparity may be referred to as a limit disparity.
[0112]
FIG. 17 illustrates the principle of actually adjusting parallax between two viewpoints when an image to be stereoscopically displayed is extracted from three-dimensional data. First, the limit parallax determined by the user is converted to the estimated angle of the provisionally arranged camera. As shown in the figure, the limit parallax between the near position and the far position can be represented by M and N in the number of pixels, and the angle of view θ of the camera is equivalent to the number L of horizontal pixels on the display screen. The near maximum possible angle φ and the far maximum possible angle ψ, which are angles, are represented by θ, M, N, and L.
tan (φ / 2) = Mtan (θ / 2) / L
tan (ψ / 2) = Ntan (θ / 2) / L
Next, this information is applied to the extraction of a two-viewpoint image in a three-dimensional space. As shown in FIG. 18, first, a basic expression space T (the depth of which is described as T) is determined. Here, it is assumed that the basic expression space T is determined from restrictions on the arrangement of objects. Let S be the distance from the front projection plane 30, which is the front of the basic representation space T, to the camera arrangement plane, that is, the viewpoint plane 208. T and S can be specified by the user. There are two viewpoints, and the distance between the optical axis intersection plane 210 and the viewpoint plane 208 is D. Let A be the distance between the optical axis intersection plane 210 and the front projection plane 30.
[0113]
Next, assuming that the parallaxes of near and far positions in the basic representation space T are P and Q, respectively,
E: S = P: A
E: S + T = Q: TA
Holds. E is a distance between viewpoints. Now, the point G, which is a pixel without parallax, is located at a position where the optical axes K2 from both cameras intersect on the optical axis intersection plane 210, and the optical axis intersection plane 210 is the position of the screen plane. The ray K1 that produces the near maximum parallax P intersects on the front projection plane 30, and the ray K3 that produces the far maximum parallax Q intersects on the rear projection plane 32.
[0114]
P and Q are expressed by using φ and ψ as shown in FIG.
P = 2 (S + A) tan (φ / 2)
Q = 2 (S + A) tan (ψ / 2)
And as a result,
E = 2 (S + A) tan (θ / 2) · (SM + SN + TN) / (LT)
A = STM / (SM + SN + TN)
Is obtained. Now, since S and T are known, A and E are automatically determined in this way. Therefore, the optical axis intersection distance D and the inter-camera distance E are automatically determined, and the camera parameters are determined. If the camera arrangement determining unit 132 determines the camera arrangement in accordance with these parameters, the processing of the projection processing unit 138 and the two-dimensional image generation unit 142 will be performed independently on the images from the cameras, and A parallax image having parallax can be generated and output. As described above, E and A do not include hardware information, and an expression format independent of hardware is realized.
[0115]
Thereafter, when a camera is arranged so as to protect A or D and E even when another image is stereoscopically displayed, appropriate parallax can be automatically realized. Since the entire process from identification of proper parallax to ideal stereoscopic display can be automated, providing this function as a software library eliminates the need for programmers who create content and applications to be conscious of programming for stereoscopic display. . When L, M, and N are represented by the number of pixels, L indicates a display range, so that it is possible to use L to indicate whether display is to be performed on the entire screen or a part of the screen. L is also a parameter independent of hardware.
[0116]
FIG. 20 shows a four-lens camera arrangement using four cameras 22, 24, 26, and 28. To be more precise, the above-mentioned A and E should be determined so as to obtain an appropriate parallax between adjacent cameras, such as between the first camera 22 and the second camera 24. Even if A and E determined between the second camera 24 and the third camera 26 which are close to the above are diverted to other cameras, substantially the same effect can be obtained.
[0117]
Although T is a restriction on the arrangement of objects, it may be determined by a program as the size of a basic three-dimensional space. In this case, the object can be arranged only in the basic expression space T throughout the entire program, or a parallax may be given to the object so as to occasionally jump out of the space for effective display.
[0118]
As another example, T may be determined for the coordinates of the nearest object and the most distant object in the three-dimensional space, and if this is performed in real time, the basic expression space T Objects can be placed in As an exception to always placing an object in the basic expression space T, a short-time exception can be created by operating under a relaxed condition that “the average of the position for a certain time should be within the basic expression space T”. Further, the object defining the basic expression space T may be limited to a static object. In this case, an exceptional operation in which a dynamic object protrudes from the basic expression space T can be provided. As still another example, conversion for reducing the space in which the objects are already arranged to the size of the width T of the basic expression space may be performed, or may be combined with the operation described above. A method of intentionally displaying an object so as to protrude from the basic expression space will be described later.
[0119]
Note that if the stereoscopic image adjusting unit 112 of the first stereoscopic image processing apparatus 100 displays a double image as an image to be displayed to the user, the marginal parallax is determined to be small, and the image is displayed when another image is displayed. The appearance frequency of double images can be reduced. As an image in which a double image easily appears, an image in which the color and brightness of the object and the background are contrasted is known, and such an image may be used in the stage of specifying the limit parallax, that is, in the initial setting. .
[0120]
FIGS. 21 to 36 show the processing by the distortion processing unit 136 of the first stereoscopic image processing apparatus 100 and the principle thereof.
FIG. 21 conceptually illustrates an example of the correction map stored in the correction map holding unit 140 of the first three-dimensional image processing apparatus 100. This map directly corrects the parallax, and the entire map directly corresponds to the parallax image, and becomes smaller as it goes to the peripheral portion. FIG. 22 shows a change in parallax resulting from the operation of the camera parameters by the camera arrangement determining unit 132 that has determined the camera arrangement by the distortion processing unit 136 according to this correction map. When viewing the front direction from the left and right viewpoint positions of the two cameras, “normal parallax” is assigned. On the other hand, when viewing the direction largely deviating from the front, “small disparity” is assigned. In actuality, the camera arrangement determining unit 132 makes the camera interval closer as approaching the periphery.
[0121]
FIG. 23 shows another example in which the camera arrangement determining unit 132 changes the parallax by changing the camera arrangement according to the instruction of the distortion processing unit 136. Here, the parallax changes as “normal parallax” → “medium parallax” → “small parallax” toward the periphery of the image while moving only the left camera of the two cameras. This method has a lower calculation cost than FIG.
[0122]
FIG. 24 shows another example of the correction map. This map also changes the parallax. The vicinity of the center of the image is not touched with the normal parallax, and the parallax is gradually reduced in other parallax correction areas. FIG. 25 shows the camera position changed by the camera arrangement determining unit 132 according to this map. When the direction of the camera deviates greatly from the front, the position of the left camera moves toward the right camera for the first time, and "small parallax" is given.
[0123]
FIG. 26 conceptually shows another example of the correction map. This map corrects the sense of distance from the viewpoint to the object, and in order to achieve this, the camera arrangement determination unit 132 adjusts the optical axis intersection distance of the two cameras. If the optical axis intersecting distance is reduced toward the periphery of the image, the object appears to be relatively deeper in the far direction, so that the object is achieved particularly in the sense of reducing near parallax. In order to reduce the optical axis intersection distance, the camera arrangement determining unit 132 may change the direction of the optical axis of the camera, and may change the direction of one of the cameras. FIG. 27 shows changes in the optical axis intersection position or the optical axis intersection plane 210 when a two-dimensional image is generated by the map of FIG. The optical axis crossing plane 210 is closer to the camera as the image is closer to the periphery.
[0124]
FIG. 28 shows another correction map for the sense of distance, and FIG. 29 shows how the camera arrangement determining unit 132 changes the optical axis crossing plane 210 according to the instruction of the distortion processing unit 136 according to the map of FIG. In this example, the object is arranged at the normal position without correction in the central region of the image, and the position of the object is corrected in the peripheral region of the image. For that purpose, there is no change in the optical axis crossing plane 210 near the center of the image in FIG. 29, and after a certain point, the optical axis crossing plane 210 approaches the camera. In FIG. 29, only the left camera is turned around.
[0125]
FIGS. 30A to 30F show another distortion conversion by the distortion processing unit 136. Unlike the previous examples, instead of changing the camera position, the 3D space itself is directly distorted in the camera coordinate system. 30A to 30F, a rectangular area is a top view of the original space, and a hatched area is a top view of the converted space. For example, a point U in the original space in FIG. This means that this point has been moved in the remote direction. In FIG. 30 (a), the space is crushed in the direction of the arrow in the depth direction toward the peripheral portion, and the distance close to a certain sense of distance, such as point W in FIG. I get a feeling. As a result, the sense of distance is uniform in the peripheral portion of the image, and there is no specially placed object, so that the problem of the double image is solved and the expression is easily adapted to the physiology of the user.
[0126]
FIGS. 30 (b), 30 (c), 30 (d), and 30 (e) each show a modification of the conversion for bringing the sense of distance closer to a constant value in the peripheral portion of the image, and FIG. An example is shown in which all points are converted in the far direction.
[0127]
FIG. 31 shows the principle for realizing the conversion of FIG. The rectangular parallelepiped space 228 includes a space where the projection processing of the first camera 22 and the second camera 24 is performed. The view volume of the first camera 22 is determined by the angle of view of the camera and the front projection plane 230 and the rear projection plane 232, and that of the second camera 24 is determined by the angle of view of the camera and the front projection plane 234 and the rear projection plane Determined at 236. The distortion processing unit 136 performs distortion conversion on the rectangular parallelepiped space 228. The origin is the center of the rectangular parallelepiped space 228. In the case of a multi-view system, the conversion principle is the same, only the number of cameras increases.
[0128]
FIG. 32 shows an example of distortion conversion, which employs a reduction conversion in the Z direction. Actually, processing is performed on individual objects in the space. FIG. 33 is a representation of this conversion compared to a parallax correction map, where the normal parallax is on the Y axis, the parallax decreases as the absolute value of X increases, and there is no parallax at X = ± A. Here, since the reduction conversion is performed only in the Z direction, the conversion formula is as follows.
(Equation 1)
The conversion will be described with reference to FIG. First, consider the range of X ≧ 0 and Z ≧ 0. When the point (X0, Y0, Z0) moves to the point (X0, Y0, Z1) by the reduction processing, the reduction ratio Sz is
It is. The coordinates of C are (X0, Y0, 0) and the coordinates of D are (X0, Y0, B).
E is an intersection of a straight line and a plane, and when coordinates are (X0, Y0, Z2), Z2 can be obtained as follows.
[0129]
Z = BX-B / A (plane)
X = X0, Y = Y0 (straight line)
Z2 = B−X0 × B / A
Therefore,
In general for X
Sz = 1−X / A
It becomes. When the same calculation is performed for other ranges of X and Z, the following results are obtained, and the conversion can be verified.
[0130]
When X ≧ 0, Sz = 1−X / A
When X <0, Sz = 1 + X / A
FIG. 35 shows another example of the distortion conversion. More strictly, taking into account that the image is radially taken from the camera, the reduction processing in the X-axis and Y-axis directions is also combined. Here, the conversion is performed with the center of the two cameras as a representative of the camera position. The conversion formula is as follows.
(Equation 2)
FIG. 36 verifies this conversion. Again, consider the range of X ≧ 0 and Z ≧ 0. When the point (X0, Y0, Z0) moves to the point (X1, Y1, Z1) by the reduction processing, the reduction ratios Sx, Sy, Sz are
It becomes. Since E is the intersection of the plane and the straight line, Sx, Sy, and Sz can be obtained as described above.
[0131]
When the converted space is represented by a set of planes as described above, the processing changes at the boundary of the tangent between the planes, and in some cases, a feeling of strangeness may occur. In such a case, the connection may be made by a curved surface, or the space may be constituted only by the curved surface. The calculation is simply replaced with the calculation of the intersection E between the curved surface and the straight line.
[0132]
In the above example, the reduction ratio is the same on the same straight line CD, but may be weighted. For example, a weighting function G (L) for the distance L from the camera may be applied to Sx, Sy, and Sz.
[0133]
FIGS. 37 to 40 show the processing by the distortion processing unit 174 of the third stereoscopic image processing apparatus 100 and the principle thereof.
FIG. 37 shows a depth map of an image with depth information input to the third stereoscopic image processing apparatus 100. Here, it is assumed that the depth range has a value of K1 to K2. Here, the near depth is represented by positive, and the far depth is represented by negative.
[0134]
FIG. 38 shows the relationship between the original depth range 240 and the converted depth range 242. The depth approaches a constant value toward the periphery of the image. The distortion processing unit 174 converts the depth map according to the correction. The same applies to the case where parallax is provided in the vertical direction. Since this conversion is only reduction in the Z direction, it can be expressed by the following equation.
[Equation 3]
Sz is classified according to the value of X.
When X ≧ 0, Sz = 1−2X / L
When X <0, Sz = 1 + 2X / L
It becomes. With the above conversion, a new depth map having new elements shown in FIG. 39 is generated.
[0135]
FIG. 40 shows another principle of distortion transformation for a depth map. Since the space is more strictly observed radially from the user 10, reduction processing in the X-axis and Y-axis directions is also combined. Here, the interocular center is set as the observation position. The specific processing is the same equation as in FIG. Although the original depth map has only the Z value, when this calculation is performed, the X value and the Y value are also held. The Z value is converted into a pixel shift amount in the X direction or the Y direction, but the X value and the Y value may be held as offset values for them.
[0136]
In any case, the depth map and the original image converted by the distortion processing unit 174 are input to the two-dimensional image generation unit 178, where a synthesis process is performed in which the image is shifted in the horizontal direction so as to have an appropriate parallax. The details will be described later.
[0137]
FIGS. 41 to 51 show the processing of the position shift unit 160 of the second stereoscopic image processing apparatus 100 and the processing of the two-dimensional image generation unit 178 of the third stereoscopic image processing apparatus 100 that can be grasped as an extension thereof.
FIG. 41 shows the principle of shifting the combined position of two parallax images by the position shift unit 160. As shown in the figure, the positions of the right-eye image R and the left-eye image L match in the initial state. However, when the left-eye image L is relatively shifted rightward as in the upper part of the figure, the parallax at the near point increases and the parallax at the far point decreases. Conversely, when the left eye image L is relatively shifted to the left as shown in the lower part of the figure, the parallax at the near point decreases and the parallax at the far point increases.
[0138]
The above is the essence of the parallax adjustment by shifting the parallax image. The image may be shifted by one or both may be shifted in opposite directions. Also, from this principle, it is understood that the stereoscopic display method can be applied to all methods using parallax regardless of the glasses method or the method without glasses. The same processing can be performed for a multi-view video and a vertical parallax.
[0139]
FIG. 42 shows the shift processing at the pixel level. In the left-eye image 200 and the right-eye image 202, both a first rectangle 250 and a second rectangle 252 are shown. The first quadrangle 250 has near parallax, and when the parallax amount is represented by a positive number, it is “6 pixels”. On the other hand, the second rectangle 252 has a far parallax, and when the parallax amount is represented by a negative number, it becomes “−6 pixels”. Here, the parallax amounts are F2 and F1, respectively.
[0140]
On the other hand, it is assumed that the appropriate parallax of the display device held by the user is found to be J1 to J2. The position shift unit 160 shifts the combination start position of both images by (J2-F2) pixels. FIG. 43 shows the state after the end of the shift. Now, assuming that F1 = −6, F2 = 6, and J1 = −5 and J2 = 4, the synthesis start positions are −2 pixels from each other, that is, It will be shifted in the direction in which the whole shifts in the far direction. The final amount of parallax is E1 = −8 and E2 = 4, as shown in FIG. 43, and falls within the limit parallax at least in the near direction. In general, it is considered that the double image in the close direction is more uncomfortable than the far direction, and the subject is often photographed in a state where it is arranged in the close direction. It is desirable to keep the parallax in the direction within the limit. The following is an example of processing.
1. When the near point is outside the limit parallax and the far point is within the limit parallax, the near point is shifted to the limit parallax point. However, if the parallax of the distant point reaches the interocular distance, the processing is stopped.
2. When the near point is outside the limit parallax and the far point is outside the limit parallax, the near point is shifted to the limit parallax point. However, if the parallax of the distant point reaches the interocular distance, the processing is stopped.
3. If both the near point and the far point are within the limit parallax, no processing is performed.
4. If the near point is within the limit disparity and the far point is outside the limit disparity, the far point is shifted to the limit disparity point, but the process is stopped if the near point reaches the limit disparity point during the processing. .
[0141]
FIG. 44 shows the loss of the image edge due to the shift of the combining position. Here, the shift amount between the left-eye image 200 and the right-eye image 202 is one pixel, and missing portions 260 each having a width of one pixel are generated at the right end of the left-eye image 200 and the left end of the right-eye image 202, respectively. At this time, the image edge adjustment unit 168 duplicates the pixel row at the image edge as shown in FIG. 44 to compensate for the number of horizontal pixels.
[0142]
As another method, the missing portion 260 may be displayed in a specific color such as black or white, or may be hidden. Further, cutout and addition processing may be performed so as to be the same as the size of the initial image. In addition, the size of the initial image may be made larger than the actual display size in advance so that the missing portion 260 does not affect the display.
[0143]
FIG. 45 shows the flow of manual adjustment of parallax by the second stereoscopic image processing apparatus 100. As shown in the figure, first, left and right images are manually created as parallax images (S10), and are distributed via a network or another route (S12). This is received by the second stereoscopic image processing apparatus 100 (S14), and in the example of this figure, the image is first synthesized and displayed in a normal state without any shift (S16). That is, here, a case is considered where a proper parallax has not yet been acquired or a case where the position shift unit 160 has not been operated. Subsequently, the user instructs the parallax image to be stereoscopically displayed via the stereoscopic effect adjustment unit 112, and the position shift unit 160 receives the instruction in the “manual adjustment mode” to adjust the image combining position. Is displayed (S18). S10 and S12 are the procedure 270 of the image creator, and S14 and subsequent steps are the procedure 272 of the second stereoscopic image processing apparatus 100. Although not shown, if the shift amount is recorded in the header and is synthesized by referring to it from the next time, readjustment can be omitted.
[0144]
FIG. 46 shows the flow of automatic adjustment by the second stereoscopic image processing apparatus 100. The generation 270 of the left and right images (S30) and the image distribution (S32) as the image creator procedure 270 are the same as those in FIG. In the procedure 272 of the second three-dimensional image processing apparatus 100, the same applies to the image reception (S34). Next, the matching section 158 of the parallax amount detection section 150 detects a parallax pre-set between parallax images, in particular, a maximum parallax (S36), and obtains a proper parallax, especially a limit parallax from the parallax information holding section 120. (S38). Thereafter, the position shift unit 160 shifts the combined position of the image so as to satisfy the limit parallax by the above-described processing (S40), and performs the processing by the parallax writing unit 164, the image edge adjustment unit 168, and the format conversion unit 116 to perform the stereoscopic display. Is performed (S42).
[0145]
FIG. 47 shows a flow of still another automatic adjustment by the second stereoscopic image processing apparatus 100. After the left and right images are generated (S50) in the procedure 270 of the image creator, the maximum parallax is detected at this time (S52) and recorded in the header of any viewpoint image of the parallax image (S54). This detection may be performed by corresponding point matching. However, when a creator manually generates a parallax image, the parallax image is naturally known in an editing process, and may be recorded. Thereafter, the image is distributed (S56).
[0146]
On the other hand, of the procedure 272 of the second three-dimensional image processing apparatus 100, image reception (S58) is the same as that in FIG. Next, the above-described maximum parallax is read from the header by the header inspection unit 156 of the parallax amount detection unit 150 (S60). On the other hand, the limit parallax is acquired from the parallax information holding unit 120 (S62), and the following processes S64 and S66 are the same as the processes S40 and S42 in FIG. 46, respectively. According to this method, there is no need to calculate the maximum disparity. Also, an appropriate three-dimensional effect can be realized over the entire image. Further, since the shift amount can be recorded in the header, there is no possibility of damaging the original image itself. Although not shown, if the detected maximum parallax is also recorded in the header in FIG. 46, processing can be performed in accordance with the procedure in FIG. 47 from then on.
[0147]
Note that the same processing can be performed with a multi-view system, and the same processing may be performed on the amount of parallax between adjacent viewpoint images. However, in practice, the maximum parallax among the parallaxes between the plurality of viewpoint images may be regarded as the “maximum parallax” between all the viewpoint images, and the shift amount of the combining position may be determined.
[0148]
Although it is sufficient that the header information is present in at least one of the multi-view images, if the multi-view image is combined into one image, the header of the image may be used.
[0149]
In some cases, images that have already been combined are distributed.In such a case, the images are separated by an inverse transformation process, and the combined position shift amount is calculated and recombined. May be performed.
[0150]
FIGS. 48 to 51 show a process of shifting the combining position for an image with depth information. This is performed by the two-dimensional image generation unit 178 of the third stereoscopic image processing device 100. FIGS. 48 and 49 show a plane image 204 and a depth map, respectively, which constitute an image with depth information. Here, the near depth is represented by positive and the far depth is represented by negative. There are a first rectangle 250, a second rectangle 252, and a third rectangle 254 as objects. The first rectangle 250 has a depth of "4", the second rectangle 252 has a depth of "2", and the third rectangle 254 has a depth of "-4". . The first rectangle 250 is located at the nearest point, the second rectangle 252 is located at an intermediate point, and the third rectangle 254 is located at the furthest point.
[0151]
The two-dimensional image generation unit 178 first performs a process of shifting each pixel by the value of the depth map based on the original plane image 204, and generates the other viewpoint image. If the reference is a left-eye image, the original plane image 204 becomes the left-eye image as it is. The first square 250 is shifted by 4 pixels to the left, the second square 252 is shifted by 2 pixels to the left, and the third square 254 is shifted by 4 pixels to the right, and the right eye image 202 is created as shown in FIG. The image edge adjustment unit 168 fills the missing part 260 of the pixel information due to the movement of the object with the neighboring pixel whose parallax is “0” and is determined to be the background.
[0152]
Subsequently, the two-dimensional image generation unit 178 calculates a depth that satisfies the appropriate parallax. Assuming that the depth range is K1 to K2 and the depth value of each pixel is Gxy, the depth map has a shape in which Hxy is changed to Gxy in FIG. Further, it is assumed that the appropriate parallax of the display device held by the user is found to be J1 to J2. In this case, in the depth map, the depth value G of each pixel is converted as follows, and a new depth value Fxy is obtained.
[0153]
Fxy = J1 + (Gxy-K1) × (J2-J1) / (K2-K1)
In the above example, if K1 = −4, K2 = 4, and J1 = −3, J2 = 2, the depth map of FIG. 49 is converted into the depth map of FIG. You. That is, "4" is converted to "2", "2" is converted to "1", and "-4" is converted to "-3". The intermediate value between K1 and K2 is converted between J1 and J2. For example, the second rectangle 252 has Gxy = 2 and Fxy = 0.75. If Fxy does not become an integer, processing may be performed to round off or reduce parallax.
[0154]
Although the above conversion formula is an example of linear conversion, a weighting function F (Gxy) for Gxy may be further applied, or various other non-linear conversions may be considered. Further, the left and right images can be newly generated by shifting the objects in the directions opposite to each other from the original plane image 204. In the case of a multi-view system, the same processing may be performed a plurality of times to generate a multi-viewpoint image.
[0155]
The above is the configuration and operation of the stereoscopic image processing device 100 according to the embodiment.
Although the three-dimensional image processing apparatus 100 has been described as an apparatus, this may be a combination of hardware and software, or may be configured only with software. In this case, it is convenient if an arbitrary part of the three-dimensional image processing apparatus 100 is made into a library and can be called from various programs. The programmer can skip programming where the knowledge of stereoscopic display is required. For the user, the operation related to the three-dimensional display, that is, the GUI becomes common irrespective of the software and the content, and the set information can be shared by other software, so that the trouble of resetting can be omitted.
[0156]
In addition, it is useful to simply share information between a plurality of programs instead of processing related to stereoscopic display. Various programs can determine the state of the image with reference to the information. An example of the shared information is information acquired by the information acquisition unit 118 of the stereoscopic image processing device 100 described above. This information may be stored in a recording unit (not shown), the correction map storage unit 140, or the like.
[0157]
FIGS. 52 to 54 show an example in which the above-described stereoscopic image processing apparatus 100 is used as a library. FIG. 52 shows an application of the stereoscopic display library 300. The stereoscopic display library 300 is referred to by calling functions from a plurality of programs A302, program B304, program C306, and the like. The parameter file 318 stores the user's proper parallax and the like in addition to the above information. The stereoscopic display library 300 is used by a plurality of devices A312, B314, C316, and the like via an API (application program interface) 310.
[0158]
Examples of the program A302 include a game, a three-dimensional application called so-called Web3D, a three-dimensional desktop screen, a three-dimensional map, a viewer of a parallax image as a two-dimensional image, and a viewer of an image with depth information. Of course, some games use coordinates differently, but the three-dimensional display library 300 can cope with that.
[0159]
On the other hand, as an example of the device A312 or the like, any stereoscopic display device using parallax, such as a binocular or multi-view parallax barrier system, a shutter glasses system, and a polarized glasses system, is used.
[0160]
FIG. 53 shows an example in which the three-dimensional display library 300 is incorporated in the three-dimensional data software 402. The three-dimensional data software 402 includes a program main body 404, a three-dimensional display library 300 for realizing appropriate parallax therefor, and a shooting instruction processing unit 406. The program body 404 communicates with the user via the user interface 410. The shooting instruction processing unit 406 virtually shoots a predetermined scene during operation of the program body 404 according to a user's instruction. The photographed image is recorded in the image recording device 412. Also, it is output to the stereoscopic display device 408.
[0161]
For example, assume that the three-dimensional data software 402 is game software. In this case, the user can execute the game while experiencing an appropriate three-dimensional effect by using the three-dimensional display library 300 during the game. During the game, when the user wants to keep a record, for example, when a complete victory has been achieved in a competitive battle game, an instruction is issued to the photographing instruction processing unit 406 via the user interface 410, and the scene is recorded. At that time, a parallax image is generated using the stereoscopic display library 300 so that the parallax image becomes appropriate when reproduced on the stereoscopic display device 408 later, and this is recorded in an electronic album or the like of the image recording device 412. Note that by performing recording with a two-dimensional image called a parallax image, the three-dimensional data itself of the program body 404 does not leak out, and copyright protection can be considered.
[0162]
FIG. 54 shows an example in which the three-dimensional data software 402 of FIG. 53 is incorporated in a network-based system 430.
The game machine 432 is connected to a server 436 and a user terminal 434 via a network (not shown). The game machine 432 is for a so-called arcade game, and includes a communication unit 442, three-dimensional data software 402, and a three-dimensional display device 440 that locally displays the game. The three-dimensional data software 402 is shown in FIG. The parallax image displayed on the stereoscopic display device 440 from the three-dimensional data software 402 is optimally set for the stereoscopic display device 440 in advance. The adjustment of the parallax by the three-dimensional data software 402 is used when transmitting an image to a user via the communication unit 442 as described later. The display device used here only needs to have a function of generating a stereoscopic image by adjusting parallax, and need not necessarily be a device capable of performing stereoscopic display.
[0163]
The user terminal 434 includes a communication unit 454, a viewer program 452 for viewing a stereoscopic image, and a stereoscopic display device 450 of any size and type for displaying the stereoscopic image locally. The stereoscopic image processing device 100 is mounted on the viewer program 452.
[0164]
The server 436 associates the communication unit 460, the image holding unit 462 for recording an image virtually shot by the user in connection with the game, and the user's proper parallax information, the user's mail address, and other personal information with the user. And a user information storage unit 464 for recording. The server 436 functions as, for example, an official website of the game, and records a scene that the user liked during the game execution, or a moving image or a still image of a famous game. The three-dimensional display can be any of a moving image and a still image.
[0165]
An example of image capturing in the above configuration is performed in the following manner. The user performs stereoscopic display on the stereoscopic display device 450 of the user terminal 434 in advance, obtains an appropriate parallax based on the function of the stereoscopic image processing device 100, notifies the server 436 via the communication unit 454, and notifies the user information. It is stored in the holding unit 464. This proper parallax is a general-purpose description irrespective of the hardware of the stereoscopic display device 450 held by the user.
[0166]
The user plays a game using the game machine 432 at an arbitrary timing. During that time, the stereoscopic display device 440 performs stereoscopic display based on the initially set parallax or the parallax manually adjusted by the user. If the user desires to record an image during game play or replay, the three-dimensional display library 300 built in the three-dimensional data software 402 of the game machine 432 transmits the three-dimensional display library 300 to the server 436 via the two communication units 442 and 460. The appropriate parallax of the user is acquired from the user information holding unit 464, and a parallax image is generated in accordance with the proper parallax. Is stored. When the user returns to his / her home and downloads this parallax image to the user terminal 434, stereoscopic display can be performed with a desired stereoscopic effect. At this time, the parallax can be manually adjusted by the stereoscopic image processing apparatus 100 included in the viewer program 452.
[0167]
As described above, according to this application example, the programming related to the three-dimensional effect that should be set for each hardware of the display device and for each user is mainly collected in the three-dimensional image processing device 100 and the three-dimensional display library 300. The programmer does not have to worry about any complex requirements for stereoscopic display. This applies not only to game software but also to any software that uses stereoscopic display, and eliminates restrictions on the development of content and applications that use stereoscopic display. Therefore, their dissemination can be drastically promoted.
[0168]
In particular, in the case of games and other applications that originally have three-dimensional CG data, it is difficult to code an accurate three-dimensional display in the past. Often not used for 3D display. According to the three-dimensional image processing device 100 or the three-dimensional display library 300 according to the embodiment, such an adverse effect can be eliminated, and the stereoscopic display application can be enhanced.
[0169]
In FIG. 54, the user's proper parallax is registered in the server 436. However, the user may bring the IC card or the like in which the information is recorded and use the game machine 432. On this card, the score of this game or a favorite image may be recorded.
[0170]
The present invention has been described based on the embodiments. This embodiment is an exemplification, and it is understood by those skilled in the art that various modifications can be made to the combination of each component and each processing process, and that such modifications are also within the scope of the present invention. is there. The following is an example.
[0171]
The first three-dimensional image processing apparatus 100 can perform processing with high accuracy by inputting three-dimensional data. However, the three-dimensional data may be temporarily dropped into an image with depth information, and a parallax image may be generated using the third stereoscopic image processing apparatus 100. In some cases, this may have lower computational costs. Similarly, when inputting a plurality of viewpoint images, it is also possible to create a depth map using high-precision corresponding point matching. The parallax image may be generated using the image processing device 100.
[0172]
In the first three-dimensional image processing apparatus 100, the temporary camera arrangement unit 130 is configured as the three-dimensional image processing apparatus 100. However, this may be a pre-process of the three-dimensional image processing apparatus 100. This is because processing up to temporary placement of cameras can be performed regardless of proper parallax. Similarly, an arbitrary processing unit constituting the first, second, and third three-dimensional image processing apparatuses 100 can be taken out of the three-dimensional image processing apparatus 100, and the degree of freedom of the configuration of the three-dimensional image processing apparatus 100 can be increased. Is well understood by those skilled in the art.
[0173]
In the embodiment, the case where the control of the parallax is performed in the horizontal direction has been described, but the same processing can be performed in the vertical direction.
[0174]
During the operation of the stereoscopic display library 300 or the stereoscopic image processing apparatus 100, a unit for enlarging character data may be provided. For example, in the case of a parallax image with two horizontal viewpoints, the horizontal resolution of an image visible to the user is halved. As a result, the legibility of the character may be reduced, and therefore, a process of extending the character twice in the horizontal direction is effective. If there is also a parallax in the vertical direction, it is also useful to stretch the characters in the vertical direction.
[0175]
During the operation of the stereoscopic display library 300 or the stereoscopic image processing apparatus 100, an “operating display unit” for putting characters or marks such as “3D” on the displayed image may be provided. In that case, the user can know whether or not the image can adjust the parallax.
[0176]
A switching unit for stereoscopic display / normal display may be provided. This unit includes a GUI, and it is convenient if the display is switched from stereoscopic display to normal two-dimensional display when the user clicks a predetermined button, and vice versa.
[0177]
The information acquisition unit 118 does not necessarily acquire information by user input, but may include information that can be automatically acquired by a function such as plug and play.
[0178]
In the embodiment, the method of deriving E and A is adopted. However, a method of deriving E and A and deriving other parameters may be used, and the specification of variables is free.
[0179]
For stereoscopic display, another expression method is proposed. In general, in a planar image display, an object is described as "an object passes through a certain interface", and there is a limit in terms of realism, especially in the depth direction. Further, it is difficult for an observer to recognize that the window surface has an interface that actually separates the space. Therefore, as described below, by displaying an object in a three-dimensional manner on a three-dimensional image display device, it is possible to recognize an entity such as a screen or a frame so that an interface on the object represented in the image matches. And such a display creates a new expression method. In general, the display screen and its surrounding frame are visually perceived, so a display method that uses this as a window can be considered, and a designation to place an interface between spaces and a plate-like object on that surface is considered. Is required. In this case, the optical axis crossing position D is specified in the positional relationship shown in FIG.
[0180]
In the positional relationship of the photographing system shown in FIG. 18, assuming that near and far limit parallaxes in the basic expression space T are P and Q, respectively,
E: S = P: A
E: S + T = Q: TA
Is obtained. Solving these relational expressions for the near and far limit parallax respectively,
E = PS / A
E = Q (S + T) / (TA)
Is obtained. By selecting the smaller E of these two Es, a stereoscopic image with an appropriate parallax range can be obtained.
[0181]
FIG. 55 shows a state where an image composed of three-dimensional data is displayed on the display screen 400. This image indicates that one glass surface 401 of the water tank 410 matches the display screen 400, and that the fish 301 is swimming in the water tank 410. If the processing is performed such that the far side is the far space and the near side is the near space from the display screen 400, the fish 301 is normally represented as swimming in the far space as shown in FIG. 56, and Occasionally, as shown in FIG. 57, an expression such as “the fish 301 breaks through the display screen 400 and appears in the nearby space” can be made. Further, when the fish 301 passes through the display screen 400, for example, the expression "splashes fly from the periphery of the display screen 400 and the interface is reproduced when the fish 301 passes through" can be used. Another example of the expression is, for example, “Because there is no water in the near space before the display screen, the fish 301 becomes stuffy after swimming for a while in the close space, and again penetrates the interface, that is, the far space through the display screen 400. Return to "."
[0182]
Note that it is not always necessary to reproduce the broken interface when the object passes through the interface and then when the object passes, and the interface remains broken or the interface is deformed in accordance with the collision of the object. It is clear that various expressions can be made regarding the interaction between the interface and the object, such as not passing through, or transmitting only a shock at that time, and applying an electric shock as an effect on the image, for example. The interface may be a single surface, but a plate-like object such as glass or a thin object such as paper may be arranged. In addition, the interface does not need to completely match the display screen, but may be in the vicinity of the interface. It is clear that the above-described expression effects cannot sufficiently convey the situation to the observer with a two-dimensional image. In particular, if the original data serving as the starting point of the three-dimensional image is three-dimensional data, editing for expressing the above-described effects becomes easy.
[0183]
Such an expression that matches the interface of the object to be displayed with the display screen can be generated by the method shown in FIG. That is, the virtual water tank 410 is arranged in the three-dimensional space, and two images having parallax are generated from the two virtual cameras 430 and 440 arranged on the left side thereof. At this time, the optical axis intersection positions of the two virtual cameras 430 and 440 are made to coincide with one surface of the water tank. Further, such an image can be taken as shown in FIG. Two virtual cameras 430 and 440 are arranged above the actual water tank 410 to photograph the water tank 410. At that time, the optical axis intersection positions of the two cameras are made to coincide with the water surface.
[0184]
FIG. 60 shows a configuration of a fourth stereoscopic image processing device 100 for realizing the above processing. This stereoscopic image processing apparatus 100 has a configuration in which an object designating section 180 is further provided in the stereoscopic effect adjusting section 112 of the first stereoscopic image processing apparatus 100 shown in FIG. The object specifying unit 180 performs a process of positioning or matching the interface of the object specified by the user near the display screen. Here, the user is assumed to be a creator of a three-dimensional image, and the above-described processing is performed when a three-dimensional image is created or edited. Note that the user may be an observer.
[0185]
First, the processing procedure of the stereoscopic image processing device 100 shown in FIG. 60 will be described. The object designation unit 180 receives designation of an object corresponding to the optical axis crossing plane of the two virtual cameras 430 and 440 from a user using a predetermined input device such as a mouse, and notifies the parallax control unit 114 of the designated object. The parallax control unit 114, more specifically, the camera arrangement determination unit 132, adjusts the plane of the object designated by the user so that the plane is the optical axis intersection plane of the two virtual cameras 430 and 440. Operations other than this processing may be the same as the operations of the three-dimensional image processing apparatus 100 shown in FIG. Information indicating that the object is to be displayed near the display screen is added to the object determined in this manner. At the time of display, it is read out as appropriate to determine the intersection distance D of the optical axis, and the inter-camera distance E is determined by the processing described above.
[0186]
We also propose another representation method. When there are a plurality of objects to be displayed on the display screen, it is not always necessary to keep all the objects within the proper parallax. At times, for an effective display, some objects may be displayed under certain conditions, for example, for a certain period of time, out of the condition of proper parallax. As described above, the basic expression space is determined for a stationary object. More specifically, it is determined whether each object is an object to be expressed in the basic expression space including the object to be stereoscopically displayed. Information for determination (hereinafter, also simply referred to as “identification information”) may be provided. The object to be expressed in the basic expression space is also referred to as “calculation object of the basic expression space”. The basic expression space may be determined at any time based on the identification information.
[0187]
If the identification information is configured to be able to be changed as needed, it is possible to flexibly set conditions for excluding proper parallax. For example, if the specification of the time to be excluded from the proper parallax condition is described in the identification information, it is possible to automatically return to the range of the proper parallax after the specified time has passed.
[0188]
A method for temporarily removing some objects from the appropriate parallax condition and displaying the objects on the display screen will be described below. For example, in the first stereoscopic image processing apparatus 100 shown in FIG. 11, the camera arrangement determination unit 132 corrects the temporarily set camera parameters according to the appropriate parallax, but if the function is further extended as follows. Good. That is, the camera arrangement determination unit 132 reads the identification information associated with each object, and arranges the camera parameters in a manner that reflects the identification information.
[0189]
We propose another representation method. When the front and back of the basic representation space, that is, the front projection plane which is the near limit and the rear projection plane which is the far limit, are determined by an object, the expression that moves in the space before and after the space corresponding to the object becomes become unable. FIG. 61 illustrates the image displayed by the fourth stereoscopic image processing apparatus 100 in the depth direction, particularly, the basic expression space T for convenience. A front projection plane 310 is set on the left side of the figure and a rear projection plane 312 is set on the right side. The basic expression space T is between the front projection plane 310 and the rear projection plane 312. Within the range of the basic expression space T, a house 350 is represented as a stationary object on the front projection plane 310 side, and a tree 370 is represented on the rear projection plane 312 side. Furthermore, a bird 330, which is a dynamic object, is moving forward in the space above these two stationary objects. The bird 330 can express its movement if it moves within the range of the basic expression space T. However, when the bird 330 reaches the front projection plane 310 or the rear projection plane 312, the bird 330 thereafter moves to the left side of FIG. The bird 330 is an object located on the front projection plane or the rear projection plane 312 (not shown) like the bird 330 shown in FIG. 3, and the bird 330 is fixed at the maximum parallax, and cannot move further forward or backward in the real space. If it is possible to express the object as if it is moving at all, it is possible to maintain a sense of realism for the object.
[0190]
As described above, a process of excluding a dynamic object from the target of the basic expression space T can be considered. However, the user may feel uncomfortable except when aiming for an effect as described above. In many cases, it is preferable to express in the range.
[0191]
Therefore, as shown in FIG. 62, a region where no object exists is included in the basic expression space T. In FIG. 62, a space where nothing exists is provided as a part of the basic expression space T further in front of the front stationary object house 350, and the bird 330, which is a dynamic object, can move in front of the house 350. It is like that. In FIG. 63, a space in which nothing exists further behind the still object tree 370 placed behind is provided as a part of the basic expression space T. Thereby, for example, even if the bird 330, which is a dynamic object, moves from the rear and exceeds the position corresponding to the front of the house 350, the bird 330 is located within the range of the basic expression space T. Even if it moves, it is expressed with proper parallax, and the observer who is the user does not feel uncomfortable with the movement.
[0192]
Further, as shown in FIG. 64, a moving object 390 is formed as a target for calculating the parallax, for example, in a form including the bird 330 as well as the space before and after itself. When the foreground of the moving object 390 reaches the front projection plane 310, only the bird 330 is moved. In this case, for example, by making the moving speed of the bird 330 slower than the original speed, the time until the bird 330 originally reaches the front projection plane 310 immediately and the subsequent movement cannot be expressed can be reduced.
[0193]
Further, as shown in FIG. 65, for example, after the moving object 390 has passed over the front projection plane 310, the bird 330 may be moved in a space that is included in advance. Thus, the maximum parallax is determined by the moving object 390, and the bird 330 gradually approaches the maximum parallax, so that it is possible to continue moving forward in the real space. This can be realized by determining whether to enable or disable the movement based on the position of the object, that is, the bird 330. The moving speed may be set to any of the originally assumed moving speed, a high speed, and a low speed. By giving flexibility to the moving speed, various expressions are possible. For example, by changing the moving speed to be slower as approaching the end of the moving object 390, it is possible to express the moving forward while preventing the parallax amount from becoming excessively large in the front-back direction.
[0194]
Also, if another object appears before or after it, the bird 330 will gradually return to its original position in the moving object 390, since the maximum parallax will now depend on that object.
[0195]
Next, the principle of preventing a sudden change in parallax while changing the maximum parallax will be described with reference to FIGS. 17 and 18 described above. As mentioned above,
tan (φ / 2) = Mtan (θ / 2) / L
E: S = P: A
P = 2 (S + A) tan (φ / 2)
Is established, and from these equations, the parallax amount on the near side of a certain object in a certain camera setting is
M = LEA / (2S (A + S) tan (θ / 2))
It can be expressed as. Here, if the object moves forward, unless the camera settings are changed, A increases and S decreases, so that the amount of parallax increases.
[0196]
Here, assuming that M becomes M ′, S becomes S ′, and A becomes A ′ when the object moves forward,
M ′ = LEA ′ / (2S ′ (A ′ + S ′) tan (θ / 2))
M <M '
It can be expressed as.
[0197]
Change E and A 'in the camera settings,
M "= LE" A "/ (2S '(A" + S') tan (θ / 2))
And at this time
M <M ”<M '
If the relationship is satisfied, a sudden change in the amount of parallax can be prevented when the object moving toward the observer is displayed three-dimensionally. Note that only either E or A ′ may be changed. At this time, M ″
M ″ = LE ″ A ′ / (2S ′ (A ′ + S ′) tan (θ / 2))
Or
M ″ = LEA ″ / (2S ′ (A ″ + S ′) tan (θ / 2))
It is expressed as
[0198]
To prevent a sudden change in the amount of parallax for movement toward the back of the object,
M> M "> M '
What is necessary is to satisfy the relationship.
[0199]
Also, the same applies to the parallax amount N on the far side,
N = LE (TA) / (2 (T + S) (A + S) tan (θ / 2))
And similarly,
N ′ = LE (TA ′) / (2 (T + S ′) (A ′ + S ′) tan (θ / 2))
N ″ = LE ″ (TA ″) / (2 (T + S ′) (A ″ + S ′) tan (θ / 2))
Ask for. here
N> N "> N '
If the relationship is satisfied, the movement speed on the actual coordinates with respect to the movement of the object toward the observer can prevent a sudden change in the amount of parallax,
N <N ”<N '
If the relationship is satisfied, it is possible to prevent a sudden change in the amount of parallax with respect to the movement toward the back of the object.
[0200]
The configuration of the three-dimensional image display device 100 that realizes the expression method shown in FIGS. 61 to 65 will be described. The three-dimensional image display device 100 can be realized by the three-dimensional image display device 100 shown in FIG. However, when correcting the camera parameters provisionally set according to the appropriate parallax, the camera arrangement determination unit 132 uses the original data to obtain information on the range to be calculated in the basic expression space and information on a change in the amount of parallax of the object. There is also a function to read and reflect it in camera parameters. This information may be included in the original data itself, or may be stored in the parallax information storage unit 120, for example.
[0201]
In the embodiment, for example, when it is determined that the parallax is too large by the proper parallax processing with respect to a correct parallax state in which a sphere can be correctly viewed, the processing is performed so that the parallax of the stereoscopic image is reduced. At this time, the sphere looks like a shape that is crushed in the depth direction, but generally the feeling of strangeness to such display is small. Since a person is usually used to seeing a planar image, if the parallax is between the state of 0 and the state of correct parallax, the person often does not feel uncomfortable.
[0202]
Conversely, if it is determined that the parallax of the stereoscopic image is too small in the proper parallax processing for the parallax state in which the sphere looks correct, processing is performed to increase the parallax. At this time, for example, the sphere looks like a shape bulging in the depth direction, and a person may feel a sense of discomfort greatly for such a display.
[0203]
When displaying a single object in 3D, the above-mentioned phenomena tend to be uncomfortable, and this is especially true when viewing objects such as buildings and vehicles that are seen in real life. Tends to be clearly recognized. Therefore, in order to reduce the discomfort, it is necessary to add correction to the processing that increases the parallax.
[0204]
When a three-dimensional image is generated from three-dimensional data, adjustment of parallax can be performed relatively easily by changing the arrangement of cameras. A procedure for correcting the parallax will be described with reference to FIGS. The correction of the parallax can be performed by the above-described first to fourth stereoscopic image processing devices 100. Here, it is assumed that the first three-dimensional image processing apparatus 100 shown in FIG. 11 generates a three-dimensional image from three-dimensional data. It should be noted that the above-described correction processing can also be realized by fourth and sixth stereoscopic image display devices 100 described later.
[0205]
FIG. 66 shows a state in which an observer is observing a stereoscopic image on a display screen 400 of a certain stereoscopic image display device 100. The screen size of the display screen 400 is L, the distance between the display screen 400 and the observer is d, and the interocular distance is e. Further, the near limit parallax M and the far limit parallax N are obtained in advance by the stereoscopic effect adjusting unit 112, and the proper parallax is between the near limit parallax M and the far limit parallax N. Here, only the near limit parallax M is displayed for easy understanding, and the maximum pop-out amount m is determined from this value. The pop-out amount m indicates the distance from the display screen 400 to the near point. Note that the unit of L, M, and N is “pixel”, and unlike the other parameters such as d, m, and e, it is originally necessary to adjust using a predetermined conversion formula. Are expressed in the same unit system.
[0206]
At this time, it is assumed that the camera arrangement is determined as shown in FIG. 67 by the camera arrangement determining unit 132 of the parallax control unit 114 based on the nearest point and the farthest point of the ball 21 in order to display the ball 21. . The optical axis intersection distance of the two cameras 22 and 24 is D, and the camera interval is Ec. However, in order to facilitate the comparison of the parameters, the coordinate system is enlarged / reduced so that the expected width of the camera at the optical axis intersection distance matches the screen size L. At this time, for example, it is assumed that the camera interval Ec is equal to the interocular distance e and the optical axis intersection distance D is smaller than the observation distance d. Then, in this system, as shown in FIG. 68, when the observer observes from the camera position shown in FIG. 67, the sphere 21 can be seen correctly. When the sphere 21 is observed on the original stereoscopic image display device 100 based on an image generated by such an imaging system, the sphere 21 extending in the depth direction over the entire proper parallax range is observed as shown in FIG.
[0207]
A method for determining whether or not a stereoscopic image needs to be corrected using this principle will be described below. FIG. 70 illustrates a state in which the camera arrangement illustrated in FIG. 67 captures the nearest point of a sphere whose distance from the display screen 400 is A. At this time, the maximum parallax M corresponding to the distance A is obtained by two straight lines connecting each of the two cameras 22 and 24 and the point at the distance A. Further, FIG. 71 shows the camera interval E1 required to obtain the parallax M shown in FIG. 70 when the optical axis tolerance distance between the two cameras 22 and 24 and the camera is d. This can be said to be a conversion that makes all the parameters of the imaging system other than the camera interval coincide with the parameters of the observation system. 70 and 71 hold the following relationship.
M: A = Ec: DA
M: A = E1: d-A
Ec = E1 (DA) / (dA)
E1 = Ec (d−A) / (DA)
When E1 is greater than the interocular distance e, it is determined that correction is required to reduce the parallax. Since E1 may be set to the interocular distance e, Ec may be corrected as in the following equation.
Ec = e (DA) / (dA)
[0208]
The same applies to the furthest point. In FIGS. 72 and 73, if the distance between the nearest point and the farthest point of the sphere 21 is T, which is the basic expression space,
N: TA = Ec: D + TA
N: TA = E2: d + TA
Ec = E2 (D + TA) / (d + TA)
E2 = Ec (d + TA) / (D + TA)
Further, when E2 is greater than the interocular distance e, it is determined that correction is necessary. Subsequently, since E2 may be set to the interocular distance e, Ec may be corrected as in the following equation.
Ec = e (D + TA) / (d + TA)
[0209]
Ultimately, if the smaller one of the two Ec obtained from the nearest point and the farthest point is selected, the parallax does not become too large for both the near point and the far point. The camera is set by returning the selected Ec to the original coordinate system in the three-dimensional space.
[0210]
More generally,
Ec <e (DA) / (dA)
Ec <e (D + TA) / (d + TA)
What is necessary is just to set the camera interval Ec so as to simultaneously satisfy the two expressions. This is due to two optical axes K4 connecting the two cameras 22, 24 placed at the position of the observation distance d at the distance of the interocular distance e and the closest point of the object in FIGS. 74 and 75, or This shows that the interval between the two cameras 22 and 24 and the two cameras on the two optical axes K5 connecting the furthest points is the upper limit of the camera interval Ec. That is, the two cameras 22 and 24 may be arranged so as to be included between the narrower optical axis of the interval between the two optical axes K4 in FIG. 74 or the interval between the two optical axes K5 in FIG.
[0211]
Although the correction was performed only at the camera interval without changing the optical axis crossing distance, the position of the object may be changed by changing the optical axis crossing distance, or the camera distance and the optical axis crossing distance may be changed. Both may be changed.
[0212]
Correction is also required when using a depth map. If the depth map value represents the shift amount of the point by the number of pixels, and the initial value, generally the value described in the original data is in a state of realizing the optimal stereoscopic vision, by appropriate parallax processing, The above processing is not performed when the range of the depth map value needs to be increased, and the above processing is performed only when it is necessary to reduce the range of the depth map value, that is, when it is necessary to reduce the parallax. Just do it.
[0213]
When the initial value of the parallax is set to be relatively small, the maximum allowable value may be held in a header area of the image or the like, and the appropriate parallax processing may be performed so as to be within the maximum allowable value. In these cases, hardware information is required for an appropriate distance, but higher-performance processing can be realized as compared with the above-described processing in which hardware information is not dependent. The above processing can be used not only when parallax is automatically set but also when manually set.
[0214]
In addition, the limit of parallax at which an observer feels strange varies depending on the image. In general, in an image having little change in pattern or color and an image in which edges are conspicuous, crosstalk is conspicuous when parallax is increased. Also, an image having a large difference in luminance on both sides of an edge has noticeable crosstalk when the parallax is increased. In other words, when an image to be stereoscopically displayed, that is, a parallax image, and further a viewpoint image, have a small amount of high-frequency components, the user tends to feel discomfort when viewing the image. Therefore, the image may be subjected to frequency analysis by a method such as Fourier transform, and the appropriate parallax may be corrected according to the distribution of frequency components obtained as a result of the analysis. That is, for an image having a large amount of high-frequency components, a correction is made so that the parallax becomes larger than the proper parallax.
[0215]
Further, crosstalk is conspicuous in an image with less movement. In general, it is often possible to determine whether a file type is a moving image or a still image by checking the extension of the file name. Therefore, when it is determined that the moving image is a moving image, the state of the motion may be detected by a known motion detecting method such as a motion vector, and the appropriate amount of parallax may be corrected according to the state. In other words, a correction is made so that the parallax becomes smaller than the original parallax to the image with little motion. On the other hand, no correction is applied to an image having many motions. Alternatively, when it is desired to emphasize the motion, a correction may be made so that the parallax becomes larger than the original parallax. Note that the correction of the appropriate parallax is an example, and any correction can be performed within a predetermined parallax range. Further, the depth map can be corrected, and the amount of shift of the synthesis position of the parallax image can be corrected.
[0216]
Alternatively, these analysis results may be recorded in the header area of the file, and the three-dimensional image processing apparatus may read the header and use the header when displaying the next and subsequent three-dimensional images.
[0219]
In addition, the amount and motion distribution of the high-frequency component may be ranked by an actual creator or a user based on stereoscopic vision, or may be ranked by stereoscopic vision by a plurality of evaluators, and the average value may be used. The ranking may be performed in any manner.
[0218]
Further, the proper parallax does not need to be strictly observed, and the calculation of the camera parameters does not need to be performed constantly, but may be performed at regular time intervals or at each scene change. This is particularly effective when the processing is performed by a device having a low processing capacity. For example, when calculating camera parameters at regular intervals, in the case of generating a stereoscopic image from three-dimensional data, in the first stereoscopic image processing apparatus 100, the parallax control unit 114 uses an internal timer to arrange the camera at regular intervals. What is necessary is just to instruct the determination unit 132 to recalculate the camera parameters. The internal timer may use the reference frequency of the CPU that performs the arithmetic processing of the three-dimensional image processing apparatus 100, or a dedicated timer may be separately provided.
[0219]
FIG. 76 illustrates a configuration of a fifth stereoscopic image processing apparatus 100 that calculates an appropriate parallax according to the state of an image. Here, in the first three-dimensional image processing apparatus 100 shown in FIG. 11, an image determination unit 190 is newly provided. Since other configurations and operations are the same, different points will be mainly described. The image determination unit 190 analyzes the frequency components of the image to determine the amount of the high frequency components, and notifies the parallax control unit 114 of a parallax suitable for the image. If the original data is a moving image, The camera includes a scene determination unit 194 that notifies the parallax control unit 114 of camera parameter calculation timing by detecting a scene change or detecting a motion in an image. The detection of a scene change may be performed using a known method.
[0220]
When the original data is a moving image, the processing load of the frequency component detection unit 192 increases if the process of adjusting the appropriate parallax based on the amount of the high frequency component of the image is always performed. If an arithmetic processing device that matches the processing load is used, there is a concern that the cost of the stereoscopic image processing device 100 will increase. As described above, since the proper parallax does not need to be strictly maintained at all times, the configuration is such that the frequency component of the image is analyzed when the image greatly changes, such as a scene change, based on the detection result of the scene determination unit 190. Thus, the processing load of the image determination 190 can be reduced.
[0221]
When a plurality of virtual cameras are arranged in a three-dimensional space and parallax images corresponding to the respective virtual cameras are generated, an area where object information does not exist in the parallax images may occur. In the following, taking as an example a case where a three-dimensional image is generated with three-dimensional data as a starting point, a principle in which an area where no object information exists in a parallax image will be described, and a solving method thereof will be described. FIG. 77 shows the relationship among the temporary camera position S (Xs, Ys, Zs), the angle of view θ, and the first to third objects 700, 702, and 704 set by the creator who creates the three-dimensional data. I have.
[0222]
The temporary camera position S (Xs, Ys, Zs) becomes the center of the virtual cameras when each of the parallax images is generated based on the plurality of virtual cameras (hereinafter, also referred to as a camera group center position S). The first object 700 corresponds to the background. Here, the creator sets the angle of view so that the second and third objects 702 and 704 fall within the angle of view θ and the object information exists within the angle of view θ by the first object 700 which is the background image. θ and the camera group center position S are set.
[0223]
Next, according to a predetermined program, a desired parallax is obtained as shown in FIG. 78, and an optical axis crossing position A (Xa, Ya, Za) which is a reference for near and far is obtained. The parameters of the two virtual cameras 722 and 724, specifically, the camera positions and their respective optical axes are determined. At this time, when the angle of view θ is equal to the previously determined value, the camera positions of the two virtual cameras 722 and 724 are, for example, shown in the drawing depending on the size of the first object that is the background image. Thus, the first and second object zero areas 740 and 742 where no object information exists are generated.
[0224]
The first object zero region 740 is represented by an angle α, and the second object zero region 742 is represented by β, and there is no object information in these angle ranges. Therefore, as shown in FIG. 79, the angle of view θ may be adjusted so that α and β disappear. That is, the larger value of the values of α and β is subtracted from the angle of view θ. At this time, a new angle of view θ1 is determined from θ1 = θ1-2 × α or θ1-2 × β in order to reduce the value to be subtracted from both the left and right of the angle of view θ so as not to change the optical axis direction. Can be However, since α and β may not be immediately known from the parallax image, the angle of view θ is adjusted little by little, and each time there is an area in the parallax image where no object information exists. You may check. In addition, the presence or absence of an area where no object information exists may be actually confirmed by checking whether there is data to be input to the pixels on the display screen. Further, the adjustment is not limited to the adjustment so that the object information exists in all the pixels only by adjusting the angle of view θ, and the camera interval E and the optical axis intersection position A may be changed.
[0225]
FIG. 80 is a flowchart showing the angle-of-view adjustment processing. This angle-of-view adjustment process can be realized by the first stereoscopic image display device 100 shown in FIG. First, when original data serving as a starting point of a stereoscopic image is input to the stereoscopic image display device 100, the temporary camera placement unit 130 determines a camera group center position S (S110). Subsequently, the camera arrangement determination unit 132 determines the camera angle of view θ based on the camera group center position S (S112), determines the camera interval E (S114), and determines the optical axis intersection position A of the virtual camera. Is determined (S116). Further, the camera arrangement determination unit 132 performs a coordinate conversion process on the original data based on the camera interval E and the optical axis intersection position A (S118), and determines whether or not object information exists in all pixels on the display screen. (S120).
[0226]
When there is a pixel having no object information (N in S120), a correction for slightly narrowing the angle of view θ is performed (S122), and the process returns to S114. Thereafter, until the object information is present in all the pixels. The processing from S114 to S120 is continued. However, when the adjustment is performed so that the object information exists in all the pixels only by correcting the angle of view θ, the processing of determining the camera interval E in S114 and the processing of determining the optical axis intersection position A in S116 are skipped. If the object information exists in all the pixels (Y in S120), the process of the angle-of-view adjustment ends.
[0227]
In the above embodiment, a description has been given mainly of a three-dimensional image generated starting from three-dimensional data. Hereinafter, a method of expressing a stereoscopic image starting from a real image will be described. The difference between the case where the three-dimensional data is used as the starting point and the case where the real image is used as the starting point is that there is no concept of the depth T of the basic expression space when using the real image as the starting point. This can be rephrased as a depth range T in which proper parallax display is possible.
[0228]
As shown in FIGS. 17 and 18, parameters necessary for camera setting for generating a stereoscopic image include a camera interval E, an optical axis intersection distance A, an angle of view θ, and a front projection plane 30 which is a front surface of the basic expression space. , The distance S from the camera arrangement plane, that is, the viewpoint plane 208, the distance D of the optical axis intersection plane 210 from the viewpoint plane 208, and the depth range T. The following relational expression is satisfied between them.
E = 2 (S + A) tan (θ / 2) · (SM + SN + TN) / (LT)
A = STM / (SM + SN + TN)
D = S + A
Therefore, if three of the six parameters E, A, θ, S, D, and T are designated, the remaining parameters can be calculated. In general, any parameter can be specified, but in the above-described embodiment, θ, S, and T are specified, and E, A, and D are calculated. If θ and S are automatically changed, the enlargement ratio changes, so that the expression intended by the programmer or the photographer may not be possible, and it is often undesirable to automatically determine these. T can also be said to be a parameter representing the limitation of the expression range, and is preferably determined in advance. In the case of three-dimensional data, changing any of the parameters is almost the same. However, it is different in the case of live action. Depending on the structure of the camera, the price varies greatly and the operability also changes. Therefore, it is desirable to change the designated parameter according to the application.
[0229]
FIG. 81 shows the relationship between a subject 552 and a three-dimensional photographing device 510 that takes a three-dimensional photo at an entertainment facility or a photo studio. The stereoscopic photographing apparatus 510 includes a camera 550 and the stereoscopic image processing apparatus 100. Here, the shooting environment is fixed. That is, the position of the camera 550 and the position of the subject 552 are determined in advance, and θ, S, and T are determined as parameters. This photographing system is a state in which the example shown in FIG. 18 is replaced with an actual camera 550. Two lenses 522 and 524 are provided in one camera 550, and only this camera 550 serves as a base point of a stereoscopic image. Two parallax images can be taken.
[0230]
FIG. 82 shows a configuration of a sixth stereoscopic image processing apparatus 100 that performs this processing. This stereoscopic image processing device 100 is obtained by replacing the parallax detection unit 150 with a camera control unit 151 in the stereoscopic image processing device 100 shown in FIG. The camera control unit 151 has a lens interval adjustment unit 153 and an optical axis adjustment unit 155.
[0231]
The lens interval adjuster 153 adjusts the camera interval E, more specifically, the lens interval E by adjusting the positions of the two lenses 522 and 524. Further, the optical axis adjustment unit 155 adjusts D by changing the optical axis direction of each of the two lenses 522 and 524. The subject 552 inputs the proper parallax information of the stereoscopic image display device held at home or the like through a portable recording medium such as a memory or a card or a communication means such as the Internet. The information acquisition unit 118 receives the input of the appropriate parallax and notifies the camera control unit 151. Upon receiving the notification, the camera control unit 151 calculates E, A, and D, and adjusts the lenses 522 and 524, so that the camera 550 shoots with an appropriate parallax. This is realized because the library uses a common process for the stereoscopic display device for displaying the subject and the stereoscopic photographing device 510.
[0232]
If the subject is to be placed on the screen during display, D and A may be determined, and the subject may be positioned at D and photographed. In this case, the proper parallax is set separately for near and far positions. May be calculated, and the smaller E may be selected. T may be larger than the range occupied by the subject. If there is a background, T may be determined including the background.
[0233]
Further, the proper parallax information does not necessarily need to have been checked by a stereoscopic image display device owned by the user who is the subject. For example, a desired stereoscopic effect may be selected by a typical stereoscopic image display device at the shooting site. This selection can be made by the stereoscopic effect adjusting unit 112. Alternatively, simply select from items such as “on screen / distant / close” and “stereoscopic effect: large / medium / small”, and determine in advance the parallax information holding unit 120 corresponding to them. The set camera parameters may be used. Further, the change of the optical axis crossing position may be changed by a mechanism structure, but may be realized by changing a range used as an image by using a high-resolution CCD (Charge Coupled Device). The function of the position shift unit 160 may be used for this processing.
[0234]
FIG. 83 shows a state in which a movable camera 550 is installed in a place where no human can enter, and the camera 550 is operated by remote control using a controller 519, and a captured image is observed on a stereoscopic image display device 511. Is shown. The stereoscopic image display device 511 having the configuration shown in FIG. 82 is incorporated in the stereoscopic image display device 511.
[0235]
The camera 550 has a mechanism capable of automatically adjusting the lens interval E. The camera 550 has an optical zoom function or an electronic zoom function, which determines θ. However, the amount of parallax changes due to this zoom operation. In general, the farther an image is taken, the smaller the angle formed by the optical axis between the viewpoints at the time of display. Therefore, it is necessary to appropriately change camera settings such as the lens interval E and the zoom amount. Here, in such a case, the camera settings are automatically controlled to greatly reduce complicated camera settings. Note that the controller 519 may be used to adjust camera settings.
[0236]
When the operator first operates the optical zoom or the electronic zoom using the controller 519, θ is determined. Next, the camera 550 is moved, and the subject to be photographed is displayed at the center on the stereoscopic display device 511. The camera 550 focuses on the subject by an autofocus function, and at the same time acquires the distance. In the initial state, this distance is D. That is, the camera 550 is automatically set so that the subject appears to be located near the display screen. T can manually change the range, and the operator specifies in advance the distribution in the depth direction of the object whose context is to be grasped. Thus, θ, D, and T are determined. Thus, E, A, and S are determined from the above-described three relational expressions, and the camera 550 is appropriately and automatically adjusted. In the case of this example, since S is determined later, it is uncertain what range T will eventually be. Therefore, it is preferable to set T to a somewhat large value.
[0237]
If the subject is to be displayed at the edge of the screen, the subject may be displayed once in the center, a predetermined button may be pressed to fix the focus and D, and then the direction of the camera 550 may be changed. If the focus and D can be manually changed, the depth position of the subject can be freely changed.
[0238]
FIG. 84 illustrates an example of shooting with the stereoscopic image shooting device 510. The stereoscopic image photographing device 510 has the configuration shown in FIG. The proper parallax of the stereoscopic image display device held by the photographer is input to the camera 550 in advance through a recording medium such as a portable memory or a communication unit such as the Internet. Here, it is assumed that the camera 550 has a simple structure and is available at a relatively low price. Here, the camera interval E, the optical axis crossing distance D, and the angle of view θ are fixed, and A, S, and T are determined from the above three relational expressions. Since the appropriate range of the distance to the subject can be calculated from these values, the distance to the subject is measured in real time, and whether the calculated distance is appropriate is photographed using a message or lamp color etc. Can be notified. The distance to the subject may be obtained by a known technique such as an autofocus distance measurement function.
[0239]
As described above, the combination of which camera parameter is used as a variable or a constant is free, and there are various forms according to the application. In addition to the above, the camera 550 may be attached to various devices such as a microscope, a medical endoscope, and a portable terminal.
[0240]
If the parallax is optimized for a specific stereoscopic display device, stereoscopic viewing may be difficult with another stereoscopic display device. However, in general, the performance of the device is improved, and it is rare that the parallax is too large for the stereoscopic display device to be purchased next. Rather, it is important to make the above adjustments in order to avoid the danger that stereoscopic viewing becomes difficult regardless of the performance of the stereoscopic display device due to improper setting of the photographing device. Here, the stereoscopic display device includes a stereoscopic image processing device for realizing stereoscopic vision.
[0241]
The proper parallax obtained by the stereoscopic effect adjustment unit 112 of the first to sixth stereoscopic image processing apparatuses 100 is a parameter determined by the user while stereoscopically viewing the specific stereoscopic image processing apparatus 100. In 100, the proper parallax is maintained thereafter. Two factors, the "image separation performance" specific to the stereoscopic display device and the "physiological limit" specific to the observer, are taken into account in the operation for adjusting the stereoscopic effect. "Image separation performance" is an objective factor that indicates the performance of separating multiple viewpoint images. A stereo display device with low performance can easily detect crosstalk even with little parallax, and can be adjusted by multiple viewers. , The range of the appropriate parallax becomes narrow on average. Conversely, if the image separation performance is high, crosstalk is hardly perceived even with a large parallax, and the range of proper parallax tends to be wide on average. On the other hand, the “physiological limit” is a subjective factor. For example, even if the image separation performance is very high and the image is completely separated, the parallax range where the observer does not feel discomfort differs. This appears as a variation in proper parallax in the same stereoscopic image processing apparatus 100.
[0242]
The image separation performance is also called a degree of separation, and can be determined by a method of measuring the illuminance of the reference image 572 while moving the illuminometer 570 in the horizontal direction at the optimum observation distance as shown in FIG. At that time, in the case of the binocular system, for example, all white is displayed on the left eye image, and all black is displayed on the right eye image. If the images are completely separated, the illuminance at the position where the right eye image can be seen becomes zero. In contrast, by measuring the degree of white leakage of the left-eye image, image separation performance can be obtained. In this figure, the graph on the right end is an example of the measurement result. Further, since this measurement is almost equivalent to measuring the density of moire, a moire image is captured at a distance at which moire is observed as shown in FIG. 86, and the image separation performance is also measured by analyzing the density. be able to.
[0243]
Even in a glasses-type stereoscopic display device or the like, the image separation performance can be measured by measuring leaked light in the same manner. Actually, the calculation may be performed taking into account the measured value when both the left and right images are all black as the background. Further, the image separation performance can be determined by an average value of ranking evaluations by a large number of observers.
[0244]
As described above, it is possible to give an objective criterion such as a numerical value for the image separation performance of the stereoscopic display device. For example, the rank of the stereoscopic display device 450 in FIG. If the proper parallax of is known, the proper parallax can be converted so as to match the rank of the other stereoscopic display device 440. In addition, the stereoscopic display device also has parameters that are eigenvalues such as a screen size, a pixel pitch, and an optimum viewing distance, and information on these parameters is also used for conversion of proper parallax.
[0245]
Hereinafter, conversion examples of the appropriate parallax will be sequentially described for each parameter with reference to FIGS. 87 and 88. Here, it is assumed that the proper parallax is held by N / L and M / L. Here, M is the near limit parallax, N is the far limit parallax, and L is the screen size. By expressing the ratio in this manner, a difference in pixel pitch between the stereoscopic display devices can be ignored. Therefore, in the drawings used below, description will be made assuming that the pixel pitch is equal for ease of description.
[0246]
First, conversion for a difference in screen size will be described. As shown in FIG. 87, it is preferable to perform processing so that the absolute value of parallax does not change regardless of the screen size. That is, the stereoscopic expression range in the front-back direction is made the same. It is assumed that the screen size has been increased by a times from the state shown on the upper side of the figure to the state shown on the lower side. At this time, by converting N / L to N / (aL) and converting M / L to M / (aL), proper parallax is realized even when the screen size is different. This figure shows an example of the closest position.
[0247]
Next, conversion for a difference in observation distance will be described. As shown in FIG. 88, when the optimal observation distance d increases by a factor of b, the absolute value of the parallax is preferably increased by a factor of b. That is, the angle of parallax that the eye sees is kept constant. Therefore, by converting N / L to bN / L and converting M / L to bM / L, proper parallax is realized even when the optimum viewing distance is different. In this drawing, this is shown as an example of the closest position.
[0248]
Lastly, a description will be given of adding the factors of the image separation performance. Here, it is assumed that the rank r of the image separation performance is an integer equal to or greater than 0, and that the performance r is so poor that parallax cannot be provided. Then, assuming that the image separation performance of the first stereoscopic display device is r0 and the image separation performance of the second stereoscopic display device is r1, c / r1 / r0, N / L becomes cN / L, and M / L Is converted to cM / L. As a result, proper parallax is realized even in a stereoscopic display device having different image resolvability. Note that the expression for deriving c shown here is an example, and may be derived from another expression.
[0249]
When all the above processes are performed, N / L is converted to bcN / (aL) and M / L is converted to bcM / (aL). This conversion can be applied to both the horizontal parallax and the vertical parallax. The conversion of the appropriate parallax described above can be realized by the configurations shown in FIGS. 52, 53, and 54.
[0250]
The front and back of the basic expression space may be determined using a Z buffer. The Z-buffer is a technique of hidden surface processing, and a depth map of an object group viewed from a camera is obtained. The minimum and maximum values from which the Z value is removed may be used as the frontmost and rearmost positions. As a process, a process of acquiring a Z value from the position of the virtual camera is added. Since this process does not require a final resolution, processing with a reduced number of pixels reduces the processing time. With this method, the hidden portion is ignored, so that the appropriate parallax range can be used effectively. Also, even if there are a plurality of objects, it is easy to handle.
[0251]
Further, the parallax control unit 114, when generating a stereoscopic image from the three-dimensional data, if the parameter related to the camera arrangement set to generate the parallax image is changed, the camera parameters to the change of the parameter Alternatively, control may be performed so as to fall within a predetermined threshold value. Also, when generating a stereoscopic image of a moving image from a two-dimensional moving image to which the depth information is given, the parallax control unit 114 generates the maximum value of the depth included in the depth information, which occurs with the progress of the two-dimensional moving image. Alternatively, control may be performed so that the change in the minimum value falls within a threshold value provided in advance. The thresholds used for these controls may be stored in the parallax information storage unit 120.
[0252]
When generating a three-dimensional image from three-dimensional data, if the basic expression space is determined from objects existing in the field of view, the size of the basic expression space may be rapidly increased due to rapid movement of objects or frame-in / frame-out. , The parameters related to the camera arrangement may fluctuate greatly. If this variation is greater than a predetermined threshold, the variation may be allowed up to the threshold. Also, when a stereoscopic image is generated from a two-dimensional moving image to which depth information is given, similar inconvenience is considered if the maximum value or the minimum value of the parallax amount is determined from the maximum value or the minimum value of the depth. Can be A threshold may be provided for this variation.
[0253]
【The invention's effect】
According to the present invention, the following effects can be obtained.
1. It is possible to generate or display a stereoscopic image that is easily adapted to human physiology.
2. Even if the display target image changes, a stereoscopic image appropriate for the user can be generated or displayed.
3. The stereoscopic effect of stereoscopic display can be adjusted by simple operations.
4. The burden on the programmer can be reduced when creating content or an application that enables appropriate three-dimensional display.
5. The labor of the user trying to optimize the stereoscopic display is reduced.
6. Normally, the same applies to a device that cannot be plug-and-play in principle, such as a retrofit parallax barrier, which can easily realize three-dimensional effect adjustment and head tracking information that are not targets of the plug-and-play function.
[Brief description of the drawings]
FIG. 1 is a diagram showing a positional relationship between a user, a screen, and a reproduction object 14 that can perform ideal stereoscopic viewing.
FIG. 2 is a diagram illustrating an example of a photographing system that realizes the state of FIG. 1;
FIG. 3 is a diagram showing another example of a photographing system for realizing the state of FIG. 1;
FIG. 4 is a diagram showing another example of a photographing system for realizing the state of FIG. 1;
FIG. 5 is a diagram illustrating a model coordinate system used in the first stereoscopic image processing apparatus.
FIG. 6 is a diagram illustrating a world coordinate system used in the first stereoscopic image processing apparatus.
FIG. 7 is a diagram illustrating a camera coordinate system used in the first stereoscopic image processing apparatus.
FIG. 8 is a diagram illustrating a view volume used in the first stereoscopic image processing apparatus.
9 is a diagram illustrating a coordinate system after perspective transformation of the volume in FIG. 8;
FIG. 10 is a diagram showing a screen coordinate system used in the first stereoscopic image processing apparatus.
FIG. 11 is a configuration diagram of a first stereoscopic image processing apparatus.
FIG. 12 is a configuration diagram of a second stereoscopic image processing device.
FIG. 13 is a configuration diagram of a third stereoscopic image processing device.
FIGS. 14A and 14B are diagrams respectively showing a left-eye image and a right-eye image displayed by a stereoscopic effect adjusting unit of the first stereoscopic image processing device.
FIG. 15 is a diagram illustrating a plurality of objects having different parallaxes displayed by the stereoscopic effect adjusting unit of the first stereoscopic image processing device.
FIG. 16 is a diagram illustrating an object whose parallax changes, which is displayed by the stereoscopic effect adjustment unit of the first stereoscopic image processing device.
FIG. 17 is a diagram illustrating a relationship among a camera angle of view, an image size, and parallax when an appropriate parallax is realized.
18 is a diagram illustrating a positional relationship of a photographing system that realizes the state of FIG.
FIG. 19 is a diagram showing a positional relationship of a photographing system for realizing the state of FIG. 17;
FIG. 20 is a diagram illustrating a camera arrangement when generating a multi-viewpoint image with proper parallax.
FIG. 21 is a diagram illustrating a parallax correction map used by a distortion processing unit of the first stereoscopic image processing device.
22 is a diagram illustrating a camera viewpoint when generating a parallax image according to the parallax correction map of FIG. 21;
23 is a diagram illustrating another camera viewpoint when generating a parallax image according to the parallax correction map of FIG. 21;
FIG. 24 is a diagram illustrating a parallax correction map used by a distortion processing unit of the first stereoscopic image processing device.
25 is a diagram illustrating a camera viewpoint when generating a parallax image according to the parallax correction map of FIG. 24.
FIG. 26 is a diagram illustrating a sense of distance correction map used by a distortion processing unit of the first stereoscopic image processing apparatus.
FIG. 27 is a diagram illustrating a camera viewpoint when generating a parallax image according to the distance sense correction map of FIG. 26;
FIG. 28 is a diagram illustrating another sense of distance correction map used by the distortion processing unit of the first stereoscopic image processing apparatus.
FIG. 29 is a diagram illustrating a camera viewpoint when generating a parallax image according to the sense of distance correction map of FIG. 28;
FIGS. 30 (a), 30 (b), 30 (c), 30 (d), 30 (e), and 30 (f) are first stereoscopic image processing apparatuses. FIG. 14 is a top view of a parallax distribution obtained as a result of performing processing on a three-dimensional space by the distortion processing unit of FIG.
FIG. 31 is a diagram illustrating a principle of processing by a distortion processing unit of the first stereoscopic image processing apparatus.
FIG. 32 is a diagram specifically showing the processing of FIG. 31.
FIG. 33 is a diagram specifically showing the process of FIG. 31.
FIG. 34 is a diagram specifically showing the process of FIG. 31.
FIG. 35 is a diagram illustrating another example of the processing by the distortion processing unit of the first stereoscopic image processing device.
FIG. 36 is a diagram specifically showing the process of FIG. 35.
FIG. 37 is a diagram showing a depth map.
FIG. 38 is a diagram illustrating an example of processing by a distortion processing unit of the third stereoscopic image processing device.
FIG. 39 is a diagram illustrating a depth map generated by a process performed by a distortion processing unit of the third stereoscopic image processing apparatus.
FIG. 40 is a diagram illustrating another example of the processing by the distortion processing unit of the third stereoscopic image processing device.
FIG. 41 is a diagram illustrating an example of processing performed by a two-dimensional image generation unit of the second stereoscopic image processing device.
FIG. 42 is a diagram illustrating an example of a parallax image.
FIG. 43 is a diagram illustrating a parallax image whose combination position has been shifted by the two-dimensional image generation unit of the second stereoscopic image processing apparatus.
FIG. 44 is a diagram illustrating processing of an image edge adjustment unit of the second stereoscopic image processing device.
FIG. 45 is a diagram illustrating processing of the second stereoscopic image processing device.
FIG. 46 is a diagram illustrating another process of the second stereoscopic image processing device.
FIG. 47 is a diagram illustrating another process of the second stereoscopic image processing device.
FIG. 48 is a diagram showing a planar image to which a depth map has been added.
FIG. 49 is a diagram showing a depth map.
FIG. 50 is a diagram illustrating a manner in which a two-dimensional image generation unit of the second stereoscopic image processing apparatus generates a parallax image based on a depth map.
FIG. 51 is a diagram illustrating a depth map corrected by the two-dimensional image generation unit of the second stereoscopic image processing device.
FIG. 52 is a diagram illustrating a state in which the stereoscopic image processing apparatus according to the embodiment is used as a library.
FIG. 53 is a configuration diagram in which a three-dimensional display library is incorporated in three-dimensional data software.
FIG. 54 is a diagram illustrating a state in which a stereoscopic display library is used in a network-based system.
FIG. 55 is a diagram showing a state in which an image composed of three-dimensional data is displayed on a display screen.
FIG. 56 is a diagram showing another state in which an image constituted by three-dimensional data is displayed on a display screen.
FIG. 57 is a diagram showing another state in which an image formed by three-dimensional data is displayed on a display screen.
FIG. 58 is a diagram illustrating a method of matching an interface of an object to be displayed with a display screen.
FIG. 59 is a diagram showing another state in which an image is captured by making the optical axis intersection positions of two virtual cameras coincide with one surface of an aquarium.
FIG. 60 is a configuration diagram of a fourth stereoscopic image processing device.
FIG. 61 is a diagram illustrating a convenient basic expression space T with respect to an image displayed by a fourth stereoscopic image processing apparatus.
FIG. 62 is a diagram illustrating an area in which no object exists by including the area in the basic expression space T.
FIG. 63 is a diagram illustrating a region in which no object exists by including the region in a basic expression space T.
FIG. 64 is a diagram illustrating a state where a moving object is formed so as to include not only a bird but also the space before and after the target object for calculating the parallax.
FIG. 65 is a diagram illustrating a state in which a bird 330 moves in a previously included space after a moving object has passed a front projection plane.
FIG. 66 is a diagram showing a state where an observer is observing a stereoscopic image on a display screen.
FIG. 67 is a diagram illustrating a camera arrangement determined by a camera arrangement determining unit.
68 is a diagram illustrating a manner in which an observer is observing a parallax image obtained by the camera arrangement in FIG. 67.
69 is a diagram illustrating a state in which an observer is observing a display screen at the position of the observer illustrated in FIG. 66 on an image for which an appropriate parallax has been obtained with the camera arrangement in FIG. 67.
70 is a diagram illustrating a state in which the nearest point of a sphere located at a distance A from the display screen is photographed with the camera arrangement illustrated in FIG. 67.
71 is a diagram showing the relationship between the optical axis tolerance distance between two cameras and the camera interval required to obtain the parallax shown in FIG. 70.
FIG. 72 is a diagram illustrating a state in which the farthest point of a sphere located at a distance TA from the display screen is imaged with the camera arrangement illustrated in FIG. 67;
73 is a diagram showing the relationship between the optical axis tolerance distance between two cameras and the camera interval E1 required to obtain the parallax shown in FIG. 72.
FIG. 74 is a diagram illustrating a relationship between camera parameters required to set a parallax of a stereoscopic image within an appropriate parallax range.
FIG. 75 is a diagram illustrating a relationship between camera parameters required for setting a parallax of a stereoscopic image within an appropriate parallax range.
FIG. 76 is a configuration diagram of a fifth stereoscopic image processing device.
FIG. 77 is a diagram illustrating a relationship between a temporary camera position, an angle of view, and first to third objects set by a creator who creates three-dimensional data.
FIG. 78 is a diagram showing a state where two virtual cameras are arranged based on the temporary camera positions determined in FIG. 77.
FIG. 79 is a diagram showing a state where the camera arrangement is adjusted so that an area where no object information exists does not occur.
FIG. 80 is a diagram showing a view angle adjustment process.
FIG. 81 is a diagram illustrating a relationship between a stereoscopic photographing apparatus that photographs a stereoscopic photograph in an entertainment facility, a photo studio, or the like, and a subject.
FIG. 82 is a diagram illustrating a configuration of a sixth stereoscopic image processing apparatus.
Fig. 83 is a diagram illustrating a state where a camera is operated by remote control and a captured image is observed on a stereoscopic image display device.
FIG. 84 is a diagram illustrating an example of imaging by a stereoscopic imaging device including a sixth stereoscopic image processing device.
FIG. 85 is a diagram illustrating a state in which image resolution is measured by an illuminometer.
FIG. 86 is a diagram showing a moire image used for measuring image resolvability.
Fig. 87 is a diagram illustrating a conversion example of an appropriate parallax.
FIG. 88 is a diagram illustrating another example of conversion of proper parallax.
FIG. 89 is a diagram illustrating a table used for simple determination of a parallax and a basic expression space.
[Explanation of symbols]
Reference Signs List 10 user, 12 screen, 14 playback object, 20 real object, 22, 24, 26, 28 camera, 30 front projection plane, 32 rear projection plane, 100 stereoscopic image processing device, 112 stereoscopic effect adjustment unit, 114, 152 170 parallax control unit, 116 format conversion unit, 118 information acquisition unit, 122 instruction acquisition unit, 124 parallax identification unit, 132 camera arrangement determination unit, 136, 174 distortion processing unit, 140, 176 correction map holding unit, 142 two-dimensional image Generation unit, 150 parallax amount detection unit, 151 camera control unit, 156 header inspection unit, 158 matching unit, 160 position shift unit, 164 parallax writing unit, 168 image edge adjustment unit, 178 two-dimensional image generation unit, 180 object designation Section, 190 image determination section, 192 frequency component detection section, 94 scene determination unit, 210 optical axis crossing plane, 300 stereoscopic display library, 400 display screen, 402 three-dimensional data software, 406 shooting instruction processing unit, 430 network-based system, 432 game machine, 434 user terminal, 436 server, 452 viewer program, 510 stereo photography device.

Claims (12)

  1. When generating a three-dimensional image from three-dimensional data, control is performed so that the ratio of the width to the depth of the object represented in the three-dimensional image does not become larger than the parallax in a range in which human eyes can correctly perceive. A stereoscopic image processing device comprising a parallax control unit.
  2. When generating a stereoscopic image from the two-dimensional image given the depth information, the ratio of the width and the depth of the object represented in the stereoscopic image is smaller than the parallax in a range that is correctly perceived by human eyes. A stereoscopic image processing apparatus comprising: a parallax control unit that controls the image to be not enlarged.
  3. When generating a stereoscopic image by three-dimensional data, when a parameter related to a camera arrangement that is set to generate a parallax image is changed, the camera parameter is set to a threshold value that is provided in advance with respect to the parameter change A three-dimensional image processing device, characterized in that it is controlled to fit.
  4. When generating a three-dimensional image of a moving image from a two-dimensional moving image given depth information, a change in the maximum value or the minimum value of the depth included in the depth information, which occurs with the progress of the two-dimensional moving image, A three-dimensional image processing apparatus that controls to be within a threshold value provided in advance.
  5. An image determination unit that performs frequency analysis on a stereoscopic image to be displayed based on a plurality of viewpoint images corresponding to different parallaxes,
    According to the amount of high-frequency components found by the frequency analysis, a parallax control unit that adjusts the amount of parallax,
    A stereoscopic image processing device comprising:
  6. The stereoscopic image processing device according to claim 5, wherein the parallax control unit performs an adjustment to increase the parallax amount when the amount of the high-frequency component is large.
  7. The stereoscopic image processing apparatus according to claim 5, further comprising a scene information reflecting unit that associates a result of frequency analysis of the stereoscopic image with the stereoscopic image.
  8. An image determination unit that detects movement of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes,
    A parallax control unit that adjusts the amount of parallax according to the amount of movement of the stereoscopic image,
    A stereoscopic image processing device comprising:
  9. The stereoscopic image processing apparatus according to claim 8, wherein the parallax control unit performs an adjustment to reduce the parallax amount when the amount of movement of the stereoscopic image is small.
  10. The stereoscopic image processing apparatus according to claim 8, further comprising a scene information reflecting unit that associates a result of detecting a motion of the stereoscopic image with the stereoscopic image.
  11. A stereoscopic image processing method, wherein a viewpoint of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes is set for each scene.
  12. A stereoscopic image processing method, wherein viewpoints of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes are set at predetermined time intervals.
JP2003003765A 2003-01-09 2003-01-09 Stereoscopic image processing method and apparatus Granted JP2004221700A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003003765A JP2004221700A (en) 2003-01-09 2003-01-09 Stereoscopic image processing method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003003765A JP2004221700A (en) 2003-01-09 2003-01-09 Stereoscopic image processing method and apparatus

Publications (1)

Publication Number Publication Date
JP2004221700A true JP2004221700A (en) 2004-08-05

Family

ID=32894941

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003003765A Granted JP2004221700A (en) 2003-01-09 2003-01-09 Stereoscopic image processing method and apparatus

Country Status (1)

Country Link
JP (1) JP2004221700A (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009296272A (en) * 2008-06-04 2009-12-17 Sony Corp Image encoding device and image encoding method
JP2010102137A (en) * 2008-10-24 2010-05-06 Fujifilm Corp Three-dimensional photographing device, method and program
JP2010103866A (en) * 2008-10-27 2010-05-06 Fujifilm Corp Three-dimensional (3d) display and method of displaying in three dimensions, and program
JP2010200213A (en) * 2009-02-27 2010-09-09 Sony Corp Image processing apparatus, image processing method, program, and three-dimensional image display apparatus
WO2010104089A1 (en) * 2009-03-11 2010-09-16 Fujifilm Corporation Imaging apparatus, image correction method, and computer-readable recording medium
JP2010206774A (en) * 2009-02-05 2010-09-16 Fujifilm Corp Three-dimensional image output device and method
JP2010226362A (en) * 2009-03-23 2010-10-07 Fujifilm Corp Imaging apparatus and control method thereof
JP2010259056A (en) * 2009-04-03 2010-11-11 Sony Corp Information processing device, information processing method, and program
JP2011114868A (en) * 2009-11-23 2011-06-09 Samsung Electronics Co Ltd Gui providing method, and display apparatus and 3d image providing system using the same
CN102187681A (en) * 2009-08-31 2011-09-14 索尼公司 Three-dimensional image display system, parallax conversion device, parallax conversion method, and program
WO2011148921A1 (en) * 2010-05-26 2011-12-01 シャープ株式会社 Image processor, image display apparatus, and imaging device
JP2011250318A (en) * 2010-05-28 2011-12-08 Sharp Corp Three-dimensional image data generation device, display device, three-dimensional image data generation method, program, and recording medium
WO2011162209A1 (en) * 2010-06-25 2011-12-29 富士フイルム株式会社 Image output device, method, and program
JP2012003350A (en) * 2010-06-14 2012-01-05 Hal Laboratory Inc Image display program, device, system, and method
JP2012015771A (en) * 2010-06-30 2012-01-19 Toshiba Corp Image processing apparatus, image processing program, and image processing method
WO2012020558A1 (en) * 2010-08-10 2012-02-16 株式会社ニコン Image processing device, image processing method, display device, display method and program
WO2012023330A1 (en) * 2010-08-16 2012-02-23 富士フイルム株式会社 Image processing device, image processing method, image processing program, and recording medium
JP2012039484A (en) * 2010-08-10 2012-02-23 Nikon Corp Display device, display method, and program
JP2012065330A (en) * 2011-10-21 2012-03-29 Fujifilm Corp Three-dimensional display device and method, and program
JP2012083573A (en) * 2010-10-12 2012-04-26 Canon Inc Stereoscopic video processor and method for controlling the same
JP2012090256A (en) * 2010-09-22 2012-05-10 Nikon Corp Image display device and imaging device
JP2012100342A (en) * 2009-04-03 2012-05-24 Sony Corp Information processing equipment, information processing method, and program
WO2012086298A1 (en) * 2010-12-24 2012-06-28 富士フイルム株式会社 Imaging device, method and program
JP2012141753A (en) * 2010-12-28 2012-07-26 Nintendo Co Ltd Image processing device, image processing program, image processing method and image processing system
WO2012147329A1 (en) * 2011-04-28 2012-11-01 パナソニック株式会社 Stereoscopic intensity adjustment device, stereoscopic intensity adjustment method, program, integrated circuit, and recording medium
CN103024408A (en) * 2011-09-22 2013-04-03 株式会社东芝 Stereoscopic image converting apparatus and stereoscopic image output apparatus
KR101296902B1 (en) 2010-12-08 2013-08-14 엘지디스플레이 주식회사 Image processing unit and stereoscopic image display device using the same, and image processing method
JP2013168897A (en) * 2012-02-17 2013-08-29 Nintendo Co Ltd Display control program, display control apparatus, display control system, and display control method
WO2013128847A1 (en) * 2012-03-02 2013-09-06 パナソニック株式会社 Parallax adjustment device, three-dimensional image generator, and parallax adjustment method
JP2013540402A (en) * 2010-10-04 2013-10-31 クゥアルコム・インコーポレイテッドQualcomm Incorporated 3D video control system for adjusting 3D video rendering based on user preferences
JP2013258568A (en) * 2012-06-13 2013-12-26 Panasonic Corp Stereoscopic video recording apparatus, stereoscopic video display device, and stereoscopic video recording system using the same
JP2014501086A (en) * 2010-11-23 2014-01-16 深▲セン▼超多▲維▼光▲電▼子有限公司 Stereo image acquisition system and method
WO2014064946A1 (en) * 2012-10-26 2014-05-01 株式会社ニコン Image capture device, image processing device, image capture device control program, and image processing device control program
JP2015141418A (en) * 2014-01-29 2015-08-03 株式会社リコー Depth-disparity calibration of binocular optical augmented reality system
US9258552B2 (en) 2010-06-30 2016-02-09 Fujifilm Corporation Playback device, compound-eye image pickup device, playback method and non-transitory computer readable medium
JP6011862B2 (en) * 2011-01-27 2016-10-19 パナソニックIpマネジメント株式会社 3D image capturing apparatus and 3D image capturing method

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4591548B2 (en) * 2008-06-04 2010-12-01 ソニー株式会社 Image coding apparatus and image coding method
JP2009296272A (en) * 2008-06-04 2009-12-17 Sony Corp Image encoding device and image encoding method
US8369630B2 (en) 2008-06-04 2013-02-05 Sony Corporation Image encoding device and image encoding method with distance information
JP2010102137A (en) * 2008-10-24 2010-05-06 Fujifilm Corp Three-dimensional photographing device, method and program
JP2010103866A (en) * 2008-10-27 2010-05-06 Fujifilm Corp Three-dimensional (3d) display and method of displaying in three dimensions, and program
US8130259B2 (en) 2008-10-27 2012-03-06 Fujifilm Corporation Three-dimensional display device and method as well as program
JP4625517B2 (en) * 2008-10-27 2011-02-02 富士フイルム株式会社 Three-dimensional display device, method and program
US8120606B2 (en) 2009-02-05 2012-02-21 Fujifilm Corporation Three-dimensional image output device and three-dimensional image output method
JP2010206774A (en) * 2009-02-05 2010-09-16 Fujifilm Corp Three-dimensional image output device and method
JP4737573B2 (en) * 2009-02-05 2011-08-03 富士フイルム株式会社 3D image output apparatus and method
CN102308590B (en) * 2009-02-05 2014-04-16 富士胶片株式会社 Three-dimensional image output device and three-dimensional image output method
JP2010200213A (en) * 2009-02-27 2010-09-09 Sony Corp Image processing apparatus, image processing method, program, and three-dimensional image display apparatus
US8379113B2 (en) 2009-03-11 2013-02-19 Fujifilm Corporation Imaging apparatus, image correction method, and computer-readable recording medium
WO2010104089A1 (en) * 2009-03-11 2010-09-16 Fujifilm Corporation Imaging apparatus, image correction method, and computer-readable recording medium
US8885070B2 (en) 2009-03-11 2014-11-11 Fujifilm Corporation Imaging apparatus, image correction method, and computer-readable recording medium
JP2010226362A (en) * 2009-03-23 2010-10-07 Fujifilm Corp Imaging apparatus and control method thereof
JP2012100342A (en) * 2009-04-03 2012-05-24 Sony Corp Information processing equipment, information processing method, and program
JP2010259056A (en) * 2009-04-03 2010-11-11 Sony Corp Information processing device, information processing method, and program
US8971692B2 (en) 2009-04-03 2015-03-03 Sony Corporation Information processing device, information processing method, and program
US9756310B2 (en) 2009-04-03 2017-09-05 Sony Corporation Information processing device, information processing method, and program
CN102187681A (en) * 2009-08-31 2011-09-14 索尼公司 Three-dimensional image display system, parallax conversion device, parallax conversion method, and program
US9307224B2 (en) 2009-11-23 2016-04-05 Samsung Electronics Co., Ltd. GUI providing method, and display apparatus and 3D image providing system using the same
JP2011114868A (en) * 2009-11-23 2011-06-09 Samsung Electronics Co Ltd Gui providing method, and display apparatus and 3d image providing system using the same
JP2011250059A (en) * 2010-05-26 2011-12-08 Sharp Corp Image processing device, image display device and image pickup device
WO2011148921A1 (en) * 2010-05-26 2011-12-01 シャープ株式会社 Image processor, image display apparatus, and imaging device
CN102939764A (en) * 2010-05-26 2013-02-20 夏普株式会社 Image processor, image display apparatus, and imaging device
US8831338B2 (en) 2010-05-26 2014-09-09 Sharp Kabushiki Kaisha Image processor, image display apparatus, and image taking apparatus
JP2011250318A (en) * 2010-05-28 2011-12-08 Sharp Corp Three-dimensional image data generation device, display device, three-dimensional image data generation method, program, and recording medium
JP2012003350A (en) * 2010-06-14 2012-01-05 Hal Laboratory Inc Image display program, device, system, and method
US8773506B2 (en) 2010-06-25 2014-07-08 Fujifilm Corporation Image output device, method and program
WO2011162209A1 (en) * 2010-06-25 2011-12-29 富士フイルム株式会社 Image output device, method, and program
CN102959967A (en) * 2010-06-25 2013-03-06 富士胶片株式会社 Image output device, method, and program
CN102959967B (en) * 2010-06-25 2015-07-01 富士胶片株式会社 Image output device and method
JP2012015771A (en) * 2010-06-30 2012-01-19 Toshiba Corp Image processing apparatus, image processing program, and image processing method
US9258552B2 (en) 2010-06-30 2016-02-09 Fujifilm Corporation Playback device, compound-eye image pickup device, playback method and non-transitory computer readable medium
JP2012039484A (en) * 2010-08-10 2012-02-23 Nikon Corp Display device, display method, and program
WO2012020558A1 (en) * 2010-08-10 2012-02-16 株式会社ニコン Image processing device, image processing method, display device, display method and program
US9488841B2 (en) 2010-08-10 2016-11-08 Nikon Corporation Image processing apparatus, image processing method, display apparatus, display method, and computer readable recording medium
CN103069814A (en) * 2010-08-10 2013-04-24 株式会社尼康 Image processing device, image processing method, display device, display method and program
US10462455B2 (en) 2010-08-10 2019-10-29 Nikon Corporation Display apparatus, display method, and computer readable recording medium
CN103098478A (en) * 2010-08-16 2013-05-08 富士胶片株式会社 Image processing device, image processing method, image processing program, and recording medium
WO2012023330A1 (en) * 2010-08-16 2012-02-23 富士フイルム株式会社 Image processing device, image processing method, image processing program, and recording medium
JP2012090256A (en) * 2010-09-22 2012-05-10 Nikon Corp Image display device and imaging device
CN102547324B (en) * 2010-09-22 2016-02-17 株式会社尼康 Image display device and camera head
US9076245B2 (en) 2010-09-22 2015-07-07 Nikon Corporation Image display apparatus and imaging apparatus
CN102547324A (en) * 2010-09-22 2012-07-04 株式会社尼康 Image display apparatus and imaging apparatus
US9035939B2 (en) 2010-10-04 2015-05-19 Qualcomm Incorporated 3D video control system to adjust 3D video rendering based on user preferences
JP2013540402A (en) * 2010-10-04 2013-10-31 クゥアルコム・インコーポレイテッドQualcomm Incorporated 3D video control system for adjusting 3D video rendering based on user preferences
JP2012083573A (en) * 2010-10-12 2012-04-26 Canon Inc Stereoscopic video processor and method for controlling the same
JP2014501086A (en) * 2010-11-23 2014-01-16 深▲セン▼超多▲維▼光▲電▼子有限公司 Stereo image acquisition system and method
KR101296902B1 (en) 2010-12-08 2013-08-14 엘지디스플레이 주식회사 Image processing unit and stereoscopic image display device using the same, and image processing method
US8711208B2 (en) 2010-12-24 2014-04-29 Fujifilm Corporation Imaging device, method and computer readable medium
JP5453552B2 (en) * 2010-12-24 2014-03-26 富士フイルム株式会社 Imaging apparatus, method and program
WO2012086298A1 (en) * 2010-12-24 2012-06-28 富士フイルム株式会社 Imaging device, method and program
JP2012141753A (en) * 2010-12-28 2012-07-26 Nintendo Co Ltd Image processing device, image processing program, image processing method and image processing system
JP6011862B2 (en) * 2011-01-27 2016-10-19 パナソニックIpマネジメント株式会社 3D image capturing apparatus and 3D image capturing method
JP6002043B2 (en) * 2011-04-28 2016-10-05 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Stereoscopic intensity adjusting device, stereoscopic intensity adjusting method, program, integrated circuit, recording medium
WO2012147329A1 (en) * 2011-04-28 2012-11-01 パナソニック株式会社 Stereoscopic intensity adjustment device, stereoscopic intensity adjustment method, program, integrated circuit, and recording medium
US9094657B2 (en) 2011-09-22 2015-07-28 Kabushiki Kaisha Toshiba Electronic apparatus and method
CN103024408A (en) * 2011-09-22 2013-04-03 株式会社东芝 Stereoscopic image converting apparatus and stereoscopic image output apparatus
CN103024408B (en) * 2011-09-22 2015-07-15 株式会社东芝 Stereoscopic image converting apparatus and stereoscopic image output apparatus
JP2013070267A (en) * 2011-09-22 2013-04-18 Toshiba Corp Stereoscopic image converting apparatus, stereoscopic image output apparatus and stereoscopic image converting method
JP2012065330A (en) * 2011-10-21 2012-03-29 Fujifilm Corp Three-dimensional display device and method, and program
JP2013168897A (en) * 2012-02-17 2013-08-29 Nintendo Co Ltd Display control program, display control apparatus, display control system, and display control method
US9019265B2 (en) 2012-02-17 2015-04-28 Nintendo Co., Ltd. Storage medium having stored therein display control program, display control apparatus, display control system, and display control method
WO2013128847A1 (en) * 2012-03-02 2013-09-06 パナソニック株式会社 Parallax adjustment device, three-dimensional image generator, and parallax adjustment method
US20140055579A1 (en) * 2012-03-02 2014-02-27 Panasonic Corporation Parallax adjustment device, three-dimensional image generation device, and method of adjusting parallax amount
JP2013258568A (en) * 2012-06-13 2013-12-26 Panasonic Corp Stereoscopic video recording apparatus, stereoscopic video display device, and stereoscopic video recording system using the same
WO2014064946A1 (en) * 2012-10-26 2014-05-01 株式会社ニコン Image capture device, image processing device, image capture device control program, and image processing device control program
US9693036B2 (en) 2012-10-26 2017-06-27 Nikon Corporation Imaging apparatus, image processing device, computer-readable medium having stored thereon an imaging apparatus controlling program, and computer-readable medium having stored thereon an image processing program
JPWO2014064946A1 (en) * 2012-10-26 2016-09-08 株式会社ニコン Imaging apparatus, image processing apparatus, control program for imaging apparatus, and control program for image processing apparatus
JP2015141418A (en) * 2014-01-29 2015-08-03 株式会社リコー Depth-disparity calibration of binocular optical augmented reality system

Similar Documents

Publication Publication Date Title
DE69632212T2 (en) Image conversion and encoding process
JP5340952B2 (en) 3D projection display
CA2752691C (en) Systems, apparatus and methods for subtitling for stereoscopic content
EP2188672B1 (en) Generation of three-dimensional movies with improved depth control
JP4707368B2 (en) Stereoscopic image creation method and apparatus
US6747610B1 (en) Stereoscopic image display apparatus capable of selectively displaying desired stereoscopic image
US7027659B1 (en) Method and apparatus for generating video images
JP3944188B2 (en) Stereo image display method, stereo image imaging method, and stereo image display apparatus
US8228327B2 (en) Non-linear depth rendering of stereoscopic animated images
JP4740135B2 (en) System and method for drawing 3D image on screen of 3D image display
EP2268049A2 (en) Image data creation device, image data reproduction device, and image data recording medium
JP3568195B2 (en) 3D image generation method
US6160909A (en) Depth control for stereoscopic images
CN101523924B (en) 3 menu display
JP2009528587A (en) Rendering the output image
JP4764305B2 (en) Stereoscopic image generating apparatus, method and program
CN1144157C (en) System and method for creating 3D models from 2D sequential image data
US7596259B2 (en) Image generation system, image generation method, program, and information storage medium
US8711204B2 (en) Stereoscopic editing for video production, post-production and display adaptation
JP2005353047A (en) Three-dimensional image processing method and three-dimensional image processor
EP1671276B1 (en) Image rendering with interactive motion parallax
US9445072B2 (en) Synthesizing views based on image domain warping
KR101313740B1 (en) OSMU( One Source Multi Use)-type Stereoscopic Camera and Method of Making Stereoscopic Video Content thereof
Schmidt et al. Multiviewpoint autostereoscopic dispays from 4D-Vision GmbH
JP5317955B2 (en) Efficient encoding of multiple fields of view

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20040702

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20060530

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20060606

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20060807

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20070227